
Artificial entity thinks, learns like a person
The software, called Sallie, consists of a centralized computer mind and uses mobile sensory pods with multiple senses and abilities that enable the system to learn from a real-world environment and gain a fundamental understanding of physical objects, cause and effect, and the passage of time. As a result, says the company, it has the ability to draw conclusions – a critical facet of genuine thinking and a necessary component to ushering in artificial general intelligence (AGI).
“The first component of being able to understand like a person is learning about immediate surroundings,” says Charles Simon, Founder and CEO, Future AI. “Sallie can recognize objects with vision, build an internal model, ask questions, and take direction without any initial information.”
“Our work advances new algorithms which simulate biological neuron circuits with high-level artificial intelligence techniques. Sallie can infer information about objects she doesn’t understand – demonstrating one shot, real-world learning without tagged data sets or backpropagation.”
The company’s software creates connections on its own between different types of real-world sensory input in the same way that the human brain interprets everything it knows in the context of everything else it knows. Sallie emulates the processes of human thought, beginning with perception.
The company says that it uses unique graph algorithms and structures that are self-adaptive. Recently, the company raised $2 million in initial funding to accelerate the development of its technology and algorithms, including its Universal Knowledge Store (UKS), which aggregates different types of information and creates connections between them, similar to the cognitive processes of human intelligence. Modeled in neurons, the UKS has biological plausibility and the ability to learn and function unsupervised the way children do.
Sallie’s technology and the knowledge she acquires, says the company, will be incorporated into existing AI applications, radically improving personal assistants like Alexa and Siri, language translation, computer vision, automated customer service systems, and many other human-interactive systems. Additional Sallie enhancements already in development will give Sallie an even better understanding of the world around her.
Companies or individuals interested in evaluating Sallie can sign up now for priority beta-testing in Q4 of this year.
