Does new research out of three universities—Duke, Surrey, and Hamburg—represent a tipping point in the development of artificial general intelligence?
The biggest technology game changers don’t always grab the biggest headlines. Two emerging AI developments may not go viral on TikTok or YouTube, but they represent an inflection point that could radically accelerate the development of artificial general intelligence (AGI). That’s AI that can function and learn like us.
As the types of data used to interact with the environment become richer and more integrated, AI moves inexorably closer to true AGI.Â
Coming to our senses: WildFusion
As humans, we rely on all sorts of stimuli to navigate in the world, including our senses: sight, sound, touch, taste, smell. Until now, AI devices have been solely reliant on a single sense—visual impressions. Brand-new research from Duke University goes beyond reliance only on visual perception. It’s called WildFusion, combining vision with touch and vibration.
The four-legged robot used by the research team includes microphones and tactile sensors in addition to the standard cameras commonly found in state-of-the-art robots. The WildFusion robot can use sound to assess the quality of a surface (dry leaves, wet sand) as well as pressure and resistance to calibrate its balance and stability. All of this data is gathered and combined or fused, into a single data representation that improves over time with experience. The research team plans enhance the robot’s capabilities by enabling it to gauge things like heat and humidity.
Read the complete Fast Company article BYÂ Tom Barnett: https://www.fastcompany.com/91358434/these-two-game-changing-breakthroughs-advance-us-towards-artificial-general-intelligence