BrainChip Holdings, which states it is the world’s first commercial producer of neuromorphic AI IP and chips, and nViso, a human behavioral analytics AI company, are to collaborate on development systems with high levels of AI performance yet ultra-low power technologies. The initial effort will include implementing nViso’s AI solutions for automotive in-cabin monitoring systems (IMS) using BrainChip’s Akida processors.
BrainChip says that its neuromorphic processor mimics the human brain to analyze only essential sensor inputs at the point of acquisition, processing data with high efficiency, precision and economy of energy. Keeping AI/ML local to the chip, independent of the cloud, also dramatically reduces latency while improving privacy and data security.
The companies note that developers of automotive and consumer technologies are striving to develop devices that better respond to human behavior — which requires tools and applications to interpret human behavior captured from cameras and sensors on devices. However, these environments can be constrained by limited compute performance, power consumption and cloud connectivity lapses. The Akida processors addresses these weaknesses with high performance and ultra-low power consumption (micro- to milliwatts) as well as by performing AI/ML processing of vision/image, motion and sound data directly on devices, instead of in a remote cloud.
“Our work with BrainChip will support AI’s demanding power/cost/performance needs for OEMs, even at mass production and scale, so they can benefit from faster and more efficient development cycles,” said Tim Llewellynn, CEO of nViso. “Ultra-low power edge-based consumer processing is expected to deliver a more intelligent and individualized user experience, and we believe running our AI solutions for social robots and in-cabin monitoring systems on Akida will provide a competitive edge for joint customers demanding always-on features on low power budgets.”
NViso says that its technology is uniquely able to analyze signals of human behavior such as facial expressions, emotions, identity, head poses, gaze, gestures, activities, and objects with which users interact. In robotics and in-vehicle applications, human behavior analytics detect the user’s emotional state to provide personalized, adaptive, interactive, safe devices and systems.