The company sells an “AI-in-Sensor” system-on-a-chip that enables faster processing of AI problems at the edge of the network.
Fresh off a round of funding worth $13.2 million, edge artificial intelligence startup AIStorm announced their first two products. The company sells an “AI-in-Sensor” system-on-a-chip that enables faster processing of AI problems at the edge of the network.
The processors are designed for sensors meant for embedding into mobile devices, IoT machinery and self-driving cars, and process data directly within their systems. According to the company, they are useful for AI computations at the edge because they can process information in raw analog form, eliminating the need for powerful, time-consuming GPUs.
See also: IBM doubles down on Watson bet
“By combining the processing chip with the sensor, we can deliver efficient and extremely fast processing of sensor data right at the edge,” David Schie, co-founder, and chief executive officer of AIStorm, told SiliconANGLE. “This offers two huge benefits. It reduces response latency and it eliminates the need to send data over the network to a different processor for machine learning and decision making.”
Low-powered SoCs are a better option because they eliminate the need to transform data into a digital format. Instead, they can process information directly from sensors while it’s still in its native form. The processed analog data can then be used to train AI and machine learning models for a wide range of different tasks, the company said.
The IoT Vision and Waveform chips are designed for smartphones, wearables and other IoT uses. They are designed to assist with things like heart monitoring, drone imaging, fingerprint sensing, gesture control, image stabilization, voice input and more. The company’s Advanced Driver Assistance System assists vehicles in processing data that can help improve safety by increasing collision avoidance.