The company sells an “AI-in-Sensor” system-on-a-chip that enables faster processing of AI problems at the edge of the network.
Fresh off a round of funding worth $13.2 million, edge artificial intelligence startup AIStorm announced its first two products. The company sells an “AI-in-Sensor” system-on-a-chip that enables faster processing of AI problems at the edge of the network.
The processors — designed for sensors meant for embedding into mobile devices, IoT machinery and self-driving cars — process data directly within their systems. Useful for AI computations at the edge, they can process information in raw analog form, eliminating the need for powerful, time-consuming GPUs.
See also: IBM doubles down on Watson bet
“By combining the processing chip with the sensor, we can deliver efficient and extremely fast processing of sensor data right at the edge,” David Schie, co-founder, and CEO of AIStorm says. “This approach offers two huge benefits. It reduces response latency and it eliminates the need to send data over the network to a different processor for machine learning and decision making.”
Low-powered SoCs offer a better option because they eliminate the need to transform data into a digital format. Instead, they process information still in its native form directly from sensors. The program uses processed analog data to train AI and machine learning models for a wide range of different tasks.
Designed for smartphones, wearables, and other IoT uses, IoT Vision and Waveform chips also help with:
- Heart monitoring
- Drone imaging
- Fingerprint sensing
- Gesture control
- Image stabilization
- Voice input
The company’s Advanced Driver Assistance System assists vehicles in processing data that can help improve safety by increasing collision avoidance.