Dealing with Rising Performance Demands at the Edge
Posted 08/29/2019 by Hussein Osman
Migrating AI applications to the network Edge offers promising benefits like reduced data latency, better privacy and lower power consumption. But it poses design challenges as well. Designers are being asked to develop Edge AI solutions that combine lower power requirements with a smaller footprint. At the same time new AI/ML applications like presence detection and object counting demand that developers build solutions that operate at higher performance levels than ever before.
Looking for a way to overcome these challenges? Look no further than Lattice sensAI™, our award-winning technology stack that delivers low power AI to Edge devices. In May, Lattice announced major performance and design flow enhancements to sensAI, including a 10X performance boost for low power, smart IoT devices.
Recently, Lattice released a new white paper, 'Rising Edge AI Requirements Demand Higher Performance Solutions,' that lays out how designers can use the sensAI stack and Lattice's low power, low density FPGAs to accelerate neural network performance.
Real World Use Cases
So if you're looking at ways to implement Edge AI, be sure to read about the use cases for sensAI covered at the end of the white paper. They illustrate how sensAI can address some common data processing problems associated with supporting AI in Edge devices, both in new designs and when integrated into legacy designs to add AI support.
These use cases can be split into two categories: data pre-processing and data post-processing. Data pre-processing is a great way to keep power and data latency low in smart visions applications. sensAI can determine if data needs further inspection on the device or in the cloud before the device takes action. This keeps false positives (very common in smart vision applications) from triggering the device to activate its SoC or MCU (keeping power low) or forwarding the false positive to the cloud (keeping data latency low). Post-processing data via sensAI is a great way for designers to add support for smart vision to existing embedded vision applications at the Edge. For example, designers looking to add smart vision support to a design may find it more cost effective to add an accelerator to offload some of the processing workload from a legacy SoC or MCU; sensAI makes it easy to do this. And as the sensAI stack runs on small footprint, low power Lattice FPGAs, using sensAI to add smart vision to a legacy device has minimal impact on power consumption and overall design footprint.
For more information, visit our sensAI page.
Lattice Semiconductor Corporation published this content on 29 August 2019 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 29 August 2019 20:20:01 UTC