blog-breadcum

BLOG

breadcum-strip

AI and Embedded systems: A future in making

blog

Embedded system has a huge role to play in the implementation and success of AI. The most interesting development in AI hardware can be seen at the Edge, where filtering data and quick decision-making is paramount.

 

Artificial Intelligence has been around for decades. The field of AI is coined in the 20th century when a handful of computer scientists met at Dartmouth Conference in 1956. But it exploded since 2015, because of the rise of faster, cheaper, and powerful processing power, infinite storage capacity, the flood of data, and advances in deep learning (a subset of Machine Learning, which is a subset of AI).

 

The advancements in AI algorithms and supporting technologies created an ecosystem for the emergence of speech, image, video, and text recognition, that require specialized hardware capable of accelerating AI-based application development.

 

EDGE AI:

The most interesting development in AI hardware can be seen at the Edge, where AI algorithms or models run seamlessly on embedded devices. To name a few devices that need some sort of AI processing are smartphones, robots, drones, cameras, and security cameras. As per a study, the overall edge AI hardware market is expected to register a shipment of 610 million units in 2019 and is likely to reach 1559.3 million units by 2024 [i].

 

Edge AI is delivering four important capabilities (1) local data processing, (2) filtering data and transferring to the cloud, (3) quick decision making with (4) low latency, which is imperative for the growth of the internet of things (IoT), robots (IIoT), and Autonomous vehicles. Edge AI filters relevant data and compresses it down to data that needs to be analyzed and stored. It pre-processes the relevant data before shipping it over the cloud that saves extra cost on additional bandwidth and equipment.

 

A typical technology stack for Artificial intelligence contains nine layers. To enable emerging AI applications, there is a great reliance on hardware, especially on logic and memory.

 

Courtesy: Artificial-intelligence hardware: New opportunities for semiconductor companies

 

Talking about logic and compute, to run cutting-edge AI systems, we need cutting-edge AI chips, as they are a thousand times efficient than CPU. The Artificial Intelligence (AI) chipset market will reach $57.8 billion by 2026 [i], this presents a great opportunity for semiconductor companies than ever before.

 

Edge computing has limited resources and computing power when compared with a public cloud infrastructure, so to fill the gap chip manufacturers are building AI accelerator that significantly accelerates the inference process. This results in faster prediction and classification of data received at the edge layer.

 

CPU, GPU, FPGA, ASIC? WHICH ONE IS BETTER?

Compute performance relies on central processing units (CPUs) and accelerators graphics-processing units (GPUs), field-programmable gate arrays (FPGAs), and application-specific integrated circuits (ASICs).

 

As each use case has a different set of computing requirements, the optimal AI hardware architecture will vary. If we look at the current scenario, CPUs and GPUs lead in AI sockets, though all else being equal, FPGAs and ASICs are faster than GPUs and CPUs because they offer lower latency as there is no operating system (OS). However, they have their own disadvantages such as FPGA is good for low quantity production as quantity increases the cost and developers have very limited control over power optimization. Whereas in ASIC, the development time is huge and the design tools are much expensive.

 

MARKET AND OPPORTUNITIES:

There is a lot of opportunity for Edge AI hardware on the surveillance side where governments use surveillance cameras for law enforcement. Behavior analysis and face recognition are major applications where AI-based surveillance is used. The use case of Edge AI in surveillance is quite simple, a surveillance camera should recognize the bad guys in real-time and not wait for the back end support to recognize and do the needful.

 

On top of it, these solutions can be used by private companies in the video protection market. A passenger counting system is one such example, where cameras and intelligent people counters are used to log the number of people getting on and off at each station. A video counting system is often 98% accurate and can differentiate between children, adults, and objects. With the help of the embedded AI, companies working in mass transit would be able to implement a high-speed and accurate passenger counting system.

 

Similarly, a surveillance system coupled with embedded AI and sensors can detect possible accidents or spillage at oil & gas refinery and take decisions to neutralize.

 

CONCLUSION:

The AI hardware is still in its infancy though chip maker and components manufacturer are building promising hardware. In the near future, almost all the devices will be running ML algorithms and the majority of the computing will be done at the edge. Trunexa is already experiencing this trend and designing AI-enabled edge devices for our clients. Please get in touch for more information.

This website uses cookies or similar technologies, to enhance your browsing experience and provide personalized recommendations. By continuing to use our website, you agree to our Privacy Policy