Edge AI Trends Represented on Embedded World

June 24, 2024

The topic of Edge AI is experiencing notable advancements as the integration of artificial intelligence (AI) with edge computing progresses. The term "Edge AI" describes the application of AI algorithms on edge devices, enabling decision-making and real-time data processing nearer to the data-generating source.

The premier global conference and exhibition, Embedded World, provides a stage on which to present the newest developments and Edge AI use cases. The breakthrough potential of Edge AI in our increasingly connected world is highlighted in Embedded World, which showcases cutting-edge applications across multiple sectors and advances in hardware and software.

Briefly about Edge AI

Briefly about Edge AI

Rather than depending on cloud services or Internet access, Edge AI entails placing computational resources—such as IoT devices or sensors—closer to the point of data generation. Edge AI optimizes IoT by executing AI algorithms directly on edge devices, enabling quick decision-making without the need to transfer data back and forth to the cloud. This configuration reduces latency and accelerates response times, as data processing occurs close to the source. By processing data locally, Edge AI helps to minimize bandwidth requirements and lower data transfer costs.

One of the main advantages of Edge AI is the improvement in security and data privacy. By processing data locally on edge devices, the risk of sensitive data being exposed or intercepted during transmission is significantly reduced.

What is Embedded World 2024?

A major event for the embedded systems community is Embedded World. The conference this year, which took place in Nürnberg, Germany from April 9 to 11, focused on the most recent developments and breakthroughs in chipsets, edge computing, embedded software, embedded systems, and related topics. The integration of AI with embedded systems was emphasized in the 2024 edition of Embedded World, with a specific focus on edge AI solutions.

Corporate research subscribers can obtain the extensive 67-page Embedded World 2024 event report if they're looking for in-depth analysis. This report contains a wealth of information on the occasion, including the team's identification of key trends, noteworthy announcements, product launches, and highlights from the keynote addresses.

Edge AI 6 trends showcased on Embedded World

1. NVIDIA Edge AI company

The US-based chipmaker NVIDIA has played a significant role in promoting the adoption and application of AI technology in many industries. NVIDIA is well-known for producing high-performance GPUs, especially for use in data centers, but the company is also making great progress in implementing sophisticated AI models at the edge. NVIDIA has firmly established its dominance in the AI technology sector, surpassing rivals like AMD and Intel, thanks to a strong partner network that comprises more than 1,100 businesses.

One of NVIDIA's partners, Taiwan-based embedded systems manufacturer Aetina, showcased its AI-driven industrial edge solutions powered by NVIDIA GPUs at Embedded World 2024. Significantly, Aetina unveiled the AIB-MX13/23, a device that makes use of NVIDIA's Jetson AGX Orin GPU, which has a 275 trillion operations per second (TOPS) processing capacity. Together with the Finnish company TrueFlaw, which provides flaw recognition solutions, Aetina demonstrated a portable ultrasonic testing device connected to the AIB-MX13/23.

The Taiwan-based fabless semiconductor company MediaTek unveiled four new embedded systems-on-chips (SoCs) called CX-1, CY-1, CM-1, and CV-1 that are intended for use in automotive applications. The fact that these SoCs can run NVIDIA's DRIVE OS 3, a reference operating system for autonomous vehicles, shows how the company's technology is being used in areas other than gaming and data centers.

2. AI model training shifting to the thick edge

Training AI models are shifting from centralized cloud infrastructures to smaller, more localized "thick-edge" settings, including micro data centers or servers. The integration of high-performance CPUs and GPUs, which offer significant computational capacity at the edge, is what is driving this change. Multiple AI inferencing capabilities and AI training are made easier with this configuration. Instead of depending only on cloud infrastructure, training AI models on-site at vendor facilities has several advantages, including lower costs, better privacy, and faster edge device AI application response.

Aetina debuted its AIP-FR68 Edge AI Training platform during the event. This platform offers up to 200 teraflops of processing power per GPU—a significant amount for a single GPU—and supports several configurations of 4x NVIDIA GPUs. This launch demonstrates the increasing trend of improving edge computing capabilities to enable complex AI tasks at the edge, such as inferencing and training.

3. Tiny AI and ML micro-edge AI capability

As its name suggests, tiny AI and ML models are condensed forms of machine learning and artificial intelligence algorithms that can operate on low-resource devices, such as sensor-based micro-edge devices. These models allow commonplace equipment and objects to make decisions on their own without relying on cloud connectivity. This local processing, which handles data directly on the device, at the edge, improves privacy and data security.

Several instances of small-scale machine learning integration were showcased at Embedded World 2024. This compact ML-powered speech verification solution is intended for extremely low-power edge AI devices. By adding speech recognition and passcode verification, it improves security.

In a similar vein, SensiML, a US-based AI/ML software startup, unveiled a smart drill proof-of-concept. This drill offers real-time edge sensing and anomaly detection capabilities by using AI/ML models to categorize various screw fastening statuses.

Furthermore, Nordic Semiconductor, a fabless semiconductor business based in Norway, unveiled the Thingy:53 IoT prototyping device. This gadget uses embedded machine learning to detect anomalies. It is outfitted with Nordic's nRF5340 chipset. The Thingy:53 uses a miniature ML model in conjunction with an accelerometer to detect vibrations in devices. As an example, it can turn off a machine or device's power when it notices abnormalities.

4. Localization of autonomous decision-making

An increasing number of cellular IoT devices are being integrated with AI-enabled chipsets, indicating a move toward intelligent, autonomous IoT systems that can make localized decisions. With major benefits including real-time data processing, lower latency, and higher efficiency because of smaller form factors, this invention has the potential to completely transform industries like smart cities and factories.

The intelligent mowing robot solution offered by Fibocom, a Chinese supplier of wireless communications modules, is a shining illustration of this innovation. This robot's intelligence module, which is Qualcomm-based, allows for reliable on-device processing. Without relying on continuous cloud connectivity, it is capable of mapping its surroundings, avoiding barriers, and carrying out economical boundary identification independently. This useful application demonstrates the significant benefit of AI-enabled chipsets in improving IoT device operation.

At the event, Thundercomm, a US-based joint venture that specializes in IoT solutions, showcased its EB3G2 IoT edge gateway. This gateway dramatically lowers latency and dependence on the cloud by executing AI models on the device through the use of a Qualcomm System on Chip (SoC).

5. Micro- and thin-edge AI acceleration

AI inference capabilities are greatly improved by the integration of specialized Neural Processing Units (NPUs) into edge devices. Given its ability to reduce power consumption, enhance thermal regulation, and provide effective multitasking, this integration is perfect for applications that must minimize latency and power consumption, such as wearables and sensor nodes.

NXP, a Dutch semiconductor firm, displayed its latest MCX N Series microcontrollers (MCUs) at the recent fair. Compared to typical CPU cores alone, these MCUs provide 42 times quicker machine learning (ML) inference. Furthermore, two configurations were showcased by ARM, a UK-based semiconductor design company: one that solely utilized the ARM Cortex-A55 and another that combined it with the ARM Ethos-U65 NPU. With the latter configuration, the NPU handled 70% of the AI inference tasks that were previously handled by the CPU, leading to an 11-fold increase in performance.

6. On-device AI processes for developers

Developers face several obstacles when integrating on-device AI, such as the requirement to purchase new devices before assessing the effectiveness of AI chipsets and their suitability for use with AI models. Temperature, CPU/NPU use, and device TOPS are often evaluated factors. Companies are launching new AI development platforms that can simulate on-device AI performance in order to overcome these issues. Without having to buy actual hardware, these platforms allow developers to test the deployment of AI models employing particular edge device/chipset resource specifications.

One such solution was the EdgeAI SDK platform from Taiwan-based IoT and embedded solutions supplier Advantech. This platform facilitates the deployment of AI models on popular AI chipsets such as NVIDIA, Qualcomm, Hailo, and Intel. Advantech demonstrated the use of an ARC A380E embedded systems GPU combined with an AIMB-278 industrial motherboard to implement a pose detection model.

Conclusion

The ideas and developments presented at Embedded World 2024 highlight how Edge AI can completely change embedded systems. Edge AI use cases will undoubtedly be essential in boosting real-time processing capabilities, enhancing privacy and security, and enabling more effective and scalable solutions across industries as the integration of AI with edge computing continues to develop.

Embedded World 2024 offered a thorough insight into the direction of Edge AI, ranging from recent improvements in edge computing infrastructure to creative software solutions and AI chipset advancements. The event demonstrated how various technologies are coming together to build more intelligent, self-sufficient systems that can function well at the edge, leading to important advancements in smart homes, healthcare, industrial automation, and other fields.