www.magazine-industry-usa.com

Full-Stack Edge AI for MCUs and MPUs

Microchip Technology expands its edge AI portfolio with integrated silicon, software and tools for production-ready real-time inference applications.

  www.microchip.com
Full-Stack Edge AI for MCUs and MPUs

Microchip Technology has introduced full-stack edge AI application solutions for its microcontrollers and microprocessors, enabling real-time machine learning inference directly on embedded devices across industrial, automotive, data center and IoT networks.

From cloud AI to real-time edge inference
As AI and ML workloads move closer to data sources, edge devices are increasingly required to perform real-time inference without relying on cloud connectivity. This shift reduces latency, enhances data privacy and minimizes bandwidth dependency—factors critical in industrial automation, automotive systems and consumer IoT environments.

Microchip’s expanded offering targets embedded platforms that interface directly with sensors, motors, actuators and control systems. By embedding AI capabilities into MCUs and MPUs, the company positions these devices as processing nodes capable of executing inference tasks at the point of data generation.

Full-stack approach to edge AI
Microchip’s edge AI strategy integrates silicon, software frameworks, optimized ML models and development tools into unified application solutions. The objective is to simplify the transition from proof-of-concept to production deployment.

The new application solutions include pre-trained, deployable models along with modifiable application code. Developers can adapt models to different environments using Microchip’s embedded software ecosystem or partner tools, enabling flexibility across performance and memory configurations.

Targeted application use cases
The initial application portfolio addresses common edge AI scenarios, including:
  • AI-based detection and classification of electrical arc faults through signal analysis.
  • Condition monitoring and predictive maintenance via equipment health assessment.
  • On-device facial recognition with liveness detection for secure identity verification.
  • Keyword spotting for voice-driven command interfaces in industrial, automotive and consumer systems.
These applications reflect demand for low-latency, always-available inference capabilities directly on embedded hardware.

Development environment and scalability
Engineers can implement these AI solutions using the MPLAB® X Integrated Development Environment (IDE), combined with the MPLAB Harmony software framework and the MPLAB ML Development Suite plug-in. This unified toolchain supports embedded AI integration through optimized libraries and scalable workflows.

Developers can begin with low-complexity implementations on 8-bit MCUs and scale to 16-bit or 32-bit MCUs for higher-performance production systems. For FPGA-based implementations, the VectorBlox™ Accelerator SDK 2.0 provides hardware-accelerated inference capabilities for computationally intensive workloads such as computer vision and sensor analytics.

Complementary hardware support
Microchip’s edge AI ecosystem extends beyond processing devices. Complementary components, including PCIe® connectivity devices and high-density power modules, support system-level integration in industrial automation and data center applications. Reference designs, such as motor control systems using dsPIC® digital signal controllers, facilitate data extraction in real-time AI pipelines.

Additional application examples include load disaggregation in smart e-metering, object detection and motion surveillance, demonstrating the breadth of deployment scenarios across industrial and infrastructure markets.

www.microchip.com

  Ask For More Information…

LinkedIn
Pinterest

Join the 155,000+ IMP followers