AI-enabled flexible electronic systems via near-sensor and in-sensor computing – npj Flexible Electronics

Artificial intelligence is no longer confined to data centers and smartphones; it’s increasingly being woven into the very fabric of our devices—literally. AI-enabled flexible electronic systems (AI-FESs) are turning soft, stretchable, and skin-like sensors into intelligent front ends that don’t just detect the world but interpret it. This shift, from simple data capture to on-the-spot analysis, is transforming health monitoring, human–machine interaction, soft robotics, and the Internet of Things. Yet most current AI-FESs still lean on external processors or the cloud, a dependency that adds latency, drains power, and raises privacy flags. The field is now pivoting toward making sensors themselves smarter—via near-sensor and in-sensor computing that pushes intelligence to the extreme edge.

What makes AI-FESs different

Traditional sensor systems collect raw signals—pressure, strain, temperature, biopotentials, chemicals, light—and ship them off-chip for processing. AI-FESs integrate these sensors with embedded compute so the system can recognize patterns, extract context, and act in real time. Flexible substrates (like polymers and textiles), stretchable conductors, and ultrathin devices conform to the body or objects. Layer AI atop that, and you move from “measuring a heartbeat” to “detecting arrhythmia,” or from “sensing pressure” to “identifying a grasp type.” The result is a tighter loop between the physical and digital worlds, enabling responsive, adaptive behavior without a broadband connection.

Why near-sensor and in-sensor computing matter

Shuttling raw data to a remote processor is costly in power and bandwidth. For wearables and untethered robots, it’s also too slow. Near-sensor computing co-locates microcontrollers, neural accelerators, or compute-in-memory blocks with the sensor array, shrinking data movement and latency. In-sensor computing goes a step further: the sensing element participates in computation. Examples include photodetectors that perform weighted summation before digitization, piezoresistive networks that embed feature extraction, and memristive crossbar arrays that both store synaptic weights and execute multiply–accumulate operations in analog.

The benefits are clear:

  • Lower latency: decisions made at the edge, within milliseconds.
  • Energy efficiency: reduced analog-to-digital conversions and data transfer.
  • Privacy: sensitive biosignals can be analyzed locally.
  • Robustness: inference continues even without connectivity.

Recent progress in integrating flexible sensors with processors

Research is converging on several integration strategies. Hybrid “chip-on-flex” and chiplet approaches combine mature silicon AI accelerators with flexible interconnects and sensor skins, providing high performance while preserving mechanical compliance. All-flex systems are emerging too, powered by thin-film transistors and printed electronics that deliver basic logic, memory, and analog front ends directly on plastic. Compute-in-memory arrays—often using resistive RAM or ferroelectric devices—offer analog multiply–accumulate at microwatt scales, well matched to continuous sensing.

Multi-modal fusion is a hot spot: combining tactile, strain, temperature, and bioelectrical signals locally to improve accuracy. On-device training is still rare due to energy costs, but incremental learning, calibration updates, and personalization (for unique physiologies or usage patterns) are being demonstrated with lightweight algorithms.

Advanced processing methods moving on-sensor

To make AI fit on and within sensors, the community is embracing:

  • TinyML: aggressively quantized and pruned neural networks running on milliwatt microcontrollers.
  • Spiking neural networks and event-driven sensing: data flows only when something changes, slashing power.
  • Compute-in-memory and analog preprocessing: feature extraction and dimensionality reduction before digitization.
  • Reservoir and morphological computing: leveraging the intrinsic dynamics of soft materials as computational substrates.
  • Federated and privacy-preserving learning: local updates with shared model improvements, keeping raw data on-device.

These methods align algorithmic demands with hardware realities, maximizing accuracy per microwatt and per millimeter of flexible real estate.

Key challenges on the road to mainstream AI-FESs

Despite rapid progress, several hurdles remain:

  • Materials and reliability: Flexible devices face strain, sweat, and temperature swings. Minimizing drift, hysteresis, and variability while maintaining comfort and biocompatibility is essential.
  • Power and energy harvesting: Battery capacity limits wear-time. Integrating solar, thermoelectric, piezo, or triboelectric harvesters with ultra-low-power and intermittent computing is a priority.
  • Memory and endurance: Nonvolatile, flexible memories must retain weights across millions of cycles with tight variability control.
  • Calibration and data quality: On-sensor self-calibration and robust baselining are needed to tame noise and long-term degradation.
  • Security and privacy: Edge inference reduces exposure, but secure boot, model protection, and encrypted updates are vital.
  • Standardization and benchmarks: Common datasets and test protocols for soft, on-body, and robotic scenarios are still scarce.
  • Sustainability: Designing for repairability, recyclability, and minimal e-waste will shape responsible deployment.

Where the field is heading

Next-generation AI-FESs will blend near- and in-sensor intelligence in hierarchical stacks: sensors perform analog filtering and compression; local arrays handle feature extraction; microcontrollers or chiplets run compact inference; and the cloud supports periodic retraining or fleet analytics. Expect advances in:

  • Self-healing and stretchable materials that maintain performance under repeated deformation.
  • Batteryless operation and energy-aware algorithms that adapt to harvested power budgets.
  • Reconfigurable metasensors that change modality or sensitivity in software.
  • Open toolchains that co-design materials, devices, and models, accelerating from lab prototypes to manufacturable systems.

Applications will span continuous health analytics without raw data leaving the skin, dexterous soft grippers that recognize objects by touch, and smart surfaces that perceive occupancy, posture, and intent with minimal infrastructure.

The bottom line

AI-enabled flexible electronic systems are moving from sensing to understanding—and doing so at the edge. By bringing computation closer to, and even inside, the sensor, researchers are overcoming the bottlenecks of latency, power, and privacy that have hampered real-time, on-body, and untethered intelligence. This review charts the integration strategies, processing techniques, and open challenges shaping the path toward truly autonomous, adaptive, and unobtrusive electronics.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Unlock Your Escape: Mastering Asylum Life Codes for Roblox Adventures

Asylum Life Codes (May 2025) As a tech journalist and someone who…

Challenging AI Boundaries: Yann LeCun on Limitations and Potentials of Large Language Models

Exploring the Boundaries of AI: Yann LeCun’s Perspective on the Limitations of…

Unveiling Oracle’s AI Enhancements: A Leap Forward in Logistics and Database Management

Oracle Unveils Cutting-Edge AI Enhancements at Oracle Cloud World Mumbai In an…

Charting New Terrain: Physical Reservoir Computing and the Future of AI

Beyond Electricity: Exploring AI through Physical Reservoir Computing In an era where…