Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides – Robohub
In this episode of Robot Talk, host Claire sits down with Andrew Philippides of the University of Sussex to explore what ants and bees can teach us about building robots that find their way in the real world—reliably, efficiently, and without GPS.
Philippides is a Professor of Biorobotics who co-directs the Centre for Computational Neuroscience and Robotics and the be.AI Leverhulme Doctoral Centre for Biomimetic Embodied AI at Sussex. His work blends behavioral biology, robotics, computational modeling, and machine learning to understand how intelligent behavior emerges from the tight loop between brain, body, and environment. With a focus on visual navigation, he studies how insects learn and travel through complex landscapes, translating those insights into new AI and biorobotic algorithms.
Why insects?
Ants and bees navigate long distances with astonishing reliability using minuscule brains and minimal energy. Instead of building heavy, globally consistent maps, they leverage:
- View-based homing: comparing the current visual scene with stored “snapshots” to steer toward familiar views.
- Landmark guidance: learning the visual relationships of distinctive objects and horizons to follow routes.
- Optic flow and path integration: estimating distance traveled and heading using motion cues and celestial compasses.
- Rapid, data-efficient learning: encoding useful, low-dimensional features without exhaustively mapping the world.
From the nest to the robot lab
Philippides explains how these principles are reshaping robotic navigation:
- View-based navigation for robots: storing compact visual signatures and using simple familiarity metrics to home, robust to clutter and lighting changes.
- Embodied AI: using sensor placement, wide field-of-view cameras, and morphology to simplify control—mimicking the advantages of insect compound eyes.
- Biorobotic validation: testing algorithms on real robots outdoors to probe limits and refine models grounded in biology.
- Learning that stays lean: combining insect-inspired strategies with machine learning to fuse path integration with landmark memories without bloated models.
- Energy-efficient autonomy: exploring neuromorphic hardware and event-based vision to deliver long-lived, low-power navigation.
What makes it different from mainstream AI?
Instead of depending on dense maps, heavy perception stacks, or constant connectivity, insect-inspired methods emphasize simplicity and robustness:
- Local strategies over global maps: rely on what’s visible now and previously learned views, not a perfect world model.
- Interpretable control: clear links between sensory cues and steering decisions.
- Graceful degradation: when signals are noisy or missing, behavior degrades predictably rather than failing catastrophically.
Challenges and frontiers
Real environments are messy. Philippides highlights ongoing work to handle seasonal appearance changes, moving objects, and long-term drift. Another frontier is connecting behavior to underlying neural circuits—such as how insect brain regions support visual learning—then turning those principles into algorithms that scale from small robots to complex platforms.
About Andrew Philippides
Andrew Philippides is Professor of Biorobotics at the University of Sussex. He co-directs the Centre for Computational Neuroscience and Robotics and the be.AI Leverhulme Doctoral Centre for Biomimetic Embodied AI. His research brings together biological experiments, robotics, modeling, and machine learning to uncover how intelligence emerges from the interplay of body and brain in the real world. By focusing on visual navigation and learning in ants and bees, he develops new AI and biorobotic methods that are compact, data-efficient, and robust outdoors.
Tune in to Robot Talk Episode 154 to hear how lessons from tiny navigators are inspiring the next generation of resilient, low-power robot guidance systems—and why the smartest route home might be the simplest one.