This Humanoid Robot Is Cheaper Than Rivals — And Full of Security Flaws, Researchers Say | Today Headline
In the ongoing push to bring humanoid helpers into campuses and workplaces, attention often gravitates toward towering machines. Yet a compact model from Unitree Robotics—the G1—has quietly become a practical workhorse for a different kind of revolution: making humanoids affordable enough for everyday experimentation and education.
Priced at around $16,000, the G1 sits at a threshold that makes it accessible to universities, robotics clubs, and startup teams looking to prototype real interactions. It’s showing up in labs from Beijing to Boston, learning to climb stairs, lift boxes, and greet observers with a wave.
That affordability, however, carries risk. A recent independent technical assessment looked closely at the robot’s software stack, encryption approach, and cloud connections. The findings describe a bold security vision that doesn’t hold up under practical scrutiny: encryption keys that never rotate, randomness sources that aren’t truly random, and telemetry channels that quietly send video, audio, and motion data to external servers without clear user consent or visibility. The data-movement middleware was found to manage more than forty distinct streams that could be transmitted outward.
The report notes there’s no evidence of privacy policies, data-collection disclosures, user-consent mechanisms, or opt-out options that would enable local-only operation. There were no audible or visual cues to indicate when data were being recorded or transmitted, leaving users unaware of the surveillance happening in their environment.
The analysis frames the issue in terms of system design. Modern robots are networks of networks: sensors generate data, compute units process it, and actuators respond in a deterministic loop. In this G1, that chain is protected by a proprietary encryption scheme that relies on fixed, static keys rather than dynamically generated ones. That choice makes offline decryption of configuration files and firmware feasible without a targeted attack. Researchers were able to extract cloud-connection details and control routines, illustrating how an attacker could move laterally through the system by hopping from one component to another.
These findings spark broader questions about the balance between cost, capability, and privacy in affordable humanoids. While the G1 lowers the barrier for hands-on robotics, its security posture highlights the risks inherent in deploying mass-market machines without robust, transparent safeguards. For researchers and developers, the takeaway is that core design decisions matter as much as mechanical prowess, especially when these devices are intended to operate in shared spaces.