Silicon Valley startup Sabi unveils brain-reading beanie to decode human thoughts

What if you could type by thinking? Silicon Valley startup Sabi is betting on that future with a brain-reading beanie designed to turn a wearer’s unspoken words into text on a screen. It’s an ambitious take on brain-computer interfaces (BCIs), aiming to make mind-to-machine communication non-invasive, wearable, and everyday.

How Sabi’s “thinking cap” works

Instead of surgical implants, Sabi’s approach relies on a dense matrix of miniature EEG sensors woven directly into fabric. The company says the beanie will pack between 70,000 and 100,000 electrodes, vastly increasing the surface-level signal coverage compared to conventional EEG caps. That density, Sabi claims, helps capture more informative neural patterns associated with internal speech.

“Given that high-density sensing, it pinpoints exactly what and where neural activity is happening. We use that information to get much more reliable data to decode what a person is thinking,” CEO Rahul Chhabra said.

Those signals are fed into machine-learning models trained to map patterns of brain activity to words the wearer is silently articulating. The goal is an end-to-end system: imagine a phrase internally, see it appear as text—no keystrokes required.

Design and timeline

Sabi’s first product is a winter-hat style beanie slated for late 2026. The company is also developing a more streamlined baseball cap that trades cold-weather aesthetics for an everyday look. The common thread is accessibility: a BCI you can put on and take off like any other piece of clothing.

That non-invasive approach is core to Sabi’s thesis. Vinod Khosla, founder of Khosla Ventures and one of Sabi’s investors, put it plainly: “The biggest and baddest application of BCI is if you can talk to your computer by thinking about it. If you’re going to have a billion people use BCI for access to their computers every day, it can’t be invasive.”

Performance targets

At launch, Sabi is targeting an initial typing speed of roughly 30 words per minute. That’s slower than most people type on a keyboard, but the company expects improvements as users spend more time with the system and as the models adapt to individual neural signatures. As with speech-to-text and predictive typing, personalization and training data typically drive steady gains.

Why non-invasive is hard

Reading brain signals from the scalp is notoriously challenging. Skin, skull, and other tissue attenuate and blur the electrical activity generated by neurons, making it harder to isolate specific patterns. Surgically implanted electrodes can pick up much stronger, cleaner signals—which is why implant-based BCIs have historically led on speed and accuracy.

Sabi’s bet is that sensor density plus better algorithms can close much of that gap without the risks and barriers of surgery. Still, the company acknowledges that not every user’s signals will be equally easy to detect, and real-world performance will depend on both hardware coverage and the robustness of the decoding models.

The AI behind the beanie

BCIs live or die by their machine-learning stack. To power its interface, Sabi says it has trained a “brain foundation model” on 100,000 hours of neural data collected from 100 volunteers. The idea mirrors trends in AI elsewhere: train large, generalized models that can then be adapted to new users with comparatively little data.

In practice, the model attempts to translate patterns correlated with internal speech into words and sentences in real time. The company describes a pipeline that blends high-density sensing with probabilistic language modeling—so the system doesn’t just decode a signal; it also constrains outputs to likely words and phrases, much like modern speech recognition.

What to watch next

Key milestones to track before the 2026 debut include third-party demos, real-world typing benchmarks, and comfort and battery life details for full-day wear. If Sabi can deliver reliable, non-invasive internal-speech decoding in a beanie or cap, it would mark a notable step toward practical consumer BCIs—opening new options for hands-free computing and accessibility tools, while sidestepping the invasiveness of implants.

Whether dense-fabric EEG and large-scale AI training can overcome the fundamental physics of scalp sensing is the central question. For now, Sabi’s blend of ambitious hardware and foundation-model AI sets an aggressive but intriguing course for mind-to-text technology.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Unlock Your Escape: Mastering Asylum Life Codes for Roblox Adventures

Asylum Life Codes (May 2025) As a tech journalist and someone who…

Challenging AI Boundaries: Yann LeCun on Limitations and Potentials of Large Language Models

Exploring the Boundaries of AI: Yann LeCun’s Perspective on the Limitations of…

Unveiling Oracle’s AI Enhancements: A Leap Forward in Logistics and Database Management

Oracle Unveils Cutting-Edge AI Enhancements at Oracle Cloud World Mumbai In an…

Charting New Terrain: Physical Reservoir Computing and the Future of AI

Beyond Electricity: Exploring AI through Physical Reservoir Computing In an era where…