
Inside OpenAI’s Usage Cliff
The viral chart nearly everyone read wrong.// 3–4 min read · A single chart convinced the internet that students run AI. One deeper look that rewrites the story.First appeared in Essentialist Edge · A...

// A $500 robot just did what billion-dollar AI demos couldn’t. Reachy Mini shows us why: One head tilt. One blink. One giant leap for lovable machines.
First appeared in Essentialist Edge · September 5, 2025.
“Tiny price, small size, huge possibilities.”
That’s how Hugging Face introduced Reachy Mini in June 2025. Day one: $500,000 in preorders. Fully open-source. A desktop robot you can build and program yourself.
The real story though isn’t the sales figure. It’s what Reachy signals: we’re redesigning AI beyond IQ into taste, tactility and trust.
Charm beats charts.
While billion-dollar humanoids make headlines for folding laundry or climbing stairs, Reachy Mini quietly did something more radical: it made AI feel lovable, approachable, safe enough to sit on our shelf.
If you grew up on The Jetsons like I did, tiptoeing out of bed at 07:00 flipping on the screen while your parents were still asleep, this hits different.
Reachy isn’t just a robot. It’s the feeling we’ve been chasing for decades. Small enough for a desk. Affordable enough for a student. Open enough to spark a movement.
And for builders, that’s the point: Reachy Mini isn’t a launch. It’s a product thesis in motion. A proof that in the embodied AI era, the interface is behavior. It distills five principles that will define how we build AI from here.
The timing isn’t accidental. In 2023, 65.7% of new AI models went open source which is double the year before. Reachy Mini rides that same wave: embodiment, openness and emotional resonance mattering more than raw IQ points.
This is the real quest of our era: What does it mean to build AI that cohabits with us, not just serves us? Products need to blend in effortlessly. They must feel native, not bolted on.
For AI-native builders, this isn’t a shift at all. It’s the water they already swim in. Reachy doesn’t signal the future anymore. It shows it’s already here.
So let me ask: are you keeping pace or helping shape what comes next in this radically new world?
Artificial intelligence is moving into a new phase where systems must feel intuitive, trustworthy and natural to interact with. Reachy Mini’s $500K day-one success proves the point. Embodiment, openness and emotional resonance create momentum that specs alone cannot.
We’re entering an embodied AI era where charm beats charts and robots win hearts before they win market share. Given the breakneck pace, the focus is shifting from “what AI can do” to “how it belongs in our lives”. That includes not only work but also daily environments — on a desk, in a classroom, beside a human. We will judge it by whether it feels native and trustworthy.
Delight is a moat. Micro-interactions like Reachy’s tilt or blink build bonds that raw performance never achieves. Emotional resonance is sticky.
2. Intent = Interface
Interfaces are shifting from screens to behaviors. Gestures, tone and subtle signals become the language of interaction. Reachy shows intent through expression rather than menus.
3. DIY Assembly = User Ownership
Users who assemble their robot feel a sense of authorship. Kits transform customers into co-creators who experiment, tinker and advocate. Empowered users become loyal evangelists.
After all, who hasn’t felt oddly proud of an IKEA shelf? So imagine how it feels when the flatpack blinks back at you.
4. Platforms > Products
Closed systems shine on launch day. Open platforms compound every day after. Open platforms compound in value because users extend and remix them. Reachy’s modular design and direct link to Hugging Face’s 1.7M+ models amplify this effect.
5. Open Source + Embodiment = Real Trust
Open source is a launchpad, not a license.
Every user becomes a builder, every builder your R&D. Momentum compounds when the community tinkers and shares. But momentum alone isn’t enough. Embodiment makes AI real. Code becomes character and behavior becomes presence. Trust comes less from what AI does, more from how it welcomes the human in the loop.
And just like that… the most loved AI becomes the most used AI.
The next AI moat will go beyond the intelligence. The differentiation will be whether systems feel native and trustworthy enough for us to welcome them into our daily lives. Reachy shows the way. And it’s already here.
“Don’t explain your philosophy. Embody it.”
— Epictetus
The original Reachy was born at Pollen Robotics, launched at CES 2020. Quirky, lovable and priced around $70,000 with ~100 units sold globally.
Fast forward to April 2025: Hugging Face acquired Pollen Robotics and made a brilliant move. They shrunk it to desktop size, kept the charm and dropped the price by 99.5%.
Voilà: from lab-only to laptop-friendly. From high-end research tool to something you can set on your kitchen table.
And in doing so Hugging Face didn’t just democratize machine learning tools. They embodied the most lovable one: Reachy Mini.
A programmable robot that tilts its head, blinks with intent and costs less than the smartphone in your pocket while arguably being smarter in how it connects.
This is where robotics stops mimicking humans… and starts meeting them.
Builder takeaway: Shrinking a product isn’t dilution, it’s distribution.
What looks like a small desktop robot is really a signal of irreversible momentum and real democratization. The global service robotics market is projected to grow from about $47 billion in 2023 to $108 billion by 2030. Sit with that for a second. This may end up as one of the biggest industries in history and the wave is compounding faster than most people realize.
Even the conservative forecasts point to a market that compounds fast. The takeaway is simple: we are not watching a niche gadget space. We are watching one of the biggest industries of the century take shape in real time.
The forecasts vary, but the trajectory is unmistakable.
Reachy Mini collapses cost and scale while making accessibility real. It shows that embodied AI is no longer a lab demo but a consumer-ready movement.
This isn’t just cute. It marks a seismic shift in how robotics is built and shared. Openness, accessibility and hands-on creativity are no longer aspirations. They are front and center. Now, anyone regardless of background can build with it.
And that is the point: not just the robot, but what it enables.
I have always believed in the power of open-source AI. Hugging Face’s thesis leans on shifting power away from a handful of AI giants and handing tools to the rest of us. Yes, this creates risks (a topic for another day) but it also unleashes remarkable momentum. It’s the path through which AI will thrive.
This isn’t a gimmick. It is part of Hugging Face’s larger bet:
Robotics and AI should be built with people not merely delivered to them.
Reachy Mini lives at the intersection of: delight, embodiment, open source and real-world AI usability.
Remi Fabre, a robotics engineer from Pollen Robotics, sits across from Reachy talking to it. Not prompting. Not debugging. Just talking. In his eyes you could see the moment when mind and emotion began to connect. Almost like human to human interaction.
It blinked.
It tilted.
It felt like something real.
That was the moment it became clear. The value is no longer in intelligence alone but in the experience of interacting with it.
So how do you design for that? In my view, there are five lessons that will shape the AI-native era. Consider them more as a mindset rather than rigid doctrines. A lens for building AI that feels right.
Most robots still feel like they belong in a lab. Reachy feels alive.
It is not the model size or inference speed. It is the head tilt when it is curious, the antenna wiggle when it is “thinking.” Micro-interactions like these create bonds that raw performance never could.
Reachy Mini does not just function. It feels good to be around. And that matters more than most builders admit.
Its DIY kit, beginner-friendly programming (Python now, JS and Scratch soon) and ready-to-go demos lower the floor. But what lifts it is personality: calm, playful, human-adjacent.
And here is the question I come back to across every product I have worked on:
What emotions does this evoke?
That question belongs in the room whether you are defining the problem, sketching the interface or debating launch priorities.
The Principle:
Emotional resonance beats raw intelligence. Reachy Mini proves that delight is not decoration. It is the moat.
Key Question:
What emotion does your product evoke and is that feeling strong enough to make someone return or tell a friend?
Forget screens for a moment. The real interface now is gestures, glances and tone.
Reachy doesn’t hand you buttons to tap or menus to browse. It nods to confirm. It tilts to show doubt. It behaves.
That is not a robotics gimmick. It is a preview of post-screen interfaces. Frontier AI products are moving away from static dashboards and toward natural, embodied and emotionally resonant interactions, from pixels to presence.
In the AI-native era, the interface is no longer a control panel. It embodies your intent. It adapts to your behavior. It turns invisible signals into intuitive exchange.
Reachy Mini nails this through modular, expressive design. Stuffed-animal sized. Playful. Relatable. But also endlessly hackable: designed to be upgraded, personalized and reimagined.
That flexibility is not just fun. It makes the product yours.
The best AI-native products blur the line between action and intent. They respond to what you mean, not just what you click. It is not just how you interact with Reachy. It is how Reachy interacts and melds with you.
The Principle
Do not just build dashboards. Design behavior.
Key Question
If your product had no screen how would it express intent?
Hugging Face could have shipped polished, ready-to-use robots. Instead they shipped DIY kits. Why?
Because users who build it do more than use it. They understand it. They become co-creators rather than customers.
They form the feedback loops. They tinker. They modify. They evangelize because they had a hand in shaping it. Invite users to co-build and you invite them to care.
Reachy Mini is not locked in a lab or hidden behind glass at a demo booth. It is designed for the messy, unpredictable real world: homes, classrooms and offices.
And it is priced for access, not prestige: $299–$449 instead of the $2,000–$70,000 range of research-grade robots.
That choice is not symbolic. It is structural. Hugging Face made a deliberate bet by baking openness and user agency into the product’s foundation.
What would make Reachy Mini fail? If it were too hard to program. Too generic. Too lifeless. Programmable alone is not enough. Lovable, relatable, intuitive, expressive. That is what makes people stay.
This is why working backwards from real-world goals and outcomes matters. It is how you avoid building something that is usable but unwanted.
The Principle
Turn users into builders. Assembly creates ownership.
The Key Question
Where could you let your users assemble, adapt or extend your product and how would that change the way they feel about it?
Reachy Mini is more than a robot. It is a robotics platform disguised as a toy.
Imagine what 10 million developers will make it do tomorrow.
Here’s the deeper pattern:
Open source is no longer just about sharing models. It’s about sharing behaviors, interactions, embodied experience.
When Hugging Face open-sourced the hardware, the software and the training playground, they didn’t just hand over tools. They handed over agency to remix, to rebuild, to invent new use cases nobody had imagined yet.
Reachy is now part of the largest open AI dev ecosystem in the world. That means every update and experiment adds momentum. Platforms grow because users make them smarter, lower the cost of experimentation and accelerate iteration. Product direction turns into a community conversation.
Open source isn’t just a dev philosophy. It’s becoming a strategic lever on the global stage. See China. They treat open source models as the fastest way to close the gap with the US. Europe has ground to cover, but its collaborative culture could set benchmarks for responsible embodied AI.
Both Unitree and Reachy (by Hugging Face / Pollen Robotics) champion open-source robotics. But Reachy goes further by opening the entire build stack, hardware and software. Unitree supports open robot control and development tools but keeps its hardware designs closed.
The principle: Build for the use cases you haven’t imagined yet.
Closed products shine on launch day. Open platforms compound in value as others build on top of them. They improve every single day while you sleep.
Key question: Where can your product become a playground? What would change if you gave others permission to build on top of it?
When you open the platform, you multiply the pace. Every user becomes a builder. Every builder becomes your R&D. Here’s where the math gets wild:
Stanford reports that 149 foundation models launched in 2023 more than double the number from 2022.
And nearly 66% were open-source.
Closed robotics platforms die slow deaths. Open ones explode. Open-sourcing everything gives Reachy instant leverage: every improvement is shared, every user a multiplier.
As Ethan Mollick put it:
“The genie is out of the bottle… Open-source models create competition, lower costs, and make innovation possible at the edge.”
This is more than strategy. It is how compounding systems behave.
Platforms like Reachy compound because they grow in the wild. Reachy Mini connects directly to the Hugging Face Hub making cutting-edge AI models usable in physical form.
Developers can build, test, deploy and remix behaviors in simulation or on the real robot then instantly share with the world. That’s a radically different loop than the old model of closed R&D.
It’s also why Reachy is built as a modular generalist, not a single-function tool. You architect for the core use case but leave APIs and extension points wide open for surprise hits from the community.
Projects like Open X-Embodiment take this even further: They pool robotic experiences such as arms, humanoids, quadrupeds and more into open datasets that let anyone train embodied agents from a collective memory.
Large-scale, general-purpose systems trained on diverse data often outperform narrow, specialized ones. That shift is already under way.
The principle: Your community is your R&D department. Act accordingly.
Key question: Where could your product act more like a platform inviting others to extend, remix or reinvent it in ways you wouldn’t have imagined?
Reachy Mini proves that AI will not win on intelligence alone. It will win because it feels right to interact with.
The data supports this shift. In 2023, AI models surpassed human performance on tasks like reading comprehension and visual reasoning. But 57% of people still expect AI to disrupt their jobs in the next 5 years.
The missing piece? Trust. And increasingly, that trust comes through embodiment. We’re entering the embodied AI era. Not just chatbots, but AI you can see, touch and trust.
And the companies that figure out the emotional layer, the cute head tilts, the antenna wiggles, will own the next decade. That is beyond the interfaces, a whole different game. And Reachy has just made the first move.
One where charm beats charts and robots win hearts before market share.
Embodiment makes AI real. It turns code into character, behavior into interface.
What we remember isn’t intelligence. It is presence. Maya Angelou said it best:
“People will forget what you said, people will forget what you did, but people will never forget how you made them feel.”
She was speaking about people, yet the same truth now applies to the machines we choose to coexist with.
Reachy made us feel something. A blink, a tilt, a spark of connection. That is how the next era of AI begins. Not with IQ, but with presence that earns trust.
End of Line.
{ Next release loading… }
Stay essential,
Nihal
P.S. If you are experimenting with Reachy or building on embodied AI, I’d love to see what you’re creating. The weirdest one gets featured first. I promise.
P.S.S. Join the Reachy Mini Community here.
Written with curiosity and maybe a little Jetsons nostalgia for the future we are finally building.

The viral chart nearly everyone read wrong.// 3–4 min read · A single chart convinced the internet that students run AI. One deeper look that rewrites the story.First appeared in Essentialist Edge · A...

Begin Before Anyone Believes — Be It Deep Tech or MountaineeringFrom death zones to a YC-backed $21M orbital bet: decoding conviction in deep tech.Disclaimer: I’m not a space expert. I didn’t grow up...

Technical brilliance gets you to the starting line. Judgment wins the race.Originally published on Essentialist Edge, June 2, 2025.Welcome to the near-abundant era.Your infra stack can scale to a mill...