Reachy Mini,
desktop robotics
Reachy Mini is the open-source desktop humanoid built by Pollen Robotics and acquired by Hugging Face in April 2025. Petronella Technology Group operates one in our Raleigh lab as the entry point into our sovereign robotics prototyping practice for defense, research, and healthcare research clients.
What Reachy Mini is, and where it came from
Reachy Mini is an open-source, expressive desktop humanoid robot designed for AI experimentation, human-robot interaction research, and education. It is small enough to fit on a developer's desk at roughly 28 centimetres tall in active mode and 23 centimetres tall when sleeping, and it weighs about 1.5 kilograms. It does not have arms, hands, or legs. The body holds a head with six degrees of freedom, two animated antennas, a wide-angle camera, four microphones, a 5 watt speaker, and a base that rotates the entire torso around the vertical axis. The robot is sold in two variants. Reachy Mini Lite, at $299, depends on a Mac or Linux host and ships without onboard wireless. Reachy Mini, the wireless variant, ships at $449 with an onboard Raspberry Pi 4, Wi-Fi, an accelerometer, and a battery for untethered operation. Both variants ship within roughly 90 days of order. For fleet purchases of ten or more units, contact Petronella Technology Group directly at our contact page or call (919) 348-4912 and we will scope the procurement, supporting compute, and security posture with you.
The robot was designed by Pollen Robotics, a French open-source robotics company founded in 2016 by Matthieu Lapeyre and Pierre Rouanet, both former researchers at the French national digital science institute Inria. Before Reachy Mini, Pollen built the full-size Reachy 2 humanoid, a $70,000 research-grade platform with seven-degree-of-freedom Orbita-joint arms, an omnidirectional mobile base, LiDAR, and VR teleoperation, deployed at Cornell University, Carnegie Mellon University, Accenture, the French Atomic Energy Commission, and the French National Centre for Scientific Research, among other institutions. Pollen described its mission, since founding, as making research-grade robots open and affordable.
On April 14, 2025, Hugging Face acquired Pollen Robotics. The deal made Hugging Face, already the dominant open community for AI models and datasets, the new home of one of the most credible open-source robotics platforms in the world. Hugging Face co-founder Thomas Wolf framed the rationale this way: "We believe robotics could be the next frontier unlocked by AI, and it should be open, affordable, and private." Reachy Mini was announced as a Hugging Face product on July 9, 2025, and is positioned as the entry-level open hardware that pairs with the LeRobot open-source robotics framework. Reachy Mini and LeRobot together are how Hugging Face is recapitulating, for robotics, the open-model and open-dataset playbook that reshaped natural language processing.
Petronella Technology Group operates a Reachy Mini in our Raleigh, North Carolina lab. We bought it because it is the most credible, lowest-friction way today for a regulated-industry consultancy to climb the AI-driven robotics learning curve on real hardware, on premises, with no cloud dependency. Everything about the platform aligns with how we already work for our defense, research, and healthcare clients: open, inspectable, on-prem-capable, and small enough to run in a controlled enclave. This page documents what the hardware is, what we run alongside it, what we are exploring with it, and how to engage us if your team needs a robotics prototype with the same compliance posture you would expect from any other Petronella engagement.
Reachy Mini hardware specifications
The table below collects every published specification from the Hugging Face Reachy Mini launch announcement on July 9, 2025, retrieved May 2, 2026. Where a specification was not stated in the launch announcement, it is marked as being confirmed against Pollen Robotics documentation rather than guessed. Reachy Mini is open hardware: the software side is fully open-source and on GitHub today, and the CAD files were described as pending release at the launch announcement. We will update this page as additional values are confirmed against the Pollen Robotics product page.
| Spec | Reachy Mini Lite | Reachy Mini (Wireless) |
|---|---|---|
| Form factor | Desktop expressive humanoid: head plus rotating torso, no arms, no locomotion | |
| Height (active) | 11 inches / 28 centimetres | |
| Height (sleep) | 9 inches / 23 centimetres | |
| Width | 6.3 inches / 16 centimetres | |
| Weight | 3.3 pounds / 1.5 kilograms | |
| Head degrees of freedom | 6 | |
| Body rotation | Full rotation about the vertical axis | |
| Animated antennas | 2 | |
| Camera | 1 wide-angle camera (resolution and field of view being confirmed against Pollen Robotics documentation) | |
| Microphones | 4 | |
| Speaker | 5 watts | |
| IMU / accelerometer | Not present | Present |
| Onboard compute | None; pairs with a Mac or Linux host | Raspberry Pi 4 onboard (RAM SKU and storage being confirmed) |
| Windows host support | Announced as coming soon at launch, not at general availability | |
| Wi-Fi | No | Yes |
| Bluetooth | Specifications being confirmed against Pollen Robotics documentation | |
| Power | Wired only | Wired and battery (battery capacity and runtime being confirmed) |
| Connectivity ports | USB and charge port specifications being confirmed against Pollen Robotics documentation | |
| Servo motor model | Specifications being confirmed against Pollen Robotics documentation | |
| Bundled behaviours at launch | 15 or more | |
| Hardware open-source posture | Open hardware; CAD release was pending at launch | |
| Software open-source posture | Fully open-source on GitHub | |
| Default data privacy posture | No personal data stored, transmitted, or processed by default; camera and microphone use is fully user-controlled (per Hugging Face launch announcement) | |
| Price (USD, ex-tax, ex-shipping) | $299 | $449 |
| Stated delivery window at launch | Approximately 90 days | |
| Fleet procurement | Contact Petronella Technology Group or call (919) 348-4912 to scope a fleet purchase | |
A few things in that table are worth pulling out for engineering audiences. The 9-degree-of-freedom mechanical design (six in the head, two in the antennas, one in the body rotation) is a deliberate compromise: enough articulation for credible expressive interaction and head-motion-driven research, none of the safety, calibration, and certification burden that arms or legs would impose. This is what makes Reachy Mini useful as a teaching and research platform in environments, including classrooms and shared labs, where a full-body humanoid would not be welcome. The Wireless variant's onboard Pi 4 is sufficient to run sensor capture, behaviour playback, and small inference workloads, but heavier policy training and large vision-language-action models run on a paired host or, in our case, on Petronella's owned datacenter GPU fleet. The data-privacy default ("no personal data stored, transmitted, or processed by default") is the language we wanted to see from a vendor that lands on a desk inside a regulated-industry client. Specifications being confirmed against the Pollen Robotics product page or repository documentation will be updated on this page as soon as we publish those confirmations.
The software stack we run with Reachy Mini
Reachy Mini ships with a Python-first software stack designed for the Hugging Face open-AI ecosystem. The four pieces that matter, in order of how they show up in a real engagement, are LeRobot, the Pollen Robotics Reachy Mini Python SDK, the MuJoCo-based simulation environment, and the Hugging Face Spaces apps and datasets that the community has already published.
LeRobot
LeRobot is the open-source AI for real-world robotics framework that Hugging Face launched in 2024 and that Pollen's lead robotics scientist Remi Cadene, formerly of the Tesla Optimus team, has driven since. By the April 2025 acquisition announcement, LeRobot had crossed 12,000 GitHub stars and grown into a hub of more than 100 ecosystem repositories, 39 robotics models, and 181 datasets on the Hugging Face Hub. LeRobot supports a range of real-world hardware platforms beyond Reachy, including the SO-101 and SO-100 arms from The Robot Studio, Koch v1.1, LeKiwi, Hope Jr, Reachy 2, Unitree G1, Earth Rover Mini, OMX, OpenArm, the Aloha bimanual platform from Trossen Robotics, and the ViperX arm. Camera support spans OpenCV-compatible USB and built-in cameras, ZMQ network cameras, Intel RealSense depth cameras, and Reachy 2 native cameras. The framework requires Python 3.12 or newer and PyTorch 2.10 or newer. It runs end to end on a self-hosted Linux and GPU stack, with no mandatory cloud component. That last property is the one that makes LeRobot fit our practice: it is the rare open robotics framework where training and inference can both stay inside a client's data plane.
Pollen Robotics Reachy Mini Python SDK
The Reachy Mini Python SDK is the canonical way to talk to the robot. It exposes head and antenna control, body rotation, the camera and microphone arrays, the 5-watt speaker, the bundled 15-plus behaviours that ship at launch, and the LeRobot integration. Pollen has announced JavaScript and Scratch language bindings as coming after the initial Python release. The SDK is open-source on the pollen-robotics/reachy_mini GitHub repository. ROS 2 bridge availability is being confirmed against the Pollen Robotics repository documentation; if a community ROS 2 bridge is published, we will document the integration on this page rather than asserting native ROS 2 support today.
MuJoCo-based simulation
Pollen ships an open MuJoCo-based simulation environment for Reachy Mini. Simulation matters in our work for two distinct reasons. The first is the standard sim-to-real argument: it lets a research team iterate on policy code, behaviour design, and perception pipelines on a developer laptop or training cluster without occupying the physical robot. The second is the compliance argument: simulation is where a CMMC-aligned development workflow can run continuous integration, regression tests, and security review without ever touching the camera and microphone of the physical unit. We use the simulator as the default execution target during early scoping, and we move to real-hardware runs only after the corresponding policy or behaviour clears its safety review.
Hugging Face Spaces, datasets, and models
The Pollen Robotics organization on Hugging Face publishes 18 Spaces apps, of which at least five are explicitly Reachy Mini-named (Reachy Mini, Reachy Mini Realtime URL, Reachy Mini Conversation App, Reachy Mini Skins, Reachy Mini Debug and CI Testbench), 3 models (anyskin-slip-detection, act_reachy2_mobile_household_apple, act_reachy2_static_cup), and 15 datasets including the Reachy Mini official app store, the dances library, and the emotions library. The LeRobot organization adds another 39 models and 181 datasets, including the SmolVLA vision-language-action stack designed to run on consumer-grade GPUs. We pull these as starting points, fork them into our private Hugging Face Hub workspace, and only push back upstream when a client engagement explicitly green-lights it. Our default posture is on-prem mirror, not public publish.
What we run Reachy Mini on, and why it matters
Reachy Mini's onboard Pi 4 (Wireless variant) is sized for sensor capture, behaviour playback, and small inference. Anything heavier than that, including policy training, vision-language-action model fine-tuning, and large-scale teleoperation data ingest, runs off-robot on owned compute. For the regulated buyers we serve, off-robot does not mean public cloud. It means our owned datacenter GPU fleet inside our managed infrastructure footprint, which is the same fleet that runs the rest of our private AI workloads.
Owned training inventory
We train robotics policies, fine-tune small vision-language-action models, and run large data conversions on a tiered GPU fleet sourced through the NVIDIA Elite Partner Channel. The fleet spans NVIDIA DGX systems for the heaviest workloads, NVIDIA HGX-class servers for production inference, and NVIDIA RTX PRO 6000 Blackwell workstations for development, prototyping, and small-scale training. The Reachy Mini integration, in practice, looks like this: data captured on the robot in our Raleigh lab streams to a development workstation, gets cleaned and converted to LeRobot's dataset format, and is then either trained locally on the workstation tier for small policies or shipped over our internal network to a DGX or HGX node for larger jobs. Inference for the resulting policy can run on the robot's Pi 4, on a Linux host on the same desk, or on a server-class node depending on latency budget. The LeRobot SmolVLA paper documents the same pattern explicitly: "consumer-grade GPUs or CPUs" for inference, with a "remote GPU server for asynchronous inference" pattern as a supported deployment.
If you are evaluating which compute to bring to a robotics prototype, our enterprise AI workstations page documents the same RTX PRO 6000 Blackwell, Threadripper PRO, Xeon W, and EPYC platforms that we pair with Reachy Mini for training-side work. Larger projects, including ones that need multi-GPU NVLink for vision-language-action model training, route through our private AI infrastructure stack.
Onboard inference path
Wireless Reachy Mini runs onboard inference on the Raspberry Pi 4. That budget is enough for the bundled 15-plus behaviours, for keyword-driven conversation flows, and for distilled small models compiled for ARM. When a workload exceeds the Pi 4's envelope, we fall back to the asynchronous-remote-inference pattern: the robot streams sensor frames over the local network to a paired Linux host or workstation, the host runs the policy, and the action stream returns over the same channel. We keep all of that traffic inside the client's network, the client's lab segment, or our own controlled enclave. Public-internet inference is not part of our default architecture.
Why owned, not cloud
The reason this matters, and the reason it is the wedge that earns us the right to do this work for regulated-industry clients, is that public-cloud robotics is a non-starter in the markets we serve. A defense contractor handling Controlled Unclassified Information cannot stream Reachy Mini sensor data into a tenant they do not own. A university research group with a National Institutes of Health-funded protocol cannot push human-subject video into a third-party model API without rewriting the protocol. A healthcare research group with Health Insurance Portability and Accountability Act-aligned data on the bench cannot route it through a vendor that has not signed a Business Associate Agreement. Owned compute, on premises, with a known provenance chain, is what makes a robotics prototype shippable in those rooms. That is what we built our cybersecurity practice on for twenty-three years, and it is the foundation we are extending into the robotics overlay now.
Use cases we are exploring with Reachy Mini
Robotics is a new practice area for our team, so the language on this section is precise on purpose. The use cases below are projects we are exploring, not delivered engagements. Where we have shipped code, datasets, or merged pull requests, we will say so on the page that documents the work. Until then, this is a roadmap of where the platform fits inside our existing client base.
Conversational AI prototyping
Pairing Reachy Mini's microphone array, speaker, and animated antennas with private speech-to-text and large language model inference on our owned cluster. The objective is a desktop demo that a defense or research client can hold a one-on-one conversation with, where the entire transcript and audio path stays inside the client's data plane.
Teleoperation studies
Driving Reachy Mini's head pose and body rotation from a remote operator session over a CMMC-aligned segmented network. Use cases include after-hours physical-presence demos, remote welcome desks for our own front lobby, and as a control prototype for downstream teleoperation work on larger platforms.
Manipulation perception research
Using Reachy Mini's wide-angle camera and an external arm (such as the SO-101 or Koch v1.1 from the LeRobot supported-platform list) as a perception-and-behaviour testbed for clients who eventually want to deploy on a larger arm. The Mini's cost makes it the right surface for the exploratory phase before a real-hardware commitment.
Education and demo builds
Hands-on demonstrations for engineering teams at the labs and primes we serve into on what an open-source AI-driven robot looks like in practice. Particularly useful for boards, executive briefings, and non-engineering stakeholders who need to see the technology before approving a research budget.
Behaviour authoring with LeRobot
Authoring new behaviours on top of the 15-plus that ship with the unit, recording datasets in LeRobot format on our owned storage, and contributing back to the open ecosystem when the client engagement permits public release. We track authorship and license provenance on every behaviour we author.
CMMC-aligned development workflow
Building out the development workflow itself: secrets handling for SDK keys, software bill of materials for every model and dataset pulled from the Hugging Face Hub, dependency provenance, and audit-log capture for every training run. This is engineering on the engineering process, and it is the half of the practice that twenty-three years of cybersecurity work earns us the right to do.
What we are not exploring with Reachy Mini, and what we will refer to other vendors who own those capabilities, includes industrial cobot integration on a moving production line, autonomous-vehicle fleet work, operating-room surgical robotics, weapons-platform integration, and humanoid manufacturing at any scale. Reachy Mini is a desktop research platform, and our practice around it is sized to that reality.
Reachy Mini vs the alternatives we considered
Reachy Mini is one of three open-source-leaning humanoid platforms a research team will reasonably evaluate today, and one of several research-grade platforms in the LeRobot ecosystem. The table below sets the comparison honestly. The two platforms are not direct substitutes. Reachy Mini is a $299 to $449 desktop expressive head-and-torso unit. The Unitree G1 is a full-size bipedal humanoid in the $16,000-and-up range. Hello Robot's Stretch 3 is a research-grade mobile manipulator. The three platforms answer three different questions, and the right answer for a given client depends on the question being asked.
| Axis | Reachy Mini (Lite / Wireless) | Unitree G1 | Hello Robot Stretch 3 |
|---|---|---|---|
| Form factor | Desktop head plus rotating torso, no arms, no locomotion | Full-size bipedal humanoid, legs and arms | Mobile manipulator: telescoping arm on a wheeled mobile base |
| Height | 28 cm active / 23 cm sleep | Approximately 127 cm per general reporting; specs being confirmed against vendor sheet | Approximately 142 cm at full extension per vendor; specs being confirmed |
| Weight | 1.5 kg | Approximately 35 kg per general reporting; specs being confirmed | Approximately 24 kg per vendor; specs being confirmed |
| Total degrees of freedom | 9 (head 6 + antennas 2 + body rotation 1) | 23 base, 43 with dexterous hands per general reporting; specs being confirmed | Telescoping arm with multi-DOF wrist and gripper; specs being confirmed |
| Onboard compute | None on Lite / Raspberry Pi 4 on Wireless | Onboard SoC plus NPU per general reporting; specs being confirmed against vendor sheet | Onboard Linux compute with ROS 2; specs being confirmed |
| Software stack | LeRobot, Pollen Python SDK, Hugging Face Hub apps and datasets, MuJoCo simulation | Unitree native SDK; LeRobot lists Unitree G1 as a supported platform | Native ROS 2 stack and Stretch SDK |
| Price (USD, ex-tax, ex-shipping) | $299 Lite / $449 Wireless | $16,000 starter tier per general reporting; tiered pricing being confirmed against vendor sheet | Research pricing on inquiry; figure being confirmed against vendor sheet |
| Open-source posture | Software fully open; hardware open with CAD release pending | Closed hardware; SDK partially documented; LeRobot adds community openness on top | Hardware partially open with research-program licensing; software ROS 2 ecosystem |
| Best fit | Education, conversational AI research, perception experiments, teleop prototyping | Locomotion research, full-body control, professional humanoid R&D | Mobile manipulation, assistive robotics, in-home and rehabilitation research |
| In Petronella's lab | Yes, owned and operated in Raleigh | No, comparison from public spec data only | No, comparison from public spec data only |
The honest framing for a buyer: a $299 Reachy Mini Lite gets you onto LeRobot, lets you run perception and conversation experiments at low cost, and earns you a real entry into the open robotics ecosystem. It does not give you locomotion or arm manipulation. A Unitree G1 gives you locomotion and arm research at a $16,000-and-up price point with a different open-source posture. A Hello Robot Stretch 3 gives you mobile manipulation in a more clinical, ROS-2-native form factor. We chose Reachy Mini because it pairs the lowest cost-of-entry into LeRobot with a desk-friendly footprint that fits the early scoping conversations we have with the buyers we serve into. If a client needs locomotion or arm reach, we will scope the engagement to the platform that fits, not force-fit Reachy Mini into a job it cannot do. Specifications being confirmed against each vendor's published documentation will be updated on this page as confirmed; we link the canonical sources rather than reciting general-knowledge approximations as fact.
Security and data-handling posture for client prototypes
The reason a regulated-industry client buys a robotics prototype from us, rather than from a generalist robotics shop, is the security and data-handling overlay. Reachy Mini ships with a "no personal data stored, transmitted, or processed by default" posture (verbatim from the Hugging Face launch announcement), with camera and microphone use fully under user control. That is the default we want from a vendor, and it is the floor we build on. The CMMC-aligned development workflow we wrap around Reachy Mini for client engagements is documented below.
Practices we apply on every robotics engagement
- Aligned to NIST SP 800-171 and CMMC Level 2 as the baseline for any engagement that involves Controlled Unclassified Information or that supports a downstream Department of Defense contract. We are a CMMC-AB Registered Provider Organization, RPO #1449, verifiable at CyberAB. Our team holds CMMC-RP credentials.
- Aligned to NIST SP 800-218 SSDF for the development workflow itself: secrets handling, dependency provenance, code-signing, build reproducibility, and audit-log capture. Robotics codebases pull from public model and dataset hubs, which makes provenance non-negotiable.
- Software bill of materials per CISA SBOM guidance for every model, dataset, library, and container image pulled from the Hugging Face Hub or upstream package indexes. We track license, source URL, and date-of-pull on every artifact.
- Network segmentation for the robot's local network. Reachy Mini Wireless does not share a Wi-Fi segment with general-purpose office traffic. Sensor streams stay inside the robot enclave or get tunneled across explicit, logged paths.
- Default-deny outbound traffic from any host paired with the robot. Public-internet inference is not part of the default architecture. If a workload needs to reach an external service, the path gets explicit allow-listing and review.
- Data minimisation. We capture the minimum sensor data the workload requires, retain it for the minimum time needed for training and verification, and delete on schedule.
- HIPAA and Common Rule alignment for healthcare research engagements. Where a study touches protected health information or human-subjects data, the prototype operates under the client's existing IRB protocol, with our team covered by a Business Associate Agreement. We do not introduce a public-cloud surface that would require renegotiating the protocol.
- Authorship and license provenance on every model, behaviour, dataset, and dependency we ship with a prototype. Hugging Face Hub artifacts carry an explicit license; we honour them.
- Audit-log capture for every training run, every dataset import, every behaviour deployment. Logs are retained per the engagement's contractual retention rule.
None of this is exotic, and none of it is robotics-specific. It is the same engineering hygiene we apply to every cybersecurity, compliance, and private-AI engagement we run for our defense, research, and healthcare clients. The Reachy Mini overlay is the new thing; the security floor underneath it is twenty-three years old.
How to engage us on a Reachy Mini prototype
If you are reading this page because your team needs a robotics prototype on Reachy Mini hardware, with a Petronella security and compliance overlay, here is the path. We do not publish fixed pricing because the work is custom-quote by definition. The phases below are the same scoping pattern we run on every engagement.
Discovery call
Thirty to forty-five minutes. Tell us the use case, the regulatory context (CMMC level, HIPAA, IRB, ITAR awareness), the rough timeline, and who the decision-makers are. We tell you what we have learned with Reachy Mini so far and where the platform fits, or does not fit, your problem. Outcome: a written one-page scope memo.
Architecture sketch
One to two weeks. We produce a written architecture sketch that names the hardware (Reachy Mini Lite or Wireless plus optional LeRobot-supported peripherals), the compute (your fleet, our fleet, or hybrid), the data plane, the security overlay, and the deliverables. Outcome: a fixed-fee scoping document the right people on your side can sign.
Prototype build
Four to ten weeks, scoped to your problem. We build, in our Raleigh lab on our owned compute and the Reachy Mini we already operate, with weekly demo touch-points back to your team. We do not invoice for shadow work; everything we build is visible.
Handoff
One to two weeks. We deliver a hardware list (with the path to procure it through the Pollen Robotics direct channel for ten-or-more-unit orders), the code repository on your infrastructure, a written runbook, the security plan, and a recorded walkthrough. Optionally, we extend into a managed-prototype phase where we keep operating the prototype while your team learns the codebase.
The fastest path into the conversation is a phone call to (919) 348-4912 or a written scoping request through our contact form. We respond within one business hour during weekday business hours. We are not a high-pressure sales shop; if Reachy Mini is not the right platform for your problem, we will say so on the discovery call rather than after a contract.
Who authored this page
Frequently asked questions
What does Reachy Mini cost, and what is included?
Reachy Mini Lite is $299 (USD, before tax and shipping) and pairs with a Mac or Linux host. Reachy Mini, the Wireless variant, is $449 and ships with an onboard Raspberry Pi 4, Wi-Fi, an accelerometer, and a battery for untethered operation. Both variants ship within roughly 90 days of order, and 10-or-more-unit orders are routed through Pollen Robotics direct sales. Petronella Technology Group does not resell the hardware; we operate one in our Raleigh lab and use it as the entry point into a custom-quoted prototyping engagement.
Is Reachy Mini the same product as Reachy 2?
No. Reachy 2 is Pollen Robotics' full-size humanoid: 7-degree-of-freedom Orbita-joint arms, an omnidirectional mobile base, LiDAR, VR teleoperation, at $70,000. Reachy Mini is a desktop expressive humanoid: head with 6 degrees of freedom, two antennas, body rotation, no arms, no locomotion, at $299 to $449. They share an open-source software lineage and the LeRobot integration, but they answer very different research questions. Reachy Mini fits desktop research; Reachy 2 fits full-body humanoid research at universities and corporate labs.
Can Reachy Mini run without an internet connection?
Yes. Reachy Mini ships with a "no personal data stored, transmitted, or processed by default" posture from Hugging Face. The Wireless variant runs onboard inference for the bundled 15-plus behaviours on the Raspberry Pi 4. Heavier inference can run on a paired Linux host or workstation on the same local network with no public-internet path. This is exactly the architecture our regulated-industry clients require, and it is one of the reasons we chose this platform.
Does Reachy Mini support ROS 2?
The Pollen Robotics SDK is Python-first. The Hugging Face Reachy Mini launch announcement does not state native ROS 2 support, and ROS 2 bridge availability is being confirmed against the Pollen Robotics repository documentation. If a community ROS 2 bridge is published or already shipping in the pollen-robotics/reachy_mini repository, we will document it here. For projects that need a robust ROS 2 stack today, we typically pair Reachy Mini with a separate LeRobot-supported arm such as the Koch v1.1, the SO-101, or a Trossen Robotics Aloha; or we recommend a different platform.
Is Petronella Technology Group an authorized Pollen Robotics or Hugging Face partner?
No, and we will not claim to be. We operate a Reachy Mini in our Raleigh lab and we participate in the Hugging Face LeRobot open-source robotics ecosystem. There is no public Pollen Robotics partner program, reseller program, or channel program we have been admitted to as of the date on this page. If that changes, this answer will change with it. We sourced the unit through Pollen Robotics direct sales, the same channel that any team buying ten or more units would use.
What can a $299 desktop robot actually be used for in a real client engagement?
Conversational AI prototyping with a private speech-to-text and large language model stack on owned compute. Teleoperation studies where a remote operator drives the head and torso over a CMMC-aligned segmented network. Manipulation perception research, paired with a separate LeRobot-supported arm. Education and demo builds for executive and board audiences who need to see the technology before approving budget. Behaviour authoring on top of the bundled behaviours, with proper authorship and license provenance for the engagement. CMMC-aligned development workflow on the engineering process itself. Reachy Mini is not the right platform for industrial cobot integration on a moving production line, autonomous-vehicle work, surgical robotics, or weapons-platform work, and we will not scope it that way.
Where is the work done?
In our lab at 5540 Centerview Drive, Suite 200, Raleigh, North Carolina 27606. Training-side work runs on our owned datacenter GPU fleet sourced through the NVIDIA Elite Partner Channel: NVIDIA DGX systems for the heaviest workloads, NVIDIA HGX-class servers for production inference, and NVIDIA RTX PRO 6000 Blackwell workstations for prototyping. Sensor data captured on the Reachy Mini in our lab does not leave our infrastructure unless the client engagement explicitly authorises that, and even then it ships through paths the client has reviewed.
Can my team buy Reachy Mini units through Petronella Technology Group?
For fleet purchases of ten or more units, contact Petronella Technology Group directly so we can scope the variants, supporting compute, and security posture together. Reach us via our contact page or call (919) 348-4912. For single-unit research purchases the Reachy Mini ships through Pollen Robotics standard channels; our value is in the prototyping engagement, the security and compliance overlay, and the integration with private AI infrastructure, not in marking up the hardware.
Is the Reachy Mini hardware open-source?
The software side, including the Pollen Reachy Mini Python SDK and the LeRobot integration, is fully open-source on GitHub. The hardware side is described by Hugging Face as open, with the CAD files described as pending release at the launch announcement on July 9, 2025. We will update this page when the CAD release lands so that any team that wants to fabricate replacement parts has a citation for the license.
How do you handle data captured on the robot during a client engagement?
By default, sensor data captured on Reachy Mini in our Raleigh lab during a client engagement is stored on owned, encrypted storage inside our infrastructure. It is retained for the minimum time needed to train, verify, and deliver the prototype, and is deleted on a contractually-defined schedule. For healthcare research engagements, the storage path is covered by a Business Associate Agreement and the engagement operates under the client's existing Institutional Review Board protocol. For defense engagements, the storage path is aligned to NIST SP 800-171 and to CMMC Level 2 controls. We document every artifact we keep, including software and dataset bills of materials, and we hand the documentation to the client at the close of the engagement.
Ready to scope a Reachy Mini prototype?
Call (919) 348-4912 to talk to Craig directly, or schedule a hardware call at a time that fits your week. Our robotics practice overview, our CMMC compliance program, and our robotics prototyping engagement page have more on how the work fits together.
Petronella Technology Group · 5540 Centerview Dr, Suite 200, Raleigh, NC 27606 · Since 2002 · CMMC-AB RPO #1449
Get the Secure Robotics Development Brief
Tell Petronella Technology Group about your robotics project. We will reply within 4 business hours with a CMMC-RP led scoping conversation and the early-access edition of our Secure Robotics Development Brief covering CUI handling, on-prem AI inference for robotics, and CMMC-aligned development practices. No obligation. No sales pressure.