Loading...
Reachy Me, Reachy You: A Share‑Style Integration of Reachy Mini with OpenClaw
This is a more “share‑style” integration log: bringing Reachy Mini (via ClawBody) into OpenClaw so the AI can move with intent, not just speak. I’ve added commit links so you can jump straight to the code changes.
1. The Foundation: Devbox & GitHub
Before anything else, we stabilized the dev environment and repo access:
- Permissions: direct GitHub access isn’t available to the model, so all operations run on a controlled machine (devbox).
- SSH setup: generated keys for the devbox node user, added to GitHub, and verified connections.
- Repo management: moved from HTTPS to SSH for reliable server-side operations.
2. Expression & Gestures: From “It Moves” to “It Feels”
What’s working now (with commit links)
- Core motion (look / goto_target / set_target) works reliably.
- Macro gestures are stronger (nod/shake/wave/bounce): f416c04
- Speech‑aligned gestures +
body_swaytool: 111668b - Tool result logging for debugging: 863d8ae
- Cue‑word gesture triggers: 525f7d1
- Live transcript‑delta gesture triggers: 4053353
- Natural turn‑level gestures: 1978064
- Discover daemon’s 81 recorded emotions/dances: 90c7da3
- Dynamic capability registry + local capabilities calls: f25c102, 8b9e538
- Dataset layout support for
reachy_mini_dances_library: a53d93a - emotion() prefers daemon recorded expressions: bda899a
Key improvements in progress
- Gesture timing precision: cue‑word and delta triggers are usable, but GESTURE_BIAS_S still needs tuning to align actions with speech timing.
- body_sway: implemented; needs full real‑hardware testing and API responsiveness checks.
- Dance library:
reachy_mini_dances_librarylayout is detected, but Python API (collection.dance) needs complete verification.
3. Current State & Next Steps
✅ What’s solid
- Reachy Mini can express emotion clearly.
- Gestures are executed, not just narrated.
- Daemon-side expressions are now part of the capability registry.
🔧 Next steps
- Tighten speech sync with better GESTURE_BIAS_S tuning
- Validate body_sway execution on real hardware
- Expand dance execution across the full library
- Improve fallbacks when daemon APIs or datasets fail
Closing
We’re past the “it moves” stage and into “it feels” territory. The next milestone is precision: tighter timing and more reliable capability execution. That’s where physical AI starts to feel truly alive.