Post
64
EO-1 is now available directly in LeRobot!
You can train, evaluate, and deploy EO-1 through the standard LeRobot policy interface, making it much easier to try EO-1 on robot-control workflows.
Huge thanks to @pepijn223 and the
lerobot team for the collaboration and support.
EO-1 is an open-source Vision-Language-Action model for general robot control. It combines a Qwen2.5-VL backbone with continuous flow-matching, enabling multimodal perception, embodied reasoning, and action generation in one unified model.
Try it out:
LeRobot: https://github.com/huggingface/lerobot/tree/main/src/lerobot/policies/eo1
Project: http://eo-robotics.ai/eo-1
Paper: https://arxiv.org/abs/2508.21112
Code: https://github.com/SHAILAB-IPEC/EO1
You can train, evaluate, and deploy EO-1 through the standard LeRobot policy interface, making it much easier to try EO-1 on robot-control workflows.
Huge thanks to @pepijn223 and the
EO-1 is an open-source Vision-Language-Action model for general robot control. It combines a Qwen2.5-VL backbone with continuous flow-matching, enabling multimodal perception, embodied reasoning, and action generation in one unified model.
Try it out:
LeRobot: https://github.com/huggingface/lerobot/tree/main/src/lerobot/policies/eo1
Project: http://eo-robotics.ai/eo-1
Paper: https://arxiv.org/abs/2508.21112
Code: https://github.com/SHAILAB-IPEC/EO1