Introducing Stretch AI
Explore Embodied Intelligence
Explore the future of robotics and AI with our open-source (Apache 2.0) set of tools, tutorials, and reference code for:
grasping
manipulation
mapping
navigation
LLM agents
text to speech and speech to text
visualization and debugging
Stretch AI is still in its early days. It is intended for advanced Stretch users familiar with building and debugging embodied intelligence systems.
Build Your Own AI Skills
Designed to accelerate your research.
Build your own complex skills starting with our reference code and tools, including:
LLM Agent: LLM agent for language-directed pick and place
Dynamem OVMM system: Deploy open vocabulary mobile manipulation with Dynamem
Data Collection: Collect data for learning from demonstration
Learning from Demonstration: Train and evaluate policies with LfD
Open-vocabulary mobile manipulation: Handle more complex language commands
Rerun: Real-time data visualization
Language Guided Pick and Place
βHey, Stretch, pick up the Totoro toy and put it in the basket.β
Use an LLM agent to to pick up a variety of objects from the floor and place them in nearby containers, such as baskets and boxes.
Use Dynamem to dynamically handle moving objects and people in the environment.
Learning from Demonstration
Train your Stretch to do simple manual tasks
Train Stretch to do simple dexterous tasks using learning-from-demonstration.
Available policies include ACT, VQ-BeT, and Diffusion Policy
Data collection is conducted using the Dex Teleop Kit and managed using HuggingFace LeRobot.
An Invitation for Open Collaboration
Join us in democratizing embodied intelligence for home robots
Stretch AI is just the start of our journey to bring the power of embodied intelligence to robots that assist people in their homes.
We develop in the open, share our work early, and encourage collaboration across our diverse community.
Join us!