Robotics Tutorials

Hands-on, reproducible, step-by-step tutorials for the tools and workflows modern robot-learning teams actually use.

Welcome to the Silicon Valley Robotics Center tutorials hub. Every tutorial on this page is structured as a proper HowTo guide: a defined goal, a prerequisites list, an ordered sequence of steps with real commands, expected output, and a troubleshooting section. Each tutorial is also published with schema.org/HowTo and VideoObject structured data so search engines can surface the right step to the right reader at the right time.

Our focus is practical workflows at the intersection of vision-language-action models, open robot-learning datasets, and the hardware that lives in real labs — the kind of hardware you can buy or lease from SVRC. That means step sequences that match the actual tooling releases (Isaac Sim 4.x, ROS 2 Humble, HuggingFace LeRobot 2.x, PEFT/LoRA fine-tuning) rather than blog posts that go stale the moment a dependency ships a new major version.

Each tutorial sits inside a broader path: if you are new to simulation, start with Isaac Lab. If you already have demonstration data in hand, jump to the fine-tuning tutorial. If you have hardware but no data, start with LeRobot recording or the ALOHA teleop build. You can always cross-reference related topics in the guides, the academy, and the RL environments catalog.

All tutorials are free. Video walkthroughs are attached where available. If you spot a command that no longer matches the current upstream release, please open a thread in the community forum — we update tutorials monthly.

How these tutorials are structured

Each tutorial follows a consistent shape so you always know where to look. A short lede explains what you will accomplish and who the tutorial is for. A prerequisites list tells you what hardware, software, and skill level you need before starting. The steps section is a numbered list of concrete actions with real commands, expected output, and callouts for gotchas. A deep dive section at the end covers the non-obvious trade-offs — the kind of context that usually only shows up in Slack messages after something breaks. A FAQ at the bottom answers the common follow-ups. Finally, a related tutorials grid links to the logical next steps.

Recommended reading order

If you are starting from zero and you have hardware, record data first: Record a LeRobot-Compatible Dataset is the shortest path to having something to train on. Then fine-tune a policy: Fine-Tune OpenVLA gives you a modern VLA baseline. If you are doing bimanual work, add ALOHA Teleop Rig Setup before the dataset tutorial. For humanoid teams, start with Unitree G1 Camera Calibration — bad calibration wastes every hour of data collection that follows. And if you want to pretrain in simulation before recording any real data, Install Isaac Lab is the right starting point.

Versioning and accuracy

Robotics tooling moves fast. Commands in these tutorials were validated against the April 2026 releases: Isaac Sim 4.x, LeRobot 2.x, OpenVLA on PEFT 0.12+, ROS 2 Humble, Unitree G1 SDK, OpenCV 4.8+. Where exact CLI flags change release-to-release, we err toward documenting the general shape of the command and pointing you at upstream docs for the current exact form. If something is broken on your side, the first question to ask is "did a dependency ship a new major version since this was written?" — the answer is often yes.

Simulation

How to Install Isaac Lab on Ubuntu 22.04

Clone the IsaacLab repo, create a conda env, pair it with Isaac Sim 4.x, and run your first manipulation RL training in simulation.

45 minIntermediate
VLA / Fine-tuning

How to Fine-Tune OpenVLA on Your Own Robot Dataset

LoRA-based fine-tuning of OpenVLA-7B on an RLDS dataset. Covers GPU sizing, dataloader, LoRA config, and evaluation.

3 hoursAdvanced
Teleoperation

How to Set Up an ALOHA-Style Bimanual Teleop Rig

Build an ALOHA-style rig with 4 WidowX 250 arms, 3 cameras, ROS 2 Humble, and leader-follower sync.

1 dayAdvanced
Humanoid

How to Calibrate Cameras on Unitree G1

ChArUco intrinsic and extrinsic calibration for the G1 head cameras. Save results to YAML, validate on a test scene.

90 minIntermediate
Data collection

How to Record a LeRobot-Compatible Dataset

Use the lerobot record CLI to capture episode-based parquet + video shards ready for HuggingFace Hub.

2 hoursBeginner