Skip to content

EGalahad/simple-tracking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HDMI: Learning Interactive Humanoid Whole-Body Control from Human Videos

HDMI is a novel framework that enables humanoid robots to acquire diverse whole-body interaction skills directly from monocular RGB videos of human demonstrations.

This repository contains the official training code of HDMI: Learning Interactive Humanoid Whole-Body Control from Human Videos.

🚀 Quick Start

Setup virtual environment with uv sync and apply mjlab patch (venv files)

patch --forward -p0 < patches/mjlab_local.patch

For mujoco stubs

uv pip install mypy
uv run stubgen -m mujoco -o ./stubs
uv run stubgen -p mujoco -o ./stubs

Then add the ./stubs to VSCode settings.

// .vscode/settings.json
{
  "python.analysis.extraPaths": [
    "${workspaceFolder}/stubs",
  ]
}

Prepare Data

AMASS data: refer to https://github.com/Axellwppr/gentle-humanoid-training. use scripts/data_process/generate_amass_dataset.py to convert to HDMI format.

Lafan data: refer to https://github.com/EGalahad/lafan-process.

Train and Evaluate

Teacher policy

uv run scripts/train.py algo=ppo_roa_train task=G1/tracking/amass
uv run scripts/train.py algo=ppo_roa_train task=G1/tracking/lafan

Student policy

uv run scripts/train.py algo=ppo_roa_finetune task=G1/tracking/lafan

Play policy

uv run scripts/play.py algo=ppo_roa_play task=G1/tracking/lafan checkpoint_path=run:<wandb_run_path>

add export_policy=true to export onnx model.

Sim2Real

Please see github.com/EGalahad/sim2real for details.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages