⚠️ Alpha Release Notice
This is an alpha release for MOPS data generation. The public API might still change, and some bugs might be present.
This project builds on ManiSkill3, a simulation framework built on SAPIEN, which requires Python 3.10.
# Create and activate conda environment
conda create --name mops python=3.10
conda activate mops
# Install ManiSkill3 and dependencies
pip install mani_skill
pip install torch torchvision torchaudio
pip install -e .-
Download RoboCasa assets
python -m mani_skill.utils.download_asset RoboCasa
-
Download PartNet Mobility Assets
Get the assets from SAPIEN UCSD -
Organize directory structure
mops-data/ ├── demos/ ├── data/ │ └── partnet_mobility/ ├── src/ └── ...
For code development, install and configure pre-commit hooks:
pip install black isort pre-commit
pre-commit install
pre-commit runThis project builds on ManiSkill3, so much of its official documentation is applicable to MOPS-data.
Our CustomEnvs now accept an optional keyword argument np_rng, which expects a Numpy Random Number Generator Object.
The AffordanceKitchenEnv-v1 also supports an optional argument preroll, which deterministically creates a kitchen setup.
For MOPS-specific functionality, refer to the code documentation in the src/ directory and example scripts in demos/.
| Component | Description |
|---|---|
demos/ |
Executable high-level scripts for quick image generation |
src.mops_data |
Core source code |
mops_data.asset_manager |
Observation augmentation using annotation resources |
mops_data.envs |
Custom environments for rendering objects and scenes |
xr_teleop |
WebXR-based VR teleoperation controller |
The xr_teleop module provides rudimentary VR teleoperation through WebXR. Serve the webpage and access it with a VR headset to send control commands.
🔜 Coming Soon: IRIS support!
- 🎨 Photoreal Simulation: High-quality visual rendering for computer vision
- 🤖 Robotic Manipulation: Specialized environments for manipulation tasks
- 🏠 Kitchen Environments: Realistic household scenarios
- 📦 Object Diversity: Support for cluttered tabletop scenarios
- 🔧 Extensible: Modular design for custom environments
If you use MOPS-data in your research, please cite:
@article{li2025mops,
title={Multi-Objective Photoreal Simulation (MOPS) Dataset for Computer Vision in Robotic Manipulation},
author={
Maximilian Xiling Li and
Paul Mattes and
Nils Blank and
Korbinian Franz Rudolf and
Paul Werker L\"odige and
Rudolf Lioutikov
},
year={2025}
}We welcome contributions! Please see our development setup above and feel free to submit issues and pull requests.
This project is licensed under the MIT License - see the LICENSE file for details.