iGibson, the Interactive Gibson Environment, is a simulation environment providing fast visual rendering and physics simulation (based on Bullet). It is packed with a dataset with hundreds of large 3D environments reconstructed from real homes and offices, and interactive objects that can be pushed and actuated. iGibson allows to train and evaluate robotic agents that use RGB images and/or other visual sensors to solve indoor (interactive) navigation and manipulation tasks such as opening doors, picking up and placing objects, or searching in cabinets.
containing 572 buildings, 1400 floors, and 211k square meters of indoor spaces
for five object categories (chairs, desks, doors, sofas, and tables) in ten buildings (more coming soon!)
with internal degrees of freedom that can be actuated, such as doors, drawers and cabinets
of one house with replaced texturized models that can be interacted
achieving more than 200 fps with full-physics or close to 1000 fps with only kinematics
to train in collaborative tasks where agents see each other
for interactive and point2point navigation, including pretrained visuo-motor SAC, DDPG, and PPO
Comparison to other environments:
|Real-world Scenes||3D Object Assets||Articulated Objects||Realistic Robot Control||Multi-Agent|
There are two ways to install iGibson on your machine:
In most machines, you can directly install iGibson with a single command. Source your virtual environment (Python 2.7 and >3) and execute
pip install gibson2
You can then run
python -m gibson2.envs.demo_interactive
that will download one fully interactive environment and show a simple robot navigating in it.
There are several datasets that you can download and use with iGibson, all of them accessible once you fill in this form. You will get access to the ten environments with annotated instances of furniture (chairs, tables, desks, doors, sofas) that can be interacted, and to the original 572 reconstructed 3D environments. You can also download a fully annotated environment where the interactive objects replaced the original spatial arrangenment of a real house (original 3D model also available).
2020.04 iGibson Dataset v1 Released: This release include the simulation environment, ten houses annotated with interactive objects of five categories, and one house fully annotated to be interactive and with selected textures. We include documentation with code examples and baselines of navigation agents with reinforcement learning state-of-the-art algorithms.
2020.02 Beginning of the simulation phase of our CVPR Challenge "Sim2Real Challenge with Gibson": Do you want to see your visual navigation algorithm run on a real robot but you don't want to deal with the real world setup? Participate in our CVPR20 challenge! We will test the best entries from a first simulation-only phase on our own Locobots in a real apartment. If you want to participate, follow the instructions in the challenge page.
iGibson handles multiple agents: it renders simultaneously images from the virtual camera of all agents and simulates collisions between them.
iGibson support rendering of RGB images, surface normals, depth and segmentation masks.
In iGibson, photorealistic textures have been baked on the interactive object surfaces with path traced rendering.
In iGibson, agents can push interactive objects away to achieve navigation tasks.
Thanks to the photorealism of iGibson, tranfering from simulation to real-world is easier.
iGibson is a new version of our original Gibson V1 environment, presented at CVPR18. If you would like to know more details about iGibson, you can find them in the following publications:"Interactive Gibson Benchmark: A Benchmark for Interactive Navigation in Cluttered Environments.", by Fei Xia, William B. Shen, Chengshu Li, Priya Kasimbeg, Micael Edmond Tchapmi, Alexander Toshev, Roberto Martín-Martín, and Silvio Savarese. IEEE Robotics and Automation Letters and ICRA 2020.
Consider citing them if you use iGibson in your research.
The documentation for the current version of iGibson can be found here. The documentation includes multiple code examples and snippets to help you develop your own solutions, or modify the code for your project. If what you are looking for is not in the documentation, check the issues section of our github repository or contact Fei Xia.