iGibson Github

iGibson, the Interactive Gibson Environment, is a simulation environment providing fast visual rendering and physics simulation (based on Bullet). It is packed with a dataset with hundreds of large 3D environments reconstructed from real homes and offices, and interactive objects that can be pushed and actuated. iGibson allows to train and evaluate robotic agents that use RGB images and/or other visual sensors to solve indoor (interactive) navigation and manipulation tasks such as opening doors, picking up and placing objects, or searching in cabinets.


Features


Dataset of Large Environments

containing 572 buildings, 1400 floors, and 211k square meters of indoor spaces

Interactive Environments

for five object categories (chairs, desks, doors, sofas, and tables) in ten buildings (more coming soon!)

Robot Models

of the most common real robots and AI agents: Fetch and Freight, Husky, TurtleBot v2, Locobot, Minitaur, JackRabbot, a generic quadrocopter, Humanoid and Ant

Articulated Objects

with internal degrees of freedom that can be actuated, such as doors, drawers and cabinets

Full Interactive Model

of one house with replaced texturized models that can be interacted

Fast Rendering and Physics

achieving more than 200 fps with full-physics or close to 1000 fps with only kinematics

Multi-Agent Support

to train in collaborative tasks where agents see each other

Reinforcement Learning Baselines

for interactive and point2point navigation, including pretrained visuo-motor SAC, DDPG, and PPO

Comparison to other environments:

Real-world Scenes 3D Object Assets Articulated Objects Realistic Robot Control Multi-Agent
iGibson
Habitat-sim
Sapien
AI2Thor

Installing iGibson Environment


There are two ways to install iGibson on your machine:

Pip install:
In most machines, you can directly install iGibson with a single command. Source your virtual environment (Python 2.7 and >3) and execute
pip install gibson2
You can then run
python -m gibson2.envs.demo_interactive
that will download one fully interactive environment and show a simple robot navigating in it.

From source:
Follow the instructions in the installation page of the documentation to clone our github repo and install from source, if you plan to modify the code.


Download Interactive Gibson Dataset


There are several datasets that you can download and use with iGibson, all of them accessible once you fill in this form. You will get access to the ten environments with annotated instances of furniture (chairs, tables, desks, doors, sofas) that can be interacted, and to the original 572 reconstructed 3D environments. You can also download a fully annotated environment where the interactive objects replaced the original spatial arrangenment of a real house (original 3D model also available).


News


  • 2020.04 iGibson Dataset v1 Released: This release include the simulation environment, ten houses annotated with interactive objects of five categories, and one house fully annotated to be interactive and with selected textures. We include documentation with code examples and baselines of navigation agents with reinforcement learning state-of-the-art algorithms.

  • 2020.02 Beginning of the simulation phase of our CVPR Challenge "Sim2Real Challenge with Gibson": Do you want to see your visual navigation algorithm run on a real robot but you don't want to deal with the real world setup? Participate in our CVPR20 challenge! We will test the best entries from a first simulation-only phase on our own Locobots in a real apartment. If you want to participate, follow the instructions in the challenge page.

  • 2020.01 Paper Accepted at RA-L and ICRA2020: [Paper] [Arxiv Version]


Demo Videos


Multi-Agent Simulation

iGibson handles multiple agents: it renders simultaneously images from the virtual camera of all agents and simulates collisions between them.


Multi-Modal Rendering

iGibson support rendering of RGB images, surface normals, depth and segmentation masks.


Arm Manipulation

iGibson allows agents to interact with the environment and change its configuration with its arms. Here, a simulated JackRabbot pulls open a door, see HRL4IN for more details.


Textured Interactive Objects

In iGibson, photorealistic textures have been baked on the interactive object surfaces with path traced rendering.


Interactive Navigation

In iGibson, agents can push interactive objects away to achieve navigation tasks.


Sim2real Transfer

Thanks to the photorealism of iGibson, tranfering from simulation to real-world is easier.


Documentation


iGibson is a new version of our original Gibson V1 environment, presented at CVPR18. If you would like to know more details about iGibson, you can find them in the following publications:

"Interactive Gibson Benchmark: A Benchmark for Interactive Navigation in Cluttered Environments.", by Fei Xia, William B. Shen, Chengshu Li, Priya Kasimbeg, Micael Edmond Tchapmi, Alexander Toshev, Roberto Martín-Martín, and Silvio Savarese. IEEE Robotics and Automation Letters and ICRA 2020.
[Paper] [Arxiv Version] [Bibtex]
(Outdated) "Gibson env V2: Embodied Simulation Environments for Interactive Navigation.", by Fei Xia, Chengshu Li, Kevin Chen, William B. Shen, Roberto Martín-Martí, Noriaki Hirose, Amir R. Zamir, Li Fei-Fei, and Silvio Savarese. Technical Report 2019.
[Technical Report] [Bibtex]

Consider citing them if you use iGibson in your research.


The documentation for the current version of iGibson can be found here. The documentation includes multiple code examples and snippets to help you develop your own solutions, or modify the code for your project. If what you are looking for is not in the documentation, check the issues section of our github repository or contact Fei Xia.


Projects Developed based on iGibson, the Interactive Gibson Environment

[full list]