igibson package

Subpackages

Submodules

igibson.simulator module

class igibson.simulator.Simulator(gravity=9.8, physics_timestep=0.008333333333333333, render_timestep=0.03333333333333333, solver_iterations=100, mode='gui', image_width=128, image_height=128, vertical_fov=90, device_idx=0, render_to_tensor=False, rendering_settings=<igibson.render.mesh_renderer.mesh_renderer_settings.MeshRendererSettings object>, vr_settings=<igibson.render.mesh_renderer.mesh_renderer_vr.VrSettings object>)

Bases: object

Simulator class is a wrapper of physics simulator (pybullet) and MeshRenderer, it loads objects into both pybullet and also MeshRenderer and syncs the pose of objects and robot parts.

add_normal_text(text_data='PLACEHOLDER: PLEASE REPLACE!', font_name='OpenSans', font_style='Regular', font_size=48, color=[0, 0, 0], pos=[0, 100], size=[20, 20], scale=1.0, background_color=None)

Creates a Text object to be rendered to a non-VR screen. Returns the text object to the caller, so various settings can be changed - eg. text content, position, scale, etc. :param text_data: starting text to display (can be changed at a later time by set_text) :param font_name: name of font to render - same as font folder in iGibson assets :param font_style: style of font - one of [regular, italic, bold] :param font_size: size of font to render :param color: [r, g, b] color :param pos: [x, y] position of top-left corner of text box, in percentage across screen :param size: [w, h] size of text box in percentage across screen-space axes :param scale: scale factor for resizing text :param background_color: color of the background in form [r, g, b, a] - background will only appear if this is not None

add_overlay_image(image_fpath, width=1, pos=[0, 0, - 1])

Add an image with a given file path to the VR overlay. This image will be displayed in addition to any text that the users wishes to display. This function returns a handle to the VrStaticImageOverlay, so the user can display/hide it at will.

add_viewer()

Attach a debugging viewer to the renderer. This will make the step much slower so should be avoided when training agents

add_vr_overlay_text(text_data='PLACEHOLDER: PLEASE REPLACE!', font_name='OpenSans', font_style='Regular', font_size=48, color=[0, 0, 0], pos=[20, 80], size=[70, 80], scale=1.0, background_color=[1, 1, 1, 0.8])

Creates Text for use in a VR overlay. Returns the text object to the caller, so various settings can be changed - eg. text content, position, scale, etc. :param text_data: starting text to display (can be changed at a later time by set_text) :param font_name: name of font to render - same as font folder in iGibson assets :param font_style: style of font - one of [regular, italic, bold] :param font_size: size of font to render :param color: [r, g, b] color :param pos: [x, y] position of top-left corner of text box, in percentage across screen :param size: [w, h] size of text box in percentage across screen-space axes :param scale: scale factor for resizing text :param background_color: color of the background in form [r, g, b, a] - default is semi-transparent white so text is easy to read in VR

can_assisted_grasp(body_id, c_link)

Checks to see if an object with the given body_id can be grasped. This is done by checking its category to see if is in the allowlist.

disconnect()

Clean up the simulator

disconnect_pybullet()

Disconnects only pybullet - used for multi-user VR

fix_eye_tracking_value()

Calculates and fixes eye tracking data to its value during step(). This is necessary, since multiple calls to get eye tracking data return different results, due to the SRAnipal multithreaded loop that runs in parallel to the iGibson main thread

gen_assisted_grasping_categories()

Generates list of categories that can be grasped using assisted grasping, using labels provided in average category specs file.

gen_vr_data()

Generates a VrData object containing all of the data required to describe the VR system in the current frame. This data is used to power the BehaviorRobot each frame.

gen_vr_robot_action()

Generates an action for the BehaviorRobot to perform based on VrData collected this frame.

Action space (all non-normalized values that will be clipped if they are too large) * See BehaviorRobot.py for details on the clipping thresholds for Body: - 6DOF pose delta - relative to body frame from previous frame Eye: - 6DOF pose delta - relative to body frame (where the body will be after applying this frame’s action) Left hand, right hand (in that order): - 6DOF pose delta - relative to body frame (same as above) - Trigger fraction delta - Action reset value

Total size: 28

get_button_data_for_controller(controller_name)

Call this after getDataForVRDevice - returns analog data for a specific controller Returns trigger_fraction, touchpad finger position x, touchpad finger position y Data is only valid if isValid is true from previous call to getDataForVRDevice Trigger data: 1 (closed) <——> 0 (open) Analog data: X: -1 (left) <—–> 1 (right) and Y: -1 (bottom) <——> 1 (top) :param controller_name: one of left_controller or right_controller

get_button_for_action(action)

Returns (button, state) tuple corresponding to an action :param action: an action name listed in “action_button_map” dictionary for the current device in the vr_config.yml

get_category_ids(category_name)

Gets ids for all instances of a specific category (floors, walls, etc.) in a scene

get_data_for_vr_device(device_name)

Call this after step - returns all VR device data for a specific device Returns is_valid (indicating validity of data), translation and rotation in Gibson world space :param device_name: can be hmd, left_controller or right_controller

get_data_for_vr_tracker(tracker_serial_number)

Returns the data for a tracker with a specific serial number. This number can be found by looking in the SteamVR device information. :param tracker_serial_number: the serial number of the tracker

get_device_coordinate_system(device)

Gets the direction vectors representing the device’s coordinate system in list form: x, y, z (in Gibson coordinates) List contains “right”, “up” and “forward” vectors in that order :param device: can be one of “hmd”, “left_controller” or “right_controller”

get_eye_tracking_data()

Returns eye tracking data as list of lists. Order: is_valid, gaze origin, gaze direction, gaze point, left pupil diameter, right pupil diameter (both in millimeters) Call after getDataForVRDevice, to guarantee that latest HMD transform has been acquired

get_hidden_state(obj)

Returns the current hidden state of the object - hidden (True) or not hidden (False)

get_hmd_world_pos()

Get world position of HMD without offset

get_hud_show_state()

Returns the show state of the main VR HUD.

get_scroll_input()

Gets scroll input. This uses the non-movement-controller, and determines whether the user wants to scroll by testing if they have pressed the touchpad, while keeping their finger on the left/right of the pad. Return True for up and False for down (-1 for no scroll)

get_vr_events()

Returns the VR events processed by the simulator

get_vr_offset()

Gets the current VR offset vector in list form: x, y, z (in iGibson coordinates)

get_vr_pos()

Gets the world position of the VR system in iGibson space.

import_behavior_robot(bvr_robot)

Import registered behavior robot into the simulator.

import_ig_scene(**kwargs)
import_non_colliding_objects(objects, existing_objects=[], min_distance=0.5)

Loads objects into the scene such that they don’t collide with existing objects.

Parameters
  • objects – A dictionary with objects, from a scene loaded with a particular URDF

  • existing_objects – A list of objects that needs to be kept min_distance away when loading the new objects

  • min_distance – A minimum distance to require for objects to load

import_object(**kwargs)
import_particle_system(**kwargs)
import_robot(**kwargs)
import_scene(**kwargs)
isconnected()
Returns

pybullet is alive

load()

Set up MeshRenderer and physics simulation client. Initialize the list of objects.

load_articulated_object_in_renderer(**kwargs)
load_object_in_renderer(**kwargs)
load_visual_sphere(**kwargs)
load_without_pybullet_vis()

Load without pybullet visualizer

perform_vr_start_pos_move()

Sets the VR position on the first step iteration where the hmd tracking is valid. Not to be confused with self.set_vr_start_pos, which simply records the desired start position before the simulator starts running.

poll_vr_events()

Returns VR event data as list of lists. List is empty if all events are invalid. Components of a single event: controller: 0 (left_controller), 1 (right_controller) button_idx: any valid idx in EVRButtonId enum in openvr.h header file press: 0 (unpress), 1 (press)

query_vr_event(controller, action)

Queries system for a VR event, and returns true if that event happened this frame :param controller: device to query for - can be left_controller or right_controller :param action: an action name listed in “action_button_map” dictionary for the current device in the vr_config.yml

register_main_vr_robot(vr_robot)

Register the robot representing the VR user.

reload()

Destroy the MeshRenderer and physics simulator and start again.

set_hidden_state(obj, hide=True)

Sets the hidden state of an object to be either hidden or not hidden. The object passed in must inherent from Object at the top level

Note: this function must be called after step() in the rendering loop Note 2: this function only works with the optimized renderer - please use the renderer hidden list to hide objects in the non-optimized renderer

set_hud_show_state(show_state)

Shows/hides the main VR HUD. :param show_state: whether to show HUD or not

set_hud_state(state)

Sets state of the VR HUD (heads-up-display) :param state: one of ‘show’ or ‘hide’

set_render_timestep(render_timestep)
Parameters

render_timestep – render timestep to set in the Simulator

set_timestep(physics_timestep, render_timestep)

Set physics timestep and render (action) timestep

Parameters
  • physics_timestep – physics timestep for pybullet

  • render_timestep – rendering timestep for renderer

set_vr_offset(pos=None)

Sets the translational offset of the VR system (HMD, left controller, right controller) from world space coordinates. Can be used for many things, including adjusting height and teleportation-based movement :param pos: must be a list of three floats, corresponding to x, y, z in Gibson coordinate space

set_vr_pos(pos=None, keep_height=False)

Sets the world position of the VR system in iGibson space :param pos: position to set VR system to :param keep_height: whether the current VR height should be kept

set_vr_start_pos(start_pos=None, vr_height_offset=None)

Sets the starting position of the VR system in iGibson space :param start_pos: position to start VR system at :param vr_height_offset: starting height offset. If None, uses absolute height from start_pos

step(print_stats=False)

Step the simulation at self.render_timestep and update positions in renderer

step_vr(print_stats=False)

Step the simulation when using VR. Order of function calls: 1) Simulate physics 2) Render frame 3) Submit rendered frame to VR compositor 4) Update VR data for use in the next frame

sync(force_sync=False)

Update positions in renderer without stepping the simulation. Usually used in the reset() function

sync_vr_compositor()

Sync VR compositor.

trigger_haptic_pulse(device, strength)

Triggers a haptic pulse of the specified strength (0 is weakest, 1 is strongest) :param device: device to trigger haptic for - can be any one of [left_controller, right_controller] :param strength: strength of haptic pulse (0 is weakest, 1 is strongest)

update_position(instance, force_sync=False)

Update position for an object or a robot in renderer. :param instance: Instance in the renderer

vr_system_update()

Updates the VR system for a single frame. This includes moving the vr offset, adjusting the user’s height based on button input, and triggering haptics.

Module contents