igibson.utils package


igibson.utils.assets_utils module


Download iGibson assets


Download Gibson dataset


Download iGibson demo dataset


Download iGibson dataset


Get 3D-FRONT scene path


scene_name – scene name


file path to the scene name


Get iGibson all object models


a list of all object model paths


Get cubicasa scene path


scene_name – scene name


file path to the scene name


Get iGibson asset version


iGibson asset version


Load average object specs (dimension and mass) for objects


Get iGibson object categories


file path to the scene name


Get iGibson object category path


category_name – object category


file path to the object category

igibson.utils.assets_utils.get_ig_model_path(category_name, model_name)

Get iGibson object model path

  • category_name – object category

  • model_name – object model


file path to the object model


Get iGibson scene path


scene_name – scene name


file path to the scene name


Gibson scene path


scene_id – scene id


scene path for this scene_id


Get texture file


mesh_file – mesh obj file


texture file path

igibson.utils.behavior_robot_planning_utils module

igibson.utils.behavior_robot_planning_utils.dry_run_arm_plan(robot: igibson.robots.behavior_robot.BehaviorRobot, plan)
igibson.utils.behavior_robot_planning_utils.dry_run_base_plan(robot: igibson.robots.behavior_robot.BehaviorRobot, plan)
igibson.utils.behavior_robot_planning_utils.get_hand_distance_fn(weights=array([1.0, 1.0, 1.0, 1.0, 1.0, 1.0]))
igibson.utils.behavior_robot_planning_utils.plan_base_motion_br(robot: igibson.robots.behavior_robot.BehaviorRobot, end_conf, base_limits, obstacles=[], direct=False, weights=array([1.0, 1.0, 1.0]), resolutions=array([0.05, 0.05, 0.05]), max_distance=0.0, override_sample_fn=None, **kwargs)
igibson.utils.behavior_robot_planning_utils.plan_hand_motion_br(robot: igibson.robots.behavior_robot.BehaviorRobot, obj_in_hand, end_conf, hand_limits, obstacles=[], direct=False, weights=(1, 1, 1, 5, 5, 5), resolutions=array([0.02, 0.02, 0.02, 0.02, 0.02, 0.02]), max_distance=0.0, **kwargs)

igibson.utils.checkpoint_utils module

This file contains utils for BEHAVIOR demo replay checkpoints.

igibson.utils.checkpoint_utils.load_checkpoint(simulator, root_directory, frame)
igibson.utils.checkpoint_utils.load_internal_states(simulator, dump)
igibson.utils.checkpoint_utils.save_checkpoint(simulator, root_directory)

igibson.utils.constants module

Constant Definitions

class igibson.utils.constants.OccupancyGridState

Bases: object

class igibson.utils.constants.PyBulletSleepState(value)

Bases: enum.IntEnum

An enumeration.

class igibson.utils.constants.SemanticClass(value)

Bases: enum.IntEnum

An enumeration.

class igibson.utils.constants.ShadowPass(value)

Bases: enum.IntEnum

An enumeration.


igibson.utils.generate_trav_map module

igibson.utils.generate_trav_map.generate_trav_map(scene_name, scene_source, load_full_scene=True)

igibson.utils.git_utils module


igibson.utils.ig_logging module

IG logging classes that write/read iGibson data to/from HDF5. These classes can be used to write regular logs, iGATUS task logs or VR logs to HDF5 for saving and replay.

class igibson.utils.ig_logging.IGLogReader(log_filepath, log_status=True)

Bases: object


Call this once reading has finished to clean up resources used.


Gets action for agent with a specific name.


Returns whether there is still data left to read.

static get_obj_body_id_to_name(vr_log_path)

Returns VR for the current frame as a VrData object. This can be indexed into to analyze individual values, or can be passed into the BehaviorRobot to drive its actions for a single frame.

static has_metadata_attr(vr_log_path, attr_name)

Checks whether a given HDF5 log has a metadata attribute.


Reads the action at action_path for the current frame.

action_path: /-separated string representing the action to fetch. This should match

an action that was previously registered with the VRLogWriter during data saving

static read_metadata_attr(vr_log_path, attr_name)

Reads a metadata attribute from a given HDF5 log path.


Reads any saved value at value_path for the current frame.


value_path: /-separated string representing the value to fetch. This should be one of the values list in the comment at the top of this file. Eg. vr/vr_button_data/right_controller


Sets camera based on saved camera matrices. Only valid if VR was used to save a demo. :param sim: Simulator object

class igibson.utils.ig_logging.IGLogWriter(sim, log_filepath, frames_before_write=200, task=None, store_vr=False, vr_robot=None, filter_objects=True, profiling_mode=False, log_status=True)

Bases: object

Class that handles saving of physics data, VR data, iGATUS task data and user-defined actions.

Usage: 1) Before simulation loop init -> N x register_action -> set_up_data_storage

2) During simulation: N x save_action (at any point during frame) -> process_frame (at end of frame)

3) After simulation, before disconnecting from PyBullet sever: end_log_session


Creates data map of data that will go into HDF5 file. All the data in the map is reset after every self.frames_before_write frames, by refresh_data_map.


Closes hdf5 log file at end of logging session.


Generates lists of name paths for resolution in hd5 saving. Eg. [‘vr’, ‘vr_camera’, ‘right_eye_view’].


Resolves a list of names (group/dataset) into a numpy array. eg. [vr, vr_camera, right_eye_view] -> self.data_map[‘vr’][‘vr_camera’][‘right_eye_view’]

static one_hot_encoding(hits, categories)

Asks the VRLogger to process frame data. This includes: – updating pybullet data – incrementing frame counter by 1


Resets all values stored in self.data_map to the default sentinel value. This function is called after we have written the last self.frames_before_write frames to HDF5 and can start inputting new frame data into the data map.

register_action(action_path, action_shape)

Registers an action to be saved every frame in the VRLogWriter.

action_path: The /-separated path specifying where to save action data. All entries but the last will be treated

as group names, and the last entry will the be the dataset. The parent group for all actions is called action. Eg. action_path = vr_hand/constraint. This will end up in action (group) -> vr_hand (group) -> constraint (dataset) in the saved data.

action_shape: tuple representing action shape. It is expected that all actions will be numpy arrays. They

are stacked over time in the first dimension to create a persistent action data store.

save_action(action_path, action)

Saves a single action to the VRLogWriter. It is assumed that this function will be called every frame, including the first.


action_path: The /-separated action path that was used to register this action action: The action as a numpy array - must have the same shape as the action_shape that

was registered along with this action path


Performs set up of internal data structures needed for storage, once VRLogWriter has been initialized and all actions have been registered.


Writes frame data to the data map.


s (simulator): used to extract information about VR system


Write all pybullet data to the class’ internal map.


Writes data stored in self.data_map to hd5. The data is saved each time this function is called, so data will be saved even if a Ctrl+C event interrupts the program.


Writes all VR data to map. This will write data that the user has not even processed in their demos. For example, we will store eye tracking data if it is valid, even if they do not explicitly use that data in their code. This will help us store all the necessary data without remembering to call the simulator’s data extraction functions every time we want to save data.


s (simulator): used to extract information about VR system

igibson.utils.map_utils module

class igibson.utils.map_utils.Plane(orig, normal)

Bases: object

igibson.utils.map_utils.compute_triangle_plane_intersections(vertices, faces, tid, plane, dists, dist_tol=1e-08)

Compute the intersection between a triangle and a plane Returns a list of intersections in the form (INTERSECT_EDGE, <intersection point>, <edge>) for edges intersection (INTERSECT_VERTEX, <intersection point>, <vertex index>) for vertices This return between 0 and 2 intersections : - 0 : the plane does not intersect the plane - 1 : one of the triangle’s vertices lies on the plane (so it just “touches” the plane without really intersecting) - 2 : the plane slice the triangle in two parts (either vertex-edge, vertex-vertex or edge-edge)

igibson.utils.map_utils.gen_map(vertices, faces, output_folder, img_filename_format='floor_{}.png')
igibson.utils.map_utils.gen_trav_map(vertices, faces, output_folder, add_clutter=False, trav_map_filename_format='floor_trav_{}.png', obstacle_map_filename_format='floor_{}.png')

Generate traversability maps.

igibson.utils.map_utils.get_xy_floors(vertices, faces, dist_threshold=- 0.98)
igibson.utils.map_utils.point_to_plane_dist(p, plane)

igibson.utils.mesh_geometric_conversion module

igibson.utils.mesh_geometric_conversion.insert_geometric_primitive(obj, insert_visual_mesh=True)

igibson.utils.mesh_util module

3D mesh manipulation utilities.

igibson.utils.mesh_util.anorm(x, axis=None, keepdims=False)

Compute L2 norms alogn specified axes.

igibson.utils.mesh_util.frustum(left, right, bottom, top, znear, zfar)

Create view frustum matrix.

igibson.utils.mesh_util.homotrans(M, p)

Load 3d mesh form .obj’ file.


fn: Input file name or file-like object.


dictionary with the following keys (some of which may be missing): position: np.float32, (n, 3) array, vertex positions uv: np.float32, (n, 2) array, vertex uv coordinates normal: np.float32, (n, 3) array, vertex uv normals face: np.int32, (k*3,) traingular face indices

igibson.utils.mesh_util.lookat(eye, target=[0, 0, 0], up=[0, 1, 0])

Generate LookAt modelview matrix.

igibson.utils.mesh_util.normalize(v, axis=None, eps=1e-10)

L2 Normalize along specified axes.


Scale mesh to fit into -1..1 cube

igibson.utils.mesh_util.ortho(left, right, bottom, top, znear, zfar)

Create orthonormal projection matrix.

igibson.utils.mesh_util.perspective(fovy, aspect, znear, zfar)

Create perspective projection matrix.


quat – quaternion in w,x,y,z


rotation matrix 4x4


mat – 4x4 matrix


quaternion in w,x,y,z

igibson.utils.mesh_util.sample_view(min_dist, max_dist=None)

Sample random camera position. Sample origin directed camera position in given distance range from the origin. ModelView matrix is returned.

igibson.utils.mesh_util.save_obj(vertices_info, faces_info, fn)
igibson.utils.mesh_util.transform_vertex(vertices, pose_rot, pose_trans)

orn – quaternion in xyzw


quaternion in wxyz

igibson.utils.monitor module

class igibson.utils.monitor.Monitor(env, filename, allow_early_resets=False, reset_keywords=())

Bases: gym.core.Wrapper

EXT = 'monitor.csv'

Override close in your subclass to perform any necessary cleanup.

Environments will automatically close() themselves when garbage collected or when the program exits.

f = None

igibson.utils.motion_planning_wrapper module

class igibson.utils.motion_planning_wrapper.MotionPlanningWrapper(env=None, base_mp_algo='birrt', arm_mp_algo='birrt', optimize_iter=0, fine_motion_plan=True)

Bases: object

Motion planner wrapper that supports both base and arm motion


Dry run arm motion plan by setting the arm joint position without physics simulation


arm_path – arm trajectory or None if no plan can be found


Dry run base motion plan by setting the base positions without physics simulation


path – base waypoints or None if no plan can be found

execute_arm_push(plan, hit_pos, hit_normal)

Execute arm push given arm trajectory Should be called after plan_arm_push()

  • plan – arm trajectory or None if no plan can be found

  • hit_pos – 3D position to reach

  • hit_normal – direction to push after reacehing that position


Attempt to find arm_joint_positions that satisfies arm_subgoal If failed, return None


arm_ik_goal – [x, y, z] in the world frame


arm joint positions


Get IK parameters such as joint limits, joint damping, reset position, etc


IK parameters

interact(push_point, push_direction)

Move the arm starting from the push_point along the push_direction and physically simulate the interaction

  • push_point – 3D point to start pushing from

  • push_direction – push direction


Attempt to reach arm arm_joint_positions and return arm trajectory If failed, reset the arm to its original pose and return None


arm_joint_positions – final arm joint position to reach


arm trajectory or None if no plan can be found

plan_arm_push(hit_pos, hit_normal)

Attempt to reach a 3D position and prepare for a push later

  • hit_pos – 3D position to reach

  • hit_normal – direction to push after reacehing that position


arm trajectory or None if no plan can be found


Plan base motion given a base subgoal


goal – base subgoal


waypoints or None if no plan can be found


Set subgoal marker position


pos – position

set_marker_position_direction(pos, direction)

Set subgoal marker position and orientation

  • pos – position

  • direction – direction vector

set_marker_position_yaw(pos, yaw)

Set subgoal marker position and orientation

  • pos – position

  • yaw – yaw angle


Set up arm motion planner


Step the simulator and sync the simulator to renderer


Sync the simulator to renderer

igibson.utils.muvr_utils module

igibson.utils.sampling_utils module

igibson.utils.sampling_utils.check_cuboid_empty(hit_normal, bottom_corner_positions, refusal_log, this_cuboid_dimensions)
igibson.utils.sampling_utils.check_hit_max_angle_from_z_axis(hit_normal, max_angle_with_z_axis, refusal_log)
igibson.utils.sampling_utils.check_normal_similarity(center_hit_normal, hit_normals, refusal_log)
igibson.utils.sampling_utils.check_rays_hit_object(cast_results, body_id, refusal_log)
igibson.utils.sampling_utils.compute_ray_destination(axis, is_top, start_pos, aabb_min, aabb_max)
igibson.utils.sampling_utils.compute_rotation_from_grid_sample(two_d_grid, hit_positions, cuboid_centroid, this_cuboid_dimensions)

Fits a plane to the given 3D points. Copied from https://stackoverflow.com/a/18968498


points – np.array of shape (k, 3)

:return Tuple[np.array, np.array] where first element is the points’ centroid and the second is

igibson.utils.sampling_utils.get_distance_to_plane(points, plane_centroid, plane_normal)
igibson.utils.sampling_utils.get_parallel_rays(source, destination, offset, new_ray_per_horizontal_distance=0.1)

Given a ray described by a source and a destination, sample parallel rays and return together with input ray.

The parallel rays start at the corners of a square of edge length offset centered on source, with the square orthogonal to the ray direction. That is, the cast rays are the height edges of a square-base cuboid with bases centered on source and destination.

  • source – Source of the ray to sample parallel rays of.

  • destination – Source of the ray to sample parallel rays of.

  • offset – Orthogonal distance of parallel rays from input ray.

  • new_ray_per_horizontal_distance – Step in offset beyond which an additional split will be applied in the parallel ray grid (which at minimum is 3x3 at the AABB corners & center).

:return Tuple[List, List, Array[W, H, 3]] containing sources and destinations of original ray and the unflattened,

untransformed grid in object coordinates.

igibson.utils.sampling_utils.get_projection_onto_plane(points, plane_centroid, plane_normal)
igibson.utils.sampling_utils.sample_cuboid_on_object(obj, num_samples, cuboid_dimensions, bimodal_mean_fraction, bimodal_stdev_fraction, axis_probabilities, undo_padding=False, aabb_offset=0.1, max_sampling_attempts=10, max_angle_with_z_axis=2.356194490192345, hit_to_plane_threshold=0.05, refuse_downwards=False)

Samples points on an object’s surface using ray casting.

  • obj – The object to sample points on.

  • num_samples – int, the number of points to try to sample.

  • cuboid_dimensions – Float sequence of len 3, the size of the empty cuboid we are trying to sample. Can also provice list of cuboid dimension triplets in which case each i’th sample will be sampled using the i’th triplet.

  • bimodal_mean_fraction – float, the mean of one side of the symmetric bimodal distribution as a fraction of the min-max range.

  • bimodal_stdev_fraction – float, the standard deviation of one side of the symmetric bimodal distribution as a fraction of the min-max range.

  • axis_probabilities – Array of shape (3, ), the probability of ray casting along each axis.

  • undo_padding – bool. Whether the bottom padding that’s applied to the cuboid should be removed before return. Useful when the cuboid needs to be flush with the surface for whatever reason. Note that the padding will still be applied initially (since it’s not possible to do the cuboid emptiness check without doing this - otherwise the rays will hit the sampled-on object), so the emptiness check still checks a padded cuboid. This flag will simply make the sampler undo the padding prior to returning.

  • aabb_offset – float, padding for AABB to make sure rays start outside the actual object.

  • max_sampling_attempts – int, how many times sampling will be attempted for each requested point.

  • max_angle_with_z_axis – float, maximum angle between hit normal and positive Z axis allowed. Can be used to disallow downward-facing hits when refuse_downwards=True.

  • hit_to_plane_threshold – float, how far any given hit position can be from the least-squares fit plane to all of the hit positions before the sample is rejected.

  • refuse_downwards – bool, whether downward-facing hits (as defined by max_angle_with_z_axis) are allowed.


List of num_samples elements where each element is a tuple in the form of (cuboid_centroid, cuboid_up_vector, cuboid_rotation, {refusal_reason: [refusal_details…]}). Cuboid positions are set to None when no successful sampling happens within the max number of attempts. Refusal details are only filled if the debug_sampling flag is globally set to True.

igibson.utils.sampling_utils.sample_origin_positions(mins, maxes, count, bimodal_mean_fraction, bimodal_stdev_fraction, axis_probabilities)

Sample ray casting origin positions with a given distribution.

The way the sampling works is that for each particle, it will sample two coordinates uniformly and one using a symmetric, bimodal truncated normal distribution. This way, the particles will mostly be close to the faces of the AABB (given a correctly parameterized bimodal truncated normal) and will be spread across each face, but there will still be a small number of particles spawned inside the object if it has an interior.

  • mins – Array of shape (3, ), the minimum coordinate along each axis.

  • maxes – Array of shape (3, ), the maximum coordinate along each axis.

  • count – int, Number of origins to sample.

  • bimodal_mean_fraction – float, the mean of one side of the symmetric bimodal distribution as a fraction of the min-max range.

  • bimodal_stdev_fraction – float, the standard deviation of one side of the symmetric bimodal distribution as a fraction of the min-max range.

  • axis_probabilities – Array of shape (3, ), the probability of ray casting along each axis.


List of (ray cast axis index, bool whether the axis was sampled from the top side, [x, y, z]) tuples.

igibson.utils.scene_geometric_conversion module

igibson.utils.semantics_utils module


Get mapping from semantic class name to class id


starting_class_id – starting class id for scene objects

igibson.utils.tf_utils module


igibson.utils.urdf_utils module

Add a fixed link onto a URDF tree.

  • tree – The URDF tree (ElementTree) to add to.

  • link_name – The name of the link to add.

  • link_info – A dict that stores the link info, including geometry

(box or None), size (for box), xyz and rpy of the origin of the joint :return: None


Parse URDF for spliting by floating joints later

igibson.utils.urdf_utils.round_up(n, decimals=0)

Helper function to round a float

igibson.utils.urdf_utils.save_urdfs_without_floating_joints(tree, main_body_is_fixed, file_prefix)

Split one URDF into multiple URDFs if there are floating joints and save them

igibson.utils.urdf_utils.splitter(parent_map, child_map, joint_map, single_child_link)

Recursively split URDFs by floating joints

igibson.utils.urdf_utils.transform_element_xyzrpy(element, transformation)

Transform a URDF element by transformation

  • element – URDF XML element

  • transformation – transformation that should be applied to the element

igibson.utils.utils module

igibson.utils.utils.brighten_texture(input_filename, output_filename, brightness=1)
igibson.utils.utils.cartesian_to_polar(x, y)

Convert cartesian coordinate to polar coordinate


Converts YML config into a string


Returns the roll, pitch, yaw angles (Euler) for a given rotation or homogeneous transformation matrix transformation = Array with the rotation (3x3) or full transformation (4x4)

igibson.utils.utils.get_transform_from_xyz_rpy(xyz, rpy)

Returns a homogeneous transformation matrix (numpy array 4x4) for the given translation and rotation in roll,pitch,yaw xyz = Array of the translation rpy = Array with roll, pitch, yaw rotations

igibson.utils.utils.l2_distance(v1, v2)

Returns the L2 distance between vector v1 and v2.

igibson.utils.utils.multQuatLists(q0, q1)

Multiply two quaternions that are represented as lists.


Normalizes a vector list.


Parse iGibson config file / object


Parse string config

igibson.utils.utils.quatFromXYZW(xyzw, seq)

Convert quaternion from XYZW (pybullet convention) to arbitrary sequence.

igibson.utils.utils.quatToXYZW(orn, seq)

Convert quaternion from arbitrary sequence to XYZW (pybullet convention).


Convert quaternion from rotation matrix

igibson.utils.utils.quat_pos_to_mat(pos, quat)

Convert position and quaternion to transformation matrix

igibson.utils.utils.rotate_vector_2d(v, yaw)

Rotates 2d vector by yaw counterclockwise

igibson.utils.utils.rotate_vector_3d(v, r, p, y, cck=True)

Rotates 3d vector by roll, pitch and yaw counterclockwise

igibson.utils.utils.transform_texture(input_filename, output_filename, mixture_weight=0, mixture_color=(0, 0, 0))

igibson.utils.vision_utils module

class igibson.utils.vision_utils.RandomScale(minsize, maxsize, interpolation=2)

Bases: object

Rescale the input PIL.Image to the given size. Args: size (sequence or int): Desired output size. If size is a sequence like (w, h), output size will be matched to this. If size is an int, smaller edge of the image will be matched to this number. i.e, if height > width, then image will be rescaled to (size * height / width, size) interpolation (int, optional): Desired interpolation. Default is PIL.Image.BILINEAR

igibson.utils.vr_log_converter module

igibson.utils.vr_plot_profiling module

Code to plot realtime graph.

To use, run the iGibson simulator loop with simulator.step(print_stats=True) and redirect stdout to a file:

python my_gibson_test.py > log.txt

and provide log.txt to the profiling script via the –filename arg


igibson.utils.vr_utils module

This module contains vr utility functions and classes.

class igibson.utils.vr_utils.VrData(data_dict=None)

Bases: object

A class that holds VR data for a given frame. This is a clean way to pass around VR data that has been produced/saved, either in MUVR or in data replay.

The class contains a dictionary with the following key/value pairs: Key: hmd, left_controller, right_controller Values: is_valid, trans, rot, right, up, forward, left/right model rotation quaternion

Key: torso_tracker Values: is_valid, trans, rot

Key: left_controller_button, right_controller_button Values: trig_frac, touch_x, touch_y

Key: eye_data Values: is_valid, origin, direction, left_pupil_diameter, right_pupil_diameter

Key: reset_actions Values: left_reset bool, right_reset bool

Key: event_data Values: list of lists, where each sublist is a device, (button, status) pair

Key: vr_positions Values: vr_pos (world position of VR in iGibson), vr_offset (offset of VR system from origin)

Key: vr_settings Values: touchpad_movement, movement_controller, movement_speed, relative_movement_device


Utility function to print VrData object in a pretty fashion.


Queries VrData object and returns values. Please see class description for possible values that can be queried.

q is the input query and must be a string corresponding to one of the keys of the self.vr_data_dict object

refresh_action_replay_data(ar_data, frame_num)

Updates the vr dictionary with data from action replay. :param ar_data: data from action replay :param frame_num: frame to recover action replay data on


Returns dictionary form of the VrData class - perfect for sending over networks

class igibson.utils.vr_utils.VrTimer

Bases: object

Class that can be used to time events - eg. in speed benchmarks.


Gets timer value. If not start value, return 0. If we haven’t stopped (ie. self.time_stop is None), return time since start. If we have stopped, return duration of timer interval.


Returns state of timer - either running or not


Refreshes timer


Starts timing running


Stop timer

igibson.utils.vr_utils.calc_offset(s, touch_x, touch_y, movement_speed, relative_device)
igibson.utils.vr_utils.calc_z_dropoff(theta, t_min, t_max)

Calculates and returns the dropoff coefficient for a z rotation (used in both VR body and Fetch VR). The dropoff is 1 if theta > t_max, falls of quadratically between t_max and t_min and is then clamped to 0 thereafter.


Calculates z rotation of an object based on its right vector, relative to the positive x axis, which represents a z rotation euler angle of 0. This is used for objects that need to rotate with the HMD (eg. VrBody), but which need to be robust to changes in orientation in the HMD.


Converts a list of binary vr events to (button_idx, press_id) tuples. :param bin_events: binarized list, where a 1 at index i indicates that the data at index i in VR_BUTTON_COMBOS was triggered


Converts a list of button data tuples of the form (button_idx, press_id) to a binary list, where a 1 at index i indicates that the data at index i in VR_BUTTON_COMBOS was triggered :param bdata: list of button data tuples

igibson.utils.vr_utils.get_normalized_translation_vec(right_frac, forward_frac, right, forward)

Generates a normalized translation vector that is a linear combination of forward and right.

igibson.utils.vr_utils.move_player(s, touch_x, touch_y, movement_speed, relative_device)

Moves the VR player. Takes in the simulator, information from the right touchpad, player movement speed and the device relative to which we would like to move.

igibson.utils.vr_utils.translate_vr_position_by_vecs(right_frac, forward_frac, right, forward, curr_offset, movement_speed)

Generates a normalized translation vector that is a linear combination of forward and right, the

Module contents