.. _tools: Tools ========================= We provide a set of tools to help you with robot configuration, environment setup, visualization and teleoperation. Below are some of the key tools available: .. code-block:: bash python3 tools.py [options] Available commands: 1. **view** - Environment Visualization ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Visualize robot environments and configurations in an interactive GUI. **Syntax**: .. code-block:: yaml python3 tools.py view [options] **Options**: - ``--random-action`` : Apply random actions to the robot (default: False) - ``--paused``: Start simulation in paused state when using ``random-action`` (default: False) - ``--hz `` : Set action frequency in Hz in when using ``random-action`` (default: None) **Examples**: .. code-block:: bash python3 tools.py view vr_teleop/configs/examples/maniskill_panda/pick_cube.yaml .. figure:: assets/GUI_default.png :width: 800px :align: center Visualizing the panda robot in "PickCube-v1" environment 2. **replay** - Trajectory Playback ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Replay recorded robot trajectories. We utilize ManiSkill3 built-in recording and replay tools. The recorded trajectories are stored in the same format as ManiSkill3 with `meta data `_ to store environment information and `trajectory data `_ to store the recorded trajectories. **Syntax**: .. code-block:: yaml python3 tools.py replay [options] This script is a wrapper to launch `ManiSkill3 replay tools `_ and you can use all the options provided by ManiSkill3 replay tools. **Example**: .. code-block:: bash python3 tools.py replay records/maniskill/PegInsertionSide-v1/20250615-145430/trajectory.h5 --allow_failure .. video:: assets/replay_peg_insertion.mp4 :width: 400px :align: center Replay recorded trajectory in "PegInsertionSide-v1" environment 3. **calibrate** - Retargeting Calibration ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Calibrate scaling factors for human-to-robot motion retargeting to ensure accurate motion transfer. **Syntax**: .. code-block:: bash python3 tools.py calibrate **Usage**: While wearing your VR headset, keep your hands flat and open, and make sure the hands are detected by the camera. Hold this gesture for 15 seconds, and the scaling factor will be printed out. Please refer to :ref:`Retargeting` for more details. 4. **convert** - Pose Conversion Utilities ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Compute quaternion to convert the controller / hand frame to the end-effector frame. This tool calculates the transformation needed to align coordinate frames for proper robot control. **Syntax**: .. code-block:: bash python3 tools.py convert - ````: String containing space-separated axes like "x -y -z" or "-x z y". **Axis Mapping Format:** - Use space-separated axis directions to define how controller axes map to end-effector axes - Positive axes: ``x``, ``y``, ``z`` - Negative axes: ``-x``, ``-y``, ``-z`` - Order represents: [end-effector X maps to controller axis] [end-effector Y maps to controller axis] [end-effector Z maps to controller axis] **Example**: see example in :ref:`ik` for Panda robot. 5. **ik-vis** - Visualize IK Robot ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Launch a SAPIEN GUI to visualize the robot model used for inverse kinematics (IK) calculations. This tool helps verify robot kinematics structure, joint configurations, and end-effector poses. **Syntax**: .. code-block:: bash python3 tools.py vis_ik **Example**: The following is an example for Panda robot in ``configs/agents/panda/ik_right.yaml``: .. code-block:: yaml robot: urdf: ${oc.env:MANISKILL_ASSETS}/robots/panda/panda_v2.urdf fix_root_link: false cut: root: panda_hand cuts: [panda_hand_tcp, panda_leftfinger, panda_rightfinger] # add qpos for visualization qpos: [0.0, 0.392, 0, -1.96, 0, 2.36, 0.785] # use default qpos in https://github.com/haosulab/ManiSkill/blob/9561ef04747daa881209a143f2ad8b4bd5a6d98d/mani_skill/agents/robots/panda/panda.py ik: ee_pose_convertor: [q: [0, 1, 0, 0]] ee_name: ["panda_hand"] fix_joint_indices: [] n_retry: 0 max_iterations: 100 threshold: 1e-3 use_projected_ik: false # true mod_2pi: false # true .. code-block:: bash python3 tools.py vis_ik configs/agents/panda/tele_right.yaml This shows the Panda robot without gripper fingers, a pure robot model with only IK-related joints, validating the IK configuration. .. figure:: assets/GUI_ik_robot.png :width: 800px :align: center Interactive SAPIEN GUI showing the Panda robot with only IK-related joints For detailed information about IK configuration parameters, see :ref:`ik`. 6. **retargeting-vis** - Visualize Retargeting Robot ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Launch a SAPIEN GUI to visualize the robot model used for retargeting calculations. This tool helps verify robot kinematics structure and link configurations (pose of links used for ``target_origin_link_names`` and ``target_task_link_names``). **Syntax**: .. code-block:: bash python3 tools.py vis_ik **Example**: The following is an example for Floating Ability Hand in ``configs/agents/floating_ability_hand/retargeting_right.yaml``: .. code-block:: yaml hand_model: urdf: ${oc.env:ASSETS}/agents/floating_hand/ability_hand/ability_hand_right.urdf mimic_joints: ... # skip for brevity # add qpos for visualization (vis_retargeing) qpos: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.72349796, 0.72349796, 0.72349796, 0.72349796] .. code-block:: bash python3 tools.py vis_retargeting configs/agents/floating_ability_hand/tele_right.yaml This shows the Floating Ability Hand and you can check the link poses to verify the retargeting configuration. .. figure:: assets/GUI_retargeting_robot.png :width: 800px :align: center Interactive SAPIEN GUI showing the Floating Ability Hand For detailed information about retargeting configuration parameters, see :ref:`Retargeting`.