...
Info |
---|
Welcome to the OpenSense documentation! To complete this example, you will need to download OpenSim 4.1 or later. If If you try the example and software, please send any issues or feedback to opensim@stanford.edu. |
...
OpenSense provides an interface to associate and register each IMU sensor with a body segment of an OpenSim model (as an IMU Frame). We provide a basic calibration routine in which the first time step of IMU data is registered to the default pose of the model. You change the registration pose by changing the default coordinate values of the model. You can also write your own calibration procedures in Matlab, Python, etc. to optimize the initial pose of the model for calibration using other data sources (markers, goniometer, etc). Read more about these steps in our User's Guide chapter on the IMU Placer tool.
Computing Inverse Kinematics
An inverse kinematics method is used to compute the set of joint angles at each time step of a motion that minimizes the errors between the experimental IMU orientations and the model’s IMU Frames. The angles can then be used as inputs to other OpenSim tools and analyses or you can visualize these angles in the OpenSim GUI. The OpenSense capabilities are available through the command line and through scripting (Matlab or Python). As of OpenSim 4.2, the calibration and inverse kinematics steps are also available through the OpenSim GUI. The resulting Model and Motion can be loaded, visualized, and analyzed in the OpenSim GUI. In the future, we will also provide a direct GUI-based tool to run IMU-based kinematics. Read more about this step in the User's Guide chapter on IMU Inverse Kinematics.
...
How to Setup the OpenSense Tools
...
A visualizer window will appear, showing the calibrated model. The pose of the model is determined by the model's default pose and will not change from one calibration to the next (unless you edit the model's default pose). What will change is the orientation of the sensors attached to each body. You can zoom in on the sensors, represented as small orange bricks located at the COM of each body. Note: You can close the visualizer window, when selected, by using the keyboard shortcut of ctrl-Q (command-Q on Mac). | |
You will see a print out the calibration offset for each IMU. This is the transform between the model body and the IMU sensor. To continue the calibration, and print the calibrated model to file, select the visualizer window and press any key to continue. The Calibrated Model is written to file and will have the postfix '' added (i.e., if the input Model file is called model.osim, the output calibrated model file will be named model_calibrated.osim). |
Using the OpenSim
...
Application (GUI)
As of version 4.2 you can execute this step from the OpenSim application by invoking Tools→IMU Placer and loading the settings from the file myIMUPlacer_Setup.xml created above, or entering the data manually in the dialog as shown below, then hitting the Run button. After you run the tool, a new model with IMUs placed on it will appear in the application.
Step Four: Perform IMU Sensor Tracking Anchorstepfour stepfour
stepfour | |
stepfour |
Now that you have read in your data and calibrated your model, you can use OpenSense's Inverse Kinematics to track Orientation data from IMU sensors. The Inverse Kinematics step finds the pose of the model at each time-step that minimizes, in the least-squares sense, the difference between the orientation data from the IMU sensors and the IMU Frames on your calibrated model. The computed kinematics depend on both the calibrated model and the sensor data. Thus to perform inverse kinematics tracking of orientation data you need (i) a Calibrated Model (.osim), (ii) an orientations file (as quaternions), and (iii) an Inverse Kinematics Setup file (.xml). Using the calibrated model we generated in the previous section, we will track orientation data for walking that we read in during Step Two.
In a text editor— such as Notepad++, SublimeText, Atom, or Matlab— open the myIMUIK_Setup.xml file. The setup file stores properties that tell OpenSense how to run the Inverse Kinematics simulation. In the setup file, you specify:
<time_range> The time range for the inverse kinematics tracking (in seconds). In our example, we use data between 7.25 and 15 seconds.
- <sensor_to_opensim_rotations> The rotation needed to convert the IMU world Frame (typically Z up, Y to the left) to the OpenSim world Frame (Y up, Z to the right).
- <model_file_name> The name/path to the calibrated model file (.osim) to be used in tracking. In our example, this is the Rajagopal_2015_calibrated.osim file that was the output of Step Three.
- <orientations_file_name> The name/path to a .sto file of sensor Frame orientations (as quaternions) that will be tracked. In our example, this is the MT_012005D6_009-001_orientations.sto we created in Step Two.
- <results_directory> The directory where the results will be printed to file.
An example setup file is shown below.
For now, leave these settings as they are. This settings file can be copied and edited for your own workflow.
...
Expand | ||
---|---|---|
| ||
To perform Inverse Kinematics with OpenSense from the command line, use the following steps.
The output motion file is written to file and will have the prefix 'ik_' added (i.e., if the input orientations file is called MT_012005D6_009-001_orientations.sto, the output motion file will be named IKResults/ik_MT_012005D6_009-001_orientations.mot) |
Using the OpenSim
...
Application (GUI)
As of version 4.2 you can execute this step from the OpenSim application by invoking Tools→IMU Inverse Kinematics and loading the settings from the file imuInverseKinematics_Setup.xml created above, or entering the data manually in the dialog as shown below, then hitting the Run button, the . The IK problem will be solved and the solution will be animated in the application.
Step Five: Visualize the Results of IMU Tracking AnchorvisualizeResults visualizeResults
visualizeResults | |
visualizeResults |
...
To view the Inverse Kinematics results:
- Open the OpenSim 4.1 application application.
- Open the model: calibrated_Rajagopal_2015.osim
- Load the motion you created in Step Four: IKResults/ik_MT_012005D6_009-001_orientations.mot. Since the IMUs cannot track global translations, only relative orientations, the model appears to rotate about a single place.
...