Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Team Members

  • Richard Hsiao
  • Michael Fanton

Widget Connector
urlhttps://www.youtube.com/watch?v=l52s1VRyenw

 

Software:

Matlab Scripting:

 

View file
nameMatlab Scripting Software.zip
height250
View file
nameWritingtheMotionFileFromXSensIMUs.rar
height250

View file
nameDataProcessing.rar
height250

 

Opensim Model:

View file
nameNeck6dof_EDIT.osim
height250

Description

This project focuses on using inertial measurement units (IMUs) in the "wild," meaning using IMUs to capture state-specific data in a non-laboratory and non-controlled environment. IMUs are generally used to capture linear acceleration and angular velocity data using gyroscopes and accelerometers. Through some integration, these linear accelerations and angular velocities can then be used to determine the position of the IMU,which provides a useful way to capture biomechanical data such as joint angles and body positions. However, one of the largest issues with capturing real-time position data is the noise apparent not only in the environment, but also in the sensors themselves. Using the integration method to determine positions and joint angles introduces drift error - a systemic artifact due to the integration that causes the data to shift towards one direction. Existing studies have shown that applying a Kalman filter could ameliorate the effects of this drift. 

Our project aims to use IMUs to capture orientations of the head/neck and then simulating these orientations in OpenSim, where we can analyze and the kinematics and forces acting on the head and neck. The XSens models outputs neck joint angles as a value through the use a two-pivot neck joint model (reference to XSens paperYoung, 2010). Other biomechanical models have muscles constrain the neck angle joint such that they are distributed over the cervical vertebra (Vasavada). In distributing the angles between the vertebrae, there may be a more accurat accurate representation of the forces acting on each vertebrae. In this project, we are aiming to trying to determine whether or not if distributing the joint angles is significant in determining the overall kinematics of the neck angles. 

...

  1. Collected experimental data of knee flexion using high speed motion capture camera (120 bpm, 80 bpm, 40 bpm) and XSENS IMU tracking system.
    By using high speed mo-cap, markers on the shank, knee joint, and thigh, and a goniometer, we were able to capture the ground truth of the biomechanical movement and state data during knee flexion.We used knee flexion as a starting point as it was easier to collect the data since the XSENS system has marker data for the thigh and shank. These steps were also a starting point for another group project (Using Inertial Measurement Units to Calculate Knee Flexion Angle) and steps for the knee flexion experiment can be found there. 

  2. Processed the motion-capture data in the knee flexion/extension to capture knee angle with markers
    Multimedia
    date//neck using inertial measurements units and Kalman filters
    nameKneeFlexExp.mp4


    The above video shows the knee flexion set-up and movement as well as the trackers (Yellow - thigh, Green - knee joint, Blue - Shank). We can see the knee has a range of motion (ROM) from around 10 degrees to 80 degrees. Using the 2D marker position data, we can find the vector from the thigh marker to the knee joint and the vector from the knee joint to the shank marker. From here, we can calculate the angle between the two vectors and estimate the knee joint angle with the equation: cos (θ) = dot (v,w)/ norm(v,w). The plot below shows the knee angle over time. We can see that it follow what we have observed on the video.

      1. Processed the raw IMU data of knee flexion to obtain position by integrating the linear acceleration and angular velocities
        In order to show the drift error inherent in the IMU raw data, we needed to take the raw data from the IMUs and integrate to get position data. This step was to illustrate the error. 

Figure 1. Linear accelerations and angular velocities from the raw data of the IMUs placed on the upper and lower leg

The linear acceleration of the upper leg is fairly constant, which is what we expect because in the experiment, only the lower leg is moving to flex the knee. The linear acceleration is also correct; however, in both cases the raw data is very noisy. If we take this data and integrate it, we would expect to see position data similar to what was captured in the motion capture video. We expect the upper leg to have be relatively constant and have minimal changes in position while the lower leg should have an oscillatory changes in position. The plot below shows the integrated angular velocity for the upper and lower leg.

Figure 2. The positions of the upper and lower leg from the processed (integrated) angular velocities of the IMUs

...

By using high speed mo-cap, markers on the shoulder, IMU, neck, cheek, and forehead, we were able to capture the ground truth of the biomechanical movement and state data during neck flexion. We used an altered version of the experiment we used to detect to detect neck flexion as in the first part. The motion capture was processed using Kinovea; since we didn't use a goniometer, we used the software to capture the angle and position data. The motion captured with the XSENS IMU data was done with the subject flexing and extending their neck at different rates.

Figure 3. above show shows the experiment set-up with the high-speed video capture (120 fps) and marker/IMU setup on the subject.

...

The XSens IMU tracking outputs joint angle data for C7T1 and C1Head. We plan to use the joint angles we captured and put them into the OpenSim Vasavada model to calculate positions. of the head and compare these positions to the the motion capture data. The specific positions of each vertebrae were also used to calculate the joint angles of the vertebrae.The XSENS data for position and orientation are quaternions of the body segment positions and body segment angles, respectively. The XSens data will be kept as is, using the processing that is built into the software; however, we're using these same angles as the inputs to the OpenSim Vasavada neck model. The Vasavada model will then distribute these neck angles along the cervical vertebrae due to the constraints of the model (Vasavada).

OpenSim model with the joint angles from the XSens IMU's

Figure 4. Vasavada OpenSim neck model with the .mot file with the joint angles collected from the IMUs

Side note: The way Vasavada model works, there's a function between each joint at angles. If For example, if you adjust a joint by an angle of 1, the other one would adjuts the otehr one by another angle would adjust by different factor. To make it a two link model, you we could reduce the weight to 0. We will try to , which could limit the number of dependent variables. XSENS into XSENS model vs XSENS into Vasavada model. Independent vs dependent coordinates for angles. 

In order to input the XSens IMU data into OpenSim, we used a MATLAB script to write a .mot file that controls pitch2 and pitch1, which corresponds to the movement of the C7T1 and C1head joint in the sagittal plane respectively. We had to change the .osim file available from Vasavada to remove the constraints placed on the C1Head and C7T1 joints as the joint angles obtained from the IMUs exceeded the range of motion the model could handle, resulting in clipping. Removing these constraints removed the errors from clipping. From there, the Analyze tool was used to obtain the BodyKinematics of the OpenSim model, where we are able to obtain the positions of the skull and each cervical spine. Using the same calculations from the knee joint as stated above, we used the positions of the T1, C1, C3, C7, and skull to obtain three joint angles. The T1 was constant and used as a reference point to represent the torso, which did not move in the biomechanical model. The joint angle between T1, C1, and C3 was obtained and named C1. The joint angle between C1, C3, and C7 was obtained and named C3. The joint angle between C3, C7, and skull was obtained and named C7. 

...

The marker data outputs position data along the head/neck. In order to get joint angle data, we took vectors between each markers and treated them as rigid bodies between joints. 

Figure 5. Illustration of the vectors and joint angles calculated from the motion capture data

In the Figure (NUMBER)5, the vectors are depicted between each markers. The light blue shows the joint angles we calculated. In order to estimate the position of head, we averaged the positions of the markers on the cheek and head. 

michael add more information here

 

With collecting data from the XSens, the OpenSim model, and our marker positions, there are errors that come with comparing the joint angles. Each dataset has their own reference coordinates and methodology of calculating the joint positions. For example, the marker positions are visually placed and may not be an accurate position to where the cervical vertabrae are actually located. The depending on where you place marker positions on a rigid body, the computed angle between two vectors can greatly differ, although the change in angle should remain the same. Additionally, the XSens IMUs estimate the neck angles with placements on the shoulder, back, and forehead, meaning they may have another way of positioning the angles; one way to accommodate, characterize, and compare the neck angles of each set is to systematically demean the data. Since the position differences are systemic across each dataset, it means the joint angles are offset by a constant and by By removing this constant, we can track the relative joint angle changes. The image above visualizes what demeaning the oscillatory neck flexion looks like. This was implemented in MATLAB.

Figure 6. Demeaning a signal

4. Results

Because we're basing the joint angles on both models using the same set of data from the XSens model, this study does not validate the accuracy of the Vasavada models; however, if the joint angles and positions calculated by the Vasavada model affects the accuracy of the joint angles, we can say that modeling the cervical spine with multiple joints may improve the biofidelity of the modelFigure 7. Study results. A) We can see that distributing the C1 and C7 neck joint angles over the cervical spine much more accurately predicts the C3 joint angle than assuming that the C3 joint angle is fixed (i.e., rigid body). Similarly, the Vasavada model more accurately predicts the C1 joint angle than the Xsens 2 pivot model. However, the Vasavada model greatly underestimates the C7 joint angle. This could be because we did not have the ground truth C7 marker in the anatomically correct location. B) From this plot we see that increasing the speed at which the subject moved her head decreased the accuracy of the Xsens data capture system.

 

Bibliography

 Vasavada, Li, and Delp. "Influence of muscle morphometry and moment arms on the moment-generating capacity of human neck muscles." Spine, 1998.

 Young, Alexander D. "Wireless realtime motion tracking system using localised orientation estimation." (2010).

 

Panel
borderColorgray
bgColorwhite
borderWidth5
borderStylesolid

Home: BIOE-ME 485 Spring 2017

...