Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Team Members

  • Richard Hsiao
  • Michael Fanton

...

Our project aims to use IMUs to capture orientations of the head/neck and then simulating these orientations in OpenSim, where we can analyze and the kinematics and forces acting on the head and neck. The XSens models outputs neck joint angles as a value through the use a two-pivot neck joint model (reference to XSens paperYoung, 2010). Other biomechanical models have muscles constrain the neck angle joint such that they are distributed over the cervical vertebra (Vasavada). In distributing the angles between the vertebrae, there may be a more accurat accurate representation of the forces acting on each vertebrae. In this project, we are aiming to trying to determine whether or not if distributing the joint angles is significant in determining the overall kinematics of the neck angles. 

...

  1. Collected experimental data of knee flexion using high speed motion capture camera (120 bpm, 80 bpm, 40 bpm) and XSENS IMU tracking system.
    By using high speed mo-cap, markers on the shank, knee joint, and thigh, and a goniometer, we were able to capture the ground truth of the biomechanical movement and state data during knee flexion.We used knee flexion as a starting point as it was easier to collect the data since the XSENS system has marker data for the thigh and shank. These steps were also a starting point for another group project (Using Inertial Measurement Units to Calculate Knee Flexion Angle) and steps for the knee flexion experiment can be found there. 

  2. Processed the motion-capture data in the knee flexion/extension to capture knee angle with markers
    Multimedia
    date//neck using inertial measurements units and Kalman filters
    nameKneeFlexExp.mp4


    The above video shows the knee flexion set-up and movement as well as the trackers (Yellow - thigh, Green - knee joint, Blue - Shank). We can see the knee has a range of motion (ROM) from around 10 degrees to 80 degrees. Using the 2D marker position data, we can find the vector from the thigh marker to the knee joint and the vector from the knee joint to the shank marker. From here, we can calculate the angle between the two vectors and estimate the knee joint angle with the equation: cos (θ) = dot (v,w)/ norm(v,w). The plot below shows the knee angle over time. We can see that it follow what we have observed on the video.

      1. Processed the raw IMU data of knee flexion to obtain position by integrating the linear acceleration and angular velocities
        In order to show the drift error inherent in the IMU raw data, we needed to take the raw data from the IMUs and integrate to get position data. This step was to illustrate the error. 

Figure 1. Linear accelerations and angular velocities from the raw data of the IMUs placed on the upper and lower leg

The linear acceleration of the upper leg is fairly constant, which is what we expect because in the experiment, only the lower leg is moving to flex the knee. The linear acceleration is also correct; however, in both cases the raw data is very noisy. If we take this data and integrate it, we would expect to see position data similar to what was captured in the motion capture video. We expect the upper leg to have be relatively constant and have minimal changes in position while the lower leg should have an oscillatory changes in position. The plot below shows the integrated angular velocity for the upper and lower leg.

Figure 2. The positions of the upper and lower leg from the processed (integrated) angular velocities of the IMUs

...

By using high speed mo-cap, markers on the shoulder, IMU, neck, cheek, and forehead, we were able to capture the ground truth of the biomechanical movement and state data during neck flexion. We used an altered version of the experiment we used to detect to detect neck flexion as in the first part. The motion capture was processed using Kinovea; since we didn't use a goniometer, we used the software to capture the angle and position data. The motion captured with the XSENS IMU data was done with the subject flexing and extending their neck at different rates.

Figure 3. above show shows the experiment set-up with the high-speed video capture (120 fps) and marker/IMU setup on the subject.

...

The XSens IMU tracking outputs joint angle data for C7T1 and C1Head. We plan to use the joint angles we captured and put them into the OpenSim Vasavada model to calculate positions. of the head and compare these positions to the the motion capture data. The specific positions of each vertebrae were also used to calculate the joint angles of the vertebrae.The XSENS data for position and orientation are quaternions of the body segment positions and body segment angles, respectively. The XSens data will be kept as is, using the processing that is built into the software; however, we're using these same angles as the inputs to the OpenSim Vasavada neck model. The Vasavada model will then distribute these neck angles along the cervical vertebrae due to the constraints of the model (Vasavada).

OpenSim model with the joint angles from the XSens IMU's

Figure 4. Vasavada OpenSim neck model with the .mot file with the joint angles collected from the IMUs

...

The marker data outputs position data along the head/neck. In order to get joint angle data, we took vectors between each markers and treated them as rigid bodies between joints. 

Figure 5. Illustration of the vectors and joint angles calculated from the motion capture data

In the Figure (NUMBER)5, the vectors are depicted between each markers. The light blue shows the joint angles we calculated. In order to estimate the position of head, we averaged the positions of the markers on the cheek and head. 

michael add more information here

 

With collecting data from the XSens, the OpenSim model, and our marker positions, there are errors that come with comparing the joint angles. Each dataset has their own reference coordinates and methodology of calculating the joint positions. For example, the marker positions are visually placed and may not be an accurate position to where the cervical vertabrae are actually located. The depending on where you place marker positions on a rigid body, the computed angle between two vectors can greatly differ, although the change in angle should remain the same. Additionally, the XSens IMUs estimate the neck angles with placements on the shoulder, back, and forehead, meaning they may have another way of positioning the angles; one way to accommodate, characterize, and compare the neck angles of each set is to systematically demean the data. Since the position differences are systemic across each dataset, it means the joint angles are offset by a constant and by By removing this constant, we can track the relative joint angle changes. The image above visualizes what demeaning the oscillatory neck flexion looks like. This was implemented in MATLAB.

Figure 6. Demeaning a signal

4. Results

Because we're basing the joint angles on both models using the same set of data from the XSens model, this study does not validate the accuracy of the Vasavada models; however, if the joint angles and positions calculated by the Vasavada model affects the accuracy of the joint angles, we can say that modeling the cervical spine with multiple joints may improve the biofidelity of the modelFigure 7. Study results. A) We can see that distributing the C1 and C7 neck joint angles over the cervical spine much more accurately predicts the C3 joint angle than assuming that the C3 joint angle is fixed (i.e., rigid body). Similarly, the Vasavada model more accurately predicts the C1 joint angle than the Xsens 2 pivot model. However, the Vasavada model greatly underestimates the C7 joint angle. This could be because we did not have the ground truth C7 marker in the anatomically correct location. B) From this plot we see that increasing the speed at which the subject moved her head decreased the accuracy of the Xsens data capture system.

 

Bibliography

 Vasavada, Li, and Delp. "Influence of muscle morphometry and moment arms on the moment-generating capacity of human neck muscles." Spine, 1998.

 Young, Alexander D. "Wireless realtime motion tracking system using localised orientation estimation." (2010).

 

Panel
borderColorgray
bgColorwhite
borderWidth5
borderStylesolid

Home: BIOE-ME 485 Spring 2017

...