Soft Sensors and Actuators for Wearables to Assist People with Disabilities

By James Jones
Slide 1: Title slide presenting soft sensors and actuators for wearables to assist people with disabilities

Slide-1

Mentor: Erik Engeberg, PhD

Acknowledgments: Maohua Lin, Rudy Paul, Darryl Dieujuste, Moaed Abd

Scholar: James Jones

Home Institution: Boise State University

Slide 2: Wearable soft robotic actuator showing bare exoskeleton and exoskeleton worn on hand

Slide-2

Wearable Soft Robotic Actuator

Fig. 1. Bare Exoskeleton

Fig. 2. Exoskeleton worn on Hand

This slide displays two images showing the soft robotic actuator device - first as a bare exoskeleton structure, and second demonstrating how it is worn on a human hand.

Slide 3: Myo Armband sensor device with reference citation

Slide-3

Myo Armband

Fig. 3. Tatarian, K., et al., (2018).

This slide shows the Myo Armband, which is a wearable sensor device used for electromyography (EMG) signal detection.

Slide 4: Difficulties encountered during the REU process including technical and design challenges

Slide-4

Difficulties Throughout REU Process

  • Myo Armband Unsupported
  • Changing the experimental setup
  • Air Leak in Index Finger
  • Designing wearable Apparatus
Slide 5: Results section showing experimental outcomes and data

Slide-5

Results

Figure 5:

Two Graphs: The top plot, "Human Hand Sensor Array," shows multiple colored lines fluctuating between 1200 and 1400 on the Y-axis over a time period of 0 to 35 on the X-axis. The bottom plot, "Human Hand EMG Trial," displays multiple colored lines ranging from -100 to 100 on the Y-axis over a time period of 0 to 2 multiplied by 104 on the X-axis.

Slide 6: Additional results with data visualization and analysis

Slide-6

Results

Figure 6:

Titled "Human Hand Sensor Array and Valve Control," this figure consists of four pairs of plots stacked vertically. Each pair has a top plot showing multiple colored lines fluctuating between 1200 and 1400 on the Y-axis, and a bottom plot with a blue line that's either at 0 or 100 on the Y-axis. The X-axis for all eight plots is labeled from 5 to 6.5.

Slide 7: Neural network accuracy results presented in a table format

Slide-7

Results

Figure 7:

This figure presents a table and a matrix. The table, "Table 1. Average Neural Network Accuracy," has four columns: "Human," "No Human 1," "No Human 2," and "Surface." It contains data for ten trials and rows for "Average" and "Std. Dev." The matrix, "Test Confusion Matrix," is a 10x10 grid with rows labeled "Output Class" and columns labeled "Target Class." The cells contain numbers and percentages, with some cells highlighted in green.

Slide 8: Applications and future research directions with demonstration video

Slide-8

Applications and Further Research

Video 1.

This slide contains a blank placeholder for a video with a video icon in the bottom left corner.

Slide 9: References section with academic citations

Slide-9

References

1. Tatarian, Karen & Couceiro, Micael & Ribeiro, Eduardo & Faria, Diego. (2018). Stepping-stones to Transhumanism: An EMG-controlled Low-cost Prosthetic Hand for Academia. 10.1109/IS.2018.8710489.

Last slide: Contains plain text stating 'End of presentation. Click the right arrow to return to beginning of slide show.'

End of Presentation

Click the right arrow to return to the beginning of the slide show.

For a downloadable version of this presentation, email: I-SENSE@FAU.

Additional Information
The Institute for Sensing and Embedded Network Systems Engineering (I-SENSE) was established in early 2015 to coordinate university-wide activities in the Sensing and Smart Systems pillar of FAU’s Strategic Plan for the Race to Excellence.
Address
Florida Atlantic University
777 Glades Road
Boca Raton, FL 33431
i-sense@fau.edu