[U of U ME Dept. Shoutout] [Patent Initiated]

Overview:

By combining Computer Vision and Inertial Unit Measurement Sensors (IMU), an inexpensive and accurate body pose estimation framework was created. With a total of $250 for wearable components and a webcam or phone camera. This sensor fusion framework was only a 5 (deg) root mean square error off compared to a $25,000 Vicon System on Lateral Raises. Initially intended for the use in assisting physical therapy, this novel idea has the potential to go in areas such as VR for a budget accurate body pose estimation.

https://lalopunch.github.io/ST/index.html

As the Lead Software role I oversaw the creation of the project. Using C++ for IMU sensor data and communicating with 5 wearables simultaneously. Also utilizing Python for its computer vision framework and developing a GUI using PyQT. Was also in charge of creating the Sensor Fusion system to combine python and C++ data and eliminate each frameworks biggest constraints; Computer Vision being body depth measurement and IMU’s drift errors when usage for long periods of time. The project got much praise from our Professor and we had talks with health companies (SWORD health and kaia health). Currently going through process of starting a patent application.

How It Works

SettingUpWearable.png

1️⃣ Set up wearables

BluetoothConnectionStatus.png

2️⃣ Confirm Wireless Connection

Note: This is a presentation demo app. Full mobile app currently being developed

NewExerciseSelection.png

1️⃣ Select the exercise

Calibration.png

2️⃣ Follow Calibration Instructions

NewExerciseGuidance.png

3️⃣ Start the exercise

NewResults.png

4️⃣ Review your results

5️⃣ 3D results data is now retrieved and saved in a csv file. To be opened and reviewed by you or a trainer


Class Diagram

Python - Application (Trainer GUI)

classDiagram

class TrainerGUI{
-guiWindow: ExerciseSelection
-widget:QStackedWidgets
-show():void
-bluetooth:Bluetooth

}

class ExerciseSelection{
-gotoLatRaiseCal
-gotoCatCowCal
-gotoSquat
}

class ExerciseCalibration{
__init__(self, Nodes[])
calibration(bodyNodeLoc:Nodes[])
setAngleThresh(self,float[], float[],Nodes[]) void
setLatRaiseThresh(float[],float[]) void
cancelFeed() void
gotoExercise(Object) void
}

class ExerciseGuidance{
setReps(reps:int) void
setTimer(timer:int) void
setSets(sets:int) void
calculteAngle(startNode:int,endNode:int) void
timerEvenet() void
}

class CameraDetect{
-QThread:thread
ToggleCamera(QThread)
__init__(self,Max[],Min[],Nodes[])
run(self,Max[],Min[],Nodes[])
getAngles(self,QList[Array])
setAngles(self,QList[Array])
stop(self)

}

class Bluetooth{
-mac_addr:char[]
-uuid:char[]
-setup(mac_addr:char[]):bool
-sendPacket(UUID:char[]) void
-getPacket() char[]
-connect(mac_addr:char[]) void
-isConnected(mac_addr:char[]) bool
-disconnect(mac_addr:char[]) void 
}
class Results{
showRepCount(rep:int)
showTimeCount(rep:int)
showBodyMotion(qStacks(array[]))
showBodPlot(timerX:int,anglY:float)
}

TrainerGUI o-- Bluetooth

TrainerGUI*-- ExerciseSelection
TrainerGUI*-- ExerciseCalibration
TrainerGUI*-- ExerciseGuidance
TrainerGUI*-- Results

ExerciseCalibration <|-- latRaiseCal
ExerciseCalibration<|-- catCowCal
ExerciseCalibration<|-- squatCal

ExerciseGuidance <|--  squat
ExerciseGuidance <|-- catCow
ExerciseGuidance <|-- latRaise

ExerciseCalibration *-- CameraDetect
ExerciseGuidance *-- CameraDetect

Bluetooth o-- ExerciseCalibration
Bluetooth o-- ExerciseGuidance

C++ - Wearables Host