For my second year of school, I did a 11 weeks internship at B|Braun - Aesculap where I was in charge of developping two serious games for training on the OrthoPilot plateform .
The goal was to study the relevance and efficiency of Gesture Interactions for this platform.
The technologies I used for this project were: C++, Qt/QML.
B|Braun is a company specialized in medical services and products. In particular they sell ankle, knee or hip protheses. Within this company, Aesculap sells tools and systems dedicated to surgery.
I was part of a small R&D team in Echirolles (FR 38), which is working on the OrthoPilot plateform , a computer assisted tool for orthopedic surgeon, allowing them to save time and gain precision on hips, knee and ankle prothesis surgery as well as ACL or other orthopedic surgery.
This platform is based on an OptiTrack system tracking the surgeon's objects (like tools or prothesys) thanks to object-specific rigidbodies mounted on the objects. This allows the system to know very precisely and at everymoment the position and orientation of the tracked objects and as a consequence helps the surgeon placing and measuring what he is doing in real time.
It was an "Assistant Engineer" internship on a short period (11 weeks), and I had no choice on the technologies to use. The goal was to develop two serious games in Qt/QML, integrating a gestures detection module developped by an other engineer of the team.
The goal was to study the relevance and efficiency of gestures interactions for the Orthopilot platform. Indeed back then, the only interactions available for the user were through a 2 buttons foot pedal while operating the tools with the hands. The idea was to replace the pedal by movements done directly with the tools.
I released two games using Qt/QML (a declarative programming language similar to HTML but easily integrable to C++ code). Surgeon warrior, a games focusing on left and right gestures (see below) where the player has to kill squeletons attacking from the sides. And a Puzzle game, focusing on the pull-out gesture, where the player has to pick bones and place them at the right position on a squeleton.
The gesture recognition module was designed to be used with the pointing tool of the orthopilot platform. This tool is used by surgeon to record position precisely (for example specific bones of the ankle...).
The pointing tool and its rigidbody.
There were 3 gestures which could be recognized by the gestures recognition module: circle right, circle left and pull out.
The 3 gestures.