A demo of how an AI powered virtual reality morgue training simulator would work.
A pseudo-haptic demo in VR with hand tracking and physics. The demo was created to teach massages.
A test I did with cloth physics simulation in Unity that runs in real-time, using only the microprocessor.
I did this test with robot vision, tracking Aruco markers for the CONSENSUS 2018 conference.A colleague and I did this test with robot vision, tracking Aruco markers for the CONSENSUS 2018 conference.
Robot vision without markers in micro-python using shapes and blobs of colour.
Virtual reality auscultation simulator running on a Meta Quest 2 without laptop or computer.
Rigid elbow exoskeleton with soft actuation using McKibben muscles.
Virtual reality percussion simulator with pseudo-haptics. It features pseudo-haptics (visual force feedback), soft tissue deformation and sound collision force.
Three degrees of freedom completely soft elbow and shoulder exoskeleton test rig made with McKibben muscles and Bowden cables. The test is slow for safety reasons, but it can go faster than human limbs with faster pneumatic valves. McKibben muscles can raise up to 600 kilograms considering Festo's technology.
Python arucco marker tracking code, includes 3D frame position and size.
The Jelly Cube in space pseudo haptic (visual force feedback) experiment with-ultrasound based haptics, featuring visual collisions and deformation.
Visible human project reconstruction from still images. Two human bodies were donated to science, then frozen and sliced into thousands of photos. I reconstructed the human bodies with a script that annexed thousands of pictures into 3D xyzrgb mode.
The robot show I helped put at the CONSENSUS 2018 conference. Four robots, each with a camera and a Raspberry Pi, were controlled wirelessly through a laptop. The robots built stacks representing the UK's national energy grid's consumption in real-time with cubes.
IMU tracking with Khalman Filter, in Unity. Using JavaScript with an Espruino Pico.
I abandoned this 15 IMU (inertial measurement unit) tracking glove that could track the whole hand in virtual reality and would feature four actuated cables that would provide soft force feedback due to time constraints. The 15 IMUs already had the Kalman filter programmed and could accurately provide information about the location of the fingers and the hands.
I did this interactive 3D demo during my master's. It was programmed with OpenGL and C++ without using graphics engines, CAD programs or animation software. The 3D models were input as vertices and RGB colours, and they feature some shades and many transform matrices. The idea was to explain a full-body haptic exoskeleton for virtual reality, similar to the one featured in "The Lawnmower."
Object detection using OpenCV in Unity with C#.
Text detection using OpenCV in Unity with C#.
Two spheres are learning to play sumo. The demo is coded in Unity, C#, Python and TensorFlow (machine learning).