Small Marker Tracking with Low-Cost, Unsynchronized, Movable Consumer Cameras For Augmented Reality Surgical Training

This Project

Small Marker Tracking with Low-Cost, Unsynchronized, Movable Consumer Cameras For Augmented Reality Surgical Training

  • Goal: Tracking objects in motion usually requires cameras to be synchronized, which most consumer cameras are not capable of, and even research cameras like PointGreys are very expensive or their cheaper alternatives like PiCams are hard to receive images from. This project tries to deal with all of these problems at once by using cheap $7-10 cameras that are "synchronized" with smart threading instead. 2 of them are attached to the dynamic manipulators themselves to deal with occlusion, which requires ViveTrackers as well.

  • Current progress: We can track Vive tech in Hololens space and the entire pipeline is complete, with some naive assumptions being made that should eventually be corrected and some interpolation needs to be done. There is also some delay that needs be be dealt with to have better realtime performance

  • Technology that I'm using: Unity with C# & Vuforia for marker-tracking, UE4 w/ Blueprint for ViveTrackers & pooling, Hololens/Unity/Vuforia/C# for application.

  • Documentation for replicability & source code will be provided after publication.

Short ISMAR paper/poster (more succinct; less technical detail)

Laparo_ISMAR2020_edits_camera.pdf

Longer, older paper draft that's not as well-written but has more technical detail and pipeline figures

ismar20a-sub1208-i7 (1) (1).pdf

Short class presentation (pending the ISMAR presentation)

6dof Dynamic Marker Tracking with Low-Cost, Unsynchronized, Dynamic Cameras
laparo documentation

Related Ongoing Work

Using AR to Improve Training for Laparoscopic Surgery

  • Co-investigators: Hao Jiang, Andrei State, and Andrei Illie

  • Goal: We hypothesize that having a 3D view of the surgery area through an AR display such as the Hololens will improve accuracy of movement, since the user has the additional stereoscopic depth cues to aid them in localizing the workspace.

  • Current progress: We are able to get a basic laparoscopic training scenario to work in the Hololens, with limitations in tracking and calibration that we are working on. I personally focus on the realtime simulation and user study scenario after the tracking data is received.

  • Technology that I'm using: Unity with C# for the realtime implementation. Hololens as the headset. Blender to build all 3d-printed models. OpenCV+Vuforia for tracking (Hao is using OpenCV to track the cube accurately while I use Vuforia for world calibration)

  • Attached are pictures of a theoretical portable laparoscopic training apparatus and some videos of our process (more or less ordered by most recent to oldest)

laparoposter (1).pdf

Using Vuforia with Small Markers

  • Goal: Find a good method of tracking very small markers (<2cm^2). Vuforia and OpenCV are currently among the best marker-tracking methods, but here I focus specifically on Vuforia because it works with a wider range of markers, uses many features, and seems to have great tracking after acquisition (which seems to take much longer than OpenCV).

  • Current progress: Vuforia seems to fare very well with small markers and at long ranges after the initial acquisition, which is the hard part. Auto-focus and glare can easily break the tracking in these more difficult cases.

  • Technology that I'm using: Unity with C# and Android for realtime implementation. Vuforia for tracking.

  • This was expanded into the title project.