Graphics I Made

(and how I made them)

Here I'll provide info about how I made and/or where I found certain graphics from my papers and projects (rendered, animated, etc.). This is mostly a place for graphics that are complicated or those that someone might be interested in how they were done (when the method is not already obvious...for example, I already mention that the Old Well rendering was done in Blender in My Projects, and the method is really not much more complex than that apart from the manual modelling).


(this is from fall 2019 and needs to be updated!)

Some general notes:

  • All of my videos are made in Premiere Pro. I usually use the H264 Youtube 720p format to render, which is very quick (<5 mins to render >5 min video). You can then use Clipchamp or a Youtube downloader to get a lower file size version.

  • Most recordings of the game engine are recorded with OBS Studio (flv format to lower CPU usage and increase framerate).

  • All of my own papers are made in Overleaf.

Made for: "A Walk to the Park": Virtual Locomotion with Robot-Guided Haptics and Persistent Distractors

  • The virtual implementation was made in Unreal 4. The green spline is what the robot should follow. It's constructed using Unreal's great spline mesh system.

  • The robot is a modified Elegoo-brand 4-wheeled Arduino Uno robot (found on Amazon). I added a separate HC06 bluetooth adapter, and with Toshiba Bluetooth Stack, I can connect to it easily from a W10 PC. I haven't gotten the connection to work with any other Bluetooth programs. It treats a Bluetooth LE pairing like any other COM port so that I can use an Unreal 4 Serial communication plugin called UE4Duino to send commands to the robot. The robot doesn't do anything by itself besides read and parse commands that are structured like "m[leftWheelDirection][rightWheelDirection][leftWheelSpeed]_[rightWheelSpeed]. I compute the wheel velocities using the standard differential drive equations (see paper).

A_Walk_to_the_Park__Robot_Based_Active_Haptics_for_Virtual_Navigation__Copy___Copy_ (11).pdf
  • Most figures were made in Illustrator, with vector graphics made with Illustrator assets+screenshots from Unreal+real pictures

  • Figures 2,3,5,6 were made in Google Draw for a specific reason: Google Draw's arrow connectors work better than Powerpoint and it's easier to play with their curvature

  • Data was analyzed in SPSS and the tables were made with that data in Excel.

  • Protip: you can disable the UE4 Tonemapper when you take screenshots in Unreal 4 so that the background of a complex object is white (like the room in Figure 9), reducing the need to do anything but crop

  • The city in Fig 1e is borrowed from a free asset in the UE4 asset store called ModularBuildings: https://www.unrealengine.com/marketplace/en-US/slug/modular-building-set?sessionInvalidated=true

  • The Razzaque firedrill scenario was granted by Razzaque himself. I converted the old '98 Maya format into something Unreal could use with Maya.

  • This smaller project was made with Unreal 4. Everything is an Unreal standard asset with additional cel shading provided by an asset pack called "Advanced Cel Shader Lite."

Made for: Shared Haptics in Multi-User Surgical Training Environments

  • Both of these demos were made in UE4. For the upper video, multiplayer features are provided through Unreal's great RPC system. The Leap Motion hands are from the Leap UE4 plugin, although I needed to modify a lot of their code to get it to work with my system (and to enable better grabbing, collision detection, networking, etc.)

  • The lower video is a VR version of an old plugin called Kinect4Unreal that animates the default UE4 mannequin using skeletal data from the Kinect v2.

Made for: Using AR to Improve Training for Laparoscopic Surgery

Made for: Evaluating the Effectiveness of Redirected Walking with Auditory Distractors for Navigation in Virtual Environments

  • All of the realtime part is in Unreal 4

  • The visualizer at the top of the obstacle avoidance section was made with matplotlib in Python

  • The top visualizer in the non-obstacle avoidance section is still UE4, just rendered from an aerial view. It reads data that was gathered from the realtime implementation and is synced with the other videos

  • The bottom left subvideo was made by implementing perspective matching in UE4. Basically, I play the video on a plane that is parallel to the camera plane, then try to approximate the floor position in the 3D world with another plane that is rotated relative to the video plane from the camera perspective. The result is that it looks like the python visualizer is projected onto the real floor. I think it came out pretty well for a rough estimate.

Evaluating_the_Effectiveness_of_Redirected_Walking_with_Auditory_Distractors_for_Navigation_in_Virtual_Environments__Revision_ (2).pdf
  • Figure 3 was also made with perspective matching (same method as above)

  • All of the non-table figures (besides Fig 3) were made with Illustrator+screenshots from UE4. The table data was analyzed with SPSS and made presentable with Excel.

Made for: Original traffic sim for Maze Day (2016)

This UE4 project used a mix of assets from Epic Games' Infiltrator and Showdown demos.

Made for: COMP872 Virtual Worlds VR assignment (2017)

This is a modifed version of the UE4 Infiltrator demo by Epic games (just the above-ground part and some optimizations for VR performance).

Made for: Eye-tracking for egocentric displays

  • These were all rendered in Unreal 4 and edited with Illustrator

  • The extended lens apparatus thing was made in Blender with boolean modeling and sent to Unreal.

  • Glass materials came from the Advanced Glass Material Pack in UE4 (see above).

  • The glasses were found on a free 3D object site and they use the Automotive Material Pack mentioned above. The glasses above (a) were older; the other pair looks nicer.

  • The head comes from Epic Games' Digital Human demo.

Made for: Diffraction Kernels for Interactive Sound Propagation in Dynamic Environments

Diffraction_Kernels.pdf
  • Figure 1, 2, 7, 9 were made in Illustrator

  • Fig 4 was made in Blender

  • Fig 3 was also made with Illustrator, but the "Projected Area & Curvature" Part was made by rendering different perspectives in Blender and placing them correctly to make it look nice.

  • Fig 5 was made similarly, with some extra sample point positions

  • Fig 6 was made by merging Blender renderings and opacity magic in Illustrator

  • Fig 10 was made in Unreal

  • All of these demos are made in UE4

  • The concert scene is a heavily modified free concert scene from Sketchup that I augmented with Mixamo characters and the Automotive Materials Pack. I also use foliage from Epic Games' KiteDemo and the background buildings are from their Infiltrator demo

  • The helicopter scene is a mix of Epic Games' Showdown and Infiltrator demos + Mixamo characters + some free objects and a helicopter I modified in Blender to add a rig. Helicopter Motion is done with UE4's spline system.

  • The parking garage is a heavily modified version of a mesh I bought from some old 3D model website + free car meshes (augmented with the Automotive Materials Pack) +spline motion

  • The 2 interactive demos are modified versions of demos provided to us through partnership with Oculus. I made some new disks and spawnable objects for the First Contact demo in Blender. The other one is a merger of Epic Game's ShooterGame demo and Oculus' Toybox demo that is networked with UE4's RPC system.

Made for: Towards Fully Mobile 3D Face, Body, and Environment Capture Using Only Head-worn Cameras

  • This demo was made in UE4 using the Alembic system for geometry caches (the reason being that Young-woon's human mesh was a merger of 2 animation methods that UE4 can't deal with easily). The Alembic file was made in Maya using MEL to merge a sequence of OBJs. I would recommend avoiding Alembic for this as it uses an insane amount of RAM in UE4 due to caching+runtime RAM requirements, but we didn't have much of a choice.

Made for: Audio-Material Reconstruction for Virtualized Reality Using a Probabilistic Damping Model

  • All demos were made in UE4

  • The windchime demo was made with the following:

    • I made the chimes myself so the meshes could be computed for synthesis

    • Open World Collection in UE4 for some foliage and trees

    • Epic Games' Kite Demo in UE4 for more trees, grass, logs, etc. (note that it takes forever to cache and compile shaders)

    • swinging motion is done with UE4's cable system and built-in wind. You can daisy chain cable systems like this.

  • The hand holding a wrench hitting the porcelain bowl video was rendered in Blender. In hindsight, I wouldn't have done it this way. The cables are done by daisy chaining tiny cubes together (since I don't recall there being a better way of doing this in Blender 2015). I animated the hand in Blender as well.

  • The virtual museum demo is a VERY heavily modified version of Epic Games' Realtime Rendering demo and some materials are modified UE4 standard assets.

  • The twinkle twinkle demo is a modified version of Epic Games' Infiltrator demo. Ball motion relies on a spline because if you relied solely on physics, the ball would bounce differently every time.

Made for: P-Reverb: Perceptual Characterization of Early and Late Reflections for Auditory Displays

1902.06880.pdf
  • All demos made in UE4.

  • Figure 1 and 11 was made in Illustrator with screenshots from UE4.

  • Figure 9 was made in Illustrator with that complex mesh designed mathematically in Blender.

  • Figure 10 was made in UE4 with a simple Blender mesh.

  • This "Tuscany" demo is a merge of the typical Tuscany demo + Sibenik cathedral, ported to UE4.

  • The "Sun Temple" demo is exactly the same as Epic Games' Sun Temple apart from the motion.

  • The "Shooter Game" demo is exactly the same as Epic Games' ShooterGame demo apart from the motion.