VR for Scientific Sensemaking present
I am a researcher on a multi-year project to create multi-user VR interfaces for scientific sensemaking, supported by a Sloan Foundation grant.
I am a VR developer and design researcher pursuing a PhD, advised by Katherine Isbister at the SET Lab at University of California, Santa Cruz.
I believe that virtual reality and spatial computing should empower us to overcome difficult existential problems such as climate change. On a project supported by a Sloan Foundation grant, I research multi-user applications for scientific sensemaking, like making VR simulations for wildfire evacuation specialists.
Outside of research, I serve as a community leader for the Creative Code Collective, contribute actively as a collaborator in the arts, and participate in movement communities.
Reach me at [email protected]
I am a researcher on a multi-year project to create multi-user VR interfaces for scientific sensemaking, supported by a Sloan Foundation grant.
For PhysioCHI '24, our team explored how passthrough head mounted displays could let us use physiological signals to enhance collocated movements. We wrote about how these interfaces may change for one-to-one, one-to-many, and mutual training circumstances.
Eye Ball is a research project led to explore how gaze tracking can be used for novel interfaces in immersive multiplayer games. We tested how gaze interactions could augment VR dodgeball-- for example, if you closed your eyes, a shield would pop up to protect you!
Chem Reality is a WebXR demo created at Vision Dev Camp 2024 that allows a user to browse 3D structures of molecules from an Apple Vision Pro and Meta Quest 3. An early test of three.js based project potential at the largest AVP developer gathering to date!
For With or Without Justice, Karen Abe, Jules Park, and I explored how situated AR could be used to bring light to historical artifacts related to colonial tourism.
Chem Reality is a WebXR demo created at Vision Dev Camp 2024 that allows a user to browse 3D structures of molecules from an Apple Vision Pro and Meta Quest 3. An early test of three.js based project potential at the largest AVP developer gathering to date!
As part of an experiment to put acrobatic and martial arts movement inside VR, Danny Andreev from Sunburn Schematics created a 3D mesh of my body using photogrammetry and post-processing tools, and I used Mixamo to rig and animate capoeira movements. The ultimate goal is to be able to fight, dance, and play with this avatar using complex, expressive kicks, spins, and dodges.
While Angie Fan was at the SET Lab, I was able to assist with an Applied Cuteness Research DIS paper and with some VR development for Cutie, an MFA thesis exhibition.
With the Processing Foundation I learned how to commit open source code and made contributions to p5xr a library that implements the WebXR standard and helps to view p5 3D sketches from VR headsets.
Booksnake lets you explore digitized archival items in the real world using the International Image Interoperability Framework (IIIF). This open framework, used by leading galleries, libraries, and museums, allows Booksnake to download an item's image and metadata, creating a custom virtual object for you to explore.
Bunker Hill VR is a VR recreation of 1930s Downtown Los Angeles by the Ahmanson Lab. Architectural and humanities scholars created 3D assets from historical photographs and civil engineering schematics to create web, desktop, and VR applications. We collaborated with many institutions such as the Los Angeles City Archives, the Natural History Museum of Los Angeles, the Library of Congress, the Getty, and the Huntington Library.
Breath Garden is a WebXR based immersive environment linked to breathing exercises integrated with a BCI platform. It was built for the 2021 XR Brain Jam. The flow of a river moves to guided breathing exercises or data from heart rate sensors, hemoencephalography (HEG), or electroencephalography (EEG). I wrote the graphics code for this environment and worked with an interdisciplinary team to design the experience.
Minerva is a Raspberry Pi based telepresence robot that serves as an educational platform to allow students to practice implementing features in a computationally limited environment such as networking, controls, and computer vision.
The Ahmanson Lab collaborated with The Vatican to render the Raphael Rooms, which contain multiple High Renaissance frescoes. The 16th century art historian Vasari compared figures across the room, creating a 3D narrative that would move between walls and the ceiling. A interdisciplinary team made a novel annotation interface for this— lines that criss-cross and entangle this space.
Three years of practical, critical, and hands-on workshops out of the makerspace at the Ahmanson Lab
Bodyscape is a 3D printed fashion piece containing wearable technology by Behnaz Farahi. It was exhibited internationally (like at Ars!) and locally. I contributed by writing generative art patterns of light that respond to the model's gait. For other pieces I worked on maintenance code, bluetooth communications, and safety engineering.
Corpus Callosum is an engineering student organization named after the part of the brain that bridges both hemispheres-- like so, it encourages technology and arts projects. As technical director, I oversaw the growth of the org to over 100 participants and the org is still active today!