Overview
Early this year, Mike, Nick and Media participated in a spatial computing hackathon at the MIT Media Lab. (Spatial computing means a combination of technologies that represent the next step in the evolution of mass-market XR: SLAM tracking, 6DoF controllers, and a live view of the environment via waveguide optics or passthrough cameras.) At the hackathon, they experimented with applications for AR-enabled headsets like the Magic Leap One, HoloLens, and Vive Pro with SRWorks–and Media and Mike won the Best Use of Magic Leap prize with their assistive vision prototype cleARsight.
They’re going to share their experiences with these devices, go over some of the frustrations they encountered and how they resolved them, and describe the plan to use them in different fields–respectively, photography, animation, and healthcare. They’ll also talk about the imminent changes in the ecosystem that will make spatial computing capabilities much cheaper and more widely available.
Objective
Share our experiences developing app prototypes for spatial computing devices.
Target Audience
People interested in building apps for spatial computing devices.
Assumed Audience Knowledge
A basic understanding of how an app gets onto a standalone XR headset will be helpful for following along–this could mean either a little firsthand experience of either Unity in general, or else mobile development on any platform.
Five Things Audience Members Will Learn
- The essential characteristics of spatial computing devices
- Applications for creation tools
- Applications for assistive devices
- Why Android developers should be paying special attention
- …and why they’re about to get a lot cheaper