Anyone can create a pretty little bamboo garden in the world, but I doubt the gardener would succeed in incorporating the world in his bamboo grove.
In these precedented times it was a joy to see many of my favorite XR industry friends in person at Augmented World Expo (AWE) 2021. Although we work on 3D immersive technology, the physical world is still where we go for full connection. For how much longer will the meat-verse remain so much better than a metaverse that we will want to fly in from around the world to visit with each other?
This was my first AWE and I was attending for work now that I’m a main XR UI/UX designer at BadVR. I’ve been reflecting on how far I’ve come from first meeting Steve Lukas and seeing his magic leap demos in his garage, playing with Spatiate alpha builds with the leap squad, making Holo healing as my first magic leap project (I pitched it to the independent creator program), then continuing to try a medical first response XR startup to now working officially in an industry position among a growth-stage startup. The wide-eyed idealism of the magic leap launch inspired me to dream huge and learn 3D skills as fast as possible, and fueled my own naïve approach to throwing my whole heart and personal skin into a complex startup. Now though I have been tempered by the pandemic and real world need for money to buy food. Somehow, I have been fortunate enough to ‘go pro’ in this amazing XR industry, just in time for it to become the ‘metaverse’ and grow to many trillions of dollars. I owe it all to my heart and my crazy investment in this wild new technology, I took the magic leap and landed on the other side, yet the magic has become more subtle, muted perhaps as it is mixed in further with industry and capitalism. If I hadn’t tried, I could not have succeeded, even if that success did not look like I imagined.
From a passionate cohort of talented dreamers and builders, to the next industrial revolution, the XR ride has been intense yet thrilling.
So this AWE felt special to me personally, and as an industry it marks a turning point: this AR/VR alphabet soup has grown up into ‘the metaverse’ and is poised to touch every industry and human activity that technological society is involved in. There’s still a long way to go, but now no one can deny that XR is the next wave of computing, and computing is the dominant paradigm for economic value in this era.
So what did I learn at AWE? That business models are already moving faster than the actual vision for a metaverse, and now it appears we are entering into the Xverse wars as Rony tweets about it, a time when walled garden platforms will compete to become your one stop realm for all things, begrudgingly letting you load into another app if you must, but no fusion! The metaverse has already morphed from an idea of a virtual neighborhood or city – shared standards acting as the roads, trains, and sidewalks, and so on -- to the idea of a Disneyland for world and play. One company looms above it all wanting to grab it, and every other company seems to also want it all. Inter-operability seems destined for a later discussion, after the consolidation into private siloes. It brings me some sadness to see companies sell platforms and apps as an entire metaverse, yet it brings me more hope to continue being part of the amazing XR community which is actually building the metaverse of many pieces that we will stitch together organically, as technology truly evolves.
A feeling I’m left with: we’re moving forward tangibly, yet conceptually are still where it has been for decades – we are still in early grind mode trying to get concepts working and out to people. This is how I know we’ve grown up into a proper industry: it’s kinda boring, with sparkles of magic.
Let’s get the coolest demo experience out of the way: yes I did see Magic Leap 2. Thanks to Shane and my association with BadVR (a company originally funded by Magic Leap) we were able to get the team eyes-on experience with the ML2. I was blown away. The visual quality is incredible with rich colors and bright, uniform display, with a welcome improvement in field of view. Yet the real star of the new ML2 is the selective dimming feature: it works incredibly well. The background can become almost completely opaque. Yes that’s right folks, the Magic Leap can now render black. This is a huge paradigm shift for additive display technology, coming just in time to compete with video passthrough devices. The dimming feature is not just a global mode, instead it is an array that can make arbitrary shapes. For example, I was told that the dimming system could block out lights in the ceiling while leaving the rest open. Before this demo, I was not particularly excited for Magic Leap 2, yet now I’ve flipped over to desire for one. It still has a tethered computer pack, and I can’t use glasses with it, instead it requires vision inserts like ML1. Those are acceptable things, since the optics were so impressive that I am now ready to buy one, even if my credit cards are not as ready! I hope ML1 experiences ‘just work’ out of the box on ML2, and maybe a discount for us early leapers who believed when many didn’t? 😉
On the presentation front,
Two of the largest announcements at AWE were Niantic’s lightship platform for making AR experiences, and Qualcomm’s Snapdragon Spaces development platform and vision.
Niantic showed off a vision of an augmented world that brings you closer to the real world by weaving joyful characters driven by complex computational understanding of mapping and context. It looked like the Magicverse talk at leapcon, yet more boring and tangible since it’s running on phones now. Familiar features to headset users are now in mobile, from scene meshing to semantic understanding along with an impressive realtime occlusion and situational mapping systems to make world-scale content possible. Yet sitting next to Andres we both wondered: where does this live? Is it something that I have to choose to use at the expense of other systems? Could I use Niantic’s anchors with someone else’s meshing? It felt like a ‘one stop shop’ solution, which is worrying because right after CEO of Niantic got off stage, Qualcomm announced a development platform too.
They call it Snapdragon Spaces and it offers again meshing, hand tracking, semantic understanding and other goodies. Still I was curious, is this something I add to Unity? Is it per app? Is it for all systems running XR2 chips? Their intro talk was vague, showing us polished if generic demos of image tracking on a book and something else I don’t remember. I knew that I would have to get a better understanding at Steve Lukas’ talk. From his talk I seemed to understand that this is a base platform API for Qualcomm chipset hardware. Steve conveyed that Qualcomm’s hypothesis is that in order to get to full lived-in headset experiences, we need to take a trip through mobile phone driven headset enhanced experiences. Again this is a ‘growing up’ moment for XR that I think will pay off. Solving the entire ‘headset as a computer’ problem is going to take us all years working on our various problems. Adding value via a headset as a companion to a mobile device can make sense now – a specialized tool to add value now through synergy rather than trying to bring all the value on its own.
Still my questions linger: could I mix Qualcomm Spaces frameworks with Niantic? What about Microsoft? As a researcher and professional all this still feels like a time-freeze: we’re still being locked into single platforms that build specific experiences. The XR computer is fragmented within its walled gardens. It’s increasingly possible to build amazing AR and VR applications, yet this path toward millions or billions of applications existing in ‘the metaverse’ is still not on the real roadmap. Users and money come first, and inter-operability seems like a more distant dream. Everyone seems to treat XR systems as game consoles more so than desktop OS paradigm. It will take our continued effort to pry these systems open and make them inter operate. Unless WebXR really can deliver feature parity with these special software stacks…
On the open standards front I attended an overview of OpenXR’s progress and next steps. Thank the Standards Gods that the industry went with OpenXR, I remember back in 2017&2018 being concerned (read, terrified) that each headset would stay its own esoteric platform target. Instead, it is now possible to (kind of) build once and deploy anywhere. OpenXR has been focused on hardware compatibility, so its next journey will be in the more meta layers. He mentioned the Khronos group is considering finding a way for semantic layers to perhaps inter-communicate within the framework. The actual ‘metaverse’ will need standards far more than it needs big single apps, so hopefully these mega corps and new startups alike will realize that short term user capture must be merged with long term ecosystem support. If each company wants to get people to live in their amusement park then they will need to invest in standards that represent to on-ramps and sidewalks that get us to and fro their power-play spaces. Walking across a ‘home screen’ of one sanitized company’s platform, or surfing the endlessly self-evolving metanet? Choose the better future!
On the interfaces side I attended one talk about eye-tracking and another about hand-tracking. Ultraleap compared hand tracking as the same level of importance as the touch screen was in enabling smartphones to become part of daily life. Clearly controllers for XR devices are not the mainstream way, and at BadVR we design primarily for hands for the same reasons described: instant immersion, intuitive interface, nothing to hold and lose. Yet hand tracking alone is not the future, it is merely our best present. My brief experiments with eye tracking showed me that the real power of XR will be unlocked not by doing more physical action, but by doing less. The Minority report future will not arrive, instead you’ll gaze and think your way to even more activity than could ever be done with hands and arms flailing. Eye gaze is more universal and reliable than hand tracking, yet it is still in its infancy. And yet to enable eye-tracking means to open up the window of the soul to computerized scrutiny. We will need to demand that certain data never leave the headset, lest your deepest secrets are revealed by where you look and what you care about.
Cue running into Kavya Pearlman who is the indefatigable avatar of XRSI and all around kind person. I was glad to see her flying about AWE, someone needs to think of the future – how will these technologies bring us sovereignty rather than tyranny? Only through our demands and action of holding the whole industry accountable to human rights and decency.
Finally need to shout out the AR house brought to life by Lucas and Aidan. This is the antidote to corporate boredom, a place where the early hacking LeapNation creator spirit lives on and expands the medium beyond what can be imagined by ‘industry’ approaches. I hope to see this grow into a foundation where people are encouraged to give it support just because it should exist, not for any ROI. Now that XR is an industry, we risk getting stuck in boring loops of funding what is possible to imagine, rather than imagining what is possible.
Out on the vendor floor I spent the second day giving demos of BadVR’s software to conference goers before departing back to the office. This was a very rewarding experience. When I first saw SeeSignal on magic leap in 2019 I could see how BadVR was doing truly native XR software, things that could only be made in the new medium. I still feel that we are on the forefront of what XR is actually all about: new ways of seeing and understanding that were never possible before.
It was great to hear that sentiment from many conference goers was shared, a common thread of feedback was ‘most of the other AR stuff seems like a solution looking for a problem, but this makes sense as a solution for actual problems.’ Kent Bye came by and was impressed with the SeeSignal demo I gave him, and Robert Scoble visited and shared endless wacky yet accurate details about how spatial audio is the real revolution. It was an action packed day showing the tangible possibilities of headsets: new ways of seeing, both on the ground and as a situational awareness eye in the sky.
I slipped away to peruse the whole floor including the play area, and my heart felt full seeing Stan Larroux and the sweet Lynx headset that I backed (don’t worry Stan, I won’t rush you all… too much) and marveling at how powerful the technology is becoming. Yet overall felt wary. Behind the bright pins and characters of new fantastical games, the XR technology stack is among the most powerful things humans have yet built. It is revolutionizing all industries, from games to war. Turning over a new era when humans are surrounded by computation, rather than just mere computer users.
2021 is the year that XR grows up into ‘the metaverse’. All the tech is the same, but now with Facebo-er, I mean Meta pumping billions into VR and gathering millions of users, and Microsoft trying to call everything a metaverse and continuing to ship useful AR headsets, along with all the other macro and micro players jockeying for position it really feels like a proper industry. It is our time now, the world is starting to notice that there’s a new kind of computer that you wear on your head, and a new type of computer understanding that can perceive the real world.
Still, let’s not get ahead of ourselves, it only just now works to put a animated model of a pokemon on the ground in the same place for multiple people to see. There’s still much work to do!
The XR industry is so young, it still has its soul to gain or lose through choices made about what it’s for and how we design it. Will it uplift us and bring new empowerment, allowing us to see ourselves anew that we can heal and advance to the technic and societies we need to thrive in the demands of the new Earth era? Or blind us in the most powerful saccharine pleasure trap ever constructed, run on oil money until the microplastic waste chokes the bottom of the food chain and the oceans collapse, us all soon following? The choice is ours, but don’t delude yourself about the stakes. This is not a mere ‘product’, XR is at its heart a new way to perceive, structure and modify reality. This level of change only comes around so often, perhaps only the advent of language itself can compare. Now that another revolution of new monetary forms and value experiments on blockchains is crashing into 3D graphics all bets are on for a new kind of technological reality to emerge and consume prior forms. A world is dawning where computation becomes not something you do, but something you inhabit, along with the world and the entire economy and the ever many more machines…
What will we do with this? What are we using our time for? In the 1960s and 70s there was a dream of computation, that it would liberate knowledge, enhance thought and bring creativity to every corner of human affairs, that everyone would learn to program and ride the bicycle of their mind. Then it got locked down into ‘apps’ and the profit motive warped it into a contest for attention and subjugation – rather than seeing us all as its programmers, the modern internet sees us all as its product. Once again we are at a turning point for computation, yet unlike last time we do know how valuable it will become, we are greedy adults in this computer revolution unlike the prior wild-eyed dreaming youths who started it.
Can we resist the temptation for domination to let a trillion free open and interoperable meta-worlds bloom, or must consolidation be the name of the game until there will only be one not-so-meta-verse? You and I have part of the answer to these questions. I live the need to use this technology for more than we can currently imagine, and I invite you to show me what’s in your heart that can only be unlocked through this special new medium.
Even as you do business, free your mind…