A series of sketches on what current interactions might look like in the metaverse. Date of implementation: undetermined
Reading Snowcrash was illuminating and disappointing. I can see why so many startups latched onto the idea of selling land ( a misstep in my opinion ), but also what flat two-dimensional interactions, embodied in the metaverse might look and feel like. The hypercard as a metaphor for transacting or the Librarian as an aid to researching are still, thirty years after the first release of Snowcrash, true advancements to digital interactions we have today.
This got me thinking about the question: what is the difference between the web today and the web of tomorrow likely to be? Reality moves in increments rather than sudden leaps. If we take today as a starting point for imagination, what are we likely to see? The fundamental innovation allowing the Metaverse, from a designer point of view -- the equivalent of HTTP -- is the Universal Scene Description, invented and later open-sourced by Pixar. This allows everyone to create scenes and the scenes to talk to each other.
Based on this, here is my take on what is likely to be different in the metaverse compared to flatland, once we get there.
Flatland vs. the Metaverse
Web 2.0 | Flatland
Web 3.0 | Metaverse
Today we can see that we are down a third of the way in the list, at Objects and the first wave of Experiences like BeatSaber and exercise experiences. These are native experiences with no parallel in today's world. But what about the myriad experiences we do have in Web 2.0 ? How might they look different in the metaverse?
We see the future best in stories. To that end, here is the first in a series of metaverse sketches, moving today's flat experience into the third dimension :
Sketch 1: High-end fashion in the metaverse
You enter a personal runway at an appointed hour. You enter as a hyper-realistic avatar, with millimeter perfect body size into a lounge area. A quick conversation with your personal style advisor, who is waiting there for you ensues, surfacing a few keywords and your current color palette.
Your personal wardrobe is populated with a short, curated collection of clothes. You run your finger through and touch each item, checking the texture, zooming in on the weave, seams and detailing. You send back a few items that don’t match your criteria. Your style advisor makes a note of your gestures and retinal dilation. Based on your preferences and her judgement she looks for alternatives that might meet your needs.
Ready with your wardrobe, you try on each item in your runway, to see how the clothes look on your body size when in motion. Lighting changes to match the event setting for the item of clothing. You can choose event settings manually — dinner, cocktail, party, picnic, and more — but for the most part they are cued automatically from the clothing. The environment around your runway alters to put you in the appropriate setting, giving you a realistic sense of how the dress will look and move.
You decide to pick the ones you love and send back the ones you don’t. Your style advisor asks for your permission to charge your account. Before you agree, you want to know that this is indeed an exclusive piece and a part of a limited run. Your advisor scans the tag on the virtual garment— which also exists on the real garment — and verifies it on-chain. The verification returns the item number in the series and how many other pieces of these series have been sold so far.
Satisfied, you accept, inserting your hypercard into the slot presented. Your account is charged, and you see the details of your order and confirm your delivery details. You exit the metaverse. The dresses make their way to you and arrive in a few days. They look and fit exactly as they did in the metaverse.