Everything is being scanned.
Drop by SketchFab and run a search for ‘scans’ and you’ll find user-generated uploads of everything from old gas stations to mandarin oranges. (The company just passed 4M registrations…an indicator of how quickly the market for 3D content is growing).
Click through to view the model:

A Digital Twin, Piece By Piece
The world is being captured digitally, in glorious colour and three dimensions. It’s turning into a massive crowd-sourcing effort to capture reality.
They’re all tiny pieces of a larger digital twin: a cloud-based repository of point clouds and mesh objects, coupled in some way to the real world. Sometimes, that coupling is just by tagging: “this is an apple” or “this is a scan of Notre Dame Cathedral”.
But some of these scans are ‘attached’ to maps, meaning you can locate the digital scan back in the real world. Some of these scans are point clouds: not as “readable” by a human when you look at them on their own. And some have captured all of the details you see in the scan above.
There’s a pincer movement in creating these digital twins: the big companies like Apple and Google are wandering around scanning streets and bike paths, every Tesla car is capturing data as it drives, and Amazon is probably capturing scans of your living room with its “security slash surveillance” drones.
Meanwhile, the world is also being scanned from the ground up. Individuals capturing scans of their shoes and living rooms, faces and pets.
I imagine game developers will get in on the action too. (I mean, in addition to Niantic). City-wide scans used as “levels” in a game. I can even imagine them updated in real-time, so that the New York you visit in Grand Theft Auto – Reality is reflective of the actual city. With the coming launch of Unreal 5, that digital reality will be looking pretty darn real.
Things That Disappear
Scanning isn’t just about uploading content. It’s also the invisible engine driving more realistic experiences in augmented reality. With increased depth perception and the ability to rapidly scan the space around you, virtual objects can be placed in your living room, they can interact with your furniture or hide behind a chair.
And one of the things those scans will allow is to make things disappear.
The LiDAR on your iPad (and coming iPhones) is able to help differentiate between “couch” and “floor”. It doesn’t just detect one continuous mesh, but can start to separate the objects from one another.
To get a sense of how powerful this will be, check this out (and read more about it here):
While this level of detail and ‘exclusion’ is still more difficult for things like people, it isn’t so difficult for something stationary (like a couch).
Now, IKEA will be able to provide an AR app that doesn’t just PLACE a couch in your living room, but can remove the couch that’s already there.
It can lead to some interesting effects. Like this one, where objects aren’t just detected and scanned, but ‘replaced’ in a temporal inversion (reversing time itself!):
Replace Everything
The ability to “delete” physical objects from the scenes we see in augmented reality is tightly coupled to replacement and overlays. We already know how this works from Snapchat: the ability to take a real-time scan of your *face* and replace your skin with something else.
This week, anime-style replacements are the viral rage:
But now think about how this concept of replacement might apply to buildings or objects. We can replace entire neighborhoods with a Fortnite Island. The buildings can be overlaid with scans of those same buildings from a year ago.
As we move towards AR glasses and the affordance of the phone is removed, the visual fidelity will slowly improve. Eventually, our eyes won’t be able to distinguish the real from the virtual.
Facebook (and others) are already making this possible but with audio:
“Even when we’re in the same geographic location, the type of environment affects the quality of human connection. Noisy backgrounds get in the way, often causing us to stay quiet, get frustrated, or end up losing our voice from all the shouting. Now imagine that same pair of AR glasses takes your hearing abilities to an entirely new level and lets you hear better in noisy places, like restaurants, coffee shops, and concerts. What would this do for the quality of your in-person interactions?”
Reality Won’t Be Fully “Real”
The benefits, well, sound obvious. But this is also providing the capacity to edit reality. In real time.
Extending this out, we may be able to walk down the street and edit other people’s faces, excluding racial groups or maybe just changing the fashions people wear to something more to our liking. (Imagine a Prada AR filter!)
We’ll be able to decide whether we want to walk around in a Star Wars environment or Fortnite. We’ll be able to choose whether Pokemons or dragons pop up in the alleys. We’ll be able to edit out entire buildings, or turn a crowded square into an empty one.
And it’s all starting with those scans: the convergence of world-scale mapping with individual efforts, of crowd-sourced scanning projects and the walled gardens of whatever Tesla has on its servers.
“Mixed reality” will, eventually, be everywhere you want it to be: whether walking around or logged in to your console at home. A brave new world in which everything we see will be, to some degree, for our eyes alone.