When the path isn't clear, the advisor you turn to should be.

10-Year Archive

Humanizing Tech

AR Apps Are Interactive Live Video

The future of TV isn’t app, the future of apps is interactive live video

I. Overview

I know that headline might be a bit confusing to some of you, so just stick with me here for a bit because this is new thinking that I’m still trying to digest. As I alluded to in this week’s Diary of a Madman newsletter, it hit me like a flash after I read Scoble’s latest post on what he believes is the future of AR/VR and how that lines up with the upcoming iPhone 8.

So, let’s connect a few dots:

  • Apple’s CarPlay is nothing more than an interactive live video stream.
  • There is a continuum of reality to digital overlay reality to full virtual reality, all enabled by live vide.
  • The iPhone 8 won’t be actually be transparent, it will just look like it using live video.

Three insights, that when put together, add up to something that seems to have merit. Lets keep going.

II. Apple’s CarPlay Transforms into CarOS

Apple’s trojan horse into the automotive market has been around for years. They call it CarPlay.

CarPlay is nothing more than a H.264 live video stream from the iPhone through a cable or bluetooth to the car’s display. It doesn’t install software like Maps, Messages, or Music in the car’s infotainment system or the car’s operating system. Instead, all you’re doing is interacting with a live video stream.

You can read more about it in my review of CarPlay a few months back (complete with pictures!).

Apple’s CarPlay isn’t an “app” in the traditional sense. It’s a live video stream that you’re interacting with through the dashboard display.

Allow me to digress a bit and talk about the automotive industry at large, as it changes from a mechanical machine shop (hardware) to a digital development shop (software). No doubt, they will still be massive system integrators.

The software stack of the future will look like something like this from the top down:

  • User Interface: CarPlay, Android Auto, etc which is really just H.264 live video streamed from the phone mimicking a real app installed on the phone.
  • Self-Driving App(s): these will have to be certified by the NHTSA. The deep learning approaches currently employed by the industry isn’t enough to get to true Level 5 self-driving because they’ll never capture enough training data to handle every Black Swan event. We need a different approach.
  • Operating System: like BlackBerry’s QNX, which Ford is using, but also Apple’s CarOS and Android Auto extended to full OS. Currently folks are just running a fork of Ubuntu or the Robot Operating System (pretty scary right, that your entire life is in the hands of some open source robotics software?).
  • Digital Sensor Hardware: video cameras, sonar, LIDAR, etc.
  • Normal Car Mechanical Parts: all the moving parts like axles, wheels, etc.

So, if we’re talking about pushing interactive live video from the access device (your smart phone) through to the display, while trying to work in concert with off-the-shelf, open source car software, well we have a ways to go to tighten this up.

This is likely the reason that Apple sees such an opportunity in this space. To generate an industry-standard OS on the back of CarPlay that will allow for higher reliability and ease of integration for developers.

But instead of having Apple approve your app, you’re going to have to get approved by the much more stringent NHTSA.

III. Continuum of RR to AR to VR

AR and VR aren’t separate things, but rather a continuum from real reality (i.e., live video) to digital overlays on a live video (i.e., AR) to a full digital universe (i.e., VR). Instead of the various transitions from not wearing any headset, to putting on a Hololens (AR headset), to putting on something like the Oculus (VR headset), the future will be different. Really, it will be more like putting on a pair of Warby Parkers where turn a dial to go more or less digital. Think Snap Spectacles, but a little more functional that don’t look like there’s tiny LIDAR on either corner of the frame.

Back to Scoble’s post (linked above), he says Microsoft’s future strategy comes down to one simple thing: Hololens + Cloud. That means their AR headset will run that gamut. And it means that Microsoft Office will be used while wearing glasses, goggles, headsets, whatever.

Imagine going into work and instead of taking your laptop or phone, you just put your shades in your pocket. The lenses get darker as you go outside and operate as sunglasses. Inside, they turn clear again. In both respects you have a digital overlay that shows the directions to get to the office and avoid traffic (maps! live video!). You get to the office, sit down at your desk or in a new “private work room” where your glasses appear to project big objects throughout the room that you can physically manipulate in 3D space.

You log into excel and manipulate data and analysis by speaking to your personal AI (it won’t be all based on deep learning folks, you need biology for that), and moving physical blocks with your hands. Vlookups and Pivot Tables actually turn into 3D objects.

So, lets go back to the in-car experience for a moment and predict what that’s going to be like when self-driving cars are the norm. Well, we already did in detail. In essence, get a comfy couch in your luxury SUV and watch movies, play games, or turn it into your mobile office like above. Smaller space, smaller objects that appear on a desk instead of in an entire room.

Heck, who knows what’s going to happen?

But what I do know is that different use cases call for different interactions. One is the real world where we eat, drink, and move around. The other is digital + real and the final is all digital.

In all cases, you’ll be pushing a live video stream through your phone and viewing it through your glasses. Sometimes there will be interactive elements in that live video. Other times it will be entirely digital. What we need is massive infrastructure for cellular and CDN bandwidth to handle the 16K resolution required for truly real reality and GPUs about 5x more powerful than the ones we have now to process it all in real time all while consuming only a fraction of the energy consumption of today’s wearables and smart phones.

A heavy challenge, indeed, to make all of those things line up. But the future is coming faster and faster. Build towards the where the puck will be methinks.

IV. Transparent iPhone 8

The iPhone 8 won’t be transparent as Scoble predicted because where would the battery and logic board go (obviously)? Instead it will just use live video from cameras on the front and back to mimic some sort of transparency on an edge-to-edge display with a software-only home button.

Our friendly neighborhood CEO, Mr. Timothy Cook, can’t stop gushing about how awesome AR is. At this point the only thing he hasn’t specifically said is, “Apple’s making an AR headset”.

Of course, if you’ve been following this publication for some time, you already know how that’s all going to come together as a step on Apple’s ultimate roadmap.

So, when when you make the entire front of your iPhone appear transparent because it’s just showing the video of what’s behind it, then you start getting to a place where you can overlay digital objects on that and fool the mind into thinking there’s no device resting in a headset strapped to your face.

Again, because this is pushing live video from one camera on the back of the phone to the screen on the front of the phone, and then overlaying digital objects on top of that, you’re talking about H.264 live video. As it becomes a more social experience it means you’ll be sharing that video and trying to operate almost like a two-way augmented reality FaceTime call. We can digitally “live” in each other’s spaces.

With cameras like the 4K live stream Orah installed in various places and certain venues it means we can also transport ourselves to a different digital place, which is essentially 360 video in virtual space (only it’s not all digital, it’s real).

V. Conclusion

I know this all gets confusing so I’ll sum it up. Like turning a dial, your future of augmented reality with exist as a range from nothing digital to everything digital and everywhere in between. It’s a setting just like the volume control on your phone.

Everything will exist as interactive live video that you can tap, tweak, move, or speak to. Apps in the traditional sense where you install some software to the phone’s operating system and then open it and interact with its functionality won’t exist in that form any longer.

Instead everything will be more fluid because it’s delivered over a live stream with interactive hot spots.

That means if you’re not investing in live video, or understanding it, then you should. And if you’re curious about what our transportation experience is going to look like in the future, you might see a consolidation of operating systems that market to you similarly to how mobile already has been. Because it’s a shortcut to consumer understanding.

Sean

Read More

Machine Learning for Metadata Explained


AR Apps Are Interactive Live Video was originally published in Humanizing Tech on Medium, where people are continuing the conversation by highlighting and responding to this story.



from Stories by Sean Everett on Medium http://ift.tt/2e2rOn8