Self-Driving’s Dirty Little Secret
Fountainhead News: May 23, 2017
I. The Disconnect
If you listen to the media and the tech industry at large, and maybe view some of the PR videos, ads, and commentary put out by the various car companies and OEMs, you might be led to believe that self-driving is a solved problem.
That’s there’s really nothing left to do but wait until these car companies can just get their hands on more data.
I mean, we’ve seen the demo videos, right? Elon has said it. The tech press only reports on accidents and new funded startups making sensors now. Solved. Now onto what comes next, world. Good job, Valley. Erlich Bachman is ready to roll.
Not so fast.
Here’s the truth and these come from (anonymotized and consolidated) direct quotes from the people actually building these systems inside car companies, startups, and OEMs.
- “Everyone is using the same tools and approach. It gets us 90% of the way there, but not all the way. The problem is edge cases.”
- “Keeping cars inside white lines is not a trivial problem and not even that is easily solved.”
- “If you run a car wired with sensors down the same well-marked road on a sunny day many times, train an AI model and do it again, it will work just fine. But try it with a different car with sensors in a slightly different place, and you need to retrain your AI model. Because it won’t work.”
- “We have multiple groups working on self-driving across commercial trucking and consumer autos, plus looking for startups in emerging tech. We’re duplicating effort but not making much progress.”
- “Startups getting acquired in the self-driving space only get us to Level 3, not to the goal of Level 5 full autonomy.”
How can this be? How can there be such a big disconnect from what’s being reported and the truth inside these companies?
It’s because people squint their eyes, see the PR demo video hocus pocus, hear official dates from every car company that states we’re only a few years out from full-self driving, and people start to believe it.
But the Wizard Of Oz is working tirelessly behind the scenes to make it a reality.
II. Where Part of the Confusion Lies
There are hardware companies making LIDAR that are also building self-driving software. There are chip companies also building self-driving software (read: NVIDIA PX2). And finally, there are OEMs partnering with chip companies who are building self-driving software to help sell this into Tier 1 manufacturers as a Master ECU (read: NVIDIA + ZF = ProAI).
That’s why this thing is tricky. As we continue to preach, you have to understand How It Works if you ever hope to pick the winners.
A car’s hardware and software system is not like cloud computing, web and mobile apps at all. You have to start from a completely different paradigm.
The first thing you have to understand is how data gets communicated in a car.
Think of it this way: imagine a hose that has holes along the length of it. Maybe your mom and dad had one of these to water the flower bed or bushes out front of your house. Instead of water, imaging that data is flowing through this hose. And instead of the water only streaming out of the hose, imagine that data can go in and out of those holes. And then imagine water / data can flow both ways.
So what’s actually happening here is you have a sensor sending data down one of these holes into the massive stream of the rest of the data. Other sensors and computer chips can choose to do something with that data flowing through or just let it pass without interacting with it.
The technical terms for these things are an ECU (electronic control unit) and a CAN Bus. The ECU can be thought of like your computer processor sending data/water down the hose. The CAN Bus can be thought of as the data hose itself that transfers things back and forth. It looks something like this:
Over the last decade or so, as cars have moved from purely physical systems to digital ones, we now have something called drive-by-wire. Airplanes have had this fly-by-wire concept for some time. What it means is you can control the car with software instead of physical actuators. Less manual gears, more software.
This is why a Tesla can be made by a startup. The hardest and most expensive part of a car is the powertrain. Tesla and other electric cars minimize all the hardware and mechanical machinery. It’s just battery, a chassis, some wheels, and a couple pedals and a steering wheel for control.
If you go a step further and remove the driver, then you have a pretty easy system. Some computer processors, batteries, an aluminum shell, and some wheels. The rest is just creature comforts for humans.
The problem from the vantage ponit of 2017 is every single automotive manufacturer and OEM is working with different systems. A different self-driving architecture, a different set of ECUs, different sensors, a different self-driving platform and software. It’s essentially the wild west.
Then throw regulators into the mix as well as the folks from the ADAS units (read: safety) and you’ve got yourself a soup of confusion and chaos.
But I guess that’s why they call it disruption. Things are no longer as they once were. Expect more executive shake ups at Big Auto over the coming years as we get closer to the 2021 dates that many have publicly stated and don’t have a solution that works end-to-end.
III. Where Does Biologic Intelligence Fit In?
Look, we don’t pretend to be the holy grail for every problem that’s ever come up in the technology world. But we do know a self-learning AI system when we see it. Because we have proof of capability all around us and because we built it.
As we begin to open the vest a bit over the back half of this year, you can trust that we’ll tell you the truth, whether good or bad for us. After all, the #1 component of our culture is Candor + Confidentiality = Trust.
So, we figured we’d make a little one-pager to help folks understand how we plug in and why. Of course, part of the problem is figuring out first, what kind of self-driving architecture is currently in place (a Master ECU doing deep learning and sensor fusion, individual ECUs, separating path planning and object avoidance, etc).
The entire self-driving industry is still in prototype phase. And so the technical approach agreed upon by the industry is just starting to be figured out.
Now into the Biologic Intelligence aspect. You’ll note that in the diagram it looks like a Step 1, 2, 3 process from sensory input to cognitive processing to motor output, but in reality all three of those things are all happening at the same time, in real-time, as each new bit of data is ingested into its brain and nervous system.
That’s the basis of what we call “Adaptable Self-Driving”.
IV. The Future
Now, what we’d like you to do is have a small bit of imagination for a moment. Going one step beyond self-driving cars into autonomous robotics.
So, something that doesn’t just have four wheels, but maybe many legs and arms. One of our good friends (you know who you are) that we’ve been talking to for the better part of a year has custom built his own self-driving “spider” for lack of a better term. Servos running 18 individual leg movements in step. He’s created the hardware shell and some basic software to make it move slightly forward.
The problem here, using traditional approaches, is you have to coordinate the legs to move in just the right time so it doesn’t topple over. Basic reinforcement learning using traditional approaches with Tensor Flow installed on NVIDIA TX1 will get you so far but no further.
How can it learn by itself, on the fly, as it begins to encounter, and explore not just its own robotic limitations, but also the world around it.
At this point, you might be getting a glimpse into why self-driving is 90% of the way home, but needs an entirely different approach to get that last 10%.
Because as well all know, the last 10% takes 90% of the time.
Read More on the Self-Driving Channel.
from Stories by Sean Everett on Medium http://ift.tt/2qTpjH0