When the path isn't clear, the advisor you turn to should be.

10-Year Archive

Humanizing Tech

Explaining Smart Dust & 4D Printing

What the future of our physical objects might look like

I. Overview

As we move beyond the world of mainstream tech media and VC 10-year fund horizons, towards the realm of futurists and shaman who predict the future of our technological world, you begin to see the edges of what’s possible by humanity.

Things that we may have only dreamed about in dusty Arthur C. Clarke novels or deep in the weathered bowels of a Da Vinci sketchbook where fiction begins to give way to reality.

When we predict the future, what you first have to realize is we’re not actually predicting. We’re willing. For example, because we see the future a certain way, we believe that is what’s possible, and therefore that is what we manifest into reality.

We predict it because we can see a fuzzy path to building it. And by building it, something that only existed in our imaginations can become reality. I think we have a word for that.

Magic.

II. Disney’s Smart Dust

But, before we delve further into what the way-out-there jockeys already understand, we first must imagine the world through the eyes of our children. Theta waves and all that. When I discovered what I’m about to show you, it immediately reminded me of this beautiful film that I was lucky enough to see years before it hit the theaters.

Kudos to you, Disney Animators. The keepers of our magic.

What you’ve just seen warms the heart. A family of rag-tag dudes and dudettes fascinated with building things, that can change things.

They call it Microbots in this Disney flick, but we’ve named the real science Smart Dust. Where do you think Disney got the concept?

If the universe is a fractal and it’s the same as you zoom all the way in as you zoom all the way out then shouldn’t our tech be the same? Small, smart components that, when combined, create larger structures at the speed of thought.

Maybe the robotics of the future won’t be made of traditional metal parts or even wetware as fantasized in Ex Machina, but rather the real R2D2 will be made of Smart Dust combined in a particular shape to do a particular job.

Big Hero 6’s MicroBots

Is there a way to build something like this in the real world? You bet your bum there is.

III. Graphene’s Honeycomb Building Blocks

We are reaching the limits of Moore’s Law. With transistors at the 5 nanometer level, we begin to experience quantum tunneling through gates, which means new innovations and manufacturing methods must take place if we want our electronic components to get much smaller and faster than they already are.

To get smaller, some researchers are using Graphene (there’s that space age material again) at only 1 nanometer across. To put that into perspective, that’s only about 10 atoms wide. How small is an atom?

Pretty darn small. So about ten of those is what we’re looking at as a core building block. Essentially a tiny sheet of graphene. So what does Graphene look like when you connect some of the honeycomb structures together (beehive anyone)?

Graphene sheets, tubes, and buckyballs with potential for electrochemical energy storage. Looks a lot like a beehive, no?

We’re starting to get somewhere. At a very small scale, this is beginning to look like the magic that Disney’s animators concepted.

IV. Introducing Smart Dust

Now that we’ve got the building blocks out of the way, and shown how evolution and biology is already using similar structures for one of Earth’s most important animals, we can move onto the fun stuff.

The concept for Smart Dust began in the early to mid 90s as, you guessed it, a DARPA project for the potential military applications. Note the implications of this are also included in Big Hero 6.

The core concept of Smart Dust is very small, millimeter-sized sensors or, further into the future, computers that when pulled together can create more complex machines. A fractal computer, if you will.

In mid-2016, the University of Stuttgart released a paper showing how very small sensors can be combined to create ultra high resolution imagery, essentially Smart Dust as a combination of very tiny cameras. Imagine that for the Snap Spectacles of the future.

These micro-lenses, about the size of a grain of sand, can be 3D printed. That means no matter what kind of lens system or design you can imagine, you can make it with a simple CAD drawing.

Single lens (left) and multi-lens (right) showing the tiny scale of these new micro-vision sensors.

Of course you could imagine all sorts of use cases for these things, from recording the reality around you like Snap’s Spectacles, security and monitoring, or imaging inside the body for future medical devices, all the way to future self-driving cars that might have a layer of cameras embedded within the car paint itself in order to find its way around the world.

And, you could imagine how if you put these smart lenses around a 3D object and combined with a high-resolution display, you could essentially make that object appear invisible. By projecting what’s shown behind the object to people viewing it from the front, it would appear that the object had disappeared.

Transparent glass iPhone 7, anyone?

V. 3D and 4D Printing

What you’ve seen above can be made using a comparatively low-cost 3D printer. The fancy term for that is a Nanoscribe laser lithography 3D printer. Here’s a video of that in action (check out that detail!).

As you can see we’re talking about 3D printing something close to the molecular scale, down to the precision of about a single nanometer. Reviewing what we described above as a 1 nanometer transistor, you could imagine a very real leap where we have embedded SoCs as part of these imaging sensors, combined in a graphene honeycomb pattern to give us something like we’ve seen in Big Hero 6.

Of course, your next question is going to be, how do we make something that can change shape, combine, uncombine, and move after we’ve already built it?

For that we need to extend 3D printing to 4D. The 4 refers to the object moving into new shapes after it’s already been printed. There are two videos to show this interaction taking place from our friends at the MIT Media Lab.

But first, back to our graphene-honeycomb shape. You will see this shape a lot in the coming decades so you should probably start getting used to it now.

And the second video, showing the fractal nature of this. Print one honeycomb, then let it combine with other honeycombs automatically.

Eventually, you want to program exactly how these things come together so it happens much more quickly and for a specific purpose. The “job to be done”, in product parlance.

For that, you’re going to need an operating system to control all the subsystems. Something lightweight and low-power enough but also parallel enough to run artificial intelligence programming to make sense of all the information being sent in by the sensors. Essentially, a very scaled down sensory-input-to-motor-output robot.

Luckily, there’s been one such project that’s been under active development for years, called simply TinyOS. It’s now hosted on Github and still under active development. Sadly, not enough over the last few years, but its time will come.

Put all these pieces together and you’re talking about building a Robot Explorer that can change shape to handle any situation or environment. The robots of the future will not look anything like the way Robin Williams portrayed it in Bicentennial Man. Rather, it will look more like Big Hero 6’s Microbots that can shift into humanoid form, or fly along the wind like Michael Crichton’s Prey, or transform into 3D letters or shapes for communication with alien species like the more recent Arrival.

Robots of the future won’t be made of hard metal, but rather smart dust combined into any shape for any purpose, including communication.

The last aspect of this, then, is artificial intelligence. Not just the brains of the system, but also the nervous system to control the physical dimensions of this object in our 3-dimensional universe (or 4D if you believe in time).

So, if we’re talking about artificial intelligence that can work across spacetime, as well as evolving to handle an end-to-end sensory-input-to-motor-output necessary for handling real-world robotics, then you need Biologic Intelligence. It’s the only way for something like Microbots, Smart Dust and 4D Printing can combine into something that can help us explore the universe we inhabit.

Of course, things like this often create fear, for that which we do not understand must automatically be dangerous. If you haven’t seen the movie Transcendence with Johnny Depp, I urge you to watch it to the very end. The entire span of not just the technology, but how humanity handles it, might awaken some insight in your subconscious.

Sean

Read More

What is an Artificial Connectome?


Explaining Smart Dust & 4D Printing was originally published in Humanizing Tech on Medium, where people are continuing the conversation by highlighting and responding to this story.



from Stories by Sean Everett on Medium http://ift.tt/2g5k8x0