5 min read

Convergence in Robotics

Apr 2, 2018

180401_v2 Tech Blog 3x1

Last year, I ran a competition to find my next “Strike Force” interns. Max Goldberg and Claire Adair won the competition, and joined me at Abundance 360 as special correspondents. In this blog, you’ll learn key insights from A360 on the robotics revolution, as written by Max and edited by Marissa Brassfield.

From R2-D2 to Commander Data, robots have been on movie and TV screens for decades.

Thanks to the convergence of AI, sensors, material sciences, actuators and mobile phone technology, we are on the cusp of a robot revolution.

This year at A360, Peter brought Dean Kamen, legendary inventor, entrepreneur and founder of FIRST Robotics, to forecast how robotics will converge with other exponential technologies to enable new breakthroughs.

Before we dive in, here is some background on the field. 

As the Draw Shop video above notes, today’s robots primarily perform dull, dirty or dangerous work. But as enabling technologies like batteries and sensors improve, we’ll see complex and capable robots doing everything from conducting precise surgery or taking care of our elderly parents.

Robotics will influence far more than the healthcare industry, however.

Transporting Consciousness Through Avatars

Soon, we'll put on virtual goggles and haptic suits that enable us to remotely operate robot avatars, transporting our skills and consciousness to distant locations.

“Once we have devices that in real time can do sensing, and be safe, and be reliable, and don't require a lot of kludgy human interface,” Kamen explained at A360, “we're going to start creating ways for people to interface with each other around the world. And if you think making a car is efficient, how about being able to project yourself anywhere in the world at essentially zero physical movement… that's an efficiency gain that's even more than exponential.”

This future may be closer than we realize.

In March 2018, Peter and the XPRIZE team announced the $10 million ANA Avatar XPRIZE, which challenges small teams around the world to design a multipurpose robotic avatar system. Following two milestone competitions in 2020 and 2021, XPRIZE plans to award the prize purse on October 1, 2021.

In Part 1 of this Convergence Catalyzer blog, we shared how materials science will converge with virtual reality and augmented reality breakthroughs to produce a mobile revolution for VR headsets. As a recap, we’ll soon see the following innovations:

  • OLED display advances that allow headset resolution to increase from 500 pixels per inch to 3000+ pixels per inch
  • Optics will move from today’s traditional structures to thin films just nanometers thick
  • Memory improvements, enabling storage of large VR graphics files
  • More energy-efficient processors and graphic cards
  • Battery advances to support the large energy requirements of high-quality graphics

In particular, battery advancements are essential for avatars. Because your avatar will operate remotely, representing you and your energy, the device must have sufficient onboard power to support computation, data acquisition and actuation over several hours.

Better materials will also enable anthropomorphic robots -- devices that look, feel, and move like humans.

At a point, these avatars will outperform humans, surpassing the performance of even our biological systems:

  • Audio will stream in higher quality
  • Translations will be automatic
  • Visuals will be more crisp
  • Artificial ‘muscles’ will run for longer and faster, and can jump farther

You might ‘uber’ yourself into a robot, so that you can be at home in Los Angeles and remotely explore New York City.

By tapping into different types of avatars, you could inspect a manufacturing plant in Detroit, shake hands across a dinner table in Shanghai, or play a concert at Madison Square Garden… all in the same day.

“What if we could get you and your ideas anywhere in the world to interact with anybody else’s ideas anywhere in the world without having to use your clumsy body and somewhat limited physical resources between your ears?” Kamen asked A360 members.

Operating an avatar is a far more efficient mechanism of travel than today’s methods… but what about when we want (or need) to move our physical “meat bodies” from place to place?

Ubiquitous Autonomous Electric Vehicles

What happens when autonomy (enabled by sensors, computation and networks), innovative business models (the sharing economy), and electrification converge?

Transportation disruption.

Current transportation is filled with inefficiencies -- from the mere 15 percent efficiency of internal combustion engines, to the man hours wasted piloting vehicles, to the inefficiency of moving our bodies across the country for business meetings.

But one class of robots -- autonomous vehicles -- is poised to make the industry hyper-efficient.

Dozens of car manufacturers are experimenting with or developing autonomous vehicles, racing to market between 2020 and 2025.

Tesla recently presold 450,000 electric Model 3 vehicles to be delivered by 2020, each with the hardware needed to function fully autonomously. Uber’s self-driving test cars are already on the road.

What are the implications of autonomous electric vehicles?

As Ramez Naam, Chair of the Energy track at Singularity University, described at A360:

“Uber is successful for a lot of reasons. One of them is that it costs about half as much per mile as a taxi. If you make that autonomous, you drop the cost in half again. From a dollar a mile to 50 cents. Make it shared, [and] you make it 25 cents perhaps. Make it electric, and you can cut the price again, because electric vehicles cost a little bit more upfront now, but they're already cheaper to operate because they have fewer maintenance costs and less energy costs… All Taxis and Ubers will go electric.”

Lithium-ion battery prices have fallen five times over the past eight years to under $200 per kilowatt-hour. Over the next five years, we should see prices fall below $100 per kilowatt-hour.

A sharing economy of autonomous vehicles requires an abundance of low-cost sensors: LiDAR, radar, ultrasonics, high-dynamic-range cameras.

Ten years ago, the LiDAR systems that Google used were about $75,000. Today the same systems cost about $1,000 -- two orders of magnitude less. Within the next 10 years, the cost will be reduced by another two orders to under $10.

Sensor breakthroughs enabled by materials science and signal processing are giving autonomous vehicles superhuman perception and operational capabilities.

Implications for Humans

Exponential technologies aren’t developing in a vacuum.

By tracking how they converge, entrepreneurs can identify disruptive opportunities (which, to large legacy organizations, might represent disruptive threats).

And for all of us, we’ll soon see robots enter every area of our lives -- from bedrooms to boardrooms to highways, operating rooms and airplanes.

As these robots free humans up for higher-level work, what will we focus on next? What problems will we choose to solve?

Join Me

1. A360 Executive Mastermind: This is the sort of conversation I explore at my Executive Mastermind group called Abundance 360. The program is highly selective, for 360 abundance and exponentially minded CEOs (running $10M to $10B companies). If you’d like to be considered, apply here.

Share this with your friends, especially if they are interested in any of the areas outlined above.

2. Abundance-Digital Online Community: I’ve also created a Digital/Online community of bold, abundance-minded entrepreneurs called Abundance-Digital.

Abundance-Digital is my ‘onramp’ for exponential entrepreneurs – those who want to get involved and play at a higher level. Click here to learn more.

Topics: Transportation
Peter H. Diamandis

Written by Peter H. Diamandis


Peter’s laws

The 28 laws that have guided Peter to success.

See Peter's Laws