Controlling a Human Surrogate

"Human Uber" concept would remotely display a user's face on another person.

"Human Uber" is a new concept from Sony-affiliated Japanese AR/VR researcher Jun Rekimoto, who recently presented the concept at MIT Tech Review's EmTech conference in Asia. The concept, officially known as the ChameleonMask, is a "telepresence system that displays a remote user’s face on another user’s face".

The researchers wanted to tackle problems with tele-operated robots. For example, it’s hard to take a head on a robotic scooter seriously, so they created a human-human augmentation system.

The surrogate views the world through a VR rig, and as the pilot, you send commands and hand gestures through interactive software. According to the researchers, the surrogates will be hired to work like a remote user in the physical world, much like a "human uber" who does our shopping and hangs out with our grandmothers.

Some issues persist, both morally and logistically, but let's stick with logistics. How do you know that you've hired the right driver/surrogate? Do they need the same body type and style? Could I really hire a petite blond woman to watch my son's baseball games while I was out gallivanting in Iowa?

The technology has major implications not only on interpersonal relationships, but what about the effects on the drones? We already had a problem with vetting Uber car drivers, and now we're going to let this person visit our grandmother?

Now, this was merely an example of emerging technology — and one that has been predicted in the past — but the researchers contend that it also solves problems, such as eliminating the need for, and crowded space occupied by, tele-operated robots.

More in IoT