One of the interesting and somewhat underappreciated aspects of the Qualcomm wireless event in San Diego a few weeks back is that 5G will make realistic augmented reality avatars possible. This advance will be possible because with high bandwidth and very low latency, processing could shift from the device to the cloud.
Coupled with the rollout of distributed data centers, to keep latency down, this means you could have realistic virtual pets and friends, or even visit remote locations virtually. Using virtual reality you could not only feel you were there, but also appear to people there as your avatar, rather than as the drone or robot providing you with the experience.
This also creates an interesting path to immortality by advancing the capability of creating digital clones of pets and people. Coincidentally, the first digital human clone was born last week, and I think it opens the door to many interesting future possibilities.
I’ll close with my product of the week: a new portable second screen from Lenovo, the ThinkVision M14.
Moving Local Processing to the Cloud
There are three limitations to creating a realistic AR experience:
- Optics. Developers need to make the image look realistic, and they must render the new image while realistically occluding conflicting reality.
Right now, AR glasses provide an experience where reality bleeds through, and the rendered image tends to look like a cartoon ghost.
Oh, and those glasses have to look good when you wear them. The failure of 3D TVs showcased that we don’t want to wear dorky prosthetics. (This may turn out to be the biggest problem to overcome.)
- Computing power. The power required would challenge a workstation, let alone something you might put on your head or anyplace on your body.
Walking around with one of those HP backpack workstations with double holstered batteries would get old quickly, and I doubt it would provide the needed performance.
- Content. Now for inanimate objects like furniture, creating content is pretty easy with a 3D scanner and existing technology.
However, if you want a virtual pet that behaves like a pet, or a virtual human that behaves like a human, you have to code the personalities. Until now, that has been beyond our capability.
Moving the processing to the cloud potentially allows unlimited headroom without the need to carry a workstation on our bodies — a solution for limitation No. 2. Besides, that extra processing power can create an artificial intelligence that behaves realistically like an animal or even a human.
Optics are still a problem, but until we can replace eyeballs, there will need to be a lot of work done to make us accept optical prosthetics in any form. As a society, we don’t view even well-designed glasses well (with the exception of dark glasses, which may point to a solution).
Virtual Pets, Virtual People
One of the issues for people who are just entering the workplace or who have a job that requires a lot of travel is that schedules make it impractical to have the companionship of a pet. While there are people who travel with their pets, doing so while having to go into work remains problematic. A few companies allow pets in the workplace, but most don’t, so having a real pet could limit your ability to change jobs and advance.
What if you could have a virtual pet that only you could see? It could always be with you. If tied to an AI, it actually could guide you to locations as a GPS solution, converse with you like a far smarter digital assistant, and even provide a rolling commentary that only you could hear. If done right, it could address your need for companionship without the overhead of a living pet.
Just like you could create a living pet that would have superpowers (I’m thinking of a pet dragon or Superman’s dog Krypto), you also could create a human companion. That human companion could be crafted after a real person but modified to remove anything you might find annoying.
You could even recreate yourself and have that virtual clone be the best version of yourself. That could be your perfect advisor because you likely would trust a version of yourself that you perfected more than any other person, real or virtualized. Not to mention that this virtual version of you could transcend your life, going on for as long as the technology existed to provide advice to your heirs.
I could use a resource like that, and there are a whole bunch of folks in politics who could use a trusted advisor to keep them out of trouble.
I got a pitch along those lines last week from a company called “1sec inc.” It had turned Hiro Newman into a virtual celebrity influencer. Fully rendered and based on a real person, this is one of a number of efforts to create immortal avatars based on real people.
These efforts are creating the content that we’ll need to populate the planet with realistic virtual people. Not to mention, they could do a lot for in-game NPCs (Non-Player-Characters).
I expect the technology to have a big impact on computer gaming as teams are created using a blend of real and virtual players. You could even digitize top players and provide them, for a fee, as companions to help train or compete with players on their favorite games.
Granted, the porn industry would have an entirely different spin on this.
The shift from localized to centralized cloud computing will have a massive impact on how we perceive reality — from being able to re-render the world around us, to being able to acquire both human and non-human virtual companions to help us through our lives, and even to digital immortality.
Granted, that will create some rather interesting and unfortunate side effects, like folks preferring digital pets and mates to real ones. However, it also could help us become better people, as we create advisors to help us become the best versions of ourselves.
It potentially could provide a better model for kids to learn appropriate behavior than watching typical adults in action.
These improvements are just a reminder that our world is going to change massively over the next few decades, so much that magic will seem to become real, and our sustaining offspring could be virtual clones representing our better selves — something to noodle on this week.
I don’t like to write when I’m on the road, because the 49-inch Dell monitor I have on my desk spoils me. I can be watching email, have my reference material, and what I’m working on all on the same screen. I can even add a video to this and keep working.
When I’m on a laptop, the 13- to 15-inch screen doesn’t give me the real estate to which I’ve grown accustomed. The answer is a portable external monitor, and I’m always on the hunt for a better one.
The latest is the Lenovo ThinkVision M14, which connects using USB-C and has a power supply, so it doesn’t draw down the power of your laptop.
I’m using it with the amazingly light Lenovo ThinkPad X1 (which uses real carbon fiber). Although the combination is lighter than the 15-inch notebook I’d otherwise carry, I have the equivalent of two 14-inch displays.
When I first got it, the device just plugged in and worked, with no new driver requirements or other issues. I wish everything I got worked as easily. It was a dream to set up. The battery is integrated into the stand, making the M14 one of the most stable portable monitors I’ve used.
With two USB-C connectors, you can daisy chain this with your USB-C power supply, so both your laptop and portable monitor get charged at the same time.
Also, at US$249 it won’t break the bank.
Until we work out head-mounted displays, moving to a second portable monitor is the easiest way I’ve found to overcome the real estate issue with a laptop screen, so the Lenovo M14 is my product of the week.
The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.