We are entering a New Normal, a subject richly codified by Kevin Kelly in his thought-provoking new book, 'The Inevitable.'
The Next Wave, which is quickly becoming the New Way, is starting to become ubiquitous.
Soon, it will be so embedded in the fabric of everything, that it's essentially 'invisible.'
Examples of the New Way readily come to mind. With Uber, you can push a button, get picked up and taken anywhere on demand. And it's a GREAT experience.
With Airbnb, you can push a button, and secure lodging anywhere in the world. And it's a transformative experience.
With Amazon, you can have anything delivered to your door at the push of a button. And you can even return it, too. Incredible, right?
But, what's coming next are the Archetypes for long heralded concepts, such as robotics, machine learning and augmented reality.
By archetypes, I mean emblematic examples that provide the pattern recognition from which more complex, more integrated instances (systems) will iterated and be created.
Consider the case of Tesla, with their Auto Pilot function. What does Tesla's Auto Pilot say about the future of software driven robotics?
One the one hand, it was the failure of this function that resulted in the death of one of their Tesla owners.
This is a critical reminder that new technologies can yield dramatically unintended consequences, and so the pros and cons of enabling specific functions must be evaluated. "Public beta" can't be a disclaimer for not having a full formed safety plan.
That noted, entrepreneur and blogger, Marco Arment, wrote a seriously crisp assessment of Tesla Autopilot that captures just how intricate -- and powerful -- these systems can be.
Tesla has built FIVE different subsystems that comprise its Autopilot capabilities -- Automatic Emergency Breaking; Autopark; Summon; Adaptive Cruise Control; and Autosteer -- some of which Marco dismisses, a few of which he uses faithfully.
In baseball terms, software robotics is the second batter in the first inning of a nine inning baseball game. But, it's also a very real, directed approach to solving specific problems that make humans better, safer drivers.
Meanwhile, machine learning has paved the way for what we think of as Artificial Intelligence. Google's Image Search is based upon a deep learning system that literally learns what a face looks like, the parts that comprise it, the pieces that fit together a certain way, and the degrees of variance that can be applied across its feature set.
What can you do with all of this goodness? Well, the ability to autonomously recognize sophisticated objects, decompose them into base elements, and then support precious, assembly, overlay and masking logic is a central recipe in enabling augmented reality.
One archetypal example is MSQRD (masquerade), an app for iPhone and Android that uses facial detection and recognition to map a library of dynamic masks to the contours of your face...even in video.
Don't take my word for it, download MSQRD on your phone, and you'll be hooked.
The next level up from this is creating overlay experiences so real that you don't know that a car cruising on a race track in a TV commercial is actually a sensor wrapped shell optimized for motion capture and digital car body overlay (see: Blackbird).
Watch the video to get a sense of how the car advertising business could change fairly dramatically in short order.
And, of course, there is Pokemon Go, which uses your camera, physical space, graphical overlay and location based services to create an interactive universe in the real world. It, too, is an archetype for the social, mobile and play capabilities of augmented reality.
Consider, the curious case of the Amazon Echo, the speaker that listens to you.
If I were to tell them a decade ago that people would put voice capture devices in their household, you would have assumed I was talking about the People's Republic of China.
But, no, I am talking about Amazon. The same company that transformed the book in the way Apple transformed the song. The same company that created a massive cloud array of services for Software Developers through Amazon Web Services and for Merchants through Selling on Amazon.
Echo takes Siri and goes further, asserting persistence as a staple. As a home or office bound device always tethered to electrical power and network connectivity, this is a reasoned value exchange for a broad swath of users.
Users trade privacy, and trust that Amazon really only is listening in when the user says 'Hey Alexa...' In exchange, they get an software driven agent ready to handy a number of daily tasks on verbal command.
Amazon, after all, is the same company who envisioned a bot driven future through the human powered Mechanical Turk.
Echo closes the loop.
Finally, consider the Simulation goodness of playing NBA 2K. The experience of playing a competitive basketball game that 'feels' like real basketball tells me that the first 'holy shit' virtual worlds will be a better, smarter version of NBA 2K, but bot-infused, location-aware, physical and augmented.
Full immersion is coming, but augmentation is where the Next Way is coming to life. It's the one with a well framed set of archetypal experiences from which to broaden forth.
Side Notes and Links:
- If you really wanted to accelerate machine learning, would you put more energy into ENCODING or INTERPRETING?
- If AI, Deep Learning and Machine Learning are of interest, this video is DEEP, but worth it
- If we all agree that some combination of artificial intelligence and machine learning is obvious and inevitable, what is the platform play?