Let Me Talk to My Agent
A while back, I mused about the magic of Hollywood, specifically the user interfaces in last summer’s blockbuster movies. Besides prepping for this year’s ICS QuickStart series, I’ve had time to catch some of this year’s crop of movies too. It’s only May and we’ve seen a few good Sci-Fi tent-pole movies. Even better, there have been some good UI tidbits to observe in them.
The latest Star Trek installment, Into Darkness, has plenty of eye candy on the screens of the Enterprise. As a UI geek, it was interesting to me that this movie tries to maintain many of the physical aspects of the old series control panels. While very shiny, the new movie series features a bridge complete with knobs, buttons and a throttle. Apparently, the all-touch Library Computer Access/Retrieval System (LCARS) interface of the Enterprise-D remains a few hundred years away, even though we’re all using tablets today.
Of greater interest to me was the latest outing of Tony Stark and his electronic sidekick, Jarvis (actually J.A.R.V.I.S. – Just Another Really Very Intelligent System), in Iron Man 3. I’ll admit it: I want to be Tony Stark. Besides the obvious appeal of the whole “Genius, billionaire, playboy, philanthropist” gig, the man just has the coolest toys. Besides the obvious metal suit, an integral part of the whole Iron Man user experience is the intelligent agent system that runs the armor, the house and just about every aspect of Tony’s life.
The notion of helpful intelligent agents goes back many years. In 1987, Apple produced their Knowledge Navigator video showing Professor Bradford working in his home office putting together notes for a lecture (yes, the video is on YouTube). While some aspects of the technology might seem quaint by today’s standards, back in 1987, the tech standards were pretty advanced. We were able to see a flat screen device with touch interaction that looks suspiciously like a tablet, video conferencing, desktop sharing and even voice interaction with the agent system. All of these technologies are available today in one form or another.
Flashing forward to 2013, let’s revisit the Iron Man franchise from a speculative fictional point of view. As I mentioned in a previous installment of this blog, Tony’s interactions with Jarvis are highly immersive in nature. In his office, Tony sits inside a volumetric projection of data manipulated by gestures, letting him do forensic analysis of a crime scene in full 3D. Interactions with Jarvis are delivered by voice, with all of the nuances of speech being accurately handled by the agent. Jarvis quickly and efficiently does all the things one would expect from agent software from routing calls and displaying information to controlling environmental systems. So, will we have to wait another 25 years to get to this level of interaction?
I don’t think we will, at least for some of the tech shown. Taking a quick look at the Gartner Hype Cycle for Emerging Technology, we see gesture control moving past the high point of hype (Peak of Inflated Expectations) as people realize what gesture can and cannot currently do. Gesture interactions are expected to come into the mainstream (Plateau of Productivity) in 2 to 5 years. Speech recognition is even further along the curve, although to be fair it’s been listed in the same range for the last three years. Natural language such as question answering capability, the key to interacting with Jarvis as seen in the movie, is projected 10 years out. So we have these predictions and we have Moore’s Law telling us that computing power is doubling every two years, so while we’re not going to have our own personal digital butlers in the immediate future, it’s likely we’ll get there eventually. When we will have volumetric displays is anyone’s guess.
Even though we can’t create the full movie experience yet, ICS R&D has been experimenting with gesture and voice input for a while now. The results are encouraging, or at least entertaining. For those in the San Jose (CA) or Santa Ana (CA) areas in next couple of weeks, feel free to pop into our Qt QuickStart seminar to see some examples of what can be done today. I promise Jarvis will be on his best behavior.