Brad Feld

Back to Blog

I’ve Seen The Future

Nov 16, 2006
Category Technology

On Tuesday night, I hung out in LA with my long time friends and frat brothers John Underkoffler and Kevin Parent, co-founders of Oblong along with Kwin Kramer and Tom Wiley.  I hadn’t seen John and Kevin in a while, so it was fun to catch up with them.  More fun, however, was seeing and playing with the amazing stuff they are creating.

In the movie Minority Report, Tom Cruise’s character John Anderton is shown interacting with a computer on a wall using a futuristic user interface in which he uses hand gestures to manipulate images and video.  Underkoffler created that – including the beginnings of a new UI paradigm and a language for describing it.

Wind the clock forward four years.  John, Kevin, Kwin, and Tom are hard at work commercializing this next generation user interface.  It’s magical to watch John control a set of complex applications projected on the screen with his hands.  No mouse, no keyboard – just gestures.  All that’s missing was speech – and – for someone who has spent some time working on speech related companies – it’s pretty clear where that could fit in.

Pause.  Ponder.  After a few minutes, John gave the gloves to me and taught me the UI.  It took about five minutes for me to get comfortable (probably less time that it takes a windows / mouse novice to deal with the Windows / Mac UI.) While I had some trouble with my non-dominant hand (right hand), I could feel the “brain wiring” taking place as I got more and more comfortable working with the applications.

These weren’t trivial applications.  A few of them were set up just to demonstrate the UI characteristics.  But – there were deeper ones that included a 3–D view of LA (think pan and zoom, along with annotate objects.)  Or – a time sequenced example of traffic moving down the street (time forward, time back, pan, zoom).  Or – a time sequenced map of the world showing all flight patterns over an elapsed period of time, including selecting specific origins and destinations to filter the data. 

All of this was running on top of a Mac G5.

We went out for sushi afterwards and talked about it for several more hours.  I’m 40 years old.  In my life, there have been only two major UI paradigms that I’ve interacted with.  The first was character-based ASCII terminals and keyboard (circa 1979).  The second was WIMP (ironically, I saw a Xerox Alto around 1980 – so while the Mac was the first popular WIMP UI, I actually saw a WIMP UI before that.)  Of course, you have punch cards and toggle switches – but let’s start in 1977 when the Apple II – which was arguably the first mainstream personal computer – came out.

So – 1977 – character-based becomes mainstream.  1984 – WIMP appears but probably doesn’t really become mainstream until Windows 3.0 – around 1990.  Speech – which has been stumble-fucking around since I was a kid – is still not mainstream (HAL – “I feel much better now, I really do.”)  I supposed you could argue that there is a new paradigm for handheld devices, but it’s so poor that it’s hard to consider it an innovation.  20 years and we’ve got nothing that is a discontinuous UI paradigm.

John, Kevin, Kwin, and Tom are inventing it right now.  Awesome.