« swipe left for tags/categories
swipe right to go back »
I’ve written in the past about my obsession with measuring things. While my manual measurements via Daytum include miles run, books read, flights taken, and cities slept in, I’ve become much more focused in the past year on what I’ve been calling “human instrumentation.” This resulted recently in Foundry Group leading a $9 million financing in a San Francisco company called Fitbit.
If you want to see the type of data I’m tracking, take a look at my Fitbit profile. For now, I’m focused on the data that Fitbit tracks automatically for me, primarily derived from the step and sleep data. But from my profile page you can see a variety of other data which I can currently enter manually (I’ve entered a few examples) even though I use other sources to track them (for example, my weight using my Withings scale.)
I now have a house full of personal measurement devices and an iPhone full of apps to track various things. A few are still active; many have long been relegated to the “closet of dead, useless, obsolete, or uninteresting technology.” During this journey over the past year, I feel like I tried everything and finally found a company – in Fitbit – that has a team and product vision that lines up with my own.
A year ago when I first encountered the company, they were just launching their product. I was an early user and liked it a lot, but hadn’t clearly formed my perspective on what the right combination of software and hardware was. As I played around with more and more products, I started to realize that the Fitbit product vision as I understood it was right where I thought things were going. The combination of hardware, software, and web data integration are the key, and the Fitbit founders (James Park and Eric Freidman) totally have this nailed. That made it easy when we explored investing again to pull the trigger quickly.
One of the things my partners and I love about products like the Fitbit are the combination of hardware, software, and a web service that lets the product continually improve without having to upgrade the hardware. Fitbit is a great example of this which I expect you’ll see over the next quarter if you buy one today.
I firmly believe that in 20 years we’ll simply swallow something that will fully instrument us. Until then, we still have to clip a small plastic thing to our belt or keep it in our pocket. But that’s ok since it now knows how to talk to my computer, which is connected to the web, which is getting smarter every millisecond.
“In five years when you buy a computer you’ll get this.” John Underkoffler, Oblong’s Chief Scientist, at 14:20 in the video.
I’ve been friends with John Underkoffler since 1984 and we’ve been investors in Oblong since 2007. Ever since I first met John I knew that he was an amazing thinker. John, his co-founders at Oblong, and the team they have assembled are creating the future of user interfaces. This year has started off incredibly fast for them – they’ve spent the last five months scaling the business as the result of several large customers and are in the home stretch of releasing their first “shrink wrapped product” in Q3. Get ready – the future is closer than you imagine.
I talk about human computer interaction (HCI) a lot on this blog. We’ve invested in a number of companies in our HCI theme, including Oblong, Organic Motion, and EmSense and have a few more that we are working on that hopefully will be announced shortly. When I think about the areas I’ve been paying the most attention to and am the most intrigued with as an investor, HCI rises to the top of the list.
This morning I read an article on SeattlePI titled UW researchers look to reinvent the graphical user interface. While the headline is a bit sensational, the project (Prefab) is very cool. At first glance I thought it was simply rewriting HTML pages (clever, but not that big a deal) but then I realized it was doing something more profound. The five minute video is worth a look if you are into these types of things.
The bubble cursor and sticky icon examples are great ones. Starting at 1:45 you see the bubble cursor and sticky icons in action on Firefox in Vista. At 2:05 you see it on OSX. At 2:45 you see it in action on a Youtube player. The magic seems to be around pixel level mapping, which anyone working in adtech knows that’s where the real action is. It’s pretty cool to see it being used to map UI functionality.
A week or so ago, Fred Wilson Dictated a Blog Post. In it he dictated a blog post on his Nexus One phone. He then discovered Swype which now has an unofficial Android app. As usual the comment threads on AVC were very active and had lots of thoughts about the future (and past) of voice and keyboard input.
When I talk about Human Computer Interaction, I regularly say that “in 20 years from now, we will look back on the mouse and keyboard as input devices the same way we currently look back on punch cards.”
While I don’t have a problem with mice and keyboards, I think we are locked into a totally sucky paradigm. The whole idea of having a software QWERTY keyboard on an iPhone amuses me to no end. Yeah – I’ve taught myself to type pretty quickly on it but when I think of the information I’m trying to get into the phone, typing seems so totally outmoded.
Last year at CES “gestural input” was all the rage in the major CE booths (Sony, Samsung, LG, Panasonic, …). In CES speak, this was primarily things like “changing the channel on a TV using a gesture”. This year the silly basic gesture crap was gone and replaced with IP everywhere (very important in my mind) and 3D (very cute, but not important). And elsewhere there was plenty of 2D multitouch, most notably front and center in the Microsoft and Intel booths. I didn’t see much speech and I saw very little 3D UI stuff – one exception was the Sony booth where our portfolio company Organic Motion had a last minute installation that Sony wanted that showed off markerless 3D motion capture.
So – while speech and 2D multitouch are going to be an important part of all of this, it’s a tiny part. If you want to envision what things could be like a decade from now, read Daniel Suarez’s incredible books Daemon and Freedom (TM) . Or, watch the following video that I just recorded from my glasses and uploaded to my computer (warning – cute dog alert).