« swipe left for tags/categories
swipe right to go back »
I got an interesting email from a friend who has historically been a huge Apple fanboy. I asked him if I could repost it verbatim and he said yes. It follows – I’m curious what your response is to this.
While I’m still very involved with the art world here in Colorado and still working on conservation issues we’ve actually just returned from almost a year away, the last 6 months in India. I realize that a lot of what I see is colored with the lens of India, but maybe that’s helping to make things more clear.
Anyway, in preparation for re-entry after India (we were in rural, south east India, without much electricity so I figured home might be a shock), I started to try and catch up on things. Your blog was one of my tools for this. I read the post on creating the best product, agreed, and moved on. One of the first things I planned on doing once home was to buy a shiny new macbook to replace my 4 year old white macbook. Maybe going to the mall, rather than just buying it online was my first mistake, but the cult of apple and the temple that is that store made me gag the second I walked in there. And while my macbook may be old, my use of apple products is right where they want it to be… had the iPhone5 the 2nd day it was out, mcgyvered the Airtel sim cards to work as nano-sims card in india, have a small film production crew all working on the latest macbook pros and iMacs, iPads and iPods at home… on and on. But in the store, what I noticed was a culture of elitism and insincerity. I had a 4 year old laptop with me, and was treated like a Luddite because I didn’t look up to speed. Insulted, I kept the $4,500 in my pocket, thinking I’d keep the laptop running, which I did. Small thing I know, but my thought was “if apple doesn’t care about me, who do they care about?” Today an even smaller issue illuminated this even more. I went in again, this time to replace the defective “top case/keyboard” from these old white plastic macs, and was told that the machine was now “vintage” (that’s the official apple label), and that they couldn’t replace the “defective part” (also their official language) as they had done in the past, because it is more than 4 years old. I thought that maybe I should just get a new machine and quit belly aching, but I pushed a little just to see what apple thought about a customer like me… and called apple to ask if there was anything more they could do. After a lot of insincere apologies, I asked if there was really nothing they could do. The support supervisor insisted that there was no more senior person to address this issue but that I might try craigslist. I was pretty surprised that apple’s official support process ended with telling the customer to check out craigslist for an old mac to scrap for parts. I’m such a pushover that if he’d offered me $100 credit towards a new macbook, I’d have smiled and bought another apple product.
As I right this, it sounds too much like a rant. But I couldn’t help writing, first to say hello after a long while (I did hear about the 3D printed tooth in Croatia…amazing!) and second to just try an make sense of what apple could possibly be thinking… the “cool factor” is clearly waning, they’re products are overpriced, and now they’re indifferent, even hostile, to customer who regularly spend tens of thousands of dollars on their products. Can they really be thinking that the best product is the one that you replace really quickly with something “cooler” and more expensive? I think this time, I might really go get the chromebook. I can’t be alone, and that can’t be good for them.
Marc Andreessen recently wrote a long article in the WSJ which he asserted that “Software Is Eating The World.” I enjoyed reading it, but I don’t think it goes far enough.
I believe the machines have already taken over and resistance is futile. Regardless of your view of the idea of the singularity, we are now in a new phase of what has been referred to in different ways, but most commonly as the “information revolution.” I’ve never liked that phrase, but I presume it’s widely used because of the parallels to the shift from an agriculture-based society to the industrial-based society commonly called the “industrial revolution.”
At the Defrag Conference I gave a keynote on this topic. For those of you who were there, please feel free to weigh in on whether the keynote was great, sucked, if you agreed, disagreed, were confused, mystified, offended, amused, or anything else that humans are capable of having as stimuli-response reactions.
I believe the phase we are currently in began in the early 1990′s with the invention of the World Wide Web and subsequent emergence of the commercial Internet. Those of us who were involved in creating and funding technology companies in the mid-to-late 1990′s had incredibly high hopes for where computers, the Web, and the Internet would lead. By 2002, we were wallowing around in the rubble of the dotcom bust, salvaging what we could while putting energy into new ideas and businesses that emerged with a vengence around 2005 and the idea of Web 2.0.
What we didn’t realize (or at least I didn’t realize) was that virtually all of the ideas from the late 1990′s about what would happen to traditional industries that the Internet would distrupt would actually happen, just a decade later. If you read Marc’s article carefully, you see the seeds of the current destruction of many traditional businesses in the pre-dotcom bubble efforts. It just took a while, and one more cycle for the traditional companies to relax and say “hah – once again we survived ‘technology’”, for them to be decimated.
Now, look forward twenty years. I believe that the notion of a biologically-enhanced computer, or a computer-enhanced human, will be commonplace. Today, it’s still an uncomfortable idea that lives mostly in university and government research labs and science fiction books and movies. But just let your brain take the leap that your iPhone is essentially making you a computer-enhanced human. Or even just a web browser and a Google search on your iPad. Sure – it’s not directly connected into your gray matter, but that’s just an issue of some work on the science side.
Extrapolating from how it’s working today and overlaying it with the innovation curve that we are on is mindblowing, if you let it be.
I expect this will be my intellectual obsession in 2012. I’m giving my Resistance is Futile talk at Fidelity in January to a bunch of execs. At some point I’ll record it and put it up on the web (assuming SOPA / PIPA doesn’t pass) but I’m happy to consider giving it to any group that is interested if it’s convenient for me – just email me.
Next week at Defrag I’ll be giving a talk titled “Resistance is Futile”. I’ll be talking about my premise that the machines have already taken over. A few days ago a friend of mine emailed me a perfect image to summarize where we are today. Ponder and enjoy.
I’ve worn glasses since I was three years old. I was trying to look at something on my iPad yesterday without them on and I heard Amy burst out laughing with “you really can’t see a thing without your glasses.” True – my eyes are defective. I’ve contemplated getting LASIK’s a few times but chickened out each time – if 42 years of glasses have worked, I expect another 42 will be just fine.
For years I’ve fantasized about getting glasses that have a heads-up display (HUD) integrated into them. This HUD would be connected to a computer somehow, which would of course be connected to the Internet, which would then give me access to whatever I wanted through my glasses. I can’t remember a sci-fi movie over the past decade that didn’t have this technology available and since my jetpack now seems like it’s finally around the corner (I’m hoping to get one for my 46th birthday), I have hope for my HUDglasses.
The pieces finally exist since I’m carrying a computer in my pocket (my iPhone or my Android) that’s always connected to the Internet. My glasses just need bluetooth to pair with my phone, an appropriate display, a processor, a camera, and the right software. Optimally I could control it via a spatial operating environment like Oblong’s g-speak.
I’m interested in investing in a team going after this. The magic will be on the software side – I want to work with folks that believe the hardware will be available, can integrate existing products, are comfortable with consumer electronics products, but are obsessed with “assembling the hardware” and “hacking the software.”
If this is you, or someone you know, please aim them at me. In the mean time, I tried to hunt down Tony Stark but don’t have his email address.
A post in the New York Times this morning asserted that Software Progress Beats Moore’s Law. It’s a short post, but the money quote is from Ed Lazowska at the University of Washington:
“The rate of change in hardware captured by Moore’s Law, experts agree, is an extraordinary achievement. “But the ingenuity that computer scientists have put into algorithms have yielded performance improvements that make even the exponential gains of Moore’s Law look trivial,” said Edward Lazowska, a professor at the University of Washington.
The rapid pace of software progress, Mr. Lazowska added, is harder to measure in algorithms performing nonnumerical tasks. But he points to the progress of recent years in artificial intelligence fields like language understanding, speech recognition and computer vision as evidence that the story of the algorithm’s ascent holds true well beyond more easily quantified benchmark tests.”
If you agree with this, the implications are profound. Watching Watson kick Ken Jennings ass in Jeopardy a few weeks ago definitely felt like a win for software, but someone (I can’t remember who) had the fun line that “it still took a data center to beat Ken Jennings.”
While that doesn’t really matter because Moore’s Law will continue to apply to the data center, but my hypothesis is that there’s a much faster rate of advancement on the software layer. And if this is true it has broad impacts for computing, and computing enabled society, as a whole. It’s easy to forget about the software layer, but as an investor I live in it. As a result of several of our themes, namely HCI and Glue, we see first hand the dramatic pace at which software can improve.
I’ve been through my share of 100x to 1000x performance improvements because of a couple of lines of code or a change in the database structure in my life as a programmer 20+ years ago. At the time the hardware infrastructure was still the ultimate constraint – you could get linear progress by throwing more hardware at the problem. The initial software gains happened quickly but then you were stuck with the hardware improvements. If don’t believe me, go buy a 286 PC and a 386 PC on eBay, load up dBase 3 on each, and reindex some large database files. Now do the same with FoxPro on each. The numbers will startle you.
It feels very different today. The hardware is rapidly becoming an abstraction in a lot of cases. The web services dynamic – where we access things through a browser – built a UI layer in front of the hardware infrastructure. Our friend the cloud is making this an even more dramatic separation as hardware resources become elastic, dynamic, and much easier for the software layer folks to deploy and use. And, as a result, there’s a different type of activity on the software layer.
I don’t have a good answer as to whether it’s core algorithms, distributed processing across commodity hardware (instead of dedicated Connection Machines), new structural approaches (e.g. NoSql), or just the compounding of years of computer science and software engineering, but I think we are at the cusp of a profound shift in overall system performance and this article pokes us nicely in the eye to make sure we are aware of it.
The robots are coming. And they will be really smart. And fast. Let’s hope they want to be our friends.