« swipe left for tags/categories
swipe right to go back »
If you’d like to do something political that has nothing to do with the upcoming elections, read through Tom Evslin’s post Act Now for Better Internet Access. Then go sign the online petition at freetheairwaves.com.
It kind of blows my mind that the National Association of Broadcasters is still fighting this stuff, especially with the impending federally mandated cutover to digital TV in February. But hey, lots of things seem illogical – this is nothing new.
Thanks Tom for alerting everyone to this. The deadline for comments is Tuesday 10/28 so click and comment now.
Fred Wilson gave a phenomenal speech at Web 2.0 Expo NY titled New York’s Web Industry From 1995 to 2008: From Nascent to Ascendent. It’s about 25 minutes long – worth watching from beginning to end. It’s a fantastic history lesson that details the rise, fall, and re-emergence of the Web industry in New York.
As part of this, Fred makes a plea to "bury the name Silicon Alley." He hates it in the same way I’ve always hated the names "Silicon Flatirons" and "Silicon (whatever)" to describe the tech communities in other geographies than Silicon Valley. Fred appropriately suggests that we should call "New York" simply "New York" – which I completely agree with.
Now that I’m 42 years old, I’ve been around the computer industry long enough to understand that it runs in cycles. I don’t know how long the cycles are going to be, when they are going to reach a peak or a trough, but I do know that things will get better, will get worse, will get better, will get worse, will get better, …
When I reflect on it, the long term trend over the last 42 years has been amazing. There are lots of formal and informal studies and articles on this that all link to Schumpeter’s theory of creative destruction and Clay Christensen’s ideas around disruptive innovation. As the cycles play out, great new companies get created around new innovation, some reach escape velocity, some get absorbed into other large incumbent companies, and some disappear.
Today’s New York Times has two short articles – one in Bits and the other in DealBook – that reminded me of this.
Our good friend Microsoft makes a key appearance in both articles. Pondering the rise, fall, rise, fall, … of each of these companies over a 50 year period – both at a macro company level and within specific product groups – is a fun mental exercise (at least for me.)
When I reflect on the various companies we’ve funded over the past year I get really excited about the stage of the cycle we are in with the new Foundry Group portfolio. Independent of who wins the upcoming election, I think the vector of innovation around software and Internet will be steep and many of the things we’ve been talking about for the past 20 years as science fiction are going to start to instantiate themselves as real products and services. The relationship between humans and computers is once again changing rapidly and the number of different amazing things that I can envision happening in the next two decades is extensive.
I was going to call this post "Private Beta is Bullshit" but then I realized I might be wrong. Rather than decide, I’m looking for reasons to change my mind. Please help me. In the spirit of thinking out loud on my blog, I’m going to go through a history lesson from my perspective to frame the problem.
When I started writing commercial software in the 1980′s, there was a well-defined "beta process." Your first beta was actually called an alpha – when you had your friends and a few lead users play around with your stuff which was guaranteed to break every few minutes. But they were a good source of much needed early feedback and testing. Then came beta – you shipped your disks out to a wider audience, including a bunch of people you didn’t know but who were interested in your product, and they banged away looking for bugs. You had a bug collecting and management process (if you were really cutting edge you even had a BBS for this) and while there wasn’t a code freeze, you spent all of your time fixing bugs and hardening / optimizing the code. If you had a complex system, you started shipping release candidates (RCs); less complex went straight to a release (GA). Inevitably some bugs were found and a bug fix version (v x.01) was released within a week or two. At this point you started working on the next version (v x+1.0); if there were meaningful bugs still in v x you did small incremental releases (v x.02) on the previous code base.
This approach worked well when you shipped bits on disks. The rise of the commercial Internet didn’t change this approach much other than ultimately eliminate the need to ship disks as your users could now download the bits directly.
The rise of the web and web-based applications in the later 1990′s (1997 on) did change this as it was now trivial to "push" a new version of your app to the web site. Some companies, especially popular consumer sites and large commercial sites, did very limited testing internally and relied on their users to help shake down the web app. Others had a beta.website.com version (or equivalent) where limited (and often brave) users played around with the app before it went in production. In all cases, the length of time of the dev/test/production loop varied widely.
At some point, Google popularized the idea of an extended beta. This was a release version that had the beta label on it which is supposed to indicate to people that it’s an early version that is still evolving. Amazingly, some apps like Gmail (or Docs or Calendar), seem to never lose their beta label while others like Reader and Photos have dropped them already. At some point, "beta" stopped really meaning anything other than "we’ve launched and we probably have a lot of bugs still so beware of using us for mission critical stuff."
With the rise of the Web 2.0 apps, beta became the new black and every app launched with a beta label, regardless of its maturity (e.g. a whole bunch of them were alphas.) Here’s where the problem emerged. At some point every beta got reviewed by a series of web sites led by TechCrunch (well – not every one – but the ones that managed to rise above the ever increasing noise.) When they got written up, many of them inevitably ran into The First 25,000 Users Are Irrelevant problem (which builds on Josh Kopelman’s seminal post titled 53,651 – which might be updated to be called 791K.) During this experience, many sites simply crash based on the sudden load as they weren’t built to handle the scale or peak load.
Boom – a new invention occurred. This one is called "private beta" and now means "we are early and buggy and can’t handle your scale, but we want you to try us anyway when we are ready for you." I’ve grown to hate this as it’s really an alpha. For whatever reason, companies are either afraid to call an alpha an alpha or they don’t know what an alpha is. For a web app, operational scale is an important part of the shift from alpha to beta, although as we’ve found with apps like Twitter, users can be incredibly forgiving with scale problems (noisy – but forgiving).
So – why not get rid of the "private beta" label and call all of these things alphas. Alphas can be private – or even public – but they create another emotional and conceptual barrier between "stuff that’s built but not ready for prime time" (alpha), "stuff that getting close but still needs to be pounded on by real users and might go down" (beta), and "stuff that’s released" (v x.0). That seems a lot more clear to me than "private beta", "beta" (which might last forever), and ultimately v x.0.
In the grand scheme of things this might simply end up in "Brad Pet Peeve" land, but recently it’s been bothering me more than my other pet peeves so it feels like there’s something here. Call me out if there isn’t or pile on if there is.
If that heading makes you think "Relax, relax, relax I need some information first" then you have the same Pink Floyd addiction that I have.
Eric Norlin – my co-conspirator in the Defrag Conference – has a very relevant post up titled Beyond Incrementalism 2.0. I expect we are going to hear a new wave of "why aren’t we (where "we" is the computer industry) going after big problems right now."
Tim O’Reilly had a dynamite post up over the weekend titled MicroHoo: corporate penis envy? (anyone bold enough to use the phrase "penis envy" in the title of a blog post is a personal hero of mine.) Fred Wilson weighed in, called it the Best Blog Post In A Long Time and pulled out some of the great one liners. Among other things, his post is about the need for Big Hairy Audacious Goals to move innovation forward.
On my run this morning (during the 60 minutes where I got lost in the mountains and added a very muddy extra 30 minutes to my normal two hour run to the office) I ruminated on the dynamics of incrementalism and whether I was seeing enough radically new stuff – or if the new things I was seeing was merely an incremental build.
In our friendly neighborhood "Web 2.0 space" (god I hate that phrase) there is a ridiculous amount of incrementalism. When the echoes in the echo chamber echo even more than usual, that is a signal – mostly about the signal to noise ratio getting out of whack.
When I think of other areas we are playing around in (HCI, Digital Life) I’m seeing plenty of stuff that I would put in the "radically new / BHAG" category (e.g. "the mouse and keyboard are an anachronism – their time is up – let’s make them vanish.")
Based on Eric’s brain and what he’s thinking about, I expect this years Defrag Conference to step wide outside the Web 2.0 / Implicit Web echo chamber and try to re-energize some seriously cool thinking around BHAGs in this arena. Come play.