Brad Feld

Category: Technology

Niel Robertson, the CTO of Newmerix, has what I anticipate will become a legendary blog post up called I Pity The Fool that dissects Oracle’s recent “Half Way To Fusion” event.  In addition to casting John Wookey (Oracle’s SVP of Application Development) as Mr. T, Niel does an extraordinary job of walking through what Oracle has announced and what he thinks it means for Oracle’s customers.  If you use (or more importantly are responsible for managing and deploying) Oracle, PeopleSoft, JD Edwards, or Siebel applications, you must read this.  At the minimum you’ll get a great laugh out of it.


Con(Fusion)?

Jan 19, 2006
Category Technology

Yesterday Oracle has a major set of presentations around its new Fusion platform to help celebrate the Fusion project’s one year anniversary.  The headline for the event was “Oracle is halfway to Fusion.”  This is a huge deal as it’s the base for Oracle’s integration of a number of disparate products and technologies from the acquisitions they’ve done in the past two years (PeopleSoft / JD Edwards and Siebel being the most notable.)

Niel Robertson – the CTO at Newmerix – watches this closely as the consolidation in the ERP / packaged application market is a key driver of the opportunity for Newmerix.  I asked Niel what he thought and his quick response was:

Imagine if Microsoft came to you and said the following: “We know you like MS-Office. It has some great features. But we really think next gen of OpenOffice is the way to go. Hey, its standards based. And you can extend it with your own functionality. And look, we have this totally cool OpenDoc format. And, we took the best features of word, wordperfect, framemaker, PDF, and anything else you have and we sort of merged it all together. Well, some of the features won’t be there. Like sub-bullet points. They didn’t make it. And track changes will work totally differently. And folders in outlook won’t be able to be organized the same – but pretty close. But hey, its open. And oh yeah, we’ll give you some tools to convert all your word, PPT, outlook, excel, etc.. over. But you can’t bring anything based on a template (like PPT slides) or any custom formatting you have done in your documents and tables of contents in PDFs won’t work anymore. But don’t worry, all word processors and office tools go through a 7 year evolution – it’s totally normal.”  Now take this conversation and consider how many word, ppt, pdf, excel docs you have alone. Then consider rolling this out to 45,000 people in your organization. Ahhh! It’s a total mess.


It kind of reminds me of Microsoft’s Project Green.  Expect more on this from Niel on his blog.


David Jackson is now putting up conference call transcripts on his Seeking Alpha blog.  He’s got a feed up that you can subscribe to for your daily dose of earnings call entertainment (and some of them are very funny.)  Plus you can quickly get an extra dose of Overstock.  Paul Kedrosky re-enforces the potential humor value with an old Mary Meeker question on a Microsoft Q2 2003 call.


The We Media Deal

Jan 10, 2006
Category Technology

Matt Blumberg has a good thoughtful post up on the notion of the “We Media Deal.”  Matt spends a lot of time paying attention to the transition from “Old Media” to “New Media” to “Next Media” (or “We Media”).  Matt posits that We Media has two components:

  1. The value of the service to you increases in lock-step as you contribute more data to it.
  2. The more transparent the value exchange, the more willing you are to share your data.

Worth a careful read if you are developing anything that interacts with end users.


I still read a handful of print magazines (you gotta do something in the bathroom) – one my favorites is Technology Review (MIT’s Magazine).  This month’s cover story was The Internet Is Broken and is a fascinating (and probably important) article about the cost the Internet’s basic flaws which result in the need for a “clean-slate approach” being advocated by MIT’s David Clark (an Internet old-timer and chief protocol architect from 1981 – 1989.) 

Clark believes there should be four basic elements that he’d like to see designed into the “new Internet architecture.”

  1. Security: The Internet should authenticate the people and computers you communicate with and keep spam and hazards like viruses from ever reaching your PC.
  2. Mobility: Assigning Internet Protocol addresses to small and mobile computing devices such as sensors, phones, and embedded processors in cars would allow them to connect to the network securely.
  3. Protocols: Better traffic routing agreements between Internet service providers would allow them to collaborate on advanced services without compromising their businesses.
  4. Instrumentation: All pieces of the network should have the ability to detect and report emerging problems – whether technical breakdowns, traffic jams, or replicating worms – to network administrators.

The article is bound to be controversial, but covers a lot of ground, including discussing a proposed $300 million effort from the NSF to create a new Internet infrastructure.


The Nacchio File

Jan 02, 2006
Category Technology

If you are interested in the story of how things play with Joe Nacchio’s criminal insider trading indictment I recommend you subscribe to the New West Networks Boulder feed as Richard Martin is writing a series titled The Nacchio File.  Nacchio is at the top of the Qwest meltdown pyramid and – among the expected editorial chatter – the case will likely be covered heavily by the local (and national) news guys.

I was at Qwest headquarters a couple of times during the bubble for various meetings with senior Qwest execs (a few cases of potential M&A that never went anywhere, some executive recruiting, and a random hysterical meeting that will go down in history of one of the silliest experiences I’ve ever had at a corporate headquarters anywhere.) 

I never met with Nacchio, but I bumped into him twice.  One time he was hunkered over the Bloomberg that was prominently located in the center of the executive waiting area looking at the movements of the 100 or so stocks on the Qwest default screen.  The other time I literally ran into him as I turned a corner as he was saying something like “nice sale today” to Weisberg.  I was easily ignored as a long haired nerd wearing jeans – he likely thought I was up there fixing the executive network connectivity and – since I clearly had nothing to do with the Bloomberg machine (since it seemed to be working at the time) – wasn’t worthy of a greeting.


The Deathstar Rises

Dec 29, 2005
Category Technology

I sat stunned this morning as I read that AT&T (the “new name” for SBC) is going to spend $1 billion on a branding campaign

For that?  Now, I’m well known for hating money wasted on marketing, but $1 billion for that?  You’ve got to be fucking kidding me.  I can imagine about 1 billion better uses for the money.  Apparently, they’ve spent lots of high powered marketing energy (and money, I expect) replacing the “Reach out and touch someone” slogan with “Your world, delivered.”  Excellent.  If I’d been on the “branding committee”, I would have recommended “We Suck Less.” 

This just in – Intel also announced that they are going to do a major overhaul of their branding, replacing “Intel Inside” with “Leap ahead.”  Double excellent.

Seth – the world needs you man – go save these guys from themselves.


When I saw my first demo of the World Wide Web at an MIT Athena Cluster in 1994 (it was Freshman Fishwrap – among other things – running on a very clunky version of Mosaic) I remember thinking something along the lines of “wow – this could be used for a lot of things.”  Duh. 

Every now and then I run into an unintended consequence of the Web.  I’ve been involved in many companies that were trying to create “intended consequences” (some succeeded, some failed), but I’m intrigued when I stumble upon an unintended consequence, especially if it’s buried deep in the fabric of the mainstream.

I found one the other day.  Amy and I were visiting her relatives in Hotchkiss, Colorado.  Hotchkiss is in the middle of the western foothills of the Rocky Mountains, is a beautiful place, has some great running, and – while I’m probably the only jew for 50 miles – I’m always welcomed by Amy’s wonderful family at Christmas time.  Amy’s uncle Mike and aunt Kathy own Weekender Sports – the local sporting goods store (need a snowmobile, ATV, fishing rod, gun, or ammo anyone?)  It’s a great local store and is everything you’d expect.

I was sitting around talking with Amy’s cousin Mario who helps run the store and I asked him how business was.  While the answer was “generally good”, we talked about the ups and downs of a local retail store (big city discounters, Wal*Mart, up and down days, challenging suppliers.) 

One comment that Mario made that stood out was that “business is slow at the beginning of hunting season.”  I pressed on this and asked him why he thought this was the case.  The answer was stunningly simple – the Colorado Division of Wildlife now sells hunting and fishing licenses on the Web.  Historically, if you wanted to hunt or fish in Colorado, you had to go to one of the Colorado license agents (e.g. Weekender Sports) and buy your license. This resulted in lots of traffic to the store, especially visitors from out of state who were coming to Colorado for a hunting / fishing vacation who wouldn’t otherwise go to the store.  While you can still buy the license in the store, many people are opting to purchase them online since it’s better to have everything done in advance rather than have to scramble around on your first day of your trip.  Of course, the unintended consequence is that visitors from out of state don’t bother stopping in at the local sporting goods store to pick up their license, and – correspondingly – don’t buy the random extra hunting and fishing gear they forgot to bring with them.

Now – there’s plenty of ongoing discussion about e-commerce and the endless shift of purchasing from stores to the Web (Amy bought almost all of her Christmas presents on the Web this year.)  But – this example has an interesting effect.  Think of the aggregate amount of secondary in-store purchases that won’t get made because one could get their fishing / hunting license on the Web.  While you might think this is not a big deal, it clearly has impact on local merchants like Weekender Sports and is yet another e-commerce side effect that mainstream American businesses have to contend with.


When I woke up this morning, I decided that I wanted to see how hard it was to implement a map using the Google Map API (god only knows why I think of things like this when I wake up.)  I’m no longer much of a programmer, but I can hack around with the best of them, especially if someone else does all the hard work and all I need to do is play trial and error with some HTML, CSS, and Javascript.

I got my Google Maps API Key and tossed up the “Hello, World” of Google Maps on my web site.  Pretty easy.  I then started trying to get the size, zoom, and center point set the way I wanted it.  Suddenly, I had to find the longitude / latitude points and – to make sure I was putting them in the right place – began digging through the Class Reference which is well documented, but doesn’t give me much of a clue about the actual boundary parameters (e.g. I figured out that map.centerAndZoom (latLng, zoomLevel) was what I wanted and I could figure out the latLng, but I didn’t know how to determine zoomLevel without trial and error.)

I found an easy latitude / longitude geocoder site and started monkeying around with the addresses of the five marathons that I have run.  I kept thinking there should be something that would generate the code for me as it was starting to get a little messy and it seemed like I was trying to create a pretty simply map – centered on Colorado, with five locations with some text associated with them. 

As I was looking for other geocoder options, I stumbled upon Map Builder which did everything I wanted (including embedding the geocoder in its UI.)  Ten minutes later, I had the source code for the map.  I had to muck around with it some to integrate it properly into my Marathon page on my web site, but it’s up, functional in Firefox, IE, and Safari, and in pretty good shape. 

After I run my next marathon, I’ll work on creating an XML file with the data and actually feeding both the map and the table with the data (e.g. I’ll clean up the code from Map Builder and put an abstraction layer in place.)  In the mean time, I’ve satiated my need to play around with the Google Map API for now.