The Internet Is Broken

I still read a handful of print magazines (you gotta do something in the bathroom) – one my favorites is Technology Review (MIT’s Magazine).  This month’s cover story was The Internet Is Broken and is a fascinating (and probably important) article about the cost the Internet’s basic flaws which result in the need for a “clean-slate approach” being advocated by MIT’s David Clark (an Internet old-timer and chief protocol architect from 1981 – 1989.) 

Clark believes there should be four basic elements that he’d like to see designed into the “new Internet architecture.”

  1. Security: The Internet should authenticate the people and computers you communicate with and keep spam and hazards like viruses from ever reaching your PC.
  2. Mobility: Assigning Internet Protocol addresses to small and mobile computing devices such as sensors, phones, and embedded processors in cars would allow them to connect to the network securely.
  3. Protocols: Better traffic routing agreements between Internet service providers would allow them to collaborate on advanced services without compromising their businesses.
  4. Instrumentation: All pieces of the network should have the ability to detect and report emerging problems – whether technical breakdowns, traffic jams, or replicating worms – to network administrators.

The article is bound to be controversial, but covers a lot of ground, including discussing a proposed $300 million effort from the NSF to create a new Internet infrastructure.

  • Chip

    I agree with the premise that the Internet is broken, especially from the security perspective. My question is: How does a public network authenticate its users without invading its privacy? In other words, how do I trust you without knowing much if anything about you and not being able to ask for information about you?

    It would be interesting to see some of the original Internet transit service agreements enforced. Simply stated they said if malicious activity came from your subnet, you were cut off from Interent access. Imagine cutting off the AOL or Comcast domain for SPAM. Would this help fix the problem?

    This is a Gordian knot that I look forward to seeing broken.

  • This smells like an argument for moving intelligence further into the core (the network core).

    Traditionally it has been the strategy of innovators to innovate at the edge; it’s tough to sell core infrastructure changes to established owners of infrastructure like telcos. These guys don’t tend to buy just anything, especially not all the bells-and-whistles if it risks impacting their stability and uptimes.

    I agree that more intelligence in the core would be nice in theory, though. Consider for example the extremely difficult task of mitigating distributed denial of service attacks. Doing this at the edge is too little too late. Filtering the attacks as close to the source as possible (the ideal scenario) involves adding detection and filtering intelligence into or closer to the core. In practise, it’s extremely challenging to implement intelligent devices that do deep packet inspection when the traffic they have to handle is very big (which is the case for core devices, but not at edge devices). This is why it’s much simpler to argue that this stuff is needed but much more difficult (costly) to implement and deploy.

  • I tend to worry that any reengineering of the Internet at this point will be used to control the ability of “regular people” to distribute content. Traditional media sources are very much afraid of the blogosphere and would love to shut it down, and they have the money to influence any redesign of the Internet such that the open publishing model that we have grown to both love and hate would be sharply curtailed under the “new model”. At least, there would be very powerful forces pushing very hard for such a result.

  • I wouldn’t really go so far to say that traditional media _sources_ have the worst end of the stick. After all, they’re providing content, and if it’s good content, they’ll be OK.

    Probably one of the most important challenges belongs to the big TelCo. Or the big cable company. Sometimes, the cable company *is* the TelCo, and vice-versa. These guys are the traditional media distributors. What makes the Internet/Web/Blogosphere particularly different from traditional media mediums (oh God, pardon the repetition) is that unlike TV, for example, the content you have access to is completely controlled by the content provider. This means that FOX can put up whatever it wants to give out to the public on their website and they’re no longer dealing with the cable company to make it both available and _visible_ to the public. Instead, they’re dealing with whoever is hosting them and/or providing their bandwidth to make them _available_, and with whoever is sourcing most of the traffic to their site (e.g., Google, Yahoo!, etc.) to make them _visible_.

    You might think I’m arguing semantics here; not so. For the web, Google provides a large chunk of the value that the cable company provides for more traditional media (TV), even though they don’t own all the infrastructure. This is because Google enables people to find (the conceptual equivalent for the cable company is “TV channel”) — by the way, when I say “Google” here, it might not be Google, it could be Yahoo! or even, or whatever).

    Currently, the only value that the TelCo/cable provider/infrastructure owner provides to the consumer is the last-mile, the connection to his or her home. Not for long, though, in comes Google with free WiFi.

    And that’s why traditional media distributors have to worry. Because suddenly, not only do they have the least of the advertising minutes/real-estate on all the infrastructure they actually own, but they’re in real danger of eventually no longer being able to capitalize on the consumer.

    The reality of all this, and to relate back to Brad’s blog post, is that most of the intelligence/innovation/money is at the edge. The advertising revenue model doesn’t apply to the core where not only is the technology unaware (and unconcerned) with the nature of the content that it facilitates, but where it would be difficult to make it aware.