Morning Keynote with Nick Carr
Morning! It’s time for the big Nick Carr keynote. Ready? Me too!
[Okay, we are rocking out to some trendy rock group that I am sure I would know if Susan’s crankiness wasn’t rubbing off on me. Oh, it’s Green Day, I think? Maybe. I think its Green Day. I think that’s what Li Evans just said to my left.
This keynote is schedule to start at 9am, but it’s 8:59am and the room is half empty and the stage is bare. I’m not sure what’s going on. I guess I’ll wait.]
And we’re off!
Nick jokes that it’s good to have the first slot in the morning after the most alcoholic holiday of the year. Hee.
He thinks we’re at a major turning point in the history of computing and the way we use computers. It will have a huge set of ramifications for the way we think about things – from media to society and for culture as well. Computer systems increasingly are at the center of the way we do business, the way we find media, the way we find entertainment, the way we communicate. A fundamental change in the technology will have enormous ripple effects. These changes aren’t just technology-related; they’re also the economic changes that will be set off by the new technologies. After all, it’s economics, not technology that affects how companies behave and affects what we do.
If you look at your own behavior over the last 5 years, you can already see that we’ve changed the way we think about computers. It used to be you’d have to buy a piece of software to do something new with your computer. Increasingly that’s becoming foreign to us. Now we just fire up our Web browser, go out into to the Internet, find the software we want to use and bypass the process of buying it and installing software on our computer. It’s the whole Web 2.0 movement.
So far, businesses have kept themselves at a distance with the Web 2.0 phenomenon. They’ve looked at it as something for geeks or people in their homes. But if you look at what’s going on you can see some parallels to what happened 25 years ago when we got the first personal computer. [I’m as old as the first personal computer?] At first it was kept out of the corporate world and then people realized it was hugely valuable. It forced corporations to adapt to this new technology. It didn’t only mean that companies bought lots of PC; it meant that they restructured their way of thinking about information technology.
He says the change we’re starting into today is even more fundamental. You have to go back even further in time to really understand the change. In fact, he’d argue the best place to start is back in 1851 when a man in Troy, NY built this magnificent machine. It was a water wheel and was the biggest and most powerful one at the time. This man knew that all factory owners had to also generate the power to run these machines. Power generation was something every company had to do on its own. They had to set up their own department, invest in their own equipment, etc. He knew he could get a competitive advantage if he could figure out how to dominate the power supply. What’s telling is not the picture of the water wheel, but what you would have seen if you went back to that site 50 or so years later. You would have seen that the wheel had been abandoned, left to collapse and overgrown with trees and weeds.
Suddenly companies didn’t have to generate their own power thanks to technology innovations culminating in the creation of the alternating current generating system and the electric grid. Now companies could plug into a new network and get all the power they needed. As a result, they no longer had to run their own water wheels and they could abandon them and the time that went into them.
When you think about that and you put yourself into their shoes, it must have seemed like an incredible leap of faith. Something that you had always had to do yourself and now you are trusting it to a machine? Still, they did.
The old data centers are like the old water wheels.
In 1910, only 40 percent of the electricity generated in the US was directed by utilities. Just 20 years later, the utility supply of power had risen to 80 percent. That change would continue at that radical pace until it reached about 95 percent utility/ 5 percent private supply. A major change came about through industry as soon as you had an economically more efficient means of distributing a crucial technology resource. This was only the start of the revolution that would be the electrification of industry and society. The big news was that as soon as you moved to utility supply, you could drive down the cost of energy dramatically. What we saw was an explosion of innovation at the socket. It started with simple machines, but then business people started to realize that cheap power would allow them to completely change the way they manufactured their own products.
Computing is the next great important technology that is going to go through a very similar change – going through private supply to a central utility supply model. We’ll plug into a shared grid to get all the processing requirements we need. Information technology and electricity are the only two technologies that can be supplied over a grid.
If you look through the history of computing, you see that it follows a very similar pattern to power generation. The pattern of computing was established in the early years of the last century. Even as companies were shutting down their private power generation depts., they were building up their IT departments which would become the core of the economy. As new technologies advance, the power and application of computers rose enormously and we had the next great stage of computing in the 50s when we had the introduction of the mainframe into corporations.
The mainframe was incredibly efficient, very clean, neat, etc. The mainframe computer would operate at about 80-90 percent of its capacity. But it had a huge disadvantage – it was very impersonal. They could only be used for big institutional jobs. The average user couldn’t apply it to his own personal job.
As soon as we had the introduction of the PC, everything changed and computing looked differently.
Every company has to build a datacenter that has the same machinery and mostly the same software as all of its computers. It’s incredibly inefficient. Then they have to maintain it. But this is the only way that we’ve known how to deploy computing on a large scale. If you look at some of the stats, you see the incredible inefficiency of this fragmented model of computer. HP found that 80 percent of the average server capacity goes to waste and only 20 percent is actually applied. Seventy percent of IT labor goes to keeping the machine running. It doesn’t provide you with any business advantage; it’s just the cost of doing business. If you look more broadly, you realize that we’re all paying a big tax for this inefficient mode of computing.
If there were a more efficient way of deploying these resources, we could dramatically reduce the capital that goes into computing and IT and release it throughout the economy. Companies, instead of having to invest in peripheral activities, could invest that time and money in that main business. That’s where we are today.
Nick shows a picture of a Google datacenter in Oregon and says that animal sacrifices are going on in there. Hee. When did all these SES speakers become funny? Lisa likes.
Nick says that there’s a massive build out of the new computing grid that is going to transform computing.
Why is this happening now?
A number of things have changed at a tech level that make cloud computing unavoidable.
Moore’s Law: The power of computers for a given price is going to double about every 18 months. That’s played out throughout the last few decades. It underpins the explosion of computers. It has also meant that computing power has gotten cheap enough that we can move from having physical machines to replicating those physical machines in software, which is the trend called virtualization. This allows much of the build out of the utility computing system. You can get cheap generic servers and employ virtualization.
You also have to have an efficient means of distributing that power. You have to have the computer grid.
Grove’s Law: The capacity of network communications doubles only every century. What Grove meant was a dig at the telephone companies who he thought were dragging their feet and holding back the computer revolution. To tap into the huge, latest most efficient computing power, you’ve had to run your machines locally. With the build out of broadband Internet, suddenly the capacity of the network has begun to catch up with the power of the computer. Grove’s Law has been repealed and suddenly the network capacity catches up to computing power and you can deploy very sophisticated services centrally over this rich new grid. This is something Eric Schmidt predicted back in 1993 when he was working at Sun Microsystems. This is what we’re seeing today. The network is becoming the computer and the data center.
This isn’t only some theoretical occurrence. We can see massive investments among IT providers to businesses. The IT industry is acknowledging that a massive change is on the horizon.
What does this mean?
Just like the electrical socket brought a wave of innovation, we’re going to see similar effects come out the utility model of computing. It dramatically reduces the cost of computing and increases the availability. Pretty soon the network will be available everywhere. This is going to require a rethinking and a new wave of innovation that will affect all areas of commerce. One of the first areas is in rethinking corporate IT. It’s important to recognize the importance of this trend. As big companies move into the crowd, it will lead to new economies. Corporations are going to move their data center into the crowd. Create networks virtually out on the grid. Suddenly, all this physical machinery and the label required to maintain it becomes software. It moves out onto the utility grid. Anyone who thinks about how software applications are deployed is going to have to think about the interfaces.
The idea of cloud computing is that it’s built on the assumption that data should be shared, not isolated. It’s not about creating control anymore or hiding things behind your own firewall. Business is built on sharing. The real value comes when you can choose people to share information with and do it easily. What we have now is the ability to do this. He shows how Facebook users can customize their data streams and information flows. It’s just about moving levers up.
For many of us, this is becoming the natural way and the way computing should be delivered. It should put the users in control and work on the assumption that we can share information easily. It’s a radical departure from the systems we’re using today.
We’re going to start to see the ramifications begin to spread out and influence society and culture, the way people think and the way they live.
We’re seeing a blurring of the line between the consumer software business and the media business. They’re taking over each other’s characteristics. We can see it in new services being supplied over the net, like Mint. Free version of programs modeling Quicken or others. That means you have to have an indirect means of supporting these applications – advertising. Essentially, the media model is being applied to software. The success of human software is no longer being measured by the number of units sold; it’s affected by the audience and how much of their attention you can hold and your ability to monetize it. This is going to be a massive challenge for businesses, but will also open up big new opportunities for companies who can reengineer.
In turn, the media business is taking on many characteristics of the software business. You have big opportunities for traditional technology and software companies who can move into media. Increase their software scales and adapt to a new model of doing business.
We’re also going to see continued consolidation of control of online business. We have this idea that the Web democratizes media and provides new opportunities to individuals. Between 2001 and 2006, we saw an explosion in the number of domains on the Web and yet traffic during this time consolidated greatly. This trend is going to continue pushed by the economics that gives such huge profits to aggregated traffic.
On the more scary side, as soon as you begin to translate businesses into software, you have the phenomenon of the workerless company. You can run very large scale operations, serving millions of customers with very, very few employees because you’re running them on cheap infrastructure. Look at Skype. Back when it got purchased by eBay, it was serving the same number of customers at British Telecom, but employing just a fraction of the people. Same goes for YouTube and Craigslist.
As this computing power is deployed we’re going to see a greater degree of personalization of information and media. On the one hand, that’s great. On the other, it raises concerns about what happens to society and culture as we begin to automatically receive highly customized information from the media that is based on preexisting prejudices. Creates polarization. We should be concerned about this. There’s an assumption that Internet is leading to greater social harmony, but there’s also that polarization effect going on.
Consumers are prey. Challenges user privacy and free will. People’s entire lives are laid out in the search terms they use and where they go on the Web. Everywhere people go the net today, there are companies looking over their shoulder. This phenomenon is only going to increase in the future as our data mining techniques continue to expand and accelerate.
Does the World Wide Computer liberate us or control us?
As we have centralization of data, essentially the controlling aspect of computers will be applied to people without their knowledge. We’re going to see the struggle between control and liberation.
Well, that’s a scary way to end things…