2. The impact of
On the
GigaOm/Bitcurrent latency on web
performance
briefing on Cloud InteropCloud business
of clouds
Computing (2008) (2009) (2009)
(2010)
Application Complete WebOps
Delivery Web Monitoring (2010)
Networks (2009) (2009)
3. The impact of
On the
GigaOm/Bitcurrent latency on web
performance
briefing on Cloud InteropCloud business
of clouds
Computing (2008) (2009) (2009)
(2010)
Application Complete WebOps
Delivery Web Monitoring (2010)
Networks (2009) (2009)
4. Enough about me; what
about you?
http://www.flickr.com/photos/jamesjordan/2751393381/
33. For much of its history, AT&T and its Bell System functioned as
a legally sanctioned, regulated monopoly.
The US accepted this principle, initially in a 1913 agreement
known as the Kingsbury Commitment.
Anti-trust suit filed in 1949 led in 1956 to a consent decree
whereby AT&T agreed to restrict its activities to the regulated
business of the national telephone system and government
work.
Changes in telecommunications led to a U.S. government
antitrust suit in 1974.
In 1982 when AT&T agreed to divest itself of the wholly owned
Bell operating companies that provided local exchange service.
In 1984 Bell was dead. In its place was a new AT&T and seven
regional Bell operating companies (collectively, the RBOCs.)
http://www.corp.att.com/history/history3.html
Hi! I’m Alistair, and I’m going to chat about the roadmap of clouds. Most of what you’ll see is the result of Bitcurrent’s research into clouds and some of the content presented at cloud and big data events.
Cloud computing is an approach to computing that’s more flexible and lets organizations focus on their core business by insulating them from much of the underlying IT work.
At its most basic, it’s computing as a utility – pay for what you need, when you need it, rather than paying for it all up front.
This is what Nicolas Carr talked about in his book The Big Switch.
But clouds can be confusing. Part of the reason is that they’re a big deal, which means everyone wants to be a part of them – even companies who have nothing to do with clouds.
There’s a lot of misinformation too, and the lines between clouds, virtualization, SaaS, and managed hosting are blurry at best.
I’m going to start by trying to put some definitions in place, so we can discuss the consequences of clouds. This may be remedial for some of you, but I’ll try to make it fun.
First, let’s talk about disruption.
Once, IT was a monopoly.
Today, it’s a free market. The line of business has tremendous choice in what it owns, runs, and uses.
The boardroom loves this: instead of managing machines, they manage services.
But enterprise IT doesn’t like it much, because it forces them to compete, and puts them side-by-side with organizations that spend their entire day doing detailed usage and billing.
It’s not all bad, though. There’s a lot to be learned from a transition from monopoly to a free market.
There were a couple of reasons IT was a monopoly for so long.
First, the machines were expensive. That meant they were a scarce resource, and someone had to control what we could do with them.
Second, they were complicated. It took a very strange sect of experts to understand them. AVIDAC, Argonne's first digital computer, began operation in January 1953. It was built by the Physics Division for $250,000. Pictured is pioneer Argonne computer scientist Jean F. Hall.
AVIDAC stands for "Argonne Version of the Institute's Digital Automatic Computer" and was based on the IAS architecture developed by John von Neumann.
This was also a result of scarcity. When computers and humans interact, they need to meet each other halfway. But it takes a lot of computing power to make something that’s easy to use;
in the early days of computing, humans were cheap and machines weren’t
So we used punched cards,
and switches,
and esoteric programming languages like assembler.
Think about what a monopoly means.
A monopoly was once awarded for a big project beyond the scope of any one organization, but needed for the public good.
Sometimes, nobody wants the monopoly—like building the roads.
For the most part, governments have a monopoly on roadwork, because it’s something we need, but the benefits are hard to quantify or charge back for.
(IT’s been handed many of these thankless tasks over the years, and the business has never complained.)
The only time we can charge back for roads are when the resource is specific and billable: a toll highway, a bridge.
Sometimes, we form a company with a monopoly, or allow one to operate, in order to build something or allow an inventor to recoup investment. This is how we got the telephone system, or railways.
When monopolies are created with a specific purpose, that’s good. But when they start to stagnate and restrict competition, we break them apart.
In fact, there’s a lot of antitrust regulation that prevents companies from controlling too much of something because they can stifle innovation and charge whatever they want. That’s one of the things the DOJ does.
In other words, early on monopolies are good because they let us undertake hugely beneficial, but largely unbillable, tasks.
Later, however, they’re bad because they reduce the level of creativity and experimentation.
Today, computing is cheap. We can buy many times the compute power of the Apollo missions with a swipe of a credit card.
It’s also not complicated. Everyone can use a computer. Because today, the computer is cheap and the human’s expensive we spend so much time on user interfaces, from GUIs to augmented reality to touchscreens to voice control to geopresence.
What used to take a long time to procure, configure, and deploy is now a mouseclick.
The way data centers are designed must reflect this shift from IT-as-a-monopoly to IT-as-an-enabler
When you’re building something huge and expensive, you build what you want, and expect people to be grateful for it.
But today’s IT user is driving IT requirements.
They can shop around—choosing SaaS, clouds, and internal IT according to their business requirements. Amazon now has microinstances at $0.03 an hour!
They’re increasingly able to build the applications themselves, but expect IT to deliver smooth, fast platforms on which to experiment.
It’s an inversion of the traditional IT “pyramid”, where the hardware dictates the platforms, which in turn dictates, the apps, which dictates what users can do.
Today, what users want to do drives the apps they use, which drives the platforms and the hardware.
In 2008, the word of the year was Overshare.
(and BTW, in 2009 it was Unfriend)
They’re focusing on adoption rates, trying to understand which IT services are being used and which are neglected
As a species, we’re sharing more content, more ways, than ever before.
Think about it: We’re only a generation away from the steno pool
and the mimeograph.
Okay, maybe two generations.
But we’re an entirely different species.
We communicate with one another instantaneously.
We favor “phone a friend” crowdsourcing over rote knowledge.
Public disclosure has replaced privacy.
Think about it. It’s only been 6000 years that we’ve been communicating and recording information as a species. If you added up the years of experience of everyone in this room, that’s more than all the years that humans have been writing things down.
Throughout history, we’ve invented things as we need them, in response to environmental pressures.
Irrigation moved us from hunting to farming, giving us time to research, teach, and learn.
Aqueducts allowed us to live in cities, finding shelter and controlling the spread of disease
Literacy and the printing press democratized knowledge, replacing myth with recorded facts everyone had access to
Steam power meant land ownership gave way to the industrial era
Electricity mechanized the household
And now computing and the web have created a universal, real-time, flat world of information built for human crowd
In many ways, crowds and clouds are inextricably linked.
The online crowd is experimenting. It’s trying new things—new ways of working, playing, and interacting. Cloud computing is the sandbox in which it plays.
The crowd creates content—photos, videos, blogs, mails—at an unprecedented rate. Clouds let us track, analyze, and make sense of it all.
The crowd is whimsical, spontaneous. Clouds offer the elasticity needed to handle bursts of curiosity and unpredictable usage patterns.
Clouds are the fuel for augmented reality, giving the crowd superpowers: remote sight, omniscience, flawless navigation, and more.
Most importantly, clouds let the computer recede into the background, delivering on the promise of ubiquitous computing made decades ago.
The technology we talk about this week might seem very real and tangible, but make no mistake: it’s the raw material for science fiction and comic books. On top of pervasive, connected, elastic computing, we’re redefining what it means to be human.
Clouds are the grey matter of Human 2.0.
We’ve already got central nervous systems.
It’s our job to build Humanity’s distributed nervous system—with all of the promise and peril, all the risk and reward that it entails.
That’s what we’re creating today. It might not look like it at the moment, but we’re building the grey matter for the next iteration of human consciousness.
And cable operators could control our content. The set-top box, tied to a PVR, freed us from the tyranny of the O’Clock. We could watch things whenever we wanted.
Today, each of us has several screens.
And the cost of getting information to those screens is negligible. So slim, in fact, that anyone with a $100 camera can get a 10-minute video, to the entire planet for free which they can watch any time they want.
That means the death of one-size-fits-all, watch-when-I-tell-you-to television, and with it the end of traditional broadcast networks.
But the fact remains that for most of us, it will make no economic sense to own something we don’t use 90 percent of the time, particularly when that thing is the second most expensive item we’ll buy (after a house.)
Augmented reality will be the biggest change to human consciousness of the next decade. We’ll see the world around us as just the "default layer" or "layer zero." We’re adding layers for transportation, friends, restaurants, historical information -- whatever you like. We’ll take them for granted, and when they stop working, it’ll feel like a stroke.
Science is how humans try to perceive things better.
We try to understand millennia,
Life on the head of a pin
galaxies.
We try to understand patterns that move too slowly—
like global warming,
or the spread of a conflict,
or the dissemination of a disease.
Technology, applied properly, helps us to perceive changes. It shows us patterns we can’t see; lets us discern shifts beyond our own senses.
All of these are about delaying gratification—not doing what’s immediately obvious, and recognizing the bigger picture.
The reason we don’t think of personal transportation and broadcast TV as ephemeral is because we’re slaves to our own perception As humans, we have a hard time looking at the big picture with our own senses.
We see only a narrow range of the visible spectrum.
We hear a small amount of the audible spectrum.
We perceive distances from a kilometer to a millimeter.