• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
The Past and Future of Open Computing

The Past and Future of Open Computing



My keynote from the Open Compute Platform Summit in Santa Clara, CA on January 16, 2013. I talk about the influence of open source on the history of computing, starting with von Neumann, and end with ...

My keynote from the Open Compute Platform Summit in Santa Clara, CA on January 16, 2013. I talk about the influence of open source on the history of computing, starting with von Neumann, and end with a vision of the "Internet Operating System" behind modern applications, and the question of who will control that operating system software and hardware.



Total Views
Views on SlideShare
Embed Views



23 Embeds 2,922

http://blog.zenoss.com 2451
http://oreillyradar.tumblr.com 177
http://www.scoop.it 115
https://twitter.com 83
http://lanyrd.com 30
http://www.twylah.com 17
https://wiki.csiro.au 9
http://podidoo.com 6
http://www.linkedin.com 6
http://feeds.feedburner.com 5
http://cloud.feedly.com 5
http://inoreader.com 4
http://news.google.com 2
http://tweetedtimes.com 2
http://twitter.com 2
https://www.google.nl 1
https://abs.twimg.com 1
http://www.kred.com 1
http://m.kred.com 1
http://kred.com 1
http://abtasty.com 1
http://www.diffbot.com&_=1358457510954 HTTP 1
http://webcache.googleusercontent.com 1



Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution-NonCommercial LicenseCC Attribution-NonCommercial License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.


11 of 1 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
  • Calculation of quantums applied on somedays when all matters solved.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    The Past and Future of Open Computing The Past and Future of Open Computing Presentation Transcript

    • Compute Summit January 16-17, 2013 Santa ClaraWednesday, January 16, 13
    • Compute Summit The Architecture of Participation Lessons from open source software for open source hardware +Tim O’Reilly @timoreilly O’Reilly MediaWednesday, January 16, 13
    • “History doesn’t repeat itself, but it does rhyme.” -Mark Twain 3Wednesday, January 16, 13I’m going to look both backwards and forwards in this talk. Before talking about the future, it’s worthwhile to reflect on the past, to see whatlessons we can take from it. As Mark Twain said, “History...
    • Wednesday, January 16, 13The computing universe that we take for granted began with a profound act of open source hardware. As George Dyson explains in his bookTuring’s Cathedral, John von Neumann and his colleagues at the Institute for Advanced Study put the fundamental architecture of storedprogram computers, the ancestral architecture reflected in all computers we use today, into the public domain, declining to seek any patents.This was an act of radical idealism. But it wasn’t “political” in the sense that “free software” came to be seen in the 1980s. It was really allabout the sense that this technology was too important, too fundamental for one organization to try to wring proprietary advantage out of it.For computing to reach its full, world-changing potential, it had to be available for everyone to build on. That’s the spirit with which the OpenCompute Project also operates.
    • “What we’re selling to users of open source is control.” - Michael Tiemann, Red HatWednesday, January 16, 13Michael Tiemann echoed this sense of the importance of the users of technology being in charge of their own destiny when he said “What we’reselling...” (This is probably not an exact quote, just a memory of a conversation we had the better part of a dozen years ago.)
    • Why Fidelity and Goldman Sachs Care About Open Compute “In the past we needed to simplify and reduce motherboards of unnecessary proprietary components, open up and simplify management software, maximize hands-free management software, and so on, in order to make them work efficiently for us. This behavior was similar to Facebook’s early days’ server OEM experiences. Despite our numerous attempts in the past to influence design, none of the server providers listened to our needs. Although not an ideal design, we maximized power efficiency and automated system management as best we could.” “What jumped out at us last summer at the OCP summit was that for the first time the non-hyperscale world could access many of the same design points, ODMs, simplifications and “freedom of choice” advantages indigenous to the hyperscale Web 2.0 world. This open access led to an evolution in thinking.” - Peter Krey, in A Concise History of AMD’s Roadrunner Server for OCP http://www.opencompute.org/2012/06/25/a-concise-history-of-amds- roadrunner-server-for-ocp/Wednesday, January 16, 13It’s this sense of users working together to advance the state of the art that comes through in the OCP blog post explaining why FidelityInvestments and Goldman Sachs came to work together on the AMD Roadrunner server for OCP. “Despite our numerous attempts...”
    • Wednesday, January 16, 13This is also the underlying spirit of Richard Stallman’s Free Software Definition, the first overtly political statement of user rights, from the early1980s.Unfortunately, the Free Software Foundation brought in a lot of stridency to the discussion, and what, to my mind, became an excessive focuson legal means -licenses - as the heart of the free software story.
    • “Given enough eyeballs, all bugs are shallow” --Eric RaymondWednesday, January 16, 13The open source movement emerged in 1998 with a more pragmatic approach, selling openness as a benefit, providing better softwaredevelopment practices through community.
    • Wednesday, January 16, 13But there was still an unfortunate focus on licensing as the heart of the open source movement. Open source was ultimately defined by a set ofapproved licenses.We need to get away from the narrative that makes us focus on licensing. The most important things are system architecture, community, andtools and practices for actually sharing our work. Licenses are just a way of making sure that bad actors don’t ruin the party.
    • Wednesday, January 16, 13I like how the OCP seems pretty clear about this. The licenses are simple, focused mainly on restricting patent assertions, and the emphasis ison providing specifications and implementation details. You know that it’s about working designs, and about community.
    • Wednesday, January 16, 13The core advances of open source software, in my opinion, have always come from people who are more pragmatic. I was around in the earlydays of Unix, and what drove code sharing wasn’t radical idealism or licenses. The early Unix code wasn’t shared under a license that wouldhave qualified as open source, but it was open enough. Early versions of Unix were developed collaboratively by hundreds of developers at aloose network of institutions, most notably AT&T, where it started, and UC Berkeley.That community collaboration is a big part of what OCP is trying to recreate. The future of OCP depends on you. It is your contributions thatwill push it forward. Own it!
    • “No matter who you are, most of the smart people work for someone else.” - Bill JoyWednesday, January 16, 13The importance of open source in enabling a distributed community is summed up in what has been referred to as Joy’s Law. “No matter...”That’s what leads to so much innovation. Open source is fabulous for innovation.
    • “Richard Stallman talks about the evil of copyright, and says we need copyleft to fix it. At Berkeley, we just say ‘Go down to Copy Central and copy it.’” --Kirk McKusick, head of Berkeley Unix projectWednesday, January 16, 13But there was another element to the early spread of open source. Unix was the first operating system that became divorced from theunderlying hardware. It ran on many different machines with very different architectures. Code from one machine couldn’t simply be run onanother; it had to be recompiled. With a standardized hardware architecture, PC software could be distributed in binary. Unix *had* to bedistributed in source form, because that was the only way to get the software to run. All of us spent time “porting” programs we’d received toaccount for either differences in hardware architecture or differences between various implementations of the operating system. In addition,because it was initially a non-commercial operating system, software was shared freely. Unix was developed collaboratively by hundreds ofdevelopers across many organizations. In an odd way, open source was a response to the problem of incompatibility.
    • Wednesday, January 16, 13We know all about incompatibility in the hardware world. My friend Nat Torkington once said that there must be a special circle of Dante’s hellreserved for the makers of incompatible power supplies for consumer devices. In his infernal vision, the manufacturers of such devices were allcondemned to a hell of perpetual sexual arousal combined with incompatible sexual organs. I’m sure that those of you in the data center worldknow people who belong in this same hell. But I thought this image of one of Dante’s circles of hell recreated in lego tells another story aboutarchitecture. You can make anything out of lego, because the pieces are designed to fit together.
    • “The book is perhaps most valuable for its exposition of the Unix philosophy of small cooperating tools with standardized inputs and outputs, a philosophy that also shaped the end-to- end philosophy of the Internet. It is this philosophy, and the architecture based on it, that has allowed open source projects to be assembled into larger systems such as Linux, without explicit coordination between developers.”Wednesday, January 16, 13This is also what was so powerful about Unix, the system that Linux emulated. It wasn’t itself open source by today’s standards of licensing,but it had an architecture that allowed it to be developed collaboratively by a community of loosely connected developers. It was thearchitecture that mattered. In writing an entry for this classic book on Wikipedia, I wrote... I believe this philosophy of interoperablecomponents is also at the heart of the OCP vision.
    • “The architecture of participation” “I couldn’t have written a new kernel for Windows even if I had access to the source code. The architecture just didn’t support it.” -Linus TorvaldsWednesday, January 16, 13I heard another striking assertion about the importance of architecture fifteen years or so ago in a conversation with Linus Torvalds. Heobserved...That term “architecture” stuck in my head, and I realized how true it was of all the most successful open source projects - that it was far morethan a matter of just releasing source code. It was designing systems in such a way that someone could bite off a manageable chunk andmodify, replace, or extend it. I call this “the architecture of participation.” Some systems are designed for participation; others are not.
    • The internet would not exist without open source softwareWednesday, January 16, 13Here’s an even stronger assertion: “The internet....” And that’s not just because the initial implementations of TCP/IP and related tools like the DNS came out ofBerkeley Unix and were open source. It’s not just because the services we all take for granted are built on top of an open source foundation. It’s because thevery architecture of the internet and the www are shaped by open source.
    • Wednesday, January 16, 13Tim Berners-Lee put the web into the public domain, and that was a profound act of open source software. But the software that Tim wrote islong gone, subsumed by other software that built on the architecture, communication protocols, and markup language that he designed. Aneven deeper contribution was the fundamental architecture of the web, which allowed anyone to put up a site without permission from anyone- all they had to do was speak the same language and communication protocol.
    • Wednesday, January 16, 13You also see this architectural element in the success of the Apache web server. I remember back in the mid 90s, when there was this mediahysteria that Apache wasn’t keeping up, because it wasn’t adding features as fast as Netscape’s web server or Microsoft IIS. The folks atApache were clear: We’re an HTTP server. We have an extension layer (read “we are a platform”) that allows other people to add new features.Fifteen years later, Apache is still the dominant web server, and Netscape and IIS are footnotes in history.
    • Work on stuff that mattersWednesday, January 16, 13Moving on to another topicI’ve made a practice for the past half-dozen years of asking the tech industry to work on stuff that matters. The Open Compute Projectmatters, and I want to give you some forward-looking perspective on just why I think it does.
    • opportunities for innovation bring computing to people at the lowest cost and widest distribution minimize environmental impact improved upon by anyoneWednesday, January 16, 13I want to start with the mission statement for OCP. I’ve pulled out some key phrases. This is idealism of the kind expressed by von Neumannand his colleagues at the Institute for Advanced Study.
    • Why does this matter so much?Wednesday, January 16, 13The traditional wisdom was always that there weren’t that many companies of Google or Facebook scale. We now know better.
    • Wednesday, January 16, 13What we’re really engaged in is building a platform for a global internet operating system. Back in 2002, I ran a conference entitled Buidlingthe Internet Operating System. In his keynote at that conference, Clay Shirky told a thought-provoking story. He remarked on the assertion byIBM CEO Thomas Watson Jr. that he saw no need for more than five computers in the world. “We now know that he was wrong,” said Clay. Theaudience nodded, thinking of the millions of PCs in the world. Today, it’s billions of smartphones. But then Clay delivered his devastatingpunch line: “We now know that he overstated the number by four.” We are moving towards a world that can be thought of as one globalcomputer. The big battle in computing is about who will control the operating system for that computer.
    • Wednesday, January 16, 13With the rise of applications like Facebook, which reach a billion people, you can see why this matters. Before we know it, there will beapplications with many billions of users.
    • Wednesday, January 16, 13And of course, the smartphone is really just a portal to network services.
    • What happens when the kind of collective intelligence applications of the web are driven by sensors rather than people typing on keyboards?Wednesday, January 16, 13But the big question I’ve been asking myself for the past half dozen years is this: “What happens...”
    • The Google Autonomous VehicleWednesday, January 16, 13We see this in unexpected places, such as the Google autonomous vehicle. This car is thought-provoking on a number of levels.
    • 2005: Seven Miles in Seven HoursWednesday, January 16, 13You see, back in 2005, the car that won the DARPA Grand Challenge went seven miles in seven hours.
    • AI plus the recorded memory of augmented humansWednesday, January 16, 13What was the difference? It turns out that the autonomous vehicle is made possible by Google Streetview. Google had human drivers drive allthose streets in cars that were taking pictures, and making very precise measurements of distances to everything. The autonomous vehicle isactually remembering the route that was driven by human drivers at some previous time. That “memory”, as recorded by the car’s electronicsensors, is stored in the cloud, and helps guide the car. As Peter Norvig of Google pointed out to me, “picking a traffic light out of the field ofview of a video camera is a hard AI problem. Figuring out if it’s red or green when you already know it’s there is trivial.” Effectively, the Googleautonomous vehicle is part of a cloud-based system reliant on what I’ve called “the global brain.”
    • Wednesday, January 16, 13You can see this same “data center behind apps in everyday life” in applications like Square. Square is revolutionizing the retail experience forsmall merchants. I don’t know how many of you have tried the combination of Square Register and the Square wallet app. It automaticallychecks you in when you walk into a participating merchant. Your name and face appear on the register, and since your payment details arealready on file, all the retail clerk has to do is confirm your identity, as shown in this screen shot.
    • Wednesday, January 16, 13We can also see this in the Apple Store. If you squint a little, you can see the Apple Store clerk as a cyborg. Where most stores (at least inAmerica) have used technology to eliminate salespeople, Apple has used it to augment them. Each store is flooded with smartphone-wieldingsalespeople who are able to help customers with everything from technical questions to purchase and checkout. Walgreens is experimentingwith a similar approach in the pharmacy, and US CTO Todd Park foresees a future in which health workers will be part of a feedback loopincluding sensors to track patient data coupled with systems that alert them when a patient needs to be checked up on. The augmented homehealth worker will allow relatively unskilled workers to be empowered with the much deeper knowledge held in the cloud.
    • Wednesday, January 16, 13Or consider how a taxi service like Uber creates a system connecting passenger and driver - with a data-center app providing the dispatching,coordination, billing, and reputation system that ties it all together.
    • Wednesday, January 16, 13They said that CES was “the break out year for the Internet of Things.” While much of it may be hype, we do see a lot of activity around ideaslike the connected car, smart homes, and the quantified self - all consisting of sensor driven apps with big data back ends.
    • Wednesday, January 16, 13Perhaps the most striking development on the Internet of Things front is what GE is calling “the Industrial Internet”. I spoke at GE’s event in SanFrancisco a few months ago. Jet engines equipped with sensors are putting out a terabyte of data a day. Analysis of that data can improve fuelefficiency, predict when parts are breaking down and require service, and much more. Again, devices are being woven into something greater,and there’s a data center somewhere in the background.
    • Wednesday, January 16, 13I really started thinking about the operational implications of the internet as operating system back in 2006. I wrote a blog post calledOperations: The New Secret Sauce, which made the assertion that in the future, operations - what’s now called “devops” in particular - wouldbecome a key competency not just for internet companies but also for the enterprise.
    • Wednesday, January 16, 13That prediction led a group of operations professionals to ask me to launch “a gathering place for their tribe.” That became our VelocityConference, which focuses on web performance and operations, and increasingly, the back end for the internet of everything. What you dowith the Open Compute Platform is very much part of that same story.
    • Large cloud end users “are beginning to see devops, openstack, open source methods, and hardware as one long continuum.” - Bob Ogrey, AMDWednesday, January 16, 13Bob Ogrey of AMD reportedly described how the devops movement, open source, and open hardware are all part of the same enterprisetransformation. He said....
    • “Being a developer on someone’s platform will mean being hosted on their infrastructure.” - Debra Chrapaty, Microsoft, 2006Wednesday, January 16, 13But that’s a key to why the Open Compute Platform really matters. Remember what Michael Tiemann said about the benefit of open sourcebeing control by users? The conversation that led me to write that 2006 blog post about web operations was with Debra Chrapaty who at thetime ran operations for Windows Live. (She’s now the CIO of Zynga.) She said, “Being a developer...” Since I was talking with her at the O’ReillyOpen Source Convention, that led me to ask, “who will control that platform?” and to make the case that what we now call the cloud, not thedesktop, should be the focus of the open source community.
    • “What we are creating now is a monster whose influence is going to change history, provided there is any history left.” -John von NeumannWednesday, January 16, 13But as I’ve suggested here, the enterprise transformation is only the tip of the iceberg. We’re talking about something that is incrediblypervasive.von Neumann’s wife Klari recalled him waking up one night in 1945 in a cold sweat. He said: “What we are creating...” She thought he wastalking about the atom bomb, but George Dyson argues that his greater worry was the growing power of machines. Again, if, as MichaelTiemann notes, open source is about control, our ability to have distributed control over the hardware of the global brain may turn out to bevery important.
    • “The species of devices of which this is to be the first representative is so radically new that many of its uses will become clear only after it is put into operation. These uses which are not, or not easily, predictable now are likely to be the most important ones.” -John von NeumannWednesday, January 16, 13But I prefer to end on a more hopeful note. In a letter to one of the military funders of the first computing project at the Institute for AdvancedStudy, von Neumann wrote, “The species...”That’s the real beauty of open source, that it allows everyone to play a role in inventing the future. Your creativity is what will make this asuccess. Go forth and make the future happen! Thank you very much.