Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

2020 foresight - Tech Views of the Future - Ed Maguire

We sought the views of over 20 people in and related to the technology industry, including venture capitalists (VCs), technologists, software companies, authors and industry analysts to identify key themes to help investors frame their decisions over the next several years. This report includes transcripts of conversations touching on the world of technology in 2020. We explore key areas including cloud computing, software as a service (SaaS), information management, enterprise applications, open source, mobility, energy information technology (IT), social enterprise and collaboration. We identify three “meta-themes”: the rise of transparent IT, intelligent systems and convergence.

  • Login to see the comments

2020 foresight - Tech Views of the Future - Ed Maguire

  1. 1. US software Produced byProduced byProduced by Special report The group of companies that comprise CLSA are affiliates of Credit Agricole Securities (USA) Inc. For important disclosure information please refer to page 192. Ed Maguire (1) 212 261 3997 Dominic White, CFA (1) 212 261 7759 13 September 2010 USA Technology 2020 foresight Tech views of the future
  2. 2. US software The group of companies that comprise CLSA are affiliates of Credit Agricole Securities (USA) Inc. For important disclosure information please refer to page 192. Contents Executive summary .............................................................................3 Tapping into accelerating change........................................................4 Transparent IT ..................................................................................12 Intelligent systems ...........................................................................16 Convergence .....................................................................................22 2020 interviews Jan Baan, Cordys............................ 28 Seth Levine, Foundry Group ......... 118 Willem van Biljon, Nimbula............... 34 Glen Mella, Control4.................... 122 David Cohen, EMC .......................... 40 Geoffrey Moore, TCG Advisors ...... 131 Simon Crosby, Citrix ....................... 47 Lew Moorman, Rackspace ............ 139 Jill Dyche, Baseline Consulting .......... 58 Sanjay Poonen, SAP.................... 145 Andrew Feldman, SeaMicro .............. 65 Keith Schaefer, BPL Global ........... 150 Promod Haque, Norwest Venture ........ 70 Stratton Sclavos, Radar Partners..... 157 Timo Hannay, Nature Publishing ......... 75 Michael Skok, North Bridge .......... 164 Parker Harris, 83 Michael Tiemann, Red Hat............ 171 Dave Kellogg, MarkLogic .................. 91 Ray Wang, Altimeter Group.......... 181 Gary Kovacs, Sybase..................... 101 Stephen Wolfram, Wolfram Rsch..... 186 Andy Lawrence, The 451 Group ........ 111 Insightful, differentiated research Ed Maguire (1) 212 261 3997 Dominic White, CFA (1) 212 261 7759 AsiaUSA Research TM AsiaUSA Research TM AsiaUSA Research TM
  3. 3. Executive summary US software September 2010 3 2020 foresight We sought the views of over 20 people in and related to the technology industry, including venture capitalists (VCs), technologists, software companies, authors and industry analysts to identify key themes to help investors frame their decisions over the next several years. This report includes transcripts of conversations touching on the world of technology in 2020. We explore key areas including cloud computing, software as a service (SaaS), information management, enterprise applications, open source, mobility, energy information technology (IT), social enterprise and collaboration. We identify three “meta-themes”: the rise of transparent IT, intelligent systems and convergence. Technologists and investors tend to project the future in stepwise terms, but innovations and paradigm shifts occur at an accelerating, often exponential pace. Over the next decade, hardware, storage and computing capabilities will improve at exponential rates, while progress in software may prove the sole limitation. Increasingly rapid paradigm shifts (Facebook and the iPad, for instance) reinforce that change is accelerating and will continue to surprise. Computing will be increasingly embedded into daily life, more intuitive and pervasive as a result of increasingly powerful and flexible software, rapid growth of endpoint devices, availability of “instant-on” connectivity and declining costs of hardware, bandwidth and storage. The trend of “consumerization” really reflects complexity that is giving way to simplicity. Intelligence will increasingly be embedded into “closed-loop” and point-of- control systems. Solutions will benefit from growing predictive powers of software, standards-enabled integration, pervasive connectivity and increasing availability of sensors and remote-controlled devices. Intelligence will extend beyond ecommerce and business to embrace physical systems, including home networking, smart grids and location-based services. For technology users and information consumers, the experience is paramount. Over the next decade, the distinction between discrete software, hardware, services and content vendors will become blurred as leading vendors both diversify and vertically integrate through mergers and acquisitions. Scale and specialization will define competitive differentiation. Paradigm shifts are accelerating ? The Web Mobile Phone PC Television RadioTelephone Facebook 0.1 1 10 100 1860 1880 1900 1920 1940 1960 1980 2000 2020 (Years) Source: Ray Kurzweil, Exploring tech views of the future Tapping into accelerating change Transparent IT - Simplicity rules, complexity fades into the background Intelligent systems - Turning data into a smarter world Convergence - Cross breeding software, hardware, services and content Change in technology will come at an increasingly quicker pace
  4. 4. Section 1: Tapping into accelerating change US software 4 September 2010 Tapping into accelerating change Trying to accurately elicit directions of the future is the charter of professional investors. By gathering data and context from the past and present, we attempt to extrapolate future trends. The immediate considerations of near-term dynamics of markets and the technology industry place demands on investors to focus on issues at hand; however, it is helpful to step back from time to time to take stock of what trends and themes will impact the dynamics of the future. Tomorrow will give us something to think about. Marcus Tullius Cicero We sought the views of people in and related to the technology industry, including VCs, technologists, software companies, authors and industry analysts. Our intent is not to accumulate a codex of future predictions, rather our goal is to identify key themes that may help investors frame their decisions over the next several years. A review of demographic and geopolitical trends, while certainly relevant, lies beyond the scope of this project. Imagination is linear, progress is exponential The amount of change that can occur in a decade is accelerating as technology evolves at an exponential pace. The world of technology in 2020 will see innovations widely adopted that may now be in the earliest stages of conception, or may not be conceived and realized for several years. Our conversations have provided a few surprises, revealed common threads, confirmed some prevailing views and challenged others, but overall have provided us with a framework to guide our ongoing efforts to anticipate the course of technological change and the investment opportunities that follow. If we have learned one thing from the history of invention and discovery, it is that, in the long run - and often in the short one - the most daring prophecies seem laughably conservative. Arthur C Clarke We believe there is a lot of reason for optimism. The emergence of cloud computing, the mobile internet, non-traditional user interfaces, advances in programming science and artificial intelligence, falling costs of computing, networking and storage place unprecedented power in the hands of everyone, from a child with a cell phone to entrepreneurs to researchers seeking to solve challenges of medicine. The barriers to innovation have never been lower. Resources at hand for free or nominal cost, combined with global availability of information, communications and collaboration tools provide a springboard for imagination, experimentation and risk taking. For investors, the challenge always remains a combination of timing and careful selection. As we look to the next decade for investment opportunities, the key themes we have indentified - transparent IT, intelligent systems and convergence - provide a framework that we hope will help place in context the ability to identify emerging opportunities, disruptive forces and secular drivers along the full continuum of industry maturity cycles. Our conversations revealed a number of key themes and predictions about the future of tech and technology. While hardware, bandwidth and storage will continue to increase in performance and decline in price, software remains a key domain where there is not a Moore’s Law type paradigm of exponential improvement. We sought the views of people in and related to the technology industry Investors inherently must anticipate the future The amount of change that can occur in a decade is accelerating Barriers to innovation have never been lower The challenge always remains a combination of timing and careful selection
  5. 5. Section 1: Tapping into accelerating change US software September 2010 5 Three “meta-themes” provide a framework We have identified three “meta-themes” that provide a framework for anticipating the changes in technology and software over the next decade: Transparent IT - Complexity is giving way to simplicity. Powerful capabilities become more easily accessible and pervasive, while advanced technologies become increasingly embedded in systems and the environment. Intelligent systems - Software and technology systems are increasingly gaining the ability to drive intelligent automation, decision enhancement, operational optimization and risk management in self-directed, recursive systems. Advancements in analytics and artificial intelligence techniques, such as machine-learning and neural nets, leverage the exponential growth of data from the proliferation of users, devices, sensors, applications and systems. Convergence - Software, hardware, services, content and business processes increasingly straddle formerly discrete definitional and categorical barriers. We expect increasing integration, both vertical and horizontal, a steady pace of cross-disciplinary development and M&A and a growing emphasis on holistic solutions. We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next 10. Bill Gates What changed from 2000-2010 It is helpful to take brief measure of the changes that have occurred over the past decade. The year 2000 saw the height of the internet bubble with the Dow and Nasdaq hitting all-time highs, companies raising money on little more than an idea (often spurious) and the investment community at large applying “best case” scenarios to the transformative power of the internet business model. Through the crash of 2000-2001 and the creative destruction that followed, the software industry in particular has followed a course few might have predicted. In hindsight, it is not difficult to imagine that the parabolic tech stock gains of the late 1990s would be followed by a crash. However, the lackluster aggregate return of established tech giants, such as Microsoft, Cisco and Oracle, and steady multiple compression across the sector were far less likely to have been anticipated. Figure 1 Largest technology companies by market cap in 2000 2000 Market cap (US$bn) PS (x) PE NTM (x) TTM sales (US$bn) Cisco 448.4 28.8 97.8 15.5 Microsoft 422.6 19.8 72.5 21.4 Intel 202.1 6.0 32.7 33.7 Oracle 201.8 21.0 97 9.6 IBM 149.8 1.6 26.9 95.8 Source: Bloomberg Simplicity wins Things get smarter Solutions are neatly wrapped and delivered That parabolic tech stock gains of 1990s would be followed by crash is not difficult to imagine The largest tech stocks of 2000 had stratospheric valuations . . .
  6. 6. Section 1: Tapping into accelerating change US software 6 September 2010 Figure 2 The declining valuations of the former top five technology companies 2010 Market cap (US$bn) PS (x) PE NTM (x) TTM sales (US$bn) Cisco 132.2 3.5 15.8 38.1 Microsoft 199.5 3.3 15.5 60.0 Intel 112.3 3.2 13.5 35.1 Oracle 113.4 4.5 14.7 25.3 IBM 170.9 1.9 12 88.4 Source: Bloomberg Alternately, the rise of companies, such as Google and, from startup origins as well as the mainstream acceptance of virtualization and cloud computing are dramatic developments that have reshaped the industry landscape. However, these developments sprung from ideas that had germinated well ahead of the new Millennium. Similarly, the advent of the mobile internet and many of its manifestations, such as smartphones and mobile applications, were clearly envisioned in the late 1990s. That the ideas were ahead of their time resulted in significant loss of venture investments, but steady progress over the decade has led to a definitive paradigm shift at the beginning of the 2010s. The explosive rise of social-networking technologies, such as Facebook and Twitter, has occurred in a far more compressed time frame. With the number of Facebook users approaching half a billion and Twitter nearing 200 million, this is remarkably rapid adoption for applications in only a few years. As investors seek to extrapolate future trends from historical perspective, it is helpful to qualify any expectations in the context of exponential change. Nobody, 20 years ago, forecast the internet. Bryan Appleyard In his book The Singularity is Near, Ray Kurzweil makes a compelling case that we should expect continuing exponential change in technology and society at large, so that over the next century we will see not 100 years of progress. It will be more like 20,000 years of progress at today’s rate. The idea that progress in technology occurs at exponential rates is most illustrated by comparing mass adoption of inventions over the past 150 years. One only has to look at the rapid growth of Facebook and the vision of tablet computing the iPad has catalyzed to see these accelerating paradigms. The ramifications of this are significant. Over the next decade, we can anticipate that successful new innovations will see adoption at increasingly rapid pace. Ideas behind Google, virtualization and SaaS sprung before the new Millennium Adoption of social networking technologies occurred in a far more compressed time frame The rate of change is accelerating Adoption of new technologies is exponentially faster . . . but have reverted to market multiples over time
  7. 7. Section 1: Tapping into accelerating change US software September 2010 7 Figure 3 Adoption paradigms are accelerating ? The Web Mobile Phone PC Television RadioTelephone Facebook 0.1 1 10 100 1860 1880 1900 1920 1940 1960 1980 2000 2020 (Years) Source: Ray Kurzweil and The principle of Moore’s Law, which holds that processor performance can double every 18 months, has held fast since the 1970s, while price performance of DRAM continues to improve along a similar dynamic. Figure 4 Figure 5 MIPS growth since 1970 DRAM price/performance since 1970 Core i7 Extreme (i980EE) Core 2 Extreme (Qx6700) Pentium 4 (600) Pentium 4 (3066)Pentium III Pentium 4 (1700) Pentium II Pentium 486 386 286 8086 8080 8008 4004 0 0 1 10 100 1,000 10,000 100,000 1,000,000 1970 1975 1980 1985 1990 1995 2000 2005 2010 (MIPS) 100 1,000 10,000 100,000 1,000,000 10,000,000 100,000,000 1,000,000,000 1970 1975 1980 1985 1990 1995 2000 2005 2010 (DRAM bits/US$) MIPS = Millions of instructions per second. Source: Ray Kurzweil and The power of wireless handheld devices similarly reflects an accelerating rate of performance, while the bandwidth capacity of the internet is growing at exponential step functions. The rate at which new inventions reach widespread adoption is accelerating Moore’s Law has held fast since the 1970s Wireless handheld devices showing accelerating rate of performance
  8. 8. Section 1: Tapping into accelerating change US software 8 September 2010 Figure 6 Figure 7 Smartphone/PDA processing power since 1994 Internet backbone BPS since 1965 0 1,000 2,000 3,000 4,000 5,000 6,000 1995 1997 1999 2001 2003 2005 2007 2009 2011 (MIPS) 1.0E+04 1.0E+05 1.0E+06 1.0E+07 1.0E+08 1.0E+09 1.0E+10 1.0E+11 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010 (Bits per second) MIPS = Millions of instructions per second; BPS = Bits per second. Source: Ray Kurzweil and In fact, the dynamic of exponential cost and performance improvement is occurring across a broad range of technologies. While improvement occurs at different rates, the consistent historical trend remains a common dynamic across different hardware technologies. Figure 8 Time to double (or half) Dynamic RAM memory “half pitch” feature size 5.4 years Dynamic RAM memory (bits per dollar) 1.5 years Average transistor price 1.6 years Microprocessor cost per transistor cycle 1.1 years Total bits shipped 1.1 years Processor performance in MIPS 1.8 years Transistors in Intel microprocessors 2.0 years Microprocessor clock speed 2.7 years Source: Ray Kurzweil and I never think of the future. It comes soon enough. Albert Einstein We highlight a few key predictions for the progress of technology, data and connectivity over the next 10 years: The National Science Foundation predicts the number of internet users will reach almost 5 billion by 2020, an increase from 1.7 billion users in 2010 and 360 million in 2000. Vast numbers of people in developing countries will gain access to the web as a result of declining costs and exponential technology improvement. The worldwide mobile subscriber base is forecast to increase from 4.6 billion at end of 2009 to 7.5 billion by end of 2020. (Source: Portio Research 2009) Worldwide mobile penetration will touch 94% by end of 2020 from 64% in 2009. (Source: Portio Research 2009) The first commercial quantum computer will be available by mid-2020. (Source: Cisco IBSG 2009) By 2020, the number of internet users will reach almost 5 billion Worldwide mobile subscribers will reach 7.5 billion by end-2020 Exponential cost and performance improvement occurs across technologies
  9. 9. Section 1: Tapping into accelerating change US software September 2010 9 In the next 10 years, we will see a 20x increase in home-networking speeds. (Source: Cisco IBSG 2009) The adoption of Internet Protocol version 6 (IPv6) will dramatically increase the availability of unique IP addresses, hence the number of unique devices that can be connected to the internet. IPv6 addresses are 128 bits long, whereas IPv4 addresses (the prior standard) are 32 bits. While the IPv4 address space contains roughly 4.3×109 (4.3 billion) addresses, IPv6 has enough room for 3.4×1038 (340 trillion trillion trillion unique addresses.) By 2020, a US$1,000 personal computer will have the raw processing power of a human brain. (Sources: Hans Moravec, Robotics Institute, Carnegie Mellon University 1998; Cisco IBSG 2006-2009) The world’s data will increase sixfold in each of the next two years, while corporate data will grow fiftyfold. (Source: Technorati) By 2020 worldwide, the average person will maintain 130 terabytes of personal data, up from ~128 gigabytes today. (Source: Cisco IBSG, 2009) The extrapolation of exponential progression in microchip processing power shows that we will be able to get to 10-14th / 10-16th calculations per second, possibly by the end of this decade. Justin Rattiner of Intel believes that 3D chips will take off where standard silicon chips leave off. Through the transition, Moore's Law will continue. Worldwide volume of all digital data will grow from 1.2 million petabytes (or 1.2 zettabytes) in 2010 to 35 zettabytes in 2020. (Source: IDC 2010). Note: 1 zettabyte is equal to 2 to the 70th power (binary), 10 to the 21st power or 1 sextillion bytes. This equates to a billion terabytes (a terabyte is equal to a trillion bytes or a million megabytes). Looking forward, we identify a few common threads from our conversations: Simplicity rules - Consumers and business want a straightforward experience and will care less about the pieces that make up a solution. Complex technology systems will continuously be broken down into components and exposed as services to create a simpler experience for users. Solutions will predominate - The convergence of hardware, software and services will result in pre-integrated solutions productized and delivered as “application appliances.” Recent M&A illustrates this direction, with IBM’s software acquisitions, Oracle’s acquisition of Sun and Intel’s acquisition of McAfee. Scale wins the platforms, innovation fragments apps - Scale and scope stake out defensible high ground in the cloud. Cloud services will be about cost efficiencies and flexibility, and providers with distribution networks and the capital resources will be best positioned. Google, Amazon, Microsoft, and Rackspace are early leaders, but we expect growing presence from telecom carriers, such as AT&T and Verizon, as well as integrators, such as IBM and Fujitsu. This will alter the competitive landscape as software vendors compete directly with hardware vendors (SAP vs IBM, Oracle vs HP, Symantec vs Dell, and Cisco vs Microsoft). In the next 10 years, we will see a 20x increase in home-networking speeds Worldwide digital data will grow from 1.2 million petabytes in 2010 to 35 zettabytes in 2020 Consumers and businesses want a straightforward experience Expect pre-integrated solutions productized and delivered as “application appliances”
  10. 10. Section 1: Tapping into accelerating change US software 10 September 2010 Internet time arrives - 10 years later. The accelerating pace of paradigm shifts creates a dynamic where competitive advantages can arise and crumble more rapidly than ever. Though it is not a stretch to assume companies like Microsoft, IBM, SAP, HP, Google and Oracle will remain powerful forces in the tech industry, it will be incumbent on their management teams to ensure they remain as relevant in 2020 as they are today. Who benefits, who faces disruption? We identify vendors who appear well positioned to benefit from accelerating change over the next decade. This list is far from comprehensive, but provides a general framework for evaluating secular beneficiaries and companies potentially under threat: Cloud-service providers - Scale is important for infrastructure as a service, and there is a rush underway to build out datacenter capacity to accommodate the expected demand. Key players include Microsoft,, Google, Rackspace and IBM. Other important infrastructure- and platform-as-a-service providers include AT&T, Fujitsu, Savvis, Equinix, Terremark, OpSource, Joyent, GoGrid, NaviSite, NetSuite, Intuit and enabling services, such as Akamai. Startups - The growing availability and falling cost of cloud-computing services lowers barriers for entrepreneurs. The environment favors new providers of mobile and vertical applications, developers of algorithmic services and ecommerce/personal data brokers. We identify vendors who stand to benefit from the disruptive technologies that create new markets: Integration and middleware - Vendors that provide integration platforms for the cloud, connected devices and smart grids are well positioned to enable new environments of networked services. These include Informatica, IBM, SAP/Sybase, Pervasive, Oracle, Syniverse, private vendors Boomi, Iris Wireless, Motricity, mBlox and others. Traditional infrastructure-management vendors, such as BMC, CA, Symantec and others, have important roles in managing complexity, but must balance challenges from legacy businesses. Cloud-infrastructure vendors - At least for the near term, vendors that specialize in networking infrastructure for cloud computing (both public and private clouds) are well positioned for a capital-spending buildout. These vendors include Cisco, Juniper Networks, F5 Networks, Riverbed Networks, HuaWei and others. Public software vendors that provide and manage cloud-computing infrastructure include VMWare, Citrix, Microsoft and Red Hat. There is a vibrant ecosystem of cloud- focused startups including Univa HD, Platform Computing and Elastra. Sensors, home networking and smart devices - There is opportunity for providers of a new generation of connected devices that will enable intelligent systems. These include smart meters and industrial control systems (iTron, GE, Honeywell, Siemens, APC/Schneider Electric, Emerson, Eaton) as well as software vendors that will enable smart grid and networked home solutions (Control4, BPL Global, SilverSpring Networks, GridPoint). Cloud-services and infrastructure vendors are positioned well Integration, networking, and smart-device vendors are positioned well Connected devices create opportunities for new and old industries
  11. 11. Section 1: Tapping into accelerating change US software September 2010 11 Providers of mobile and location-based services - These include technology, internet, media and communications-services companies. Key vendors include Google, Facebook, Nokia, Microsoft, Apple, Yelp, foursquare, Twitter, Verizon, AT&T, Sprint, DoCoMo, BT and many others, both public and private. We identify which vendors could face disruption to their businesses: Legacy packaged-software applications - This includes on-premises applications from the likes of Oracle, SAP, Sage, Lawson, Epicor, Kenexa, JDA Software, Infor, Kronos and others. Relational database vendors - On-premises database vendors could face some disruption from an increasingly fragmented technology landscape. These vendors include Oracle, IBM, Microsoft, Teradata and others. Non-integrated hardware vendors - Commodity server, hardware, storage and networking technologies will face ongoing pricing erosion and increasing concentration in the customer base. We expect hardware vendors to acquire for scale, expand into higher-margin services and software for differentiation. Legacy application, database and hardware vendors are at risk
  12. 12. Section 2: Transparent IT US software 12 September 2010 Transparent IT We believe innovation over the next decade will be ruled by a consistent trend toward “transparent IT” - technology that is so simple that the underlying complexity becomes invisible to the user. In other words, the difficult stuff will be increasingly distant from end users. Any sufficiently advanced technology is indistinguishable from magic. Arthur C Clarke Over the next decade, computing will become increasingly embedded in the daily life of consumers, businesses and other organizations. Computing will become more intuitive and pervasive with the evolution of more powerful software, rapid growth of endpoint devices, availability of “instant-on” connectivity and declining costs of hardware, bandwidth and storage. We see the continuous elevation of simplicity of experience to the user as logic controls the underlying systems, processes and infrastructure with increasing power. Technology happens. It's not good, it's not bad. Is steel good or bad? Andy Grove Four critical vectors are likely to drive accelerating innovation of software and systems toward transparent IT: Cloud computing - A paradigm shift in computing The ramifications of the shift to cloud computing cannot be understated. Most of our conversations referenced the transformative impact that cloud computing will have on the availability of flexible and increasingly cheap computing power. The first generation of public clouds from Amazon, Google, Rackspace, and Microsoft Azure provide field validation of the viability of the model. Virtualization has been an enabling technology for cloud computing. The ability to run multiple workloads on a shared server to improve server utilization, to run a single workload across multiple servers for scale and to quickly scale up or down server images provides compelling efficiencies in terms of cost, power and flexibility. Commitment to the cloud model is evidenced by the rapid pace of investment in datacenter infrastructure by corporations, colocation providers, systems integrators, telecom carriers and others. We are seeing aggressive allocation of R&D resources toward cloud-infrastructure initiatives by software companies of all sizes. Notably, Microsoft (which operates the largest R&D organization in the software industry) disclosed that 70% of its developers are focused on the cloud, and this would soon move to over 90%. Lew Moorman, CTO of Rackspace, comments on the laws of accelerating returns: ‘There are laws of accelerating returns on these technologies at play. Going from mainframe to mini computer was a small advance . . . the PC becomes a transformative advance and I think this next step is again another exponential leap. With cloud computing, we now have ubiquitous computing. Not only does everyone have computing power at their fingertips, they have the power of a datacenter at their fingertips. They have the ability to manipulate and access all types of data; to connect with and do things with that data; to create and store new data. Over the next decade, computing will become increasingly embedded in daily life Complexity gets pushed down, logic moves to higher levels The ramifications of the shift to cloud computing cannot be understated Commitment to the cloud model is evidenced by rapid investment in datacenter infrastructure
  13. 13. Section 2: Transparent IT US software September 2010 13 We are just starting to understand the potential of cloud computing. It is just starting to transform companies and individuals’ lives.’ Importantly, the availability of cloud-based resources is lowering barriers for startups, which in turn should fuel accelerating innovation. Stratton Sclavos of Radar Partners comments on lower barriers to startups: ‘Eighty percent of the entrepreneurs that come into our firm looking for seed capital or A-round capital already have an application up and running, if not hosted on Amazon then hosted somewhere like Rackspace . . . The capital efficiency that can be applied to developing, introducing and then iterating on these new applications is phenomenal to us and we don’t think that is going to change. In fact, we think it is going to get more and more like that. More interesting is that the capital efficiency for a venture firm trying to launch these new companies is much better.’ The mobile internet - “Any device, always on, anywhere” The rapid adoption of smartphones and the growing availability of wireless internet is a key vector for realizing the vision of pervasive computing and a wealth of related applications, including micropayments, content streaming, multiplayer gaming, location-based services, enterprise applications, etc. The introduction of high-speed mobile networks based on technologies including high-speed packet access (HSPA), worldwide interoperability for microwave access (WiMAX) and long-term evolution (LTE) will encourage adoption of data-based applications. Over the next 10 years, we expect the move to 5G wireless to make significant progress toward the “always-on” high-speed internet connection. More advanced mobile networks will support a new range of applications, including content, shopping, HDTV, collaboration, social networking, video conferencing, robust gaming and additional personalized offerings. This in turn will expand the range of devices beyond phones, PDAs, smartphones and laptops to embrace additional audio, video, sensors, industrial devices and appliances. The availability of greater bandwidth will facilitate adoption of advanced applications, which in turn should drive further growth of data traffic. Figure 9 Smartphone penetration growth forecast 0 10 20 30 40 50 60 2007 2008 2009 2010CL 2011CL 2012CL 2013CL Smartphone share Smartphone growth YoY(%) Source: Credit Agricole Securities (USA) Availability of cloud- based resources is lowering barriers for startups More advanced mobile networks will support a new range of applications Adoption of smartphones and wireless internet help realize the vision of pervasive computing We expect smartphones to continue to take share of global handset shipments
  14. 14. Section 2: Transparent IT US software 14 September 2010 The proliferation of mobile applications is representative of the variety of innovations enabled by smartphones and the mobile internet, and this will be reflected both in consumer and enterprise adoption. Examples of the types of mobile applications expected to see healthy growth include money transfer by short message service (SMS), mobile search and browsing, location-based services, mobile music and video services, near-field communications services, mobile health monitoring and many others. The ability to combine location awareness with content streaming will enable applications that can deliver content to users based on their specific location and preferences (for marketing, entertainment or educational purposes). David Cohen of EMC comments on the impact of mass connectivity: ‘Another jump in terms of the numbers, just in sheer distribution the number of people with cell phones is at least an order of magnitude more than people who have highly connected PCs. It is probably much higher than that. If you look at sensor networks, wireless point-of-sale devices and RFID data collectors, there is also a massive expansion from where we were with tethered devices in previous generations. We are talking about connectivity that is literally on a planetary scale.’ Gary Kovacs of Sybase sees mobility as a paradigm shift, not just technology: ‘I think many people who have been around mobility for a long time don’t talk about mobility as a phone anymore. We talk about it much more as a method of interaction . . . basically accessing cloud services through devices. What I love about that is I can go to my PC, I can go to my iPad, and I can go to my iPhone or my Blackberry and I can get a file from my online service and it just saves. In 10 years, we will have machine to machine and we will have devices that are always on. I think we will have much less in our lives that take a boot-up cycle.’ User interfaces - “See me, feel me, touch me - think me?” The development of new types of interfaces promises to expand the experience of computing beyond the traditional keyboard/mouse, touch and speech-based interaction currently available. The development of new types of touch and haptic interfaces promises to enable new types of applications - for gaming, enabling the disabled, medical procedures, industrial processes, training, simulation and therapy. Haptic interfaces have applications in virtual reality (by enabling real touch to operate in artificial environments) and through teleoperation (using real touch to operate in real environments via computer). Motion-control interfaces are becoming mainstreamed particularly in the realm of video gaming. Microsoft’s Kinect enhancement for Xbox integrates speech recognition, 3D sensing and motion sensing in an integrated, controller-less user experience, while Nintendo’s Wii and the Sony Move technologies exploit motion-control capabilities. Startups, such as Oblong, are exploring non-physical interfaces for computing. There is also growing progress on brain computer interface (BCI) technology. Currently, research is focused on physical implants (mostly to benefit the disabled through physical mobility and prosthetics), but there is also growing progress in non-invasive brain interfaces that track brain activity to control computing and physical devices. Mobility is a paradigm shift, not just technology New types of interfaces promise to expand the experience of computing There is growing progress on brain computer interface technology Innovations and connectivity enable a new class of mobile applications to emerge
  15. 15. Section 2: Transparent IT US software September 2010 15 Over the next decade, we expect advancements in non-traditional computing interfaces to be accompanied by new software applications that will match the tactile or sensory interface to a matching function or automation. This is one of the areas where we expect innovation to be quite surprising and to have a paradigm-shifting impact on the experience of transparent IT. Development tools & standards - Simpler, more powerful The evolution of higher-level languages that bring development closer to the business process puts increasing power in the hands of business users and veils the underlying complexity of code. The evolution of the open-source programming model has created in aggregate over 1 billion lines of freely available open-source code that developers and business users can build upon to create applications and new businesses. Standards, such as HTML 5, promise to enable a new class of rich, interactive and mobile applications, while markup languages (ie, variants of XML and BPML for business process) help enhance data interchange and interoperability and increasingly distributed environments. Jan Baan, chairman & chief innovation officer of Cordys, believes new languages will change the role of developers from coding to process orchestrators: ‘The new generation of development language is 5GL. This increasingly takes the developer out of the process. Instead, we have the business driving the use of the commodity type of components that are decoupled with services that have been established.’ Michael Tiemann, VP of public affairs for Red Hat, highlights the transformative role that the open-source model plays in software development: ‘I believe the open-source model for software development and intellectual development has been quite disruptive over the last decade . . . cultivating these communities of innovation and enabling a rise of incredibly pervasive powerful technology that is really powering a new transition to cloud-based computing. Open source has wonderful economics compared to proprietary software in terms of cost model, but the fact that this open-source model can run at such a high level of quality in such a robust manner, with so much interoperability, I think that’s the reason you see companies like Google, other companies who are putting open-source infrastructures together like able to achieve levels of scale and start making forward progress by simply remediating the existing software.’ Figure 10 Open-source project defect densities Year Findings Details 2004 985 defects in 5.7 MLOC of Linux kernel source code 2005 Linux kernel grew 4.7%, defect density decreased 2.2% 100% of all "serious" defects identified fixed within six months 2006 Survey expanded to entire LAMP Stack and 32 OSS programs, no correlation found between size and defect density 2008 Survey expanded to 250 OSS projects consisting of more than 55 MLOC. Defect density reduced additional 16% since 2006 Funded by Department of Homeland Security 2009 According to Coverity, overall integrity, quality and security of open-source software is improving. Products with near-zero defects have increased from 11% to 36% and project involvement has increased more than 50% since 2008 Rung 2 projects (Rung 1 have zero defects) increased from 11% to 36% Source: Standards such as HTML 5 promise to enable a new class of rich, interactive and mobile applications Open source has wonderful economics, but also a high level of quality New languages will change the role of developers from coding to process orchestrators Open-source project defect densities are relatively superior
  16. 16. Section 3: Intelligent systems US software 16 September 2010 Intelligent systems Over the next decade, we expect to see intelligence increasingly embedded into systems, benefiting from growing predictive powers of software, standards-enabled integration, “always-on” connectivity and the increasing availability of sensors and remote-controlled devices. The concepts behind intelligent systems have been around for decades, but the gating factors have been the limitations of connectivity, throughput and computational power. The Star Trek computer doesn't seem that interesting. They ask it random questions, it thinks for a while. I think we can do better than that. Larry Page The proliferation of data from corporate applications provides unprecedented visibility into operations and business. New generation sensors and radio frequency identification (RFID) tracking generate huge amounts of data which can be used to improve energy efficiency and optimize broad functional aspects of the supply chain. Technologies related to data warehousing, business intelligence and predictive analytics have matured and are easier and faster than ever to deploy. Applications and systems will leverage the power of predictive analytics and advanced techniques to optimize business processes, improve collaboration, target information flow and reduce risk. We have seen the mainstream acceptance of business intelligence and data warehousing over the past three decades, while leading independent BI vendors Hyperion, Cognos and Business Objects were absorbed by Oracle, IBM and SAP, respectively, in 2007. This wave of consolidation effectively mainstreamed business- intelligence software within the enterprise IT ecosystem, and these technologies have continued to rank as high priorities since. Prediction and optimization embedded in the walls The value of predictive analytics has continued to grow with increases in computational power. If business intelligence is largely historical facing, predictive analytics look to the future, to improve marketing accuracy and operational efficiencies and reduce risk of all types. In the past, predictive capabilities have been employed in a number of specific types of scenarios: online search and advertising, to improve query results and better target ads, all types of marketing, to improve effectiveness and revenue “lift” from campaigns, manufacturing, to anticipate potential quality assurance issues, financial-risk management, to help mitigate portfolio or credit risk (credit scores are a prominent example). The holy grail of analytics is to “close the loop” from data to insight, to prediction, to action. In many cases, such as marketing, the value of human interpretation of predictive analytics is critical to optimize decisions. Analytics will be increasingly embedded into real-time operational and transactional systems. We already see this dynamic playing out in ecommerce, with marketing and merchandizing optimization attempting to anticipate what the user may be interested in. Amazon, Netflix Pandora, We expect to see real- time intelligence increasingly embedded into systems There is more data available and analytic technologies have matured Value of predictive analytics continues to grow with increases in computational power The holy grail of analytics is to “close the loop” from data to insight, to prediction, to action
  17. 17. Section 3: Intelligent systems US software September 2010 17 Google and others target consumer interests in books, movies, music and search. Operational systems increasingly incorporate analytics into domain specific tasks: pricing and inventory optimization, IT performance management, and QA in manufacturing. Integration of predictive analytics throughout the value-chain of products and services, from production to end user, will enable the realization of “mass customization.” There will be growing integration between physical and logical, with systems that incorporate data from sensors and physical systems, apply predictive analytics recursively to improve experience, operational or other efficiencies. This is a consequential trend in nascent stages. Some of the most visible manifestations of this include the growing prevalence of embedded intelligence in passenger autos, such as self-diagnosing systems and location- aware navigation systems that route around traffic jams. Examples of the types of solutions that will incorporate physical/logical intelligence are: smart grid (including demand/response energy management, municipal traffic management solutions, critical infrastructure management, and networked home-control solutions), self-diagnosing and correcting datacenter management, networked video surveillance and security systems, location- aware services, such as smart billboards. Figure 11 An example of location-aware services Source: IBM Several underlying trends enable the rise of intelligent systems: Big data. This refers to large sets of data of all types (created by enterprise and internet applications, the proliferation of audio, video and social-networking data) that have historically proven unwieldy and even impossible to manage and analyze. Leading internet companies, such as Yahoo, Facebook, LinkedIn, Twitter and AOL, commonly generate large amounts of data, often over 100 terabytes of data each year. Preparing this data for analysis can double or even triple the size. This is driving new approaches to structure the data to find new ways to analyze and visualize effectively. There will be growing integration between physical and logical systems Big data drives new approaches to structure data to find new ways to analyze and visualize IBM is developing intelligent billboards
  18. 18. Section 3: Intelligent systems US software 18 September 2010 Improvements in the hardware and processing architectures will allow greater amounts of data to be manipulated and analyzed with far greater capacity. Sanjay Poonen of SAP discusses this: ‘Behind the scenes, these intelligent systems will have voluminous data handling capacity because much of what ends up getting processed today in disk-based structure can be handled in in-memory structures, which can be in memory structures which may be either physical RAM or flash. Industries that build their reputation on voluminous amounts of data - retail, consumer packaged goods, financial services, utilities, even healthcare and the public sector - will be the first consumers of these devices, appliances and software that run in the form factor, that allows operations that took seconds to take milliseconds, operations that took minutes, to take seconds.’ Importantly, there is an increasing range of technologies to organize and store data that go beyond the traditional relational database promoted by Oracle, IBM, Microsoft and others. Technologies that deal with unstructured data (ie, Hadoop, Cassandra, Mark Logic) and analytics-focused columnar databases (ie, Vertica, Kognitio, Sybase IQ and others) represent an emergence of an increasing array of database types to store and leverage data. There is increasing convergence between structured and unstructured data as vendors (ie, Mark Logic, Attivio and others) bridge existing distinctions between data stored in relational databases (structured data) and everything else, which include flat files, word documents, spreadsheets, multimedia files, etc. Jill Dyche of Baseline Consulting discusses the emerging concept of data virtualization: ‘A lot of people are talking about software virtualization, but we are going to start to hear a lot of buzz around data virtualization. We simply have to assume that the days of the big behemoth mega data warehouses are over and there will always be new data introduced by both internal and external sources. Data like social-media interactions become important to companies. This “data is everywhere” mindset that everyone is embracing - by the time you are able to load all that data into a big database, it could be irrelevant.’ Proliferation of sensors and networked devices. The growth of devices that will be able to generate data for analysis and receive instructions remotely will give rise to new solutions. Sensors that can measure energy use, temperature and other conditions are being built into smart meters, appliances, building control systems and other devices. The ability to receive instructions (for instance, to run appliances to capitalize on low utility rates or avert demand spikes) is critical to realizing the vision of “closed loop” intelligent systems. The adoption of IPv6 will allow for billions more devices to be connected to the internet, each with a unique IP address. The “Internet of Things” will enable far more pervasive analytics. Better analytics. Predictive mathematical and statistic algorithms improve the more they can be tuned for effectiveness. The more data and the more iterations, the better the predictive accuracy. The exponential improvement in cost/performance in computation, DRAM and storage enables greater predictive precision. The power of prediction improves as experts in the problem domain deploy the appropriate, finely tuned algorithm. Advances in machine learning, neural nets and other artificial-intelligence techniques continue to empower users with a growing arsenal of analysis tools. There are an increasing range of technologies to organize and store data ‘We are going to start to hear a lot of buzz around data virtualization’ The “Internet of Things” will enable far more pervasive analytics Algorithms improve the more they can be tuned with more data and iterations
  19. 19. Section 3: Intelligent systems US software September 2010 19 Stephen Wolfram’s team is working on software that mines the universe of possible programs to find the optimal software program itself: ‘One of the things that has come out of a bunch of science that I have done is this idea that there is this computational universe of possible programs. Even quite simple programs are useful for things, which mean that it becomes feasible to search this computational universe for programs that are useful for your purpose. Whether your purpose is cleaning up images or encryption or doing routing or linguistics, it becomes possible to not have the engineer build the program step by step, but instead have the program be mined from this computation universe of possibilities. That sounds very futuristic, but it is, in our own work in Mathematica and Wolfram Alpha, a methodology we increasingly use. So there is an increasing number of algorithms that no human built. We found it. We searched a trillion possible programs and one of them was the one that was the best with respect to certain criteria and that ends up being the one we use for such and such a function. There are a lot of places where this type of mined algorithm will become more and more prevalent.’ The rise of social intelligence. Social networking and collaboration technologies foster business agility, collaboration and innovation by allowing for more fluid communications across the organization. These technologies give rise to new sources of data for analysis as well as new avenues to insight through the processes of “crowdsourcing.” Crowdsourcing refers to the practice of taking problems that are typically solved by employees and extending them to a group of individuals in order to arrive at a collective solution. Crowdsourcing is the paradigm underlying open-source software development, designing organizational algorithms, solving complex collaborative questions as well as optimizing search-engine results. Technologies, such as Facebook and Twitter, allow problems and data to be distributed across groups of users in real-time. There is increasing interest in using “social intelligence” - harnessing the collective wisdom of large groups of users via technology - to solve global problems, such as energy usage and climate change. We expect to see an increasing range of organizations incorporate the power of social-intelligence techniques to enhance broader analytic capabilities. Several domains are fostering the evolution to intelligent systems: Real-time performance management. With the mainstream acceptance of methodologies, such as Total Quality Management (TQM), Six Sigma and Balanced Scorecard, software technologies evolved to help management and line workers adopt metrics-driven strategies through dashboards and visualization tools. There is increasing interest in enabling continuous planning processes with more frequent refreshes or real-time data updates. Leading software vendors IBM, Microsoft, and SAP are leading the evolution toward real-time performance management. The key goal for performance management lies in improving the decision- making process. We expect to see increased use of real-time analytics to drive optimized decision making. While companies, such as Fair Isaac have focused on enabling enterprise decision management through a combination of analytics, domain knowledge and data management, this has historically required a costly, highly service-oriented approach. The use of analytics to Performance management applies methodologies to specific practices Performance management will increasingly focus on real-time decisions ‘It becomes possible to have programs be mined from this computational universe of possibilities’ Social technologies allow problems and data to be distributed across groups of users
  20. 20. Section 3: Intelligent systems US software 20 September 2010 optimize the internet ad networks’ placement and targeting in real time is a key opportunity in this area. Vendors, such as Google, Akamai, Adobe and Splunk, are positioned to chart new ground in this area. Software vendors IBM, Microsoft, IBM and SAP are leading the evolution toward real-time performance management. Marketing optimization. The use of analytics is a long-standing part of marketing and campaign management, where vendors, such as SAS and IBM/SPSS, have provided the analytic horsepower to power effective campaigns. Vendors, such as SaaS leader Eloqua, Responsys, Aprimo and IBM/Unica, continue to drive incremental value through the use of scoring and predictive analysis to complement functions related to lead generation, campaign design and analysis of effectiveness. Risk management, churn analysis and fraud prevention. In the wake of the financial industry crisis, financial services have been compelled to deploy enhanced credit risk management solutions. Solving the risk management problem requires pulling together multiple sources of data and applying risk analytics to detect potentially consequential changes. Financial fraud remains a significant challenge for consumers, businesses, banks and credit card companies, and there is consistent need for more effective analytics to detect potential fraud with minimal false positives. Anti-money laundering requirements continue to drive the need for more sophisticated solutions to detect and prevent illegal financial activities. Vendors, such as RiskMetrics, Actimize, IBM/SPSS, SAS and many others, are focusing on the evolving challenges associated with detecting electronic fraud. Smart grid, smart buildings, smart datacenters. One of the most promising areas for innovation lies in the use of software to manage energy usage, aligning the needs of customers and utilities. The ”smart grid” is envisioned as a system that employ two-way technologies to control appliances and HVAC systems at a home or building, allowing customers to automatically reduce energy usage during periods when spot energy prices are high, and allowing utilities to manage their own resources to avoid “brown outs” from overload or invest in costly spinning capacity. “Demand response” technologies employ meters at the customer’s site that can be controlled by the power utility. Predictive technologies allow the utility to anticipate when demand may be approaching “peak load” or monitor the spot price of energy to exceed a certain threshold. In response, the utility can issue remote instruction to the customers systems to reduce their demand by raising the air-conditioning temperature or turning off a pool heater, for instance. Companies focused on this problem include Comverge, EnerNoc and private companies SpringSource, GridPoint and BPL Global, among others. While there is significant promise, Andy Lawrence of 451 Group believes it will take time to realize these visions: ‘I think we are going to get to a point where the power consumption of corporations scales up and down as power consumption drops naturally, without human intervention at the end of the workday, with intelligent policies making decisions about what to turn off. It may be an energy- management system or some kind of network of building-management system and network-management system. Risk management continues to gain importance in financial markets Smart-grid technologies employ predictive techniques to conserve energy Analytics enable marketers to improve effectiveness of campaigns It will take time to realize the eco-efficient IT vision
  21. 21. Section 3: Intelligent systems US software September 2010 21 I think it is going to take a couple of decades. It is not always going to be clear that it is worth doing the retrofit and embracing the complexity. People don’t rip out building-management systems so they can get a better view of their energy footprint. They will do it over time. This is a long-term project.’ Software-enabled home networking also shows significant promise as the wireless ZigBee standard enables compatible appliances and systems to be controlled automatically from a central system. Control4 is perhaps the most advanced vendor in this area and is focusing on enhancing its demand- response capabilities to allow homes and hotels to automatically schedule certain tasks (ie, running a dishwasher) or adjust climate control settings in order to make the most cost-effective energy choices. Over the next decade, we expect home networking and monitoring to extend to areas, such as infant and elder care, which promise significant improvements in quality of life. Glen Mella of Control4 describes some of the uses of monitoring technology: ‘A partner of ours, CloseBy Networks, sells a software system on top of a Control4 deployment which essentially enables children to assist aging parents. They can monitor things like, “Well, it’s 10am and grandma hasn’t gone into the master bathroom, there may be something wrong because she usually gets up at 8.” Another one is actually helping seniors get the movie set up and play from a remote location or setting lighting scenes to help them settle down for the evening. People are starting to experiment with ZigBee monitoring devices - you can have a pad next to the bed, so when they step out of the bed a signal is given. There are even ZigBee garments that can monitor vital signs.’ Intelligence helps deliver personalized medicine. There is increasing use of analytics and business-intelligence tools to improve both the delivery of care for individual patients as well as fundamental research. Healthcare providers are increasingly able to respond to demands of pandemic prevention by using business-intelligence tools to sort clinical data to identify which patients might be most at risk, enabling the providers to target and prioritize vaccination programs. Another focus is helping doctors make more accurate diagnoses and recommend the appropriate treatments. Systems that integrate sample analysis into decision-support applications can help pinpoint details that may elude physicians. Public health initiatives, such as the Cancer Biomedical Informatics Grid and UCSF’s Athena Breast Health Network, aggregate shared data and increasingly enable researchers and physicians to improve their research, diagnosis and treatment efforts. RFID and the supply chain. RFID technology experienced a significant level of hype in 2004-2005 which created a level of skepticism, but steady adoption of the technology is paving the way for new applications and services. RFID employs two components: a reader (or interrogator) and tag (or label). There are many applications for the technology, which is becoming increasingly prevalent as the cost of the tags decreases. RFID tags are used in conjunction with mobile phones and credit cards for payments, for automated toll collection and similar uses. It is in asset management, retail sales and product tracking that RFID shows significant promise, enabling manufacturers, distributors and retailers to gain granular insight into both the supply chain and demand patterns. The tremendous amount of data generated by RFID applications has been long anticipated as a demand driver for data warehouse technologies from the likes of Teradata, IBM, SAP/Sybase, Oracle and other vendors. Home-networking technologies increasingly focus on managing energy consumption RFID applications promote demand for data warehousing and predictive analytics Business intelligence increasingly helps to improve patient care Elder care is a promising vision for home- networking technologies
  22. 22. Section 4: Convergence US software 22 September 2010 Convergence Users and consumers of information technology are ultimately concerned with the experience and functions of chosen solutions. Increasingly, providers bundle components of hardware, software, services, connectivity and content as a complete solution. This has resulted in a blurring of the lines as offerings and vendors straddle multiple categories. The key characteristics that have driven adoption include accelerated time to deploy, reduced cost of integration and implementation services and increased efficacy of the overall solution. Information technology and business are becoming inextricably interwoven. I don't think anybody can talk meaningfully about one without the talking about the other. Bill Gates The distinction between discrete software, hardware, services and content will become blurred into solutions. Vendors will be at once more diversified and vertically integrated. Over the next decade, we expect convergence to be a theme that will drive both organic development and active M&A in the technology industry. Business models will evolve as value creation accrues in different points in the delivery chain. Apple and Microsoft, which have long embraced hardware as part of the business model (via Macs, iPods/iPhones and the Xbox), have expanded into content-based businesses (iTunes and Xbox Live) as a way to reinforce the broader value proposition of their integrated offerings. We expect increasing convergence over the next decade across the technology and media sectors as software, hardware and content providers seek to bolster customer stickiness, drive incremental revenues and create higher competitive barriers. Figure 12 S&P 500 net debt per share 0 100 200 300 400 500 600 700 800 900 1,000 1,100 1,200 Q4 00 Q4 01 Q4 02 Q4 03 Q4 04 Q4 05 Q4 06 Q4 07 Q4 08 Q4 09 (US$) Source: Bloomberg We expect convergence to play out along several vectors over the next decade: Everything as a Service. The mainstream acceptance of Software as a Service over the past decade has paved the way for services to reach lower down the stack. Platform as a Service and Infrastructure as a Service hide the underlying complexity of compute and storage infrastructure and allow users, service providers and application developers to access resources in holistic Software offerings increasingly combine hardware, software, connectivity and content Business models will evolve as value creation accrues in different points in the delivery chain Corporate cash balances and low interest rates are spurring M&A To those that consume these services, the experience is what counts
  23. 23. Section 4: Convergence US software September 2010 23 fashion. To those that consume these services, the experience (functionality or service levels) is what counts, and the environment will lead to new competitive dynamics., Rackspace,, Google and Microsoft all compete directly for emerging opportunities, but have dramatically distinct origins. Commoditization of IT infrastructure. A corollary to the trend toward “Everything as a Service” and trend toward transparent IT is the continual commoditization of compute, memory, bandwidth and storage. We expect to see continuing convergence of hardware, storage and networking capabilities. Cisco’s move into servers and HP’s initiatives around its ProCurve networking equipment are only the leading edge of what we expect to be an increasing emphasis on integration of commoditized capabilities. As hardware vendors continue to compete on price, performance and supply chain efficiencies, there will continue to be focus on value and differentiation from software. Content integrates into the IT ecosystem. This is a longer-term trend, as proprietary content and information services become increasingly integrated into solutions. So far this has been indirect as Apple, Amazon and Barnes & Noble have used specialized hardware and ecommerce to deliver digital content to end users. Microsoft’s Xbox Live service builds on the Xbox hardware console in a broader effort to deliver a complete home entertainment solution. We expect the next phase of evolution to increasingly stress the role of proprietary content as publishers and content producers seek ways to monetize their intellectual assets and software vendors look to use content to enhance value and differentiation to their user bases. Physical and logical control systems converge. There is a slow but steady integration between physical facilities control systems and IT systems. One area that has seen significant interest has been energy management in the datacenter. Part of the challenge is managing energy usage according to the device load. This is being addressed by companies, such as APC, Emerson (which acquired Avocent to extend its control of the datacenter) and privately held Modius. A number of startups have sprung up around energy usage measurement, with several companies focusing specifically on PC power consumption (notably Power Assure and Faronics). We expect to see increasing innovations around the integration between physical systems and IT systems control. Beyond energy management, we believe one of the key areas of promise is tying together physical and IT security. For banks and high-security areas of government, this would involve the ability to tie together physical and logical access control (ie, allowing someone to log onto a system only when their security badge has been scanned into the building). How will convergence play out? We identify several likely directions for convergence over the next decade for software, hardware, services and content vendors. We believe the most likely directions are: Software vendors move into hardware and content Hardware vendors move into software and services Services vendors move into software and content Content vendors move into hardware and software Physical facilities and logical systems controls are converging Expect the continual commoditization of compute, memory, bandwidth and storage One of the key areas of promise is tying together physical and IT security
  24. 24. Section 4: Convergence US software 24 September 2010 Figure 13 Converging IT ecosystem Software Content Hardware Connectivity Solutions Source: Credit Agricole Securities (USA) Software vendors move into hardware and content The rise of appliances that integrate both hardware and software has been a significant trend over the past decade and we expect this to continue over the next decade. We expect continued interest from software vendors on delivering software in appliances as organizations seek greater ROI and deployment efficiencies from the combined solutions. In particular, we expect application appliances to become more common along with hybrid offerings that combine an on-premise hardware presence with cloud-based services. Oracle’s acquisition of Sun and Check Point’s acquisition of the Nokia Enterprise business are examples of the types of transactions we expect to continue. The hardware appliance approach has become increasingly mainstreamed and adoption is spreading across numerous categories. We expect application appliances to gain increasing ground over the next decade, following Oracle’s lead in targeting high-end opportunities for converged systems that combine software, networking, hardware and storage. Figure 14 Software/hardware appliances by category Category Representative vendors Collaboration Bull CRM Sage SalesLogix CRM Appliance Data integration Cast Iron (IBM) Data warehousing Teradata, Oracle Exadata, Netezza, SAP, IBM e-Discovery Clearwell, StoredIQ, FTI, Kazeon (EMC) ERP SAP (ERP in a box with Novell, Intel) Firewall, VPN Check Point, McAfee,, Palo Alto Networks IT management ScienceLogic Mail security Barracuda, Cisco/Iron Port, McAfee, Trend Micro Messaging Tibco Search Google, Thunderstone, Index Engines Secure web content Websense, McAfee Storage - Deduplication, backup EMC, NetApp, HDS, 3Par, ExaGrid, DataDomain (EMC), NetApp, Unified threat management Fortinet, Barracuda, Check Point Video security NICE Systems WAN acceleration Certeon, Citrix NetScaler, F5 Networks Source: Credit Agricole Securities (USA) The appliance approach is increasingly mainstream, proliferating a cross categories We expect application appliances to become more common along with hybrid cloud offerings Software, hardware, connectivity and content will converge into holistic solutions Appliances combining hardware, software and sometimes content and services are proliferating
  25. 25. Section 4: Convergence US software September 2010 25 We expect to see continued diversification of revenue streams both among larger and smaller software vendors. Traditional perpetual license vendors have been moving toward additional subscription-based businesses and hardware over the past several years. Some of this has been through acquisition, some through partnership, but the key objective is to enhance the overall strategic value of the vendor through a broader portfolio and reduce reliance on a single business model. Ray Wang of Altimeter believes software companies will increasingly focus on content: ‘Software companies are emerging to become information brokers. They are not just delivering a solution. What the cloud is doing is allowing them to aggregate data for prediction, for trends, and for benchmarking purposes. Those trends drive another point where companies must consider if they even need any of these solution providers. We are seeing companies buy out key software assets so they can deliver services directly to the customers without having to use a packaged application. One of the great examples of this occurred when Roper Industries bought a software company that just did supplier networks.’ Hardware vendors move into software and services For hardware vendors, the appeal of software is undeniable and we expect continuing development and M&A. This trend has been so consistent we would expect continued consolidation in the software sector to continue to be driven by hardware vendors. IBM: Rational, Cognos, SPSS, Unica HP: Mercury Interactive, Opsware, Tower (software), EDS (services) Cisco: WebEx, ScanSafe, Ironport (Software as a Service) Dell: 3Par attempted (appliances), Perot Systems (services) EMC: VMWare, Documentum, RSA, Kazeon, Greenplum (software) Intel: McAfee, Wind River (software) Promod Haque of Norwest Venture Partners highlighted the hardware commoditization trend: ‘We are seeing hardware get commoditized rapidly, not just in the server space, in the compute space. You are starting to see networking vendors, under tremendous pressure from the HuaWeis and the ZTEs from China. The general commoditization of that space as well, at least the lower end of that space. Over a period of time, that will happen with the high end, but that lower end is getting commoditized. There is a lot of emphasis on the part of companies like Alcatel Lucent, and whether it is Extreme Networks or Brocade, and equipment providers including Cisco and Juniper. You see Oracle now in the hardware business. HP acquired 3Com. One of the more interesting things that we saw recently was the IBM announcement that the entire hardware group will be reporting to Steve Mills, who previously ran Software only. Expect vendors to blend software, hardware, content and services Software companies will increasingly focus on content We expect continuing development and M&A
  26. 26. Section 4: Convergence US software 26 September 2010 You are starting to see some very interesting trends in that hardware, not just compute, but compute, storage and networking are coming together.’ Services vendors move into software and content IT services vendors, particularly the likes of Accenture, CSC and others, commonly specialize in customized solutions that incorporate proprietary IP, packaged software and hardware. Given the ongoing pricing pressure from offshoring competition, we would expect North American IT-services vendors to look to acquire or incorporate more internally developed proprietary software and content in order to differentiate their franchises. There have not been many large acquisitions as of yet. The Canadian IT services provider CGI Group acquired AMS, a vendor of government software offerings, which is a notable example from 2003. Part of the challenge for traditional services vendors is that their valuations (on a per employee basis and on price/sales metrics) tend to be far lower than other areas of technology. There have been a few examples of services vendors moving into hardware (both IBM and Fujitsu retail legacy hardware businesses). Though not a traditional services vendor, Google’s unsuccessful launch into hardware through the Nexus One smartphone demonstrate that convergence is not without risks. Content vendors move into hardware and software Although the indications of this trend are early, we think there are growing signs of increasing efforts to converge solutions on the part of media companies. As media companies seek new ways to monetize content (to offset pressures in advertising and traditional book and periodical sales) and technology firms seek new and recurring sources of revenue, we expect increasing cooperation, partnerships and potentially M&A between traditional media and publishing companies and technology firms. Amazon’s Kindle and the Barnes & Noble Nook represent efforts by channel players to provide a hardware-based vehicle to monetize digital content, and of course Apple’s iPad appears to be revitalizing the business of magazines and other periodicals in digital form. Publishers similarly are seeking ways to monetize their own content and are developing applications to offset declines in their traditional businesses. Reed Elsevier has sought to offset declining sales of its hardbound medical texts and reference books by offering specialized software applications, such as MD Consult, that allow access to the company’s proprietary data on a subscription basis. Expect increasing cooperation between media and tech firms ‘Not just compute, but compute, storage and networking are coming together’ Pricing pressure coming from offshoring competition
  27. 27. US software September 2010 27 2020 interviews Jan Baan, Cordys.............................................................................. 28 Willem van Biljon, Nimbula .............................................................. 34 David Cohen, EMC............................................................................. 40 Simon Crosby, Citrix......................................................................... 47 Jill Dyche, Baseline Consulting ......................................................... 58 Andrew Feldman, SeaMicro .............................................................. 65 Promod Haque, Norwest Venture ..................................................... 70 Timo Hannay, Nature Publishing Group............................................ 75 Parker Harris, .......................................................... 83 Dave Kellogg, MarkLogic .................................................................. 91 Gary Kovacs, Sybase ...................................................................... 101 Andy Lawrence, The 451 Group...................................................... 111 Seth Levine, Foundry Group ........................................................... 118 Glen Mella, Control4 ....................................................................... 122 Geoffrey Moore, TCG Advisors ........................................................ 131 Lew Moorman, Rackspace .............................................................. 139 Sanjay Poonen, SAP ....................................................................... 145 Keith Schaefer, BPL Global ............................................................. 150 Stratton Sclavos, Radar Partners ................................................... 157 Michael Skok, North Bridge ............................................................ 164 Michael Tiemann, Red Hat .............................................................. 171 Ray Wang, Altimeter Group ............................................................ 181 Stephen Wolfram, Wolfram Research............................................. 186
  28. 28. Jan Baan, Cordys US software 28 September 2010 Jan Baan, Cordys Jan Baan has over 25 years of entrepreneurial and business leadership experience in the software industry. Internationally recognized for his key role in creating the worldwide market for enterprise resource planning (ERP) software, he founded Cordys in 2001, together with Theodoor van Donge, the key architect behind Baan Company's pioneering ERP solution. Jan founded Baan Company in 1978, soon after attending Business College. With the development of his first software package in 1979, Jan started what was to become a pioneering career in the ERP industry. Under Jan's stewardship, Baan Company grew from a US$35m company in the early 1990s to US$680m in 1998. Baan became the No.2 ERP player in the software industry. In the late 1990s, Jan became a highly successful venture capitalist by focusing on software innovation. The importance of the business process layer Today, with cloud computing in the enterprise gaining traction, businesses must re-evaluate their existing IT portfolios and transition towards the cloud to remain competitive. Businesses are increasingly demanding a shift from rigid business processes to more flexible and dynamic ones where a business process layer can link all silos under a service-oriented architecture (SOA). SOAs offer a loosely integrated suite of services that can be built upon existing IT infrastructure which offers the flexibility and functionality of the cloud without foregoing the benefits of the investment in on-premise IT infrastructure. A feature of this shift is that differentiation within software will occur not in manufacturing, but in assembly. Users will have the ability to dynamically change the components of a system by making systems into customizable web services to be delivered over the web. Ultimately, complexity will be hidden underneath a layer of processes, a phenomenon Gartner calls the ‘meta of the meta’ layer. Using the meta, or ‘data about data,’ commodity components will enable projects to be created with no new lines of JAVA code because best practices codified in legacy systems all the way down to the databases will be utilized. Business operations platform process layer Source: Cordys The business process layer promises greater flexibility for IT to serve businesses Differentiation within software will occur not in manufacturing, but in assembly The business process layer hides complexity, creating more transparent IT
  29. 29. Jan Baan, Cordys US software September 2010 29 Commoditizing complexity, simplifying process Our discussion with Jan Baan focused on his visions of the evolution of the business process, the commoditization of applications into best practice components and the promise that flexibility will bring. Baan envisions traditional enterprise-software applications becoming discrete components that are manipulated by higher level business process orchestration. The applications of the future will weave together these components (comprised of formerly distinct lower level application functions) into a cohesive solution framework. Baan effectively used the analogy of the auto and the airplane in describing his vision for software over the next 10 years. The ability to “componentize” functions and applications allow business users unprecedented flexibility in matching software to the needs of the business, not vice versa. Key points In the future, a business process layer will link system “silos,” with integration via SOA. Business Process as a Service (BPaaS) will provide the logical overlay on top of Infrastructure as a Service (IaaS), Platform as a Service (Paas) and SaaS to enable true multi-silo integration of software systems from the cloud. Today, data and systems reside in silos (ERP, CRM, HRM, etc), and integration between silos (if any) is difficult, especially as new processes are introduced to the system(s). Legacy systems (and new code going forward) are redesigned as web services, which will be integrated into the business process layer, delivered over the web with customizable interfaces. Complexity will be commoditized down to simple elements representing best practices. These functions can be decoupled and then recombined with other components, enabling mass customization via enterprise “mash-ups.” The analogy for software is that we will see differentiation in the assembly, not the manufacturing. The result of this evolution is that corporations will become more agile and efficient. As processes are commoditized and IT becomes transparent to users, large system integrators and consultants could find they have less demand for their products. The key differentiator for applications will not be the logic, but increasingly content from the unstructured data world. Business processes will see the commoditization of applications into best practice components Baan envisions Business Process as a Service integrating existing software “silos”
  30. 30. Jan Baan, Cordys US software 30 September 2010 Jan Baan - Interview summary We spoke to Jan Baan, chairman and chief innovation officer of Cordys, on 17 June 2010 about his vision for the evolution of software over the next 10 years. Jan Baan became an entrepreneur in 1978. The original vision was to tackle the “mother of all complexity,” enterprise resource problems. This led to the rise of Baan Software, which became one of the top ERP vendors in the late 1990s. In the ERP era, “data was king.” These enterprise transactional applications were based on UNIX and relational databases, which are designed to handle structured data. This generation of applications was never built for the internet. These have led to extraordinarily high costs. Baan cited a company that has been in business for 150 years and is happy that its IT budget is under US$1bn. This typical company spent US$1bn to implement an SAP system. After buying the software for US$150m, the company needed to spend US$850m more to customize it. This model is not only costly, but at the end, all the user has is the application. The computer is now mobile. We can have all the relevant data around us along with the relevant processes. We see examples of this with many of the new iPad applications. We have a new generation of beautiful emerging technologies based on the internet that includes the iPad, smartphones, etc that address only unstructured data. Increasingly, there is value for large legacy systems (more so in healthcare than in manufacturing) to connect with all these new devices. The dream for Cordys 10 years ago had been to go beyond ERP, and it has taken 10 years to build a business process platform. This is focused on a completely mobile platform that is able to leverage both structured and unstructured data. In this new approach, data is no longer king . . . business process is. In tomorrow’s world, we will no longer be employees, but knowledge workers in the supply chain. This will take the form of a business process layer that can link all of the system silos, with integration via SOA. A wealth of possibilities exists as a result. Why not have an order to cash collection process then link this with unstructured data rules (like Facebook in a CRM environment), while making all compliant? We need to find ways to link collaborative and social technologies to process, linking something like a Google Wave to a business process, for example, using both structured and unstructured data. This is the promise of using technology to mirror how people naturally interact. Today there is a lot of discussion about the transition to Saas, PaaS and IaaS. On top, we will have Business Process as a Service. We need to follow the commoditization of services. One supposes you could open legacy systems to view the processes, but you end up having one view with one process. In the future, we will access these systems from the cloud. We realize enterprises have to combine different technology waves. Suppose we take the analogy of the automobile. In the auto industry, there have been different technology waves, but there are still wheels, an engine, etc. Here the idea of using an old engine and new dashboard concept come together. This is the same for the cloud, where we see combining the cloud and on- premise worlds. ERP applications were never built for the internet The business process layer will link together underlying applications as services We review our conversation with Jan Baan
  31. 31. Jan Baan, Cordys US software September 2010 31 Currently, we still have an on-premise world that needs to be combined across silos. Technology systems are still very business centric, and the goal is to align business in the drivers seat, controlling process. At the enterprise, business processes have been standardized for compliance, but this is not flexible for tomorrow’s business needs. In a typical business entity, there are many different silos. These silos incorporate structured data in multiple areas: ERP, CRM, product lifecycle management (PLM), financial systems, logistics and more. The business process layer can link together all the silos, using SOA to integrate - one rule in a business process. Tomorrow’s world is no longer about employees doing work directly related to a company. This is about knowledge workers participating in a supply chain. Workers will have their own devices which have access to their corporate resources and will be able to access data from mobile internet connections. For business processes, it is critical to follow steps in the appropriate order. Process requires linear steps. Workflows, such as email, do not follow the same pattern. A critical language enabling the rise of the cloud is business process language. Business Process Markup Language (BPML) has been standardized for 10 years. It used to be that the human element was missing in the BPM layer. We can now use BPML not only for compliance and system-to-system communications, but also for human-to-human communications. The evolution of software will allow technology to mirror behavior rather than vice versa. Mass customization is the next wave of software The goal is to preserve existing investments, while providing a layer of flexibility on top of them. Like a mobile phone, the complexity underneath will be hidden as complex processes are combined into “mass custom apps”. For example, the iPhone external case may change with each version, but the underlying functional components are consistent. We are still driving our software “cars” based on what is under the hood. Though the body and the dashboard may change, a car may have the same wheels and engine as 10 years ago. A similar case will hold in software. We need to look at SAP, Oracle and other applications to respect and preserve the value of their complex capabilities. In the past we had logic which was stored and concrete, but in the future knowledge workers will be able to harness this logic in a better way - as services. We can build a web service and weave it together with others to collaborate commodity functions at the process layer. The crucial element in IT is complexity, which creates costs and inefficiencies. What we need to be able to do is commoditize complexity down to simple elements that can be decoupled and recombined with other components. Customer auditing is a decoupling point. Think of creating software like manufacturing an airplane. Airplanes have been manufactured in a similar way for decades. Commodity components are built from all over and the plane is assembled in one place in the same way, while the individual configuration is different. Traditional enterprise applications have been discrete “silos” Goal is to preserve existing best practices while adding flexibility Software will see differentiation in the assembly, not the manufacturing
  32. 32. Jan Baan, Cordys US software 32 September 2010 The analogy for software is that we will see differentiation in the assembly not the manufacturing. Not only will organizations build Software as a Service, they will build the business itself. Tomorrow’s world will be one of mass customization, where users will have the ability to dynamically change components of a system. Essentially you will be able to make “mash-ups” out of complex processes. The way you achieve mass customization is to take vanilla, traditional systems and make them into a web service, deliver them over web with customizable interfaces. Baan believes Cordys had been early to the market, but we think people are getting it now. Commoditization preserves best practices Commoditization means there is respect for best practices and best practices for components. What is meant by commoditizing software is to bring together all of these underlying application silos as a process for the information worker. In this way, the complexity is hidden underneath a layer of process. Gartner calls this the ‘meta of the meta’ layer. This approach is really the only way to simplify everything. In this model, the underlying information is hidden and the process is using information in a way that the commoditization allows it to. Since they are commoditized as components, there is no need to change the underlying applications, the user doesn’t have to worry about how it comes together and there is no need to have engineers customize when systems are built on best practices. The challenge with the cloud is to use the inherent strengths and underlying processes of legacy systems, make them multi-tenant and cloud enable them. Basically the vision is to make web services out of legacy systems, integrate them into the business process layer. Commodity components will enable projects to be created with no new lines of JAVA code, with the processes engineered in the “meta-layer,” leveraging best practices codified in legacy systems all the way down to the databases. Business Process as a Service For the knowledge worker, this new model requires that the business user/developer only needs to manipulate the business process layer. This allows the user to build upon the existing investments in IP and orchestrate a higher level of processes that are transparent to the end user. At Baan Software in the 1990s, there were 2000 engineers working on various versions of the software. Programmers of yesterday made “spaghetti” with all of the customized code. This is analogous to a car where friends come in and change everything. In the cloud, there are new efficiencies because there is only one version of the software. There is no longer room to build big, monolithic enterprise applications. The differentiator nowadays is content from the unstructured data world. It is not functionality that is the driver, content is now the driver, and this includes both unstructured and structured data. Content itself is stored under the umbrella of the business process layer. In the past, it was logic that was stored and concrete - now logic becomes used as a component. Commoditization in software preserves best practice and enables re-use In the future, business users will be able to manipulate the process layer, not underlying code
  33. 33. Jan Baan, Cordys US software September 2010 33 There are still many complex common best practices. In the past, it was about functionality. Now we like to incorporate unstructured content into the process. For the first time, we can bring unstructured rules and content into the process layer. The new generation of development language is 5GL, which increasingly takes the developer out of the process. Instead, we have the business driving the use of the commodity type of components that are decoupled with services that have been established. The challenge is how to make legacy on-premise systems multi-tenant to bring them into the cloud. The carrier for this new wave of enablement is the internet. In the 1960s, data was king. Today it is the “meta-meta model” that enables collaborative process. With the internet as the delivery mechanism, the end device will be whatever the user likes. The crucial element is how to decouple different technology components. Collaborative workspaces will bring together different models under one view. With the XML standard, software itself can become increasingly based on components. IT is now participating in business decisions. Previously, IT was regarded as a “necessary evil” for the business, but with increased flexibility from the cloud to orchestrate and integrate processes, IT is becoming an increasingly strategic part of the business. In coming years, the sun will set on mainframe technology. However, client- server technology can continue to be useful for the next 20 years if we can decouple processes and maintain on-premise and in-the-cloud models. 5GL development languages allow businesses greater input in the process The sun will set on mainframe technology
  34. 34. Willem van Biljon, Nimbula US software 34 September 2010 Willem van Biljon, Nimbula Willem van Biljon is a senior technology executive and entrepreneur who started his career building a UNIX-based operating system for mini-computers and the first retail debit-card payment system for one of the largest retailers in South Africa. Building on that expertise, he co-founded Mosaic Software to build the first high-end payment transaction switch for commodity hardware and operating systems. Mosaic became one of the world's leading electronic funds transfer companies with operations in more than 30 countries and was successfully sold to S1 Corp in 2004. Willem then joined Amazon to develop the Elastic Compute Cloud (EC2) service business plan and to drive product management and marketing for the service. Willem is a graduate of the University of Cape Town. The next layer of cloud automation Willem was part of the team that built the Amazon EC2 cloud, and his new company, Nimbula, is focused on creating a new layer of automation to a broad vision of cloud-based infrastructure. Our discussion focused on the next generation of cloud-infrastructure management as organizations increasingly seek to apply policy and provisioning in a dynamic fashion across public and private clouds. Key points There will be a change in how people purchase computing resources, driven by the desire for flexible use of hardware, cost drivers, how applications are deployed, underutilization of servers and the desire to change how to pay for software. A new breed of cloud-infrastructure software will be able to automate the orchestration of cloud resources, policy management and federation between clouds, resulting in lower cost of implementation and management of datacenters. Automated and federated systems that orchestrate cloud services will reduce the cost and complexity of IT, making it more transparent. The continuum of new infrastructure and payment models allows for a stream of innovation that previously would have obviated investment hurdles and management requirements. Projects or investments that might have been unthinkable in the past because of resources will suddenly be trivial. ‘Previously, IT was task based, and hardware was purchased to solve a particular task or problem. If the task didn’t warrant the cost of buying five to 10 servers or whatever the costs would be, then it never got done. If it now becomes trivially easy to do it, you are going to find pent-up demand for all of these tasks that really make sense to do now.’ Summary of interview on 2 August 2010. Full transcript follows Nimbula is focused on creating a new layer of cloud infrastructure automation Automated and federated systems that orchestrate cloud services will reduce IT cost and complexity