Accenture Tech Vision2011 Report V6 1901
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Accenture Tech Vision2011 Report V6 1901

on

  • 1,625 views

The technology waves that are reshaping the business landscape, including a good discussion on social platforms.

The technology waves that are reshaping the business landscape, including a good discussion on social platforms.

Statistics

Views

Total Views
1,625
Views on SlideShare
1,625
Embed Views
0

Actions

Likes
0
Downloads
50
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Accenture Tech Vision2011 Report V6 1901 Document Transcript

  • 1. AccentureTechnologyVision 2011The technology wavesthat are reshaping thebusiness landscape
  • 2. Table of Contents3 Foreword: Pierre Nanterme and Kevin Campbell 17 Data Privacy Will Adopt a Risk-based Approach4 Introduction 19 Social Platforms Will Emerge as a New5 Data Takes its Rightful Place as a Platform Source of Business Intelligence7 Analytics Is Driving a Discontinuous 21 User Experience is What Matters Evolution from BI 24 Seeing Beyond the Walls: The Process of9 Cloud Computing Will Create More Value Change Begins Here Higher up the Stack 26 Notes11 Architecture Will Shift from Server-centric to Service-centric 27 Research Methodology14 IT Security Will Respond Rapidly, 27 Contacts Progressively—and in Proportion 2
  • 3. ForewordWhat’s next? That’s a simple question to ask, but it’s not so simpleto answer. Our clients and our own company are constantly lookingaround the corner to see what’s coming, and what the future willhold for our business and our lives.The Accenture Technology Vision for 2011 represents our looktoward the future of technology. But as you will see in this report,technology trends are not isolated and are intimately intertwinedwith business and societal trends. Our Technology Vision is asimportant for business and government leaders as it is for IT.Technology touches everyone in the modern world. It’s no longeron the sidelines in a support role, but instead is driving businessperformance and enriching people’s lives like never before.We encourage you to take the time to read this important piece,reflect on what the future of technology means for you and yourorganization, and then take all the steps necessary to deliver ontechnology’s enormous potential to create value.Pierre Nanterme Kevin CampbellChief Executive Officer Group Chief Executive - TechnologyAccenture Accenture 3
  • 4. IntroductionThe 1969 moon landing required radical, back-to-the-drawing-boardideas about everything from earth orbit to life in zero gravity. Theentire program called for exceptional innovation, willingness to shedold dogma, unprecedented teamwork, and great boldness.Just as the U.S. space program could Three threads run through the report: IT and business leaders who see andnot have put a man on the moon using understand the significance of theconventional aviation technology, 1. Things will be distributed technology changes now under wayIT leaders and business executives The obvious and immediate realization will be those who are best placed tocannot use yesterday’s approaches to is that data today is spread far and help their organizations outperform.realize tomorrow’s objectives. Their wide. Data is also dispersed across many They will root their observationslong-held assumptions are being more locations, and under the control and actions in nine core capabilitiesturned upside down as three forces of far more owners. At the same time, essential for the effective operation ofconverge. First, price-performance services will be distributed more widely. all IT departments. And while they willratios keep improving; we have access Analytics will follow data and services, continually sharpen the skills specificto a superabundance of computing and will become distributed too. All to their roles, they will never limit theirpower, network bandwidth, and storage of which accentuates the importance perspectives to their areas of specialty.capacity, all at lower and lower price of factors such as master datapoints. Second: The expectations of management, secure communications,consumers are changing dramatically and federated identity.because they are being exposed totechnology choices that empower 2. Things will be decoupledthem as never before. And third: New Technology today enables decouplingtechnology trends put IT in position to of entities and layers once deemeddrive innovation and growth rather than inseparable. Data representation isfocusing on cost-cutting and efficiency being decoupled from data access.improvements. Software layers can be addressed separately. Application interfaces noMany changes are under way in parallel. longer need to be tied to physicalSome, like cloud computing, have been interfaces. Decoupling on such a scaletalked about and debated for years promises unprecedented agility andbut only now are able to deliver their flexibility. But it also calls for a verypotential. Others, like the strategic different mindset—and skills set—andrecognition of the importance of data, for wise governance disciplines.are just now becoming apparent. Stillothers, like data privacy, are being 3. Things will be analyzedpropelled by a worldwide wave of Since everything from keystrokes toconcern about individual rights and the consumer behavior can now be trackedgreatly expanded potential for abuse of and studied, analytics will become thethose rights in an information age. super-tool with which to drive more agile and effective decision-making.So this year’s Accenture Technology Business processes will have to keepVision report tells a story of pace if those super-tools are to bediscontinuous change. That story is effective. There are a host of positiveapparent in the trends that comprise implications, in categories as diversethe core of this report—the trends as customer intelligence and threatthat will have the greatest impact on detection. But there is no shortage ofperformance in the future. negative implications—among them the risks to data privacy and the over- optimization of business processes. 4
  • 5. Data Takes its Rightful Placeas a PlatformThe age of viewing everything through an application lens is coming to an end.Coming next: a world in which the quantity, processing speeds, and distribution ofdata compel IT leaders to see the world through a data lens.Generations of programmers and application architecture, will refer to scaling will be increasingly deployedarchitects have grown up thinking in abstraction layers and separation of to help meet the performanceterms of applications—seeing the world concerns, and not just to data models. requirements.through the lens of the functions thatthe business has needed and with Recognizing three catalysts The third factor is that data is nodata being the object, not the subject. longer “contained” in an enterprise dataThat thinking will change. Although Three factors will drive this shift in center—nor is it necessarily “owned” bya focus on applications will continue perspective: dramatic increase in the the enterprise. The adoption of cloudto be important, it will give way to an quantity of data; sustained processing computing will lead to corporate dataemphasis on data. It is our belief that in requirements in the context of being distributed across the enterprisethe near future, platform architectures growing data volumes; and widening boundary and potentially among severalwill be selected primarily to cope distribution of data. Let’s look at each cloud providers. Further—publiclywith soaring volumes of data and the factor in turn. available information as well as datacomplexity of data management—not from third-party data service providersfor their ability to support this or that To start with, there are enormous is rapidly becoming a crucial part of theapplication. increases not only in the volumes mix. but in types of data—more and moreWhereas traditional databases are transactional data, a surge in meta- Distributed data is the newdesigned to keep track of where thedata is stored and how it can and data, an explosion of sensor data, and normal a staggering rise in the volumes ofshould be accessed, data platforms unstructured data such as e-mails, While it is true that data today iswill provide a layer of abstraction that tweets, blogs, video clips, and more. already distributed among differenthides the data’s location, and is not Second, because organizations want data centers and application silos, weconcerned with the form in which the to do more with the data in service are talking about distributed ownershipdata is stored or how its consistency of their business processes, there are and control of data, which requiresis maintained. So in effect, the data very strict performance requirements a different approach to managementrepresentation architecture will be for analyzing the vast new data of the data as well as its security anddecoupled from the application. volumes. As a result, new processing governance. If IT leaders are not alreadyData architecture, much like today’s architectures emphasizing horizontal 5
  • 6. facing up to the fact that they mustdeal with data that resides outside their backup frequency, compounding the difficulty of taking snapshots of the Think data utility,enterprises, they will very soon have todo so. data and of managing recovery from disaster consistently. And there is the not just data quality challenge of recombining data dispersed The concept of data quality willThe distribution of ownership will have among different providers into a single soon give way to the idea of dataa host of consequences. For a start, view of the truth. utility, which is a more sophisticateddistributed data will be more frequently measure of fitness for purpose. Thisshared across applications. For instance, Related to this point is the question will get IT departments away fromone process, say, involving an asset of how to ensure the destruction of often-fruitless discussions abouttracking application, may produce data sensitive but old data and handle the cleanliness of data and towardand require only a basic level of data destruction audits. How does anstorage. But if another process, like an organization now confirm that data has productive talks about what caninventory management or supply chain been destroyed according to its policies? be done with the data on hand.application that uses the data and now And when auditing data destruction, it Importantly, it will allow them toneeds to use that data in a mission- must be possible to audit the data paths apply semantic and analytic toolscritical application, the new demand too—tracking all of the places where to extract useful insights fromcompletely changes the storage and the data may have left a trace along the inaccurate data and to integrateaccess requirements for the data. way. data silos more easily.At the same time, master data Our prediction is that these factors will We characterize data utility in eightmanagement (MDM) will become lead to new value-added services from ways:considerably more complex than it is cloud providers or will become part of • Quality: Data quality, whiletoday. Specifically, MDM needs to keep future service-level agreements (SLAs) important, will be one of the manytrack of the origin and location of data, that differentiate cloud providers. dimensions of data utility.access policies, backup frequencies, • Structure: The mix of structureddegrees of redundancy, location of Distribution will also affect data and unstructured data will have aownership of meta-data, etc. We expect quality. How can we effectively detect big impact, and will vary from taskthat this will create big headaches duplication and inconsistent data when to task.because MDM will become crucial when it is distributed across different silos, • Externality: The balance betweenalready scarce MDM skills will become vendors, and providers? And how can we internal and external data willscarcer. tell when data duplication is beneficial— be important, with implications and should be planned for—as opposed for factors such as competitiveThe shift toward a data platform to accidental and potentially unsafe? advantage (external data maymindset will turn the spotlight on Therefore it is necessary for IT leaders be less trustworthy, yet fine foralternative databases. The trusted to reframe the whole concept of data certain analysis or tasks).relational database is not about to quality more broadly around the idea of • Stability: A key question is howbe retired, of course. But it will soon data value and utility. frequently the data changes.start to make way for other types • Granularity: It is important toof databases—streaming databases, We believe that more and morefor instance—that mark a significant organizations will come to see data know whether the data is at thedeparture from what IT departments as something that can bestow a right level of detail.and business users have relied upon for competitive advantage, and begin to • Freshness: Data utility can bedecades. Naturally, not all of the new view application services and algorithms compromised if a large portion ofdatabase technologies will be right as utilities that can be procured off the data is out of date.for every user in every circumstance; the shelf. In other words, the roles of • Context dependency: It’scost, flexibility, reliability, and speed application and data will be reversed, necessary to understand howwill be key motivators. It is essential with data becoming the platform that much context (meta-information)for IT leaders to start thinking beyond supports application services. is required to interpret the data.conventional constructs in terms of • Provenance: It is valuable to knowhow data is organized, accessed, andmanaged. Action step where the data has come from, where it has been, and where it is being used. Begin to reframe IT’s perspectivesThen there are the questions about around the idea of databackup and recovery, which become far platforms—and start themore complex in a world where datamay reside with several cloud vendors. conversations and workshops thatEach vendor is likely to have its own enable those perspectives to take hold quickly. 6
  • 7. Analytics Is Driving aDiscontinuous Evolution from BIAnalytics is emerging as a major differentiator and value creator for businesses. Butto reap the real benefits, companies must see analytics as a discontinuous changethat will involve several different architectures and deployment models.Analytics drives insights; insights lead such as cloud computing is changing However, new challenges will appear.to greater understanding of customers how data is generated, collected, As distributed data becomes the newand markets; that understanding yields and stored across an organization. In normal, we will see the emergence ofinnovative products, better customer practice, this will require a distributed distributed ETL—that is, the need totargeting, improved pricing, and approach to analytics. extract data from multiple on-premisesuperior growth in both revenue and and off-premise platforms in order toprofits. This distributed approach will require run centralized analysis. Call it the price different ideas about who is best at of progress.That’s why farsighted companies doing what. In general, we expectare viewing analytics as essential to see some companies sourcing The quest for closed-loopfor creating value. In contrast, their from third parties the deep analyticspeers who think about analytics skills required for, say, customer nirvana continuesonly as a simple extension of segmentation, route planning or process The ultimate goal for analytics-savvybusiness intelligence (BI) are severely optimization, and keeping in-house the organizations is complete integrationunderestimating the potential of even deeper skills for the interpretation of analytics with business processanalytics to move the needle on the of the results. automation, leading to a true “closedbusiness. For one thing, they overlook loop” capability that integratesthe fact that traditional BI does not As analytics becomes integrated into analytics with automated responses toaddress the wealth of unstructured data the underlying technology platforms, the results of the analysis. While thisthat is now available. one existing challenge will ease. ETL analytics nirvana won’t be achievable (extract, transform, and load), the anytime soon, we will begin to see lessSo what does the future of analytics process of retrieving data and preparing complex and more pragmatic levelslook like for IT organizations? First of it for analysis, has traditionally been of integration between analytics andall, despite a steady drumbeat calling the most time-consuming element business logic embedded in IT systems.for the integration of data across of any analytics project. But ETL willan organization, there will be no become easier as data quality tools Leading IT organizations will go throughsuch thing as an integrated analytics improve, analytics applications become a progression, moving from traditionalplatform, technology, or deployment more tolerant of “noisy” data, and BI (reporting) to business activitymodel. The emergence of technologies ad hoc capabilities are replaced with integrated platforms. 7
  • 8. monitoring (BAM) to measure specificbusiness activities and report business applications to define the primary decision points. Brute-forcemetrics. From there they’ll proceed to improvementspredictive analytics, in which business The need for analyticalrules and processes are adjusted to literacy Don’t expect sophisticated,address business changes – a peak handcrafted analytical models tosales period, for example, or a market The growing sophistication of analytics drive performance improvements indisruption. capabilities and supporting technologies analytics. A more likely driver: brute will open up the risk of “oversteering”— force computational power appliedThis move to predictive analytics will of making increasingly frequent, fine- to larger data sets.drive the use of analytics to acquire grained decisions. That’s especially sonew data to fill knowledge gaps – because business users will be tempted Traditionally, analytics solutionswhich will further improve analysis to “get value” from these powerful tools. have been constrained by computingand decision-making. Think of how However, too-frequent optimization performance and limited availabilitythe combination of additional data can can be counterproductive when the of data. Improvements oftenprovide deeper context to improve the decision-making time scale is notquality of analysis. A sales team for a stemmed from meticulous program appropriate for the process to whichretailer could compare regional point- optimization, better algorithms, it is applied. For example, just becauseof-sale data with local weather, for simplified assumptions, and other a utility can gather real-time data onexample, to gain better insights into methods for milking limited data for fuel (e.g., oil, gas, coal) pricing fromcustomer behavior. all it was worth. the markets doesn’t mean it should be changing its generation mix everyTo truly take advantage of analytics, 30 seconds – especially if its business Increasingly, however, analyticalbusinesses need to integrate their processes are tuned for long-term fuel improvements are cominganalytics capabilities into their business contracts. from the availability of greaterrules and processes to connect the computing power applied to morerelevant insights across all stages of data. Utilizing machine learning So how can companies avoid analytics-decision-making. Business processes techniques instead of handcrafted induced course corrections that doinvolve a series of discrete decision rules, analytics teams will gain more harm than good? Effective usepoints: demand prediction, pricing, the scale needed to match the of analytics will require considerablepromotion, etc. Today, businesses tend increasing complexity of business analytical literacy. Remember that real-to apply analytics to these decision problems. time data does not necessitate real-points in isolation. While this approach time decisions. Decision-making withcan certainly improve decisions at each analytics requires an understanding ofstep, it can also lead to problems. the sampling rates of different events and their interdependencies, becauseConsider, for example, the well-known decision-making must be consistentbullwhip effect, which shows that even with the time scale of data. The businessif each stage of a process is optimized process should dictate the analytics; notbased on the data that’s available at vice versa.that stage, the overall process willstill be suboptimal because differentdecision points are not coordinated and Action stepdo not have visibility into the entireprocess. Because every decision in a Determining the right approachprocess generally is predicated on the to analytics involves many criticaloutcomes of prior decisions, analytics decisions; IT executives shouldmust integrate all decision points to work closely with business leadersprovide an understanding of the larger to identify where analytics anddecision process. insights can be leveraged most effectively as well as the properAs decisions across the process become mix of services required toapparent, companies can eliminate optimize analytics capabilitiesundesirable side effects and optimize across the enterprise.business processes enterprise-wide,leading to better results. The best wayto head down this path is to startdecoupling processes and rules from 8
  • 9. Cloud Computing Will CreateMore Value Higher up the StackThe current focus on infrastructure cloud doesn’t help organizations differentiatethemselves. Together, SaaS and PaaS rather than IaaS will enable IT to create valuethrough a combination of cost reduction, speed to market, agility, and the ability togracefully integrate business processes with partners and suppliers.There’s no denying the momentum of many large enterprises, the logical next solutions as a way to entice morecloud computing. Accenture’s research step after virtualizing their data centers enterprises to move their applications toshows that enterprises are already has been to leverage IaaS to augment the cloud.moving applications into the cloud. 1,2 those centers.The demand is anything but an IT fad; Hybrid solutions will emergeit is coming from a host of business However, IaaS is becoming a commodity. We see much greater value in SaaS as the dominant modelfunctions. And it is truly a globalphenomenon; companies everywhere and PaaS, higher up the stack. That is Cloud computing deployments will takefrom Brazil to China are moving ahead where tomorrow’s leaders will find real many forms. More organizations willrapidly with adoption. It’s clear that IT differentiation. Hybrid clouds – i.e., SaaS deploy virtualized desktop infrastructureand business executives should expect and PaaS in combination with internal (VDI) for high-security or highlycloud computing to become ever more applications – will help organizations standardized desktop environments topervasive– to the point that the term accomplish tasks that they cannot manage sensitive data centrally and“cloud computing” itself becomes accomplish today and will cement IT’s keep it out of the individual’s control.superfluous. role as a driver of business growth. Private clouds will also be utilized for development and testing, alongBut what’s needed now is a shift The reason? Unlike IaaS, which is with “transient” applications, such asin thinking from obvious but non- driven mostly by cost-management product demonstrations for customers,differentiating benefits such as cost objectives, the migration of applications that can be set up and retired quickly.reduction through cloud infrastructure and application development to the Public clouds will be leveraged moreto where the cloud will have its cloud will be based more on business for non-differentiating applications orreal impact. When we look at the need. SaaS-based applications reduce for “cloudbursting” – a computing-on-different facets of cloud computing concerns about the infrastructure and demand model for processing heavy,– Infrastructure-as-a-Service (IaaS), help organizations get to market quickly. short-term workloads.Software-as-a-Service (SaaS), Platform- When SaaS also provides a platform, itas-a-Service (PaaS), and so on – it is makes it easier to customize and to tap But we expect hybrid clouds – mixingeasier to see that most of the current into an expanding ecosystem of third- public and one or more private servicesemphasis on cloud is focused on the party applications. Service providers – to emerge as the dominant model inlower levels of the technology stack. For will begin to provide targeted vertical most enterprises. As data and services 9
  • 10. are spread across a variety of serviceproviders, hybrid models will provide a consequence, the technical sourcing of cloud providers will become a new skill Action stepthe best balance of flexibility while that organizations have to acquire andmanaging risk. In such an environment, master. Deploying infrastructure cloudIT will focus on orchestrating the today should not distract frombusiness process while treating Many technical and business planning for the transformational,everything else as a service. This will differentiating opportunities challenges remain that SaaS/PaaS offer tomorrow.force service providers to compete lesson price and more on how well they can Before the full potential of cloud The main focus should be ondifferentiate their offerings based on computing can be realized, companies developing a cloud strategy thatfactors such as quality of service and and their service providers have plenty drives business transformation byrobust application catalogs. of technical and business hurdles delivering increased functionality to cross. Technologically, IT teams and flexibility using a mix ofEconomics of the cloud: will have to develop strategies for public and private cloud-baseda new game implementing and managing federated application and platform services. identities, which are necessary forAs cloud services proliferate, enabling consistent permissions, roles,conversations about cost will have to and traceability across multiple service The lock-in problemchange. We expect the shift to move providers. Organizations’ IT leadersfrom the cost of discrete IT components will have to work with cloud service Cloud-based solutions have lowto a discussion about the total cost providers to determine the right barriers to entry. There is noof ownership (TCO) of cloud solutions. federation rules for their communities. procurement or development leadAlthough IT TCO constructs are well time; getting into the cloud needsknown and well accepted, measuring Federated identities are but one of nothing more than a credit card.TCO in the cloud is a mystery. It’s fair many technical challenges. Concerns for providers include: platform-level version But getting out of the cloudto say that currently, there are no management activities (the ability to will be a lot harder than gettingrecognized models for cloud TCO. Thecurrent emphasis is on economies manage local patches, versioning, and in—especially with SaaS. SaaSof scale from volume, automation, upgrades without introducing risk to vendors have their own styles forcommoditization, and consolidation. cloud consumers); and richer application implementing data models, meta- programming interfaces (APIs), which data, user group administration, etc.But cloud hosting is not merely a are necessary for more sophisticated The same issues apply to securitysummation of compute hours, storage cloud applications. Concerns for users domain models, proprietary scriptingbytes, and network bandwidth. There involve: consistent policy enforcement and markups, and proprietary meta-are many implicit cost elements, across cloud providers; access to data schemas. So taking all ofinvolving quality of service, staff and event meta-data, which is critical for the data back will require a full ITskill requirements, the granularity of analytics; governance over applications migration project.licensing costs and charge backs, the and data that is stored in third-partyunknown costs of skills loss, the implicit data centers; better fault tolerance for Cloud lock-in can occur alongand explicit insurance cost to offset application architectures; and network several dimensions: app stores,downtime, and the unpredictability latency. data migration schema, and skills.of operating expenses compared with Further, if the cloud service providercapital expenses. In addition to the technical problems, finds itself in trouble, needing to several business issues in deploying raise prices to stay in business,In the future, companies will need cloud solutions need to be solved as its customers will have littleto examine the full life-cycle cost well. IT teams will have to ensure that bargaining power because theywhen considering public, private, or any cloud services contain facilities need the provider to remain viablehybrid cloud services. For example, a for e-discovery and disaster recovery for the sake of their own businessshort application life span may not – processes made more complex by continuity.warrant an investment in supporting the distributed nature of cloud-basedinfrastructure, making a case to move information. They will need new So the selection of a service providerthe application to the cloud. Load processes and procedures for tracking must come with the realization thatelasticity is another consideration; user activities and data paths across a shift to another provider will callthe ability to reduce utilization when cloud-based and on-premise systems. for re-implementing the applicationdemand is low makes costs variable as While license management may become or expending considerable effort towell. Cost decisions will need to be built easier, SLA management will become far migrate to a new provider.into architecture planning and provider more complex when incorporating cloudselection – traditionally the realm of providers into the mix.architects, not finance professionals. As 10
  • 11. Architecture Will Shiftfrom Server-centric toService-centricMore mature tools, frameworks, and methodologies will pave the way towardprocess-centric IT. Rapid advances in infrastructure will drive a new generation ofsystem architecture that will be reconfigurable at runtime to operate on differentinfrastructures.Information technology is evolving from academia, advances in cloud technology tightly coupled, even though theira world that is server-centric to one at all layers of the stack create a architectures evolve at dramaticallythat is service-centric. Companies are burning platform for such architecture. different paces. The life span of anquickly moving away from monolithic enterprise application is measured insystems that were wedded to one or Dynamic reconfiguration will take decades – most mature companiesmore servers toward finer-grained, place at the business process level have 45 years or more of intellectualreusable services distributed inside and – in which processes can switch property locked up in custom-builtoutside the enterprise. to an alternate service provider in applications and data structures 3 – response to an outage or promotion while dozens of start-ups promisingThe evolution is being driven by the – and at the infrastructure level, in infrastructure innovations pop up everyongoing maturation of supporting which servers or nodes can be added month.tools, frameworks, and methodologies. through cloudbursting and similarThere is still much to be done to techniques to handle temporary, peak As a result, applications that are tightlydecouple infrastructure, systems, processing needs, or discrete projects. coupled to their infrastructures oftenapplications, and business processes In the future, business processes, not restrict access to new innovations infrom one another. This shift has technology functionality, will dictate infrastructure. An application thatmajor repercussions for all levels of how or when you use these scaling relies on a single instance, for example,the enterprise architecture stack, mechanisms. Business rules will help cannot take advantage of horizontalfrom infrastructure to applications. make this a reality. scaling technologies.Decoupling will enable components tooperate independently while making The most interesting part of this The case for decouplingsoftware architectures reconfigurable movement is that infrastructure –during run time to adapt to various commonly viewed as a constraint to Why is it imperative that theenvironments and design objectives, application functionality – will become application architecture be decoupledwhich will increase the flexibility the primary driver of architectural from infrastructure? Applicationof application deployment and change. Infrastructure (e.g., computing, design typically makes assumptionsmaintenance. Although dynamic storage, and network connectivity) about certain characteristics of thereconfiguration is not a new concept in and applications have always been infrastructure required to run it. 11
  • 12. These assumptions tie the applicationarchitecture to the architecture of In this new paradigm, applications will be immune to any underlying Softwarethe infrastructure for which it wasoriginally designed – assumptions changes in data representation or infrastructure, resulting in scalability, engineeringthat must be preserved even as the increased performance, and cost becomes moreinfrastructure improves. This is the reduction thanks to the adoption ofcurse of backward compatibility, in new advances in infrastructure. This will science than artwhich support for new functionality speed the adoption of new technologiesmust be sacrificed for the applications and enable companies to upgrade or Watch for software developmentto continue to run reliably. migrate at will. to become a lot more predictable and easier to measure and manage.New architectures, on the other hand, To fully reap the benefits of cloud- New and highly sophisticatedallow – or even require – companies to based infrastructure and service tools, exploiting analytics andrelax many long-standing assumptions orientation, system design will have standardized instrumentation, areabout application and infrastructure. to adhere to well-established best already becoming integral to theFor example, no longer will developers practices that will enable systems to tooling used by leading developers.be restricted to a single box with fixed be dynamically reconfigurable. Somecapabilities, a single instruction or examples include: using parallelization Instrumentation is starting toprocessing stream, pure vertical scaling, frameworks for flexible scaling; avoiding provide richly detailed informationor the co-location of processing and explicit assumptions regarding service across the entire developmentdata. Instead, infrastructure is being configuration in the service design; process. And analytic techniques,provided as services that can be chosen, using intermediaries like a service bus applied across multiple teams andprocured, and configured based on to avoid direct communication between projects, will allow longitudinalapplication requirements. services and data stores; and isolating tracking and benchmarks against the stateful and stateless components which new projects can beWe are already seeing horizontal scaling of an application from each other. measured and evaluated.at the chip level, the grid architecturelevel, and the server level. Parallelism Ultimately, this means that a large Examples of tools that enableand new processing mechanisms part of what used to be hardwired instrumentation and analyticssuch as MapReduce, non-relational into the system during design must be include Rational Jazz, a frameworkdatabases, virtualization, and fabric made configurable at run time so the for connecting to underlying toolscomputing are rapidly becoming the system can adapt to the highly dynamic such as ClearCase and Subversion,mainstream today. There is a reverse operating environment. The system Microsoft Visual Studio’s Teamtrend as well, as evidenced by the rise may have to be designed around the Foundation Server, and Hackystat,in special-purpose appliances where possibility of more frequent failures, an open source instrumentationcomplete stacks have been collapsed requiring more attention to managing framework.into an appliance, effectively creating state in order to recover from failuresa service in a box. For example, SAP with a consistent state. These advanced tools make itreleased SAP HANA (High-Performance possible to measure softwareAnalytic Appliance) recently, optimized Innovations must trickle up development processes infor in-memory execution of applications systematic, consistent ways. As asuch as its BusinessObjects offering. 4 the stack result, software project managers will be able to estimate and plan IT organizations have been excited byGoing forward, architectural development projects more precisely the agility that service orientation,heterogeneity will only increase as and with more predictability— business process management, andhybrid clouds, distributed data, parallel for example, tracking team other emerging technologies havealgorithms, non-relational databases, and individual productivity, or provided to date. But they are seekingservice-centric applications, and monitoring the quality of code over even greater agility. Innovations in lowerspecialized appliance-based applications time. And when necessary, the new levels of the stack require flexibilityall coexist, and new processing approaches will help managers to in the higher levels to deliver the trueparadigms emerge. Application take corrective action much earlier. potential of service-centric IT.architectures will continue to bedecoupled from infrastructure and As layers of the architectural stackfrom business processes, resulting in are decoupled from one another,applications that are self-describing, languages and notations for formalself-correcting, self-scaling, and self- communication between the layers willmodifying. become necessary. Today, an advanced 12
  • 13. application might have logic embedded or switching to new services, byin it that triggers other actions as the invoking different business rules forapplication approaches peak capacity, evaluating decisions, or by modifyingfor example. Tomorrow, communications the underlying business process itself.between layers will alert higher levelsof the stack to the fact that the The approach to system and enterpriseapplication is approaching capacity, architecture will fundamentally changeand a combination of business rules as we go from applications that areand business processes will handle the tightly coupled to infrastructure toissue rather than the infrastructure layer those that are infrastructure-agnosticmaking a decision in isolation. and, eventually, infrastructure-aware.Better inter-layer communications Ultimately, applications will controlwill enable dynamically reconfigurable the infrastructure instead of beingarchitectures to be self-monitoring – constrained by it. These types ofmeaning that they can generate events dynamically reconfigurable services arebased on changes to the environment. a key element of the next generation ofThese changes may take many forms. software architecture that will increaseThey may involve new demand—for IT’s agility – and thus boost its ability toinstance, when the infrastructure is innovate and deliver business value.reaching capacity, as would be thecase if a cloud provider has an outage.They could be configuration changes,such as new capabilities in underlying Action stepservices, and organizational, as in the Explore ways to begin decouplingcase of adding a new outsourcing applications from infrastructure asprovider. Business processes will have to part of your life-cycle managementrespond to these events appropriately: strategy.by executing existing services differently 13
  • 14. IT Security Will RespondRapidly, Progressively—andin ProportionIn the past, IT has architected everything around the idea of “100 percent” security.This fortress mentality must now give way to a realistic and practical approach toIT security. What’s needed now is a cascaded, reflex-like security architecture thatresponds proportionately to threats when and where they happen.There is no such thing as watertight – across more devices, more systems, the threat is large or moving fast,IT security. Yet for years, business and more people, more business partners, shuts down other parts of the network.technology leaders have acted as if the and broader physical infrastructure— More sophisticated responses will alsoonly alternative to a “fully secure” state supports the case for automated place a premium on using intelligence-is an unacceptable “fully breached” capabilities that detect, assess, gathering and forensics techniques tostate. and respond to external threats learn about adversaries. immediately.This “fortress mentality” is outdated— Overall, it will be essential to stepand no longer realistic or practical. Leading organizations will understand up the organization’s collectiveLeading security specialists are devising the consequences of inevitable data understanding of what “security” reallyreflex-like systems whose responses leaks. For example, they will know means. That will involve moving awaystep up with the severity of the breach. that the leak of data about a major from simple low, medium, and highIn extreme cases, counterattacks may retailer’s transportation routes does brackets and assessments at the leveleven become part of an organization’s not automatically mean that rivals can of the individual object and toward therepertoire of responses. replicate the retailer’s supply chain. security of a network of interconnected They will be able to think beyond objects. In essence, security will becomeWe believe that new security solutions the simple binary notion that their a fluid continuum across the network.and architectures will, like human organizations are either secure or havereflexes, respond instinctively to the been breached. The bottom line: Automation willgrowing speed, scale, and variety of quickly become a “must-have”attacks. This implies that for the first Those organizations will know that component in the overall securityline of defense, people will not be different levels of attacks require strategy of every IT organization. Therepart of the decision loops; the speed different speed, scale, and types of is simply no other way to detect threatsand frequency of attacks dictate that responses. A cascade of responses swiftly enough, let alone to contain thehuman responses must make way for might, for example, involve the damage and recover from it.automated capabilities that detect, immediate shutdown of one portionassess, and respond immediately. of a network coupled with activeAnd the increasing “attack surface” monitoring, which, if it detects that 14
  • 15. Teaming up to fight back The importance of being able to proveIn extreme circumstances, it may be you’re younecessary to actively fight back. Giventhe speed and scale of today’s cyber- Identity will become even more important in the future. And biometrics—attacks and the growing significance of already established as a security tool – will augment and steadily replacethe inflicted damage, organizations can other methods of identification and authorization. The importance of identityno longer stay only in defensive mode. will drive biometrics adoption in two very different directions: high-securityCyber crime accounts for an estimated uses in government and business, and convenience-driven uses for average$1 trillion in annual losses to businesses individuals.around the world. 5 Counterattacks,then, are likely to become part of the The catalysts for these twin tracks are clear. There is certainly a growingstrategy for augmenting IT security. emphasis on identity for consumer and enterprise needs; mobility, health,Organizations are likely to collaborate and e-commerce all require strong forms of authentication in the faceto counterattack their assailants, of increasing security threats. At the same time, biometrics solutions arenot least because few organizations becoming more affordable as deployment of high-value solutions bringshave the resources needed to defend down unit costs, enabling other business processes to take advantage of thethemselves. technology. In short, organizations will have more options for acquiring and implementing biometrics at differing investment levels.Efforts are underway to developthe systems that can enable In the high-security realm, we expect biometrics to evolve from a high end,countermeasures. One example: Sypris James Bond–type specialist technology to the primary tool for high-volumeElectronics recently unveiled plans to applications with a strong security requirement – for instance, border control,create an international “cyber range.” voting, or police applications. We also anticipate that biometrics solutionsThe company hopes the “range” will proliferate in the private sector, used everywhere from online banking andwill become the preferred practice payments to securing electronic medical records and to help prevent clinicalbattlefield for digital warfare where errors, prevent fraud, and protect patient confidentiality.military, government agencies, and,later, businesses running critical For consumers and individuals, biometrics will be used increasingly in lessinfrastructure services can test their security-centric domains and developing countries to access benefits ordefensive – and offensive – firepower services, from healthcare or food subsidies to unemployment benefits oragainst cyber enemies. The range is banking. They will also help make things more convenient in low-securityexpected to be operational by the first situations—for example, providing easier access to a gym or library. As devicequarter of 2011. 6 costs continue to fall, the technology will be increasingly integrated with products and processes – for fingerprint unlocking of laptops, for instance.In parallel, offensive “weapons”are being explored and tried out. As a consequence, we predict that average citizens will soon start to seeResearchers successfully demonstrated biometrics less as an intrusion on their privacy and more as a means ofcounter-hacking techniques at the 2010 enhancing their privacy—securing their bank accounts or health records, forBlack Hat security conference. They example.reverse-engineered a bot and lookedfor flaws in the bot itself to exploit, inthe same way that the bot uses flawsin commercial software. From therethey were able to retrace and, in theory,take control of the botnet itself. 7 At thesame time, Microsoft and others havepartnered to attack malware spread byregistering domain names necessaryfor continuing the exploits exploit. Apseudorandom technique generates alarge number of domains per day thatthe worm looks for in order to connectto the command-and-control structure.The team is either pre-registering, or atleast flagging, those domains to blockthe spread of malware. 8 15
  • 16. The idea of counterattacks is fraught programming errors—and consequent There is one other facet of IT securitywith policy and governance headaches. calls from security experts of technology that is worth touching on. We expectTo begin with, the amorphous nature vendors to work within a strict set leading organizations to revisit theirof the Internet makes it very easy for of software development security approaches to federation of identity—to standards. start thinking in terms of federatedattackers to hide – and makes it equally identity that extends beyond theeasy for counterattacks to harm the boundaries of any one organization soinnocent. Jurisdictions will complicate Counterfeit IT products that an employee or contractor’s identitymatters further. Whose laws apply if can follow him or her from place toa Japanese company counterattacks This far-reaching view of security place.a Latvian hacker who is using botnet extends to the proliferation of counterfeit products. As far as IT securitycomputers in Norway? is concerned, a worrying question is: Action step “If we do discover fake products, howMaking sure every router do we know they aren’t exporting data Stop thinking in terms ofand every chip is secure to a hacker cartel somewhere?” The watertight security—there is counterfeiting problem is very real. For no such thing. Instead, beginWe also believe that IT security will example, fake products--particularly planning for cascaded, reflex-soon expand beyond just securing counterfeit network equipment— like security systems that relyinformation systems to securing critical have been the target of enforcement heavily on automation to respondbusiness processes. Think of this as a initiatives across several countries, immediately and locally—and thencontinuation of the conversation about leading to seizures worth more than step up their responses as the $100 million.closer alignment between IT and the severity and scope of the threatbusiness, so that IT staff have a better increases. Recognizing that people are theunderstanding of what’s important weakest security links, more and moreabout key business processes and are enterprises will shift a large part of theirable to quickly identify the security security spend to training and culturalvulnerabilities of those processes and change programs. The motivation is asbuild defenses against them. profound as it is trivial: the recognition that people are the weakest link whenThis more holistic view will use it comes to security. We expect that“integrity measurement” tools to organizations will make “securityreach into every corner of the IT fluency” central to their corporaterealm. The tools will be used to gauge culture; in those organizations, even job interviews are likely to involvethe trustworthiness of everything ways of gauging candidates’ securityfrom processor chips to software to consciousness.smartphones to servers and entireclouds. Already there are warningsthat most cyber attacks are enabled by 16
  • 17. Data Privacy Will Adopt aRisk-based ApproachComplete data privacy is a myth—all the more so in the WikiLeaks era.Leading organizations already know that. They will be attuned to regulationsgoverning privacy and will develop a risk-based approach to data privacy.In an age when WikiLeaks has become from organizations to understand the The public is clearly becoming morea household name, every business risks surrounding the use and misuse sensitive: November 9, 2010, markedleader is right to be even more paranoid of personal data. And it will require the two millionth consumer complaintabout data privacy. Just as leading constant vigilance because things are filed with the Internet Crime Complaintorganizations now realize there is no changing so fast. Center (IC3) in response to suspectedsuch thing as 100 percent IT security, so or actual online criminal activity.complete data privacy is being exposed To begin with, it will call for close This milestone is especially notableas a myth. attention to regulation—worldwide. because it took seven years for the IC3 Just one example: Authorities recently to receive its first million complaintsIn one study, the Wall Street Journal found that Google committed a serious between May 2000 and June 11, 2007.assessed and analyzed the cookies breach of the U.K. ‘s Data Protection The second million arrived in less thanand other surveillance technology Act when its Street View mapping half the time – just under three and athat companies use on the Internet. service collected personal information half years. 11The study found that the nation’s 50 from unsecured wireless networks intop web sites on average installed 64 England. 10 The U.K. was the not the The privacy challenges may well becomepieces of tracking technology onto the only nation whose privacy policies even more burdensome. In the Unitedcomputers of visitors, usually with no Google violated. States, some politicians are proposingwarning. A dozen sites installed more to fine technology companies up tothan a hundred each. 9 We predict that individual privacy $100,000 a day unless they comply will take center stage as a result with directives imposed by the U.S.If you think privacy of increased regulation and policy Department of Homeland Security. The enforcement. Privacy outcries are new bill is called the Homeland Securityprotection is important getting louder. And governments are Cyber and Physical Infrastructuretoday... becoming considerably more active in Protection Act (HSCPIPA). 12 enforcing compliance and investigatingIt will not be enough simply to the flexibility of current policies inaccept the reality of data leaks. It adjusting to emerging capabilities andwill require very proactive responses business models. 17
  • 18. Privacy by design Action stepAt the same time, the concept of Given the difficulty of securing“privacy by design” will become much data long term, the questions tomore prominent; U.S. and European consider are how to plan the rightregulators expect technology companies responses to leaks, and whether theto incorporate data privacy in the design data should be created or acquiredof their products and services. But it in the first place.will be some time before enterprises arerewarded for proactive privacy controls.The converse applies: They can expectto be punished for what is deemed to bepoor privacy practices.We expect that leading players willdevelop superior levels of understanding,enterprise-wide, about the distinctionsbetween being a data processor—broadly, handling the personal data ofothers—versus being a data controller,thus lowering their risks of unwittingbreaches of privacy regulations andperceptions of privacy breakdowns. Wealso expect the privacy exemplars todeploy the kinds of cascaded, reflexive,automated systems that the leaders willuse as the backbones of their overall ITsecurity strategies. 18
  • 19. Social Platforms Will Emergeas a New Source of BusinessIntelligenceSocial networks will evolve into “platforms” for reaching customers, tapping intotheir social identity, and gaining information about them, and about competitors andthe market as a whole.The rapid growth of social media has Social platforms have three major the final short list and provides abeen eye-popping—especially so in dimensions: functionality, or the basic classic case study. By opening up itsthe last few years. Facebook, founded capabilities these platforms offer; development platform, the companyin 2004, now has more than half a community, or the groups of people has encouraged third parties tobillion users and is spending heavily to who belong to them; and user identity, build applications that augment itsaccommodate more. Twitter’s service the unique name and associated basic services. Facebook Connect,generates billions of tweets per month. information that characterizes an a mechanism that enables users toSocial networks are not just a product individual. log into a number of other onlineof and for the young consumer: Many communities with their Facebookof the world’s Internet users aged 50 A virtuous cycle of growth identity, has made it doubly attractiveand over are active users of social Only a small number of social for third parties to support and formedia. And increasingly, businesses communities will emerge as true social users to join.and government organizations are platforms with a large ecosystem of The cycle will eventually lead dominantusing social media to connect their services built around them. We believe players to squeeze out smaller networksconstituents in an effort to improve that social networks evolve through a and make it increasingly difficult forcollaboration. virtuous cycle of growth: More features new social networks to join the space,This is just the tip of the iceberg. The attract more customers, in turn and companies will look for ways toevolution of social media will continue attracting even more customers and connect to these dominant platformsto disrupt the way companies do making the whole ecosystem appealing using APIs such as Facebook Connectbusiness, posing new challenges to IT for third parties to support with and Google’s OpenSocial. Already, moreas it attempts to harness social media additional features, and so on. than 250 million people are usingin the enterprise. The key driver of Among the many candidates – Facebook Connect on third-party sitesthis change? The transformation of Facebook, MySpace, Yahoo Groups, every month, and 10,000 new sitessocial networks into social platforms, Google, Orkut, Twitter, LinkedIn, and are adding Facebook Connect everyeach with its own ecosystem to fuel Renren, to name some – we believe day. 13 Companies are also “fishingincreasingly deeper levels of interaction. that Facebook has already made where the fish are” by launching targeted communities inside Facebook’s 19
  • 20. walls, giving them access to the rich will enable companies to communicate trained. Using a generic collaborationinformation and activities that flow by design instead of by opportunity. platform for complex, mission-criticalthrough the platform. Combinations of social platforms, processes like design or pharmaceutical devices, mobile apps, etc., mean that drug approval may be architecturallyDisintermediation is a good corporate web sites will lose their simpler, but will require considerable primacy as online destinations. As custom development and may sufferthing such, companies will begin placing less from poor usability due to the factOne of the vaunted business beliefs has emphasis on search engine optimization that it’s built on a generic, lowestbeen that companies should own the and promotions designed to bring people common denominator. Essentially, socialrelationship with their customer and to their sites, shifting their resources to platforms call for clear guidelines aboutnever let any third party disintermediate programs for engaging users where they when to use which type of solution.between them and the customer. Social congregate online – that is, on social platforms. Collaboration platforms andplatforms will overturn that belief. Enterprises should be looking at these analyticsThe rich history of information that “social identity providers” to connect Lessons learned from social platformsindividuals leave in social networks all of their interaction channels into will lead to fresh perspectives onthrough their interaction with others a cohesive, multichannel customerwill be a much more valuable form of collaboration models inside the experience. The winners will be thoseidentity – a “social identity” – than enterprise as well, eventually enabling who recognize and serve both the short-name, physical address, social security more sophisticated and optimized term whims and the long-term goalsnumber, tax file number, driver’s of individuals and establish an ongoing process-oriented collaboration.license number, and other such isolated relationship that transcends any single Analytics, in combination withforms of identity. Through APIs, social interaction. knowledge of the collaborative process,identities can now be linked across can help measure, reengineer, or tunethe Web, providing a consistent and Process-oriented the processes. As with social platforms,comprehensive view into individuals’ collaboration inside the internal collaboration platforms will provide more visibility into userpreferences, interactions with peers, enterpriseand other activities. activities. By analyzing this data, Today, collaboration in enterprises is companies will be able to gain moreFor this reason, social identities evolving from communication and intelligence and insights about theirwill become much more valuable to channel integration (also known as internal communities and collaborativebusinesses than getting an individual unified communication) into process- processes. Extracting “employeeto register on the corporate web based platforms where the underlying collaboration technology has knowledge intelligence” or “process intelligence”site. Social identity not only provides in the same manner that marketers use of the business process in which theauthentication (just like registration), to extract customer intelligence from collaborating individuals are engagedbut also a wealth of additional data and is specifically tuned to support it. external communities will also enableabout that person. That’s why more web companies to capture and preservesites, including leading media sites such Architecturally, process-based the organizational knowledge that isas Reuters, CNN, and ABC, are allowing collaboration will evolve in two distinct directions, making it necessary for IT created and exchanged through thesevisitors to log in with various social communities. organizations to develop clear andaccounts. careful guidelines for when to adopt The next stage of the evolution ofA better source of business which direction. For mission-critical social networks—as they become social processes (say, CAD design in a high-intelligence tech company or software development platforms—will bring users new levels of engagement and interaction. At theThe same wealth of information created in an IT organization) where the process burden outweighs the collaboration same time, it will transform the wayby users and businesses in the social needs, niche, vertical solutions that in which businesses must think aboutplatform is also a valuable source of support the end-to-end process will social media. The changes will be muchbusiness intelligence. Think of it asan ongoing focus group, in which any be the preferred solution. However, for too important to ignore.interaction between users tells you most simple processes, the collaborationsomething about your customers, themarket, even your competitors. This burden will outweigh the process burden, making it both necessary and Action stepcustomer intelligence – mined and simpler to standardize on a corporate Build a case for accommodatinganalyzed at aggregate or individual collaboration platform (such as Microsoft SharePoint or Lotus Notes) on social identities in your web sitelevels – will help companies monitor which many specific processes can be registration process, based on thetheir brands, develop more targetedpromotions, and measure their implemented. additional insights your businessperformance more effectively against will be able to capture. And begin The pros and cons are obvious. Toocompetitors. designing the frameworks for next- many vertical solutions will lead to a generation enterprise collaborationThe integration points that social proliferation of platforms that need to be licensed and supported, and users models.platforms provide for this information 20
  • 21. User Experience is WhatMattersThe ability to create experiences by deeply engaging the user, using natural interfacesand integrating processes and devices, will be what differentiates leading companiesand systems from the rest.Today, business process design is driven Let’s look at each factor in turn. We could be controlled by a blend of voice,by the need for optimization and cost expect that integrated experiences touch, and gesture.reduction. But tomorrow it will be will be created by minimizing thedriven by the need to create superior context-switching cost for the user. Design will be a multidisciplinaryuser experiences that help to boost Put another way, there will be further exercise: Typically handled today bycustomer satisfaction. synchronization around a single IT architects and business owners, identity—a customer-centric, follow- tomorrow it will involve optimizationBut in the future, great user experiences the-user approach as seen today in from the perspective of the processwill require more layered approaches Facebook Connect that allows users to actor, with the emphasis on simplicitythan what is typical today. Leading maintain their identity as they browse. and on removing inefficiencies. AsIT providers are thinking way beyond such, it will call for the talents ofthe next great touch-screen interfaces We predict that leading providers sociologists and social anthropologists,or gesture-driven devices. They are will offer ways to synchronize across among other less typical professions.preparing to address three specific multiple devices, multiple services, Today, these talents, in connectionfactors: the integrated user experience, and multiple processes. For example, with the user experience, are neitherwith no cognitive cost of switching it will be possible to unify the user’s recognized nor easily available.from one context to another; a experience on Amazon.com’s Kindlecompelling experience, which minimizes platform across devices – the Kindle Experiences that trulytedium and boredom; and a natural e-reader as well as the iPhone, iPad,device interface – one that involves etc. Similarly, it will be possible to engagelittle or no learning time. Apple has participate in computer gaming across Second, leading providers willmastered all three factors; for instance, mobile devices and consoles. concentrate on experiences thatits iPhone and iPod products can be “hook” the individual. That impliesused right out of the box, with little At the same time, the application personal engagement—customizing theneed to resort to a user manual. interface and the physical interface will experience to that person’s interests, gradually decouple. So, for example, a sense of what is fun, responsiveness to game on a mobile phone will not be challenge, and social connection. constrained by the physical keys; it 21
  • 22. The customization aspect is crucial.Tomorrow’s leading providers will devise Highly entertainingways to reflect back the user’s sense The entertainment industry will be the trailblazer in terms of user experience.of self—perhaps his reputation among To a large extent, it already is. More and more consumers worldwide are usinghis peers or an amusing aspect of her their TV sets not just for broadcast and cable viewing but to watch movies onpersonality—and will ensure the right demand—and to stream what’s on their computers to a bigger screen.levels of socialization, teaming userswith people they like. Similarly, it will be At the same time, entertainment is now free of the constraints of the home theater setup. The episode of your favorite comedy that you missed lastessential to emphasize the fun aspect, week is seen as easily on your Motorola Droid phone as it is on your laptopproviding immediate gratification, or TiVo. North American and European consumers are getting used to theappealing to the senses or to the desire idea of decoupled content, broadcast, and device. Many of their counterpartsfor escapism, perhaps. Just one example: in South Korea, Japan, and now China are already quite familiar with thePop quizzes on airlines’ frequent-flier concept.sites that provide participants with afew minutes of entertainment and the The decoupling is moving fast. Already, Google TV promises “a full Webchance to win a prize. browser on your HDTV screen.” And more and more entertainment-center products—not only the TV set itself—can now connect to the Internet for fastAlso important to engagement is the access to movies, music, and more.idea of a challenge with tangible goals The point is that entertainment experiences will span more and more devicesand incentives and a gauge of progress. and tap into more and more sources of content. Increasingly, we will haveChallenges invoke competitive spirit to rethink our concept of “the TV” so that we separate the device from theand provide a sense of accomplishment. service and the service from the process of streaming content. Broadcast asInterestingly, some features such as we know it will fade away, edged out by personalized services. The idea ofavatars or 3D space will be loosely watching football games or Formula One races, with surrounding advertisingtranslated as a focus on identity or matched to your profile, is not so far away.structure while others, like clear story It’s clear that as the TV set, the content, and content-delivery processeslines, currency to drive incentives, or become more digital, TV will become more Web-like. We are not talking aboutfeedback that measures progress, will be Web pages and hyperlinks; we mean “Web-like” in the sense of optimization,more directly visible. personalization, and advertising focused on what will appeal to you. 22
  • 23. Interacting through more Possibilities that are being explored Action step include tilting, rotating, or waving athan touch device to turn it off or on or to switch screens. Gestures will also be used to Start planning for superior userThird, more and more consumers will experiences that help to boost control common user-interface tasksexpect natural interfaces that require customer satisfaction—experiences such as scrolling and switching windows.little learning and have few or no that have little or no cognitive cost And they will involve interestingbarriers to use. Touch screens, of course, of switching from one context to combinations of intuitive and learnedhave become familiar as standard another, that are very engaging, behavior—for instance, flicking left tointerfaces for phones, airport check- and that are entirely natural, right to move through lists.in terminals, ticket vending machines, requiring little or no learning time.and tablet computers. Now we see Voice is the great hope for tomorrow.touch screens migrating rapidly into While voice inputs do well with limitedlaptops, desktops, and panel displays in vocabulary sets and are usually goodpublic places. As a next step, we expect enough for common command setsmultiscreen interfaces—for instance, and simple scripted interactions, thepairing an iPhone or iPad with a personal technology still has not developed to thecomputer for additional input. point where it is good enough for, say, dictation.Gestures comprise a logical extensionto touch interfaces. They are already We are confident that in a few yearsbecoming common on phones and we will be able to use a broader rangetablets. And in the consumer realm, of voice-controlled inputs to controlNintendo’s Wii gaming system set our phones and car communicationsthe bar—and raised the kinds of “Why systems. But for several years to come,can’t we do that here?” questions that most of us will still be struggling towill help us use waving and pointing make ourselves understood when callingand more of these natural forms of “customer service.”expression to control our devices. 23
  • 24. Seeing Beyond the Walls: TheProcess of Change Begins HereThe trends discussed in this report will haveprofound impacts on IT inside the enterprise.But if they are to realize their full impact, theymust not be addressed in isolation.The themes discussed in this report will Essential IT capabilities Architecturehave profound impacts on IT inside the New infrastructure architectures willenterprise. Faced with change on so A useful next step is to understand how continue to emerge. That means systemmany fronts, IT specialists have to be the trends apply when mapped against architecture should be decoupled fromprepared to change too. When faced the capabilities that should be part of infrastructure architecture as wellwith many changes all at once, they the fabric of every IT organization— as data architecture. You want yourcan’t afford to become overwhelmed. capabilities in which high-performance system to run on any architecture andNor can they afford to haphazardly reap IT groups excel: to become reconfigurable. The goal is tothe benefit of one trend or other in set the enterprise on a course towardisolation. IT governance greater agility. Perhaps the most significant changesInstead, a more effective planning requiring a reevaluation of IT policies Information managementapproach starts by actively looking for are related to security and privacy. Distributed data requires good masterconnections among the themes inside These can no longer be seen as black- data management, which is theone’s own organization. The cloud, and-white issues, but rather as shades foundation for better analytics and forfor instance, pervades several of the of grey. IT governance thus should set managing data privacy – two crucialthemes. It is a major influence on the clear policies and guidelines around risk differentiators. Converting data to datadistribution of data, it raises questions tolerance, so that the IT organization services will also help you decouple theabout information security, and it can understand where it is positioned data from applications – the first step topresents new ways to conceive of IT along the security and privacy achieving a true data platform.architecture. So the relevant question continuum – and where it should befor an IT leader might be: How do your positioned.cloud strategy and your data strategycollide? 24
  • 25. Workforce and resource management Service management Accenture’s Technology Vision 2011 reportNew and advanced skills will be needed Although most technology trends are describes discontinuous change andfor data management, architecture, raising user expectations, the reality explains why IT leaders and their businesssecurity, and analytics. Obtaining these is that IT specialists actually have less colleagues now need to recast theirskills will require a clear workforce and control over the technology universe. approaches to technology in the contextcompensation strategy that emphasizes As a result, it pays to understand of these capabilities. Specifically, theyhiring, training, retaining, or outsourcing user expectations, to improve the must accept that everything – hardware,to the best talent available. Just as user experience with processes and software, applications, data – will beimportantly, your people must be willing interfaces, and to obtain the right distributed. They must accommodate theto embrace change. providers motivated by the right fact that everything – data from data incentives. representation, infrastructure architectureSecurity from application architecture, businessThe new challenges include the cloud, Outsourcing processes from applications – will bethe consumerization of IT, and the Buying services involves different skills decoupled from everything else. And theycomplexity of cyber threats. High- than buying products. In addition to have to recognize that everything will beperforming organizations will take the traditional emphasis on technology analyzed – structured data, unstructureda practical rather than emotional features, what matters in services are data, meta-data, and even keystrokes onapproach to the new devices and new ITreality, informed by their tolerance for the viability and business practices of a web site.risk and policies on risk management. the provider. You need to understand theThreat complexity, for its part, merits economics of sourcing, the trade-offs The organizations whose IT and businesscoordinating responses with partners in involved, and the strengths of various leaders are quick to grasp those realitiesindustry and government. providers. Outsourcing has moved will be those that rapidly pull away from beyond commodity functions to include the pack.Solutions delivery a partnership with organizations thatBusiness needs should motivate adoption provide sophisticated skills in security,of new technologies in your solutions. analytics, architecture, and otherFor example, a practical approach to crucial activities. Sourcing and vendorsecurity and privacy, and therefore cloud management thus have become criticaladoption, will help your businesses speed skills.time to market. Focusing on decoupledarchitecture will increase agility, so that Strategic IT alignmentyou can respond to market shifts quickly. In the past, alignment referred to howAnd a focus on analytics will help the IT the IT organization served the business’sorganization to become a close partner needs. The new trends discussed here,with business units in making better and their accelerated pace, shift thedecisions that lead to improved business alignment emphasis to educating theoutcomes. business about what new technologies can do and how IT can help improve execution of the chosen strategy. In that way, the IT function can move from its focus on service-level agreements and costs to being a creator of value. That is the next frontier of high-performance IT. 25
  • 26. Notes Additional Resources1 Jeanne G. Harris and Allan E. Alter, “Cloudrise: Reward and Risks at the Mind the Gap: Insights from Dawn of Cloud Computing,” Accenture Institute for High Performance, November 2010 Accenture’s third global IT performance research study.2 Accenture, “Mind the Gap: Insights from Accenture’s Third Global IT Performance Research” November 2010 u Register to Download PDF3 Forrester Research, Inc., “A Workable Application Modernization Framework Mind the Gap Insights from Accenture’s third global IT Is Job No. 1 Now,” April 26, 2010 performance research4 SAP press release, “New Reality of Real Time With Launch of SAP High- Performance Analytic Appliance,” December 1, 2010 Computing in the clouds.5 CQ Weekly, “Cybersecurity: Learning to Share,” August 1, 2010 This article originally appeared in the May 2008 issue of The journal of u Read on Accenture.com6 Sypris Electronics press release, “Sypris to Develop International Cyber high-performance business Range Host Cyber Warfare Testing for U.S. and Its Partners,” November On the Edge Computing in the clouds By Kishore S. Swaminathan, Chief Scientist, Accenture u Download PDF 10, 2010; St. Petersburg Times, “Cyberspace War Games, For Security and Geeks are not generally known for colorful metaphors. But “cloud computing,” a metaphor that captures the imagination as well as the essence of an emerging computing paradigm, may be one of those know or care which cloud a given raindrop comes from. Within this rather broad definition, there are at least three discernible categories of cloud computing. Profit,” November 21, 2010 rare exceptions. Hardware cloud. This is a very large and very sophisticated data center that lets you use Among other things, cloud computing its hardware—servers, storage, network—for promises sourcing flexibil-ity for hardware a use-based charge. You can, among other and software, variable (as opposed to fixed) things, run your enterprise applications, store costs for IT, continuous (as opposed to step) your data or execute e-commerce transactions. upgrades for mission-critical enterprise soft- ware, centralized management of user soft- Need more processing power, storage or ware, and arguably better security against bandwidth during crunch time? No problem. malicious attacks and data theft. Although The hardware cloud infrastructure expands7 Dark Reading, “Researcher Demonstrates how to Counterattack Against a the phrase “cloud computing” is somewhat or contracts to accom-modate your need, new, many of the underlying technologies and you pay only for what you use. Exactly are relatively mature and available. how the provider does this or precisely where the computing takes place, you don’t Cloud computing refers to the sourcing of know and, theoretically, you don’t care. some capability—hardware, software, execu- Amazon’s Elastic Computer Cloud, or EC2, tion of a business process—from somewhere is an example of a hardware cloud. Targeted Attack,” April 19, 2010 “out there.” The users of the capability don’t know and don’t care exactly where it comes Software cloud. Also known as Software as 1 Outlook 2008 from or how it’s put together, much the same a Service, or SaaS, this is specialized software Number 2 way a grateful farmer in a drought doesn’t (say, for customer relationship management)8 PCWorld, “Conficker Worm Draws a Counter-Attack,” February 12, 2009 Cloud Computing – Where is the rain?9 Wall Street Journal, “The Webs New Gold Mine: Your Secrets,” July 30, 2010 u Read on Accenture.com10 BBC News, “Google in Significant Breach of UK Data Laws,” November 3, 2010 This article originally appeared in the October 2010 issue of u Download PDF The journal of high-performance business11 Internet Crime Complaint Center (IC3), “The Internet Crime Complaint On the Edge Cloud computing: Center Hits 2 Million,” November 15, 2010 Where is the rain? By Kishore S. Swaminathan Chief Scientist Accenture Cloud computing makes traditional IT faster, better12 U.S. House, 111th Congress, “H.R.6423: Homeland Security Cyber and and cheaper—and it has the potential to change both the business and IT landscapes in fundamental ways. Physical Infrastructure Protection Act of 2010” (introduced November 17, 2010) Whatof brochure Title the Enterprise Needs to Know About What the enterprise needs to know13 Mashable.com, “Each Month 250 Million People Use Facebook Connect on Cloud Computing the Web,” December 8, 2010 October 2009 about cloud computing. u Read on Accenture.com u Download PDF 26
  • 27. Research Methodology ContactsFor this year’s Tech Vision report, we The response—approximately 400 Don Rippertcast the net wider and deeper than hypotheses with input from scientists, architects, and engineers—covered Chief Technology Officerbefore. In late 2010, the AccentureTechnology Labs developed hypotheses topics as well publicized as cloud and Managing Director –about information technology computing and mashups along with Technologydevelopments that will have a many others much less familiar, such assignificant impact on Accenture’s text mining and sensor fusion. The team Dr. Gavin Michaelclients in the next five years. then worked with the R&D groups to look for overlaps and redundancies, and Global Managing Director,At the same time, a wide range of other to test each hypothesis against these R&D and Alliancessources was scanned to add ideas to six criteria:the mix. The sources included the recentactivities of commercial R&D labs, • Certainty of transformational impact Dr. Kishore Swaminathanthe flow of venture capital funding, on corporate IT departments Chief Scientisttrends highlighted by IT analysts, key • Velocity and scale of technologythemes at industry conferences, and the changeacademic literature. • Impact beyond any one IT “silo”We also drew on Accenture’s HighPerformance IT research and on the • More than a “one for one”findings from our annual IT Executive replacement of an existing solutionForum. And we tapped the expertise ofAccenture practices in areas such as • Being actively explored today andanalytics, IT security, and Innovation. considered practical for the near future • Transcends any one vendor or discrete “product” technology Out of this process came more than 50 defensible hypotheses that were synthesized into the themes presented in this year’s report. 27
  • 28. About AccentureAccenture is a global managementconsulting, technology servicesand outsourcing company, withapproximately 211,000 people servingclients in more than 120 countries.Combining unparalleled experience,comprehensive capabilities across allindustries and business functions,and extensive research on the world’smost successful companies, Accenturecollaborates with clients to helpthem become high-performancebusinesses and governments. Thecompany generated net revenuesof US$21.6 billion for the fiscalyear ended Aug. 31, 2010. Its homepage is www.accenture.com.Copyright © 2011 AccentureAll rights reserved.Accenture, its logo, andHigh Performance Deliveredare trademarks of Accenture.