I      Global      Intelligence      for the CIOwww.i-cio.com   SPECIAL EDITION      IT      UNLIMITED      The world’s bi...
From    “World’s Fastest”    to    “Value for the World”                                                                  ...
Global                                 Intelligence                                 for the CIO                           ...
ING              UT ION           MP AT         CO ER                                                              so     ...
A N A L Y S I SIt’s often said that “information is power,” the suggestion being thatthe more information you have access ...
“We finallyat Gabriel Consulting Group. “First, such                have the                                              ...
analysis   Sensors are being used, for example, to manageaircraft parts, including life jackets, oxygen masks — even    su...
A N A L Y S I S“We’ve had the math                                                                  else is going to have ...
D ATA           F E E D                EeDorld’s             IZd th . w      Rr-sSempnuters                               ...
nt ial    te  Po r ness                                                  su                                               ...
I N T E R V I E W sdimensional digital data on such things as topography,buildings, structures, underground shopping cente...
SPACE                                        Demystifying our cosmic                                        origins       ...
Molecular dynamics:Simulation of antigensand antibodies                                                                   ...
ce              le                                                    m    an h.                                          ...
S T R AT E G I C               V I E W P O I N Tsmartphones are 100 times smaller, 1,000 times more powerful and          ...
I      Global      Intelligence      for the CIOwww.i-cio.com
Upcoming SlideShare
Loading in...5
×

I-CIO Special Edition: A new class of high-performance computers

6,312

Published on

SUPERCOMPUTING THE NEXT GENERATIONc
A new class of high-performance computers

Published in: Technology

I-CIO Special Edition: A new class of high-performance computers

  1. 1. I Global Intelligence for the CIOwww.i-cio.com SPECIAL EDITION IT UNLIMITED The world’s biggest challenges: the world’s most powerful solutions Shaping the future with next-generation supercomputers Case studies: Petascale systems in action Why big data will push high-performance computing into the cloud
  2. 2. From “World’s Fastest” to “Value for the World” D i s as t er Pr ev ent i on a n d G l obal Env i r onm ent a l I s s u e s Helping to analy ze global warming trends and natural disasters, as well as Medical and Pharmaceutical analyzing the strength of buildings. Advances Used for simulating optimal treat- ment options and surgery as well as for developing new medicines. D i s cover i ng t h e Un ive r s e Helping to identif y unknown mat ter s and uncover the mys t- eries of the universe.N e w E n e r gy Sour ces andN e w M a t er i al Res ear ch S t at e of t he A r t Pr oduc t D e v e l o p m e n tHelping to develop new materialsand environmentally friendly Helping to develop safer aircraf tenergy sources. and cars equipped with shock reduc- ing functions.Our modern societ y is faced with many issues such as diminishing natural I t i s al s o u s e d in d eve lo pin g n ew m ate rial s w i th nan o te ch n olo g y,re s ource s , environm ent al dis a s ter s , and the increa sing ne e d for complex research for preventing global warming and natural disasters, and discov-medical care. The K computer* 1 is the key to solving these problems. Comp - ering the answers to the wonders of the universe. Without the supercomp-u t e r s i m u l a ti o n a ll o w s t h e r e e n a c t m e n t o f va ri o u s p h e n o m e n a t h ro u g h uter it would be impossible to under take the advanced and complex simu-co mputer calculations, and is known as the third principle of science ; along la ti o n s n e c e s s a r y to f i n d t h e a n s w e r s f o r o u r f u tu r e . T h e K co m p u t e r,with hypotheses and experimentation. For example, it is used in the develop- o f f i c i a l l y r a n ke d a s t h e f a s t e s t i n t h e w o r l d * 2 , w i l l e n a b l e s o c i e t y t oment of safer cars, aircraf t design, and revolutionar y medicines and cures. r i s e t o t h e c h a ll e n g e o f s h a p i n g a p r o s p e r o u s f u t u r e f o r a ll m a n k i n d .The “ K computer,” the number one supercomputer in the world*1 The "K computer", which is being jointly developed by RIKEN and Fujitsu, is part of the High-Performance Computing Infrastructure (HPCI) initiative led by Japans Ministry of Education, Culture, Sports, Science and Technology (MEXT).*2 Ranked first in the TOP 500 (announced June 20th and Nov 14th,2011), which benchmarks the top 500 supercomputer calculation speeds using the LINPACK program, and publishes the latest rankings twice a year in June and November.
  3. 3. Global Intelligence for the CIO Editor’s letter Contents What would you do with the world’s most powerful supercomputer, a 04 AN ALYSIS system capable of tackling world-changing challenges that have hitherto Supercomputers: been beyond the reach of researchers, governments and businesses? Create life-saving cancer drugs? Predict natural disasters and The next generation How a new class of high-performance minimize their impact? Uncover the origins of the universe? Revolutionize computers is bringing solutions to some of business R&D by being able to simulate even the most complex of the world’s biggest challenges. prototypes? It’s a tantalizing prospect, and one that is becoming a reality with the quantum leap in supercomputing power that has occurred in 09 DATA FEED the past year. When the “K computer” was ranked the world’s fastest Super-sized in November — with a processing speed of 10.5 petaflops — the The biggest computers, the biggest supercomputer (jointly developed by RIKEN, Japan’s leading research numbers: The Top 10 systems, the HPC institute, and global ICT company Fujitsu) was classed as four times market, and the K computer’s vital statistics. more powerful than any rival design. Since then the breakthrough performance has only accelerated: Fujitsu’s PRIMEHPC FX10 further 10 INTERVIEWS improves on the supercomputer technology employed in the Potential for greatness K computer, making it capable of a theoretical performance of Fujitsu is building the world’s most powerful 23.2 petaflops. supercomputers. Two executives leading While access to such high-end machines has historically been its push into petascale IT talk about the restricted to only the largest research facilities, that is not going to far-reaching ambitions behind the program. be the case with this generation of systems. The arrival of petascale supercomputing dovetails with another key shift in IT: cloud. The application of the cloud computing model will open up top-of-the-range 12 C ASE STUDIES computing to organizations of all sizes as a pay-per-use service. World-changing We hope this Special Edition of I: Global Intelligence for the CIO, superpowers the exclusive publication for CIOs from Fujitsu, captures some of the Demystifying the origins of the cosmos; excitement behind these new possibilities — and fires your imagination creating a virtual aircraft; accelerating about how next-generation supercomputers have the potential to tackle anti-cancer drug discovery; and optimizing many of the biggest challenges of business and society. traffic flow using mass-sensor analysis. 14 STRATEGIC VIEWPOINT The petascale imperative David Smith, Fujitsu’s chief innovation Editor and technology officer, argues that the kenny.maciver@redwoodgroup.net latent power of “big data” will push high- performance computing into the cloud. Fujitsu I Publication is published on behalf of Fujitsu by Redwood, 7 St Martin’s Place, London, WC2N 4HA. Tel +44 (0)20 7747 0700, Fax +44 (0)20 7747 0701. Copyright Redwood Publishing Ltd 2012. All rights reserved. Reproduction in whole or in part is prohibited without prior permission of the editor. Email: fujitsu.contactus@redwoodgroup.net. Fujitsu and Redwood Publishing Ltd accept no responsibility for the views expressed by contributors to the publication. Fujitsu I Publication cannot take responsibility for unsolicited manuscripts, photographs or illustrations, or for errors in articles or advertisements in the publication.Illustrations: Yusuke Saitoh Editor: Kenny MacIver. Deputy editor: James Lawrence. Art director: Finnie Finn. Senior account director: Lisa Marie Mills. Editorial director: Sara Cremer. Creative director: Paul Kurzeja Special Edition 2012 03
  4. 4. ING UT ION MP AT CO ER so ns lutio ch. ERGEN inging in rea UP T s is brety with uter d soci S EX nc mp n e co ess andy Gilmore EN forma f busilntrations: A -per nges o acIver Il usTH igh s of h challe and Kenny M w clas iggest ortleman A neof the b Words: Jim M yto man
  5. 5. A N A L Y S I SIt’s often said that “information is power,” the suggestion being thatthe more information you have access to, the greater the influence you canhave — whether in business, in society, in politics or elsewhere. But in thedigital age, as the sheer volume of information being generated has startedto overtake many organizations’ ability to extract value from it, the information-power equation seems too simplistic. Now, real power flows from an ability todeal effectively with vast amounts of information. Rising to that challenge, several of the world’s top research institutes andleading technology vendors have been energetically pursuing a vision of anew class of IT — petascale computing — capable of delivering the ultra-highperformance needed to meet exploding information processing requirements,and, ultimately, tackle some of the world’s most pressing problems. Already such supercomputing capability is having a major impact in fieldsas diverse as the life sciences, disaster prevention, the synthesis of advancedmaterials, and astrophysics — addressing areas where the scale of theinformation processing requirement has previously put solutions to complexproblems beyond the reach of researchers. That is just the first wave ofpetascale computing adoption; across industry, government and academia thescope of what can be achieved with such power is only now being appreciated. It’s a sense of new horizons shared by Masahiko Yamada, head of theTechnical Computing Solutions Unit at Fujitsu, where major advances insupercomputing technologies are making it possible to “grasp change onmany different scales.” “The most successful innovators are not motivated by money; they want tochange the world,” says Yamada. And that is just what he and his colleaguesat the Japanese ICT giant have in mind with petascale computing. “We wantto be a major resource in solving humankind’s common problems, frommeasures against global warming and natural disasters, through savingresources and energy to curing serious illness.” Such ambitions are fueling the development of immensely powerfulmachines. The current world number one in the performance stakes is the“K computer,” the supercomputer developed by Japan’s leading researchinstitute, RIKEN, in collaboration with Fujitsu. In November 2011, theK computer — part of the High-Performance Computing Infrastructureinitiative led by Japan’s Ministry of Education, Culture, Sports, Science andTechnology — took the number one position for the second, consecutivetime in the TOP500 world supercomputer rankings. Its performance of over10 quadrillion (“kei” in Japanese) floating point operations per second, or10 petaflops, makes it more than four times faster than its nearest competitor.Driving innovation, changing livesIndeed, the need for speed among advanced users of supercomputersseems insatiable. “R&D based on large-scale computer simulations is the keyto solving global issues related directly to our daily lives,” says Yuji Oinaga,head of Fujitsu’s Next Generation Technical Computing Unit, who led thedevelopment of the world’s fastest supercomputer. “Many areas demand asmuch power as possible — even 10 petaflops is [sometimes] not enough.” Two aspects are driving demand for super-fast systems, according toindependent high-performance computing (HPC) analyst Dan Olds, principal Special Edition 2012 05
  6. 6. “We finallyat Gabriel Consulting Group. “First, such have the unstructured data beingsystems let you analyze the existing datayou have much faster. If you’re working on technology generated by corporate systems and individuals’storm prediction, for example, that enables to be able to online activity, and theyou to issue warnings sooner,” says Olds. analyze these economic imperative vast stores of“Second, they make it possible to add for organizationsin many more variables and do more to become moreiterations of your problem in the same or data and the competitive andless time. That means you can make moreaccurate predictions about, say, the storm’s deluge of data efficient are all elements drivingpath and severity without losing time.” As areflection of how critically important that is, that’s coming.” demand for ever higher- performance computingsome of the most advanced applications of power. For some years,such supercomputing horsepower are in Japan, retail giants such as Walmartwhere they are being used extensively to simulate and eBay have been employingthe impact of earthquakes and tsunamis. the largest data engines to analyze Another area where ultra-high performance has the such facets as customers’ buyingpotential to save lives is in medicine. Here, for example, habits and the optimum placement ofadvanced simulations are allowing drugs to be developed by goods in stores. Financial services firms similarlymodeling compounds and testing their properties inside the computer. use such machines to crunch petabytes of data andAs David Smith, chief innovation and technology officer at Fujitsu, predict the movement of markets, stocks and shareshighlights: “If you wanted to find a drug that would be effective against or to detect unusual trading patterns that could signala new virus strain, you would need to discover a chemical compound fraudulent activity.that could bind to the virus and render it inactive. With the new breed of Now, owning systems of such scale is not the onlysupercomputers, several million candidate compounds can be tested in option: they are becoming available as cloud services,a short period of time to find one that could be effective.” with all the associated advantages. Dr. Joseph Reger, In a related field, surgeons are able to create 3-D simulations of CTO of Fujitsu Technology Solutions, observes thatpatients’ faulty organs with ever more accuracy and detail. For example, supercomputers and high-performance computers arethe University of Tokyo can use data recorded from CT and MRI scans already being used as a cloud service — and that is set toto replicate and render a virtual model of a cardiac patient’s heart in 3-D. grow fast. “HPC as a cloud service can bring very largeThat allows doctors to infer the cause of the relevant disease and even systems into reach for users who may need it only for ato use a high-performance system to perform virtual surgery prior to any couple of days per month and for whom such investmentactual operation. The most detailed calculations, in which each cardiac in owned infrastructure would not pay off,” he says.muscle cell is simulated to replicate heart movement, previously took 700 The sense of opportunity was neatly summed up bydays; the supercomputer can achieve the same result in just two days. Gus Hunt, CTO at the US’s Central Intelligence Agency Petascale computing is also proving invaluable for industrial testing (CIA), when he spoke at Information Week’s GovCloudand innovation. Car manufacturers use such supercomputers to model conference late last year: “I have a petascale problemcrashes and road performance in minute detail, as well as to optimize and need a petascale solution. Cloud computing hasparts and vehicle design through accurate simulation. Similarly, aircraft emerged to enable us to deliver capabilities we weren’tmanufacturers are analyzing complex flight airflows without the need for able to deliver before at a scale and price and agility levelexpensive wind tunnels. Engineers can “step inside” engines virtually and we were never able to do before. This is about beingsee how complex parts work and interact. And industrial researchers are able to correlate data ahead of time, about using thedeveloping advanced materials such as carbon nanotubes by simulating computational capacity of the cloud to see how thingsthe properties of molecules and their interactions with increasing speed are related before you even do a search.”and accuracy. As Yamada says: “Manufacturers are using these tools to But there is huge potential for petascale computing tobecome more competitive. The day isn’t that far off when things like cars be employed even more widely. The exponential growthor cellphones will be designed from beginning to end inside a computer.” of data generated by trends such as the use of social networks and smartphones is already producing a wealthNew applications landscape of information that could be usefully mined to extractSuch R&D may seem far removed from many organizations’ activities, new insights about people’s sentiments, needs, behavior,but the vast datasets requiring such high-speed (often real-time) analysis movements and more. But more dramatically, the data— so-called “big data” — are now a fact of business life for a growing stream will turn into a torrent as more and more network-number of commercial and public-sector organizations. The availability connected sensors and devices are deployed across theof low-cost mass storage, the expanding volumes of both structured and globe — the so-called “Internet of Things.”06 Special Edition 2012
  7. 7. analysis Sensors are being used, for example, to manageaircraft parts, including life jackets, oxygen masks — even supercomputing in ACTIONengine components — to track expiration dates and repairhistory, thereby cutting the length of cabin inspections HPC Wales: Creating a nationalprior to departure by a factor of ten. Indeed, aerospace platform for high-performance IT.company Boeing is promoting just such a Fujitsu-developedsystem to carriers, with Alaska Airlines an early adopter. An important way in which Wales capacity for about 100 major Other sensor-driven petascale applications — such as access to supercomputers is being research projects during 2012, sayssmart metering and smart grid — are in a league of their opened up to a broader base of Craddock. Phase two will followown. Utilities companies around the world have advanced organizations is through a series rapidly this year, increasing capacityplans to use wireless meters to provide up-to-the-minute of high-performance computing by about tenfold, he adds.readings that will help balance demand with capacity and (HPC) initiatives being backed by According to Joe Duran, directoroptimize energy usage. That will result in astronomical governments around the world — of HPC systems at Fujitsu UK andamounts of data. By some estimates, the roughly from the US, EU and Japan to Ireland, the program is geared to100 million smart meters expected to be installed in the Germany, India and Australia. encourage widespread take-up. “ItUS alone by 2019 will generate 100 petabytes of data. HPC Wales — one of the most is very much about bringing HPC Meanwhile, body area networks — where inexpensive dynamic of such moves — is a to a broader set of users. Muchsensors can be used to monitor people’s health metrics, UK- and EU-backed initiative smaller organizations now havesuch as blood pressure and heart rate, in real time partnering Welsh universities and the ability to access these kinds ofwherever they are — offer the potential to revolutionize the Fujitsu’s technology capabilities to technologies in more cost-effectivepractice of medicine. The key reason: the data generated provide a distributed HPC service ways. Instead of researchers andis much more valuable than that collected in hospitals to research institutions and collaborative enterprises, includingbecause it measures the real-life conditions of the patient. businesses across what is an SMEs, having to skill up to use the On a different level, environmental sensors and sensors increasingly technology-driven technology, they can just access ain vehicles could greatly improve traffic management economy. “Most organizations familiar application for the timeand environmental wellbeing in the world’s towns and wouldn’t be able to afford required,” says Duran.cities. For example, Fujitsu recently launched a cloud- the infrastructure for high- Critical to that is the uniquelybased intelligent traffic management system in Japan performance computing,” distributed nature of the HPCthat collects large amounts of data from a rich variety says HPC Wales’s CEO, David capability, with multiple sites ableof sources — roadside sensors that monitor traffic flow Craddock. “So we are investing to to use each other’s capability in aand sensors in fleets of vehicles such as taxis or hauliers create a customer-oriented service very dynamic way, he highlights.(including subtle elements like the speed windscreen that can provide affordable access So far, 87 projects are using orwipers are moving in the rain). The system also pulls in to this technology for both have applied to use the systems,data from individuals and communities, social media businesses and academia.” with priority areas identified as:and events. And all of this data is presented on a cloud From improving cancerplatform, making it available for many different uses: fleet treatment and optimizing offshore l Environment The modelingand logistics managers can use it to route their traffic in wind farms to rendering computer- of weather patterns and climatethe most efficient way, individuals can use it to get simple generated imagery (CGI) for change, the development ofreports of traffic, urban authorities can use it to manage movies, HPC Wales is set to put climate-resistant crops, researchtraffic control in real time. high-performance computing to into efficient energy development work on a variety of academic, and renewables. scientific and commercial projects l Manufacturing Simulation, for which such technology would prototyping and modeling,“The day isn’t previously have been out of reach. new product design, projectthat far off when The £40 million ($62m) project will management techniques, qualitythings like cars eventually include 1,400 nodes control analysis, the development across eight linked sites, with an of advanced materials.or cellphones aggregated performance of more l Life sciences Geneticswill be designed than 190 teraflops. The system will mainly be powered by Fujitsu research, modeling to replace physical testing/food safety testing,from beginning PRIMERGY cluster servers. remote surgery techniques.to end inside a The project’s first phase was completed in 2011, and infrastructure l Digital/creative industries Creative design, supply chain/supercomputer.” in place at six hubs will give HPC customer service improvements. Special Edition 2012 07
  8. 8. A N A L Y S I S“We’ve had the math else is going to have to do it too. That’s the economic imperative. Now we finally have the technology to beto do this for a able to analyze these vast stores of data and the delugelong time — we just of data that’s coming next. We’ve had the math to do itdidn’t have the for a long time — we just didn’t have the horsepower at a reasonable cost.”horsepower at a That horsepower is certainly there now, though, saysreasonable cost.” Michael Keegan, executive director of the Technology Product Group at Fujitsu UK and Ireland. “There is a huge business market opening up around high-performance computing because of the increasing accessibility of One of the barriers to broadening supercomputer opportunities is computer power at an economical price — that will enablealso being addressed: the availability of standard development and innovative design and the handling of very large volumesapplication software for such high-end machines. Goro Watanabe, SVP at of data and large-scale problems. It will also provide betterFujitsu North America’s R&D Center, observes that changes in demand granularity and better resolution of business and researchare driving much more packaged code. “Historically, the most common problems. The economics have changed to such aapproach was to develop proprietary software for each application; now, degree that our leading-edge technology is now muchsoftware is emerging that addresses common applications in areas such more available as a commercial offering.”as crash-test simulation and drug discovery.” Fujitsu’s proof of that commercialization of petascale computing comes in the form of its PRIMEHPC FX10,Compelling business case the supercomputer based on the technology employedIt’s clear that with the volumes of data surging and the complexity of in the K computer developed jointly with RIKEN. Thechallenges growing, far more organizations will need access to petascale PRIMEHPC FX10 incorporates the system’s next-processing. “We have an explosion of complexity in the business world. generation processor design (SPARC64 VIIIfx in theThere is an abundance of data around us now. It’s readily available but K computer and IXfx in the PRIMEHPC FX10) andnot excavated yet,” says Reger. “There is a very strong business case for six-dimensional mesh/torus interconnect technologysupercomputers to attack that. You only have to look at how simulation (“Tofu”), as well as an improved cooling design.can make testing much less expensive in the car or the airline industries Those breakthrough technologies provide a scalabilityor how it makes predicting the impact of natural disasters more effective.” that can take the FX10 to a theoretical capacity of more The emergence of compelling business cases is already changing than 20 petaflops, according to Fujitsu’s Reger. “This isbuying patterns. In the past, supercomputing power was almost exclusively where the significance for business and society comes in,the domain of scientists, observes Watanabe, but today around 60% of as the highest levels of computing become much morethat power is already being used by business — and that figure will only widely affordable,” he says. “Aside from the possibilitiesgrow. “More businesses need supercomputing power because there are that it opens up for research institutes and universities,so many development processes that justify the investment,” he says. on the business side, there is now a whole host of And the technologies that will underpin further investment in opportunities that can be attacked because economicincreasingly affordable supercomputing are rapidly moving from the barriers are fading.”R&D labs and into commercial production. As analyst Dan Olds says, And for supercomputer-sector watchers like Dan Olds,such resources — whether owned, jointly funded or sourced as a cloud that amounts to a period of accelerated take-up: “The socialservice — will provide real competitive edge. “Increasingly, organizations’ imperative is there. The economic imperative is there.differentiators will be how well they analyze the information available to And there are so many parties doing their damnedest tothem and use it to make decisions and predictions. Much of Wall Street take advantage of it that we’re likely to see innovationis there [already], Walmart is there, a few others are there, but everybody push forward far more quickly than imagined.” l THE SUPERCOMPUTER NO. 1 CLUBChanging supercomputer leadership 2008-2012Developer Host Country Supercomputer Max performance Period at no.1FUJITSU RIKEN Japan K computer 10.5 petaflops Jun 11–presentNUDT NSCC-TJ China Tianhe-1A 2.6 petaflops Nov 10–Jun 11CRAY ORNL US Jaguar 1.8 petaflops Nov 09–Nov 10IBM DOE/NNSA/LANL US Roadrunner 1.1 petaflops Jun 08–Nov 09Source: TOP500 Lists (www.top500.org)08 Special Edition 2012
  9. 9. D ATA F E E D EeDorld’s IZd th . w Rr-sSempnuters Calligraphy: Souun Takeda for RIKEN — “Kei ,” Japanese character meaning 10 quadrillion the K Computer E be cb hi 864 racks UPnum ful o The final configuration of the “K computer,” the world’s most powerfulS g supercomputer jointly developed by Japan’s leading research institute, RIKEN, and Fujitsu, will comprise 864 compute node racks, each i er he bt pow housing 24 system boards. T s 7 billion people mo It would take the world’s entire population, each performing one computation per second for 17 days, to match what the K computer can complete in just one second.The HPC market 2010-15 6 dimensionsAccording to Intersect360 Research Worldwide’s analysis of the The K computer’s network employs anhigh-performance computing (HPC) market: innovative 6-dimensional mesh/torus topology that enables the interconnection ofl Product and services revenue in 2010 was $25.6 billion, a rise the system’s 88,128 processors, deliveringof 22.4% compared with 2009 a performance of over 10 quadrillionl 69% of revenue comes from high-performance technical computing (“kei” in Japanese) operations per second.science and engineeringl 31% comes from high-performance business computing applicationsin areas such as financial services, logistics, online games and insurancel Industrial usage of HPC is growing faster than academic or Top 10 supercomputing sites (Nov 2011)governmental, and it is expected to bring in 58.3% of HPC serverrevenue in 2015 Rank Site Machine Speed (petaflops)l The overall market will grow by 41% between 2010 and 2015 to 1 RIKEN Advanced Institute K computer 10.510reach $36 billion for Computational Fujitsu Science (Japan)l Three-quarters of HPC applications today are installed at only one sitel 34% of applications are open source, 31% developed by commercial 2 National Supercomputing NUDT YH MMP 2.566 Center in Tianjin NUDTsuppliers, the rest being public software or developed in-house. (China) 3 DOE/SC/Oak Ridge Cray XT5-HE 1.759 National Laboratory CrayVital role of supercomputers (USA)High-performance computing capabilities have become increasingly 4 National Supercomputing Dawning TC3600 1.271 Centre in Shenzhen Blade Systemimportant not only for the well-being of individuals but also for the scientific (China) Dawningand economic competitiveness of entire nations and regions, says IDC: 5 GSIC Center, Tokyo HP ProLiant 1.192l Few people know that HPC plays an integral part in designing the Institute of Technology SL390svehicles they drive and the airplanes they fly in, locating and extracting (Japan) NEC/HPthe fuel that powers these vehicles and heats their homes, developing 6 DOE/NNSA/LANL/SNL Cray XE6 1.110life-saving new drugs and medical treatments, and producing the (USA) Crayweather forecasts they rely on to plan for daily activities and severe 7 NASA/Ames Research SGI Altix ICE 1.088conditions that can devastate lives and property. Center/NAS 8200EX/8400EXl An IDC study for the US’s Council on Competitiveness revealed (USA) SGI Source: Top 500 Lists, (www.top500.org)that 97% of tier 1 industrial companies using HPC now consider 8 DOE/SC/LBNL/NERSC Cray XE6 1.054it indispensable for their ability to innovate, compete and survive. (USA) Crayl For advanced economies, HPC-based modeling and simulation is a 9 Commissariat à Bull bullx super-nodepowerful tool for competing with nations which have lower labor costs, 1.050 l’énergie atomique S6010/S6030especially as costs for physical experimentation have skyrocketed. (France) Bulll In his 2010 State of the Union address, US President Barack 10 DOE/NNSA/LANL BladeCenter QS22/ 1.042Obama named supercomputing as one of only three areas destined (USA) LS21 Clusterfor increased funding rather than recession-driven budget cuts. IBM“The expectation is that petascale computing willopen up new frontiers. Its success will have alasting impact on the planet and people all aroundthe world and for generations into the future.”Steve Conway, research vice president, HPC group, IDC Special Edition 2012 09
  10. 10. nt ial te Po r ness su erful’s push om rs. pute ascale perc into pet fo eat sm ow ost p mpany orld’ ng the co ogram. gr jit t Fu sk bu he w su iswo exec bitions b am eh i ing t ives lead ind the p ild ut r We about the IT a Masahiko Yamada is president of the Technical earthquake in Turkey — all such natural disasters Computing Solutions Unit at Fujitsu Limited in Japan. continue to wreak devastation around the world. What is clear to me, is that ICT should be much more Throughout recorded history, humankind has useful when it comes to predicting such events and dreamed of being able to predict the future — and minimizing their impact, with the goal of bringing has been driven forward by taking up that challenge. about a sustainable society. For example, we more or In every age, the most advanced technologies — less know how tsunamis occur, and supercomputers astronomical observations or sundials, for example should be very helpful when it comes to predicting — helped people to see into the future as accurately the likely damage and planning strategies around that. as possible and understand the laws of nature so However, there are problems in actually they were more prepared for its vagaries. Now, with accomplishing this. Firstly, it requires detailed data advances in supercomputing, I believe the dream that of landscapes and seabeds, and secondly, the human civilization has been pursuing since ancient processing ability of current supercomputers to times is at last becoming a reality. cope with such data is far from sufficient. With the Today’s supercomputers have made it possible to supercomputers available today, the simulation grasp change on an altogether different scale — from of a tsunami is limited to kilometers. Even if a the formation of the universe to the motions of nuclear supercomputer-driven simulation could predict the particles — and, as a result, to make very precise general direction of a wave, it would be far from predictions about challenges like global warming and capable of analyzing where people should evacuate natural disasters, the saving of resources and energy, to, or where we could effectively build wave breakers. curing serious illness and much more. The precision of the simulation required to make There are several theories about the origin of the such decisions on human movement needs to be modern computer, but the famous ENIAC, invented in terms of a few dozen meters. This means that we to compute ballistic firing tables during World War II, need computer capacities that are a million times was arguably the world’s first general-purpose better than current levels. electronic computer. Later, computers were used to improve the efficiency of office work, and ever since Answering the unanswerable we’ve seen computing evolve at an exponential rate In my view, computers have just reached a capacity to the point where IT is now indispensable across in which they can play new roles in areas such as all industries. But while some people suggest the simulation — roles that can only be fulfilled by computer does not need to be developed much computers. And the world’s most powerful system, further, it’s my opinion that the technology has just the “K computer,” jointly developed by RIKEN, reached the starting line in terms of its ability to fulfill Japan’s leading research institute and Fujitsu, has the role for which it was originally intended. opened this door. A new age has just dawned, I believe, one in which ICT can actively protect Vital new capabilities people’s lives and ensure a sustainable future. The earthquake and tsunami that hit Japan last year To achieve that aim, “digital landscapes” will need left a huge scar. The floods in Thailand, the major to be built — virtual landscapes that use three-10 Special Edition 2012
  11. 11. I N T E R V I E W sdimensional digital data on such things as topography,buildings, structures, underground shopping centers, IT TO THE MAXsubways, power networks and gas pipe systems. Ifthis can be done, computers will enable us to pose The quest for the world’s fastest — flexiblequestions that were hitherto unanswerable. If a and highly reliable — supercomputer.mega-city is struck by a tsunami, how will it affecttowering skyscrapers? What kind of chaos will becreated by roads being shut off? How will flooding Yuji Oinaga is head of Fujitsu’s Next Generationimpact large underground shopping complexes? Technical Computing Unit. With over 30 years’Even more than this, we will be able to plan ideal cities experience in supercomputer engineering, he led thethat can contain damage as much as possible, and development of the “K computer,” jointly developedmake preparations such as identifying safe evacuation with Japan’s leading research institute, RIKEN.sites and escape routes. A digital landscape also hasthe potential to create many new businesses. In developing the world’s most advanced Of course, there are challenges that need to be supercomputer, was reliability and an ability totackled — such as the establishment of coherent run different workloads as important as speed?policies between the different government We did not develop the supercomputer merely todepartments responsible for maps, topography, compete on speed; reliability and efficiency were alsostructures and seabeds, and to maintain and use data vital. In general, in a system with 100 CPUs, the failureon electricity, gas and subways which are managed rate is said to be about once a year. However, if suchby separate private companies. We also need to invent a rate was applied to the K computer, this wouldand install three-dimensional sensing technology that mean there would be a few failures each day. Wewould speed up the development of digital therefore needed to eliminate this problem duringlandscapes in a much simpler and more accurate the development stage. Decisions were made to useway. It might be a huge undertaking — spread over a materials with good track records, for example, anddecade — but I want to dream of such a future. a water-cooling system was adopted, something not There remain many other issues in the world that widely used for supercomputers.are far from being solved. Computers will contribute As a result, the organizers of the TOP500 evaluatedtowards solutions across many other fields that are our system highly for the fact that it realized an efficiency ratio of 93% during 29.5 hours of continuous operation in addition to the calculation speed. “We’ll pose questions The goal of the K computer was to support a wide range of application programs, including five strategic that were hitherto application areas specified by the Japanese Ministry of Education, Culture, Sports, Science and Technology. unanswerable.” The system’s capabilities were designed to meet the requirements of diverse R&D fields, such as aerospace,universal to mankind: ways to treat terminal cancer, to manufacturing, biotechnology and disaster-prevention.develop new ecologically friendly materials or to solve We used a massively parallel system design,the mysteries of the universe. Indeed, while the current global trend for supercomputercomputers are already indispensable in industry, just development, which has an intrinsic feature of beingaround the corner is an age in which the performance easier to adapt to different applications. As theof a supercomputer will determine competitiveness. system software, instead of proprietary code, we Sixty years ago, ENIAC was dismissed by many used open source software, for which applicationscritics as something that would not work and was a can be easily developed.waste of money. But it is a historical fact that ENIACwas the key to founding the ICT society as it is now. Will some of the work on the K computer makeThe K computer may have been crowned world its way into Fujitsu products?number one but its real achievements are yet to These technologies are already fully utilized income. A few decades from now, supercomputers will the commercial supercomputer, the Fujitsu PRIMEbe a crucial part of our social infrastructure. Nothing HPC FX10. The technological advances in thewould delight me more than to look back on the supercomputer arena will also benefit other parts of theK computer and say, that was where it all started. Fujitsu product portfolio such as our server offerings. l Special Edition 2012 11
  12. 12. SPACE Demystifying our cosmic origins Set more than 5,000 metres above sea level on the Chajnantor plateau in the Chilean Andes, the Atacama Large Millimeter/submillimeter Array (ALMA) is destined to be the world’s most powerful radio telescope — and the largest Cancer research image: University of Tokyo Research Center for Advanced Science and Technology. Map: Geospatial Information Authority of Japan. astronomical project ever conceived. As it becomes fully operational, its 66 high-precision, giant antennas — which act Space image: ALMA (ESO/NAOJ/NRAO). Visible light image: the NASA/ESA Hubble Space Telescope. Aircraft image: Shutterstock. as a single, giant telescope — will gather incomprehensible amounts of radio wave data from the distant universe with unprecedented sensitivity and resolution, allowing scientists to see back in time and expose a tantalizing picture of how the universe began and has evolved. Capturing and processing that information flow from the 7- and 12-meter antenna dishes requires prodigious computing power. With over six years’ involvement in the international project, global ICT company Fujitsu has developed a bespoke supercomputer, the Atacama Compact Array Correlator for ALMA, to make sense of the torrent of data, capable of performing interference processing in real-time at 88 trillion operations per second. “The ACA Correlator is the fastest computer ever used at an astronomical site,” say the project coordinators. It is also the highest in the world. Despite numerous observers saying that designing a system for such an extreme environment was almost impossible, Fujitsu created a unique supercomputer, employingAntennae Galaxies: diskless storage, which allows it to performComposite of ALMA andHubble telescope observations effectively at a pressure of 0.5 atmospheres and hostile desert conditions. (The Atacama Desert is reputedly the most arid place on earth, which along with its altitude, makes it an ideal astronomical observation site.)World-changingsuperpowersThe world’s fastest supercomputers are delivering solutionsto some of the greatest challenges in business and society.12 Special Edition 2012
  13. 13. Molecular dynamics:Simulation of antigensand antibodies C A S E S T U D I E S HEALTH CARE industrial innovation TRANSPORTATION Accelerating anti-cancer Enabling the simulation of Optimizing traffic flow through drug discovery complete aircraft prototypes mass-sensing data analysis The complex fight against cancer has Computer simulation is already widely With roughly a billion cars on the planet and a new, powerful ally — next-generation used in industrial design and testing — from congestion problems affecting most towns and supercomputing. At the University of Tokyo’s shipbuilding to car crash testing. In the field cities, high-performance computing will have Research Center for Advanced Science and of aircraft design and development, such an increasingly important role to play in Technology (RCAST), medical researchers simulation methods enable the analysis of managing traffic flows, reducing pollution from are employing cutting-edge supercomputing complex air flows which occur during flight. vehicles and improving the environment of technology from Fujitsu to speed the But the current generation of both drivers and pedestrians. identification of new drug candidates for supercomputers which support that analysis The SPATIOWL service sits at the cutting- cancer treatment. can only work at the component level — edge of such initiatives. Announced in June In a world first, researchers are using simulating wings and fuselages, for example. 2011 by ICT company Fujitsu and initially molecular dynamics (simulating the physical Wider tests involve the use of wind-tunnel deployed in Japan, the cloud-based platform movements of individual atoms and facilities and the creation of scale models of gathers and analyzes location and other data in molecules) to design genome-based antibody aircraft, as full-scale designs and facilities would real-time from a massively distributed pool of drugs. The project is focused on developing be simply too expensive to build and maintain. sensors in vehicles, buildings, smartphones and effective treatments for cancer patients The next-generation of supercomputers a wide range of other end-points. By acting as suffering from relapse or metastasis. And to will change all that. Machines such as the an aggregation and integration hub, SPATIOWL produce the desired results, their approach K computer, the 10 petaflops machine is opening up the potential for corporate and requires at least 10 times the computing developed by Japan’s leading research government customers to develop innovative power of traditional IT-based drug institute RIKEN, and ICT company Fujitsu, yet cost-effective location-based solutions that development techniques. are expected to be able to simulate an draw on that data — from the real-time routing “To work effectively inside the body, these aircraft as a whole, reducing the cost and of commercial vehicles to the delivery of new antibody drugs have to be precisely the time of development, while supporting the citizen services. right molecular shape,” says project leader design of aircraft with superior performance. SPATIOWL is one of the company’s first professor Tatsuhiko Kodama. “Without “One of the expectations of forays into big data analytics, with a focus on supercomputing power, such sophisticated supercomputers is the creation of a ‘wholly the “human-centric” aspects of location data. simulations just wouldn’t be possible.” numeric wind tunnel,’” Fujitsu outlines. “Data “Fujitsu is working to leverage massive volumes In 2010, RCAST and Fujitsu completed from the full-scale aircraft under development by capturing and analyzing the vast amounts development of a 300-node supercomputing can then be input so that experiments can of data generated by human activity and the cluster with a theoretical performance of be performed under various conditions.” movement of things, and harnessing it to 38 teraflops. The system is reducing the time The availability of numerical wind tunnels develop new insights,” outlines the company. it takes to design artificial antibodies from will provide low-cost, short timeframe design But it is the potential of the system to three or four years down to just a few months. and development capabilities, together with dramatically reduce congestion — and hence And in future, by employing even greater improved safety, the company says. It will pollution — that could have the biggest impact supercomputer power (such as the also enable the holistic investigation of on society. Already, for instance, Fujitsu has “K computer,” the world’s fastest system), energy efficiency and noise reduction. demonstrated how SPATIOWL can be used to the team hopes to be able to develop an analyze and re-route traffic in real-time following even wider range of antibody drugs. a major event such as an earthquake. The Until now, R&D for antibody drugs has company has also used the platform to develop undergone two generational shifts, Fujitsu an advanced telematics service for commercial outlines. Starting in the 1990s, research vehicles that gives operators real-time visibility involved animal testing and then “humanizing” of their fleet and the ability to manage the resulting antibodies for use in treatment operations quickly and efficiently. on patients. The second generational shift, Today, the SPATIOWL service is powered by which began in the 2000s, led to the a range of different server options, but as data production of antibodies that can directly act volumes grow such applications will increasingly on cancer utilizing radiation therapies. The require supercomputer performance. l new research project enables researchers to employ supercomputer simulations to design basic structures for antibody drugs, thereby developing revolutionary third-generation treatments with fewer side effects. Spatial awareness: Taxi demand hotspots on a rainy evening in Tokyo
  14. 14. ce le m an h. for mit ca r -pe id S asve h hig Dav sh et ti ll pu itsu’s wi Fuj p ra ” ata gues hepe d big d, ar of “clouT er e David Smith is chief innovation and im w th technology officer at Fujitsu’s Global po to Business Group and has held a number of nt g in CIO and CTO roles in the technology and te n ore BPO sectors, as well as client relationship e la ti ilm and business development roles. Th mpuAndy G We have reached an inflection point where the co tion: str a cost economics of IT and new models like cloud computing are enabling us to store and process Illu previously unimaginable amounts of data. What’s more, the proliferation of smart mobile devices and the drive to sensor-enable our world (the so-called Internet of Things) means organizations will require massive computing power to handle the data they gather, particularly if solutions require us to move to real-time processing. Traditionally, high-performance computing (HPC) has been the exclusive domain of organizations that need to conduct complex research and simulations: for example, meteorologists, climate researchers, pharmaceutical companies developing new drugs, car manufacturers testing the safety of new vehicles and scientists trying to predict natural disasters. Such endeavors require that complex algorithmic models are applied to vast volumes of data and the more flops (floating-point operations per second — the standard measure of supercomputing power) you can fling at the problem, the better the results you can achieve in the time available. In areas such as weather forecasting and the prediction of natural disasters, processing has to be as close to real-time as possible. But the bulk of commercial and public sector organizations to date have had neither the need nor the means to access such petascale power. Supercomputers don’t come cheap and few organizations can justify the considerable capital investment required to install such cutting-edge IT. Of course, one generation’s supercomputer is the next generation’s consumable. Indeed, today’s
  15. 15. S T R AT E G I C V I E W P O I N Tsmartphones are 100 times smaller, 1,000 times more powerful and in a way that’s truly useful for citizens — then they will alsoa million times cheaper than the (then state-of-the-art) supercomputer need vast computing capability in order to extract andinstalled at MIT in 1965. But I believe organizations will have affordable present that information in a meaningful way.access to the power of today’s supercomputers over the next four to five All of these examples, and more, suggest there willyears, via cloud-based solutions and services. be a huge appetite for affordable petascale computing Yet this won’t be a supply-led “solution looking for a problem.” Rather, among an increasingly diverse range of businesses andit will be driven squarely by commercial and social demand. Specifically, public-sector organizations. Almost every sector will faceorganizations will seek to extract value and efficiencies from ever-closer- some big-data challenge, and it is this demand, I believe,to-real-time analysis of “big data” — the exponentially growing information that will accelerate the IT industry’s drive to offergenerated by our increasingly connected digital world. massively-scalable HPC as a cloud service. The statistics on the growth of the digital universe are truly mind- We can already see this happening in the public cloudboggling. In 2011, the amount of information created and replicated space. Amazon recently introduced a cloud for HPCsurpassed 1.8 zettabytes (1.8 trillion gigabytes). By 2015, this will have applications, which while not up to the petascalemore than quadrupled to around 8ZB. In 2010, there were 305 million performance levels that will be required for real-timesmartphones on the planet. By 2015, analysts predict that more than processing of big data, nonetheless demonstrates the2 billion of the world’s population will own a smartphone. At the same growing potential for such services.time we will see an explosion of low-cost, web-connected sensors At Fujitsu, we expect to cascade technologies from ourembedded into all manner of objects and devices in the physical world. joint development with Japan’s flagship research institute, Currently around three-quarters of the information online is generated RIKEN, that resulted in 2011’s announcement of theby individuals and their interactions with web services such as blogs, “K computer,” the world’s fastest supercomputer, throughGoogle, Facebook, Twitter and YouTube. (Even so, enterprises already our portfolio of commercial products. This will inevitablyinteract with 80% of this content at some time during its digital life, for include cloud offerings once the requisite level of demandexample trawling social networks for mentions of their company name.) from customers becomes evident.To put this in context, on average in a single day: people are generating However, some social and technological challenges30 billion pieces of content on Facebook, performing around 1 billion remain. For example, there are likely to be ongoingsearches on Google, sending 140 million tweets on Twitter and watching debates around the privacy implications of aggregating3 billion videos on YouTube. and analyzing public data, and these will need to be As sensors also proliferate, the vast range and diversity of information adequately addressed. In addition, the IT industry is stillgathered by ubiquitous connected devices will generate exponentially playing catch-up when it comes to the development ofmore ways available data can be combined, sliced, diced, interpreted smarter software for the intelligence/analysis layer —and presented. Already, we’re becoming accustomed to the idea ofpublic data stores and application mash-ups that deliver value throughaggregating and analyzing data from disparate sources. The moreinformation that’s out there, and the more diverse and unstructured it is, “there will be a hugethe harder this task becomes. And there’s no point in having the data ifyou can’t do anything with it. Clearly, those organizations that succeed appetite for affordablein making sense of it will gain a competitive advantage. Different sectors will face different challenges, of course. Some petascale computing.”(such as fast-moving consumer goods and financial services) arealready used to analyzing vast amounts of data, but even they are tools that can make sense of the vast volumes andseeing a year-on-year trebling that will ramp up their hunger for diversity of information and data types out there.computing power. Another example is the utilities sector, which is But there’s little question petascale cloud computing ispursuing the vision of smart grids equipped with web-connected coming. At Fujitsu, we recently introduced the PRIMEHPCmeters to help promote energy conservation. Again, if they want to FX10 supercomputer, which incorporates key technologiesrealize that vision it will require much greater processing power than used in the K computer, including the next-generationhas traditionally been available to them. processor design, advanced cooling and the six-dimensional Public authorities face a similar challenge when it comes to the mesh/torus interconnect technology (“Tofu”). PRIMEHPCvision of smart cities, where sensors embedded into the urban FX10 has a theoretical performance of 23.2 petaflops,environment can potentially automate and optimize such things as and it’s only a matter of time before this and other HPCtraffic flows, pollution control, parking, waste management, lighting technologies begin to appear in our commercial cloudand civic problem detection/resolution (e.g. water leaks). And if services, giving many more organizations an affordable way ofnational governments are committed to open data — that is, giving free accessing the computing speeds they will increasingly need toaccess to large amounts of public information not previously available, tackle their own — and the world’s — biggest challenges. l Special Edition 2012 15
  16. 16. I Global Intelligence for the CIOwww.i-cio.com

×