Victoria A. White Head, Computing Division Fermilab

834 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
834
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • CDF and D0 are both highly distributed collaborations, with many physicists and computing resources. Making use of that potential has been an issue from the start for D0 (Monte Carlo generation) and for CDF more recently. These efforts are going to grow. The SAM data handling system, used from the beginning by D0 and recently by CDF, is being used/modified for grid/distributed computing.
  • Victoria A. White Head, Computing Division Fermilab

    1. 1. Global Scientific Collaborations and Grids: A View from Fermilab
    2. 2. The 3 Top issues for Particle Physics today <ul><li>(1) Collaboration (2) Collaboration (3) Collaboration </li></ul><ul><li>Global within the field - needed for every stage of an experiment </li></ul><ul><ul><li>to build the detector and the software, operate and monitor the experiment, produce Monte Carlo data, process data, analyze data, participate in the physics </li></ul></ul><ul><li>With other sciences – for Funding need recognition and partnerships outside our field. Also to train the next generation of scientists. </li></ul><ul><li>With Educators and Communicators – For public education and to make a difference in society </li></ul><ul><ul><li>This is needed worldwide – on both sides of the digital divide </li></ul></ul>
    3. 3. The Fermilab Experimental Program Statistics <ul><li>“Active” experiments from the 2003 Research Program Workbook </li></ul><ul><li>Of 213 Institutions involved 114 of them are non-US </li></ul><ul><li>Of 1916 physicists, 753 are non-US </li></ul><ul><li>Of 699 students, 234 are non-US </li></ul>
    4. 4. Significant Contributions to the Fermilab Program from the “other side” of the digital divide <ul><li>Brasil </li></ul><ul><li>Argentina </li></ul><ul><li>Colombia </li></ul><ul><li>Mexico </li></ul><ul><li>Czech Republic </li></ul><ul><li>Taiwan </li></ul><ul><li>Russia </li></ul><ul><li>China </li></ul><ul><li>Korea </li></ul><ul><li>India </li></ul><ul><li>… . and others </li></ul>
    5. 5. Highlighting Brasil (1) <ul><li>History of truly significant contributions to Fermilab Experiments since 1983 </li></ul><ul><ul><li>Series of charm and heavy flavor expts </li></ul></ul><ul><ul><ul><li>E791 was ahead of its time in volume of data </li></ul></ul></ul><ul><ul><ul><li>Analyzed about 10% of E791 data in Rio - one of only 4 institutions   - Ohio, Mississipi, Fermilab, CBPF </li></ul></ul></ul><ul><ul><ul><li>Significant contribution to the physics - data led to the 1st publication for E791. </li></ul></ul></ul><ul><ul><ul><li>Focus </li></ul></ul></ul><ul><ul><li>Selex </li></ul></ul><ul><li>CBPF/LAFEX + Fermilab hosted CHEP95 – pushing networking and videoconferencing technology </li></ul>
    6. 6. Highlighting Brasil (2) <ul><li>Most recently big contributions to the D0 experiment </li></ul><ul><ul><li>Designed and built the POTS for the Forward Proton Detector giving the D0 expt a new physics reach </li></ul></ul><ul><ul><li>Physics Leadership – far beyond the size of the effort </li></ul></ul><ul><li>Leon Lederman, Fermilab Director Emeritus, has throughout provided support and encouragement for Physics in South America </li></ul>
    7. 7. IT Technology and Physics <ul><li>Network connectivity and IT infrastructure must be available at some minimal level in order to participate in a Physics Experiment. </li></ul><ul><ul><li>Being on the wrong side of the digital divide is painful </li></ul></ul><ul><li>We need to help ensure that governments understand </li></ul><ul><ul><li>Just how important this contribution is to the Science & </li></ul></ul><ul><ul><li>THAT HIGH ENERGY PHYSICS IS IN THE FOREFRONT OF SHOWING HOW GLOBAL E-SCIENCE CAN BE CARRIED OUT AND OF DEMONSTRATING THE USE OF TECHNOLOGIES AND BEHAVIORS THAT WILL PROFOUNDLY AFFECT EDUCATION AND COMMERCE AND SOCIETY. </li></ul></ul>
    8. 8. Fermilab Ongoing Scientific Program, 2004 <ul><li>Run II (CDF, D0, Accelerator. Upgrades) *** </li></ul><ul><li>MINOS * </li></ul><ul><li>MiniBoone </li></ul><ul><li>Sloan Digital Sky Survey ** </li></ul><ul><li>CMS ***** </li></ul><ul><li>Lattice QCD Theory collaboration ** </li></ul><ul><li>Pierre Auger and CDMS </li></ul><ul><li>Test Beams </li></ul><ul><li>BTeV in pre-project stage, preparing for review **** </li></ul><ul><li>SNAP, Off-axis neutrinos and a few other efforts in R&D, LOI stages </li></ul><ul><li>*represents Large Global Collaboration/Petabyte Datasets/Massive distributed computational power needed </li></ul><ul><li>And R&D for the Linear Collider – a worldwide effort we assume </li></ul>
    9. 9. Inclusive Worldwide Collaboration is essential for Particle Physics <ul><li>What are we doing at Fermilab to help this ? </li></ul><ul><ul><li>Networks and Network research </li></ul></ul><ul><ul><li>Grids -- lots on this – important strategically </li></ul></ul><ul><ul><li>Guest Scientist program </li></ul></ul><ul><ul><li>Education and Outreach program </li></ul></ul><ul><ul><li>Experiment sociology and leadership – changes </li></ul></ul><ul><ul><li>Videoconferencing </li></ul></ul><ul><ul><li>Virtual Control Rooms </li></ul></ul><ul><ul><li>Physics Analysis Center for CMS </li></ul></ul><ul><ul><li>Public Relations </li></ul></ul>
    10. 10. Networks <ul><li>Starting 1 year ago Fermilab moved aggressively to make a plan for ~3 GigE connectivity in the (fairly near) future </li></ul><ul><ul><li>Dark Fiber to Starlight (contract just about in place finally) </li></ul></ul><ul><ul><li>Worked on Office of Science Strategic Networking plan </li></ul></ul><ul><ul><ul><li>Expect Metropolitan Area Network (MAN) in Chicago area linking ANL, Fermilab and Starlight in a ring </li></ul></ul></ul><ul><ul><li>Public Planning processes and reports are important – e.g. TAN report, ICFA SCIC </li></ul></ul><ul><ul><ul><li>We try to participate whenever we are invited to </li></ul></ul></ul>
    11. 11. ESnet in 2003: OC192 and OC48 Links Coming Into Service; Consider Links to US HENP Labs; Evolution Not Sufficient
    12. 12. ESNet proposal for Chicago area MAN (Metropolitan Area Network)
    13. 13.  
    14. 14. Network Research <ul><li>Also in the past year Fermilab chose to become a beginning player in the Network research arena </li></ul><ul><li>As Harvey Newman says often raw bandwidth is not enough – we need to learn how to use it, monitor the network, look for research partners and opportunities </li></ul><ul><ul><li>Joining with several partners on proposal for UltraNet (DOE research network) exploitation and with Caltech et. al. on Ultralight program of work/proposal </li></ul></ul><ul><ul><li>Some international opportunities for broader collaborations </li></ul></ul><ul><ul><ul><li>UK, CA, Czech, Netherlands, DataTag, ? </li></ul></ul></ul><ul><li>Research agenda pushes provisioning of network </li></ul>
    15. 15. Network Monitoring <ul><li>Modest effort to help with Les Cottrell’s IEPM and Pinger projects </li></ul><ul><li>Modest efforts by Fermilab network group to help track down network bottlenecks and problems </li></ul>
    16. 16. Grids for Science <ul><li>There is no doubt that computing resources for Particle Physics are going to be distributed globally and shared between “Virtual Organizations” </li></ul><ul><ul><li>Happening already with Run II, Babar, RHIC and LHC experiments </li></ul></ul>
    17. 17. LHC: Key Driver for Grids <ul><li>Complexity: Millions of individual detector channels </li></ul><ul><li>Scale: PetaOps (CPU), 100s of Petabytes (Data) </li></ul><ul><li>Distribution: Global distribution of people & resources </li></ul>2000+ Physicists 159 Institutes 36 Countries CMS Collaboration
    18. 18. Grid Projects and Coordinating Bodies <ul><li>Fermilab is involved in numerous Grid projects and coordination bodies </li></ul><ul><ul><li>PPDG, GriPhyN and iVDGL - the Trillium US Grid projects </li></ul></ul><ul><ul><li>LCG - GDB, SC2, (PEB until recently), bodies and working groups on Security, Operations Center, Rollout, etc. </li></ul></ul><ul><ul><li>SRM – storage systems standards </li></ul></ul><ul><ul><li>Joint Technical Board and Interoperability work </li></ul></ul><ul><ul><li>Global Grid Forum – Physics Research Area </li></ul></ul><ul><ul><li>Global Grid Forum – Security Research Area </li></ul></ul><ul><ul><li>Open Science Grid – meta-level collaboration and coordination </li></ul></ul><ul><ul><li>And probably some I forgot….. </li></ul></ul>
    19. 19. Fermilab Grid Strategy <ul><li>Strategy (from a Fermi-centric view) </li></ul><ul><ul><li>Common approaches and technologies across our entire portfolio of experiments and projects </li></ul></ul><ul><li>Strategy (from a global HEP view) </li></ul><ul><ul><li>Work towards common standards and interoperability </li></ul></ul><ul><li>At Fermilab we are working aggressively to put all of our Computational and Storage Fabric on “the Grid” and to contribute constructively to interoperability of Grid middleware </li></ul>
    20. 20. Successes – D0 and CDF <ul><li>SAM-GRID – a functional Grid system developed for D0 and now in use by CDF for Data Storage with meta-data, Data Delivery and caching, Job Execution (JIM) and Monitoring </li></ul><ul><ul><li>SAM “stations” worldwide </li></ul></ul><ul><ul><li>Lots of real operational experience and solutions for robustness and scalability </li></ul></ul><ul><ul><li>Evolving to use Globus and Condor based “standards” </li></ul></ul><ul><ul><ul><li>Supports GridFTP + other storage access/file Xfr </li></ul></ul></ul>
    21. 21. SAM-GRID System at D0 Summary of Resources (D Ø ) Integrated Files Consumed vs Month (D Ø ) Integrated GB Consumed vs Month (D Ø ) 4.0 M Files Consumed 1.2 PB Consumed 3/02 3/03 900 Registered Nodes 1.5M Number of Files 40 TB Total Disk Cache 56 Number of Stations 600 Registered Users Regional Center Analysis site
    22. 22. SAM at a Glance – D0 and CDF <ul><li>http://d0db.fnal.gov/sam_local/SamAtAGlance/ </li></ul><ul><ul><li>Includes Stations in U.S., Czech Republic, India, France, UK, Netherlands, Germany, Canada </li></ul></ul><ul><ul><ul><li>Monitored </li></ul></ul></ul><ul><ul><ul><li>Drill down to display jobs running and files being delivered </li></ul></ul></ul><ul><li>http:// cdfdb.fnal.gov/sam_local/SamAtAGlance / </li></ul>
    23. 23. Successes - US-CMS
    24. 24. <ul><li>Grid2003: An Operational Grid - Huge Success ! </li></ul><ul><ul><li>28 sites (2100-2800 CPUs) </li></ul></ul><ul><ul><li>400-1100 concurrent jobs </li></ul></ul><ul><ul><li>10 applications </li></ul></ul><ul><ul><li>Running since October 2003 </li></ul></ul>Korea http://www.ivdgl.org/grid2003
    25. 25. Grid2003: A Collaborative Effort <ul><li>Trillium Grid projects </li></ul><ul><ul><li>PPDG + GriPhyN + iVDGL </li></ul></ul><ul><ul><li>(US-ATLAS, US-CMS, BTEV, LIGO, SDSS, Computer Science) </li></ul></ul><ul><li>US-ATLAS and US-CMS projects </li></ul><ul><ul><li>Fermilab, LBL, Argonne </li></ul></ul><ul><ul><li>U. New Mexico, U. Texas Arlington </li></ul></ul><ul><li>Korean site </li></ul><ul><ul><li>Kyungpook National University (CMS) </li></ul></ul><ul><li>New sites </li></ul><ul><ul><li>University of Buffalo (CCR) </li></ul></ul>
    26. 26. Grid2003: 3 Months Usage
    27. 27. Open Science Grid <ul><li>Goals (http://www.opensciencegrid.org/) </li></ul><ul><ul><li>Support US-LHC research program, other scientific efforts </li></ul></ul><ul><ul><li>Federate with LCG and peer with EGEE in Europe </li></ul></ul><ul><ul><li>Involve laboratories (DOE) and universities (NSF) </li></ul></ul><ul><li>Getting there: Grid2004 (OSG-1), Grid2005 (OSG-2), … </li></ul><ul><ul><li>Series of releases  increasing functionality & scale </li></ul></ul><ul><ul><li>Persistent Production quality Grid + Grid Laboratory </li></ul></ul><ul><li>Jan. 12 meeting in Chicago </li></ul><ul><ul><li>>60 participants from labs and universities, Computer Science and Physics and Biology, and Grid Projects </li></ul></ul>
    28. 28. Federating Grids <ul><li>As the US representative to the LHC Computing Grid Grid Deployment Board (LCG GDB) I have repeatedly pushed the case for the LHC Computing Grid to be a “Grid of Grids” – not a distributed Computing center </li></ul><ul><ul><li>Resources must be able to belong to >1 Grid </li></ul></ul><ul><ul><ul><li>E.g. NorduGrid, UK Grid, Open Science Grid, TeraGrid, Region XYZ Grid, University ABC Grid and Fermilab Grid! </li></ul></ul></ul><ul><ul><li>Grids must learn to interoperate </li></ul></ul><ul><ul><li>Governance, policies, security, resource usage and service level agreements must be considered at many levels including at the level of a Federation of Grids </li></ul></ul>
    29. 29. Inclusive Worldwide Collaboration is essential for Particle Physics <ul><li>What are we doing at Fermilab and in Fermilab experiments to help this ? </li></ul><ul><ul><li>Networks and Network research </li></ul></ul><ul><ul><li>Grids -- lots on this – important strategically </li></ul></ul><ul><ul><li>Guest Scientist program </li></ul></ul><ul><ul><li>Education and Outreach program </li></ul></ul><ul><ul><li>Experiment sociology and leadership – changes </li></ul></ul><ul><ul><li>Videoconferencing </li></ul></ul><ul><ul><li>Virtual Control Rooms </li></ul></ul><ul><ul><li>Physics Analysis Center for CMS </li></ul></ul><ul><ul><li>Public Relations </li></ul></ul>
    30. 30. Support for Guest Scientists and Engineers <ul><li>Although budgets are tight Fermilab (and especially the Computing Division) wants to encourage and support a guest scientist and guest engineer/software engineer program </li></ul><ul><ul><li>Tremendously beneficial to the expts </li></ul></ul><ul><ul><li>Tremendously important for ongoing collaboration and communication </li></ul></ul>
    31. 31. Education <ul><li>Quarknet meets “Grid” </li></ul><ul><li>Traditional K-12 Education Program at Fermilab is working with Grid Projects and Computing Division on the educational opportunities afforded by the Grid </li></ul><ul><li>Open Science Grid – will have a significant Educational component. </li></ul>
    32. 32. Sociology and Leadership changes <ul><li>Increasing awareness at Fermilab of the need to include those outside Fermilab </li></ul><ul><ul><li>Meeting times to take account of time zones </li></ul></ul><ul><ul><li>Videoconferencing </li></ul></ul><ul><ul><li>Posting materials and documenting </li></ul></ul><ul><ul><li>Leaders not resident at Fermilab </li></ul></ul>
    33. 33. Videoconferencing <ul><li>We do more and more at Fermilab each year – 5 new fully equipped conference rooms last year </li></ul><ul><ul><li>Total of 19 rooms - all Polycom based </li></ul></ul><ul><li>Fermilab is instituting better “official” support for VRVS and H323-based conferencing. </li></ul><ul><ul><li>New server v. soon </li></ul></ul><ul><ul><li>Expect to see growth in desktop video in next 2 years </li></ul></ul><ul><ul><li>User Requirements meeting soon – to plan next steps </li></ul></ul>
    34. 34. Virtual Control Room <ul><li>MINOS has one at Fermilab for a detector at the Sudan Mine in Minnesota </li></ul><ul><ul><li>Control Room Logbook </li></ul></ul><ul><li>US CMS is going to build one at Fermilab </li></ul><ul><ul><li>Now we in the US are the ones who will be far from the apparatus and the action! </li></ul></ul><ul><ul><li>We hope our work in the next few years will be broad reaching and benefit all who are “far from the experiment”. </li></ul></ul>
    35. 35. CMS Physics Analysis Center(s) in the U.S. <ul><li>Serious thought going into this now. What is a PAC? </li></ul><ul><li>We can’t all go to CERN, all the time, to do physics </li></ul><ul><ul><li>Must empower physicists to do analysis away from CERN </li></ul></ul><ul><ul><li>Technology is important – networks, videoconferencing, access to data </li></ul></ul><ul><ul><ul><li>But there is more </li></ul></ul></ul><ul><ul><li>What do we need at Fermilab to have a vibrant center for physics where people come to work together both physically and virtually? </li></ul></ul><ul><li>The effort we put into this may have far reaching effects on the “democratization of science”. </li></ul>
    36. 36. Public Relations <ul><li>Judy Jackson, head of Public Affairs at Fermilab, works closely with many other public affairs groups – at CERN, SLAC and elsewhere </li></ul><ul><li>Interactions.org website </li></ul><ul><li>Forming a Strategic Grid Communications group – presented a plan for this at a recent Open Science Grid meeting </li></ul>
    37. 37. Role of Science in the Information Society <ul><li>Fermilab/SLAC exhibit from SC2003 on Networking/Grids/Storage/Simulations went to the RSIS exhibition in Geneva recently </li></ul>
    38. 38. Inclusive Worldwide Collaboration is essential for Particle Physics <ul><li>What are we doing at Fermilab to help this ? </li></ul><ul><ul><li>Networks and Network research </li></ul></ul><ul><ul><li>Grids -- lots on this – important strategically </li></ul></ul><ul><ul><li>Guest Scientist program </li></ul></ul><ul><ul><li>Education and Outreach program </li></ul></ul><ul><ul><li>Experiment sociology and leadership – changes </li></ul></ul><ul><ul><li>Videoconferencing </li></ul></ul><ul><ul><li>Virtual Control Rooms </li></ul></ul><ul><ul><li>Physics Analysis Center for CMS </li></ul></ul><ul><ul><li>Public Relations </li></ul></ul>
    39. 39. IT Technology and Physics <ul><li>Network connectivity and IT infrastructure must be available at some minimal level in order to participate in a Physics Experiment. </li></ul><ul><ul><li>Being on the wrong side of the digital divide is painful </li></ul></ul><ul><li>We need to help ensure that governments everyone understands </li></ul><ul><ul><li>Just how important this contribution is to the Science & </li></ul></ul><ul><ul><li>THAT HIGH ENERGY PHYSICS IS IN THE FOREFRONT OF SHOWING HOW GLOBAL E-SCIENCE CAN BE CARRIED OUT AND OF DEMONSTRATING THE USE OF TECHNOLOGIES AND BEHAVIORS THAT WILL PROFOUNDLY AFFECT EDUCATION AND COMMERCE AND SOCIETY. </li></ul></ul>
    40. 40. Conclusions <ul><li>We must keep working on all fronts to collaborate, educate, make the case for Science and for the Information Technology and Networking improvements that are essential for success – everywhere </li></ul><ul><ul><li>in particular to ensure that the Digital Divide does not exclude physicists. </li></ul></ul>

    ×