Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Federation and Interoperability in the Nectar Research Cloud

109 views

Published on

Audience Level
Beginner

Synopsis
The Nectar Research Cloud provides an OpenStack cloud for Australia’s academic researchers. Since its inception in 2012 it has grown steadily to over 30,000 CPUs, with over 10,000 registered users from more than 50 research institutions. It is different to many clouds in being a federation across eight organisations, each of which runs cloud infrastructure in one or more data centres and contributes to a distributed help desk and user support. A Nectar core services team runs centralised cloud services. This presentation will give an overview of the experiences, challenges and benefits of running a federated OpenStack cloud and a short demonstration on using the Nectar cloud. We will also describe some current approaches that are looking to extend this federation to encompass other institutions including some in New Zealand, to extend the infrastructure using commercial cloud providers, and to move towards interoperability with the growing number of international science and research clouds through the new Open Research Cloud initiative.

Speaker Bio
Dr Paul Coddington is a Deputy Director of Nectar, responsible for the Nectar national Research Cloud, and also Deputy Director of eResearch SA. He has over 30 years experience in eResearch including computational science, high performance and distributed computing, cloud computing, software development, and research data management.

I am text block. Click edit button to change this text. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Federation and Interoperability in the Nectar Research Cloud

  1. 1. National eResearch Collaboration Tools and Resources NeCTAR is supported by the Australian Government through the National Collaborative Research Infrastructure Strategy to establish eResearch infrastructure in partnership with Australian research institutions, organisations and research communities. The University of Melbourne has been appointed as the Lead Agent, nectar.org.au National  Research  Cloud 1 Paul Coddington, Deputy Director, Research Platforms
  2. 2. Virtual laboratories Research Cloud NeCTAR National  eResearch  Collaboration  Tools  and  Resources • Funded  by  Commonwealth  Govt EIF  and  NCRIS  since  2010 • Cloud  based  infrastructure  to  enable  research  collaboration  across   institutional  boundaries • Enable  research  community  to  store,  access,  share,  and analyse data National  Research  Cloud • Computing  infrastructure,  software  and  services  -­‐ at  scale • Self-­‐service  capability  to  quickly  provision  resources   Virtual  Laboratories • Domain-­‐oriented  online  environments   • Combine  research  data,  models,  analysis  tools  and  workflows • Support  national  and  international  collaborative  research
  3. 3. NeCTAR  enhances  Australian  research Enhance  capability  and  competitiveness  of  Australian  research  by: • Facilitating  knowledge-­‐centred  innovation § Innovative  infrastructure  enabling  innovative  research  methodologies § Democratising  access  to  sophisticated  digital  methods • Accelerating  access  to  research  data,  tools  and  models § Bringing  together  access  to  modelling  and  observation;  platforms,  tools  and   applications  to  derive  knowledge  from  data. • Removing  barriers  to  collaboration § Supporting  cross-­‐institutional  and  international  research  collaboration – People  build  collaborations  -­‐ technology  can  reduce  barriers  to  collaboration 3
  4. 4. NeCTAR Virtual  Laboratories Climate  and  Weather  Science  Laboratory  – Lead:  Bureau  of  Meteorology  – 6  Partners § Integrated  environment  for  climate  and  weather  science  modelling  and  data Genomics  Virtual  Lab  – Lead:  University  of  Queensland/University  of  Melbourne  – 9  Partners § Easy  access  to  Genomics  tools  and  resources  for  Australian  biologists. Endocrine  Genomics  Virtual  Lab  – Lead:  University  of  Melbourne  – 7  Partners § Statistical  power  for  clinical  research Marine  Virtual  Lab  – Lead:  University  of  Tasmania  – 8  Partners § Ocean  observations  and  modelling  to  improve  planning  for  marine  and  coastal  environments. All  Sky  Virtual  Observatory  – Lead:  Astronomy  Australia  Limited  – 4  Partners   § Theoretical  and  observational  astronomy  data,  simulations  and  tools  accessible  from  your  desktop Biodiversity  and  Climate  Change  Virtual  Lab  – Lead:  Griffith  University  – 18  Partners § Simplifies  biodiversity-­‐climate  change  modelling. Humanities  Network  Infrastructure  – HuNI – Lead:  Deakin  University  – 13  Partners § Integrating  28  of  Australia’s  most  important  cultural  datasets Characterisation  Virtual  Lab  – Lead:  Monash  University  – 11  Partners § Integrating  Australia’s  key  research  imaging  instruments  with  data  and  analysis  tools  on  the  cloud. Geophysics  Virtual  Lab  – Lead:  CSIRO  – 7  Partners § Easy  access  to  geophysics  workflows,  simulations    and  datasets. Alveo – Human  Communications  Sciences  – Lead:  Western  Sydney  University  – 16  Partners § Studying  speech,  language,  text,  and  music  on  a  larger  scale Industrial  Ecology  Virtual  Laboratory  – Lead:  Sydney  University  – 9  Partners § Supporting  comprehensive  environmental  carbon  footprinting and  sustainability  assessments Infrastructure  partnerships • High  demand § Funded  ¼  of  proposals • Research  institution  led § Addressing  identified  research   priorities • Highly  networked § Over  35  universities  and  research   orgs  participating § Over  1:1  co-­‐investment • Collaboratively  building   collaborative  infrastructure 4
  5. 5. NeCTAR Research  Cloud 5 Stemformatics Stem Cell data visualisation on the Cloud. Find  and  visualise  interesting   genes  in  datasets  from   leading  stem  cell   laboratories  on  the  Research   Cloud. •Over  400  users  nationally •100  cores,  multi-­‐site •NCRIS  supported. Plant  Energy  Biology   Centre  of  Excellence Researchers  study  how  plants   capture  energy  from  sunlight  and   how  they  use  that  energy  to  grow   and  develop. Hosting  collaborations  with  the  Max   Planck  Institute  and  the  Beijing   Genomics  Institute  – on  the  NeCTAR Research  Cloud. Building collaboration on the Research Cloud. “NeCTAR makes it much easier, much faster. It means more collaborations — projects that would have just been too hard to go ahead.” Professor Ian Small, Laureate Fellow, West Australian Scientist of the Year 2015. Cancer  Therapeutics  CRC “The service, support and responsiveness that we have received from the Nectar team has been first class, and feels like an extension to our own internal support services.” Paul Reeve, Director of Operations, Cancer Therapeutics CRC. Access to cancer research data, tools and visualisation on the NeCTAR Cloud Providing  access  to  analysis  and   visualisation  tools,  and  over  30TB  of   cancer  research  data  on  the   Research  Cloud. The  Nectar  choice  was  easy,  and  the   migration  process  seamless. Supporting  innovation   and  collaboration  in  the   business of  research.
  6. 6. The  NeCTAR  Research  Cloud A world first… The NeCTAR Research Cloud is a partnership between 8 institutions and research organisations who are operating Australia’s first federated research cloud. • University of Melbourne • National Computation Infrastructure (NCI) • Monash University • Queensland CyberInfrastructure Foundation (QCIF) • eResearch SA (eRSA) • University of Tasmania • Intersect, NSW • iVEC, WA A single integrated cloud operated by 8 national partners and supporting over 10000 research users.
  7. 7. Cloud  7  year  timeline 7 2011 2013 2014 2015 QCIF 512  cores,   Q2  2013 eRSA 2560  cores,   Q1 2014 iVec node 2944  cores,   Q4  2014 Melbourne 1920  cores,   Q1  2012 Melbourne 3840  cores,   late  2012 Monash 2560  cores,   Q2  2013 NCI 2560  cores,   Q1  2014 UTas,  1408  cores,   Q2  2014 Intersect  node 4352  cores,   Q4  2014 National  cloud   workshops. OpenStack  selected   in  April  2011 First  call  for   nodes Sept  2011 Second  call   for  nodes May  2012 Nectar  begins  at U.  Melbourne POC  launched,   300  cores Science  clouds   program  begins 2010 2012 2016
  8. 8. Some  facts Provisioning  resources  at  scale  to  researchers • Single  sign-­‐on  with  University  username  and  password • Supporting  diverse  needs  across  the  breadth  of  Australian  research • Any  researcher,  anywhere… 8 40,000  CPU  cores 4  PetaBytes 10,000+  registered  users since  Jan  2012   http://status.rc.nectar.org.au
  9. 9. Nectar  Federation A  single  national  cloud  interface • OpenStack  cells  to  support  8  regional  sites • Users  can  request  a  site  – or  deploy  anywhere. National  services • Cloud  dashboard  (Horizon)  and  API • Authentication  (Keystone,  AAF) • Image  repository  (Glance) Federated  services • Object  store  (Swift)   • Compute  (Nova)  and  volume  storage  (Cinder) • User  support,  help  desk,  user  guides,  documentation 9
  10. 10. Nectar  Federation OpenStack  Higher  Level  Services • Data  &  analytics  – Trove,  Sahara,  Gnocchi • Application  services  – Heat,  Murano,  Magnum • Software  defined  networking  – Neutron • Storage,  backup  &  recovery  – Manila National  and  local  resources • Nationally  and  locally  funded  cloud  resources   • National  and/or  local  resource  allocations • National  standards  and  local  customisations   • Standard  and  specialised  instance  configurations  (GPUs  etc) • ‘Golden  images’  and  community  contributed  images 10
  11. 11. NeCTAR Research  Cloud  Ops  Team …a  federation  of  8  operators (~  16  FTE  total) 11 Core Services Cloud Operators Distributed Help Desk
  12. 12. Activities  used  to  “glue”  distributed  ops Six  Monthly  Tech  and  Operations  Workshops • Physically  bringing  Core  Services,  Node  Operators  and  DHD  together  for  2  days Fortnightly  RC-­‐ops  meetings • Video  Conference  of  Core  Services,  Node  Operators  and  some  DHD Weekly  Core  Services  meetings • Video  Conference  of  Core  Services  staff  from  UoM,  Monash  and  NCI Fortnightly  DHD  meetings • Video  Conference  of  5  nodes  participating  in  the  DHD Monthly  Control  Group  meetings • Video  Conference  -­‐ Node  Management,  Nectar  Directorate,  DHD  and  Ops  Mgmt
  13. 13. Federation  Challenges Collaboration  is  great  but  not  always  easy • Federated  organisations  in  Nectar  are  all  very  different • Funding  contracts  and  OLAs  with  Nodes • Have  to  make  it  worthwhile  to  federate  into  a  national  service • Scaling  out  will  be  interesting,  hierarchical  model  is  key Security,  access,  networking • Security  challenges  with  broad  access  self-­‐service  vs  managed  service • Incident  response  within  Nodes  and  across  the  federation
  14. 14. Challenges Access  and  cost  recovery • Costs  covered  by  institutions  not  directly  by  users  (has  pros  and  cons) • Allocations  process  has  been  fairly  ad-­‐hoc,  moving  to  more  structured • Capacity  management  across  multiple  Nodes   Authentication  and  Authorisation • Australian  Access  Federation  (AAF)  has  worked  very  well,  but  some  issues § people  in  institutions  not  in  AAF  (e.g.  international,  govt) • Now  trialling  NZ  access  via  their  SAML  federation • Looking  forward  to  EduGAIN to  provide  international  access  via  SAML • Authorisation  not  so  straightforward
  15. 15. Future  Directions:    International  Science  Clouds Nectar  pioneered  the  way  – but  others  have  followed: • Over  22  International  Science  Clouds  established  (OpenStack-­‐based) § CERN  cloud  is  250,000  cores § Jetstream  at  Indiana  U  is  15000  cores § Chameleon  NSF  cloud  is  16000  cores § UTSA  Open  Cloud  Inst.  is  12000  cores § Grid.5000  in  France  is  8000  cores § CLIMB  UK  cloud  is  7680  cores  (4  Unis) § Compute  Canada  Research  Cloud  (distributed  national  resource_ § European  Open  Science  Cloud Nectar  is  pursuing  interoperability with  other  Science  Clouds 15
  16. 16. Future  Directions:  Interoperability Want  to • Make  it  easy  for  researchers  to  collaborate  and  share  infrastructure,  data,   applications  across  institutions,  states,  countries,  research  domains • Reuse  applications  and  workflows  developed  elsewhere • Make  use  of  external  resources  in  private  or  public  cloud • Add  Nectar  resources  to  international  projects   Through  involvement  with   • OpenStack  Scientific  Working  group • Global  e-­‐Infrastructures  Interoperability  Working  Group • Open  Research  Cloud  Congress  (and  Declaration) • Other  research  clouds   § Working  with  NZ  institutions  to  develop  a  NZ  research  cloud
  17. 17. National eResearch Collaboration Tools and Resources www.nectar.org.au 17

×