Successfully reported this slideshow.
Your SlideShare is downloading. ×

Introducing HPC with a Raspberry Pi Cluster

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad

Check these out next

1 of 18 Ad

Introducing HPC with a Raspberry Pi Cluster

Download to read offline

In this deck from FOSDEM 2020, Colin Sauze from Aberystwyth University describes the development of a RaspberryPi cluster for teaching an introduction to HPC.

"The motivation for this was to overcome four key problems faced by new HPC users:

* The availability of a real HPC system and the effect running training courses can have on the real system, conversely the availability of spare resources on the real system can cause problems for the training course.

* A fear of using a large and expensive HPC system for the first time and worries that doing something wrong might damage the system.

* That HPC systems are very abstract systems sitting in data centres that users never see, it is difficult for them to understand exactly what it is they are using.

* That new users fail to understand resource limitations, in part because of the vast resources in modern HPC systems a lot of mistakes can be made before running out of resources. A more resource constrained system makes it easier to understand this.

The talk will also discuss some of the technical challenges in deploying an HPC environment to a Raspberry Pi and attempts to keep that environment as close to a "real" HPC as possible. The issue to trying to automate the installation process will also be covered."

Learn more: https://github.com/colinsauze/pi_cluster
and
https://fosdem.org/2020/schedule/events/

Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

In this deck from FOSDEM 2020, Colin Sauze from Aberystwyth University describes the development of a RaspberryPi cluster for teaching an introduction to HPC.

"The motivation for this was to overcome four key problems faced by new HPC users:

* The availability of a real HPC system and the effect running training courses can have on the real system, conversely the availability of spare resources on the real system can cause problems for the training course.

* A fear of using a large and expensive HPC system for the first time and worries that doing something wrong might damage the system.

* That HPC systems are very abstract systems sitting in data centres that users never see, it is difficult for them to understand exactly what it is they are using.

* That new users fail to understand resource limitations, in part because of the vast resources in modern HPC systems a lot of mistakes can be made before running out of resources. A more resource constrained system makes it easier to understand this.

The talk will also discuss some of the technical challenges in deploying an HPC environment to a Raspberry Pi and attempts to keep that environment as close to a "real" HPC as possible. The issue to trying to automate the installation process will also be covered."

Learn more: https://github.com/colinsauze/pi_cluster
and
https://fosdem.org/2020/schedule/events/

Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

Advertisement
Advertisement

More Related Content

Slideshows for you (20)

Similar to Introducing HPC with a Raspberry Pi Cluster (20)

Advertisement

More from inside-BigData.com (20)

Recently uploaded (20)

Advertisement

Introducing HPC with a Raspberry Pi Cluster

  1. 1. Introducing HPC with a Raspberry Pi Cluster Colin Sauzé <cos@aber.ac.uk> Research Software Engineer Super Computing Wales Project Aberystwyth University A practical use of and good excuse to build Raspberry Pi Clusters
  2. 2. Overview ● About Me ● Inspirations ● Why teach HPC with Raspberry Pi? ● My Raspberry Pi cluster ● Experiences from teaching ● Future Work
  3. 3. About Me ● Research Software Engineer with Supercomputing Wales project – 4 university partnership to supply HPC systems – Two physical HPCs ● PhD in Robotics – Experience with Linux on single board computers – Lots of Raspberry Pi projects
  4. 4. Inspiration #1: Los Alamos National Laboratory ● 750 node cluster ● Test system for software development ● Avoid tying up the real cluster
  5. 5. Inspiration #2: Wee Archie/Archlet ● EPCC’s Raspberry Pi Cluster ● Archie: 18x Raspberry Pi 2’s (4 cores each) ● Archlet: smaller 4 or 5 node clusters. ● Used for outreach demos. ● Setup instructions: https://github.com/EPCCed/w ee_archlet Image from https://raw.githubusercontent.com/EPCCed/wee_archlet/master/images/IMG_20170210_132818620.jpg
  6. 6. Inspiration #3: Swansea’s Raspberry Pi Cluster ● 16x Raspberry Pi 3s ● CFD demo using a Kinect sensor ● Demoed at the Swansea Festival of Science 2018
  7. 7. Why Teach with a Raspberry Pi cluster? ● Avoid loading real clusters doing actual research – Less fear from learners that they might break something ● Resource limits more apparent ● More control over the environment ● Hardware less abstract ● No need to have accounts on a real HPC
  8. 8. My Cluster ● “Tweety Pi” – 10x Raspberry Pi model B version 1s – 1x Raspberry Pi 3 as head/login node – Raspbian Stretch ● Head node acts as WiFi access point – Internet via phone or laptop
  9. 9. Demo Software ● British Science Week 2019 – Simple Pi with Monte Carlo methods demo – MPI based – GUI to control how many jobs launch and show queuing ● Swansea CFD demo – Needs more compute power – 16x Raspberry Pi 3 vs 10x Raspberry Pi 1 ● Wee Archie/Archlet Demos – Many demos available ● I only found this recently – https://github.com/EPCCed/wee_archie
  10. 10. Making a realistic HPC environment ● MPICH ● Slurm ● Quotas on home directories ● NFS mounted home directories ● Software modules ● Network booting compute nodes
  11. 11. Network booting hack ● No PXE boot support on original Raspberry Pi (or Raspberry Pi B+ and 2) ● Kernel + bootloader on SD card ● Root filesystem on NFS – Cmdline.txt contains: ● console=tty1 root=/dev/nfs  nfsroot=10.0.0.10:/nfs/node_rootfs,vers=3 ro ip=dhcp  elevator=deadline rootwait ● SD cards can be identical, small 50mb image, easy to replace
  12. 12. Teaching Materials ● Based on Introduction to HPC with Super Computing Wales carpentry style lesson: – What is an HPC? – Logging in – Filesystems and transferring data – Submitting/monitoring jobs with Slurm – Profiling – Parallelising code, Amdahl’s law – MPI – HPC Best Practice
  13. 13. Experiences from Teaching – STFC Summer School ● New PhD students in solar physics – Not registered at universities yet, no academic accounts ● 15 people each time – 1st time using HPC for many – Most had some Unix experience ● Subset of Super Computing Wales introduction to HPC carpentry lesson
  14. 14. Feedback ● Very Positive ● A lot seemed to enjoy playing around with SSH/SCP – First time using a remote shell for some – Others more adventurous than they might have been on a real HPC ● Main complaint was lack of time (only 1.5 hours) – Only got as far as covering basic job submission – Quick theoretical run through of MPI and Amdahl’s law – Probably have 3-4 hours of material ● Queuing became very apparent – 10 nodes, 15 users – “watch squeue” running on screen during practical parts
  15. 15. Problems ● Slurm issues on day 1 – Accidentally overwrote a system user when creating accounts ● WiFi via Laptop/phone slow – When users connect to the cluster its their internet connection too – Relied on this for access to course notes
  16. 16. Experiences from teaching – Supercomputing Wales Training ● Approximately 10 people – Mix of staff and research students – Mixed experience levels – All intending to use a real HPC ● Simultaneously used Raspberry Pi and real HPC – Same commands run on both ● Useful backup system for those with locked accounts ● Feedback good – Helped make HPC more tangible
  17. 17. Future Work ● Configuration management tool (Ansible/Chef/Puppet/Salt etc) instead of script for configuration ● CentOS/Open HPC stack instead of Raspbian ● Public engagement demo which focuses on our research – Analysing satellite imagery – Simulate the monsters from MonsterLab (https://monster-lab.org/)
  18. 18. More Information ● Setup instructions and scripts - https://github.com/colinsauze/pi_cluster ● Teaching material - https://github.com/SCW-Aberystwyth/Introduction-to-HPC-with- RaspberryPi ● Email me: cos@aber.ac.uk

×