In this video from the DDN User Group Meeting at ISC'13, Dr. Daniel Hanlon from the University College of London presents: Advancing Research at London's Global University.
"As UCL's storage demands grow, the university expects to build a storage foundation that will scale up to 100PB. Looking for a storage solution that was massively scalable yet simple to manage as part of the first phase of the infrastructure build out, UCL will use DDN object storage technology to store up to 600TB of research data. DDN object storage capabilities also will be able to empower UCL researchers to collaborate without having to worry about data reliability, compliance obligations or long-term retention of critical research assets."
Learn more: http://www.ddn.com/press-releases/2013/ucl-selects-ddn-object-storage-for-cloud-infrastructure
Watch the presentation video: http://inside-bigdata.com/video-advancing-research-at-londons-global-university/
In this video from Moabcon 2013, Daniel Hardman from Adaptive Computing presents: Practical Guidelines for Moab Stacks.
Learn more at:
http://www.adaptivecomputing.com/company/news-and-events/events/moabcon-2013/moabcon-2013-full-agenda/
Watch the video: http://insidehpc.com/?p=36350
In this presentation, Jill King from Adaptive Computing describes the 2013 BeoWulf Bash. The free event will take place on Monday, Nov. 18 at 9pm at the Wynkoop Brewery in Denver.
Learn more: http://xandmarketing.com/beobash13/
Watch the video presentation: http://wp.me/p3RLHQ-aO3
In this video from the DDN User Group Meeting at ISC'13, Dr. Daniel Hanlon from the University College of London presents: Advancing Research at London's Global University.
"As UCL's storage demands grow, the university expects to build a storage foundation that will scale up to 100PB. Looking for a storage solution that was massively scalable yet simple to manage as part of the first phase of the infrastructure build out, UCL will use DDN object storage technology to store up to 600TB of research data. DDN object storage capabilities also will be able to empower UCL researchers to collaborate without having to worry about data reliability, compliance obligations or long-term retention of critical research assets."
Learn more: http://www.ddn.com/press-releases/2013/ucl-selects-ddn-object-storage-for-cloud-infrastructure
Watch the presentation video: http://inside-bigdata.com/video-advancing-research-at-londons-global-university/
In this video from Moabcon 2013, Daniel Hardman from Adaptive Computing presents: Practical Guidelines for Moab Stacks.
Learn more at:
http://www.adaptivecomputing.com/company/news-and-events/events/moabcon-2013/moabcon-2013-full-agenda/
Watch the video: http://insidehpc.com/?p=36350
In this presentation, Jill King from Adaptive Computing describes the 2013 BeoWulf Bash. The free event will take place on Monday, Nov. 18 at 9pm at the Wynkoop Brewery in Denver.
Learn more: http://xandmarketing.com/beobash13/
Watch the video presentation: http://wp.me/p3RLHQ-aO3
In this slidecast, Sumit Gupta from Nvidia discusses the latest product news on GPU computing for HPC.
* IBM and NVIDIA Partner to Build Next-Generation Supercomputers
* NVIDIA Launches the Tesla K40 GPU Accelerator, their fastest accelerator ever
Learn more: http://nvidianews.nvidia.com/Releases/NVIDIA-Launches-World-s-Fastest-Accelerator-for-Supercomputing-and-Big-Data-Analytics-a66.aspx
Watch the video presentation: http://wp.me/p3RLHQ-aRY
"While traditional HPC is at the core of supercomputing, recent advancements are allowing HPC centers to offer new and exciting compute services to their users and other customers. Adaptive Computing will discuss two such advancements, topology-based scheduling and HPC cloud, and how together they provide an infrastructure that allow HPC centers to provide Big Data and services along with traditional offerings."
Over the course of this talk, Trev Harmon from Adaptive Computing looks back to the utility computing vision of Douglas Parkhill and proposes an application-centric workflow for the future that fulfills that vision across many disciplines of computing.
Watch the presentation video: http://wp.me/p3RLHQ-aX3
PBS Works at Imperial College: 30 Million Jobs a Year and Countinginside-BigData.com
In this deck from the 2015 PBS Works User Group, Simon Burbidge from Imperial College London presents: 30 Million Jobs a Year and Counting.
Watch the video presentation:
http://insidehpc.com/2015/09/video-imperial-college-30-million-hpc-jobs-a-year-and-counting/
Sign up for our insideHPC Newsletter: http://insideHPC.com/newsletter
In this presentation, Kris Thorleifsson and Agust Egilsson from QuantCell Research describe how their easy-to-use big data spreadsheet and end-user programming environment.
"QuantCell is a big data spreadsheet and an end-user programming tool. It improves turnaround time and enables SMEs to benefit from big data. It enables non-developers to build complex analysis, models and applications, and it brings the capabilities of major programming languages to the spreadsheet user. QuantCell supports real time Big Data frameworks and Apache Hadoop installations. It allows you to build MapReduce analysis or submit real time queries right from the spreadsheet, submit it to a Hadoop server and watch it hack away. The results can be delivered back to the spreadsheet for further analysis and visualization."
Learn more: http://www.quantcell.com
Watch the video presentation: http://wp.me/p3RLEV-1aI
In this presentation, Larry Jones from Xyratex describes how ClusterStor puts productivity into high performance data storage. The new ClusterStor 9000 dramatically raises the bar in terms of both performance and efficiency, delivering up to 50% more performance in the same rack space compared to previous ClusterStor generations.
Learn more: http://xyratex.com
Watch the video presentation: http://wp.me/p3RLHQ-aTx
In this presentation from Radio Free HPC, Fritz Ferstl from Univa leads a discussion on the continuing HPC Datacenter Evolution.
Watch the video presentation: http://wp.me/p3RLHQ-b6U
In this slidecast, Sumit Gupta from Nvidia discusses the latest product news on GPU computing for HPC.
* IBM and NVIDIA Partner to Build Next-Generation Supercomputers
* NVIDIA Launches the Tesla K40 GPU Accelerator, their fastest accelerator ever
Learn more: http://nvidianews.nvidia.com/Releases/NVIDIA-Launches-World-s-Fastest-Accelerator-for-Supercomputing-and-Big-Data-Analytics-a66.aspx
Watch the video presentation: http://wp.me/p3RLHQ-aRY
"While traditional HPC is at the core of supercomputing, recent advancements are allowing HPC centers to offer new and exciting compute services to their users and other customers. Adaptive Computing will discuss two such advancements, topology-based scheduling and HPC cloud, and how together they provide an infrastructure that allow HPC centers to provide Big Data and services along with traditional offerings."
Over the course of this talk, Trev Harmon from Adaptive Computing looks back to the utility computing vision of Douglas Parkhill and proposes an application-centric workflow for the future that fulfills that vision across many disciplines of computing.
Watch the presentation video: http://wp.me/p3RLHQ-aX3
PBS Works at Imperial College: 30 Million Jobs a Year and Countinginside-BigData.com
In this deck from the 2015 PBS Works User Group, Simon Burbidge from Imperial College London presents: 30 Million Jobs a Year and Counting.
Watch the video presentation:
http://insidehpc.com/2015/09/video-imperial-college-30-million-hpc-jobs-a-year-and-counting/
Sign up for our insideHPC Newsletter: http://insideHPC.com/newsletter
In this presentation, Kris Thorleifsson and Agust Egilsson from QuantCell Research describe how their easy-to-use big data spreadsheet and end-user programming environment.
"QuantCell is a big data spreadsheet and an end-user programming tool. It improves turnaround time and enables SMEs to benefit from big data. It enables non-developers to build complex analysis, models and applications, and it brings the capabilities of major programming languages to the spreadsheet user. QuantCell supports real time Big Data frameworks and Apache Hadoop installations. It allows you to build MapReduce analysis or submit real time queries right from the spreadsheet, submit it to a Hadoop server and watch it hack away. The results can be delivered back to the spreadsheet for further analysis and visualization."
Learn more: http://www.quantcell.com
Watch the video presentation: http://wp.me/p3RLEV-1aI
In this presentation, Larry Jones from Xyratex describes how ClusterStor puts productivity into high performance data storage. The new ClusterStor 9000 dramatically raises the bar in terms of both performance and efficiency, delivering up to 50% more performance in the same rack space compared to previous ClusterStor generations.
Learn more: http://xyratex.com
Watch the video presentation: http://wp.me/p3RLHQ-aTx
In this presentation from Radio Free HPC, Fritz Ferstl from Univa leads a discussion on the continuing HPC Datacenter Evolution.
Watch the video presentation: http://wp.me/p3RLHQ-b6U