Your SlideShare is downloading. ×
  • Like
  • Save
Heavy duty Abaqus structural analysis using HPC in the cloud
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Heavy duty Abaqus structural analysis using HPC in the cloud

  • 727 views
Published

Heavy duty Abaqus structural analysis using HPC in the cloud. …

Heavy duty Abaqus structural analysis using HPC in the cloud.
Frank Ding presents challenges, solutions, lessons Learned, and recommendations.
Visit HPCExperiment.com to learn how you can start using HPC in the Cloud.

Published in Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
727
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Paving the way to HPC as a ServiceHeavy Duty Abaqus Structural Analysis Using HPC in the Cloud –Challenges, Solutions, Lessons Learned, & RecommendationsFrank DingMay 16, 2013
  • 2. What is the UberCloud HPC Experiment? Started in mid-2012 as a voluntary effort Demonstrate the potential of HPC in the Cloud Uncover and overcome the obstacles Now with over 450 participants Round 3 is in progress Approaching 80 teams as of today Please submit your project ideas at the end of this webinar!2
  • 3. About the presenter Frank Ding, Engineering Analysis & ComputingManager, Simpson Strong-Tie, HPC ExperimentAdvocate,An early adopter of HPC for simulation-drivendesign, a huge fan of Linux-based cluster computing Thanks to members of the HPC ExperimentTeam #47 Matt Dunbar, SIMULIA 3DS Steve Hebert & Rob Sherrard, Nimbix Sharan Kalwani, DataSwing Antonio Arena & Cynthia Underwood, NICE Software Dennis Nagy, BeyondCAE3
  • 4. Simpson Strong-Tie A leader in structural systems research, testing and innovation Product Lines Connectors for light frame construction Fasteners and fastening systems Lateral systems Anchoring systems Fiber reinforcing materials4
  • 5. Realistic Simulation Requires HPC High-fidelity modeling Highly nonlinear materials Multi-physics Fracture and cracking Progressive damage and failure5
  • 6. Why HPC in the Cloud? Current HPC Cluster – 4-node 32 cores total Nehalem-based Xeon InfiniBand DDR HPC in the Cloud Lack of in-house HPC resources Capacity surge – large jobs speedup Capacity overflow jobs – large number of jobs Project tested in Round 2 HPC Experiment Concrete anchor bolt tension capacity 1.9 millions of DOF’s 11.5 hours runtime (32-core)6
  • 7. Cloud-Based HPCWorkflow ProjectWorkflow Pre-processing on the local workstation at the user end Abaqus input file loaded to the data staging point thru SFTP Abaqus job submitted to the compute cloud thru Nimbix web portal Job monitoring thru Nimbix dashboard plus email notification of job status Post-processing using remote visualization tool NICE DCV7
  • 8. Barriers and Challenges Data movement limited by internet bandwidth Use remote visualization for post-processing End user side internet bandwidth issue Team member time schedule (voluntary effort) Team member expertise gap Meet the project deadline8
  • 9. Benefits HPC in the Cloud solutions require multi-vendor support ISV – 3DS SIMULIA provided Abaqus license Cloud infrastructure and service provider – Nimbix HPC Cloud for CAE Remote visualization – NICE Desktop CloudVisualization(DCV) End-user applications HPC Experiment provides a collaboration platform Form a team based on end-user application requirement Basecamp.com to support team communication Third-party solution providers invited to the team if needed9
  • 10. Lessons Learned & Recommendations End point internet bandwidth variability is the top barrier Some workflow details have been identified to improve end userexperience Service provider is recommended to provide connection bandwidthtesting tool to the end users Remote visualization using NICE DCV is a good platform if astable/consistent internet bandwidth is available. Results accessibleanywhere when needed. HPC Experiment is a great platform to test your project on the CloudHPC10
  • 11. Ready for your project! Round 3 is in progress and will conclude at the end of June Anyone can create a project And it is not too late for Round 3 (and beyond) Please fill out the form at the end of this webinar11
  • 12. Q&A session notes What bandwidth is needed in your opinion? FD: Roughly a consistent bandwidth over 5Mbps is requiredfor a smooth response for dynamic 3D model manipulation. What is the largest memory node/core on the cloudforAbaquspurposes? FD: I use 4GB/core for my local HPC, but I don’t know NimbixCompute Cloud’s configuration, but I would recommend thesame.
  • 13. Q&A session notes Do you use .pbs to submit in batch? FD: No, I use Nimbix Compute Cloud job submission web portal,which talks to theTorque load manager. What sort of interconnect types did you use / try? FD:We used GigE in the Round 1 and InfiniBand in Round 2.InfiniBand performs better. Can you talk about the Memory required for your project? FD:The job was solved by Abaqus/Explicit, which does not haveintensive memory requirement like Abaqus/Standard does. UsuallyI go with 4GB/core.
  • 14. Q&A session notes You mentioned that Internet speed (or lack of) was a majorfactor.What was the maximum internet speed available toyou for the project? FD:The maximum I ever tested was 10 Mbps on a 20 Mbpspipe. Is the NICE visualization using a secure channel (encryption)? FD:Yes, encryption using the standard AES algorithm (128 or256-bit)
  • 15. ThankYouhttp://www.hpcexperiment.comhttp://www.cfdexperiment.comhttp://www.compbioexperiment.comhttp://www.bigdataexperiment.com