The document discusses the Blue Gene/L supercomputer system. It provides an overview of the overall organization and communications hardware of the Blue Gene/L system, including its torus network and tree architecture. It also discusses the system software, including the software for I/O and compute nodes, the compiler and runtime support, and the communication infrastructure. The document outlines how the Blue Gene/L system has been used for various scientific applications and simulations.
Blue Gene_SM
Introduction
The word "supercomputer" entered the mainstream lexicon in 1996 and 1997 when IBM's Deep Blue supercomputer challenged the world chess champion in two tournaments broadcast around the world.
Since then, IBM has been busy improving its supercomputer technology and tackling much deeper problems.
Their latest project, code named Blue Gene, is poised to shatter all records for computer and network performance.
What is a Super Computer
A supercomputer is a computer that is at the frontline of current processing capacity, particularly speed of calculation.
Today, supercomputers are typically one-of-a-kind custom designs produced by "traditional" companies such as Cray, IBM and Hewlett-Packard, who had purchased many of the 1980s companies to gain their experience.
Why we need Super Computers
Supercomputers are very useful in highly calculation-intensive tasks such as
Problems involving quantum physics,
Weather forecasting,
Climate research,
Molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals),
Physical simulations (such as simulation of airplanes in wind tunnels, simulation of the detonation of nuclear weapons, and research into nuclear fusion).
Why we need Super Computers
Also, they are useful for a particular class of problems, known as Grand Challenge problems, full solution for such problems require semi-infinite computing resources.
NASA™s Linux-based Super Computer
Why Supercomputers are Fast
Several elements of a supercomputer contribute to its high level of performance:
Numerous high-performance processors (CPUs) for parallel processing
Specially-designed high-speed internal networks
Specially-designed or tuned operating systems
What is Blue gene
Blue Gene is a computer architecture project designed to produce several supercomputers that are designed to reach operating speeds in the PFLOPS (petaFLOPS = 1015) range, and currently reaching sustained speeds of nearly 500 TFLOPS (teraFLOPS = 1012).
It is a cooperative project among IBM(particularly IBM Rochester and the Thomas J. Watson Research Center), the Lawrence Livermore National Laboratory, the United States Department of Energy (which is partially funding the project), and academia.
Why Blue Gene
Blue Gene is an IBM Research project dedicated to exploring the
frontiers in supercomputing:
in computer architecture,
in the software required to program and control massively parallel systems, and
in the use of computation to advance the understanding of important biological processes such as protein folding.
Learning more about biomolecular mechanisms is expected to give medical researchers better understanding of diseases, as well as potential cures.
Why the name Blue gene
Blue - The corporate color of IBM
Gene - The intended use of the Blue Gene clusters was for Computational biology.
Blue Gene Projects
There
Performance and Flexibility for Mmultiple-Processor SoC DesignYalagoud Patil
Concepts, limitations of traditional ASIC design
Extensible processors as an alternative to RTL
Toward multiple-processor SoCs
Processors and disruptive technology
Conclusions
Blue Gene_SM
Introduction
The word "supercomputer" entered the mainstream lexicon in 1996 and 1997 when IBM's Deep Blue supercomputer challenged the world chess champion in two tournaments broadcast around the world.
Since then, IBM has been busy improving its supercomputer technology and tackling much deeper problems.
Their latest project, code named Blue Gene, is poised to shatter all records for computer and network performance.
What is a Super Computer
A supercomputer is a computer that is at the frontline of current processing capacity, particularly speed of calculation.
Today, supercomputers are typically one-of-a-kind custom designs produced by "traditional" companies such as Cray, IBM and Hewlett-Packard, who had purchased many of the 1980s companies to gain their experience.
Why we need Super Computers
Supercomputers are very useful in highly calculation-intensive tasks such as
Problems involving quantum physics,
Weather forecasting,
Climate research,
Molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals),
Physical simulations (such as simulation of airplanes in wind tunnels, simulation of the detonation of nuclear weapons, and research into nuclear fusion).
Why we need Super Computers
Also, they are useful for a particular class of problems, known as Grand Challenge problems, full solution for such problems require semi-infinite computing resources.
NASA™s Linux-based Super Computer
Why Supercomputers are Fast
Several elements of a supercomputer contribute to its high level of performance:
Numerous high-performance processors (CPUs) for parallel processing
Specially-designed high-speed internal networks
Specially-designed or tuned operating systems
What is Blue gene
Blue Gene is a computer architecture project designed to produce several supercomputers that are designed to reach operating speeds in the PFLOPS (petaFLOPS = 1015) range, and currently reaching sustained speeds of nearly 500 TFLOPS (teraFLOPS = 1012).
It is a cooperative project among IBM(particularly IBM Rochester and the Thomas J. Watson Research Center), the Lawrence Livermore National Laboratory, the United States Department of Energy (which is partially funding the project), and academia.
Why Blue Gene
Blue Gene is an IBM Research project dedicated to exploring the
frontiers in supercomputing:
in computer architecture,
in the software required to program and control massively parallel systems, and
in the use of computation to advance the understanding of important biological processes such as protein folding.
Learning more about biomolecular mechanisms is expected to give medical researchers better understanding of diseases, as well as potential cures.
Why the name Blue gene
Blue - The corporate color of IBM
Gene - The intended use of the Blue Gene clusters was for Computational biology.
Blue Gene Projects
There
Performance and Flexibility for Mmultiple-Processor SoC DesignYalagoud Patil
Concepts, limitations of traditional ASIC design
Extensible processors as an alternative to RTL
Toward multiple-processor SoCs
Processors and disruptive technology
Conclusions
Colloque IMT -04/04/2019- L'IA au cœur des mutations industrielles - L'IA pou...I MT
Colloque IMT - L'IA au cœur des mutations industrielles - Session Optimisation: L'IA pour la performance des réseaux. Présentation par Léonardo Linguaglossa, Chercheur post-doctorant (Télécom ParisTech)
Blue Gene is a massively parallel computer being developed at the IBM Thomas J. Watson Research Center .Blue Gene represents a hundred-fold improvement on performance compared with the fastest supercomputers of today. It will achieve 1 PetaFLOP /sec through unprecedented levels of parallelism in excess of 4,0000,000 threads of execution.
Blue brain enables humans to give new dimensions to science and technology and make enormous development in making the best possible enlightenment to the present scenario.the details can be seen by going though the power point presentation
Colloque IMT -04/04/2019- L'IA au cœur des mutations industrielles - L'IA pou...I MT
Colloque IMT - L'IA au cœur des mutations industrielles - Session Optimisation: L'IA pour la performance des réseaux. Présentation par Léonardo Linguaglossa, Chercheur post-doctorant (Télécom ParisTech)
Blue Gene is a massively parallel computer being developed at the IBM Thomas J. Watson Research Center .Blue Gene represents a hundred-fold improvement on performance compared with the fastest supercomputers of today. It will achieve 1 PetaFLOP /sec through unprecedented levels of parallelism in excess of 4,0000,000 threads of execution.
Blue brain enables humans to give new dimensions to science and technology and make enormous development in making the best possible enlightenment to the present scenario.the details can be seen by going though the power point presentation
Introduction to Data Models & Cisco's NextGen Device Level APIs: an overviewCisco DevNet
A session in the DevNet Zone at Cisco Live, Berlin. The need to automate interactions with the network is ever more obvious. We have a plethora of tools built around legacy technologies, such as SNMP and screen scraping, but these are coming to the end of their useful lives. In this introductory talk we will look at how YANG data models are becoming ever more pervasive across networking, and how they may be used to more efficiently automating network configuration and operational management.
Tool-Driven Technology Transfer in Software EngineeringHeiko Koziolek
This talk presentst the tool-driven technology transfer process ABB Corporate Research applies in selected software engineering University collaborations. As an example, we have created an add-in to a popular UML tool and developed the tooling in close interaction with the target users. Centering the technology transfer around tool implementations brings many benefits such as the need to make conceptual contributions applicable and the ability to quickly benefit from the new concepts. A challenge to this form of technology transfer is the long-term commitment to the maintenance of the tooling, which we try to address by creating an open developer community. Tool-driven technology transfer projects have proven to be valuable a instrument of bringing advanced software engineering technologies into our organization.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
2. INDEX
Introduction
Overview of system blue gene/l supercomputer
Blue gene system software
Blue gene/l simulation environment
Trends in supercomputer
Programming methodologies for petascale computing
Validating the architecture with application programs
Blue Gene science applications development
B. E. COMPUTER SEMINAR
3. Overview of system blue gene/l supercomputer
Overall Organization
B. E. COMPUTER SEMINAR
7. ETHERNET
JTAG
Outline of the Blue Gene/L system software.
Only I/O nodes are attached to the Gbit/s Ethernet network, giving
1024x1Gbit/s links to external file servers.
The JTAG protocol is used for reading and writing to any address of the 16 KB
SRAMs in the BG/L chips.
B. E. COMPUTER SEMINAR
8. Hardware Technologies for petascale computing
Special Purpose hardware
5 main categories:
• Conventional technologies
• Processing in-memory (PIM) designs
• Designs based on super conducting processor technology
• Special purpose hardware designs
• Schemes that use the aggregate computing power of Web-distributed processors
Built a family of special-purpose attached processors for performing the
gravitational force computations that form the inner loop of N-body simulation
problems.
B. E. COMPUTER SEMINAR
9. Blue Gene System Software
System Software for the I/O Nodes
B. E. COMPUTER SEMINAR
10. System Software for the Compute Nodes
System Management Software
Compiler and Run-time Support
CIOD in two scenarios:
• driven by a console shell (called CIOMAN)
• driven by a job scheduler (such as LoadLeveler)
Midplane Management and Control System (MMCS)
•low level hardware operations
• turn on power supplies, monitor temperature sensors and fans,
and react accordingly.
• configure and initialize IDo, Link and BG/L chips
• read and write configuration registers, SRAM and reset the cores
of a BG/L chip.
• Linux for I/O nodes
• BLRTS for compute nodes
The custom control library can perform:
B. E. COMPUTER SEMINAR
12. Blue Gene/L Simulation Environment
Trends in Performance of Super Computer
Programming methodologies for petascale computing
Validating the architecture with application programs
•Rapid changes in vendors, architectures, technologies, and system
usagein last 50 yrs.
•The Top500 list has been updated twice a year since June 1993
BlueGene/L’s unique features are especially appealing for ASCI-scale
scientific applications.
•A data parallel programming language,
•Message passing between sequential processes,
•Multithreading
B. E. COMPUTER SEMINAR
13. Blue Gene science applications development
B. E. COMPUTER SEMINAR
•It is the application platform for the Blue Gene Science program
•It serves as a prototyping platform for research into application
frameworks suitable for cellular architectures.
•It provides an application perspective in close contact with the
hardware and systems software development teams
14. Conclusion/Summary Point
•New series of high performance machines
•The first small prototypes will be available
•Simulation environment
•Demonstrate a complete and functional system
software environment before hardware becomes available
15. Bibliography
[1] “An Overview of the Blue Gene /L System Software Organization”, 5 5
George Alm’asi, Ralph Bellofatto, Jos´e Brunheroto , C¢alin Cas¸caval ,
Jos´e G. Casta˜nos , Luis Ceze , Paul Crumley , C. Christopher Erway ,
Joseph Gagliano , Derek Lieber , Xavier Martorell , Jos´e E. Moreira ,
Alda Sanomiya , and Karin Strauss
[2] “High Performance Computing-The Quest for Petascale Computing”,
Jack J. Dongarra, David W. Walker
[3] “An Overview of Blue Gene/L Supercomputer”, W Barrett, C Engel, B
Drehmel, B Hilgart, D Hill, F Kasemkhani, D Krolak, CT Li, T Liebsch, J Marcella,
A Muff, A Okomo, M Rouse, A Schram, M Tubbs, G Ulsh, C Wait, J Wittrup