This white paper introduces the EMC Isilon scale-out data lake as the key enabler to store, manage, and protect unstructured data for traditional and emerging workloads.
Scale-Out Architectures for Secondary StorageInteractiveNEC
IT organizations have seen explosive growth in the amount of data for several years. Forecasts are for that growth to continue at a rapid pace and even accelerate for organizations where the deluge of data from next generation applications such as rich media or IoT networks is just beginning to have an impact. All this growth puts pressure on storage resources, IT budgets and on the delivery of IT services including data protection. This pressure in turn is driving organizations to re-evaluate various aspects of their IT environment including data protection strategies.
Scale-Out Architectures for Secondary StorageInteractiveNEC
IT organizations have seen explosive growth in the amount of data for several years. Forecasts are for that growth to continue at a rapid pace and even accelerate for organizations where the deluge of data from next generation applications such as rich media or IoT networks is just beginning to have an impact. All this growth puts pressure on storage resources, IT budgets and on the delivery of IT services including data protection. This pressure in turn is driving organizations to re-evaluate various aspects of their IT environment including data protection strategies.
EMC Isilon: A Scalable Storage Platform for Big DataEMC
This white paper provides insights into EMC Isilon's shared storage approach, covering a wide range of desired characteristics including increased efficiency and reduced total cost.
BRIDGING DATA SILOS USING BIG DATA INTEGRATIONijmnct
With cloud computing, cheap storage and technology advancements, an enterprise uses multiple
applications to operate business functions. Applications are not limited to just transactions, customer
service, sales, finance but they also include security, application logs, marketing, engineering, operations,
HR and many more. Each business vertical uses multiple applications which generate a huge amount of
data. On top of that, social media, IoT sensors, SaaS solutions, and mobile applications record exponential
growth in data volume. In almost all enterprises, data silos exist through these applications. These
applications can produce structured, semi-structured, or unstructured data at different velocity and in
different volume. Having all data sources integrated and generating timely insights helps in overall
decision making. With recent development in Big Data Integration, data silos can be managed better and it
can generate tremendous value for enterprises. Big data integration offers flexibility, speed and scalability
for integrating large data sources. It also offers tools to generate analytical insights which can help
stakeholders to make effective decisions. This paper presents the overview on data silos, challenges with
data silos and how big data integration can help to stun them.
Beyond backup to intelligent data managementSirius
Backup isn’t going away, but it is changing. Data is now one of your company’s most valuable assets, but you may be challenged to leverage it for competitive and operational advantage. More flexibility, visibility, and intelligence into data is needed. As well as improved insights into where data resides and its importance in the business hierarchy.
Increased efficiency, effectiveness, and control is the goal. Are you ready to modernize backup from a transactional process to a forward-thinking strategy?
View to learn:
--New questions you should be asking about where your data lives, who owns data decisions, and what’s optimal for the business.
--Keys to jump-starting modernization.
--What improved visibility and intelligence into data delivers.
--Why multi-cloud management and risk assignment are essential to protecting your most valuable data.
The future of storage is here, so don’t get left behind—start planning your next move.
A data center centralizes a company’s shared IT operations as well as equipment for processing, storing and disseminating applications and data. It is essential to have knowledge about the terms that are used frequently in the context of data centers.
Efficient multicast delivery for data redundancy minimization over wireless d...redpel dot com
Efficient multicast delivery for data redundancy minimization over wireless data centers
for more ieee paper / full abstract / implementation , just visit www.redpel.com
As businesses grow and become more complex, the information systems that support them require continuous updates to keep up with the changes. Moving to an ERP (Enterprise Resource Planning) system doesn’t reduce the need to change.
View the original Blog post: http://www.eprentise.com/blog/data-systems/erp-systems-the-next-legacy-dinosaur/
Website: www.eprentise.com
Twitter: @eprentise
Google+: https://plus.google.com/u/0/+Eprentise/posts
Facebook: https://www.facebook.com/eprentise
Ensure your data is Complete, Consistent, and Correct by using eprentise software to transform your Oracle® E-Business Suite.
Microsoft India - SQL Server The Forrester Wave Enterprise Database Managemen...Microsoft Private Cloud
In Forrester’s 153-criteria evaluation of enterprise open source and closed source database management systems (DBMSes), we found that Oracle, IBM, Microsoft, and Sybase lead the pack because each offers mature, high-performance, scalable, secure, and flexible solutions. It was no surprise to see Oracle dominating in most of the features and functionality such as performance, availability, security, and administration. IBM DB2 for Linux, UNIX, and Windows showed strong support for application and data integration, performance, scalability, and administration, while Microsoft has impressive capabilities for database programmability, application development, administration, and security. Sybase Adaptive Server Enterprise continues to show improvement in its product, offering good support for availability, performance, and administration. IBM Informix Dynamic Server, MySQL, and Ingres came out as Strong Performers, following very closely on the heels of the Leaders and offering very respectable alternatives and a multitude of choices for application developers and architects. PostgreSQL lacks the Leaders’ breadth of features but is a reputable Contender for some use cases
The high volume data processing demands of IoT exceed the capabilities of the majority of today's data centers. This presentation examines the issues that must be addressed to ensure a successful IoT implementation.
The next generation of computing for content centric application will require even greater responsiveness and performance as well as a resiliency that may not have been demanded previously. Storage systems for these workloads must deliver high performance and reliability while meeting strict price – performance levels and come from a known, premier storage supplier.
A perfect storm of data growth is brewing. According to a recent survey by Gartner, data growth is now the leading infrastructure challenge.1 Left unchecked data growth negatively impacts application performance, compliance goals and IT costs. Yet, this very same data is also the lifeblood of today’s organizations driving demand for enterprise analytics to extract value from enterprise data like never before.
EMC Isilon: A Scalable Storage Platform for Big DataEMC
This white paper provides insights into EMC Isilon's shared storage approach, covering a wide range of desired characteristics including increased efficiency and reduced total cost.
BRIDGING DATA SILOS USING BIG DATA INTEGRATIONijmnct
With cloud computing, cheap storage and technology advancements, an enterprise uses multiple
applications to operate business functions. Applications are not limited to just transactions, customer
service, sales, finance but they also include security, application logs, marketing, engineering, operations,
HR and many more. Each business vertical uses multiple applications which generate a huge amount of
data. On top of that, social media, IoT sensors, SaaS solutions, and mobile applications record exponential
growth in data volume. In almost all enterprises, data silos exist through these applications. These
applications can produce structured, semi-structured, or unstructured data at different velocity and in
different volume. Having all data sources integrated and generating timely insights helps in overall
decision making. With recent development in Big Data Integration, data silos can be managed better and it
can generate tremendous value for enterprises. Big data integration offers flexibility, speed and scalability
for integrating large data sources. It also offers tools to generate analytical insights which can help
stakeholders to make effective decisions. This paper presents the overview on data silos, challenges with
data silos and how big data integration can help to stun them.
Beyond backup to intelligent data managementSirius
Backup isn’t going away, but it is changing. Data is now one of your company’s most valuable assets, but you may be challenged to leverage it for competitive and operational advantage. More flexibility, visibility, and intelligence into data is needed. As well as improved insights into where data resides and its importance in the business hierarchy.
Increased efficiency, effectiveness, and control is the goal. Are you ready to modernize backup from a transactional process to a forward-thinking strategy?
View to learn:
--New questions you should be asking about where your data lives, who owns data decisions, and what’s optimal for the business.
--Keys to jump-starting modernization.
--What improved visibility and intelligence into data delivers.
--Why multi-cloud management and risk assignment are essential to protecting your most valuable data.
The future of storage is here, so don’t get left behind—start planning your next move.
A data center centralizes a company’s shared IT operations as well as equipment for processing, storing and disseminating applications and data. It is essential to have knowledge about the terms that are used frequently in the context of data centers.
Efficient multicast delivery for data redundancy minimization over wireless d...redpel dot com
Efficient multicast delivery for data redundancy minimization over wireless data centers
for more ieee paper / full abstract / implementation , just visit www.redpel.com
As businesses grow and become more complex, the information systems that support them require continuous updates to keep up with the changes. Moving to an ERP (Enterprise Resource Planning) system doesn’t reduce the need to change.
View the original Blog post: http://www.eprentise.com/blog/data-systems/erp-systems-the-next-legacy-dinosaur/
Website: www.eprentise.com
Twitter: @eprentise
Google+: https://plus.google.com/u/0/+Eprentise/posts
Facebook: https://www.facebook.com/eprentise
Ensure your data is Complete, Consistent, and Correct by using eprentise software to transform your Oracle® E-Business Suite.
Microsoft India - SQL Server The Forrester Wave Enterprise Database Managemen...Microsoft Private Cloud
In Forrester’s 153-criteria evaluation of enterprise open source and closed source database management systems (DBMSes), we found that Oracle, IBM, Microsoft, and Sybase lead the pack because each offers mature, high-performance, scalable, secure, and flexible solutions. It was no surprise to see Oracle dominating in most of the features and functionality such as performance, availability, security, and administration. IBM DB2 for Linux, UNIX, and Windows showed strong support for application and data integration, performance, scalability, and administration, while Microsoft has impressive capabilities for database programmability, application development, administration, and security. Sybase Adaptive Server Enterprise continues to show improvement in its product, offering good support for availability, performance, and administration. IBM Informix Dynamic Server, MySQL, and Ingres came out as Strong Performers, following very closely on the heels of the Leaders and offering very respectable alternatives and a multitude of choices for application developers and architects. PostgreSQL lacks the Leaders’ breadth of features but is a reputable Contender for some use cases
The high volume data processing demands of IoT exceed the capabilities of the majority of today's data centers. This presentation examines the issues that must be addressed to ensure a successful IoT implementation.
The next generation of computing for content centric application will require even greater responsiveness and performance as well as a resiliency that may not have been demanded previously. Storage systems for these workloads must deliver high performance and reliability while meeting strict price – performance levels and come from a known, premier storage supplier.
A perfect storm of data growth is brewing. According to a recent survey by Gartner, data growth is now the leading infrastructure challenge.1 Left unchecked data growth negatively impacts application performance, compliance goals and IT costs. Yet, this very same data is also the lifeblood of today’s organizations driving demand for enterprise analytics to extract value from enterprise data like never before.
For users of Hadoop, MapReduce is a new territory. MapReduce design patterns are all about documenting the knowledge and lessons learned of the seasoned Hadoop developer so that new developers can leverage the experts’ experience in solving problems. This talk outlines a few of the most popular patterns and give an verview of the rest.
Objective 1: Understand what kinds of problems are solvable by Hadoop and MapReduce.
After this session you will be able to:
Objective 2: Understand why Hadoop engineers need to know what MapReduce Design Patterns are and what they are useful for day-to-day.
Objective 3: Begin to understand how to summarize, reorganize, and search through your data with Hadoop and MapReduce
Transforming Expectations for Treat-Intelligence SharingEMC
Gain insight into a new approach to information-sharing processes for threat intelligence which ensures that data distribution is relevant, actionable, and automated.
RSA Security Briefs provide executives and practitioners with essential guidance on today’s most pressing information-security risks and opportunities. Each Brief is created by a select response team of security and technology experts who mobilize across companies to share specialized knowledge on a critical emerging topic. Offering both big-picture insight and practical technology advice, these papers are vital reading for today’s forward-thinking security leaders.
PostNL is bij de Client Awards uit 593 deelnemende organisaties verkozen tot beste zzp-opdrachtgever van Nederland. De kwaliteit van deelnemende organisaties is gemeten op basis van de referenties van meer dan 19.000 zzp’ers met communicatie, tarief en overeenkomst als belangrijkste criteria.
Uit onderzoek van ZZP Barometer blijkt dat communicatie, tarief, overeenkomst, betalingstermijn en werksfeer de vijf belangrijkste criteria zijn van ‘Goed Opdrachtgeverschap’. Deze criteria zijn aan zzp’ers voorgelegd, naast vragen over algemene tevredenheid, de waarschijnlijkheid tot aanbeveling en suggesties tot verbetering. PostNL scoort gemiddeld een 7,6 bij zzp’ers en een Net Promoter Score (NPS) van 77. Hiermee eindigt PostNL twee posities hoger dan vorig jaar, net voor Aegon, Evita Zorg, Stedin en ING.
De top 10 van de Client Awards ziet er als volgt uit:
1. PostNL
2. Aegon
3. Evita Zorg
4. Stedin
5. ING
6. Domijn
7. ABN AMRO
8. KPN Consulting
9. Tele2
10. Rijkswaterstaat
This White Paper provides an introduction to the EMC Isilon scale-out data lake as the key enabler to store, manage, and protect unstructured data for traditional and emerging workloads.
Data lakes are central repositories that store large volumes of structured, unstructured, and semi-structured data. They are ideal for machine learning use cases and support SQL-based access and programmatic distributed data processing frameworks. Data lakes can store data in the same format as its source systems or transform it before storing it. They support native streaming and are best suited for storing raw data without an intended use case. Data quality and governance practices are crucial to avoid a data swamp. Data lakes enable end-users to leverage insights for improved business performance and enable advanced analytics.
A New Frontier in Securing Sensitive Information – Taneja Group, April 2007LindaWatson19
Sensitive Information is increasingly landing in the hands of malicious individuals either through external breaches or insider thefts from employees or contractors. This paper looks into what it takes to secure sensitive information and thwart and potential breaches.
Solix Cloud – Managing Data Growth with Database Archiving and Application Re...LindaWatson19
Mission-critical ERP and CRM applications are the lifeblood of any business. This paper examines how Solix Cloud Database Archiving and Application Retirement Solutions enable enterprises to achieve their ILM goals while reducing complexity and offering superior performance.
White Paper: EMC Isilon OneFS Operating System EMC
This white paper provides an introduction to the EMC Isilon OneFS operating system, the foundation of the Isilon scale-out storage platform. The paper includes an overview of the architecture of OneFS and describes the benefits of a scale-out storage platform.
In the Age of Unstructured Data, Enterprise-Class Unified Storage Gives IT a ...Hitachi Vantara
Your enterprises can no longer be the realm of monolithic block-centric storage systems. Unstructured data drives the adoption of unified storage systems that support multiple protocols. Mobile smartphones and tablets accelerate the spread of file-based applications and data. Is your enterprise ready to join the mobile revolution? In this paper, see how Hitachi with our next generation of unified enterprise storage will help you move into the next era of business agility and efficiency.
Enterprise Archiving with Apache Hadoop Featuring the 2015 Gartner Magic Quad...LindaWatson19
Read how Solix leverages the Apache Hadoop big data platform to provide low cost, bulk data storage for Enterprise Archiving. The Solix Big Data Suite provides a unified archive for both structured and unstructured data and provides an Information Lifecycle Management (ILM) continuum to reduce costs, ensure enterprise applications are operating at peak performance and manage governance, risk and compliance.
Enterprise Storage Solutions for Overcoming Big Data and Analytics ChallengesINFINIDAT
Big Data and analytics workloads represent a new frontier for organizations. Data is being collected from sources that did not exist 10 years ago. Mobile phone data, machine-generated data, and website interaction data are all being collected and analyzed. In addition, as IT budgets are already being pressured down, Big Data footprints are getting larger and posing a huge storage challenge.
This paper provides information on the issues that Big Data applications pose for storage systems and how choosing the correct storage infrastructure can streamline and consolidate Big Data and analytics applications without breaking the bank.
InfiniBox bridges the gap between high performance and high capacity for Big Data applications. InfiniBox allows an organization implementing Big Data and Analytics projects to truly attain its business goals: cost reduction, continual and deep capacity scaling, and simple and effective management — and without any compromises in performance or reliability. All of this to effectively and efficiently support Big Data applications at a disruptive price point.
Learn more at www.infinidat.com.
Data storage and networking are no exceptions. Development is moving fast, and tipping points have already tipped: the cloud, nextgen networks, Internet of Things (IoT), innovative le systems, NVMe SSD. These technologies are active today in enterprise data centers and in the public clouds that serve them.
Securing Your Future: Cloud-Based Data Protection SolutionsMaryJWilliams2
Explore the essential strategies for safeguarding your data with cloud-based protection solutions. This comprehensive guide delves into the benefits of using cloud services for data security, including enhanced scalability, reliability, and disaster recovery capabilities. Learn about the latest trends, best practices, and how to effectively implement cloud-based data protection to ensure your data is secure, accessible, and recoverable. To Know more: https://stonefly.com/white-papers/cloud-based-data-protection-strategies/
Securing the Future: A Guide to Cloud-Based Data ProtectionMaryJWilliams2
In an era where data breaches and cyber threats are increasingly common, cloud-based data protection emerges as a critical pillar for safeguarding digital assets. This article offers an in-depth exploration of cloud-based data protection strategies, tools, and best practices. Discover how leveraging the cloud can enhance your organization's data security posture, ensure business continuity, and provide scalability to meet future demands. To Know more: https://stonefly.com/white-papers/cloud-based-data-protection-strategies/
Know whether cloud based storage or dedicated storage is best for your business IT infrastructure depending on our organization requirements. Check Netmagic’s outlooks.
Similar to The EMC Isilon Scale-Out Data Lake (20)
INDUSTRY-LEADING TECHNOLOGY FOR LONG TERM RETENTION OF BACKUPS IN THE CLOUDEMC
CloudBoost is a cloud-enabling solution from EMC
Facilitates secure, automatic, efficient data transfer to private and public clouds for Long-Term Retention (LTR) of backups. Seamlessly extends existing data protection solutions to elastic, resilient, scale-out cloud storage
Transforming Desktop Virtualization with Citrix XenDesktop and EMC XtremIOEMC
With EMC XtremIO all-flash array, improve
1) your competitive agility with real-time analytics & development
2) your infrastructure agility with elastic provisioning for performance & capacity
3) your TCO with 50% lower capex and opex and double the storage lifecycle.
• Citrix & EMC XtremIO: Better Together
• XtremIO Design Fundamentals for VDI
• Citrix XenDesktop & XtremIO
-- Image Management & Storage
-- Demonstrations
-- XtremIO XenDesktop Integration
EMC FORUM RESEARCH GLOBAL RESULTS - 10,451 RESPONSES ACROSS 33 COUNTRIES EMC
Explore findings from the EMC Forum IT Study and learn how cloud computing, social, mobile, and big data megatrends are shaping IT as a business driver globally.
Reference architecture with MIRANTIS OPENSTACK PLATFORM.The changes that are going on in IT with disruptions from technology, business and culture and so IT to solve the issues has to change from moving from traditional models to broker provider model.
Force Cyber Criminals to Shop Elsewhere
Learn the value of having an Identity Management and Governance solution and how retailers today are benefiting by strengthening their defenses and bolstering their Identity Management capabilities.
Container-based technology has experienced a recent revival and is becoming adopted at an explosive rate. For those that are new to the conversation, containers offer a way to virtualize an operating system. This virtualization isolates processes, providing limited visibility and resource utilization to each, such that the processes appear to be running on separate machines. In short, allowing more applications to run on a single machine. Here is a brief timeline of key moments in container history.
This white paper provides an overview of EMC's data protection solutions for the data lake - an active repository to manage varied and complex Big Data workloads
This infographic highlights key stats and messages from the analyst report from J.Gold Associates that addresses the growing economic impact of mobile cybercrime and fraud.
This white paper describes how an intelligence-driven governance, risk management, and compliance (GRC) model can create an efficient, collaborative enterprise GRC strategy across IT, Finance, Operations, and Legal areas.
The Trust Paradox: Access Management and Trust in an Insecure AgeEMC
This white paper discusses the results of a CIO UK survey on a“Trust Paradox,” defined as employees and business partners being both the weakest link in an organization’s security as well as trusted agents in achieving the company’s goals.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
1. EMC WHITE PAPER
THE EMC ISILON SCALE-OUT
DATA LAKE
Key capabilities
ABSTRACT
This white paper provides an introduction to the EMC Isilon scale-out data lake as the
key enabler to store, manage, and protect unstructured data for traditional and
emerging workloads. Business decision makers and architects can leverage the
information provided here to make key strategy and implementation decisions for
their storage infrastructure.
December 2014
3. TABLE OF CONTENTS
EXECUTIVE SUMMARY 4
Audience 4
Terminology 4
OVERVIEW 4
DATA FLOW 5
Ingest 6
Storage 6
Analysis 6
Application: Surface and Act 6
BUSINESS CHALLENGES 6
Silos or islands of data 7
Inefficiencies across the system 7
Security and compliance 7
Inflexible architecture for innovation Error! Bookmark not defined.
DATA LAKE 8
ISILON SCALE-OUT DATA LAKE 9
Multiple access methods 9
Cost efficiency 9
Reduce risks 10
Protect and secure data assets 11
Faster time to insights 12
ERROR! BOOKMARK NOT DEFINED.
STORAGE AND DATA SERVICES 13
CONSUMPTION MODELS 13
CONCLUSION 13
REFERENCES 13
ABOUT EMC 13
4. 4
EXECUTIVE SUMMARY
Data is growing rapidly as the numbers of people using digital devices interact with systems and networks grow across the globe. A
Majority of this data growth is in the unstructured form—and as organizations are experiencing, contains valuable insights that could
be used to improve business results. The technology to capture, store and analyze this data is maturing rapidly, as enterprises are
looking for ways to effectively handle the data growth. By some industry estimates, over 80 percent of the new storage capacity
deployed in organizations around the world will be for unstructured data.
EMC®
Isilon®
scale-out network-attached storage (NAS) provides a simple, scalable, and efficient platform to store massive amounts
of unstructured data and enable various applications to create a scalable and accessible data repository without the overhead
associated with traditional storage systems. Isilon enables organizations to build a scale-out data lake where they can store their
current data, and scale capacity, performance, or protection as their business data grows in the future. The scale-out data lake helps
lower storage costs by efficient storage utilization, eliminate islands or silos of storage, and lower storage management costs of
migration, security and protection.
EMC Isilon OneFS®
, the intelligence behind Isilon systems, is the industry’s leading scale-out NAS operating system, which is known
for its massive capacity, operational simplicity, extreme performance, and unmatched storage utilization. With proven enterprise-
grade protection and security capabilities, Isilon is an ideal platform to meet a wide variety of storage needs.
AUDIENCE
This white paper is intended for business decision makers, IT managers, architects, and implementers. By leveraging the Isilon
scale-out data lake—a key enabler for storing and managing massive quantities of unstructured data—enterprises can build their
storage strategies and implement their infrastructure to maximize the return of their IT investments.
TERMINOLOGY
The acronyms used in this paper are summarized in the Table 1.
Table 1. Acronyms used in this paper
OVERVIEW
According to an IDC “The Digital Universe of Opportunities: Rich Data and the Increasing Value of the Internet of Things” study
sponsored by EMC, the digital universe will grow from 4.4 zettabytes (1 trillion gigabytes) in 2013 to 44 zettabytes in 2020, doubling
in size every two years. Although enterprises create only 30 percent of this data (1.5 ZB in 2013), they come in contact with 85
percent (2.3 ZB in 2013) of it, and have some liability associated with the data. IDC further estimates that 22 percent of the data is
a candidate for analysis and contains valuable information that organizations can use to make critical decisions. Figure 1 shows this
trend for enterprise data.
ACRONYM DESCRIPTION
CIFS Common Internet File System
DAS direct-attached storage
HDFS Hadoop Distributed File System
LAN local area network
NAS network-attached storage
NFS network file system
SAN storage area network
SMB Server Message Block
5. 5
Figure 1. Enterprise data1
Not all of this data is stored by the enterprises in the datacenter for cost, liability or process reasons.
Isilon scale-out NAS provides key capabilities to build a data lake that physically de-couple compute from storage without affecting
the seamless nature of the data flow outlined below. You can leverage a data lake to get the most value from your data. Data from a
variety of sources can be converged using native protocols, protected, secured and exposed to analytics systems that drive value,
organizations can plan, implement and manage loosely coupled but seamless storage and applications.
DATA FLOW
Organizational data typically follows a linear data flow starting with various sources both consumer and corporate; ingested into a
store, analyzed and surfaced for actions that augment value creation for an organization; as shown in Figure 2. The five distinct
stages are typically broken down into three broad categories of data, analytics and application for easy reference from a holistic point
of view.
|
Figure 2. Data flow
1
EMC Digital Universe Study—with Research and Analysis by IDC, http://www.emc.com/leadership/digital-universe/index.htm
6. 6
In the following section we detail each of the stages in the data flow process as we understand the implications of each of these
phases of the data flow. The choices at each of these stages have a profound impact in deriving value of the organizations data and
ultimately the data store.
INGEST
Data Ingestion is the process of obtaining and processing data for later use by storing in the most appropriate system for the
application. An effective ingestion methodology validates the data, prioritizes the sources and with reasonable speed and efficiency
commits data to storage. The velocity, volume and variety tax the capture speeds, throughput and efficiency of the ingest systems
and therefore the store as discussed in the following section. The ingest process can act as the starting point of a storage silo if the
organization planners do not look at the dataset holistically.
High velocity data- characterized by a continuous feed of data requires a specialized ingestion mechanism that typically captures the
continuous stream and commits the records in batches of varying sizes to storage for further processing. Capture strategies come in
two flavors, commit records to storage directly or use higher speed buffers as an intermediate step before storing to a more
persistent downstream system. Examples of high velocity data include clickstreams, video surveillance feeds etc.
The volume of data is not only a factor of the velocity but also the size of the data set generated in batches of non-streaming origins.
High volume data typically requires time to move from one point in the system to another depending on the network speed and
places burdens on both the network and storage. Optimization for volume as in large files can affect velocity and vice versa. High
definition video and audio are the typical examples of high volume data in the form of files- separate from streaming.
Variety places translation challenges at the data, file and protocol levels as disparate organic systems may need varying levels of
translation for data to be of value for the downstream processes. A combination of CRM, LOB, social media, web and mobile
information on customers is a good example of variety of data.
STORAGE
Storing data is typically dictated by the type of storage strategy namely block or file, the flow mechanism and the application. Over
the years, storage has evolved into an optimization problem between storage costs and performance. As the volume and variety of
data grows, the cost to reliably store, manage, secure, and protect it grows as well, particularly for data that is subject to compliance
and regulatory mandates like personal identifiable information (PII), financial transactions, and medical records. Adding the
availability component adds cost pressures at the expense of performance and vice versa.
The segregation started by the choice of ingestion system is further exacerbated by storage bringing about true silos or islands of
information catering to various application requirements of real-time, interactive or batch processing. As silos permeate the IT
infrastructure, hot spots arise in systems of heavier use, while capacity goes unused elsewhere. Downstream applications,
compliance, security and data policies can contribute to create further silos within silos- giving rise to a very complex system.
ANALYSIS
Data analysis technologies load, process, surface, secure, and manage data in ways that enable organizations to mine value from
their data. Traditional data analysis systems are expensive, and extending them beyond their critical purposes can place a heavy
burden on IT resources and costs. Integrating data from disparate sources adds complexity and management overhead that can be a
deterrent to most organizations looking to derive value from their data.
APPLICATION: SURFACE AND ACT
Post analysis, results and insights have to be surfaced for actions like e-discovery, post mortem analysis, business process
improvements, decision making or a host of other applications. Traditional systems use traditional protocols and access mechanisms
while new and emerging systems are redefining access requirements to data already stored within an organization. A system is not
complete unless it caters to the range of requirements placed by traditional and next-generation workloads, systems and processes.
BUSINESS CHALLENGES
As organizations face large datasets, data and application growth, they are observing a tremendous increase in costs both CAPEX &
OPEX; limitations in the ability to realize the value of the data; and protection and compliance lapses to name a few. These
challenges can be attributed to a combination of factors throughout the data flow process but fall in the following broad categories:
7. 7
SILOS OR ISLANDS OF DATA
The ingestion strategy employed for real-time, interactive or batch applications originate a silo that diverges as the data flows
downstream. For example, if an organization choses to setup a combination of SAN- buffer stream of data and NAS -persistent
storage for a real-time application like customer targeting; while elsewhere, CRM analysis happens on a combination of DAS and
cloud. A third silo consisting of archived data uses a combination of DAS and NAS to run batch analytics- a typical scenario in
organizations today.
In the first silo, the compliance requirement for handling of PII data can pressure admins to choose between applying policy to the
entire silo or carve out a protected zone if the technology so supports. This adds complexity to the system without the inclusion of
access control for financial data to meet SEC requirements; or HIPPA for medical data. If the design of the system does not support
protections, organizations risk going out of compliance with large fines at a minimum and liability of data leaks causing massive
lawsuits on the other extreme.
Figure 3. Silos or islands of data
Also, the silos of data adds cost pressures due to inefficiencies addressed in the following section, locking of insights within silos at
the expense of business and management and protection inconsistencies. In many organizations, applying learnings from a batch
processing system to a real-time system may involve lengthy change management processes- creating friction across departments
or adding barriers to value generation.
INEFFICIENCIES ACROSS THE SYSTEM
If one dataset of say 10 terabytes is used for three different analyses by three systems, you will have a minimum of three copies
requiring three times the storage capacity. If this dataset grows by a 30% a year the scaling requirement grows by 90% annually-
demonstrating inefficiencies experienced by organizations at the most basic level. If one silo has lower utilization than the other,
hotspots arise in the system while capacity goes unutilized elsewhere. Inefficient utilization of the budget can arise as a result of low
value data residing on high cost, high performance storage.
Organizations are faced with management, datacenter footprint, power and cooling inefficiencies due to silos and hotspots in addition
to reconfigurations, migrations and complicated maintenance activities.
SECURITY AND COMPLIANCE
Organizations with silos are faced with duplicate and inconsistent application of policies, security measures and governance policies,
whereas organizations with shared infrastructure face access control violations. Working around these is typically time-consuming,
painful and diverts resources away from standard procedures as issues arise. Securing data against leaks or destruction- both
accidental and malicious; presents a separate set of challenges. These challenges are compounded in regulated sectors like finance,
healthcare and government.
TIME TO INSIGHTS
More users than ever before are mobile or geographically dispersed with access to a larger dataset as a basic requirement to perform
their duties effectively. Traditional systems operated within the confines of Corporate IT through a regulated and measured
interconnections with the external world. With the growth of cloud, mobility and devices; the capabilities of traditional systems are
being challenged in new ways increasing the time required to access, process and consume insights.
8. 8
Figure 4. Enterprise storage challenges
Providing access while enforcing policies consistently across a wide range of approved and unapproved devices and platforms is a
growing challenge for today’s storage administrators. Traditional systems, silos and implementation strategies add latencies as
protection and security layers are added around the application or sources of information.
DATA LAKE
The data lake represents a paradigm shift from the linear data flow model. As data and the insights gleaned from it increase in value,
enterprise-wide consolidated storage is transformed into a hub around which the ingestion and consumption systems work (see
Figure 4). This enables enterprises to bring analytics to data and avoid expensive costs of multiple systems, storage and time for
ingestion and analysis.
Figure 5. Scale out data lake
By eliminating a number of parallel linear data flows, enterprises can consolidate vast amounts of their data into a single store—a
data lake—through a native and simple ingestion process. This data can be secured and analysis performed, insights surfaced, and
9. 9
actions taken in an iterative manner as the organization and technology matures. Enterprises can thus eliminate the cost of having
silos or islands of information spread across their enterprises.
The scale-out data lake further enhances this paradigm by providing scaling capabilities in terms of capacity, performance, security,
and protection. The key characteristics of a scale-out data lake are that it:
• Accepts data from a variety of sources like file shares, archives, web applications, devices, and the cloud, in both streaming and
batch processes
• Enables access to this data for a variety of uses from conventional purposes to next-gen mobile, analytics, and cloud
applications
• Secures and safeguards data with the appropriate level of protection from highly critical data like medical records, financial
transactions, credit card data, and PII to website logs and temporary data that might not require any security
• Scales to meet the demands of future consolidation and growth as technology evolves and new possibilities emerge for applying
data to gain competitive advantage in the marketplace
• Provides a tiering ability that enables organizations to manage their costs without setting up specialized infrastructures for cost
optimization
• Maintains simplicity, even at the petabyte scale
ISILON SCALE-OUT DATA LAKE
Isilon enhances the data lake concept by enriching your storage with improved cost efficiencies, reduced risks, data protection,
security, compliance & governance while enabling you to get to insights faster. You can reduce the risks of your big data project
implementation, operational expenses and try out pilot projects on real business data before investing in a solution that meets your
exact business needs. Isilon is based on a fully distributed architecture that consists of modular hardware nodes arranged in a
cluster. As nodes are added, the file system expands dynamically scaling out capacity and performance without adding corresponding
administrative overhead.
Multiple access methods
Isilon natively supports multiple protocols like SMB, NFS, File Transfer Protocol (FTP), and Hypertext Transfer Protocol (HTTP) for
traditional workloads, and HDFS for emerging workloads like Hadoop analytics. By the very nature, a shared storage with multiple
access methods eliminates storage silos bringing efficiencies and consistency to your IT infrastructure. This enables batch, real-time
or interactive applications and systems to store and access data from one shared storage pool without the need for any migrations,
loading, or conversion. Accessing data for read or write purposes are achieved at the protocol level. This implies that data can be
created by any of the myriad systems, ingested into the data lake using a natively supported protocol such as SMB for Windows or
Mac, and accessed or modified seamlessly using another protocol like NFS, FTP or HDFS.
Multiprotocol support enables the storage infrastructure to provide access to applications that leverage third platform protocols like
HTTP or HDFS, which drive emerging workloads. Adding future protocol support, data access mechanisms, or interfaces can be easily
achieved to scale the interoperability of the data lake. Without a data lake, interoperability would require an expensive and time-
consuming sequence of operations on data across multiple silos, or even costly and inefficient data duplication.
Cost efficiency
You can achieve great cost efficiencies by investing in the storage capacity and capability required today- smaller than traditional or
Hadoop requirements; scale in smaller steps proportionately to the data growth; tiering data according to value and; simplified
management.
Isilon scale-out NAS can scale from 18 terabytes to over 50 petabytes in a single cluster so you do not have to overprovision your
storage infrastructure. With over 80% utilization, you can keep your CAPEX in check with a smaller footprint. SmartDedupe provides
the ability to reduce the physical data footprint by locating and sharing common elements across files with minimal impact to
performance during writes or concurrent reads. You will observe a reduction in storage expansion costs, typically in the range of 30%
with smaller storage capacity, power and cooling and of course less rack space requirement.
10. 10
Figure 6. Isilon Scale-out data lake
Isilon enables you to tier the data lake to further drive cost efficiencies by optimizing performance, throughput, and density. Using
policy-based tiering, you can reduce the cost of storing your data based on the inherent value and utility of the data. Tiering using
EMC Isilon S-Series, X-Series, NL-Series and HD-Series nodes is shown in Figure 7. High-value, readily needed data can be kept at a
high-performance tier, whereas low-value, infrequently used data can be moved to a more cost-effective active archive without
having any impact on your application or analytics infrastructure.
Figure 7 . Tiering using Isilon storage nodes
Under the covers, an Isilon cluster provides automatic storage balancing and deduplication that not only enhances storage efficiency
but also enables storage administrators to weather hardware outages better.
Reduce risks
Any large and growing data infrastructure risks can be classified as
11. 11
• Ability to successfully implement an initial solution
• Ability and flexibility to scale solution as the environment changes
• Protect against current and future data loss
• Protect against data theft
These risks are further amplified as the dependency between the ingestion, storage and analytics system remains strong. With a
loosely coupled system you will be able to build mitigation strategies. A data lake based on Isilon is able to provide mitigation for all
these components. By decoupling storage from compute and using multiple access methods, you can deploy a storage system
capable of ingesting the data you need for your solution effortlessly. By using protection and security features outlined in the
following sections, you can ensure your data is safe and resilient to external forces.
Isilon is the only scale-out NAS to support multiple distributions of Hadoop through native HDFS implementation. This allows you the
flexibility to analyze, surface and act on data using the best tool for your application and scale storage compute or both as your
needs change.
Protect and secure data assets
Isilon storage systems are highly resilient and provide unmatched data protection and availability. Isilon uses the proven Reed-
Solomon erasure encoding algorithm rather than RAID to provide a level of data protection that goes far beyond traditional storage
systems. With N+1 protection, data is 100 percent available even if a single drive or a complete node fails, which is comparable to
RAID 5 in conventional storage. You can also deploy Isilon for N+ 2 protections, which allows two components to fail within the
system, similar to RAID 6; N+3 or N+ 4 protections, where three or four components can fail, keeping the data 100 percent
available.
Isilon is able to recover from hardware failures faster than traditional systems as recovery entails rebuilding lost data as opposed to
the entire disk. In a data Lake, this can have a huge impact on the operation as storage is shared across systems with varying levels
of performance demands. A truly distributed storage like Isilon can orchestrate all the nodes to participate in the restoration or
recovery of data from the outages on a dedicated backend infiniband network speeding up recovery without impacting front end
performance.
As organizations view data as an asset, securing data for inherent value is even more important than meeting regulatory compliance
and corporate governance requirements. Isilon helps organization address security needs by providing robust and flexible security
features outlined below as described in the figure below.
12. 12
Figure 8. Authentication zones
• Secure role separation enables roles-based access control (RBAC), where a clear separation between storage administration and
file system access is enforced.
• Authentication zones that serve as secure, isolated storage pools for insulating departments within the organization from access
to data that they are not supposed to see. For example: legal, financial, PII and employee data can be seen by only authorized
employees in the authentication zone.
• Write once-read many (WORM) protection is achieved by using SmartLock software, which prevents accidental or malicious
alterations or deletion of data- a key requirement for governance and compliance.
• Data at rest encryptions through the use of Self-Encrypting drives enables organizations to ensure physical loss of hardware
does not equate to data leak.
Faster time to insights
By utilizing a shared infrastructure, you can consolidate data from multiple islands into one single storage system based on Isilon.
This is made possible through he multiple access mechanisms at the protocol level where data can be ingested into the Isilon store
from a wide variety of sources and surfaced for use by others. This would eliminate the need for data migration or extraction-
translation-loading (ETL) operations typical with any data analytics solution, saving you precious time and resources. You can then
run Hadoop analytics with your dataset in-place. As depicted in the figure below. By using a large loosely coupled shared scalable
storage on a typical dataset of around 100 terabytes, you can save over 24 hours of data moving time on a 10 GBps network; time
that you can use to actually generate insights from your data.
Isilon is the only scale-out NAS that works with multiple distributions of Hadoop from a variety of vendors. This enables you to try
out tools from all of these vendors at the same time if necessary to find the best solution to meet your business requirements.
The data lake is the key enabler for driving business value into customer environments through the multiprotocol, multi-access,
tiered, single namespace, protected and scalable data repository. Leveraging the scale-out data lake, customers can consolidate
multiple disparate islands of storage into a single cohesive and unified data repository that is easier to manage and more cost-
effective.
Figure 9. Faster insights
13. 13
STORAGE AND DATA SERVICES
A scale-out data lake leverages the industry leading enterprise-grade storage and data services that extend the business value of
your data. Isilon Infrastructure Software provides powerful storage management software that helps protect your data assets,
control costs, and optimize the storage resources and system performance of your scale-out data lake.
The data management capabilities include EMC Isilon InsightIQ®
, MobileIQ, SmartDedupe, SmartPools®
, and SmartQuotas™, which
together help you improve the ROI of your data lake infrastructure. For more information on these data management services,
review the Isilon OneFS whitepaper available here.
The scale-out data lake is further strengthened by proven data protection capabilities provided by EMC Isilon SmartConnect™,
SmartLock®
, SnapshotIQ™, and SyncIQ®
. For more information on these data protection services, review the Isilon data availability
whitepaper available here.
CONSUMPTION MODELS
Isilon provides a number of consumption models that enables you to choose a strategy that is best for your business. The simplest
and most common strategy is an appliance that is a preinstalled combination of hardware and software. You can choose a converged
infrastructure solution in conjunction with EMC Vblock®
. Or, you may choose a cloud-based utility infrastructure-as-a-service; in
which you pay for the service based on usage. The key is that you have choices in the storage purchasing model and flexibility to fit
your business procurement needs.
CONCLUSION
Given that unstructured data will be doubling every two years, enterprises need higher efficiencies, architectural simplicity and more
protection as they scale capacity and capabilities. A scale-out data lake provides key capabiities to eliminate silos of data; secure and
protect information assets; support existing and next generation workloads while speeding time to insights. Starting with a scale-out
data lake, organizations can
1. Invest in the infrastructure today to get started
2. Realize the value of data, store, process, and analyze it- in the most cost effective manner; and
3. Grow capabilities as needs grow in the future.
This enables organizations to store everything, analyze anything and build a solution with the best ROI. By de-coupling storage from
analysis and application, organizations gain flexibility to choose between a larger number of strategies to deploy solutions without
risking data loss, cost overruns and dataset leaks. The data lake based on Isilon offers organizations this along with a capability to
simplify the IT infrastructure, tier, secure and protect data efficiently; and get to insights faster.
As a large dataset is the reason big data exists, organizations can start with the data. Understand the value locked in their large and
growing datasets by running pilots; not worry about ingesting or surfacing and; pilot applications or emerging technologies- to make
better informed strategic decisions.
REFERENCES
EMC Digital Universe Study—with Research and Analysis by IDC (http://www.emc.com/leadership/digital-universe/index.htm)
EMC Isilon OneFS Operating System http://www.emc.com/collateral/hardware/white-papers/h8202-isilon-onefs-wp.pdf
High Availability and Data Protection with EMC Isilon scale-out NAS http://www.emc.com/collateral/hardware/white-papers/h10588-
isilon-data-availability-protection-wp.pdf
ABOUT EMC
EMC Corporation is a global leader in enabling businesses and service providers to transform their operations and deliver IT as a
service. Fundamental to this transformation is cloud computing. Through innovative products and services, EMC accelerates the
journey to cloud computing, helping IT departments to store, manage, protect, and analyze their most valuable asset—information—
in a more agile, trusted, and cost-efficient way. Additional information about EMC can be found at www.EMC.com.