Storage that Powers Digital Business: Scality for Enterprise BackupScality
Explore Scality Enterprise Backup solutions and how they dramatically reduce the risk of data loss. Discover more about Scality Ring at www.scality.com
Object storage technologies have already changed the scale, economics and reliability of data storage in public cloud environments. Now these benefits are also available on-premises, within enterprise data centers.
This presentation is intended to discuss:
o The unique benefits of object storage
o Why enterprises are choosing on-premises object storage
o The most popular use cases among enterprise users
o How real companies are putting object storage to work
o What to look for in an on-premises object storage solution
Leader in Cloud and Object Storage for Service ProvidersScality
Cloud-based services are growing as they become real opportunities for service providers. Discover more about Scality RING Software-Defined Object Storage. Learn more at www.scality.com.
Test Drive: Experience Single-Click Command with the Veritas Access User Inte...Veritas Technologies LLC
To deal with relentless data growth over the past few years, most organizations have evolved to incorporate a wide variety of different storage solutions, including SAN, NAS, tape, cloud, file, block, and object. With increasingly complex combinations of these different storage types being used for primary, secondary, and archived data, understanding and managing your overall storage environment can start to feel like an impossible task. In this session, you will see first-hand how Veritas Access, a new software-defined storage solution, makes it possible to finally manage all of your storage from a single console--and allows you to migrate data from one storage tier to another with a single mouse click.
Storage that Powers Digital Business: Scality for Enterprise BackupScality
Explore Scality Enterprise Backup solutions and how they dramatically reduce the risk of data loss. Discover more about Scality Ring at www.scality.com
Object storage technologies have already changed the scale, economics and reliability of data storage in public cloud environments. Now these benefits are also available on-premises, within enterprise data centers.
This presentation is intended to discuss:
o The unique benefits of object storage
o Why enterprises are choosing on-premises object storage
o The most popular use cases among enterprise users
o How real companies are putting object storage to work
o What to look for in an on-premises object storage solution
Leader in Cloud and Object Storage for Service ProvidersScality
Cloud-based services are growing as they become real opportunities for service providers. Discover more about Scality RING Software-Defined Object Storage. Learn more at www.scality.com.
Test Drive: Experience Single-Click Command with the Veritas Access User Inte...Veritas Technologies LLC
To deal with relentless data growth over the past few years, most organizations have evolved to incorporate a wide variety of different storage solutions, including SAN, NAS, tape, cloud, file, block, and object. With increasingly complex combinations of these different storage types being used for primary, secondary, and archived data, understanding and managing your overall storage environment can start to feel like an impossible task. In this session, you will see first-hand how Veritas Access, a new software-defined storage solution, makes it possible to finally manage all of your storage from a single console--and allows you to migrate data from one storage tier to another with a single mouse click.
Leap to Next Generation Data Management with Denodo 7.0Denodo
Watch Mike's keynote presentation from Fast Data Strategy Virtual Summit here: https://goo.gl/cC3bCq
Mike Ferguson is an independent analyst and the Managing Director for Intelligent Business Strategies. In this session, he will be discussing how Denodo Platform 7.0 enables and redefines data management for the next generation.
Attend this session to discover:
• Perspectives from independent industry analyst Mike Ferguson
• Why data virtualization is gaining momentum
• How Denodo Platform 7.0 enables next generation data management
Simplifying Your Journey to the Cloud: The Benefits of a Cloud-Based Data War...Matillion
As companies grow, so does the volume of their data. Without the proper solutions in place to quickly store, measure and analyze that data, its usefulness quickly declines.
See our latest webinar to learn about how companies are increasingly turning towards cloud-based data warehousing to derive more value out of their data and apply their findings to make smarter business decisions. The webinar covers core topics including:
- The benefits of using Snowflake’s unique architecture for interacting with data.
- How Matillion can help you quickly load and transform your data to maximize its value.
- Expert advice on how to apply data warehousing and ETL best practices.
Watch the full webinar: https://youtu.be/mIOm3j431OQ
Your company’s digital transformation is in full swing - that much is obvious. But did you know that your legacy solutions could create a gap between your data, apps and cloud environment? In this session, Cohesity will show you how to bridge this gap with one data management platform.
Bridging to a hybrid cloud data services architectureIBM Analytics
Enterprises are increasingly evolving their data infrastructures into entire cloud-facing environments. Interfacing private and public cloud data assets is a hallmark of initiatives such as logical data warehouses, data lakes and online transactional data hubs. These projects may involve deploying two or more of the following cloud-based data platforms into a hybrid architecture: Apache Hadoop, data warehouses, graph databases, NoSQL databases, multiworkload SQL databases, open source databases, data refineries and predictive analytics.
Data application developers, data scientists and analytics professionals are driving their organizations’ efforts to bridge their data to the cloud. Several questions are of keen interest to those who are driving an organization’s evolution of its data and analytics initiatives into more holistic cloud-facing environments:
• What is a hybrid cloud data services architecture?
• What are the chief applications and benefits of a hybrid cloud data services architecture?
• What are the best practices for bridging a logical data warehouse to the cloud?
• What are the best practices for bridging advanced analytics and data lakes to the cloud?
• What are the best practices for bridging an enterprise database hub to the cloud?
• What are the first steps to take for bridging private data assets to the cloud?
• How can you measure ROI from bridging private data to public cloud data services?
• Which case studies illustrate the value of bridging private data to the cloud?
Sign up now for a free 3-month trial of IBM Analytics for Apache Spark and IBM Cloudant, IBM dashDB or IBM DB2 on Cloud.
http://ibm.co/ibm-cloudant-trial
http://ibm.co/ibm-dashdb-trial
http://ibm.co/ibm-db2-trial
http://ibm.co/ibm-spark-trial
Slide presentasi Pak Sutedjo Tjahjadi dari Datacomm Cloud Business dalam seminar "Accelerating Cloud Computing Adoption", Materi Messaging Anniversary DCB
Recent enhancements to Enterprise Vault give your organization new levels of control over your unstructured data. In this session, you'll learn how you can make the most of these new and enhanced capabilities. This includes using intelligent workflows that leverage classification and machine learning to accelerate your compliance activities, taking advantage of flexible new cloud deployment and cloud storage options, and much more. Don't miss this opportunity to explore best practices that will transform Enterprise Vault into one of the most versatile and powerful information management tools in your arsenal.
Webinar: Which Storage Architecture is Best for Splunk Analytics?Storage Switzerland
We discuss the pros and cons of the three most common storage architectures for Splunk, enabling you to decide which makes the most sense for your organization.
1. Leverage existing storage resources
2. Deploy a cloud storage and SaaS solution
3. Deploy a hybrid, Splunk-ready solution
Backup software is continuously improving. Solutions like Veeam Backup and Replication deliver instant recoveries, enabling virtual machine volumes to instantiate directly on the backup device, without having to wait for data to transfer back to primary storage. These solutions can also move older backups to higher capacity, lower cost object storage or cloud storage systems. To deliver meaningful performance during instant recovery without exceeding the backup storage budget requires IT to re-think its backup storage architecture.
Modern backup processes need high performance, low capacity systems to deliver high-performance instant recovery, as well as high-capacity, modest performance systems to store backup data long term and software to manage data placement for the most appropriate recovery performance while not breaking the budget.
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)Denodo
Watch full webinar here: https://bit.ly/3dudL6u
It's not if you move to the cloud, but when. Most organisations are well underway with migrating applications and data to the cloud. In fact, most organisations - whether they realise it or not - have a multi-cloud strategy. Single, hybrid, or multi-cloud…the potential benefits are huge - flexibility, agility, cost savings, scaling on-demand, etc. However, the challenges can be just as large and daunting. A poorly managed migration to the cloud can leave users frustrated at their inability to get to the data that they need and IT scrambling to cobble together a solution.
In this session, we will look at the challenges facing data management teams as they migrate to cloud and multi-cloud architectures. We will show how the Denodo Platform can:
- Reduce the risk and minimise the disruption of migrating to the cloud.
- Make it easier and quicker for users to find the data that they need - wherever it is located.
- Provide a uniform security layer that spans hybrid and multi-cloud environments.
Andrey Okhrimets - “Data Lake and Media Asset Management. Challenges and outc...Lviv Startup Club
AI&BigDataDay 2017
Andrey Okhrimets - “Data Lake and Media Asset Management. Challenges and outcomes”
Сайт: www.aiconf.com.ua
FB group: www.fb.com/groups/974150252615820/
Your data is no longer constrained by location or infrastructure. So why should your data protection be any different? Attend this session to learn how Veritas Backup Exec can help you tear down the data protection silos between physical, virtual, and cloud; escape Veeam's virtual prison; free your data; eliminate the cost of paying for multiple data protection solutions; and stay protected no matter where your data lives. Find out how Backup Exec makes backup easy--with one platform, one console, one license that protects everything everywhere.
Scaling Multi-Cloud Deployments with Denodo: Automated Infrastructure ManagementDenodo
Watch full webinar here: https://bit.ly/3oWR1Bl
The future of infrastructure management lies in automation. In this session, Denodo subject matter expert will talk about how in a multi-cloud scenario, the infrastructure can be automatically managed transparently via a web GUI. Audience will get to see that in action through a live demo.
SLA Consistency: Protecting Workloads from On-premises to Cloud without Compr...Veritas Technologies LLC
IDC predicts that by 2018, 85% of enterprises will commit to multi-cloud architectures. But in this new multi-cloud world, how do you protect data that is spread across multiple clouds? And how can you leverage one cloud as a protection target for another? In this session, Veritas experts will explore best practices for data protection in multi-cloud environments, so you can achieve aggressive SLAs, lower your costs, and mitigate risks across your multi-cloud architecture.
As we move into a new year – with the breakneck velocity that is the new normal for technology – NetApp CTO Jay Kidd presents his forecast for 2014. Below are his 10 technology predictions for 2014. Although both hybrid cloud and accelerated adoption of new technologies will be dominant themes for 2014, IT’s changing and more central role in the business will be critical to a successful year ahead. Learn more: http://nt-ap.com/1e89gMh
Meeting the Priorities and Challenges of the Data Center
Data needs to be stored, managed and transmitted across a broad range of IT infrastructures. The biggest dilemma is how to deliver greater performance, reliability, and manageability at an affordable price.
Efficiently Managing the Growth of Data
Data centers need to collect larger volumes and varieties of data. For data centers with outdated infrastructures harnessing the power of data is extremely challenging. HGST HelioSeal® Platform is ideal for enterprise and data center applications where capacity density and power efficiency are paramount. HGST SSDs provide ultra-high performance in the mission critical 24/7/365 transaction processing environments. The HGST object storage platform allows easy access and retrieval of deep-archived data. HGST solutions meet the needs of cloud service providers delivering scalability, capacity and performance.
Leap to Next Generation Data Management with Denodo 7.0Denodo
Watch Mike's keynote presentation from Fast Data Strategy Virtual Summit here: https://goo.gl/cC3bCq
Mike Ferguson is an independent analyst and the Managing Director for Intelligent Business Strategies. In this session, he will be discussing how Denodo Platform 7.0 enables and redefines data management for the next generation.
Attend this session to discover:
• Perspectives from independent industry analyst Mike Ferguson
• Why data virtualization is gaining momentum
• How Denodo Platform 7.0 enables next generation data management
Simplifying Your Journey to the Cloud: The Benefits of a Cloud-Based Data War...Matillion
As companies grow, so does the volume of their data. Without the proper solutions in place to quickly store, measure and analyze that data, its usefulness quickly declines.
See our latest webinar to learn about how companies are increasingly turning towards cloud-based data warehousing to derive more value out of their data and apply their findings to make smarter business decisions. The webinar covers core topics including:
- The benefits of using Snowflake’s unique architecture for interacting with data.
- How Matillion can help you quickly load and transform your data to maximize its value.
- Expert advice on how to apply data warehousing and ETL best practices.
Watch the full webinar: https://youtu.be/mIOm3j431OQ
Your company’s digital transformation is in full swing - that much is obvious. But did you know that your legacy solutions could create a gap between your data, apps and cloud environment? In this session, Cohesity will show you how to bridge this gap with one data management platform.
Bridging to a hybrid cloud data services architectureIBM Analytics
Enterprises are increasingly evolving their data infrastructures into entire cloud-facing environments. Interfacing private and public cloud data assets is a hallmark of initiatives such as logical data warehouses, data lakes and online transactional data hubs. These projects may involve deploying two or more of the following cloud-based data platforms into a hybrid architecture: Apache Hadoop, data warehouses, graph databases, NoSQL databases, multiworkload SQL databases, open source databases, data refineries and predictive analytics.
Data application developers, data scientists and analytics professionals are driving their organizations’ efforts to bridge their data to the cloud. Several questions are of keen interest to those who are driving an organization’s evolution of its data and analytics initiatives into more holistic cloud-facing environments:
• What is a hybrid cloud data services architecture?
• What are the chief applications and benefits of a hybrid cloud data services architecture?
• What are the best practices for bridging a logical data warehouse to the cloud?
• What are the best practices for bridging advanced analytics and data lakes to the cloud?
• What are the best practices for bridging an enterprise database hub to the cloud?
• What are the first steps to take for bridging private data assets to the cloud?
• How can you measure ROI from bridging private data to public cloud data services?
• Which case studies illustrate the value of bridging private data to the cloud?
Sign up now for a free 3-month trial of IBM Analytics for Apache Spark and IBM Cloudant, IBM dashDB or IBM DB2 on Cloud.
http://ibm.co/ibm-cloudant-trial
http://ibm.co/ibm-dashdb-trial
http://ibm.co/ibm-db2-trial
http://ibm.co/ibm-spark-trial
Slide presentasi Pak Sutedjo Tjahjadi dari Datacomm Cloud Business dalam seminar "Accelerating Cloud Computing Adoption", Materi Messaging Anniversary DCB
Recent enhancements to Enterprise Vault give your organization new levels of control over your unstructured data. In this session, you'll learn how you can make the most of these new and enhanced capabilities. This includes using intelligent workflows that leverage classification and machine learning to accelerate your compliance activities, taking advantage of flexible new cloud deployment and cloud storage options, and much more. Don't miss this opportunity to explore best practices that will transform Enterprise Vault into one of the most versatile and powerful information management tools in your arsenal.
Webinar: Which Storage Architecture is Best for Splunk Analytics?Storage Switzerland
We discuss the pros and cons of the three most common storage architectures for Splunk, enabling you to decide which makes the most sense for your organization.
1. Leverage existing storage resources
2. Deploy a cloud storage and SaaS solution
3. Deploy a hybrid, Splunk-ready solution
Backup software is continuously improving. Solutions like Veeam Backup and Replication deliver instant recoveries, enabling virtual machine volumes to instantiate directly on the backup device, without having to wait for data to transfer back to primary storage. These solutions can also move older backups to higher capacity, lower cost object storage or cloud storage systems. To deliver meaningful performance during instant recovery without exceeding the backup storage budget requires IT to re-think its backup storage architecture.
Modern backup processes need high performance, low capacity systems to deliver high-performance instant recovery, as well as high-capacity, modest performance systems to store backup data long term and software to manage data placement for the most appropriate recovery performance while not breaking the budget.
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)Denodo
Watch full webinar here: https://bit.ly/3dudL6u
It's not if you move to the cloud, but when. Most organisations are well underway with migrating applications and data to the cloud. In fact, most organisations - whether they realise it or not - have a multi-cloud strategy. Single, hybrid, or multi-cloud…the potential benefits are huge - flexibility, agility, cost savings, scaling on-demand, etc. However, the challenges can be just as large and daunting. A poorly managed migration to the cloud can leave users frustrated at their inability to get to the data that they need and IT scrambling to cobble together a solution.
In this session, we will look at the challenges facing data management teams as they migrate to cloud and multi-cloud architectures. We will show how the Denodo Platform can:
- Reduce the risk and minimise the disruption of migrating to the cloud.
- Make it easier and quicker for users to find the data that they need - wherever it is located.
- Provide a uniform security layer that spans hybrid and multi-cloud environments.
Andrey Okhrimets - “Data Lake and Media Asset Management. Challenges and outc...Lviv Startup Club
AI&BigDataDay 2017
Andrey Okhrimets - “Data Lake and Media Asset Management. Challenges and outcomes”
Сайт: www.aiconf.com.ua
FB group: www.fb.com/groups/974150252615820/
Your data is no longer constrained by location or infrastructure. So why should your data protection be any different? Attend this session to learn how Veritas Backup Exec can help you tear down the data protection silos between physical, virtual, and cloud; escape Veeam's virtual prison; free your data; eliminate the cost of paying for multiple data protection solutions; and stay protected no matter where your data lives. Find out how Backup Exec makes backup easy--with one platform, one console, one license that protects everything everywhere.
Scaling Multi-Cloud Deployments with Denodo: Automated Infrastructure ManagementDenodo
Watch full webinar here: https://bit.ly/3oWR1Bl
The future of infrastructure management lies in automation. In this session, Denodo subject matter expert will talk about how in a multi-cloud scenario, the infrastructure can be automatically managed transparently via a web GUI. Audience will get to see that in action through a live demo.
SLA Consistency: Protecting Workloads from On-premises to Cloud without Compr...Veritas Technologies LLC
IDC predicts that by 2018, 85% of enterprises will commit to multi-cloud architectures. But in this new multi-cloud world, how do you protect data that is spread across multiple clouds? And how can you leverage one cloud as a protection target for another? In this session, Veritas experts will explore best practices for data protection in multi-cloud environments, so you can achieve aggressive SLAs, lower your costs, and mitigate risks across your multi-cloud architecture.
As we move into a new year – with the breakneck velocity that is the new normal for technology – NetApp CTO Jay Kidd presents his forecast for 2014. Below are his 10 technology predictions for 2014. Although both hybrid cloud and accelerated adoption of new technologies will be dominant themes for 2014, IT’s changing and more central role in the business will be critical to a successful year ahead. Learn more: http://nt-ap.com/1e89gMh
Meeting the Priorities and Challenges of the Data Center
Data needs to be stored, managed and transmitted across a broad range of IT infrastructures. The biggest dilemma is how to deliver greater performance, reliability, and manageability at an affordable price.
Efficiently Managing the Growth of Data
Data centers need to collect larger volumes and varieties of data. For data centers with outdated infrastructures harnessing the power of data is extremely challenging. HGST HelioSeal® Platform is ideal for enterprise and data center applications where capacity density and power efficiency are paramount. HGST SSDs provide ultra-high performance in the mission critical 24/7/365 transaction processing environments. The HGST object storage platform allows easy access and retrieval of deep-archived data. HGST solutions meet the needs of cloud service providers delivering scalability, capacity and performance.
Bridging the Last Mile: Getting Data to the People Who Need ItDenodo
Watch full webinar here: https://bit.ly/3cUA0Qi
Many organizations are embarking on strategically important journeys to embrace data and analytics. The goal can be to improve internal efficiencies, improve the customer experience, drive new business models and revenue streams, or – in the public sector – provide better services. All of these goals require empowering employees to act on data and analytics and to make data-driven decisions. However, getting data – the right data at the right time – to these employees is a huge challenge and traditional technologies and data architectures are simply not up to this task. This webinar will look at how organizations are using Data Virtualization to quickly and efficiently get data to the people that need it.
Attend this session to learn:
- The challenges organizations face when trying to get data to the business users in a timely manner
- How Data Virtualization can accelerate time-to-value for an organization’s data assets
- Examples of leading companies that used data virtualization to get the right data to the users at the right time
Optimizing Healthcare with Sphere3D VDI and Containerization Mark A Watts
Virtualize 2015, Oct 29, 2015. Healthcare delivery systems can be transformed by use of Containerization and Secure Virtual Desktop Distribution of Applications. The rapid spin up and flexible distributed high performing end customer user experience would be a stark contrast to today's complex bloated disappointing offerings. EMR costs and failed deployments have made the digitization of healthcare the only industry to lose productivity in this transformation.
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here?
In this webinar, we look at this foundational technology for modern Data Management and show how it evolved to meet the workloads of today, as well as when other platforms make sense for enterprise data.
Building Confidence in Big Data - IBM Smarter Business 2013 IBM Sverige
Success with big data comes down to confidence. Without confidence in the underlying data, decision makers may not trust and act on analytic insight. You need confidence in your data – that it’s correct, trusted, and protected through automated integration, visual context, and agile governance. You need confidence in your ability to accelerate time to value, with fast deployments of big data appliances. Learn how clients have succeeded with big data by building confidence in their data, ability to deploy, and skills. Presenter: David Corrigan, Big Data specialist, IBM. Mer från dagen på http://bit.ly/sb13se
My perspective on the evolution of big data from the perspective of a distributed systems researcher & engineer -- the background of how it get started, the scale-out paradigm, industry use cases, open source development paradigm, and interesting future challenges.
Join AWS and BlueMetal, a technology architecture firm and a member of the Amazon Partner Network, for this live webinar where we will discuss modernizing your applications when moving your data center to the AWS Cloud. Microsoft has announced that July 30, 2015, is the end of support for Windows Server 2003. This will affect customers since there will be no patches or security updates, putting applications and business at risk. Attend this webinar to learn about considerations and best practices for creating a composed solution when moving off of Windows Server 2003 and migrating your data center and applications to the cloud.
A Logical Architecture is Always a Flexible Architecture (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3joZa0a
The current data landscape is fragmented, not just in location but also in terms of processing paradigms: data lakes, IoT architectures, NoSQL, and graph data stores, SaaS applications, etc. are found coexisting with relational databases to fuel the needs of modern analytics, ML, and AI. The physical consolidation of enterprise data into a central repository, although possible, is both expensive and time-consuming. A logical data warehouse is a modern data architecture that allows organizations to leverage all of their data irrespective of where the data is stored, what format it is stored in, and what technologies or protocols are used to store and access the data.
Watch this session to understand:
- What is a logical data warehouse and how to architect one
- The benefits of logical data warehouse – speed with agility
- Customer use case depicting logical architecture implementation
Introducing MetalK8s, An Opinionated Kubernetes ImplementationScality
Scality Architect Nicolas Trangez introduces MetalK8s, an opinionated Kubernetes distribution with a focus on long-term on-prem deployments, launched by Scality to deploy its Zenko solution in customer data centers. Nicolas presented this at the OpenStack Summit in Vancouver, on May 22, 2018.
Wally MacDermid presents Scality Connect for Microsoft Azure at Microsoft Ign...Scality
Wally MacDermid, Scality VP of Cloud Business Development, shares the an overview of Scality Connect for Microsoft Azure Blob Storage at the Microsoft Ignite Conference in Orlando, FL, September 28, 2017.
Zenko: Enabling Data Control in a Multi-cloud WorldScality
Watch the webinar replay here:
http://www.zenko.io/webinar
Announcing New Scality Open Source Data Controller, Zenko
How to simplify your data management and get a global view across clouds?
Zenko, the new multi-cloud data controller by Scality, provides a unified interface across clouds. This allows any cloud to be accessed with the same API and access layer. It can run anywhere in physical, virtualized or cloud environments.
Zenko builds on the success of Scality Cloud Server, the open-source implementation of the Amazon S3 API, which has enjoyed more than half a million DockerHub downloads since it was introduced in June 2016. Scality is releasing this new code to the open source community, under an Apache 2.0 license, so that any developer can use and extend Zenko in their development.
Superior Streaming and CDN Solutions: Cloud Storage Revolutionizes Digital MediaScality
Superior Streaming and CDN Solutions: Cloud Storage Revolutionizes Digital Media
Yannick Guillerm – Director Technical Marketing
Learn more:
http://www.scality.com/solutions-industries/media-entertainment-storage/
May 26, 2017
Presented at NAB 2017
AWS re:Invent 2016 - Scality's Open Source AWS S3 ServerScality
Presented by Giorgio Regni, CTO
Try Scality S3 Server Today!
https://s3.scality.com/
http://www.scality.com/scality-s3-server/
https://hub.docker.com/r/scality/s3server/
S3 Server, a Scality product, was born after a hackathon in Paris, France in 2015. What better way to continue with our philosophy of innovation than to host a hackathon of our own?
On October 21st, coders joined us for a weekend of coding, developing new solutions for storage, integrations for S3 and much more!
This event was sponsored by Seagate and hosted at Holberton School.
S3 Server Hackathon Presented by S3 Server, a Scality Product, Seagate and Ho...Scality
S3 Server was founded by Scality, after a team created open source object-storage at a Hackathon in Paris, France. To keep our innovation, (and innovative team) growing, what better way than to host a hackathon of our own? The goal of the hackathon was to showcase the endless creativity in advancing storage applications, or integrations for current storage solutions. This 3-day event was sponsored by Seagate and Holberton School.
These slides are a recap from Day 1.
Scality CTO Giorgio Regni and Software Engineer Lauren Spiegel talk about the open source S3 clone, written in Node.js. This presentation was given at a meetup on September 1, 2016 in San Francisco.
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
How Does XfilesPro Ensure Security While Sharing Documents in Salesforce?XfilesPro
Worried about document security while sharing them in Salesforce? Fret no more! Here are the top-notch security standards XfilesPro upholds to ensure strong security for your Salesforce documents while sharing with internal or external people.
To learn more, read the blog: https://www.xfilespro.com/how-does-xfilespro-make-document-sharing-secure-and-seamless-in-salesforce/
Listen to the keynote address and hear about the latest developments from Rachana Ananthakrishnan and Ian Foster who review the updates to the Globus Platform and Service, and the relevance of Globus to the scientific community as an automation platform to accelerate scientific discovery.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoamtakuyayamamoto1800
In this slide, we show the simulation example and the way to compile this solver.
In this solver, the Helmholtz equation can be solved by helmholtzFoam. Also, the Helmholtz equation with uniformly dispersed bubbles can be simulated by helmholtzBubbleFoam.
Quarkus Hidden and Forbidden ExtensionsMax Andersen
Quarkus has a vast extension ecosystem and is known for its subsonic and subatomic feature set. Some of these features are not as well known, and some extensions are less talked about, but that does not make them less interesting - quite the opposite.
Come join this talk to see some tips and tricks for using Quarkus and some of the lesser known features, extensions and development techniques.
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
Cyaniclab : Software Development Agency Portfolio.pdfCyanic lab
CyanicLab, an offshore custom software development company based in Sweden,India, Finland, is your go-to partner for startup development and innovative web design solutions. Our expert team specializes in crafting cutting-edge software tailored to meet the unique needs of startups and established enterprises alike. From conceptualization to execution, we offer comprehensive services including web and mobile app development, UI/UX design, and ongoing software maintenance. Ready to elevate your business? Contact CyanicLab today and let us propel your vision to success with our top-notch IT solutions.
Advanced Flow Concepts Every Developer Should KnowPeter Caitens
Tim Combridge from Sensible Giraffe and Salesforce Ben presents some important tips that all developers should know when dealing with Flows in Salesforce.
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Globus
The Earth System Grid Federation (ESGF) is a global network of data servers that archives and distributes the planet’s largest collection of Earth system model output for thousands of climate and environmental scientists worldwide. Many of these petabyte-scale data archives are located in proximity to large high-performance computing (HPC) or cloud computing resources, but the primary workflow for data users consists of transferring data, and applying computations on a different system. As a part of the ESGF 2.0 US project (funded by the United States Department of Energy Office of Science), we developed pre-defined data workflows, which can be run on-demand, capable of applying many data reduction and data analysis to the large ESGF data archives, transferring only the resultant analysis (ex. visualizations, smaller data files). In this talk, we will showcase a few of these workflows, highlighting how Globus Flows can be used for petabyte-scale climate analysis.
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
2. The Pressures of Digital Medicine
Growing Costs
Medical imaging is one of the
fastest-growing components of
medical costs and constitutes
between 7.5 and 10%of total health-
care expenditures1
Size
Medical imaging stores just in
U.S. hospitals grew from less than
8,900 terabytes in 2005 to more
than 27,000 terabytes in 2011.1
Storage
PACSstorage requirements
are growing: 17%globally1
Oversight
Regulatory oversight is strong,
with national and regional rules
governing retention,availability,
privacy and security1
Cost
Imaging costs amount to more
than $100 billion per year, 10%
of these exams—$10 billion
worth—tend to be duplicative or
unnecessary1
1. Frost & Sullivan, via The Data Center Journal: WHAT’SNEXT FORTHE HEALTH-CARE DATA CENTER?Morris Panner Apr 6, 2015 link >
2. Gartner Market Guide for Vendor-Neutral Archives, Published: 18 August 2015. link >
Analytics and Medical Data
With a large portion of the electronic medical
record represented as unstructured data, the
trend is toward seeing medical images as
enterprise content and mining its valuethrough
analytics.2
3. Image Capture, Use and Storage
3
Radiology
Pathology
Dermatology
Other Departments
PACS
PACS
PACS
PACS
5. Simplify and Consolidate
5
r
Scality RING
Storage
ag
e
Stora
RadiologySolutions
PACS
S3
SMB
NFS
S3
SMB
NFS
S3
S3
S3, SMB,NFS
Cardiology, Pathology
&Other Solutions
Specialty Solutions
VENDOR NEUTRAL ARCHIVE
PUBLIC CLOUD STORAGE
Data Protection/Resiliency Locally and Across Datacenters
PACS
Videos, Images,
Sensors’ Data, Etc.
6. Scality Meets Business and Clinical
Requirements
6
Privacy
Keep data safe to reduce
breach risk—and associated
regulatory penalties
Low TCO
Keep infrastructure and
administrative costs low
Access
Don’t slow clinicians down
with access waits
Integrity
Don’t compromise patient careby
risking loss of data—total loss or
degradation in quality.
7. • Scality RING software turns
standard x86 servers into
reliable object storage
• Scales to hundreds of petabytes
to manage billions of small or
large files
• Supports thousands of
simultaneous users without
affecting performance
• 100% uptime, no maintenance
windows required
The Scality RING
7
SCALITY RING SOFTWARE
ManagementLocal and Geo-Protection, Self-healing
Object, File, OpenStack Access
Advanced Routing, Object Storage
OBJECT
APPLICATIONS
FILE
APPLICATIONS
OPENSTACK
APPLICATIONS
……
STANDARD LINUX / STANDARD x86 SERVERS
8. …and Reduces Cost
8
Scale-out globally,
multiple workloads
Keep productivity up
with next to zero
downtime
Protect the data
integrity with over 10
nines
Return on
Investment:
Net Present Value: Cost Advantage:
E X E C U T I V E S U M M A R Y
229% $8.0M $3.1M
SOURCE: Forrester Report – The Total Economic Impact of Scality RING for 2 petabytes growing to 4 petabytes
9. • Scale-out access and capacity to
millions of clients and billions of
objects
• Any-to-Any: any client can access any
object in parallel with no added latency
• Shared Nothing: no “master” metadata
node or single point of failure
• Self-healing: automated, fast repair and
rebuild
• Supports object (S3 API) and file
system (NFS, SMB) natively
• 100% Software: install on any
hardware, flexible deployments, 100%
available during SW/HW upgrades, no
data migrations
Scality RING Scalable Architecture
9
APPLICATION
A
APPLICATION
B
APPLICATION
n
HDD …
STORAGE
NODE
CONNECTOR
(File)
CONNECTOR
(Object)
CONNECTOR
n
STORAGE
NODE
STORAGE
NODE n
HDD … HDD …
10. Scality HALO Cloud Monitoring
- Entry-level monitoring service available for free to all Scality customers
- 15 metrics on the RING (characteristics, operations, latency)
Standard
Service
Dedicated
Care
Service
- Full monitoring service available as part of the Always-On service to Scality
customers
- 100s metrics including RING status, disk/nodes monitoring, IO daemons,
connectors, buckets, ...
• Real-time monitoring, alerting and capacity planning
solution
• Based on best-of-breed cloud monitoring solution
powered by Netuitive
• Provides access to the first level of Scality Support teams
Included in Support Services – 2 levels of Service:
Powered by
11. • Designed for no data loss with 14x9s of
durability; guaranteed by contract with
DCS
• Automatic failure detection, data rebuild,
rebalancing
• Optimally protect both large and small
objects
• Telenet, one of our first customers, has
had zero downtime since their
installation in 2010 while growing 25X
No downtime – means data is
always there when it’s needed
Always On
Guaranteed 100% Availability
11
12. • Capacity can grow with data – easily
• Scale performance with capacity growth
linearly
• Scale-out architecture with 100%
parallel design for data and metadata
Predictable performance
keeps clinicians happy
Scale On Demand
Grow Capacity & Performance with
Needs
12
13. • Field-tested with leading PACS & VNAs
• Standards-based APIs for broad
compatibility (strong API compatibility
with defacto standard AWS S3 and S3
developer toolkits)
• Full compatibility with public cloud
services to enable seamless use of the
same applications on premises or with
public cloud
• Supports file (NFS, SMB) and object
natively
Value and versatility
13
Rich Application
Ecosystem, File & Object
Commitment to Standards
14. • Secure data storage using integrated
AES256 bit encryption
• Data integrity with S3 native MD5
checksums
• User access controls lists (ACLs) and
authorization control
Enables
Compliance
14
Secure for Privacy &
Compliance
Integrated Secure Encryption
15. • Software-defined with standard x86
servers allows hardware choice
• Hardware-independent licensing based
on usable capacity facilitates anytime
upgrades
• Reduces administrative overhead and
cost for ongoing operations
No forklift upgrades
Predictable Costs
100% Software-defined & Hardware
Independent
15
16. Availability Matters
16
Store ePHI, keeping it both
safe and available quickly and
effortlessly.
Store more, longer,
for less. It’s easier and cheaper,
than you think.
18. World Leader in Object Storage
18
500
million+
people access,store,
and protect data on
the Scality RING
800 billion +
objects are managed by the
Scality RINGin production
environments
150+
cloud-scale customers
$92 million
in funding from investors in
the United States, Europe,
and Japan
19. Scality is Object & Scale-Out File Storage Leader
Proven by Analysts
19
20. • Scality is Preferred Solution
Partner of Cisco with Cisco
Validated Design
• Key solution of Cisco DCS
(Data Center Services)
• Reference architectures with
Cisco UCS S3260, C240 M4,
C220 M4
• Resells Scality as a “SDS object
storage” partner
• S7000 File and Object Storage
• Reference architectures with
Dell R730xd, R630 + MD3060e,
and DSS7000
• Resells Scality as their sole
“Scale-out file and object”
solution
• Reference architectures with
HPE Apollo 4510, 4200, 4530 /
Cloudline
• Joint engineering
• Strategic venture investment
Only Scale-out Storage Supplier for All Three Server Leaders
20
21. Fast Track
• Appliance-like experience
• Pre-configured, pre-
installed
• Reference designs
Custom Track
• Configuration to order
• Optimized for your needs
Our Experts Make It Easy for You
21
We are here for you 24 x 7
22. 22
Experience the RING in
10 minutes
Log-on to our Free Web
Trial:
www.scality.com/trial
Download the free open-source
S3 Server and have it running on
your laptop in less than 10
minutes:
https://s3.scality.com/
Presentation Title: Efficient, Accessible Storage for Medical Images and Other Unstructured Healthcare Data
Last Edited: January 2017
Presentation Goal: Introduction to efficient alternative to PACS storage or conventional NAS – or tape – for medical imaging and other healthacre data
Target Audience: all levels intro: CIO/CTO of hospital or healthcare practice, IT or Infrastructure manager/director, PACS and Storage Infrastructure managers
Speaking Points:
Welcome (introduce yourself and other associates)
Get to know your audience (ask about current levels of storage understanding, their current role in the decision/evaluation process, …)
Over the next 30-45 minutes we will discuss the current trends in healthcare storage, and its biggest consumer, imaging. Specifically the effect that electronic medical records initiatives – including the strain that cross-institutional sharing requirements and long retentions are putting on your storage infrastructure and team.
We’ll also introduce you to Scality’s software defined storage solution as a world leader in object and cloud storage
Discovery Opportunity:
How familiar are you with your current and near-term future storage strategy? Can you share highlights?
What specific issues would you like to discuss today? How much time do we have?
Slide Goal: Show an understanding of the challenges posed by the digital medicine data explosion
Speaking Points:
As medicine goes digital, the requirements— operational and regulatory—to create, store and share access to data are growing exponentially. Scality presents a great solution for consolidation of the massive amounts of unstructured data in healthcare that solves the problem now, and sets you up for easy management and expansion as the stores grow ever larger.
Discovery Opportunity:
What type of data growth (explosion) are you anticipating in the coming years?
Speaking Points
Imaging is the place to start when working to get unstructured data under control. Images are generated across multiple departments, and have been managed within those departments.
In most larger healthcare organizations, there are lots of departmental “islands” of storage. And as those siloed stoage islands grow, management is difficult and costs are hard to control.
Speaking Points
Expensive storage on departmental PACS gets bloated; or offloaded to tape, making image access difficult.
A shared object storage pool enables long-term storage from all of these sources, that stays accessible within and across organizations.
Openness—with oversight and security
Interoperability across Ecosystems
Access to Data across and among health delivery organizations (HDOs) and vendors
Whether through the VNA (which makes the data more universally accessible) or direct from the PACS, Scality RING Software-Defined Storage can tame the storage sprawl. There is no limit to the number of files or objects that can be stored on the RING, or to the number of storage nodes/disks that can be added to a RING cluster, allowing you to grow from 1 or 2PBs to 10s or 100s of PBs.
The Scality RING is an object store at its core, offering both native object protocol access (S3, HTTP REST, CDMI) and file system protocols (NFS, SMB, FUSE). By leveraging object storage, we don’t hit traditional file system inode limitations that other storage systems do (as with building multiple NAS heads, which becomes very difficult to manage at the multi- petabyte scale).
Detail: At the lower layer, all chunks are stored via HTTP request (PUT, GET, DELETE) and leverage a unique key identifier coded on 160 bits. A single RING addressable namespace is near limitless, at 2^160.
There are several areas to be considered when choosing storage for healthcare: clinicians, IT, finance and ultimately, patients. Scality meets the needs across the spectrum.
Slide Goal: Introduce Scality RING
Speaking Points:
The Scality RING is software-defined storage. We run on standard x86 servers and create a giant pool of storage. We protect the data and provide 100% reliable, high performance access for any capacity-driven application.
100% reliability with no maintenance windows, for superior SLAs to your customers – case in point, our first customer hasn’t been down since they installed the RING five years ago!
The only storage platform with native file, object, and OpenStack support – legacy file, digital, and cloud workloads all in one platform for better utilization, holistic protection, and lower costs
Real hardware choice means the best hardware for you at any point in the lifecycle, and no data migration – better economics and lower risk
Discovery Opportunity:
Have you ever seen the RING in action?
Slide Goal: Reminder that Scality is the most effective solution for petabyte scale (from analyst reports)
Speaking Points:
In 2016, Forrester did a Total Economic Impact study (TEI), to look at the benefits of the scality RING compared to traditional NAS storage arrays. This was a detailed analysis done by interviewing our customers and then building a comparative analysis to traditional storage.
The benefits were clear – $8m in expected cost savings over the life of the storage, over $3m from capital savings alone and a 229% ROI over the life of the storage.
It also delivered a capital payback within only 6months
The key benefits which Forrester found where
Avoided Capital expenditures on traditional scale-out NAS solutions.
Improved availability and durability with zero downtime
And increased business productivity since they could now deliver new storage on-demand as the business required.
Discovery Opportunity:
Would you be interested in a custom ROI/TCO analysis for your needs?
Slide Goal: Introduce Scality RING Architecture
Speaking Points:
Use talking points on slide as architectural differentiators of Scality RING
Discovery Opportunity:
Does this meet your architectural expectations?
What is most important?
What is missing?
Slide Goal: Take a deeper look at availability/durability expectations of the RING
Speaking Points:
With the Scalable Object Storage, you can scale beyond petabytes with linear performance in less space
Architected for up to trillions of objects in a single namespace, the platform’s access and storage layers can be scaled independently to thousands of nodes, all of which can be accessed directly and concurrently with no added latency.
These density-optimized platforms are designed for space efficiency at scale, and to add capacity, simply add more storage drives or servers which will also increase back-end performance. Such modular scalability approach between access and capacity dramatically reduces cost.
Architected for 100% uptime with 14x 9s of durability with policy-based data protection schemas with native multi-site capabilities and self-healing features
Scality HALO remote monitoring service and cloud dashboard with immediate alerting for abnormal activity on cluster with dedicated Scality support staff
Discovery Opportunity:
What is the cost of downtime?
Slide Goal: Take a deeper look at scale capacity and performance expectations with the RING
Speaking Points:
100% parallel design for metadata and data provides high throughput performance with no bottlenecks at central metadata databases or indexes, and no lookups needed for replicas.
Also, it provides consistent response time at scale. Each storage node will retrieve requested data or find the correct node in a deterministic amount of hops, no matter the scale
Cover other bullets on slide
Discovery Opportunity:
What is your current storage staffing? What is your ideal?
How are you handling your internal SLA requirements?
Slide Goal: take a broader view
Speaking Points:
Cover bullets on slide
Discovery Opportunity:
Multi-site private cloud or even secure public cloud help keep data safe and available
Versatility and agility are key to IT success
Slide Goal: Regulatory compliance and the promise of patient privacy are a basic requirement
Speaking Points:
AES256 encryption and TLS network security
Can mix file and object types in a single system natively
We have healthcare customers, and other types of customers for whom security and privacy are paramount, including large banks
Discovery Opportunity:
Which applications are most taxing on your storage infrastructure?
What level of file and object requirements do you anticipate?
Slide Goal: Take a deeper look at the TCO expectations of the RING
Speaking Points:
Access all of your data types at petabyte scale with freedom to choose a platform today that is sustainable for the future for best ROI
You can have choice of interfaces & deployment topologies and mix and match different configurations and server models across multiple server generations
Hardware independent licensing model liberates you from the need to purchase new licenses every time the platform is refreshed and pay-as-you-grow
It allows you to lower your total cost of ownership (TCO) by avoiding future data migrations and upgrades along with maximum ROI on software license fees
Discovery Opportunity:
How do you predictably manage your storage budgets today?
How do you plan for migrations and technical obsolescence?
Bottom line, patient data has to be available and intact – and cost concerns are real, so TCO is a big consideration.
Slide Goal: Transition slide to show how Scality is a proven leader across partners, customers and industry analysts
Speaking Points:
Scality, world leader in object and cloud storage, develops cost-effective Software-Defined Storage: the RING, which serves over 500 million end-users worldwide with over 800 billion objects in production; and the open-source S3 Server. Scality RING software deploys on any industry-standard x86 server, uniquely delivering performance, 100% availability and data durability, while integrating easily in the datacenter thanks to its native support for directory integration, traditional file applications and over 45 certified applications. Scality’s complete solutions excel at serving the specific storage needs of Global 2000 Enterprise, Media and Entertainment, Government and Cloud Provider customers while delivering up to 90% reduction in TCO versus legacy storage. A global company, Scality is headquartered in San Francisco.
Discovery Opportunity:
N/A
Slide Goal: Show analyst reports establishing Scality as a market leader
Speaking Points:
Gartner Disclaimer
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
[1] Gartner “2016 Magic Quadrant for Distributed File Systems and Object Storage” by Julia Palmer, Arun Chandrasekaran, Raj Bala, October 20, 2016.
IDC MarketScape vendor analysis model is designed to provide an overview of the competitive fitness of ICT suppliers in a given market. The research methodology utilizes a rigorous scoring methodology based on both qualitative and quantitative criteria that results in a single graphical illustration of each vendor’s position within a given market. The Capabilities score measures vendor product, go-to-market and business execution in the shortterm. The Strategy score measures alignment of vendor strategies with customer requirements in a 3-5-year timeframe. Vendor market share is represented by the size of the circles. Vendor year-over-year growth rate relative to the given market is indicated by a plus, neutral or minus next to the vendor name.”
Discovery Opportunity:
Are you a Gartner or IDC client?
Do you have access to these reports?
If not, can I send them to you?
Slide Goal: Introduce Scality alliance partnerships
Speaking Points:
Scality is the only storage vendor with strong partnerships with over 50% of the server market.
HPE, Dell, and Cisco each offer a wide range of hardware that can be deployed with Scality: from performance-oriented to capacity-optimized.
You can buy Scality through HPE or Dell today.
Discovery:
Are you already a customer of one (or more) of these vendors?
Which is your preference for easy of acquisition and support?
Slide Goal: Introduce two ways to get started with Scality
Speaking Points:
Fast Track: appliance like approach to the fastest way to get started
Flexible Track: custom configuration for your specific requirements
Discovery Opportunity:
Which is your preferred approach?
Which use cases would you like to prioritize first? Second? Third?
Slide Goal: Call to action slide
Speaking Points:
In 10 min you can log into our free web trial and experience the RING for yourself
Or your developer can download our opensource S3 server and get started developing today
Discovery Opportunity:
Which is your preferred approach?
Which use cases would you like to prioritize first? Second? Third?