Cisco Live in booth presentation explaining how Clustered Data ONTAP gives organizations and cloud service providers the capability to rapidly and cost effectively deliver new services and capacity with maximum application uptime.
Webinar: How To Use Software Defined Storage to Extend Your SAN, Not Replace itStorage Switzerland
Join Storage Switzerland and ioFABRIC for this on demand webinar, "How to use Software Defined Storage to extend your SAN, not replace it”. We discuss the different types of software defined storage, why vendors want to replace your SAN instead of enhance it and what you can do to not only protect your current storage investments but also prepare a path to the future.
VMworld 2013: Software-Defined Storage: The VCDX Way VMworld
VMworld 2013
Wade Holmes VCDX, VMware
Rawlinson Rivera VCDX, VMware
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
Emergence of Software Defined Storage
SDS role in Software Defined Data Center
The value SDDC/SDC will bring to developers. System Integrators and IT community.
'Software-Defined Everything' Includes Storage and DataPrimaryData
Is your data stuck where it started? Join us and industry analyst Jason Bloomberg this Tuesday, July 26 to discover how you can automate data mobility across your software-defined datacenter.
If you’re like most enterprises, you’ve likely added the benefits of flash and cloud storage to your traditional infrastructure. This storage diversity delivers more choice in meeting performance, protection and cost requirements to support the different data needs of applications, but without a way to converge data across your different storage investments, it’s nearly impossible to align the right data to the right storage at the right time. Data virtualization is a software-defined solution that finally unites different storage systems into a global pool of resources so that even data can be part of your SDDC architecture from on-premise and into the cloud.
In Tuesday’s webinar, Jason will provide insight on how the principle of Software-Defined Everything supports the business agility needs of today’s enterprises. He will also discuss the software-defined approach to championing agility by automatically aligning storage resources to evolving data demands through data virtualization and orchestration, even as business needs change.
Following Jason’s talk, Primary Data Senior Systems Engineer Brett Arnott will cover how data orchestration ensures that data is automatically aligned to the right storage resource to deliver breakthrough agility and efficiency. Attendees will learn how data virtualization and orchestration helps enterprises not only develop a roadmap for their transition to software-defined storage and data, but also execute the move to automated, Objective-driven storage efficiency.
Imagine an entire IT infrastructure controlled not by hands and hardware, but by software. One in which application workloads such as big data, analytics, simulation and design are serviced automatically by the most appropriate resource, whether running locally or in the cloud. A Software Defined Infrastructure enables your organization to deliver IT services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. It is the foundation for a fully integrated software defined environment, optimizing your compute, storage and networking infrastructure so you can quickly adapt to changing business requirements. A comprehensive portfolio of management tools dynamically manage workloads and data, transforming a static IT infrastructure into a workload- , resource- and data-aware environment.
Learn more: http://ibm.co/1wkoXtc
Watch the video presentation: http://insidehpc.com/2015/03/slidecast-software-defined-infrastructure/
Software Defined anything (SDx) is a movement toward promoting a greater role for software systems in controlling different kinds of hardware - more specifically, making software more "in command" of multi-piece hardware systems and allowing for software control of a greater range of devices.
Software Defined Everything (SDx) includes
Software Defined Networks (SDN)
Software Defined Computing (SDC)
Software Defined Storage (SDS)
Software Defined Data Centers (SDDC)
Project Presentation. Gives a good overview of the various software defined technologies and quality attributes. I am looking for sales jobs at high tech companies. My profile is on LinkedIn if you need to contact me. I appreciate feedback and comments on this presentation.
Webinar: How To Use Software Defined Storage to Extend Your SAN, Not Replace itStorage Switzerland
Join Storage Switzerland and ioFABRIC for this on demand webinar, "How to use Software Defined Storage to extend your SAN, not replace it”. We discuss the different types of software defined storage, why vendors want to replace your SAN instead of enhance it and what you can do to not only protect your current storage investments but also prepare a path to the future.
VMworld 2013: Software-Defined Storage: The VCDX Way VMworld
VMworld 2013
Wade Holmes VCDX, VMware
Rawlinson Rivera VCDX, VMware
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
Emergence of Software Defined Storage
SDS role in Software Defined Data Center
The value SDDC/SDC will bring to developers. System Integrators and IT community.
'Software-Defined Everything' Includes Storage and DataPrimaryData
Is your data stuck where it started? Join us and industry analyst Jason Bloomberg this Tuesday, July 26 to discover how you can automate data mobility across your software-defined datacenter.
If you’re like most enterprises, you’ve likely added the benefits of flash and cloud storage to your traditional infrastructure. This storage diversity delivers more choice in meeting performance, protection and cost requirements to support the different data needs of applications, but without a way to converge data across your different storage investments, it’s nearly impossible to align the right data to the right storage at the right time. Data virtualization is a software-defined solution that finally unites different storage systems into a global pool of resources so that even data can be part of your SDDC architecture from on-premise and into the cloud.
In Tuesday’s webinar, Jason will provide insight on how the principle of Software-Defined Everything supports the business agility needs of today’s enterprises. He will also discuss the software-defined approach to championing agility by automatically aligning storage resources to evolving data demands through data virtualization and orchestration, even as business needs change.
Following Jason’s talk, Primary Data Senior Systems Engineer Brett Arnott will cover how data orchestration ensures that data is automatically aligned to the right storage resource to deliver breakthrough agility and efficiency. Attendees will learn how data virtualization and orchestration helps enterprises not only develop a roadmap for their transition to software-defined storage and data, but also execute the move to automated, Objective-driven storage efficiency.
Imagine an entire IT infrastructure controlled not by hands and hardware, but by software. One in which application workloads such as big data, analytics, simulation and design are serviced automatically by the most appropriate resource, whether running locally or in the cloud. A Software Defined Infrastructure enables your organization to deliver IT services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. It is the foundation for a fully integrated software defined environment, optimizing your compute, storage and networking infrastructure so you can quickly adapt to changing business requirements. A comprehensive portfolio of management tools dynamically manage workloads and data, transforming a static IT infrastructure into a workload- , resource- and data-aware environment.
Learn more: http://ibm.co/1wkoXtc
Watch the video presentation: http://insidehpc.com/2015/03/slidecast-software-defined-infrastructure/
Software Defined anything (SDx) is a movement toward promoting a greater role for software systems in controlling different kinds of hardware - more specifically, making software more "in command" of multi-piece hardware systems and allowing for software control of a greater range of devices.
Software Defined Everything (SDx) includes
Software Defined Networks (SDN)
Software Defined Computing (SDC)
Software Defined Storage (SDS)
Software Defined Data Centers (SDDC)
Project Presentation. Gives a good overview of the various software defined technologies and quality attributes. I am looking for sales jobs at high tech companies. My profile is on LinkedIn if you need to contact me. I appreciate feedback and comments on this presentation.
Software Defined Storage - Open Framework and Intel® Architecture TechnologiesOdinot Stanislas
(FR)
Dans cette présentation vous aurez le plaisir d'y trouver une introduction plutôt détaillées sur la notion de "SDS Controller" qui est en résumé la couche applicative destinée à contrôler à terme toutes les technologies de stockage (SAN, NAS, stockage distribué sur disque, flash...) et chargée de les exposer aux orchestrateurs de Cloud et donc aux applications.
(ENG)
This presentation cover in detail the notion of "SDS Controller" which is in summary a software stack able to handle all storage technologies (SAN, NDA, distributed file systems on disk, flash...) and expose it to Cloud orchestrators and applications. Lots of good content.
Big Data – Shining the Light on Enterprise Dark DataHitachi Vantara
Content stored for a business purpose is often without structure or metadata required to determine its original purpose. With Hitachi Data Discovery Suite and Hitachi Content Platform, businesses can uncover dark data that could be leveraged for better business insight and uncover compliance issues that could prevent business risks. View this session and learn: What is enterprise dark data? How can enterprise dark data impact business decisions? How can you augment your underutilized data and deliver more value? How can you decrease the headache and challenges created by dark data? For more information please visit: http://www.hds.com/products/file-and-content/
Webinar: Don't believe the hype, you don't need dedicated storage for VDI NetApp
This webinar covers how the combination of SolidFire and Citrix XenDesktop enables customers to confidently support the storage demands of a virtual desktop environment in a multi-tenant or multi-application environment.
HDS Influencer Summit 2014: Innovating with Information to Address Business N...Hitachi Vantara
Top Executives at HDS share how the company is Innovating with Information to address business needs. Learn how the company is transforming now and into the future. #HDSday.”
Object Storage 3: How to Use and Develop Applications Designed for Object Sto...Hitachi Vantara
In part 3 of 3, you will learn how to make object storage part of your existing environments, how to use object storage with the next generation of cloud-enabled applications, and how to develop applications for object storage. Join Wayzen Lin, directory of technology for Hitachi content services, to explore using object storage with legacy applications and new applications and developing applications to work with object storage. View this WebTech to learn how to: Integrate legacy applications with object storage. Integrate Web 2.0 applications with object storage. Develop applications for use with object storage. For more information on Object Storage please view our white paper: http://www.hds.com/assets/pdf/hitachi-white-paper-introduction-to-object-storage-and-hcp.pdf
DataCore’s Fifth Annual State of Software-Defined Storage (SDS) Survey Reveals Surprising Lack of Spending on Big Data, Object Storage and OpenStack. In contrast, more than half of organizations polled (52 percent) look to extend the life of existing storage assets and future-proof their IT infrastructure with SDS in 2015.
On the other hand, this year’s report reveals several major business drivers for implementing Software-Defined Storage. 52 percent of respondents expect SDS will extend the life of existing storage assets and future-proof their storage infrastructure, enabling them to easily absorb new technologies. Close to half of respondents look to SDS to avoid hardware lock-in from storage manufacturers, while lowering hardware costs by allowing them to shop among several competing suppliers. Operationally, they see SDS simplifying management of different classes of storage by automating frequent or complex operations. This is notable in comparison with earlier surveys, as these results portray a sharp increase in the recognition of the economic benefits generated by SDS (reduced CAPEX), complementing the OPEX savings referenced in prior years.
Other surprises include: while flash technology penetration expanded it is still absent in 28 percent of the cases and 16 percent reported that it did not meet application acceleration expectations. Also interesting is that 21 percent reported that highly touted hyper-converged systems did not perform as required or did not integrate well within their infrastructure. On the other hand, Software-Defined Storage and storage virtualization are deemed very urgent now, with 72 percent of organizations making important investments in these technologies throughout 2015. 81 percent also expect similar levels of spending on Software-Defined Storage technologies that will be incorporated within server SANs / virtual SANs and converged storage solutions.
Hitachi Virtual Infrastructure Integrator (Virtual V2I) is a VMware vCenter plugin plus associated software. It provides data management efficiency for large VM environments. Specifically, the latest release addresses virtual machine backup and recovery and cloning services. Customer want to leverage storage based snapshots as it is scalable, more granular backup from hours between backups to minutes resulting in improved RPO. VMworld 2015.
When to select hyper converged 2016 Sydney VMUGKeith Townsend
From the 2016 Sydney VMUG this is the accompany slide deck in which Keith Townsend discusses both technical and non-technical requirements for Hyper-Converged infrastructure vs. converged and bring your own. Video presentation can be found on Youtube.
https://www.youtube.com/watch?v=o5ClvQFIbmY
Accelerate the Business Value of Enterprise StorageHitachi Vantara
When it comes to enterprise storage, IT has always had to choose between features and cost. Ongoing tradeoffs between the best technologies to support business operations and an adequate budget to pay for those technologies generally impede an organization’s ability to be competitive, innovative and cost efficient. The entry-enterprise storage market has opened up new opportunities for storage customers – and eliminated the need for tradeoffs. Join this webinar to understand how to accelerate business value with entry-enterprise storage systems and learn about the new Hitachi Data System offering, Hitachi Unified Storage VM. View this WebTech to: Understand the common tradeoffs and challenges within the entry-enterprise storage market. Understand the business value of new entry-enterprise offerings. Learn how Hitachi Unified Storage VM is bringing enterprise-level features to the midrange. For more information on Hitachi Unified Storage VM please visit: http://www.hds.com/products/storage-systems/hitachi-unified-storage-vm.html?WT.ac=us_mg_pro_husvm
Continua il ciclo di webinar in collaborazione con Veritas Technologies.
In questo secondo appuntamento abbiamo visto le soluzioni Veritas di Software Defined Storage.
Il settore IT è oggi una delle aree aziendali maggiormente impattate dal fenomeno dell’aumento esponenziale dei dati. Conseguentemente, gli IT Manager devono far fronte all'aumento dei costi e della complessità per l’implementazione di soluzioni di Storage atte a contenere la crescita del volume dei dati.
Al tempo stesso essi devono operare delle scelte orientate a soluzioni in grado di soddisfare i livelli prestazionali sempre più elevati richiesti dalle nuove applicazioni di business mantenendo altresì la funzionalità di quelle legacy.
L’implementazione di hardware NAS ad alte prestazioni o l’adozione di soluzioni storage di tipo diversificato non rappresentano oggi la soluzione ideale dal punto di vista degli impatti economici e di gestione. Sono infatti disponibili nuove tecnologie, sviluppate proprio in risposta all'esigenza di efficientamento e al contenimento dei costi, che permettono di realizzare infrastrutture che consentono di massimizzare l’utilizzo delle soluzioni storage già presenti nel Data Center e l’adozione si soluzioni Object Storage.
Allo scopo Veritas presenta la propria linea di soluzioni Software Defined Storage.
Hyperconvergence 101: A Crash Course in Redefining Your InfrastructureePlus
Is a hyperconverged infrastructure (HCI) the right choice for your data center? EMC partners with ePlus to help transform your data center. From assessment to implementation, ePlus can be a trusted guide to get your HCI solution up and running.
Software Defined Storage - Open Framework and Intel® Architecture TechnologiesOdinot Stanislas
(FR)
Dans cette présentation vous aurez le plaisir d'y trouver une introduction plutôt détaillées sur la notion de "SDS Controller" qui est en résumé la couche applicative destinée à contrôler à terme toutes les technologies de stockage (SAN, NAS, stockage distribué sur disque, flash...) et chargée de les exposer aux orchestrateurs de Cloud et donc aux applications.
(ENG)
This presentation cover in detail the notion of "SDS Controller" which is in summary a software stack able to handle all storage technologies (SAN, NDA, distributed file systems on disk, flash...) and expose it to Cloud orchestrators and applications. Lots of good content.
Big Data – Shining the Light on Enterprise Dark DataHitachi Vantara
Content stored for a business purpose is often without structure or metadata required to determine its original purpose. With Hitachi Data Discovery Suite and Hitachi Content Platform, businesses can uncover dark data that could be leveraged for better business insight and uncover compliance issues that could prevent business risks. View this session and learn: What is enterprise dark data? How can enterprise dark data impact business decisions? How can you augment your underutilized data and deliver more value? How can you decrease the headache and challenges created by dark data? For more information please visit: http://www.hds.com/products/file-and-content/
Webinar: Don't believe the hype, you don't need dedicated storage for VDI NetApp
This webinar covers how the combination of SolidFire and Citrix XenDesktop enables customers to confidently support the storage demands of a virtual desktop environment in a multi-tenant or multi-application environment.
HDS Influencer Summit 2014: Innovating with Information to Address Business N...Hitachi Vantara
Top Executives at HDS share how the company is Innovating with Information to address business needs. Learn how the company is transforming now and into the future. #HDSday.”
Object Storage 3: How to Use and Develop Applications Designed for Object Sto...Hitachi Vantara
In part 3 of 3, you will learn how to make object storage part of your existing environments, how to use object storage with the next generation of cloud-enabled applications, and how to develop applications for object storage. Join Wayzen Lin, directory of technology for Hitachi content services, to explore using object storage with legacy applications and new applications and developing applications to work with object storage. View this WebTech to learn how to: Integrate legacy applications with object storage. Integrate Web 2.0 applications with object storage. Develop applications for use with object storage. For more information on Object Storage please view our white paper: http://www.hds.com/assets/pdf/hitachi-white-paper-introduction-to-object-storage-and-hcp.pdf
DataCore’s Fifth Annual State of Software-Defined Storage (SDS) Survey Reveals Surprising Lack of Spending on Big Data, Object Storage and OpenStack. In contrast, more than half of organizations polled (52 percent) look to extend the life of existing storage assets and future-proof their IT infrastructure with SDS in 2015.
On the other hand, this year’s report reveals several major business drivers for implementing Software-Defined Storage. 52 percent of respondents expect SDS will extend the life of existing storage assets and future-proof their storage infrastructure, enabling them to easily absorb new technologies. Close to half of respondents look to SDS to avoid hardware lock-in from storage manufacturers, while lowering hardware costs by allowing them to shop among several competing suppliers. Operationally, they see SDS simplifying management of different classes of storage by automating frequent or complex operations. This is notable in comparison with earlier surveys, as these results portray a sharp increase in the recognition of the economic benefits generated by SDS (reduced CAPEX), complementing the OPEX savings referenced in prior years.
Other surprises include: while flash technology penetration expanded it is still absent in 28 percent of the cases and 16 percent reported that it did not meet application acceleration expectations. Also interesting is that 21 percent reported that highly touted hyper-converged systems did not perform as required or did not integrate well within their infrastructure. On the other hand, Software-Defined Storage and storage virtualization are deemed very urgent now, with 72 percent of organizations making important investments in these technologies throughout 2015. 81 percent also expect similar levels of spending on Software-Defined Storage technologies that will be incorporated within server SANs / virtual SANs and converged storage solutions.
Hitachi Virtual Infrastructure Integrator (Virtual V2I) is a VMware vCenter plugin plus associated software. It provides data management efficiency for large VM environments. Specifically, the latest release addresses virtual machine backup and recovery and cloning services. Customer want to leverage storage based snapshots as it is scalable, more granular backup from hours between backups to minutes resulting in improved RPO. VMworld 2015.
When to select hyper converged 2016 Sydney VMUGKeith Townsend
From the 2016 Sydney VMUG this is the accompany slide deck in which Keith Townsend discusses both technical and non-technical requirements for Hyper-Converged infrastructure vs. converged and bring your own. Video presentation can be found on Youtube.
https://www.youtube.com/watch?v=o5ClvQFIbmY
Accelerate the Business Value of Enterprise StorageHitachi Vantara
When it comes to enterprise storage, IT has always had to choose between features and cost. Ongoing tradeoffs between the best technologies to support business operations and an adequate budget to pay for those technologies generally impede an organization’s ability to be competitive, innovative and cost efficient. The entry-enterprise storage market has opened up new opportunities for storage customers – and eliminated the need for tradeoffs. Join this webinar to understand how to accelerate business value with entry-enterprise storage systems and learn about the new Hitachi Data System offering, Hitachi Unified Storage VM. View this WebTech to: Understand the common tradeoffs and challenges within the entry-enterprise storage market. Understand the business value of new entry-enterprise offerings. Learn how Hitachi Unified Storage VM is bringing enterprise-level features to the midrange. For more information on Hitachi Unified Storage VM please visit: http://www.hds.com/products/storage-systems/hitachi-unified-storage-vm.html?WT.ac=us_mg_pro_husvm
Continua il ciclo di webinar in collaborazione con Veritas Technologies.
In questo secondo appuntamento abbiamo visto le soluzioni Veritas di Software Defined Storage.
Il settore IT è oggi una delle aree aziendali maggiormente impattate dal fenomeno dell’aumento esponenziale dei dati. Conseguentemente, gli IT Manager devono far fronte all'aumento dei costi e della complessità per l’implementazione di soluzioni di Storage atte a contenere la crescita del volume dei dati.
Al tempo stesso essi devono operare delle scelte orientate a soluzioni in grado di soddisfare i livelli prestazionali sempre più elevati richiesti dalle nuove applicazioni di business mantenendo altresì la funzionalità di quelle legacy.
L’implementazione di hardware NAS ad alte prestazioni o l’adozione di soluzioni storage di tipo diversificato non rappresentano oggi la soluzione ideale dal punto di vista degli impatti economici e di gestione. Sono infatti disponibili nuove tecnologie, sviluppate proprio in risposta all'esigenza di efficientamento e al contenimento dei costi, che permettono di realizzare infrastrutture che consentono di massimizzare l’utilizzo delle soluzioni storage già presenti nel Data Center e l’adozione si soluzioni Object Storage.
Allo scopo Veritas presenta la propria linea di soluzioni Software Defined Storage.
Hyperconvergence 101: A Crash Course in Redefining Your InfrastructureePlus
Is a hyperconverged infrastructure (HCI) the right choice for your data center? EMC partners with ePlus to help transform your data center. From assessment to implementation, ePlus can be a trusted guide to get your HCI solution up and running.
View the performance metrics that turned the heads of VMware, EMC, and NetApp at VMworld 2011.
See the reason why Nexenta is now the single biggest threat to legacy storage.
A Glimpse into Software Defined Data CenterFung Ping
Note: This article is not published yet, it is for preview purpose. Interested publisher please contact hpfung1@gmail.com or hanping.fung@aeu.edu.my
A Glimpse into Software Defined Data Center
Abstract – Existing data centers today are not ready to support IT organizations to meet the ever changing business demands. Hence, next generation of data center like Software Defined Data Center (SDDC) is explored and expected to come to rescue. However, SDDC is relatively new since its inception in 2012 whereby there are different early interpretations on its definition, criteria, reference architecture and values that SDDC brings. There is also limited literature and sharing on how a SDDC works. The objective of this study is to shed some lights on SDDC operational definition, criteria, reference architecture, depiction on how SDDC works in three scenarios as well as standardized the values it brings. Moreover, some factors to guide IT organizations how to adopt SDDC are also discussed. This study has taken a qualitative approach in which SDDC literature is reviewed and some SDDC IT professionals are interviewed. Lastly, limitations of the study, future research and conclusion are also provided.
Software Defined Data Center (SDDC) is the next evolution of the underlying technology, where software delivers greater levels of intelligence and value, on top of standardized hardware.
The Future of Storage : EMC Software Defined Solution RSD
EMC provides intelligent software-defined storage solutions that help organizations drastically reduce management overhead through automation across traditional storage silos and pave the way for rapid deployment of fully integrated next generation scale-out storage architectures.
Presentation of Executive Briefing, April 2015
Setting the Foundation for Data Center Virtualization Cisco Canada
Today the Data Centre is at the heart of IT and business innovations. As the Data Centre evolves from a pure cost centre to a strategic asset to achieve business goals, Cisco is evolving our Data Centre Architectural Framework to help accelerate IT Innovations that deliver better business value. Ed Bugnion will explain that to do this successfully it is important to have a reliable and strategic roadmap to establish the DC foundation, the virtualized services that support the key cloud capabilities such as automation, provisioning and metering.
Cloudian HyperStore offer 100% S3 compatibility for low-cost, scalable smart object storage.
With HyperStore 6.0, we are focused on bringing down operational costs so that you can more effectively track, manage, and optimize your data storage as you scale.
The Software-Defined Data Center - Dell and Cumulus NetworksCumulus Networks
The software-defined data center has been rapidly identified as a key technology to better enable organizations of all sizes to achieve the affordable capacity and operational efficiency that the largest cloud operators enjoy.
To watch the on-demand webinar: http://go.cumulusnetworks.com/cumulus-SDDC
Today, CIOs are moving from being builders of apps and operators of data centers to becoming brokers of information services to the business. They're embracing new technologies and new service models that allow them to make IT faster, cheaper, and smarter, and make their companies more responsive and more competitive. Joel Kaufman, Senior Manager, VMware Technical Marketing at NetApp, explains how NetApp's clustered Data ONTAP fits into the software-defined storage discussion.
Cloud economics design, capacity and operational concernsMarcos García
Learn how to choose your e-commerce infrastructure, and how to forecast the TCO based on a simple model, including the explanations on how public, private and hybrid cloud computing work.
In early March, Harbour IT hosted a breakfast session in conjunction with VMware – “vForum Wrap – All the best bits from VMware’s vForum 2010”.
Held in both the Norwest and Sydney offices, local customers were given a VMware update from guest speaker, Bo Leksono. The presentation covered the latest VMware technology and the steps to follow on your journey to the cloud
Cozystack: Free PaaS platform and framework for building cloudsAndrei Kvapil
With Cozystack, you can transform your bunch of servers into an intelligent system with a simple REST API for spawning Kubernetes clusters, Database-as-a-Service, virtual machines, load balancers, HTTP caching services, and other services with ease.
You can use Cozystack to build your own cloud or to provide a cost-effective development environments.
How uCast is using AWS Media Services and the Power of the Cloud to Deliver G...Amazon Web Services
Live streaming in sports and music continues to increase double digits’ year over year. Both industries are challenged with complex rights management, scale at cost, worldwide distribution and monetization business models. In this session, learn how uCast’s platform automates what were previously difficult to execute and cost prohibited video operations unsustainable for premium video brands. uCast with AWS services solves these challenges while creating broadcast grade video quality, personalization of content, geo & rights management while streaming thousands of live events each year.
Brian Lisi, CTO, uCast
What's New In Microsoft System Center 2016 & OMSAsaf Nakash
In this presentation, we will cover the latest feature of Microsoft System Center 2016 as well as cover the operation management suite and how connecting it to the organization will help you having insight in your environment.
Hybrid Infrastructure Integration is an approach to connect on-premises IT resources with AWS and bridge processes, services, and technologies used in common enterprise customer environments. This session addresses connectivity patterns, security controls, account governance, and operations monitoring approaches successfully implemented in enterprise engagements. Infrastructure architects and IT professionals can get an overview of various integration types, approaches, methodologies, and common service patterns, helping them to better understand and overcome typical challenges in hybrid enterprise environments.
DevOps the NetApp Way: 10 Rules for Forming a DevOps TeamNetApp
Does your enterprise IT organization practice DevOps without a common team approach? To create a standardized way for development and operations teams to work together at NetApp, the IT team differentiates a DevOps team from a regular development team based on these 10 rules.
Spot Lets NetApp Get the Most Out of the CloudNetApp
Prior to NetApp acquiring Spot.io, two of its IT teams had adopted Spot in their operations: Product Engineering for Cloud Volumes ONTAP test automation and NetApp IT for corporate business applications. Check out the results in this infographic.
NetApp has fully embraced tools that allow for seamless, collaborative work from home, and as a result was fully prepared to minimize COVID-19's impact on how we conduct business. Check out this infographic for a look at results from the new remote work reality.
4 Ways FlexPod Forms the Foundation for Cisco and NetApp SuccessNetApp
At Cisco and NetApp, seeing our customers succeed in their digital transformations means that we’ve succeeded too. But that’s only one of the ways we measure our performance. What’s another way? Hearing how our wide-ranging IT support helps Cisco and NetApp thrive. Here’s what makes FlexPod an indispensable part of Cisco’s and NetApp’s IT departments.
With the widespread adoption of hybrid multicloud as the de-facto architecture for the enterprise, organizations everywhere are modernizing to deliver tangible business value around data-intensive applications and workloads such as AI-driven IoT and Hyperledgers. Shifting from on-premises to public cloud services, private clouds, and moving from disk to flash – sometimes concurrently – opens the door to enormous potential, but also the unintended consequence of IT complexity.
With the widespread adoption of hybrid multicloud as the de facto IT architecture for the enterprise, organizations everywhere are modernizing to deliver tangible business value around data-intensive applications and workloads such as AI-driven IoT and indelible ledgers.
10 Reasons Why Your SAP Applications Belong on NetAppNetApp
NetApp has been supporting SAP for 20 years, delivering advanced solutions for SAP applications. Here are 10 reasons why your SAP applications belong on NetApp!
Redefining HCI: How to Go from Hyper Converged to Hybrid Cloud InfrastructureNetApp
The hyper converged infrastructure (HCI) market is entering a new phase of maturity. A modern HCI solution requires a private cloud platform that integrates with public clouds to create a consistent hybrid multi-cloud experience.
During this webinar, NetApp and an IDC guest speaker covered what led to the next generation of hyper converged infrastructure and which five capabilities are required to go from hyper converged to hybrid cloud infrastructure.
As we enter 2019, what stands out is how trends in business and technology are connected by common themes. For example, AI is at the heart of trends in development, data management, and delivery of applications and services at the edge, core, and cloud. Also essential are containerization as a critical enabling technology and the increasing intelligence of IoT devices at the edge. Navigating the tempests of transformation are developers, whose requirements are driving the rapid creation of new paradigms and technologies that they must then master in pursuit of long-term competitive advantage. Here are some of our perspectives and predictions for 2019.
Künstliche Intelligenz ist in deutschen Unter- nehmen ChefsacheNetApp
Einer aktuellen Umfrage des führenden Datenma- nagementspezialisten in der Hybrid Cloud NetApp zufolge gewinnt künstliche Intelligenz (KI) in deut- schen Unternehmen zunehmend an Relevanz.
Iperconvergenza come migliora gli economics del tuo ITNetApp
In this NetApp Webinar we present how NetApp HCI helps improve the economics of IT: accelerating and ensuring performance for each application, simplifying your Data Center and make your architecture more scalable by reducing waste, implementing and expanding your HCI infrastructure quickly and inexpensively, making your management even simpler and more intuitive, saving time and using the skills you already have in the company.
NetApp IT’s Tiered Archive Approach for Active IQNetApp
NetApp AutoSupport technology proactively monitors the health of NetApp systems installed at customer’s location and provides 24/7 actionable intelligence to optimize their storage environment. The amount of data received back to NetApp doubles approximately every 16 months. To manage the swelling waves of data to archive, NetApp IT sought a more flexible solution.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
3. The Changing Role of IT
Everyone has to think like a service provider
4. Software Defined Data Center
An architectural model where resources are:
Deliver via API integration
Provisioned by policies
Defined in software
The promise:
Purchasers have more options
IT increases agility & operational efficiency
Users and app owners get services faster
5. Software Defined Storage
Application self-service
Autonomy & speed for app owners
Virtualized Storage Services
Efficiency, control, & automation
Provisioning via policies & service levels
Support Diverse Set of Hardware
Deploy on platform of choice
10. Clustered Data ONTAP
Software-Defined Storage
Deploy on
Platform of Choice
NetApp Hardware:
– Storage array (FAS) or Integrated Stack (FlexPod)
Non-NetApp Storage:
– V-Series storage virtualization
Commodity disks in x86 Servers:
– ONTAP Edge
Cloud:
– NetApp Private Storage for AWS
11. Clustered Data ONTAP
Software-Defined Storage
Virtualized
Storage Services
Multi-Vendor
Hardware
SVM SVMSVMSVM
VM VM VMVMVM VM VM
Application
Self-Service
12. Clustered Data ONTAP
Software-Defined Storage
Autonomy and
Self-Service
Empower the application owner & developer
Application storage services via OnCommand Plug-ins
and programmable APIs
Deep integrations with
SVM SVMSVMSVM
VM VM VMVMVM VM VM
13. Clustered Data ONTAP
SVM SVMSVMSVM
VM VM VMVMVM VM VM
Virtualized
Storage Services
Multi-Vendor
Hardware
Application
Self-Service
14.
15. NetApp at Cisco Live! 2013
Learn how real companies use FlexPod to beat out competition
FlexPod as a Competitive Edge: How a Global Civil Engineering Firm Boosted
Data Center Performance While Cutting Overhead
• Date & Time: Wednesday, June 26th, 8:30am – 9:20 am
• Session ID: BRKPCS-2027
• Location: Room 104 A
Speakers include:
Shawn McCullough, Director of IT, Moffatt & Nichol
Patrick Rogers, VP Data Center Platforms, NetApp
Mark Melvin, Chief Technology Officer, ePlus
Editor's Notes
In the past, CIOs have viewed their IT organizations as a builder of services for the business. But with the explosion of new web services to reach customers, incredible growth in mobile devices, new technologies like Flash, the new paradigm of Cloud computing CIOs are dealing with more complexity than ever and this is driving them to rethink their role. Today, CIOs are moving from being builders of apps and operators of data centers to becoming brokers of information services to the business. They're embracing new technologies and new service models that allow them to make IT faster, cheaper, and smarter, and make their companies more responsive and more competitive
SDDC is an emerging architecture and set of technologies that build upon existing cloud and virtualization modelsDesign goal is to enable resources to be defined in software, provisioned based on policy, and deployable on any hardwareInnovation to increase IT agility and operational efficiency while speeding delivery of services to application owners So what we've seen emerge from the technology vendors, to advance the delivery of these services to the application owners and IT end users, is this concept of Software Defined Data Center, where every device in the IT infrastructure will be consumed and managed through software-defined constructs such as service levels and quality of service, service policies, and charge back mechanisms. This emerging architectural model is designed to speed delivery of IT services while improving operational and resource efficiency. Built on cloud and virtualization models, SDDC has three core characteristics:Resources (compute, networks, security, and data storage) are defined in software, allowing more efficient use and reuse of equipment.Resources are provisioned based on policies and service levels, allowing a mix of workloads to use a shared pool of hardware.Resources can be deployed on a variety of hardware, providing more choice for the purchasing process.
Software-Defined Storage, or SDS, is one of the elements in the SDDC model, along with Software-Defined Compute, Network, and Security. SDS is about doing for storage what server virtualization did for servers--breaking down the physical barriers that that bind data to specific hardware, enabling greater operational efficiency, greater availability and lower cost. At NetApp, we're excited by the market energy around SDS, because it's so consistent with the direction we've been taking with our Storage Operating system, Data ONTAP. For years we've been attacking the challenges of the hardware-centric monolithic storage model—where data is imprisoned: its expense, complexity, and inflexibility. Our latest release, version 8.2, of Clustered Data ONTAP has capabilities that perfectly match the 3 core tenants of Software-Defined Storage:It is integrated with applications to allow application admins direct access and control;It’s a fully virtualized storage environment that allows provisioning based on SLAs, including performance, availability, and efficiency; andIt's deployable across a range of hardwareLet me tell you a bit more about how ONTAP delivers Software-Defined Storage, and why this gets CIOs so excited.
It all begins with Virtualized Storage Services, defined in software….
It all begins with Virtualized Storage Services, defined in software--Clustered ONTAP is a highly virtualized storage operating system that abstracts all of the physical storage into a set of Storage Virtual Machines. This allows us to deliver natively multi-tenant, policy-based storage services, for SAN and NAS, with unmatched storage efficiency. Quality of Service and data protection can be configured and managed at the level of the Storage VM. This helps IT be more responsive, provision based on service level, and use policy-based management to operate more consistently. I’m hearing that more and more from customers, “help me operate in a more disciplined and consistent way”.
If you're a service guy...you want x capacity, x protocol, these are services. If you're not NetApp, you set up a jbod box and get a bunch of luns from there.Services are things that enable the elements of a policy that are relevant to the application--I need this much capacity, this much performance...so the app owner wouldn't say I need SAN/NAS, or efficiency, he'd ask for a service level tied to performance attributes at a particular price point. Policy attributes are things like performance, capacity, protection, cost. Users care about this.Storage service enablers: efficiency, protocols, multi-tenancy, mobility, what hardware it's provisioned on, etc. So in ONTAP, QoS is a core service. Storage admin cares about this stuff. Highly virtualized storage service delivery engines make the service levels possible.
In keeping with the SDS model, Data ONTAP can be deployed on range of hardware options…
In keeping with the SDS model, Data ONTAP can be deployed on range of hardware options: It can be deployed on our own optimized FAS systems, where customers benefit from highly tuned hardware and Flash integrations that let you manage hot data spikes, accelerate metadata, and do host-side data caching.It can be deployed on controllers over other storage vendor systems through our V-Series storage virtualization, And (coming soon) it can be deployed on Commodity x86 systems with ONTAP Edge. We also offer it on an integrated FlexPod system, and in the cloud through NetApp private storage for Amazon Web Services.This gives customers the purchasing flexibility to choose the platforms that meet their needs, and have common software-defined storage capabilities across a wide range of platform choices.
Lastly, is the powerful way we can automate service to the applications….
Lastly, is the powerful way we can automate service to the applications. Application VMs reside on their Storage VMs, and are managed by the application owner. Application-driven storage services, available through our On-Command Plug-ins and APIs, allow application owners to automatically provision, protect, and manage data through the application management tools directly. And, we offer deep integrations into cloud stacks from VMware, Microsoft, RedHat, Cisco, and OpenStack.With SDS based on Data ONTAP, application owners can dynamically respond to shifts in demand, and instantly deploy new services.While others may talk about Software-Defined Storage as a "vision of things to come", Clustered ONTAP lets you realize that vision today. We believe it will make a huge difference to IT, and that it will bring new levels of Agility to help CIOs reach their Holy Grail of "faster, cheaper, smarter". This new version of Clustered ONTAP is very exciting. And to tell you more about it, I'd like to introduce George Kurian, Senior Vice President for Data ONTAP strategy and development at NetApp.