One of the world's largest software companies needed secure and consistently high-performance storage for their business-critical data. Fibre Channel SANs met their requirements.
AOL manages over 20 petabytes of distributed storage across data centers to provide content to over 24 million users. They standardized on Fibre Channel to build a highly available and reliable storage solution. Fibre Channel allowed AOL to improve utilization rates, easily scale their infrastructure, and maintain high availability even when upgrading systems.
The document discusses how small to medium businesses have swung between using direct-attached storage (DAS) and storage area networks (SANs) for their VMware environments. It provides an example of a construction firm, Torcon, that converted their DAS setup into a SAN using ATTO technology to save costs and extend the usable life of their servers and storage. The conversion took less than three hours and provided benefits like isolated storage networking and easier expansion capacity. The document advocates that SAS-based SANs provide performance and flexibility comparable to fibre channel SANs at a lower cost that is suitable for cost-conscious small to medium businesses.
Presidio's Data Center Practice focuses on delivering advanced data center solutions through virtual data centers (VDCs) to help customers reduce costs and complexity while improving service levels. Presidio specializes in VMware, Cisco, and EMC technologies and can rapidly deploy VDCs using its expertise in server virtualization, virtual desktop infrastructure, converged networks, unified computing, storage, and backup/recovery solutions.
The document discusses application mobility across data centers using VMware VMotion, Cisco networking technologies, and NetApp storage solutions. It describes how VMotion can be used over long distances for business continuity and disaster recovery. A joint validation by VMware, Cisco and NetApp tested VMotion over 200km and found application performance degradation of less than 3%. Infrastructure requirements, configuration options, and best practices are provided to support long distance VMotion across data centers.
Data center 2.0: Data center built for private cloud by Mr. Cheng Che Hoo of ...HKISPA
The Chinese University of Hong Kong is facing challenges in upgrading its aging and decentralized IT infrastructure to support its growing student population. It plans to address this by building a new, larger and more advanced tier 3 data center with 800kVA capacity for high availability. It will also consolidate servers through virtualization, improve storage integration, and provide centralized backup and new infrastructure services like virtual machines on a subscription basis. This will help optimize resource sharing and costs while supporting the university's dynamic computing needs.
The Next Wave of 10GbE webcast with Crehan Research was held on 10/5 and focused on current and future 10GbE adapter and switch market drivers and adoption trends, and the effects of the introduction of 10GBASE-T products on the overall 10GbE market.
ENEA Wytwarzanie, a large Polish utility company, implemented a Cisco FabricPath network to virtualize their data center infrastructure and support mission-critical power monitoring applications in an active-active configuration across two data centers. The Cisco FabricPath solution improved application performance, simplified management, and helped ENEA Wytwarzanie meet regulatory compliance requirements by ensuring high availability of their systems through improved disaster recovery processes. The new network architecture provided the reliability, scalability, and flexibility needed to support ENEA Wytwarzanie's virtualized environment and critical energy industry applications.
AOL manages over 20 petabytes of distributed storage across data centers to provide content to over 24 million users. They standardized on Fibre Channel to build a highly available and reliable storage solution. Fibre Channel allowed AOL to improve utilization rates, easily scale their infrastructure, and maintain high availability even when upgrading systems.
The document discusses how small to medium businesses have swung between using direct-attached storage (DAS) and storage area networks (SANs) for their VMware environments. It provides an example of a construction firm, Torcon, that converted their DAS setup into a SAN using ATTO technology to save costs and extend the usable life of their servers and storage. The conversion took less than three hours and provided benefits like isolated storage networking and easier expansion capacity. The document advocates that SAS-based SANs provide performance and flexibility comparable to fibre channel SANs at a lower cost that is suitable for cost-conscious small to medium businesses.
Presidio's Data Center Practice focuses on delivering advanced data center solutions through virtual data centers (VDCs) to help customers reduce costs and complexity while improving service levels. Presidio specializes in VMware, Cisco, and EMC technologies and can rapidly deploy VDCs using its expertise in server virtualization, virtual desktop infrastructure, converged networks, unified computing, storage, and backup/recovery solutions.
The document discusses application mobility across data centers using VMware VMotion, Cisco networking technologies, and NetApp storage solutions. It describes how VMotion can be used over long distances for business continuity and disaster recovery. A joint validation by VMware, Cisco and NetApp tested VMotion over 200km and found application performance degradation of less than 3%. Infrastructure requirements, configuration options, and best practices are provided to support long distance VMotion across data centers.
Data center 2.0: Data center built for private cloud by Mr. Cheng Che Hoo of ...HKISPA
The Chinese University of Hong Kong is facing challenges in upgrading its aging and decentralized IT infrastructure to support its growing student population. It plans to address this by building a new, larger and more advanced tier 3 data center with 800kVA capacity for high availability. It will also consolidate servers through virtualization, improve storage integration, and provide centralized backup and new infrastructure services like virtual machines on a subscription basis. This will help optimize resource sharing and costs while supporting the university's dynamic computing needs.
The Next Wave of 10GbE webcast with Crehan Research was held on 10/5 and focused on current and future 10GbE adapter and switch market drivers and adoption trends, and the effects of the introduction of 10GBASE-T products on the overall 10GbE market.
ENEA Wytwarzanie, a large Polish utility company, implemented a Cisco FabricPath network to virtualize their data center infrastructure and support mission-critical power monitoring applications in an active-active configuration across two data centers. The Cisco FabricPath solution improved application performance, simplified management, and helped ENEA Wytwarzanie meet regulatory compliance requirements by ensuring high availability of their systems through improved disaster recovery processes. The new network architecture provided the reliability, scalability, and flexibility needed to support ENEA Wytwarzanie's virtualized environment and critical energy industry applications.
Iron Networks builds turnkey converged cloud infrastructure platforms optimized for hybrid cloud deployments using industry-standard hardware. These platforms provide cost-effective and scalable solutions for enterprises and service providers to build private and public clouds. Iron Networks offers pre-configured and pre-validated platforms for general infrastructure as a service and specialized workloads, reducing the cost and time of deploying these technologies.
EMC SAN provides benefits such as high availability and manageability, improved application performance through dedicated storage networks, fast scalability through centralized storage, and better data replication and recovery options. Case studies show that EMC SAN solutions can help businesses reduce costs through storage consolidation, improve business continuity through centralized data management, and increase business flexibility to support growth. EMC SAN migration services help ensure business impact through detailed planning and elimination of downtime during implementation.
The document provides an overview of Macroview Solution's data center virtualization offerings. It discusses their technology partners including VMware, Cisco, Citrix, Microsoft, and NetApp. It then summarizes their service catalog including virtualization, compute, storage, virtual desktop, enterprise mobility, disaster recovery, and multi-cloud capabilities. Specific storage solutions from NetApp are highlighted including all-flash arrays, snapshots, cloning, deduplication, encryption, quality of service, and data replication technologies.
Programmable I/O Controllers as Data Center Sensor NetworksEmulex Corporation
This is a presentation on 'Programmable I/O Controllers as Data Center Sensor Networks' as presented by Shaun Walsh and Sanjeev Datla at the 2011 Storage Developer's Conference in October 2011.
The document discusses the evolution of utility services and new data center models, including:
- The cloud will become a new utility similar to water and electricity, providing massive scale and lower costs through modular solutions.
- New data center models are emerging for small/medium businesses, hybrid enterprise data centers, and hyper-scale service providers.
- Trends driving the cloud include new server deployment models, the emergence of "the other x86 market" of large cloud service providers, and focus on network optimized delivery, cloud optimized clients, and green/commoditized infrastructure.
- The cloud can be constructed using new "lego blocks" including data center containers, network edge solutions, and application/network
Data is being generated at rates never before encountered. The explosion of data threatens to consume all of our IT resources: People, budget, power, cooling and data center floor space. Are your systems coping with your data now? Will they continue to deliver as the stress on data centers increases and IT budgets dwindle?
Imagine if you could be ahead of the data explosion by being proactive about your storage instead of reactive. Now you can be, with NetApp's approach to the designs and deployment of storage systems. With it, you can take advantage of NetApp's latest storage enhancements and take control of your storage. This will allow you to focus on gathering more insights from your data and deliver more value to your business.
NetApp's most advanced storage solutions are NetApp Virtualization & scale out. By taking control of your existing storage platform with either solution, you get:
• Immortal Storage system
• Infinite scalability
• Best possible ROI from existing environment
The Emulex Advanced Development Organization offers an in-depth analysis of how Emulex OneConnect Adapters quadruple the performance over 1GbE networks for Hadoop cluster environments, addressing the 'Big Data' performance needs of cloud providers and users. Traditional 1GbE networks have not kept pace with the growth of Big Data – Emulex offers an ideal solution.
This document provides an overview of software-defined storage (SDS) concepts and discusses several SDS solutions from major vendors. It defines SDS and explains how adding a control layer allows for visibility, communication, and allocation of storage resources. Benefits highlighted include efficiency, automation, flexibility, scalability, reliability and cost savings. Specific SDS products are then profiled from vendors such as EMC, HP, IBM, NetApp, VMware, Coraid, DataCore, Dell, Hitachi, Pivot3, and RedHat.
Fortissimo converged super_converged_hyperEmilio Billi
Fortissimo Foundation introduces a revolutionary converged computing architecture that removes layers of inefficiency in the data path. By consolidating server nodes and allowing direct hardware access, it can deliver 10-100x higher performance than existing solutions at a fraction of the cost. The architecture introduces no virtualization overhead, enabling ultra-low latency access and linear scalability for both virtual and non-virtual workloads. This makes it suitable for converged analytics, supercomputing and hyper-computing applications.
RONNIEE Express: A Dramatic Shift in Network Architectureinside-BigData.com
In this slidecast, Emilio Billi from A3 Cube presents an overview of the company's RONNIEE Express network architecture.
"RONNIEE Express is a new High-Performance Cluster and data plane Interconnect based on a disruptive pure memory-mapped communication paradigm."
Learn more: http://www.a3cube-inc.com
Watch the video presentation: http://insidehpc.com/2014/02/25/ronniee-express-dramatic-shift-network-architecture/
ScaleIO is software that creates a server-based storage area network (SAN) using local storage drives. It provides elastic scaling of capacity and performance on demand across server nodes. Data is distributed across nodes for high performance parallelism. Additional servers and storage can be added non-disruptively to scale out the system.
DataCore’s Fifth Annual State of Software-Defined Storage (SDS) Survey Reveals Surprising Lack of Spending on Big Data, Object Storage and OpenStack. In contrast, more than half of organizations polled (52 percent) look to extend the life of existing storage assets and future-proof their IT infrastructure with SDS in 2015.
On the other hand, this year’s report reveals several major business drivers for implementing Software-Defined Storage. 52 percent of respondents expect SDS will extend the life of existing storage assets and future-proof their storage infrastructure, enabling them to easily absorb new technologies. Close to half of respondents look to SDS to avoid hardware lock-in from storage manufacturers, while lowering hardware costs by allowing them to shop among several competing suppliers. Operationally, they see SDS simplifying management of different classes of storage by automating frequent or complex operations. This is notable in comparison with earlier surveys, as these results portray a sharp increase in the recognition of the economic benefits generated by SDS (reduced CAPEX), complementing the OPEX savings referenced in prior years.
Other surprises include: while flash technology penetration expanded it is still absent in 28 percent of the cases and 16 percent reported that it did not meet application acceleration expectations. Also interesting is that 21 percent reported that highly touted hyper-converged systems did not perform as required or did not integrate well within their infrastructure. On the other hand, Software-Defined Storage and storage virtualization are deemed very urgent now, with 72 percent of organizations making important investments in these technologies throughout 2015. 81 percent also expect similar levels of spending on Software-Defined Storage technologies that will be incorporated within server SANs / virtual SANs and converged storage solutions.
The document provides an overview of the new NetApp FAS2240 storage system. Key points:
1) The FAS2240 controllers plug directly into NetApp disk shelves, allowing for a fully redundant storage system in just 2U of rack space.
2) The FAS2240 uses the same Intel processors and 64-bit architecture as NetApp's mid-range and high-end systems, improving compatibility.
3) Optional mezzanine cards allow the FAS2240 to support 10GbE and 8Gb Fibre Channel connectivity for high performance.
Emc vi pr hdfs data service technical overviewsolarisyougood
This document provides an overview of EMC's ViPR HDFS Data Service. Key points include:
1) ViPR HDFS allows users to leverage existing storage infrastructure as an HDFS data repository or "data lake" without needing dedicated analytics clusters.
2) It addresses limitations of off-the-shelf HDFS and brings HDFS capabilities to existing storage hardware, enabling HDFS, object, and file-based scenarios from a single platform.
3) ViPR HDFS provides an HDFS-compatible interface but replaces name nodes to eliminate single points of failure and uses ViPR's object storage engine for high scale.
This white paper provides a detailed overview of the EMC ViPR Services architecture, a geo-scale cloud storage platform that delivers cloud-scale storage services, global access, and operational efficiency at scale.
The Future of Storage : EMC Software Defined Solution RSD
EMC provides intelligent software-defined storage solutions that help organizations drastically reduce management overhead through automation across traditional storage silos and pave the way for rapid deployment of fully integrated next generation scale-out storage architectures.
Presentation of Executive Briefing, April 2015
ClearSky - Value to Manged Service Providers rbcummings
The document discusses ClearSky, a data storage and protection service that delivers primary storage, backup, and disaster recovery as an on-demand, multi-tenant service. It allows customers to pay for their data once and access it anywhere, on-premises or in the cloud. ClearSky helps reduce costs for MSPs by up to 50% by eliminating separate storage silos for primary, secondary, and DR and offering a consumption-based pricing model. It provides integrated primary storage, offsite backup, and DR in one service to help MSPs acquire more customers.
IBM FlashSystem and other SSD's are being adopted for OLTP and Analytics applications. Fast 16Gb Flash storage requires a reliable, high performance network to ensure applications can utilize it effectively. Learn how to plan for a highspeed reliable network to handle the increased demands while delivering reliable application response times. Understand the reliability, performance, and simplified management features of Gen5 FC and Fabric Vision. Be prepared for the next jump in SAN's.
The document provides an overview of Demartek's 16Gb Fibre Channel Deployment Guide. It discusses the history and progression of Fibre Channel technology. The guide is intended to provide information and guidance for planning and deploying 16Gb Fibre Channel solutions, focusing on virtualized environments. It covers topics such as Fibre Channel technologies, virtualized deployment, performance measurement, best practices, and real-world deployment examples.
Iron Networks builds turnkey converged cloud infrastructure platforms optimized for hybrid cloud deployments using industry-standard hardware. These platforms provide cost-effective and scalable solutions for enterprises and service providers to build private and public clouds. Iron Networks offers pre-configured and pre-validated platforms for general infrastructure as a service and specialized workloads, reducing the cost and time of deploying these technologies.
EMC SAN provides benefits such as high availability and manageability, improved application performance through dedicated storage networks, fast scalability through centralized storage, and better data replication and recovery options. Case studies show that EMC SAN solutions can help businesses reduce costs through storage consolidation, improve business continuity through centralized data management, and increase business flexibility to support growth. EMC SAN migration services help ensure business impact through detailed planning and elimination of downtime during implementation.
The document provides an overview of Macroview Solution's data center virtualization offerings. It discusses their technology partners including VMware, Cisco, Citrix, Microsoft, and NetApp. It then summarizes their service catalog including virtualization, compute, storage, virtual desktop, enterprise mobility, disaster recovery, and multi-cloud capabilities. Specific storage solutions from NetApp are highlighted including all-flash arrays, snapshots, cloning, deduplication, encryption, quality of service, and data replication technologies.
Programmable I/O Controllers as Data Center Sensor NetworksEmulex Corporation
This is a presentation on 'Programmable I/O Controllers as Data Center Sensor Networks' as presented by Shaun Walsh and Sanjeev Datla at the 2011 Storage Developer's Conference in October 2011.
The document discusses the evolution of utility services and new data center models, including:
- The cloud will become a new utility similar to water and electricity, providing massive scale and lower costs through modular solutions.
- New data center models are emerging for small/medium businesses, hybrid enterprise data centers, and hyper-scale service providers.
- Trends driving the cloud include new server deployment models, the emergence of "the other x86 market" of large cloud service providers, and focus on network optimized delivery, cloud optimized clients, and green/commoditized infrastructure.
- The cloud can be constructed using new "lego blocks" including data center containers, network edge solutions, and application/network
Data is being generated at rates never before encountered. The explosion of data threatens to consume all of our IT resources: People, budget, power, cooling and data center floor space. Are your systems coping with your data now? Will they continue to deliver as the stress on data centers increases and IT budgets dwindle?
Imagine if you could be ahead of the data explosion by being proactive about your storage instead of reactive. Now you can be, with NetApp's approach to the designs and deployment of storage systems. With it, you can take advantage of NetApp's latest storage enhancements and take control of your storage. This will allow you to focus on gathering more insights from your data and deliver more value to your business.
NetApp's most advanced storage solutions are NetApp Virtualization & scale out. By taking control of your existing storage platform with either solution, you get:
• Immortal Storage system
• Infinite scalability
• Best possible ROI from existing environment
The Emulex Advanced Development Organization offers an in-depth analysis of how Emulex OneConnect Adapters quadruple the performance over 1GbE networks for Hadoop cluster environments, addressing the 'Big Data' performance needs of cloud providers and users. Traditional 1GbE networks have not kept pace with the growth of Big Data – Emulex offers an ideal solution.
This document provides an overview of software-defined storage (SDS) concepts and discusses several SDS solutions from major vendors. It defines SDS and explains how adding a control layer allows for visibility, communication, and allocation of storage resources. Benefits highlighted include efficiency, automation, flexibility, scalability, reliability and cost savings. Specific SDS products are then profiled from vendors such as EMC, HP, IBM, NetApp, VMware, Coraid, DataCore, Dell, Hitachi, Pivot3, and RedHat.
Fortissimo converged super_converged_hyperEmilio Billi
Fortissimo Foundation introduces a revolutionary converged computing architecture that removes layers of inefficiency in the data path. By consolidating server nodes and allowing direct hardware access, it can deliver 10-100x higher performance than existing solutions at a fraction of the cost. The architecture introduces no virtualization overhead, enabling ultra-low latency access and linear scalability for both virtual and non-virtual workloads. This makes it suitable for converged analytics, supercomputing and hyper-computing applications.
RONNIEE Express: A Dramatic Shift in Network Architectureinside-BigData.com
In this slidecast, Emilio Billi from A3 Cube presents an overview of the company's RONNIEE Express network architecture.
"RONNIEE Express is a new High-Performance Cluster and data plane Interconnect based on a disruptive pure memory-mapped communication paradigm."
Learn more: http://www.a3cube-inc.com
Watch the video presentation: http://insidehpc.com/2014/02/25/ronniee-express-dramatic-shift-network-architecture/
ScaleIO is software that creates a server-based storage area network (SAN) using local storage drives. It provides elastic scaling of capacity and performance on demand across server nodes. Data is distributed across nodes for high performance parallelism. Additional servers and storage can be added non-disruptively to scale out the system.
DataCore’s Fifth Annual State of Software-Defined Storage (SDS) Survey Reveals Surprising Lack of Spending on Big Data, Object Storage and OpenStack. In contrast, more than half of organizations polled (52 percent) look to extend the life of existing storage assets and future-proof their IT infrastructure with SDS in 2015.
On the other hand, this year’s report reveals several major business drivers for implementing Software-Defined Storage. 52 percent of respondents expect SDS will extend the life of existing storage assets and future-proof their storage infrastructure, enabling them to easily absorb new technologies. Close to half of respondents look to SDS to avoid hardware lock-in from storage manufacturers, while lowering hardware costs by allowing them to shop among several competing suppliers. Operationally, they see SDS simplifying management of different classes of storage by automating frequent or complex operations. This is notable in comparison with earlier surveys, as these results portray a sharp increase in the recognition of the economic benefits generated by SDS (reduced CAPEX), complementing the OPEX savings referenced in prior years.
Other surprises include: while flash technology penetration expanded it is still absent in 28 percent of the cases and 16 percent reported that it did not meet application acceleration expectations. Also interesting is that 21 percent reported that highly touted hyper-converged systems did not perform as required or did not integrate well within their infrastructure. On the other hand, Software-Defined Storage and storage virtualization are deemed very urgent now, with 72 percent of organizations making important investments in these technologies throughout 2015. 81 percent also expect similar levels of spending on Software-Defined Storage technologies that will be incorporated within server SANs / virtual SANs and converged storage solutions.
The document provides an overview of the new NetApp FAS2240 storage system. Key points:
1) The FAS2240 controllers plug directly into NetApp disk shelves, allowing for a fully redundant storage system in just 2U of rack space.
2) The FAS2240 uses the same Intel processors and 64-bit architecture as NetApp's mid-range and high-end systems, improving compatibility.
3) Optional mezzanine cards allow the FAS2240 to support 10GbE and 8Gb Fibre Channel connectivity for high performance.
Emc vi pr hdfs data service technical overviewsolarisyougood
This document provides an overview of EMC's ViPR HDFS Data Service. Key points include:
1) ViPR HDFS allows users to leverage existing storage infrastructure as an HDFS data repository or "data lake" without needing dedicated analytics clusters.
2) It addresses limitations of off-the-shelf HDFS and brings HDFS capabilities to existing storage hardware, enabling HDFS, object, and file-based scenarios from a single platform.
3) ViPR HDFS provides an HDFS-compatible interface but replaces name nodes to eliminate single points of failure and uses ViPR's object storage engine for high scale.
This white paper provides a detailed overview of the EMC ViPR Services architecture, a geo-scale cloud storage platform that delivers cloud-scale storage services, global access, and operational efficiency at scale.
The Future of Storage : EMC Software Defined Solution RSD
EMC provides intelligent software-defined storage solutions that help organizations drastically reduce management overhead through automation across traditional storage silos and pave the way for rapid deployment of fully integrated next generation scale-out storage architectures.
Presentation of Executive Briefing, April 2015
ClearSky - Value to Manged Service Providers rbcummings
The document discusses ClearSky, a data storage and protection service that delivers primary storage, backup, and disaster recovery as an on-demand, multi-tenant service. It allows customers to pay for their data once and access it anywhere, on-premises or in the cloud. ClearSky helps reduce costs for MSPs by up to 50% by eliminating separate storage silos for primary, secondary, and DR and offering a consumption-based pricing model. It provides integrated primary storage, offsite backup, and DR in one service to help MSPs acquire more customers.
IBM FlashSystem and other SSD's are being adopted for OLTP and Analytics applications. Fast 16Gb Flash storage requires a reliable, high performance network to ensure applications can utilize it effectively. Learn how to plan for a highspeed reliable network to handle the increased demands while delivering reliable application response times. Understand the reliability, performance, and simplified management features of Gen5 FC and Fabric Vision. Be prepared for the next jump in SAN's.
The document provides an overview of Demartek's 16Gb Fibre Channel Deployment Guide. It discusses the history and progression of Fibre Channel technology. The guide is intended to provide information and guidance for planning and deploying 16Gb Fibre Channel solutions, focusing on virtualized environments. It covers topics such as Fibre Channel technologies, virtualized deployment, performance measurement, best practices, and real-world deployment examples.
The Role of Fibre Channel in Server VirtualizationTheFibreChannel
Fibre Channel is the most widely deployed solution for connecting highly virtualized servers to storage because it provides low latency, high performance, and industry-leading bandwidth to support applications and server consolidation. It also offers flexible architectural designs and topologies to reduce cabling infrastructure while NPIV provides VM-level IO visibility and isolation. Additionally, its rich yet simple management aids in implementing, tuning, and troubleshooting storage networks.
- Instrumentation allows SAN events to be addressed proactively through reliable metrics like CRC errors, code violations, and class 3 discards to predict and avoid application slowdowns.
- Exchange completion time (ECT) can definitively prove where a slowdown originates by tracking latency at the initiator-target-LUN level in real-time.
- VirtualWisdom provides comprehensive visibility across the heterogeneous SAN through its probes, enabling proactive optimization through real-time root cause analysis of issues.
Symantec is not only a software company but buys and sells companies to keep the market leader position and expand its business. This report is about how Symc was financing its activity with issued convertible bonds.
Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16G...Emulex Corporation
This webinar covers the improvements in storage I/O throughput and CPU efficiency that VMware vSphere gains when using an Emulex 16Gb Fibre Channel Host Bus Adapter (HBA) versus the previous generation HBA. Applications virtualized on VMware vSphere 5.1 that generate storage I/O of various block sizes can take full advantage of 16Gb Fibre Channel wire speed for better sequential and random I/O performance.
This document presents the Fibre Channel Speedmap which outlines past, present, and future Fibre Channel speeds and standards. It includes sections on FC, ISL (Inter-Switch Link), and FCoE (Fibre Channel over Ethernet) technologies. For each speed or technology, it provides the product naming, throughput in Mbytes/s, line rate in Gbaud, relevant T11 specification, year of technical completion, and estimated year of market availability. Speeds range from the initial 1GFC up to potential future speeds of 1TFC and beyond. It establishes that each new speed is backward compatible with at least the previous two generations.
Learn about the IBM Flex System FC5052 2-port and FC5054 4-port 16Gb FC Adapters. The IBM Flex System FC5052 2-port and FC5054 4-port 16Gb FC Adapters enable the highest FC speed access for Flex System compute nodes to an external storage area network (SAN). These adapters are based on the proven Emulex Fibre Channel stack, and work with 16 Gb Flex System Fibre Channel switch modules. For more information on Pure Systems, visit http://ibm.co/18vDnp6.
Visit the official Scribd Channel of IBM India Smarter Computing at http://bit.ly/VwO86R to get access to more documents.
Next Generation Storage Networking for Next Generation Data CentersTheFibreChannel
Learning Objectives
What is the future of Fibre Channel and Ethernet Storage?
What I/O bandwidth capabilities are available with the new crop of servers?
Share some performance data from the Demartek lab
This document provides a summary and comparison of various storage interface types, including their maximum transfer rates, attributes, cable types, and distances supported. It also includes tables comparing the interfaces and notes that the document will be periodically updated with additional information. Demartek analyzes storage, server, and networking technologies through hands-on testing and research.
The document discusses Cisco's innovations in storage networking, including their portfolio of Fibre Channel and converged Ethernet switches. Some key points discussed are:
- Cisco is introducing their new MDS 9396S 96-port 16G Fibre Channel switch and expanding 16G FC support across their MDS family for flash storage environments.
- Their MDS 9700 directors provide the highest performance FC switching with support for 384 16G ports, and they are designed to easily support future 32G FC and 40G FCoE protocols.
- Cisco is bringing 40G FCoE support to their Nexus 7700/7000 series switches to converge IP and Fibre Channel storage networking over 10/40G Ethernet fabrics
HDS-Brocade Joint Solutions Reference GuideSteve Lee
Hitachi-Brocade Joint Solutions Reference Guide provides an overview of joint solutions between Hitachi and Brocade focused on virtualization, cloud, and data center solutions. Key solutions highlighted include VMware site recovery, Hyper-V live migration over distance, unified compute platforms, block and file storage, and core platforms including embedded and FC adapters. The guide also provides contact information for Hitachi and Brocade field sales and technical support staff.
SDN & NFV: Driving Additional Value into Managed ServicesTBI Inc.
From offering seamless scalability to providing best-in-class security, SDN and NFV are driving value into managed services. Discover how these technologies combine to build a simple, more agile infrastructure at a significantly lower cost. Offer NFV, SDN, and other products from best-of-breed provider NTT through TBI.
Value Journal, a monthly news journal from Redington Value Distribution, intends to update the channel on the latest vendor news and Redington Value’s Channel Initiatives.
Key stories from the May Edition:
• Dell EMC takes open networking to the edge for next-generation access.
• Microsoft announces new intelligent security innovations
• Powering partners’ transformation - Pradeep Kumar, Principal Consultant, Security, Redington Value.
• Redington Value signs new vendor partnerships.
• Fortinet delivers integrated NOC-SOC solution.
• HPE to acquire Cape Networks.
• Malwarebytes introduces endpoint protection and response solution.
• Mimecast offers cyber resilience for email with new capabilities.
• SonicWall announces Capture Cloud Platform.
• Veritas revamps channel program to drive partner growth.
• Red Hat announces availability of Red Hat Storage One.
• AWS announces Amazon S3 One Zone-Infrequent Access(Z-IA).
VM Farms Thrive with Dedicated IP Storage NetworksBrocade
Is your VM farm in hyper growth? Is it slowing you down? See how to boost performance for VM farms with a dedicated IP storage network from Brocade and EMC. Benefit from a network that keeps up with apps and delivers on customer SLAs.
F5's application traffic management products can enhance web server deployments in several ways:
1) They provide load balancing, security, and optimization to improve performance, availability, and scalability of web servers.
2) Features like intelligent compression, caching, and TCP optimization offload processing from web servers to increase efficiency.
3) Solutions like the BIG-IP system and FirePass controller ensure secure remote access to web applications and content.
The petrochemical company Innovene successfully migrated from an MPLS/frame relay network to a public Internet solution provided by Science Applications International Corporation, reducing its total network cost of ownership. The solution uses Cisco routers and IPsec encryption to provide a secure VPN across multiple carriers. While public Internet poses security and performance risks, Innovene addresses these through encryption, firewalls, intrusion prevention systems, and the ability to change carriers if needed. The migration was completed successfully and provides Innovene with a more flexible and cost-effective global network.
IRJET- Effective Privacy based Distributed Storage StructureIRJET Journal
This document summarizes a research paper that proposes an effective and secure distributed storage structure for cloud computing. The proposed structure classifies user data into three categories based on importance. More important data is encrypted with stronger encryption algorithms and larger key sizes, while less important data uses lighter encryption. This reduces complexity and cost compared to encrypting all data with the same encryption. The structure aims to provide security for data transmission and storage in the cloud while balancing efficiency. It was found to have better performance than other existing cloud storage structures.
PCTY 2012, Tivoli Storage Strategi og Portfolio Update v. Greg TevisIBM Danmark
The document provides an overview and strategy for IBM Tivoli Storage. It discusses IBM's Tivoli Storage strategy, including moving to a cloud strategy with consumption-based metering and dynamic capacity optimization. It summarizes updates to products in the Tivoli Storage portfolio, including enhancements to IBM TSM for improved data reduction, virtualization support, user experience, disaster recovery, scalability, platform coverage, and licensing.
High Availability Disaster Recovery Customer Success Stories[1]Michael Hudak
The document summarizes several customer success stories that used NetApp business continuity products like SnapMirror and MetroCluster. It provides brief overviews of how various companies from industries like energy, real estate, healthcare, and retail used NetApp solutions to strengthen data protection, improve disaster recovery capabilities, enhance storage management and performance, and gain business benefits.
EmaxIT International helped Modern Architecture Contracting Company (MACC) take its business and security performance to new heights by migrating its hosting platform and applications to Amazon Web Services (AWS). This consolidated MACC's disparate applications onto a single, highly available and scalable cloud platform. It also automated backups, enhanced security through measures like web application firewalls, and reduced operational costs by 30% while increasing efficiency by 20%. EmaxIT provided fully managed AWS services with 24/7 support.
Expedient provides IT infrastructure and managed services to businesses. To offer more customized security, Expedient deployed Juniper's vSRX virtual firewall, allowing each customer their own firewall. This allows Expedient to quickly provision tailored security services in minutes rather than hours. Over 50 customers now use vSRX, allowing Expedient to introduce new services faster and reducing workload.
The document discusses Datacomm Cloud Business, a division of PT. Datacomm Diangraha focusing on cloud opportunities in Indonesia, with over 28 years of experience and 500 employees. It outlines their cloud and security products and services, including infrastructure, data, and network services provided with local data center support and international certifications. The remainder of the document discusses architectural approaches, availability, security, and performance-related topics like scalability, disaster recovery, and content delivery networks.
VMware: Innovate and Thrive in the Mobile-Cloud EraVMware
The document discusses how organizations can evolve their IT infrastructure to deliver IT as a service (ITaaS) in the mobile-cloud era. It describes how VMware solutions such as vSphere, NSX, and vCloud Suite provide the foundation for organizations to build software-defined data centers and hybrid clouds. This allows organizations to virtualize their infrastructure, automate operations, and seamlessly extend their data centers to the public cloud in order to empower users, increase business agility, and accelerate innovation.
This document describes IBM's Application Acceleration managed services which help companies accelerate web applications and strengthen security. The services include the WebSphere DataPower Edge Appliance, which consolidates functions and simplifies application infrastructure deployment. IBM Application Accelerator for Hybrid Networks accelerates traffic between public clouds/SaaS and enterprise users. The services cache content close to users to improve performance and offload backend servers, while identifying fast internet paths and optimizing network communication.
Swisscom AG implemented application performance monitoring using Brocade's Analytics Monitoring Platform to gain visibility into application performance across its large, complex storage network supporting many enterprise customers. This identified oversubscribed ports causing high latency, allowing Swisscom to redistribute traffic and reduce peak latency by over 90%. It reduced troubleshooting time from 30 days to under 10 minutes and improved reporting and assurance of SLA compliance for customers.
Cloud Security: A Brief Journey through the Revolutionary Technologyrosswilks1
Vanderbilt cloud solutions such as ACT365 and SPC Connect are proving popular for security installers due to their adaptability and flexibility. The concept of cloud computing has existed for decades, originating from John McCarthy's idea of time-sharing in the 1960s. Today cloud spending is growing at 12.5% annually and is integrated into everyday activities. Cloud solutions offer convenience and simplicity for businesses, reducing costs compared to maintaining on-site servers. They provide remote access and management capabilities. While some have security concerns about the cloud, providers like Vanderbilt implement strong encryption to protect customer data.
Draft Juniper Experience First Networking | SlideshareSelena829218
The document summarizes Juniper's experience-first networking solutions which include AI-driven wired and wireless access solutions, Session Smart SD-WAN, automated WAN solutions, and cloud-ready data centers. These solutions are driven by Mist AI to provide visibility, automation, and assurance from client to cloud to optimize user experiences. Key benefits include proactive automation through AIOps, service level monitoring and enforcement, anomaly detection, and AI-driven actions.
The document discusses choosing a secondary storage solution for G&J Consultation Sdn Bhd, which is facing performance bottlenecks impacting business productivity. It considers network attached storage (NAS) which offers benefits like easy setup, ease of use, scalability, and reliability. NAS allows for file sharing across a network and can improve performance by offloading tasks from email servers. The document also discusses factors to consider in secondary storage like performance, transfer speeds, scalability, and reliability.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Project Management Semester Long Project - Acuityjpupo2018
Acuity is an innovative learning app designed to transform the way you engage with knowledge. Powered by AI technology, Acuity takes complex topics and distills them into concise, interactive summaries that are easy to read & understand. Whether you're exploring the depths of quantum mechanics or seeking insight into historical events, Acuity provides the key information you need without the burden of lengthy texts.
Webinar: Designing a schema for a Data WarehouseFederico Razzoli
Are you new to data warehouses (DWH)? Do you need to check whether your data warehouse follows the best practices for a good design? In both cases, this webinar is for you.
A data warehouse is a central relational database that contains all measurements about a business or an organisation. This data comes from a variety of heterogeneous data sources, which includes databases of any type that back the applications used by the company, data files exported by some applications, or APIs provided by internal or external services.
But designing a data warehouse correctly is a hard task, which requires gathering information about the business processes that need to be analysed in the first place. These processes must be translated into so-called star schemas, which means, denormalised databases where each table represents a dimension or facts.
We will discuss these topics:
- How to gather information about a business;
- Understanding dictionaries and how to identify business entities;
- Dimensions and facts;
- Setting a table granularity;
- Types of facts;
- Types of dimensions;
- Snowflakes and how to avoid them;
- Expanding existing dimensions and facts.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
OpenID AuthZEN Interop Read Out - AuthorizationDavid Brossard
During Identiverse 2024 and EIC 2024, members of the OpenID AuthZEN WG got together and demoed their authorization endpoints conforming to the AuthZEN API
Nordic Marketo Engage User Group_June 13_ 2024.pptx
Symantec Case Study
1. Summary
When Symantec, a world
leader in consumer and
data protection, was
looking to secure and
provide consistent access
to business-critical data, a
top requirement was to
accommodate for the
company's future data
growth. Fibre Channel was
able to provide all the
answers.
Challenges:
Ensure secure and reliable
access to business-critical
data
Fibre Channel
Benefits:
Flat, simple network
architecture increases security,
stability, and availability
Customer
One of the world's largest software companies, Symantec specializes in information protection and operates
one of the largest global data-intelligence networks in existence. It's this network where security, backup and
availability solutions are developed that consumers and organizations rely on to protect their critical data. The
network spans scores of secure, interconnected data centers across five continents with petabytes of data that
must be consistently available to Symantec employees.
Symantec SAN Responsibilities
Symantec's IT infrastructure is responsible for ensuring that business-critical data is available to all Symantec
employees when it is needed—making it imperative that the Symantec storage environment is capable of both
storing massive quantities of data and retrieving that data reliably. As Symantec continues to expand they
require a storage infrastructure that is capable of fluidly adapting to their growth. Symantec is currently seeing
a 15 percent increase of data annually, and consequently, their storage network has been experiencing annual
growth rates of over five percent. In addition to meeting the necessary demands of high-scalability, the
Symantec SAN must also support file sharing, database, and online transaction processing applications, while
consistently achieving the highest availability.
The Need for a Purpose-Built SAN Solution
Symantec's continuing expansion prompted company executives to look for a more reliable and scalable
storage solution that was capable of both accommodating the company's growth and ensuring that Symantec
employees have the most secure, consistent access to business-critical data possible. After implementing
several Ethernet-based storage solutions and finding them unable to meet the company’s stringent security and
reliability requirements, Symantec decided to increase their reliance on Fibre Channel and provisioned it across
critical data paths. The new Fibre Channel SAN has since allowed Symantec to realize increased system
utilization, higher availability, improved efficiencies, and enabled them to capitalize on the latest advancements
in flash storage technologies.
Symantec Case Study
Symantec Case Study
Executive Summary
Meet scalability demands
of Symantec's continued
expansion
Capitalize on
advancements in
virtualization and flash
storage technologies
Optimal and simplistic
scalability
Features performance-
enhancing NPIV virtualization,
segregation and dynamic
bandwidth provisioning
capabilities
Accommodates different
service requirements between
multiple groups
2. Symantec Case Study
Business Results
As a company that specializes in security, the protection of end-user data is of paramount importance to
Symantec. Leveraging standard Fibre Channel features like deterministic bandwidth provisioning and NPIV
(N_Port ID Virtualization) to segregate multiple fabrics, Symantec is now assured that end-user data is
optimally secure and available. Fibre Channel also allows Symantec to accommodate different service
requirements between multiple groups, securing data as well as preventing I/O floods between groups.
Fibre Channel has allowed Symantec's IT department to capitalize on new technologies like virtualization—
which has increased system utilization, improved efficiencies, and enabled Symantec to take full advantage
of SSDs. Fibre Channel has enabled Symantec to further virtualize connectivity, improving server-to-
storage attach rates from 1:10 to 1:15 and securing data flows through NPIV segregation.
Providing consistent, sustained performance, Fibre Channel has allowed Symantec to achieve the absolute
best performance from their Solid State Drives, which now operate at the height of performance. Fibre
Channel's integrated diagnostics, monitoring and simplified management functionality has enabled seamless
and progressive expansion along with an increased resiliency.
Simply put: Fibre Channel's flat network architecture and bandwidth provisioning capabilities have been
instrumental in increasing security, stability and service-availability for Symantec.
The following were among the advantages Symantec realized by deploying Fibre Channel:
Quotes:
"Fibre Channel has a
great record from an
availability and
reliability standpoint,
and it allows us to
scale on demand as
our business grows."
"On time, on target
and on budget—Fibre
Channel helps us
maintain all three of
these business
requirements.”
—————————
Kevin Tan,
Symantec
Infrastructure
Architect
________________
Consistent Reliability
With a global workforce that spans five continents and many time zones, Symantec IT needs to
ensure that access to data is available to employees at all times. Fibre channel's credit-based
flow control allows for consistent delivery without dropped frames or lost data and
deterministic performance. Fibre Channel has allowed Symantec IT to provide consistent
connectivity with sustained availability. The results, business-critical data is highly available to
employees when it is needed around the clock, enabling production schedules to stay on track.
Scalable to Accommodate Growth
Fibre Channel has long been a valued addition to Symantec’s data retention plan, most recently
providing access to critical data through natural and acquisition growth of 15 percent most
recently. In the midst of its next growth phase, Symantec is adopting next-gen technologies such as
expanding virtual machine deployments and migrating to flash based storage. Fibre Channel has
proven it can easily scale up and down in response to Symantec’s past needs and is designed with
generational feature enhancements that unleash the full potential of evolving technologies.
Secure
Fibre channel offers many security features that help ensure Symantec’s data is protected. Hosting
pertinent data on Fibre Channel reduces vulnerability from outside attacks since it is a segregated
network isolated from Ethernet. The use of NPIV insures segregation of internal data deposit and
accommodates for different service requirements between groups. As an information protection
company, Symantec takes advantage of Fibre Channels security feature to ensure data, the life blood
of the company, is protected.
A Closing Note on Fibre Channel
In considering the benefits above, one begins to understand why Symantec trusts Fibre Channel for
their critical storage needs. These enterprises realize that data connectivity doesn’t stop at performance
and must include consistent reliability, scalability, and security capabilities as well as data integrity,
operational simplicity, and manageability.
No matter who’s consuming your data, Fibre Channel proves it offers superior value and consistently
demonstrates why it is the purpose-built storage solution you can trust.