Organizations need fast, reliable and responsive service delivery from their IT departments. To meet today’s requirements, the data center can’t be slow or unable to make adjustments quickly based on the needs of the business. Here are five examples of being ready, willing and able to deliver the right services in the right way.
Enterprise technologies are changing—think about the cloud, Software-as-a-Service, smartphones and tablets, virtualization, big data and Bring Your Own Device policies. Those new technologies directly enable new business opportunities, such as mobile delivery of services, or the use of analytics to personalize product offerings and optimize the supply chain. Technology also exposes challenges to established business models, and the worry that a nimble competitor might grab your hard-won market share—or redefine the market to eliminate your offerings.
New technologies, new opportunities, new business models and new competitive threats. All of these require an active response from business leaders. Implementing those responses requires ensuring that IT is agile in terms of both systems and information management. Whether the focus is on extending on-premises data centers or on leveraging cloud-based service providers, Symantec offers tools and services to help enable an agile IT transformation.
This paper describes five use cases where offerings from Symantec can help IT move from a sluggish environment to one that is more agile and able to respond strategically and tactically to emerging technologies, opportunities, business models and competitive threats.
Devoxx UK 2024 - Going serverless with Quarkus, GraalVM native images and AWS...
WHITE PAPER▶Not Sluggish: Five Use Cases for the Agile Data Center
1. Not Sluggish: Five Use Cases
for the Agile Data Center
Contents
• Data Governance for Agility and Efficiency................................................................................... 2
• Securing the Virtual Resources and Infrastructure....................................................................... 3
• Protecting Enterprise Data............................................................................................................ 4
• Adopting the Latest Storage Technologies.................................................................................... 5
• Protecting Applications Wherever They Live.................................................................................. 6
• Delivering the Right Services in the Right Way............................................................................. 7
A TECHTARGET WHITE PAPER
Brought to you compliments of Organizations need fast, reliable and responsive service delivery
from their IT departments. To meet today’s requirements, the data
center can’t be slow or unable to make adjustments quickly based
on the needs of the business. Here are five examples of being ready,
willing and able to deliver the right services in the right way.
2. Enterprise technologies are changing—think about the cloud, Software-as-a-Service, smartphones
and tablets, virtualization, big data and Bring Your Own Device policies. Those new technologies
directly enable new business opportunities, such as mobile delivery of services, or the use of analytics
to personalize product offerings and optimize the supply chain. Technology also exposes challenges
to established business models, and the worry that a nimble competitor might grab your hard-won
market share—or redefine the market to eliminate your offerings.
New technologies, new opportunities, new business models and new competitive threats. All of these
require an active response from business leaders. Implementing those responses requires ensuring
that IT is agile in terms of both systems and information management. Whether the focus is on
extending on-premises data centers or on leveraging cloud-based service providers, Symantec offers
tools and services to help enable an agile IT transformation.
This paper describes five use cases where offerings from Symantec can help IT move from a sluggish
environment to one that is more agile and able to respond strategically and tactically to emerging
technologies, opportunities, business models and competitive threats.
Data Governance for Agility and Efficiency
Unstructured data is growing at a phenomenal rate, consuming petabytes of storage on dozens or
hundreds of file servers. How much of that storage is wasted? What’s the capacity and utilization
of that storage? What type of data is being stored on corporate assets, and is it appropriate for the
organization’s business purposes?
Some of that data may be payroll records or bank transactions that require retention for years in case
of an audit or regulatory request. Some of that may be customer or inventory records being used every
second by an employee, or shown to consumers online or via a mobile app. Inevitably, some of the
storage space is wasted or misallocated. Or worse, part of the capacity may be allocated to a former
employee’s vast array of MP4 movies that violate copyright infringement laws.
Considering the vast amount of unstructured data being created and the growth rates that are
expected, how does all this get managed?
Sluggish IT: The killer app is the spreadsheet. You’ve seen them: Rows and columns indicate storage
arrays, raw capacity, current utilization, perhaps the names of users, groups or departments and
budget codes. An admin may also track utilization over time, allowing for rudimentary, future
storage projections.
2 www.symantec.com
3. Updating those spreadsheets is a daunting task, but a necessary one, since automated storage
discovery must be conducted within the confines of each vendor’s proprietary tool set. Going beyond the
basics, such as to discover orphaned or inappropriate data (like those MP4 movie files) is extremely
difficult. It can also be hard to know who owns data: Perhaps a set of documents was created by Sally,
but when she left the company, responsibility for those documents fell to Bob, making it very difficult
to manage the infrastructure where the data resides.
Agile IT: Symantec™ Data Insight helps organizations improve unstructured data governance
through actionable intelligence into data ownership, usage and access controls. The reporting,
analytics, and visualization capabilities in Data Insight help organizations drive efficiency and cost
reduction across the data lifecycle, improve protection of sensitive data and achieve compliance.
Organizations that have automated visibility and control can manage their data efficiently and have
the right information to make agile decisions on how to oversee the storage environment.
Securing the Virtual Resources and Infrastructure
Virtualized resources can be scaled, moved, allocated and reallocated, as often as necessary.
Sometimes resources are created and provisioned by administrators through easy-to-use self-service
consoles. Increasingly, virtual resources can also be brought online and deployed through intelligent
automation. The load virtual servers in Southeast Asia are exceeding policy-set parameters and
affecting application response time? Let’s spool up some new virtual CPUs and storage arrays and
adjust the virtualized load balancer. Problem solved.
Problem solved … maybe. Is everything secure? As virtual resources expand and contract, virtual
server operating systems, applications, firewalls, user access control lists, privacy policies and more
are affected. Some of those may be in virtual images that are copied onto the new virtual machines
(VMs); others are external, such as in the physical network router’s virtual private network and
virtual Local Area Network (LAN) settings or virtualized application delivery controller’s configuration.
Antivirus, intrusion detection, intrusion prevention, secure configuration, firewall, lockdown of insecure
ports, application and protected whitelisting, and file system and admin lockdown are just some of the
hardening practices that must be enabled and up and running before virtual resources go live.
Sluggish IT: An army of IT professionals leaps into action when a new virtual resource must be
provisioned. Emails go out, alerting the team that the routers and firewalls must be updated, the
operating system on the virtual server must be updated with the latest patches, the antivirus software
on the virtual server must be installed and updated with the latest malware profiles, and the access
control list must be synced with the corporate ACLs. Those changes, and more, may require operating
3 www.symantec.com
4. dozens of tools and controls from that many vendors. In many instances, there is a significant lag
between the time new virtual resources are created (which could be done in minutes or a few hours)
and the time it takes security and IT operations to sync up and apply the appropriate security policies
(days and weeks) to the newly created resources.
If a new vulnerability is discovered, the army of IT professionals must leap into action again, this
time discovering which systems may be affected, and then manually applying the patches or bringing
systems offline until patches become available.
Agile IT: Symantec™ Data Center Security provides tools for setting and applying security
policies against physical and virtual resources. From Windows to Linux, from application whitelisting
to targeted prevention policies, from real-time configuration monitoring to centralized management,
it’s all covered. Data Center Security supports elasticity by applying security policies to new virtual
resources as they are created. It also constantly monitors resources and infrastructure to detect
vulnerabilities, and can apply automation rules to respond quickly to challenges, such as by applying
hot patches or moving affected servers into a quarantine zone for further investigation. It can even
detect unauthorized access or configuration changes, and apply countermeasures via workflow. With
this approach, policy-based automation and intelligence enable security to orchestrate a timely
response to changes in the data center.
Protecting Enterprise Data
Data must be protected. Documents that were inadvertently destroyed need to be restored for
a panicking end user. Financial hedge-fund records might be intentionally destroyed, violating
compliance requirements. Someone might trip over a power cord, unplugging a server and corrupting
the fragile ones and zeros in an important SQL database, which then need to be restored from backup.
A flood might wipe out a data center, meaning that the disaster recovery plan must be activated either
manually or automatically in accordance with a carefully rehearsed workflow.
The most sophisticated data backup and disaster recovery systems are only as good as the
implementation of those policies. If backups are accidentally turned off or a software update breaks
the policy, data is at risk—and the business is at risk, too, for lost productivity or even legal liabilities.
This is especially true with virtual servers, which may be created, provisioned, administered, moved to
other physical hosts, moved again, and then shut down without a human administrator ever logging
into its management console. This represents a challenge for data protection.
Sluggish IT: When new servers, virtual or physical, are brought online, a human backup administrator
is notified via email. Of course, that assumes that the backup administrator receives the message. The
4 www.symantec.com
5. admin installs a backup agent onto the server, and tests that the backup will work. (Making life
more complicated, different tools are used for configuring backups on physical and virtual servers.)
The backup data is written to a tape drive that’s usually close, or fairly close, to the host for the
virtual server. Since virtual servers tend to move around from one physical host to another, kicking
off a backup job of multiple virtual servers at the same time could over load the host server’s
resources, failing the backup. And since backup policies are manually applied to each virtual server,
the chance that a VM comes online without a backup policy is very high. This could lead to data
being left unprotected, which could have major risk, cost and even legal liability consequences for
the organization.
Agile IT: The administrator uses a unified backup management platform, Symantec NetBackup™
with V-Ray technology, to efficiently protect all of the organization’s data across both physical
and virtual infrastructures, as well as the cloud. NetBackup automatically applies backup policies to
the servers and virtual data centers alike in an automated fashion, ensuring that the organization’s
data is never left unprotected—even that critical metadata layer needed to restore software-defined
infrastructures. The solution is truly enterprise-class, offering a unified portal with physical and virtual
machine detection and protection, as well as unmatched scalability for protecting environments in the
multiple petabyte range, with literally thousands upon thousands of virtual machines.
Furthermore, NetBackup offers granular recovery from a single-pass backup, which means an
administrator—or a user through self-service recovery—can easily restore files, folders or whatever
is required, without inefficient post processes. Eliminate full backups forever, instantly recover virtual
machines and deploy data protection in software or appliance form factor to suit even the most
demanding data center needs.
Adopting the Latest Storage Technologies
To effectively compete in business, IT needs the capability to seamlessly adopt new technologies. The
ability to quickly deploy the latest technology for higher performance, better availability or lower costs
can result in competitive advantages that provide faster or improved services through the data center,
to the business. However, adopting new technologies is not always a simple proposition. Oftentimes,
the new technology does not have all the required functionality or reliability that enterprise-class data
centers require.
One use case to consider is the storage environment. Storage Area Networks (SANs) are the core
storage systems in many data centers. SANs aren’t new; they’ve been around for many years, and have
provided solid performance and availability in the data center. However, SAN technology as we know it
5 www.symantec.com
6. has reached a tipping point in terms of performance. The spinning hard drives that comprise storage
arrays simply cannot exceed 15K rpm, which has created a performance bottleneck.
To address this, IT organizations are testing and implementing new technologies such as solid-state
devices (SSDs) to alleviate the performance limitations of the spinning disk. Flash/SSDs have no
moving parts, and can be deployed in various form factors such as on the SAN, as all-flash arrays
or in the server. SSDs offer input/output (I/O) performance substantially faster than even the fastest
hard drives. However, there is a downside: server-class SSDs are perceived to be more expensive than
rotating hard drives, and individual units have smaller capacity and less reliability. Additionally,
certain functionality, such as snapshots, data replication and tiering necessary for enterprise-class
data centers are not available natively on the SSD. How can SSD technology be incorporated and
adopted in a data center storage environment to provide competitive advantage for the organization?
Sluggish IT: Administrators continue to focus on SANs based on standard hard drives, not only because
there is already a tremendous investment in this technology, but also because on a cost-per-terabyte
basis, it is the most affordable. SSDs are used as point solutions—installed internally to some
particularly performance-sensitive servers, perhaps.
Agile IT: To be agile, IT needs to be able to take advantage of new SSD technology. The high capacity
of hard drives and high performance of SSDs can be combined into a hybrid storage system using
Symantec™ Storage Foundation. Whether in a physical server environment or when using
virtual machines, SmartIO in Storage Foundation includes intelligence to understand the usage
patterns of data, and provides a cache on the SSD for substantially better performance than
traditional SANs or raw SSDs can provide. Storage Foundation has advanced storage features that
enable IT to use SSDs as a tier in the allocation of data and will reflect new writes to other nodes
within the cluster to provide data protection even if a drive fails. What’s more, Storage Foundation also
provides Flexible Storage Sharing, which enables a “shared nothing” architecture. This removes the
need for expensive SAN infrastructure, allowing IT to take advantage of inexpensive, direct-attached
storage to keep costs in check as new SSDs are adopted.
Protecting Applications Wherever They Live
In today’s data center, services are typically deployed via a multi-tiered architecture. For example, the
Web layer resides on virtual machines, connected to the application tier running on Linux, interfacing
with a back-end database running on legacy Unix. Critical applications or business services running
on such complex multi-tiered architectures need to be protected against the smallest failures, as well
as wide area outages and disasters. Protecting these services can be a challenge, especially when the
business expects constant availability.
6 www.symantec.com