Mark Tango, Director of Data Center Storage at a national defense contractor, realized they had 5.5 petabytes of unstructured and structured data, including outdated, duplicate, and unnecessary information. With help from Active Navigation, Tango reduced the data footprint to 3.0 petabytes by analyzing 2.0 petabytes over one year, eliminating 881 terabytes of storage. This helped save nearly $2 million annually in storage costs and avoid additional support costs. Tango provided advice to start small, get executive buy-in by demonstrating cost savings, and keep a scorecard to document savings.
EDF2013: Invited Talk Daragh O'Brien: The Story of Maturity – How data in Bus...European Data Forum
Invited talk of Daragh O'Brien, Managing Director of Castlebridge Associates, at the European Data Forum 2013, 9 April 2013 in Dublin, Ireland: The Story of Maturity – How data in Business needs to pass the ‘So What’ tests
How Analytics as a Service Changes the Game and Expands the Market for Big Da...Dana Gardner
- Analytics as a service has expanded the market for big data by making advanced analytics capabilities accessible to smaller companies and organizations without extensive data architecture experience. This "democratization" of big data allows more firms to leverage big data technologies.
- Mobility and real-time analytics have become more prevalent, allowing organizations to incorporate streaming data sources and provide insights in close to real-time rather than through batch processes. This helps companies make faster decisions.
- Dasher Technologies helps clients address big data challenges holistically by considering people, process, and technology factors and providing solutions that optimize architectures for long-term growth needs. They work with partners like HP and its Vertica platform to deliver analytics capabilities and reporting applications.
Jet Reports es la herramienta para construir el mejor BI y de forma mas rapida CLARA CAMPROVIN
Análisis empresariales cuando los necesite, en cualquier lugar
Jet Enterprise es una solución de inteligencia empresarial y generación de informes desarrollada específicamente para satisfacer las necesidades propias de los usuarios de Microsoft Dynamics. Ahora puede juntar toda su información en un mismo lugar y permitir que quien usted quiera de la organización realice fácilmente sofisticados análisis empresariales desde cualquier sitio. Capacite a los usuarios para tomar mejores decisiones, más rápido, prácticamente con cualquier dispositivo.
Con Jet Enterprise dispone de:
Una solución completa de inteligencia empresarial y generación de informes, lista para usar en solo 2 horas
Más de 80 paneles y plantillas de informes
7 cubos pregenerados personalizables
Un almacén de datos
Integración directa con sus datos de Microsoft Dynamics y posibilidad de conectarse a otros sistemas empresariales pertinentes
Posibilidad de crear paneles en cuestión de minutos, sin necesidad de conocer la estructura de datos subyacente
Jet Mobile opcional, para acceder a sus datos desde cualquier sitio a través de un navegador web o un dispositivo móvil
Una plataforma robusta de automatización y personalización del almacenamiento de datos
«Comenzamos con datos de Sage Pro, datos de NAV 2009 y, además, datos incorporados de la nueva empresa que habíamos adquirido, por lo que ahora estamos usando tres sistemas de datos. Las ventajas de combinar los tres sistemas en Jet Enterprise han sido enormes».
– Davis & Shirtliff
Éxito inmediato = rápido ROI y bajo coste de propiedad
Muchas soluciones de inteligencia empresarial conllevan costes ocultos, como implementaciones prolongadas y difíciles, personalizaciones caras y precio elevado de las licencias cuando se amplían a un gran número de usuarios. Jet Enterprise se suele instalar en unas dos horas, requiere un nivel mínimo de formación de los usuarios y ofrece licencias para un número ilimitado de usuarios. Los usuarios habitualmente experimentan un incremento de los ingresos brutos en los primeros 12 meses de uso.
Loggly - Case Study - Stanley Black & Decker Transforms Work with Support fro...SolarWinds Loggly
With Loggly, Stanley Black & Decker:
• Provides team with troubleshooting capabilities for mobile and IoT applications running on traditional and serverless architectures
• Supports performance monitoring, security, and PCI compliance needs
• Enables quick scalability as new innovations are launched
A Practical Guide to Rapid ITSM as a Foundation for Overall Business AgilityDana Gardner
This document summarizes a podcast discussion on how rapidly advancing IT service management (ITSM) capabilities can improve IT performance and enable business agility.
The panelists discuss how traditional long IT project timelines no longer meet business needs, and how new ITSM technologies and methods allow for more rapid ITSM adoption. Rapid ITSM implementation using out-of-the-box configurations from SaaS solutions can establish best practices faster than custom approaches. However, data quality issues and unclear requirements can hinder speed. Adopting true agile principles and focusing on business needs rather than desired features helps overcome barriers to rapid ITSM.
For SBI Securities, Pivotal GemFire’s results speak for themselves – greater performance with fewer resources. It provides extremely low latency and scales with spikes as well as usage growth over time. Importantly, GemFire also provides a more cost-effective means of managing the data tier for high-volume data industries, like an online brokerage with plans for future growth.
To learn more, visit pivotal.io/big-data/pivotal-gemfire.
LIC needed to regularly update its SAP systems but copying the large databases took 9 days of downtime each time. Data Sync Manager reduced the copy process to just 1 hour by time-slicing and reducing the database size by 67%, saving $11,000 per year. Functional updates that previously took 9 days were also reduced to 1 hour. The solution allowed LIC to build test environments faster with no production interruptions.
EDF2013: Invited Talk Daragh O'Brien: The Story of Maturity – How data in Bus...European Data Forum
Invited talk of Daragh O'Brien, Managing Director of Castlebridge Associates, at the European Data Forum 2013, 9 April 2013 in Dublin, Ireland: The Story of Maturity – How data in Business needs to pass the ‘So What’ tests
How Analytics as a Service Changes the Game and Expands the Market for Big Da...Dana Gardner
- Analytics as a service has expanded the market for big data by making advanced analytics capabilities accessible to smaller companies and organizations without extensive data architecture experience. This "democratization" of big data allows more firms to leverage big data technologies.
- Mobility and real-time analytics have become more prevalent, allowing organizations to incorporate streaming data sources and provide insights in close to real-time rather than through batch processes. This helps companies make faster decisions.
- Dasher Technologies helps clients address big data challenges holistically by considering people, process, and technology factors and providing solutions that optimize architectures for long-term growth needs. They work with partners like HP and its Vertica platform to deliver analytics capabilities and reporting applications.
Jet Reports es la herramienta para construir el mejor BI y de forma mas rapida CLARA CAMPROVIN
Análisis empresariales cuando los necesite, en cualquier lugar
Jet Enterprise es una solución de inteligencia empresarial y generación de informes desarrollada específicamente para satisfacer las necesidades propias de los usuarios de Microsoft Dynamics. Ahora puede juntar toda su información en un mismo lugar y permitir que quien usted quiera de la organización realice fácilmente sofisticados análisis empresariales desde cualquier sitio. Capacite a los usuarios para tomar mejores decisiones, más rápido, prácticamente con cualquier dispositivo.
Con Jet Enterprise dispone de:
Una solución completa de inteligencia empresarial y generación de informes, lista para usar en solo 2 horas
Más de 80 paneles y plantillas de informes
7 cubos pregenerados personalizables
Un almacén de datos
Integración directa con sus datos de Microsoft Dynamics y posibilidad de conectarse a otros sistemas empresariales pertinentes
Posibilidad de crear paneles en cuestión de minutos, sin necesidad de conocer la estructura de datos subyacente
Jet Mobile opcional, para acceder a sus datos desde cualquier sitio a través de un navegador web o un dispositivo móvil
Una plataforma robusta de automatización y personalización del almacenamiento de datos
«Comenzamos con datos de Sage Pro, datos de NAV 2009 y, además, datos incorporados de la nueva empresa que habíamos adquirido, por lo que ahora estamos usando tres sistemas de datos. Las ventajas de combinar los tres sistemas en Jet Enterprise han sido enormes».
– Davis & Shirtliff
Éxito inmediato = rápido ROI y bajo coste de propiedad
Muchas soluciones de inteligencia empresarial conllevan costes ocultos, como implementaciones prolongadas y difíciles, personalizaciones caras y precio elevado de las licencias cuando se amplían a un gran número de usuarios. Jet Enterprise se suele instalar en unas dos horas, requiere un nivel mínimo de formación de los usuarios y ofrece licencias para un número ilimitado de usuarios. Los usuarios habitualmente experimentan un incremento de los ingresos brutos en los primeros 12 meses de uso.
Loggly - Case Study - Stanley Black & Decker Transforms Work with Support fro...SolarWinds Loggly
With Loggly, Stanley Black & Decker:
• Provides team with troubleshooting capabilities for mobile and IoT applications running on traditional and serverless architectures
• Supports performance monitoring, security, and PCI compliance needs
• Enables quick scalability as new innovations are launched
A Practical Guide to Rapid ITSM as a Foundation for Overall Business AgilityDana Gardner
This document summarizes a podcast discussion on how rapidly advancing IT service management (ITSM) capabilities can improve IT performance and enable business agility.
The panelists discuss how traditional long IT project timelines no longer meet business needs, and how new ITSM technologies and methods allow for more rapid ITSM adoption. Rapid ITSM implementation using out-of-the-box configurations from SaaS solutions can establish best practices faster than custom approaches. However, data quality issues and unclear requirements can hinder speed. Adopting true agile principles and focusing on business needs rather than desired features helps overcome barriers to rapid ITSM.
For SBI Securities, Pivotal GemFire’s results speak for themselves – greater performance with fewer resources. It provides extremely low latency and scales with spikes as well as usage growth over time. Importantly, GemFire also provides a more cost-effective means of managing the data tier for high-volume data industries, like an online brokerage with plans for future growth.
To learn more, visit pivotal.io/big-data/pivotal-gemfire.
LIC needed to regularly update its SAP systems but copying the large databases took 9 days of downtime each time. Data Sync Manager reduced the copy process to just 1 hour by time-slicing and reducing the database size by 67%, saving $11,000 per year. Functional updates that previously took 9 days were also reduced to 1 hour. The solution allowed LIC to build test environments faster with no production interruptions.
M3 Glass Technologies implemented an ERP software system to help automate processes, reduce paper usage, and increase information flow and capacity. The ERP system integrated data from all departments, allowed remote access to order status and forecasts, and facilitated digital document management. While implementation required adjusting to changes, the increased throughput enabled new laminating capabilities and market opportunities. Managing expectations during the transition process was key to the overall success of the ERP system installation.
Mozilla Foundation Metrics - presentation to engineersJohn Schneider
@rossbruniges and I talked with our fellow Mozilla Foundation engineers and development teams about getting the data for building a data driven operation using statsd, graphite, geckoboard, google analytics, and newrelic.
The Cabinet Office has been working hard to level the playing field for open source software, agile development and lean software. This picks a number of quotes illustrating the success of these ideas outside the public sector, and hopes to explain why enterprises involved with government needs to start paying attention.
The white paper discusses how enterprises are facing exponentially growing amounts of data that is breaking down traditional storage architectures. It outlines NetApp's approach to addressing big data challenges through what it calls the "Big Data ABCs" - analytics, bandwidth, and content. This allows customers to gain insights from massive data sets, move data quickly for high-performance applications, and store large amounts of content for long periods without increasing complexity. NetApp provides solutions to help enterprises take advantage of big data and turn it into business value.
Data analytics is needed everywhere to predict relevant outcomes, increase profits, and efficiently use resources. It involves transforming processed information into knowledge. Companies use data analytics to understand customer behavior and target them with personalized advertisements and offers. Different types of data include structured, semi-structured, unstructured, and big data. Big data is high volume, high velocity, and highly variable information that requires novel processing techniques. Hadoop is an open source software framework that allows distributed processing of big data across clusters of computers using simple programming models like MapReduce. Data analytics helps analyze these different data types to provide business insights.
Denodo provides a unique solution to enterprise database reporting. Instead of a traditional ETL solution, this product "grabs" the data from the source creating what they call a "data mashup". Our company is looking to offer this product as a reporting solution option to our clients.
This document discusses the rise of big data and how organizations are dealing with increasingly large volumes of data from a variety of sources. It defines big data as datasets that are too large to be captured, managed and processed by traditional software within a reasonable time frame. The document outlines how data has increased dramatically in terms of volume, velocity and variety in recent years. It provides examples of how companies are using big data to create transparency in business processes, enable experimentation, innovate new business models and support human decision making. The challenges of analyzing unstructured data and new techniques for in-memory analytics are also discussed.
Data Engineer's Lunch #85: Designing a Modern Data StackAnant Corporation
What are the design considerations that go into architecting a modern data warehouse? This presentation will cover some of the requirements analysis, design decisions, and execution challenges of building a modern data lake/data warehouse.
This document discusses the key characteristics of Big Data - volume, variety, velocity, and veracity. It provides examples and explanations of each characteristic. Volume refers to the large amount of data. Variety means the different types and sources of data. Velocity is about the speed at which data is processed. Veracity relates to the quality and trustworthiness of the data. The document emphasizes that understanding these characteristics is important for effectively managing and analyzing Big Data.
Third presentation in our seminar on business intelligence dashboards. Derek Murphy works for National Grid and related learning points from over 30 years experience of delivering business intelligence projects
Presentation also available on YouTube https://www.youtube.com/watch?v=Er90qIA2S7U
Learn “How to get your IT budget to 1 percent of Revenue” HCL Technologies
Meeting an audacious goal such as this requires intelligent outsourcing, aggressive migration to the cloud, and savvy management of change
Download this report by Andy Nallappan, CIO Avago Technologies
Download Tab
AmPro Mortgage Corporation implemented Intelli-Mortgage Business Performance Manager from Intelli-Mine to gain a unified view of key performance metrics across its various business units. This allowed AmPro to more efficiently monitor production indicators, identify problem areas, and make timely decisions. Intelli-Mortgage provided over 100 predefined mortgage industry metrics and was implemented within 12 weeks, providing significant time and cost savings compared to building a custom business intelligence system. AmPro estimates savings of $1 million in reporting and analysis costs and a total benefit of over $4 million over three years from increased profitability and lower funding costs enabled by Intelli-Mortgage.
Littler Mendelson is the world's largest employment and labor law firm with over 1,000 attorneys. The firm's CTO realized the desktop environment could not keep up with business demands due to slow logons and stability issues. He decided to overhaul the entire desktop ecosystem to simplify management and improve the user experience. Littler implemented AppSense's DesktopNow and DataNow solutions to streamline management, accelerate logons to under 60 seconds, and provide reliable access to data from any device.
Every day we roughly create 2.5 Quintillion bytes of data; 90% of the worlds collected data has been generated only in the last 2 years. In this slide, learn the all about big data
in a simple and easiest way.
Enterprises are facing exponentially increasing amounts of data that is breaking down traditional storage architectures. NetApp addresses this "big data challenge" through their "Big Data ABCs" approach - focusing on analytics, bandwidth, and content. This enables customers to gain insights from massive datasets, move data quickly for high-speed applications, and securely store unlimited amounts of content for long periods without increasing complexity. NetApp's solutions provide a foundation for enterprises to innovate with data and drive business value.
Copy Data Management & Storage Efficiency - Ravi NambooriRavi namboori
In this PPT Ravi Namboori explains how copy data management practices can bring about changes in our workplaces. Creation of more space to operate in is one of its main benefits and also about storage efficiency.
Before vs After: Redesigning a Website to be Useful and Informative for Devel...Teresa Giacomini
There are so many fun challenges in creating a useful website for a developer audience today: you’ve got to empathize with your audience, nail the voice, understand the “jobs” your site’s visitors are trying to accomplish, make sure you anticipate (and answer!) the questions people are likely to have. In this quick lightning talk, I’ll share some before vs. after pics of a recent Citus Data site redesign—and will share some of the best practices we used, based on my years as a developer, software engineering manager, product manager, and now, as a marketer.
Big Data Means Big Business
Big data has the potential to disrupt existing businesses and help create new ones by extracting useful information from huge volumes of structured and unstructured data. To realize this promise, organizations need cheap storage, faster processing, smarter software, and access to larger and more diverse data sets. Big data can unlock new business value by enabling better-informed decisions, discovering hidden insights, and automating business processes. While the technology is available, organizations must also invest in skills, cultural change, and using information as a corporate asset to fully leverage big data.
Get Your Data Under Control in 5 Stepsgloverastera
This document discusses five steps for organizations to get their data under control: 1) implement a data governance structure, 2) refine software acquisition processes, 3) understand the full data lifecycle, 4) implement robust data management tools, and 5) initiate a data cleanup project. Taking these steps can help organizations achieve business outcomes like having a single version of the truth for analyzing data and improving data-driven processes.
More Related Content
Similar to Information-Governance-Saves-Millions-for-national-defense-contractor
M3 Glass Technologies implemented an ERP software system to help automate processes, reduce paper usage, and increase information flow and capacity. The ERP system integrated data from all departments, allowed remote access to order status and forecasts, and facilitated digital document management. While implementation required adjusting to changes, the increased throughput enabled new laminating capabilities and market opportunities. Managing expectations during the transition process was key to the overall success of the ERP system installation.
Mozilla Foundation Metrics - presentation to engineersJohn Schneider
@rossbruniges and I talked with our fellow Mozilla Foundation engineers and development teams about getting the data for building a data driven operation using statsd, graphite, geckoboard, google analytics, and newrelic.
The Cabinet Office has been working hard to level the playing field for open source software, agile development and lean software. This picks a number of quotes illustrating the success of these ideas outside the public sector, and hopes to explain why enterprises involved with government needs to start paying attention.
The white paper discusses how enterprises are facing exponentially growing amounts of data that is breaking down traditional storage architectures. It outlines NetApp's approach to addressing big data challenges through what it calls the "Big Data ABCs" - analytics, bandwidth, and content. This allows customers to gain insights from massive data sets, move data quickly for high-performance applications, and store large amounts of content for long periods without increasing complexity. NetApp provides solutions to help enterprises take advantage of big data and turn it into business value.
Data analytics is needed everywhere to predict relevant outcomes, increase profits, and efficiently use resources. It involves transforming processed information into knowledge. Companies use data analytics to understand customer behavior and target them with personalized advertisements and offers. Different types of data include structured, semi-structured, unstructured, and big data. Big data is high volume, high velocity, and highly variable information that requires novel processing techniques. Hadoop is an open source software framework that allows distributed processing of big data across clusters of computers using simple programming models like MapReduce. Data analytics helps analyze these different data types to provide business insights.
Denodo provides a unique solution to enterprise database reporting. Instead of a traditional ETL solution, this product "grabs" the data from the source creating what they call a "data mashup". Our company is looking to offer this product as a reporting solution option to our clients.
This document discusses the rise of big data and how organizations are dealing with increasingly large volumes of data from a variety of sources. It defines big data as datasets that are too large to be captured, managed and processed by traditional software within a reasonable time frame. The document outlines how data has increased dramatically in terms of volume, velocity and variety in recent years. It provides examples of how companies are using big data to create transparency in business processes, enable experimentation, innovate new business models and support human decision making. The challenges of analyzing unstructured data and new techniques for in-memory analytics are also discussed.
Data Engineer's Lunch #85: Designing a Modern Data StackAnant Corporation
What are the design considerations that go into architecting a modern data warehouse? This presentation will cover some of the requirements analysis, design decisions, and execution challenges of building a modern data lake/data warehouse.
This document discusses the key characteristics of Big Data - volume, variety, velocity, and veracity. It provides examples and explanations of each characteristic. Volume refers to the large amount of data. Variety means the different types and sources of data. Velocity is about the speed at which data is processed. Veracity relates to the quality and trustworthiness of the data. The document emphasizes that understanding these characteristics is important for effectively managing and analyzing Big Data.
Third presentation in our seminar on business intelligence dashboards. Derek Murphy works for National Grid and related learning points from over 30 years experience of delivering business intelligence projects
Presentation also available on YouTube https://www.youtube.com/watch?v=Er90qIA2S7U
Learn “How to get your IT budget to 1 percent of Revenue” HCL Technologies
Meeting an audacious goal such as this requires intelligent outsourcing, aggressive migration to the cloud, and savvy management of change
Download this report by Andy Nallappan, CIO Avago Technologies
Download Tab
AmPro Mortgage Corporation implemented Intelli-Mortgage Business Performance Manager from Intelli-Mine to gain a unified view of key performance metrics across its various business units. This allowed AmPro to more efficiently monitor production indicators, identify problem areas, and make timely decisions. Intelli-Mortgage provided over 100 predefined mortgage industry metrics and was implemented within 12 weeks, providing significant time and cost savings compared to building a custom business intelligence system. AmPro estimates savings of $1 million in reporting and analysis costs and a total benefit of over $4 million over three years from increased profitability and lower funding costs enabled by Intelli-Mortgage.
Littler Mendelson is the world's largest employment and labor law firm with over 1,000 attorneys. The firm's CTO realized the desktop environment could not keep up with business demands due to slow logons and stability issues. He decided to overhaul the entire desktop ecosystem to simplify management and improve the user experience. Littler implemented AppSense's DesktopNow and DataNow solutions to streamline management, accelerate logons to under 60 seconds, and provide reliable access to data from any device.
Every day we roughly create 2.5 Quintillion bytes of data; 90% of the worlds collected data has been generated only in the last 2 years. In this slide, learn the all about big data
in a simple and easiest way.
Enterprises are facing exponentially increasing amounts of data that is breaking down traditional storage architectures. NetApp addresses this "big data challenge" through their "Big Data ABCs" approach - focusing on analytics, bandwidth, and content. This enables customers to gain insights from massive datasets, move data quickly for high-speed applications, and securely store unlimited amounts of content for long periods without increasing complexity. NetApp's solutions provide a foundation for enterprises to innovate with data and drive business value.
Copy Data Management & Storage Efficiency - Ravi NambooriRavi namboori
In this PPT Ravi Namboori explains how copy data management practices can bring about changes in our workplaces. Creation of more space to operate in is one of its main benefits and also about storage efficiency.
Before vs After: Redesigning a Website to be Useful and Informative for Devel...Teresa Giacomini
There are so many fun challenges in creating a useful website for a developer audience today: you’ve got to empathize with your audience, nail the voice, understand the “jobs” your site’s visitors are trying to accomplish, make sure you anticipate (and answer!) the questions people are likely to have. In this quick lightning talk, I’ll share some before vs. after pics of a recent Citus Data site redesign—and will share some of the best practices we used, based on my years as a developer, software engineering manager, product manager, and now, as a marketer.
Big Data Means Big Business
Big data has the potential to disrupt existing businesses and help create new ones by extracting useful information from huge volumes of structured and unstructured data. To realize this promise, organizations need cheap storage, faster processing, smarter software, and access to larger and more diverse data sets. Big data can unlock new business value by enabling better-informed decisions, discovering hidden insights, and automating business processes. While the technology is available, organizations must also invest in skills, cultural change, and using information as a corporate asset to fully leverage big data.
Get Your Data Under Control in 5 Stepsgloverastera
This document discusses five steps for organizations to get their data under control: 1) implement a data governance structure, 2) refine software acquisition processes, 3) understand the full data lifecycle, 4) implement robust data management tools, and 5) initiate a data cleanup project. Taking these steps can help organizations achieve business outcomes like having a single version of the truth for analyzing data and improving data-driven processes.
Similar to Information-Governance-Saves-Millions-for-national-defense-contractor (20)
2. Information Governance Initiative • 1271 Avenue of the Americas • Suite 4300 • New York, NY 10020 • iginitiative.com
1
Information Governance Saves Millions
for national defense contractor
Shortly after taking over as Director of Data Center Storage at a defense contractor, Mark
Tango realized he had a problem – a really big problem.
To be exact, 5.5 petabytes worth of problems. 5.5 petabytes of unstructured (e.g.,
documents, presentations, spreadsheets) and structured (e.g., databases) information. A
petabyte can store the equivalent of 500 billion pages of standard printed text. It would
take about 11,000 typical laptop computers to store all that information. That’s a whole lot
of information.
But to make matters worse, Mark quickly realized that, like most organizations the size of
BAE Systems, the company had a lot of information that was providing little or no value to
the organization, like:
• decades-old, outdated information;
• information that had no business being in the data center (like personal music
MP3s); and
• PST files which, after email archiving was deployed, should not exist
• information that duplicated what they were already paying good money to store
and manage elsewhere
With the help of Active Navigation Inc., Tango set out to reduce the organizations
enterprise data footprint from 5.5 PB to 3.0 PB. Active Navigation provides file analysis
software for the discovery and ongoing control of unstructured information.
Over the course of one year, the team was able to analyze 2.0 PB of data across three data
centers, ultimately eliminating 881 TB of both Tier 1 and Tier 2 storage. These efforts
helped the company shave nearly $2 million per year off its storage costs.
Additionally, the teams were able to decommission approximately 75 storage units from
various vendors; reducing additional support costs and avoiding the risks associated with
aging storage devices. The team also saved money because the company didn’t have to buy
additional storage devices.
The cost of Tier 1 storage is likely lower than the rest of the industry,” Tango said. “We are
at fifty-three cents a gigabyte for Tier 1 storage, which is certainly very cheap for the
industry. But our internal charge has a number of pieces associated with it and our project
is helping us keep that cost low.”
3. Information Governance Initiative • 1271 Avenue of the Americas • Suite 4300 • New York, NY 10020 • iginitiative.com
2
The project was so successful that this exact method is featured in the Gartner report,
“Save Millions in Storage Costs with These Effective Data Management Best Practices.”
How Did Tango and His Team Do It?
Initially, Tango asked his team a key question: “So guys, what is your formula for
forecasting storage growth?”
He was astounded by the answer: “We just buy 40 percent more [storage] every year.”
Tango’s response: “Oh, boy.”
So Tango got to work. An initial analysis confirmed that the company had a data sprawl
problem. It was not the first time that company had recognized the problem, but like most
big companies, it often takes more than one attempt to solve complex problems.
Tango’s next step was to contact Peter Baumann, CEO, and Steve Matthews, VP Business
Development at Active Navigation who had worked with in a previous project, and started
“brand new.”
Early on, the team ran into what Tango called “the fear factor.” People who had worked at
here for years didn’t want to rock the boat. They didn’t want to delete anything for fear of
losing their jobs.
“So it was my job to then become not only the project leader, but also the project sponsor,
the project educator, the project mentor, and then the project salesperson to get buy-in
from upper management,” Tango said.
A key tool was the detailed reports the team produced bringing visibility into was being
stored, what could still be reduced, what had been reduced and the cost savings that had
been achieved. Tango said the company was able to use the reports generated by the
software to get business leaders to act on data reduction by showing them how much
money they could save.
Once the execs understood that the game changed, he said.
“Since that time, we have had buy-in all the way up to the C-level,” Tango said. “That
basically gives us the ability to move forward with little to no obstructions.”
After everyone was on board, the team worked with Records Management to determine
what was and what wasn’t a record.
“At first, their definition was ‘everything is a record,’” he said. “But once we got legal
involved, we helped them understand that keeping useless data around has real risks. If we
don’t have a business or legal need to keep it, then it should go.’”
The team crawled 1.5 PB of data, first looking for easy opportunities for data reduction by
scanning for redundant, outdated and trivial (ROT) data. The initial scan identified some
easy targets, including, for example, 4,000 copies of the same document about the
4. Information Governance Initiative • 1271 Avenue of the Americas • Suite 4300 • New York, NY 10020 • iginitiative.com
3
company’s 2007 annual picnic. BAE Systems also scanned user home directories for log
files, pictures and disallowed files types like digital music files, PSTs, hard drive clones,
and disk images.
To make the job easier, the company decided not to analyze all 5.5 PB at once but instead
began by analyzing strategic targets that together added up to 2.0 PB – a much more
manageable workload for the data management team.
Changing the Internal Business Model for Storage
“Now this is a very interesting piece of the story – up until recently we were charging for
storage based only on what you used,” Tango said. “We worked very hard to show a
business case to the company that we should be charging for what was allocated. If a
database admin needed two terabytes of volume disk space, and we’re only charging them
for the 600 gigs they were using, nobody else can use the other 1.4 terabytes because it is
locked in at the volume allocation size and cannot be reduced.”
So a few months ahead of time, the team met with all of the business technology officers
and announced a new chargeback model was to be introduced that would require business
areas to start paying for the storage they were allocated, not just the storage they actually
used.
“Well, that sent the droves of people running to us, which was lovely, because it gave us
the opportunity to right-size the storage,” Tango said. “And that right-sizing activity,
which lasted a week or two for each vertical, allowed us to show cost avoidance of around
two million dollars in three weeks.”
The software was also able to show the company databases that hadn’t been written to in
three years or more – databases that the company was able to eliminate, Tango said. After
the teams identified the worthless data, the Active Navigation tool made cleaning it up
easy and auditable, Tango said.
“We simply highlight those areas, and we either delete them, or in certain cases where we
have to hold it for contractual obligations, we move the data to a cold storage area, which
we built at a cost of zero,” he said. “So contractual data that has to be held doesn’t need to
sit on tier-one or tier-two data storage – it gets to sit on our free cold storage array built
from decommissioned arrays.”
Next Steps
But the team isn’t finished yet – there’s a lot more work to do, Tango said.
“We’re doing it in phases. The first phase was to get everybody warmed up, hit the low-
hanging fruit. What’s easy? What’s a quick win?” he said. “That got everybody excited. So
when we hit with Active Navigation, the first ROT – people felt good because, ‘hey, we just
5. Information Governance Initiative • 1271 Avenue of the Americas • Suite 4300 • New York, NY 10020 • iginitiative.com
4
saved ten grand, twenty grand, thirty grand.’ We used that excitement to continue to push
forward.”
During the second phase, the team will use the tool to scan at the content level for
sensitive data, including data subject to export control, data containing personally
identifiable information and so on.
The company also has another goal this year – to decommission another 30 storage units,
Tango said.
Now that the first phase of the project has been completed, Tango said the company has
established the Data Governance Board – a group of data compliance managers who work
within the different verticals.
“They are assisting us in the compliance and governance of the different types of data,
such as determining retention periods, Tango said. “They are also working with the legal
department to determine what can be defensibly deleted.”
Tango’s advice to other companies looking to reduce their data
footprints
You can do it.
“Start off with small wins. Don’t try to be a hero. Go in, evaluate with whatever product
that you chose to interrogate your data – but start slow and do it in chunks, sector by
sector, or department by department,” he said. “That gives you the time to work with each
department, clean the data, show the value, show the cost savings. As you keep going, you
build up more steam. The more steam you build, the greater the savings.”
Additionally, Tango reiterated the need to get C-level buy-in by showing the executives
how much storing worthless data is costing the company.
Finally, Tango said every IG professional needs to keep a scorecard to document the
amount of money saved in storage clean up or cost avoidance to show to upper
management at regular intervals.
The bottom line: “To keep storage costs under control, storage managers must effectively
manage – and even delete – current data,” according to Gartner. “They must also
implement a data management policy to keep future storage growth and related costs at
bay.”