SlideShare a Scribd company logo
1 of 7
Download to read offline
waterstechnology.com	 June 2015
Special Report
Data Management
Sponsored by:
June 2015 waterstechnology.com4
Special Report Data Management
Data represents the lifeblood of all capital markets firms’ operating environments. But thanks to
new regulatory requirements, firms are having to take on great loads of information, while at the
same time they need clean and reliable data to manage risk and generate alpha. It’s a balancing
act between data management and analysis, overseen by a proper data governance structure.
Lifeblood
Improving the
Q Which business processes across the buy side and
sell side are currently most reliant on clean, consist-
ent data, and why?
John Bottega, senior advisor and consultant, EDM
Council: Every function is reliant on clean consistent data.
But what we have seen in the past several years is a focus from
a “defensive” posture of data to an “offensive” posture on data.
One of the outcomes of the financial crisis has been an increased
focus on regulation—specifically, focus on risk. Risk requires
consistent, timely and accurate data. But it also requires consistent
and accurate “engineering” of data, including proper design of
information so that inherent linkages and dependencies of data
are exposed and known. Linked risk, counterparty risk and capital
management are all depend upon this interconnectedness of data.
John Fleming, chief data governance officer, BNY Mellon:
The world of analytics has exploded under the umbrella of big
data. As a result of that, people are looking for new insight into
the data that they have. To perform analytics and to get value
out of what you’re analyzing, your data has to be good, clean
and fit for purpose. Analytics on troubled data is going to lead to
undesired results.
But I think the big change right now—in a world where you
could argue that not a lot has changed—is the fact that the regulators
have come in and given their perspective as to what “good data”
means, and what the management of that data means. Everywhere
you turn around in finance and in other sectors, you’re seeing the
hiring of data scientists and putting together large analytics projects.
Michael Engelman, director, North American data practice,
GFT: For the sell side, you’re looking at the monitoring and
reporting of risk—both market risk and credit risk—and regulatory
compliance, ranging from the Volcker Rule and BCBS239, to anti-
money laundering (AML) protocols. On the buy side, strong data
management is critical for monitoring fees, along with a number of
reporting needs, including internal and external position report-
ing, performance reporting, and financial reporting. All of these
processes require a consistent, consolidated, cross-asset class and
cross-region view of the firm’s trading activity and positions.
In order for these processes to work correctly, these building
blocks need to be gathered horizontally from multiple trading and
accounting systems within an organization, and represented in a
standardized way. This allows the data to be resolved to a consistent
set of reference information and harmonized across an enterprise.
Roundtable
5waterstechnology.com June 2015
Dominique Tanner, head of business development, SIX
Financial Information: On one hand, you have the core
operational business processes like order routing, clearing and
settlement or corporate actions processing, which tend to be highly
automated. Any defect in the data that drives those processes leads
to exceptions, which usually need to be resolved manually, causing
delays and increasing cost. On the other hand, you have your
high-stakes processes, which are less time critical but can have a
considerable impact financially or reputation wise. These include
client on-boarding, regulatory reporting or tax reporting/process-
ing. Portfolio valuation for NAV calculation on the buy side or risk
management on the sell side are other examples.
Paul McInnis, head of enterprise data management, Eagle:
You really can’t overstate it—your data has to be clean. If people
within an organization—whether it’s in the back, middle or front
office—have reason to question the veracity of the data or come
across inconsistencies, they will lose faith in it and disregard it
altogether.
That being said, there are a few areas in particular, in which there
is quite literally zero room for error. Functions and processes such
as trade execution, compliance and regulatory reporting, risk and
exposure measurement, and client reporting are all areas in financial
services where “being close” doesn’t cut it. The operational and
reputational risks are too great.
Q What are the biggest
challenges facing
buy-side and sell-side firms
in terms of their data man-
agement practices? Are
these challenges mostly
technology related, or are
they a mix of technology and
operational/governance
issues?
Bottega: There are many
challenges that face banks with
respect to their data management.
If I had to narrow it down to few
key challenges, I would say first,
the challenge of legacy systems
and environments. We have
many years of disparate infra-
structures and business processes
that make holistic data management difficult. Second would be
culture. Banks have to realize that one of their most valuable assets
is their data, and they should treat it as such.
Thus, changing their business processes and mindset (their data
program support and governance), to properly acquire, curate and
utilize data according to industry best practices is key. Finally,
and probably most important, is driving consistent meaning of
data. As an industry, we have been focused—and are pretty good
at—data “processing” and the mechanics of capturing, storing and
moving data. But we have not been good at managing “con-
tent”—getting agreement across an organization or, dare I say,
across an industry, as to the semantic meaning of data.
Fleming: One of the things that a good data governance program
will create is the notion of helping people understand what data
is important, versus all data. You can’t stand on attention for all
the data in the firm; there’s just too
much of it.
Years ago I worked for a gentle-
man who taught me this phrase:
People know what they do, but not
what they are doing. People who are
in a data manufacturing job never
understood what the implications
of getting it wrong meant; they
did what they did and other people
would come along and fix that data.
One of the very important things
that’s come along is this idea of data
lineage: Where has this data come
from that you’re using, and how good
is that data? So for the first time,
people are starting to think about
that end-to-end process of data.
The last piece is putting together
some kind of metrics or scorecard
to go over. When you have so much
data, you need to figure out how good the data is and you have to
provide some kind of empirical evidence back to the firm.
Engelman: Most financial services firms move data from place
to place and transform it into multiple formats. This constant
transformation introduces hundreds or thousands of reconciliation
Paul McInnis
Head of Enterprise Data Management
Eagle Investment Systems
Tel: +1 781 943 2200
Web: www.eagleinvsys.com
“Years ago I worked for a gentleman who
taught me this phrase: People know what
they do, but not what they are doing. People
who are in a data manufacturing job never
understood what the implications of getting it
wrong meant; they did what they did and other
people would come along and fix that data.
One of the very important things that’s come
along is this idea of data lineage: Where has
this data come from that you’re using, and how
good is that data? So for the first time, people
are starting to think about that end-to-end
process of data.” John Fleming, BNY Mellon
6 June 2015 waterstechnology.com
Special Report Data Management
points, which is extremely costly and introduces potential failure
points into a firm’s operations. Moreover, large-scale metrics and
measurement programs are nearly impossible to execute if data
representations are inconsistent across the firm.
As a result, the majority of data management programs are
being executed in silos; front-office repositories, risk repositories,
compliance repositories and finance repositories each transform
and reconcile the data in separate functions. This creates duplica-
tion of effort and cost and does not move the bank forward
towards a common understanding of data on an enterprise level.
These difficulties are further compounded by a lack of
common standards and common data language. As most banks do
not understand their data lineage to the degree required to allow
the business to confidently state that the information they are
viewing is correct and complete, critical components to successful
program delivery—such as data definition, documentation, test-
ing and implementation—are often left out of corporate software
development life cycle processes.
Tanner: Managing the complexity of normalising and mapping all
inbound data formats to the required outbound data formats poses
a sizeable challenge to data managers within buy-side and sell-side
firms alike. The solution requires a clear focus on the manage-
ment of metadata. Managing metadata needs technology, i.e. data
dictionary systems, clearly defined processes and a corresponding
governance framework that defines ownership and responsibilities.
But above all it requires expertise of the underlying data formats and
its interpretations, as well as its usage across data sources and con-
suming systems. Acquiring and maintaining this expertise within
the data management function is key to a sustainable solution.
McInnis: The biggest challenge facing the industry unquestiona-
bly revolves around data governance. You saw it during the global
credit crisis in 2008, when very large and diverse asset managers
were basically in the dark as it related to their total exposures.
For many, their data was in silos and they didn’t have the tools
to be able to go in and quickly figure out their liabilities or the
risk sensitivities across their entire portfolio. Thus, following the
crisis there was this renewed appreciation, industry-wide, for an
Investment Book of Record (IBOR), for example.
There is growing awareness that data governance is not an
“IT problem” anymore; it’s been elevated to the business owners,
particularly as regulators, clients and other constituencies demand
accurate, consistent data. Data lineage and the ability to perform an
audit trail are also vitally important given the regulatory scrutiny.
But there is still a level of cultural resistance. For instance,
when you think of the financial services industry today, you’re
talking about these sprawling global
organizations that operate across vari-
ous asset classes and across geographies.
So it can be a herculean task just to
get enterprise-wide alignment on a
business conceptual ontology standard,
one that provides a strict and immuta-
ble interpretation of the structure and
obligations of financial instruments,
legal entities, market data and financial
processes. It seems simple, but it’s
incredibly complex, and this work is
vital. This is one of the reasons there
needs to be a strong advocate from
within and atop the organization to
ensure there’s buy-in company-wide.
Q Is it realistic in 2015 to expect capital markets firms
to design and build enterprise data warehouses to
cater for their data needs, or does the answer lie with
discrete data management tools?
Bottega: This assumes that building a data warehouse is the
correct approach to implementing data management. Warehouses
and discrete data management tools are just that: tools towards
an objective. In certain cases, warehouse implementation may
not be the best approach. Other technologies like visualization
and semantic technologies are introducing new ways to manage
information. In my opinion, the goal in 2015 should be to drive
consistent meaning of data, coupled with adopting and driving a
program of disciplined and organized best practices. With this as
the foundation, technology becomes the enabler of this new “data
hygiene” and can utilize any number of technology solutions so
long as they build in support of these principles.
Engelman: Financial services firms are seeing a “new normal”
environment in 2015 in relation to regulatory reform and data
management. To meet these regulations, firms must not only have
tighter governance policies, but also robust and secure systems for
storing and sorting through the mountain of data that needs to be
collected to produce the required reports.
John Bottega
EDM Council
Michael Engelman
Director, North American Data Practice
GFT
Tel: +1 212 205 3400
Web: www.gft.com
7waterstechnology.com June 2015
Roundtable
As a result of these new needs, capital markets firms are turn-
ing towards data management tools to provide them with crucial
capabilities; in addition to automating the data governance process
and housing data definitions, these tools can be used to design
and build data warehouses. However, tools alone will not ensure
the delivery of a successful data warehouse. Sound information
architectures must include the components of data modeling, data
quality, data stewardship and governance.
Tanner: Different strategies can be applied. Centralised data
repositories or data warehouses presenting the data in a uniform
and consistent way would ultimately solve many of the issues
common in today’s environment. Some firms have already or are
close to achieving this, but many have learned that this is a com-
plex time- and resource-consuming endeavour. The environment
changes constantly, not least due to the volume of new regulation
introduced. The business wants to see a shorter time to market
and benefits realised in months and not years. This requires more
agility in data management and with this the application of tacti-
cal solutions to specific data needs. Discrete tools or systems can
help gaining speed and flexibility but on the other hand increase
the cost and complexity of the overall data management function.
McInnis: I really don’t think firms can achieve what they want
to—or need to—with either discrete data management tools or an
in-house enterprise data warehouse built from scratch. Discrete
tools will just result in siloed data, which creates the governance
issues I mentioned earlier. It’s one of the reasons Eagle offers a
continuum of services that span from an on-premise solution to a
co-sourced or fully outsourced offering through BNY Mellon.
Q Can capital markets firms realistically look to partner
with specialist data management vendors and
“outsource” their data management functions? If so, what
functions would such an arrangement cover?
Fleming: The vendor community has multiple responses to these
problems, and each of them has their own spin. Some of the larger
vendor companies have the perspective, “If you do everything
with us, you won’t have a problem.” The fact of the matter is
that there’s not a single firm that has just one way of doing stuff.
Where we really need to focus is that a lot of people are focusing
on structured data—because that’s what they know best—but
what we really need to do is look at the world of structured,
unstructured and semi-structured data. The problem is that there
are different classes of tools with different capabilities, so that’s an
area where the industry could do better.
Engelman: Capital markets firms have rightfully developed a
mixed view of outsourcing. On one hand, the cost savings and
increased efficiencies that are promised by outsourcing vendors are
too tempting to ignore, particularly when considering the continu-
ously shrinking margins in the “new normal” world. On the other
hand, the increased governance and communication problems that
result from offshoring arrangements frequently outweigh the ben-
efits offered by outsourcing, particularly for processes as sensitive
and complex as data management and regulatory compliance.
Capital markets firms must ensure that their selected vendor
can work as an extension of the firm, operating in a collaborative
outsourcing, or co-sourcing, business paradigm.
Tanner: Financial firms are increasingly looking at other sectors
to find ways of industrializing their own value chain. Firms no
longer see the need to produce all services in-house, specifically if
they are not considered to be part of the firm’s core competences.
Data management is an obvious candidate in this respect. Key
concepts of this strategy are process innovation and standardisa-
tion. Existing processes need to be analysed and their potential for
standardization needs to be assessed.
Those with a high potential, usually more routine and less firm
specific tasks can easily be handed over to an external partner. They
may include data feed management, system operation, data checks
and validations. On the other side there are those with less potential.
They could include exception handling for time critical processes,
managing changes with wide ranging impacts on downstream
systems or producing value added data which is unique to the firm.
McInnis: Absolutely. That’s where the market is going, toward
managed services and a service-oriented architecture, or SOA.
Firms don’t really care who provides the security master record,
as long as it’s clean and accurate. It’s the integration piece that
clients are increasingly seeking to hand over because it requires a
tremendous amount of bandwidth to keep up with it, as the inputs
are changing all the time.
Third-party vendors, who are working with hundreds of firms,
have the economies of scale to dedicate the necessary resources to
stay on top of things like alterations to benchmark feeds. Also, it
doesn’t necessarily help clients if they simply outsource one piece
Dominique Tanner
Head of Business Development
SIX Financial Information
Tel: +1 203 353 8100
Web: www.six-financial-information.com
Special Report Data Management
8 June 2015 waterstechnology.com
of it, or just one component of their data, because it still requires
considerable resources to monitor and integrate everything else
like core security data, ratings and corporate actions.
Q Typically where do capital markets firms go wrong
when it comes to managing their data? What areas on
data management do they tend to overlook or underestimate
in terms of cost, time and complexity?
Bottega: In general, although we’re getting better, people need
to realize that data management is not a finite, short-term project,
but a multi-year, culture-changing initiative that must become part
of a firm’s operational DNA. I’ve seen too many projects cut short
because the expectation was immediate return on investment.
Data management will improve efficiency and increase transpar-
ency of data, which leads to better risk management and marketing.
But this will take time (and money) to build the proper infrastruc-
tures needed to support these goals. Firms have to continue to realize
that there is a cost to all this—not so much in building good data
practices, but in not doing so. A fragile and fragmented information
infrastructure was not helpful during the financial crisis. Strong
discipline, with accurate and transparent content, is critical to sound-
ness and safety of the banks and the industry.
Engelman: Firms that govern data in an ad hoc fashion are inher-
ently setting themselves up for failure. The collection, storage,
reconciliation, and analysis of data is unavoidably complex, mean-
ing firms that do not implement a formal governance program
stand to lose substantial time and money in data management. To
avoid such expenditures and ensure enterprise-wide alignment,
data governance programs must be highly standardized, adhering
to a common language and business-domain model. A centralized
process and dictionary for standardization should be supported
by a common technical infrastructure to ensure the information
derived from the data is consistent across the enterprise. This
standardization requires that compromises be made across business
applications, which introduces complexity in finalizing and sign-
ing off on specifications. Moreover, keeping standards current can
only be accomplished by establishing an organizational structure
that owns the standard (i.e. a run the bank function)—which can
be difficult year over year as budgets and priorities change.
Ownership of these tasks often falls to IT organizations, who
lack the authority to define and enforce standards or to implement
change. This power vacuum often results in decentralized data
management, which causes a multitude of problems. Without
central alignment on how adjustments are made and more impor-
tantly, how they are communicated, it is nearly assured that the
same adjustment will be made multiple times within the organiza-
tion and may be made in an inconsistent way. An overall approach
to the communication of adjustments across all data repositories is
important to ensure the integrity of data.
Finally, and perhaps most crucially, firms must ensure that their
employees are capable of understanding institutional data. Most capi-
tal markets firms do not understand their data lineage to the degree
required to allow the business to confidently state that the informa-
tion they are viewing is correct and complete; knowledge of data is,
in many cases, limited to what people view on user interfaces. This
is further challenged by vendor proprietary systems, attrition of staff
supporting in-house systems and lack of usable documentation.
What you really want to have is a good data architecture
program. Every time you look to build a new system along the
way, you don’t want people going out there and just solving the
problem for themselves. How are people going to take the data?
Consume the data? And if this new system is going to create data,
you have to make sure that you’re not making another copy of
something that exists someplace else. You need to have something
like an data architecture review board for the on-boarding of data
to make sure everything is tied together.
Tanner: For some data, it is essential to keep a history over time and
to apply historic changes to them. Imagine a regulator spotting an
error in a report that was produced three month ago and asks for a
correct one covering the same period. The data that caused the error
might be frequently changing and is now in a different state. If you
can’t correct the mistake in the history and re-run the report for that
period, you would not be able to meet your obligations unless you
revert to a manual process of recreating it. This is what many firms
have to do because data management did not pay enough attention
on identifying data that needs to be keep over time. Incorporating
time in a data repository is complex and system support is limited.
One of the reactions frequently seen is that of applying brute
force and maintaining a history of everything. This however is
very costly and adds a lot of unnecessary complexity. An analysis
of the history requirements of the business processes should be
conducted, identifying the data points for which history needs
to be maintained. A differentiated approach will ensure that a
firm can fulfil the essential needs for access to historic data by the
business while at the same time manage cost and complexity.
McInnis: Unfortunately, a lot of firms still view data management
solely through a “security master data” lens, and fail to leverage
the information that’s available to them across their organizations.
Make no mistake, data management projects are costly, time-
consuming and truly complex, but they’re well worth it when you
consider not only the new efficiencies, but the value-add as well.
For instance, I’ll see some firms who extend their data manage-
ment capabilities to their investment operations or to complement
their portfolio management—be it through an IBOR, business intel-
ligence or equity and fixed-income attribution. When you juxtapose
the value these firms are able to generate from their EDM capabilities
against those that merely see data management as a reporting tool
or for pricing, the difference is night and day and you get a sense of
the respective ROI potential or, conversely, the opportunity costs. I
think we’ve entered a new era in which those that have a robust data
management platform in place and leverage it across their organiza-
tions, have a distinct competitive advantage over those that don’t. In
time, these capabilities will serve as a true moat and differentiator. n
waterstechnology.com	 June 2015

More Related Content

What's hot

00 14092011-0900-derick-de leo
00 14092011-0900-derick-de leo00 14092011-0900-derick-de leo
00 14092011-0900-derick-de leoguiabusinessmedia
 
Big Data is Here for Financial Services White Paper
Big Data is Here for Financial Services White PaperBig Data is Here for Financial Services White Paper
Big Data is Here for Financial Services White PaperExperian
 
D&B Whitepaper The Big Payback On Data Quality
D&B Whitepaper The Big Payback On Data QualityD&B Whitepaper The Big Payback On Data Quality
D&B Whitepaper The Big Payback On Data QualityRebecca Croucher
 
White_Paper_Beyond_Visualisation copy
White_Paper_Beyond_Visualisation copyWhite_Paper_Beyond_Visualisation copy
White_Paper_Beyond_Visualisation copyTania Mushtaq
 
The Rise of Big Data and the Chief Data Officer (CDO)
The Rise of Big Data and the Chief Data Officer (CDO)The Rise of Big Data and the Chief Data Officer (CDO)
The Rise of Big Data and the Chief Data Officer (CDO)gcharlesj
 
How data analytics will drive the future of banking
How data analytics will drive the future of bankingHow data analytics will drive the future of banking
How data analytics will drive the future of bankingSamuel Olaegbe
 
risk management POV Digital (V.08)
risk management POV Digital (V.08)risk management POV Digital (V.08)
risk management POV Digital (V.08)Isabel Viegas
 
Getting IT Right
Getting IT RightGetting IT Right
Getting IT Rightwjgay19
 
Enterprise Information Management: Strategy, Best Practices & Technologies on...
Enterprise Information Management: Strategy, Best Practices & Technologies on...Enterprise Information Management: Strategy, Best Practices & Technologies on...
Enterprise Information Management: Strategy, Best Practices & Technologies on...FindWhitePapers
 
The Changing Data Quality & Data Governance Landscape
The Changing Data Quality & Data Governance LandscapeThe Changing Data Quality & Data Governance Landscape
The Changing Data Quality & Data Governance LandscapeTrillium Software
 
Reaping the benefits of Big Data and real time analytics
Reaping the benefits of Big Data and real time analyticsReaping the benefits of Big Data and real time analytics
Reaping the benefits of Big Data and real time analyticsThe Marketing Distillery
 
North American Financial Information Summit 2010, New York- May 26
North American Financial Information Summit 2010, New York- May 26North American Financial Information Summit 2010, New York- May 26
North American Financial Information Summit 2010, New York- May 26referencedata
 
Implementing business intelligence-whitepaper
Implementing business intelligence-whitepaperImplementing business intelligence-whitepaper
Implementing business intelligence-whitepaperKaizenlogcom
 
Business Critical Processes
Business Critical ProcessesBusiness Critical Processes
Business Critical ProcessesLarry Levine
 
Data Governance That Drives the Bottom Line
Data Governance That Drives the Bottom LineData Governance That Drives the Bottom Line
Data Governance That Drives the Bottom LinePrecisely
 
Big-Data-The-Case-for-Customer-Experience
Big-Data-The-Case-for-Customer-ExperienceBig-Data-The-Case-for-Customer-Experience
Big-Data-The-Case-for-Customer-ExperienceAndrew Smith
 

What's hot (20)

00 14092011-0900-derick-de leo
00 14092011-0900-derick-de leo00 14092011-0900-derick-de leo
00 14092011-0900-derick-de leo
 
Big Data is Here for Financial Services White Paper
Big Data is Here for Financial Services White PaperBig Data is Here for Financial Services White Paper
Big Data is Here for Financial Services White Paper
 
Capitalizing on Big Data
Capitalizing on Big DataCapitalizing on Big Data
Capitalizing on Big Data
 
Big Data strategy components
Big Data strategy componentsBig Data strategy components
Big Data strategy components
 
D&B Whitepaper The Big Payback On Data Quality
D&B Whitepaper The Big Payback On Data QualityD&B Whitepaper The Big Payback On Data Quality
D&B Whitepaper The Big Payback On Data Quality
 
White_Paper_Beyond_Visualisation copy
White_Paper_Beyond_Visualisation copyWhite_Paper_Beyond_Visualisation copy
White_Paper_Beyond_Visualisation copy
 
The Rise of Big Data and the Chief Data Officer (CDO)
The Rise of Big Data and the Chief Data Officer (CDO)The Rise of Big Data and the Chief Data Officer (CDO)
The Rise of Big Data and the Chief Data Officer (CDO)
 
How data analytics will drive the future of banking
How data analytics will drive the future of bankingHow data analytics will drive the future of banking
How data analytics will drive the future of banking
 
risk management POV Digital (V.08)
risk management POV Digital (V.08)risk management POV Digital (V.08)
risk management POV Digital (V.08)
 
Getting IT Right
Getting IT RightGetting IT Right
Getting IT Right
 
Enterprise Information Management: Strategy, Best Practices & Technologies on...
Enterprise Information Management: Strategy, Best Practices & Technologies on...Enterprise Information Management: Strategy, Best Practices & Technologies on...
Enterprise Information Management: Strategy, Best Practices & Technologies on...
 
The Changing Data Quality & Data Governance Landscape
The Changing Data Quality & Data Governance LandscapeThe Changing Data Quality & Data Governance Landscape
The Changing Data Quality & Data Governance Landscape
 
Data Disconnect
Data DisconnectData Disconnect
Data Disconnect
 
Reaping the benefits of Big Data and real time analytics
Reaping the benefits of Big Data and real time analyticsReaping the benefits of Big Data and real time analytics
Reaping the benefits of Big Data and real time analytics
 
North American Financial Information Summit 2010, New York- May 26
North American Financial Information Summit 2010, New York- May 26North American Financial Information Summit 2010, New York- May 26
North American Financial Information Summit 2010, New York- May 26
 
Implementing business intelligence-whitepaper
Implementing business intelligence-whitepaperImplementing business intelligence-whitepaper
Implementing business intelligence-whitepaper
 
Business Critical Processes
Business Critical ProcessesBusiness Critical Processes
Business Critical Processes
 
Data Governance That Drives the Bottom Line
Data Governance That Drives the Bottom LineData Governance That Drives the Bottom Line
Data Governance That Drives the Bottom Line
 
Big-Data-The-Case-for-Customer-Experience
Big-Data-The-Case-for-Customer-ExperienceBig-Data-The-Case-for-Customer-Experience
Big-Data-The-Case-for-Customer-Experience
 
Data Management
Data ManagementData Management
Data Management
 

Viewers also liked (11)

Diapositivas stefania cepeda vitores
Diapositivas stefania cepeda vitoresDiapositivas stefania cepeda vitores
Diapositivas stefania cepeda vitores
 
Computacion
ComputacionComputacion
Computacion
 
Craig Polden CV June 2015
Craig Polden CV June 2015Craig Polden CV June 2015
Craig Polden CV June 2015
 
Extract Software by ExactSolid
Extract Software by ExactSolidExtract Software by ExactSolid
Extract Software by ExactSolid
 
CURICULUM VITAE DIAH
CURICULUM VITAE DIAHCURICULUM VITAE DIAH
CURICULUM VITAE DIAH
 
Mackay HHS - Wish You worked here?
Mackay HHS - Wish You worked here?Mackay HHS - Wish You worked here?
Mackay HHS - Wish You worked here?
 
Encuadres
EncuadresEncuadres
Encuadres
 
Resus Room Service - Sarah Webb
Resus Room Service - Sarah WebbResus Room Service - Sarah Webb
Resus Room Service - Sarah Webb
 
Startup marketing plan
Startup marketing plan Startup marketing plan
Startup marketing plan
 
10 luisdiaz26165016
10 luisdiaz2616501610 luisdiaz26165016
10 luisdiaz26165016
 
How often to post on LinkedIn?
How often to post on LinkedIn?How often to post on LinkedIn?
How often to post on LinkedIn?
 

Similar to DataManagement_Waters_GFT_trimmed

Is effective Data Governance a choice or necessity in Financial Services?
Is effective Data Governance a choice or necessity in Financial Services?Is effective Data Governance a choice or necessity in Financial Services?
Is effective Data Governance a choice or necessity in Financial Services?Sam Thomsett
 
Reference data management in financial services industry
Reference data management in financial services industryReference data management in financial services industry
Reference data management in financial services industryNIIT Technologies
 
Eiu collibra transforming data into action-the business outlook for data gove...
Eiu collibra transforming data into action-the business outlook for data gove...Eiu collibra transforming data into action-the business outlook for data gove...
Eiu collibra transforming data into action-the business outlook for data gove...The Economist Media Businesses
 
The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...
The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...
The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...Balaji Venkat Chellam Iyer
 
Data governance, Information security strategy
Data governance, Information security strategyData governance, Information security strategy
Data governance, Information security strategyvasanthi4ever
 
Baofortheintelligententerprise
BaofortheintelligententerpriseBaofortheintelligententerprise
BaofortheintelligententerpriseFriedel Jonker
 
Information Governance: Reducing Costs and Increasing Customer Satisfaction
Information Governance: Reducing Costs and Increasing Customer SatisfactionInformation Governance: Reducing Costs and Increasing Customer Satisfaction
Information Governance: Reducing Costs and Increasing Customer SatisfactionCapgemini
 
Hadoop training in bangalore
Hadoop training in bangaloreHadoop training in bangalore
Hadoop training in bangaloreappaji intelhunt
 
Not waving-but-drowning
Not waving-but-drowningNot waving-but-drowning
Not waving-but-drowningClaire Samuel
 
Predictive and prescriptive analytics: Transform the finance function with gr...
Predictive and prescriptive analytics: Transform the finance function with gr...Predictive and prescriptive analytics: Transform the finance function with gr...
Predictive and prescriptive analytics: Transform the finance function with gr...Grant Thornton LLP
 
Co pilot of the business
Co pilot of the businessCo pilot of the business
Co pilot of the businessikaro1970
 
Building an Effective Data Management Strategy
Building an Effective Data Management StrategyBuilding an Effective Data Management Strategy
Building an Effective Data Management StrategyHarley Capewell
 
Modernizing Data Quality & Governance: Unlocking Performance & Reducing Risk
Modernizing Data Quality & Governance: Unlocking Performance & Reducing RiskModernizing Data Quality & Governance: Unlocking Performance & Reducing Risk
Modernizing Data Quality & Governance: Unlocking Performance & Reducing RiskWorldwide Business Research
 
Modern Finance and Best Use of Analytics - Oracle Accenture Case Study
Modern Finance and Best Use of Analytics - Oracle Accenture Case StudyModern Finance and Best Use of Analytics - Oracle Accenture Case Study
Modern Finance and Best Use of Analytics - Oracle Accenture Case StudyJames Hartshorn FIRP MIoD
 
EAI Checklist
EAI ChecklistEAI Checklist
EAI ChecklistIdeba
 

Similar to DataManagement_Waters_GFT_trimmed (20)

Is effective Data Governance a choice or necessity in Financial Services?
Is effective Data Governance a choice or necessity in Financial Services?Is effective Data Governance a choice or necessity in Financial Services?
Is effective Data Governance a choice or necessity in Financial Services?
 
Reference data management in financial services industry
Reference data management in financial services industryReference data management in financial services industry
Reference data management in financial services industry
 
Eiu collibra transforming data into action-the business outlook for data gove...
Eiu collibra transforming data into action-the business outlook for data gove...Eiu collibra transforming data into action-the business outlook for data gove...
Eiu collibra transforming data into action-the business outlook for data gove...
 
The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...
The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...
The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...
 
Data governance, Information security strategy
Data governance, Information security strategyData governance, Information security strategy
Data governance, Information security strategy
 
Baofortheintelligententerprise
BaofortheintelligententerpriseBaofortheintelligententerprise
Baofortheintelligententerprise
 
Information Governance: Reducing Costs and Increasing Customer Satisfaction
Information Governance: Reducing Costs and Increasing Customer SatisfactionInformation Governance: Reducing Costs and Increasing Customer Satisfaction
Information Governance: Reducing Costs and Increasing Customer Satisfaction
 
Hadoop training in bangalore
Hadoop training in bangaloreHadoop training in bangalore
Hadoop training in bangalore
 
Article in Techsmart
Article in TechsmartArticle in Techsmart
Article in Techsmart
 
The state of data in 2015
The state of data in 2015The state of data in 2015
The state of data in 2015
 
Not Waving but Drowning - The State of Data in 2015
Not Waving but Drowning - The State of Data in 2015Not Waving but Drowning - The State of Data in 2015
Not Waving but Drowning - The State of Data in 2015
 
Not waving-but-drowning
Not waving-but-drowningNot waving-but-drowning
Not waving-but-drowning
 
Predictive and prescriptive analytics: Transform the finance function with gr...
Predictive and prescriptive analytics: Transform the finance function with gr...Predictive and prescriptive analytics: Transform the finance function with gr...
Predictive and prescriptive analytics: Transform the finance function with gr...
 
Co pilot of the business
Co pilot of the businessCo pilot of the business
Co pilot of the business
 
Building an Effective Data Management Strategy
Building an Effective Data Management StrategyBuilding an Effective Data Management Strategy
Building an Effective Data Management Strategy
 
Big data is a popular term used to describe the exponential growth and availa...
Big data is a popular term used to describe the exponential growth and availa...Big data is a popular term used to describe the exponential growth and availa...
Big data is a popular term used to describe the exponential growth and availa...
 
Modernizing Data Quality & Governance: Unlocking Performance & Reducing Risk
Modernizing Data Quality & Governance: Unlocking Performance & Reducing RiskModernizing Data Quality & Governance: Unlocking Performance & Reducing Risk
Modernizing Data Quality & Governance: Unlocking Performance & Reducing Risk
 
Modern Finance and Best Use of Analytics - Oracle Accenture Case Study
Modern Finance and Best Use of Analytics - Oracle Accenture Case StudyModern Finance and Best Use of Analytics - Oracle Accenture Case Study
Modern Finance and Best Use of Analytics - Oracle Accenture Case Study
 
Data mining
Data miningData mining
Data mining
 
EAI Checklist
EAI ChecklistEAI Checklist
EAI Checklist
 

DataManagement_Waters_GFT_trimmed

  • 1. waterstechnology.com June 2015 Special Report Data Management Sponsored by:
  • 2. June 2015 waterstechnology.com4 Special Report Data Management Data represents the lifeblood of all capital markets firms’ operating environments. But thanks to new regulatory requirements, firms are having to take on great loads of information, while at the same time they need clean and reliable data to manage risk and generate alpha. It’s a balancing act between data management and analysis, overseen by a proper data governance structure. Lifeblood Improving the Q Which business processes across the buy side and sell side are currently most reliant on clean, consist- ent data, and why? John Bottega, senior advisor and consultant, EDM Council: Every function is reliant on clean consistent data. But what we have seen in the past several years is a focus from a “defensive” posture of data to an “offensive” posture on data. One of the outcomes of the financial crisis has been an increased focus on regulation—specifically, focus on risk. Risk requires consistent, timely and accurate data. But it also requires consistent and accurate “engineering” of data, including proper design of information so that inherent linkages and dependencies of data are exposed and known. Linked risk, counterparty risk and capital management are all depend upon this interconnectedness of data. John Fleming, chief data governance officer, BNY Mellon: The world of analytics has exploded under the umbrella of big data. As a result of that, people are looking for new insight into the data that they have. To perform analytics and to get value out of what you’re analyzing, your data has to be good, clean and fit for purpose. Analytics on troubled data is going to lead to undesired results. But I think the big change right now—in a world where you could argue that not a lot has changed—is the fact that the regulators have come in and given their perspective as to what “good data” means, and what the management of that data means. Everywhere you turn around in finance and in other sectors, you’re seeing the hiring of data scientists and putting together large analytics projects. Michael Engelman, director, North American data practice, GFT: For the sell side, you’re looking at the monitoring and reporting of risk—both market risk and credit risk—and regulatory compliance, ranging from the Volcker Rule and BCBS239, to anti- money laundering (AML) protocols. On the buy side, strong data management is critical for monitoring fees, along with a number of reporting needs, including internal and external position report- ing, performance reporting, and financial reporting. All of these processes require a consistent, consolidated, cross-asset class and cross-region view of the firm’s trading activity and positions. In order for these processes to work correctly, these building blocks need to be gathered horizontally from multiple trading and accounting systems within an organization, and represented in a standardized way. This allows the data to be resolved to a consistent set of reference information and harmonized across an enterprise.
  • 3. Roundtable 5waterstechnology.com June 2015 Dominique Tanner, head of business development, SIX Financial Information: On one hand, you have the core operational business processes like order routing, clearing and settlement or corporate actions processing, which tend to be highly automated. Any defect in the data that drives those processes leads to exceptions, which usually need to be resolved manually, causing delays and increasing cost. On the other hand, you have your high-stakes processes, which are less time critical but can have a considerable impact financially or reputation wise. These include client on-boarding, regulatory reporting or tax reporting/process- ing. Portfolio valuation for NAV calculation on the buy side or risk management on the sell side are other examples. Paul McInnis, head of enterprise data management, Eagle: You really can’t overstate it—your data has to be clean. If people within an organization—whether it’s in the back, middle or front office—have reason to question the veracity of the data or come across inconsistencies, they will lose faith in it and disregard it altogether. That being said, there are a few areas in particular, in which there is quite literally zero room for error. Functions and processes such as trade execution, compliance and regulatory reporting, risk and exposure measurement, and client reporting are all areas in financial services where “being close” doesn’t cut it. The operational and reputational risks are too great. Q What are the biggest challenges facing buy-side and sell-side firms in terms of their data man- agement practices? Are these challenges mostly technology related, or are they a mix of technology and operational/governance issues? Bottega: There are many challenges that face banks with respect to their data management. If I had to narrow it down to few key challenges, I would say first, the challenge of legacy systems and environments. We have many years of disparate infra- structures and business processes that make holistic data management difficult. Second would be culture. Banks have to realize that one of their most valuable assets is their data, and they should treat it as such. Thus, changing their business processes and mindset (their data program support and governance), to properly acquire, curate and utilize data according to industry best practices is key. Finally, and probably most important, is driving consistent meaning of data. As an industry, we have been focused—and are pretty good at—data “processing” and the mechanics of capturing, storing and moving data. But we have not been good at managing “con- tent”—getting agreement across an organization or, dare I say, across an industry, as to the semantic meaning of data. Fleming: One of the things that a good data governance program will create is the notion of helping people understand what data is important, versus all data. You can’t stand on attention for all the data in the firm; there’s just too much of it. Years ago I worked for a gentle- man who taught me this phrase: People know what they do, but not what they are doing. People who are in a data manufacturing job never understood what the implications of getting it wrong meant; they did what they did and other people would come along and fix that data. One of the very important things that’s come along is this idea of data lineage: Where has this data come from that you’re using, and how good is that data? So for the first time, people are starting to think about that end-to-end process of data. The last piece is putting together some kind of metrics or scorecard to go over. When you have so much data, you need to figure out how good the data is and you have to provide some kind of empirical evidence back to the firm. Engelman: Most financial services firms move data from place to place and transform it into multiple formats. This constant transformation introduces hundreds or thousands of reconciliation Paul McInnis Head of Enterprise Data Management Eagle Investment Systems Tel: +1 781 943 2200 Web: www.eagleinvsys.com “Years ago I worked for a gentleman who taught me this phrase: People know what they do, but not what they are doing. People who are in a data manufacturing job never understood what the implications of getting it wrong meant; they did what they did and other people would come along and fix that data. One of the very important things that’s come along is this idea of data lineage: Where has this data come from that you’re using, and how good is that data? So for the first time, people are starting to think about that end-to-end process of data.” John Fleming, BNY Mellon
  • 4. 6 June 2015 waterstechnology.com Special Report Data Management points, which is extremely costly and introduces potential failure points into a firm’s operations. Moreover, large-scale metrics and measurement programs are nearly impossible to execute if data representations are inconsistent across the firm. As a result, the majority of data management programs are being executed in silos; front-office repositories, risk repositories, compliance repositories and finance repositories each transform and reconcile the data in separate functions. This creates duplica- tion of effort and cost and does not move the bank forward towards a common understanding of data on an enterprise level. These difficulties are further compounded by a lack of common standards and common data language. As most banks do not understand their data lineage to the degree required to allow the business to confidently state that the information they are viewing is correct and complete, critical components to successful program delivery—such as data definition, documentation, test- ing and implementation—are often left out of corporate software development life cycle processes. Tanner: Managing the complexity of normalising and mapping all inbound data formats to the required outbound data formats poses a sizeable challenge to data managers within buy-side and sell-side firms alike. The solution requires a clear focus on the manage- ment of metadata. Managing metadata needs technology, i.e. data dictionary systems, clearly defined processes and a corresponding governance framework that defines ownership and responsibilities. But above all it requires expertise of the underlying data formats and its interpretations, as well as its usage across data sources and con- suming systems. Acquiring and maintaining this expertise within the data management function is key to a sustainable solution. McInnis: The biggest challenge facing the industry unquestiona- bly revolves around data governance. You saw it during the global credit crisis in 2008, when very large and diverse asset managers were basically in the dark as it related to their total exposures. For many, their data was in silos and they didn’t have the tools to be able to go in and quickly figure out their liabilities or the risk sensitivities across their entire portfolio. Thus, following the crisis there was this renewed appreciation, industry-wide, for an Investment Book of Record (IBOR), for example. There is growing awareness that data governance is not an “IT problem” anymore; it’s been elevated to the business owners, particularly as regulators, clients and other constituencies demand accurate, consistent data. Data lineage and the ability to perform an audit trail are also vitally important given the regulatory scrutiny. But there is still a level of cultural resistance. For instance, when you think of the financial services industry today, you’re talking about these sprawling global organizations that operate across vari- ous asset classes and across geographies. So it can be a herculean task just to get enterprise-wide alignment on a business conceptual ontology standard, one that provides a strict and immuta- ble interpretation of the structure and obligations of financial instruments, legal entities, market data and financial processes. It seems simple, but it’s incredibly complex, and this work is vital. This is one of the reasons there needs to be a strong advocate from within and atop the organization to ensure there’s buy-in company-wide. Q Is it realistic in 2015 to expect capital markets firms to design and build enterprise data warehouses to cater for their data needs, or does the answer lie with discrete data management tools? Bottega: This assumes that building a data warehouse is the correct approach to implementing data management. Warehouses and discrete data management tools are just that: tools towards an objective. In certain cases, warehouse implementation may not be the best approach. Other technologies like visualization and semantic technologies are introducing new ways to manage information. In my opinion, the goal in 2015 should be to drive consistent meaning of data, coupled with adopting and driving a program of disciplined and organized best practices. With this as the foundation, technology becomes the enabler of this new “data hygiene” and can utilize any number of technology solutions so long as they build in support of these principles. Engelman: Financial services firms are seeing a “new normal” environment in 2015 in relation to regulatory reform and data management. To meet these regulations, firms must not only have tighter governance policies, but also robust and secure systems for storing and sorting through the mountain of data that needs to be collected to produce the required reports. John Bottega EDM Council Michael Engelman Director, North American Data Practice GFT Tel: +1 212 205 3400 Web: www.gft.com
  • 5. 7waterstechnology.com June 2015 Roundtable As a result of these new needs, capital markets firms are turn- ing towards data management tools to provide them with crucial capabilities; in addition to automating the data governance process and housing data definitions, these tools can be used to design and build data warehouses. However, tools alone will not ensure the delivery of a successful data warehouse. Sound information architectures must include the components of data modeling, data quality, data stewardship and governance. Tanner: Different strategies can be applied. Centralised data repositories or data warehouses presenting the data in a uniform and consistent way would ultimately solve many of the issues common in today’s environment. Some firms have already or are close to achieving this, but many have learned that this is a com- plex time- and resource-consuming endeavour. The environment changes constantly, not least due to the volume of new regulation introduced. The business wants to see a shorter time to market and benefits realised in months and not years. This requires more agility in data management and with this the application of tacti- cal solutions to specific data needs. Discrete tools or systems can help gaining speed and flexibility but on the other hand increase the cost and complexity of the overall data management function. McInnis: I really don’t think firms can achieve what they want to—or need to—with either discrete data management tools or an in-house enterprise data warehouse built from scratch. Discrete tools will just result in siloed data, which creates the governance issues I mentioned earlier. It’s one of the reasons Eagle offers a continuum of services that span from an on-premise solution to a co-sourced or fully outsourced offering through BNY Mellon. Q Can capital markets firms realistically look to partner with specialist data management vendors and “outsource” their data management functions? If so, what functions would such an arrangement cover? Fleming: The vendor community has multiple responses to these problems, and each of them has their own spin. Some of the larger vendor companies have the perspective, “If you do everything with us, you won’t have a problem.” The fact of the matter is that there’s not a single firm that has just one way of doing stuff. Where we really need to focus is that a lot of people are focusing on structured data—because that’s what they know best—but what we really need to do is look at the world of structured, unstructured and semi-structured data. The problem is that there are different classes of tools with different capabilities, so that’s an area where the industry could do better. Engelman: Capital markets firms have rightfully developed a mixed view of outsourcing. On one hand, the cost savings and increased efficiencies that are promised by outsourcing vendors are too tempting to ignore, particularly when considering the continu- ously shrinking margins in the “new normal” world. On the other hand, the increased governance and communication problems that result from offshoring arrangements frequently outweigh the ben- efits offered by outsourcing, particularly for processes as sensitive and complex as data management and regulatory compliance. Capital markets firms must ensure that their selected vendor can work as an extension of the firm, operating in a collaborative outsourcing, or co-sourcing, business paradigm. Tanner: Financial firms are increasingly looking at other sectors to find ways of industrializing their own value chain. Firms no longer see the need to produce all services in-house, specifically if they are not considered to be part of the firm’s core competences. Data management is an obvious candidate in this respect. Key concepts of this strategy are process innovation and standardisa- tion. Existing processes need to be analysed and their potential for standardization needs to be assessed. Those with a high potential, usually more routine and less firm specific tasks can easily be handed over to an external partner. They may include data feed management, system operation, data checks and validations. On the other side there are those with less potential. They could include exception handling for time critical processes, managing changes with wide ranging impacts on downstream systems or producing value added data which is unique to the firm. McInnis: Absolutely. That’s where the market is going, toward managed services and a service-oriented architecture, or SOA. Firms don’t really care who provides the security master record, as long as it’s clean and accurate. It’s the integration piece that clients are increasingly seeking to hand over because it requires a tremendous amount of bandwidth to keep up with it, as the inputs are changing all the time. Third-party vendors, who are working with hundreds of firms, have the economies of scale to dedicate the necessary resources to stay on top of things like alterations to benchmark feeds. Also, it doesn’t necessarily help clients if they simply outsource one piece Dominique Tanner Head of Business Development SIX Financial Information Tel: +1 203 353 8100 Web: www.six-financial-information.com
  • 6. Special Report Data Management 8 June 2015 waterstechnology.com of it, or just one component of their data, because it still requires considerable resources to monitor and integrate everything else like core security data, ratings and corporate actions. Q Typically where do capital markets firms go wrong when it comes to managing their data? What areas on data management do they tend to overlook or underestimate in terms of cost, time and complexity? Bottega: In general, although we’re getting better, people need to realize that data management is not a finite, short-term project, but a multi-year, culture-changing initiative that must become part of a firm’s operational DNA. I’ve seen too many projects cut short because the expectation was immediate return on investment. Data management will improve efficiency and increase transpar- ency of data, which leads to better risk management and marketing. But this will take time (and money) to build the proper infrastruc- tures needed to support these goals. Firms have to continue to realize that there is a cost to all this—not so much in building good data practices, but in not doing so. A fragile and fragmented information infrastructure was not helpful during the financial crisis. Strong discipline, with accurate and transparent content, is critical to sound- ness and safety of the banks and the industry. Engelman: Firms that govern data in an ad hoc fashion are inher- ently setting themselves up for failure. The collection, storage, reconciliation, and analysis of data is unavoidably complex, mean- ing firms that do not implement a formal governance program stand to lose substantial time and money in data management. To avoid such expenditures and ensure enterprise-wide alignment, data governance programs must be highly standardized, adhering to a common language and business-domain model. A centralized process and dictionary for standardization should be supported by a common technical infrastructure to ensure the information derived from the data is consistent across the enterprise. This standardization requires that compromises be made across business applications, which introduces complexity in finalizing and sign- ing off on specifications. Moreover, keeping standards current can only be accomplished by establishing an organizational structure that owns the standard (i.e. a run the bank function)—which can be difficult year over year as budgets and priorities change. Ownership of these tasks often falls to IT organizations, who lack the authority to define and enforce standards or to implement change. This power vacuum often results in decentralized data management, which causes a multitude of problems. Without central alignment on how adjustments are made and more impor- tantly, how they are communicated, it is nearly assured that the same adjustment will be made multiple times within the organiza- tion and may be made in an inconsistent way. An overall approach to the communication of adjustments across all data repositories is important to ensure the integrity of data. Finally, and perhaps most crucially, firms must ensure that their employees are capable of understanding institutional data. Most capi- tal markets firms do not understand their data lineage to the degree required to allow the business to confidently state that the informa- tion they are viewing is correct and complete; knowledge of data is, in many cases, limited to what people view on user interfaces. This is further challenged by vendor proprietary systems, attrition of staff supporting in-house systems and lack of usable documentation. What you really want to have is a good data architecture program. Every time you look to build a new system along the way, you don’t want people going out there and just solving the problem for themselves. How are people going to take the data? Consume the data? And if this new system is going to create data, you have to make sure that you’re not making another copy of something that exists someplace else. You need to have something like an data architecture review board for the on-boarding of data to make sure everything is tied together. Tanner: For some data, it is essential to keep a history over time and to apply historic changes to them. Imagine a regulator spotting an error in a report that was produced three month ago and asks for a correct one covering the same period. The data that caused the error might be frequently changing and is now in a different state. If you can’t correct the mistake in the history and re-run the report for that period, you would not be able to meet your obligations unless you revert to a manual process of recreating it. This is what many firms have to do because data management did not pay enough attention on identifying data that needs to be keep over time. Incorporating time in a data repository is complex and system support is limited. One of the reactions frequently seen is that of applying brute force and maintaining a history of everything. This however is very costly and adds a lot of unnecessary complexity. An analysis of the history requirements of the business processes should be conducted, identifying the data points for which history needs to be maintained. A differentiated approach will ensure that a firm can fulfil the essential needs for access to historic data by the business while at the same time manage cost and complexity. McInnis: Unfortunately, a lot of firms still view data management solely through a “security master data” lens, and fail to leverage the information that’s available to them across their organizations. Make no mistake, data management projects are costly, time- consuming and truly complex, but they’re well worth it when you consider not only the new efficiencies, but the value-add as well. For instance, I’ll see some firms who extend their data manage- ment capabilities to their investment operations or to complement their portfolio management—be it through an IBOR, business intel- ligence or equity and fixed-income attribution. When you juxtapose the value these firms are able to generate from their EDM capabilities against those that merely see data management as a reporting tool or for pricing, the difference is night and day and you get a sense of the respective ROI potential or, conversely, the opportunity costs. I think we’ve entered a new era in which those that have a robust data management platform in place and leverage it across their organiza- tions, have a distinct competitive advantage over those that don’t. In time, these capabilities will serve as a true moat and differentiator. n