The Role of Non-Banking Financial Companies (NBFCs).pdf
Algo think0612-june12
1. Volume 7
June 2012
The CVA Desk:
Pricing the True Cost of Risk _ P.18
The Optimization of Everything:
Derivatives, CCR and Funding _ P.26
Through the Looking Glass:
Curve Fitting _ P.32
The Social Media World:
What Risk Can Learn From It _ P.38
Stochastic and Scholastic:
The Interconnectivity of Risk _ P.44
Not all risks are worth taking.
Back to the
Future
Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics,
we help clients to see risk in its entirety. This unique perspective enables financial services companies to
mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of
risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed
decision making through the science of knowing better.
algorithmics.com
JUNE 2012
Revisiting capital and the bank of tomorrow
2. Not all risks are worth taking. Not all risks are worth taking.
Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics, Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics,
we help clients to see risk in its entirety. This unique perspective enables financial services companies to we help clients to see risk in its entirety. This unique perspective enables financial services companies to
mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of
risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed
decision making through the science of knowing better. decision making through the science of knowing better.
algorithmics.com algorithmics.com
4. TH!NK JUNE 2012
BEST OF
Our commitment to innovation has helped
Algorithmics earn a number of public
recognitions from industry publications, reader
surveys, and judged panels year after year.
Below is a list of awards we recently received.
Best Risk Management Technology Provider, HFMWeek’s European Hedge Fund
Services 2012. Best Global Deployment for Algorithmics’ collateral
management client BNY Mellon, American Financial Technology Awards
(AFTAs) 2011. First place for Risk Management – Regulatory/economic capital
calculation, Structured Products Technology Rankings 2012. First place
overall for Enterprise-wide risk management and first place in enterprise-
wide market risk management, risk dashboards, risk aggregation, risk
capital calculation (economic) and collateral management in Risk magazine’s
Risk Technology Rankings 2011. Readers’ Choice Winner (Highly Commended)
for Best Risk Management Product or Service, Banking Technology Awards 2011.
First place in Market risk management and ALM, Asia Risk Technology
Rankings 2011. Best Risk Analytics Provider, Waters Rankings 2011.
Best Solvency II software package, Life & Pension Risk Awards 2011.
First place overall first place for Scenario Analysis, Key risk indicators,
and Operational risk loss data collection, Operational Risk & Regulation
Software Rankings 2011. Shortlisted, best post-trade risk management
product for Algo Collateral, Financial News Awards for Excellence in
Trading & Technology, Europe 2011.
2
5. Opening Bell
opening bell
Recent elections in France and Greece have added
a new chapter to the ongoing sovereign debt crisis
in Europe. At the time of this issue going to print,
Greek voters turned on the Conservative New
Democracy and Socialist PASOK, two parties that
have defined Greek politics for decades. New Greek
parties from the left and right are divided in outlook
but united in opposition to EU-IMF bailouts and
their widely unpopular austerity measures.
In France, François Hollande has replaced capitalization and risk profiles was owned
former President Nicolas Sarkozy. “Europe by decision makers. The impact of this
is watching us,” said Hollande during his framework on their business holds inter-
victory speech. “At the moment when esting implications.
the result was proclaimed, I am sure that Elsewhere in our pages are other features
in many countries of Europe there was that explore new approaches to existing
relief and hope: finally austerity is no challenges. These include a look at intercon-
longer destiny.” Yet following both elections, nectivity and stochastic modeling, risk and
Chancellor Angela Merkel of Germany social media, and the CVA desk’s function
clearly stated that neither she nor her of pricing the true cost of risk. In “Through
government were interested in reopening the Looking Glass” we return to the topic
the eurozone fiscal pact, or the strategy of of curve fitting, with an empirical look at
deficit-cutting austerity measures. how chief risk officers and supervisors can
What is the appropriate response in times gain critical insights into major exposures
of uncertainty and conflicting views on they would otherwise be unable to obtain.
future direction? This has been an issue for In finance and politics, there will always
financial service firms since the financial be an element of uncertainty. As an industry
crisis. Regulators, governments and analysts and as global citizens, we will continue to
have called for financial firms to change identify and respond to the challenges of
the way they do business. our times by searching the past, and also for
One way that firms may be able to respond solutions that have yet to be constructed.
is by looking to how they have managed
uncertainty in the past. In “Back to the Future”,
this issue’s cover story revisits capital and
its role in the bank of tomorrow. When early
banks operated as partnerships with personal
liability attached, every decision regarding
Michael Zerbs
Vice President,
IBM Risk Analytics
3
7. Brenda Dietrich has spent her professional TH!NK: You have been connected with IBM Research since the mid-1980s.
Has the company’s approach to research changed over this span?
career with IBM Research, and recently Brenda: It has changed quite a lot. In my early days with the group, IBM
became the company’s first CTO of Analytics Research most closely resembled a think tank. Our job was to figure out
cool things one could do with computing and computers first, and
Software. In this issue’s conversation, Brenda then to try and establish a shared vision within the company’s product
lines. In that period we invented some wonderful things and published
discusses the nature of research, new data papers and patents. After we were done, it fell to others to find applications
for our work. Over time, it has become more of a shared responsibility to
streams, and how the way we think about connect our work with IBM product and service lines.
information is changing. In the last decade or so, we in the Research division have been much
more tightly engaged with actual end users. Part of our role is now to
understand how people approach computing, how they would like to
use computing, and doing experiments in the art of the possible with real
people. And that is a huge amount of fun.
5
8. TH!NK JUNE 2012
“The name of the TH!NK: What would be an example of that type of compliance?
game right now
Brenda: Think about the GPS in your car. I don’t always follow the
instructions mine gives me. And I really wish that she would keep
track of what I do and learn that “Brenda prefers this street to that
route,” for whatever reason and be responsive to that, rather than
is to find insight
just yell at me and recalculate every time.
TH!NK: I would too. The information GPS devices pick up represents
new data streams, which are a big focus of the 2012 GTO. What
streams are out there?
faster than
Brenda: We’re most familiar with structured data, which is generally
numeric and tends to be nicely organized. You can find each of the pieces
of it that you want, and nothing else. You can do queries against structured
data. You can find averages, and ranges, and apply standard deviations.
anybody else.”
A lot of people say structured data is data you do arithmetic on,
but a lot of properly formatted text data is also structured. For example,
the name field in a client record. You can’t average two names or talk
about a range of names; that doesn’t make any sense. But you can
match names against one another in a way to say, “these two instances
are actually the same person versus they are different people.”
With geo-spatial data, we tend to be computing along varied types
of metrics, so what you tend to do with location data is compute
TH!NK: Why the emphasis on working with people?
distances. People we try to count. And then we try to categorize them.
Brenda: Ten years ago, the research lab was focused on the algorithm. Most catalog data is now also fairly well structured.You couldn’t do
The operating model for the math team was, “someone gives me things like Amazon searches if their catalog data weren’t reasonably
the mathematical representation of the business problem and I’ll well structured. Now, it still may be imperfect, but it’s far less imperfect
work on the algorithm to solve it.” It would return a mathematical than five years ago.
representation of the solution, or perhaps a code, and it was someone
TH!NK: Catalog data takes us online, which is where most unstruc-
else’s role to fit that into the business process.
tured data exists. How do you define it, and what is its significance?
For the relatively static problems we were looking at then, like
flight schedules for airlines or production schedules for the manu- Brenda: I don’t have a concise definition for unstructured data, but
facturing floor, this worked reasonably well. But we now live in a I would say that unstructured data is the data that we don’t yet know
world where things are much more dynamic. The easy stuff has been how we are going to be computing (as opposed to just storing, copying,
done. As we try to push the use of mathematics to support business and accessing) with. A lot of free form text data is unstructured.
decision making, we are working in problem domains that are Once you have annotated it and tagged it with the associated metadata,
much more subtle. Fine differences between the way two different you can say, “this sentence is about this person. It’s about where she
enterprises in the same industry operate come into play. lives. It’s about where she works. It’s about her relationship with
other people.” When you have all of that in the metadata it starts to
TH!NK: Has the direction of research remained consistent, or has it
move towards the structured space.
evolved in surprising directions?
But when it is just free form text, without the meta data, that’s
Brenda: The GTO (Global Technology Outlook) is an annual unstructured. Most instances of data are very unstructured, like
process in the Research division. I have been directly involved in voice data or tweets. Then it gets more complicated when we include
every GTO since 1995. In the 1990s, our focus was much more about data coming off of sensor systems. We don’t know what we are going to
speeds and feeds: how fast would storage be – how dense would storage do with it yet, and right now it is just drilling random measurements.
be? How fast would access be? How many compact transistors and We know what each field means, we don’t yet know how we are going
circuits would fit onto a chip? I’d estimate the split at that time to be computing on them. So that is kind of in a strange land.
was around 50% hardware, 50% how the hardware will be used.
TH!NK: So the next challenge is to incorporate these new data streams?
This past year it was maybe 10% hardware, and 90% how the hardware
will be used. Brenda: Yes. Let’s talk about a really simple example – retail forecasting.
The easily acquired structured data comes off the point of sale device.
TH!NK: What caused the change?
You obtain huge amounts because every item is scanned. You can sort
Brenda: For many applications hardware is good enough now, through that data and count the number of a given item bought each
whereas that wasn’t the case 15 years ago. It’s not that we are day at each location. You can keep track of what combinations are
struggling with the challenge of how to do the things we want to do bought together. And you can do simple time series to forecast how
with computers. Now we are saying, “ok, we have this enormous much it will evolve in the future.
wealth of data and computational power. What can we do with it?” You can also pull in other sources of data like weather, advertising
In other words, it is much more focused on how we can use technology, events, or a single time event, and begin to understand how they impact
less on how we can progress technology along its natural course. the consumption, demand, and purchasing of these items. If you extract
We are also now more concerned with other perspectives. I would the effect from the underlying signal, you can propagate that signal
call this modeling compliance: do the people do with computers forward using your usual time series methods. You can then look at
things they should do, and can they (both the computers and the when these events are going to happen in the future, and put the
people) adapt? multipliers back in. It can lead to an even better job of forecasting.
6
9. In Conversation
TH!NK: And data streams can be used to look at individual decision TH!NK: What would that role be?
making as well.
Brenda: Within a company like IBM, analytics touch almost every
Brenda: To stay with retail for a moment, the area that everyone part of our internal operations. We use analytics in human resources.
is looking at, especially for big-ticket items, is intent to buy. We use them in supply chain. We use them in our own financial
This is what people tweet about, what they post on social media planning. We use them in our own risk analysis. We use them
sites, on various blogs, and mention in comment fields on sites in facility planning. We use them in compensation planning.
like Amazon. They’re everywhere.
These are activities that tend to be done before the buy happens. There is a danger however, of each individual group using different
A lot of them are probably noise. And so the two things you will tools and different data to make the same sets of decisions. And a
want to do is to detect a mention of a product, and you want to Chief Data Officer, who may or may not be your CIO, may be charged
keep track of the mentions of that product. You can review the with the one source of the truth. They are the keepers who know the
data by source, by what type of person, by time, and then compare data of record for everything, as there should be.
that to the actual buys of the product that occurred at some later But there’s more than one way to analyze data. There are different
point. You want to understand – is there a decent correlation techniques. This is a field where it does require some understanding
here? Is expression of intent a signal to buy? How powerful of a of the theory behind the methods. In an enterprise that is using
signal is it? analytics and multiple business functions, having someone who will
If this process gives differentiating insight to one of the actors in say, “this is our strategy; these are the tools we will use; this is how
an economic ecosystem that the other actors don’t have, it helps we will share; this is how we will be more efficient.”
create an advantage. And the name of the game right now is to find It’s about more than having the same data go into two analysis
insight faster than anybody else. processes. It is about the assumptions and the methodologies that
are used in the analysis processes. That’s why I think a Chief Analytics
TH!NK: Moving away from retail, what does the future hold for high
Officer is going to be very important in the future in companies.
performance users?
TH!NK: It sounds like the data we gather is changing, and the ways we
Brenda: Over time I think it will become important to gain a better
gather that data are changing too. Does this mean we need to change
understanding of how data is used in combination. One of the first
the way we think about data itself?
uses of mathematical algorithms to control how a computer actually
worked dates back to the big flat platter disc drives. For those drives, Brenda: As someone who grew up believing in the scientific method,
you had to decide which track you were putting what data in. it’s my comfort zone. Researchers and scientists generally say,
And you did the analysis up front of which data were you most likely “I want to look at the data, I want the data to inspire hypothesis, and
to be accessing most often, so that data could go in the center track. then I want a way to test that hypothesis.”
The least frequently accessed data was at the very center and at We can’t necessarily create new experiments with everything
the very edge. that we come across today. What we can observe, which is vast, isn’t
You did this because the center is the point from which, no matter the law of physics. We almost have to think of ourselves differently.
where the head happens to be, on average it’s the shortest distance Perhaps as researchers we should no longer think of ourselves as
to get to. So, this was a really important algorithm. It sped things up laboratory scientists, but more like astronomers. We have great
tremendously in terms of data access. tools to observe the stars, but we can’t move them around.
TH!NK: How does that relate to using data in combination?
Brenda: Most of the advanced analytics we use pull in multiple
sets of data. We might be pulling in bigger historical data sets when
we are looking at one economic measurement versus another.
Moving forward, we may want to pull in event information as well.
“We should no
longer think of
We may want to see if an event, or a publication, or a blog, or some
other signals affect our targets, and with what duration.
The goal would be to pull multiple sources of data and try to
determine if one piece informs another piece in any way. Can we
ourselves as
compute “a priori” the frequency with which two pieces or classes of
data are going to be used together, and figure out some way to store
them so that we can get them ready and together, at the same time?
Because you can’t start the computations until you have both of
laboratory
those pieces. As long as the memory is in one place and the information
is on a magnetic drive somewhere, you bring one in first and then
wait for the other.
If we could fetch them together because they were the same
scientists,
distance apart, if you will, it would be very interesting. That’s my
notion of using data together.
TH!NK: Let’s talk for a moment about titles. A Chief Risk Officer
is common. I have heard people start to speak about a Chief
but more like
Data Officer.
Brenda: I would want to be Chief Analytics Officer.
astronomers.” 7
10. TH!NK JUNE 2012
in review
Earth
Audit
The finite supply of natural
resources drives economies and
influences pricing. Even though
their costs may not be directly
factored into all products and
services, availability of key
minerals can impact operational
risk, markets, and capital. Using
rough calculations, this energy
audit illustrates how increases
in living standards affect the
rate of consumption, and brings
an eye-opening perspective to
the state of our planet.
IF DEMAND GROWS…
Some key resources will be exhausted more quickly if
predicted new technologies appear and the population grows
ANTIMONY 15-20 years SILVER 15-20 years
HAFNIUM 10-years TANTALUM 20-30 years
˜
INDIUM 5-10 years URANIUM 30-40 years
PLATINUM 15 years ZINC 20-30 years
SOURCE: ARMIN RELLER, UNIVERSITY OF AUGSBURG; TOM GRAEDEL, YALE UNIVERSITY
8
12. TH!NK JUNE 2012
READING ROOM
There’s a connection between our thought process and the courses of
action we choose. These new and noteworthy titles explore the science
of decision making, and the impact ideas can have on society as a whole.
SCIENCE
+
James Gleick digs deep into The Information, a journey from the language of Africa’s talking drums to the
origins of information theory. Gleick explores how our relationship to information has transformed the very
nature of human consciousness. Charles Seife examines the peculiar power of numbers in Proofiness, an
eye-opening look at the art of using pure mathematics for impure ends. In Being Wrong, Kathryn Schulz
wonders why it’s so gratifying to be right and so maddening to be mistaken – and how attitudes towards error
affect decision making and relationships. John Coates reveals the biology of financial boom and bust in
The Hour Between Dog and Wolf. Coates, a trader turned neuroscientist, shows how risk-taking transforms
our body chemistry and drives us to extremes of euphoria or depression.
Poor Economics The Price of Civilization Paper Promises Finance and the
by Abhijit Banerjee, Esther Duflo by Jeffrey Sachs by Philip Coggan Good Society
PublicAffairs Random House Allen Lane by Robert Shiller
Princeton University Press
SCREENING ROOM
+
Based on the bestselling book by Andrew Ross Sorkin, Too Big
To Fail reshapes the 2008 financial meltdown as a riveting thriller.
Centering on U.S. Treasury Secretary Henry Paulson, the film goes
behind closed doors for a captivating look at the men and women
who decided the fate of the world’s economy in a few short weeks.
Too Big to Fail
Directed by Curtis Hanson
HBO Films
10
13. Reading Room
The Information Proofiness Being Wrong The Hour Between
by James Gleick by Charles Seife by Kathryn Schulz Dog and Wolf
Vintage Viking Adult Ecco by John Coates
Random House
SOCIETY
+
Jeffrey Sachs has travelled the world to help diagnose and cure seemingly intractable economic problems.
In The Price of Civilization, Sachs offers a bold plan to address the inadequacies of American-style capitalism.
Abhijit Banerjee and Esther Duflo offer up a ringside view of Poor Economics, arguing that creating a world
without poverty begins with understanding the daily decisions facing the poor. Philip Coggan’s Paper Promises
examines debt, the global finance system, and how the current financial crisis has deep roots – going back to the
nature of money itself. Robert Shiller believes that finance is more than the manipulation of money or management
of risk. In Finance and the Good Society, Shiller calls for more innovation and creativity so that society can
harness the power of finance for the greater good.
11
16. We are unlikely to return to the capi-
talization levels or strict regional
focus employed by the gentlemen of
Pawtuxet. There are however crucial
lessons to be learned when examining
the structure and scope of early
financial institutions. When we
talk about addressing concerns over
capital, funding, and liquidity, it just
might be that what we need for the
bank of tomorrow is not a new model,
but rather one that takes inspiration
from the bank of yesterday.
FIRST DEPOSITS
In the early history of banking, each
partner made decisions knowing they
shared liability if the bank failed. As a
result, choices on whom credit should
ver the decades there be extended to were not taken likely. Unambitious business owners
have been many views on with a slight but steady production of widgets were considered
what the bank of the future ideal customers. Less attractive was the dubious repayment
would be. Some ideas have potential of innovative or entrepreneurial types. The latter group
been radical and some have been represented the potential of a phenomenal return on investment,
transitional, while others never really but only if an unproven product or process succeeded.
took hold. This particular view begins in Pawtuxet, Rhode Island, This philosophy was in keeping with the dominant banking
on the eastern seaboard of the United States. By all accounts a theory of the 18th & early 19th centuries. The real-bills doctrine
lovely place to visit, Pawtuxet is well known for scenic harbour proposed that banks should restrict the extension of credit to
views and boating along its historic river corridor. But in the early customers involved in the transfer of existing products only.
19th century, textile mills and coastal trade dominated its land- Real-bills supporters argued that by basing loans on the security
scape. As the community thrived and local businesses grew, the of actual goods, any individual bank’s liquidity was ensured.
Pawtuxet Bank emerged. Common for its time, the bank was a While there is an admirable simplicity in tying loans to tangible
partnership and its directors mostly merchant-manufacturers. goods, the skyline of the 19th century was starting to change. There
The Pawtuxet Bank’s directors shared personal liability in the were towers, factories and infrastructure projects that needed to
event of loan default, or if the bank itself failed. In “The Struc- be built, without existing product to offer in exchange for funding.
ture of Early Banks in Southeastern New England”, Naomi R. And these projects were poised to generate great returns.
Lamoreaux recounts the events of June 1840 when the bank’s The unlimited liability model was ill-suited to finance these
stockholders presented themselves before the Rhode Island projects. When shareholders’ money was directly on the line, banks
General Assembly. The group appeared seeking permission to had good reason to avoid speculative projects. Because the incen-
reduce the bank’s capitalization from $87,750 to $78,000 in tives for self-discipline were so high, banks often lent only to those
order to cover losses sustained due to the death of John Pettis, they knew best. These could be local businessmen, often engaged
one of the bank’s directors: in the same type of industry as the bank shareholders. Often bank
Pettis died in 1838 with notes worth $8,800 outstanding at funds became personal resources for the shareholders themselves,
the bank and endorsements amounting to at least another and this type of insider lending, or trading, would frequently make
$1,500...(t)his loss was not sufficiently up the majority of a bank’s exposures.
large to cause the bank to fail. Nor did The adjustment (towards As the world began to change, so did banks.
depositors or the bank’s own noteholders By the late 1850s, Great Britain had moved
suffer. Most (91 percent) of the bank’s
limited liability) would towards limited liability, with France following
loans were backed by capital rather than correct what had turned suit in 1867. As with unlimited liability, the
notes or deposits, and the stockholders out to be the too successful logic was easy enough to follow: if a bank
simply absorbed the loss. risk measure of personal could diversify its investor base, there would
obligation: banks weren’t
interested in funding
anything risky.
14
17. Back to the Future
DYNAMIC CAPITAL MANAGEMENT
BY FRANCIS LACAN
To visualize the concept of dynamic capital management, think of flying
an aircraft as close as possible to the ground. If you go too high, the
be a greater availability of credit and capital. The cost of fuel becomes unreasonable. You cannot go negative because
adjustment would correct what had turned out to be the option doesn’t really exist. The goal is to seek out the most efficient
the too successful risk measure of personal obligation: middle ground that best mimics the changing landscape below.
banks weren’t interested in funding anything risky. Managing capital dynamically would enable a bank to determine,
In “Early American Banking: The Significance of on a day by day and month by month basis, the most efficient flight
the Corporate Form,” Richard Sylla suggests that the path for capital and allocate it accordingly. Optimization and
tipping point away from unlimited liability originated anticipation are the two extremes guiding such decisions, and in the
with the New York Free Banking law of 1838 which middle reside a big set of constraints. Basel III and its liquidity coverage
stated, “no shareholder of any such association shall ratio have restricted certain freedoms, particularly in terms of asset
be liable in his individual capacity for any contract, qualification. The other set of constraints is risk management.
debt or engagement of such association.” New York’s Liquidity is increasingly subject to risk management because there
free banking law didn’t just make limited liability are lots of dependences in funding liquidity and the rest of the risk.
possible. It opened the door for the incorporation of It seems liquidity is following a similar path to what happened
banks and the freedom from personal obligation. with capital and solvency. Banks didn’t invest much in economic
capital, but the strong requirement to look at regulatory capital
ONE STEP FORWARD, TWO STEPS BACK acted as an incentive to build more analytics, more rigorous reporting,
As banks moved beyond their villages in search of capital and to become more serious about addressing uncertainty with
and opportunities, the strategies and measurements the right tools. What is more complex for capital management is
used in their operation changed as well. Instead of to connect all the sources of information. Pulling cash flow across
prudence being the only driver, customer profitability entities and supporting good cash management today still has a lot
and shareholder value became ongoing concerns. of room to evolve. There are for example too many overlapping
Expansions, mergers, and deregulation replaced local systems of information that are not very good at talking to one
partnerships with a mandate to maximize customer another. Overcoming this hurdle would be a huge step towards
bases and profitability. Operating at an extreme opposite active capital management, rebalancing, and optimization.
of the early 19th century model were institutions The current baseline for automation is extremely rudimentary.
like Alfinanz, an offshore administration factory that The only true mechanical element is the planning of short term inflows
functioned as a global back office for a global network and outflows within the Treasury, because their contractual commit-
of financial advisors, intermediaries, or brokers. ments are relatively easy to model. The rest is seen as shocks, and
As banks moved from private partnerships to the focus seems to be on what the regulators are asking banks
public corporations, shareholder demands added to address as the possibility of these shocks. As a result, banks are
another voice to how bank capital and risk would be being pushed into modeling with greater consistency what may
managed. Enhanced returns were a factor in banks’ happen with respect to different uncertainties tied to cash flows.
decisions to pursue diversification, more complex In the short term, banks will have to continue on the foundations
transactions such as structured products, and other of operational management of cash and collateral, addressing
strategies that gained support from managers operating regulatory requirements for cash flow modeling and forecasting,
with limited liability. asset qualification, and scenario modeling. Together, these elements
In the modern era of banking, even the idea of ‘who will provide a rugged foundation to move towards automated
is a customer’ was up for grabs. In the 1990s, First decision making, and eventually, a more automated approach to
Manhattan Consulting Group took a leading role in at least some aspects of capital management.
introducing profit-based segments to banking. First This prediction comes with a number of ‘ifs’: if you are able to
Manhattan came to prominence with the now famous have very good and trustable aggregated pooling of all internal
conclusion that only 20% of a bank’s customers were and external balances of cash in all currencies, and if you have
profitable. Their idea to focus on profitable customers access to a very good repository for your treasury operations so
only was an attractive one to banks who were seeking to you can see your money market for all these currencies, you could
improve low revenue growth, particularly in core retail to an extent begin to automate capital allocations for particular
products. The concept also encouraged mergers and the areas of the business. Decisions on how to refinance each of these
creation of larger banks, who were better positioned to currencies, and perhaps rebalance positions into a smaller number
take advantage of segmentation opportunities. of currencies to save costs or optimize even the risk profile of certain
Today we see banks retreating from these drivers and transactions, could in theory be automated.
measures, often forced to adjust strategy by regulation, This won’t happen tomorrow. But with the proper foundation, we
and perhaps in retreat from acting “in loco parentis”. have the technology to make dynamic capital management part
of the bank of the future.
15
18. TH!NK JUNE 2012
It just might be that what we need for the bank of
tomorrow is not a new model, but rather one that
takes inspiration from the bank of yesterday.
Shareholder demands, which focused exclusively on the creation as easy as flipping a switch. A lack of cheap availability and
of shareholder value, must now be balanced against closer regulatory reduced funding sources have changed the capital landscape.
scrutiny and the need to protect customer interests. Banks of the future must focus on the preservation and leverage
Diversification led to its own set of challenges, as it did not help of available capital, and make that capital work harder.
spread risk well. The credit crisis demonstrated that market risk
and credit risk can appear in unexpected ways, and that the need to
maintain strong liquidity positions was more crucial than realized.
All the short term profitability in the world cannot help if the
system isn’t stable. And today, if you want stability, every discus-
sion must begin with the importance of access to capital.
CAPITAL: THE ONCE AND FUTURE KING
In his memoir On the Brink: Inside the Race to Stop the Collapse
of the Global Financial System, former U.S. Secretary of the
Treasury Henry Paulson reflects back on the credit crisis. One of
his conclusions is that the financial system contained too much
leverage, much of which was buried in complex structured products:
Today it is generally understood that banks and investment
banks in the U.S., Europe, and the rest of the world did not
have enough capital. Less well understood is the important
role that liquidity needs to play in bolstering the safety and
stability of banks...(f )inancial institutions that rely heavily
on short-term borrowings need to have plenty of cash on
hand for bad times. And many didn’t.
Politicians and regulators have joined hands on capital, proposing
measures that would lead to banks holding more of it. Many
banks have argued against this approach, claiming that additional
capital requirements would affect performance and competition.
Yet recent investigations into the correlation between bank capital
and profitability suggest that holding additional capital may not Part of this focus must be organizational. Allocation of capital can
be a bad thing. Which is encouraging, since banks will likely have no longer be controlled at a business unit, subsidiary, country, or
to do it anyway. branch level. It needs to be allocated at the time of doing business
Allen Berger and Christa Bouwman’s interests are reflected in with specific customers, business lines, and even at a transaction
the title of their recent paper, “How Does Capital Affect Bank Per- level. Dynamic capital leads to a radically different structure,
formance During Financial Crises?” The authors examine the where the treasury becomes the ‘owner’ of capital, lending it to
effects of capital on bank performance, as well as how these deal makers on demand.
effects might change during normal times as well as banking and A dynamic treasury requires great understanding of the uses and
market crises. The empirical evidence led Berger and Bouwman cost of capital, connected to the technological ability to ‘solve’ the
to the following conclusions: problem of Big Data. In the sidebars to this article, my colleagues
First, capital enhances the performance of all sizes of banks have expanded on the linked topics of dynamic capital and
during banking crises. Second, during normal times and managing complex data.
market crises, capital helps only small banks unambiguously
in all performance dimensions; it helps medium and large BANKING ON THE PAST
banks improve only profitability during market crises and In the early 19th century, the UK limited banking partnerships to
only market share during normal times. six members. No one is suggesting banks return to this restriction.
Empirical evidence, regulatory measures, and perhaps common But if we think about various aspects of the unlimited liability
sense dictate that holding additional capital is a worthy goal for banking model, it appears many of their tendencies are being
banks. Yet even if banks wanted to raise capital thresholds, it isn’t echoed in calls from regulators and stakeholders.
16
19. Back to the Future
Data Complexity BY Leo Armer
In the Pawtuxet model, a small number of operating partners owned
the bank’s data. It was their responsibility to collect information
about their clients, and use this knowledge to guide business decisions.
Long-dated compensation reform and shareholder ‘say Banks today have challenges managing data, in large part
on pay’ programs can be seen as measures intended because the acts of collecting and analyzing information have
to update the shared liability and sense of ownership become so separated. The greater this disconnect, the more
partners used to bring to banks. The credit crisis has important transparency becomes.
driven home the importance of liquidity, and that gaining For both banks and clients, it’s crucial to be able to ask: “If this
capital can be expensive – if it can even be acquired in is my risk number, where did it originate? How do I track it? How
times of a crisis. In a way this reflects the notion early can I see which systems it passed through, and what happened
bankers held that capital was expensive, and bringing to it along the way?” Being able to take a number from a balance
in additional funds or partners would dilute earnings. sheet or a general ledger and drill back to its origin provides a huge
Insider lending and specialization that gave way to amount of confidence.
diversification and fewer restrictions on portfolios is In order to make good decisions, you need to see the big picture.
being balanced by technologically enabled means to If data complexity is viewed purely as a technological issue, its
better know customers. Enhanced collateral manage- strategic importance can be overlooked. When institutions attack
ment and approaches like CVA can be used to gain data issues purely from an IT perspective, rules are created, trans-
a deeper understanding of capital exposures before formations take place, and the data is considered ‘clean’ after
entering into an agreement. going through a reconciliation process. Various systems and
approaches are employed to ensure that the numbers coming out
of the front system match numbers coming out of the general ledger
system, and these match the numbers coming from treasury.
The problem is, as much as you can clean the data on a Monday,
unless you change the people or method of entering that data, it’s
going to need cleaning up again on Tuesday.
Today, a few firms are approaching data complexity from a business
perspective. They have put their main focus on creating a single
data warehouse where all the information is stored. This approach is
based on the insight that every piece of data has a golden source:
a reliable point of origin before it gets passed through different
hands and different teams. It becomes as much about changing
mental attitudes as it is about technical architectures.
Creating a golden source for data becomes even more crucial
when we see what has happened in the last couple of years. CVA
charges for example occur when a bank puts a variable fee on top
of a deal, depending on whom they are trading with.
If your bank was to trade with another bank, and you had a long
history with the other bank and deep insights into their credit status,
the bank would likely get a better price for that trade than a small
finance company from Greece who might be looking less solvent.
In these transactions, where is the golden source? Is it in the
middle office or front office data? Who is taking ownership over
the trades? They can’t be processed the way they used to be,
If banks are to thrive in the future, preservation and otherwise you’re swimming against the current demand for real
leverage of available capital are crucial steps. Dynamic time responses. If it takes six or seven days to work out what
allocation, enabled by a treasury that quickly and happened when a counterparty defaults, you’re too far behind
effectively uses available capital in prudent ways, the curve.
could perhaps be the defining characteristic of the bank It is becoming more common to run into or hear discussions
of tomorrow. From the outside, these institutions about appointing a CDO, or Chief Data Officer. A CDO, or at least
would look nothing like the stakeholders of the the mindset within an institution that data quality is crucial and
Pawtuxet Bank, but they would be related in spirit. strategically relevant, can help banks evolve beyond workarounds
and create a repository of golden source data. Through a framework
where standards, direction, and architecture are provided to different
departments throughout an organization, the bank of the future can
overcome data complexity.
17
22. TH!NK JUNE 2012
,
“I wouldn t want
to overstate it –
it’s not bringing
the industry to
a halt. But there
is increasing
focus on limiting
exposures, even
among global
banks. And that is
starting to affect
the way we
do business.”
20
23. The CVA Desk
THE SETUP
Life used to be different – at least in terms
of how counterparty credit risk was calcu-
lated. In the past, an interest rate swap
would have been priced the same for every
client. But Lehman’s default, and more re-
cently the Greek sovereign stress, has
changed all that. Now, no client is assumed
to be truly risk free. Different prices are
now expected for different clients on that
same interest rate swap, depending on
variables including the client’s rating and
the overall direction of existing trades be-
A
tween both parties.
ny time one bank takes Noting their emergence, and particularly
a risk against another their activity in the sovereign CDS market,
the probability of default the Bank of England defined CVA desks in
exists. To offset this their 2010 Q2 report as follows:
concern, and to support A commercial bank’s CVA desk
c
ongoing stability within centralises the institution’s control
the interbank market, of counterparty risks by managing
banks have long emphasized the impor- counterparty exposures incurred
tance of measuring and managing coun- by other parts of the bank...CVA
terparty risk. Yet over the past few desks will charge a fee for managing
months banks have becomes noticeably these risks to the trading desk,
less comfortable trading with each other. which then typically tries to pass
The recent deterioration in credit ratings this on to the counterparty through
that has hit many U.S. and European banks the terms and conditions of the
has led to a heightened sensitivity over trading contract. But CVA desks are
counterparty risk. These apprehensions not typically mandated to maximise
may not be voiced directly, but they profits, focusing instead on risk
become evident when front office trades management.
that would have cleared in the past The Bank of England’s summary captures
no longer do because credit lines have the classic model for running a CVA desk,
been reduced. which Murphy has implemented at SG
As head of the CVA desk at Societe CIB. The classic approach incorporates
Generale Corporate & Investment Banking three elements:
(SG CIB), David Murphy has a unique 1. pricing of new trades
vantage point on interbank relationships. 2. transferring risk to a centralized
“I wouldn’t want to overstate it – it’s not desk from individual desks
bringing the industry to a halt. But there is 3. hedging or otherwise mitigating the
increasing focus on limiting exposures, even aggregated risk on a global basis
among global banks. And that is starting to On all new interest rate, FX, equity, or
affect the way we do business.” credit derivatives, CVA desks price the
CVA desks have grown in popularity as marginal counterparty risk for inclusion
banks seek more effective ways to manage into the overall price charged to the client.
and aggregate counterparty credit risk. CVA is a highly complex calculation – and
From his seat at SG CIB, David has a bird’s manually calculating that for the thou-
eye view on the challenges associated with sands of trades and potential trades that
establishing CVA desks, and the benefits pass through a bank every day isn’t realistic.
banks can realize by gaining an active view An effective automated system therefore
on their portfolio of credit risk. becomes crucial to a CVA desk’s viability.
21
24. TH!NK JUNE 2012
UPSIDES OF AUTOMATION “Really, what our sales team are interested
“For a plain vanilla trade with another bank done on an electronic trading in, is earning as much as possible net of the
platform, our target delivery time for the price is approximately 10 milliseconds,” CVA. Through the automated system tools,
says David. “On the other end of the complexity spectrum, highly-structured, we’ve empowered sales and traders to do
long-dated trades may require two or three days to calculate the CVA price. trades with the lowest CVA possible. So it’s
Within this range we deliver CVA pricing within timescales that don’t delay worth their while spending time looking
the overall trade completion.” for that price, and they can now do that
While automated pricing copes well with vanilla products and the speeds themselves quickly and efficiently – without
required for those trades, there will always be exotic trades, trades where the delays or extra resources required
clients have a non-standard credit story, or a trade with special risk mitigation. when using the manual pricing process.”
In these cases, David has a team of four who provide this manual pricing to These ‘pre-deal’ checks are purely
Sales and Traders on request. indicative – and optional for Sales and
“We try to reduce the need for manual pricing as much as possible, but the Traders. But if they don’t do this check, they
business will always have trades where they need someone to take a closer face a big risk, because every time a new
look. There may also be situations where we think the automated pricing trade is booked, an ‘official’ CVA fee is then
isn’t good enough, so we want to take a look anyway and we don’t allow the auto-calculated – which is then recorded
sales or trades to use the automated pricing provided,” he explains. alongside the trade – and will be deducted
In the manual process, the CVA desk team often passes along suggestions from the Sales/Trader performance.
to the salesperson for improving the credit risk in a trade and enabling the
sales person to offer the trade at a lower credit price. Examples of that would PERILS OF PRICING
include improving the collateral agreement with a client, or inserting a “A key challenge of building a CVA pricing
break clause. system is ensuring real-time access to data
“Via the manual process, we have educated our sales team and traders how in three categories: trade details; static
they can change the credit risk (and reduce the price). With this knowledge, data (client data such as rating, details
they now use the automated pre-deal CVA calculations to provide several of all pre-existing trades, netting status,
CVA prices for different versions of the same trade. This allows them to collateral details etc), and market data.”
achieve the best price for the client – while minimizing the counterparty risk.” “Designing the system which has reliable
and timely data in all 3 categories is crucial,
given the impact that will have on pricing
and/or hedging decisions. It’s a tough
market with sophisticated competitors: if
we under-price a risk, you can be sure we
will start attracting a large market share.
And if we over-price, then we lose business
unnecessarily.”
“We try to reduce the need for
manual pricing as much as possible,
but the business will always have
trades where they need someone to
take a closer look.”
22
25. The CVA Desk
“Through the
automated system
tools, we’ve
empowered sales
and traders to do
trades with the
lowest CVA possible.”
HALFWAY TO HEDGING “If you take SG CIB’s total portfolio of clients, just over
Murphy’s first priority for SG CIB has been 10% have a liquid CDS curve. In other words, for 90% of
to ensure the CVA desk correctly prices our clients, if we wanted to go and buy CDS protection,
the risk in all new trades. Now that this we couldn’t do it because there’s no market. To hedge
process is well-advanced, the desk will these illiquid risks, banks would need to use some
start to focus more on hedging – or otherwise kind of credit index.”
mitigating – its legacy portfolio of credit But this is an imperfect hedge. In ‘normal’ times, the
risks. “For hundreds of years banks have credit spread of the index and the ‘generic’ spread
managed reasonably well hedging 0% of applied to calculate the client’s CVA will move in tandem.
their counterparty risk. So instead of an The hedge is therefore effective in reducing earnings
instant seismic shift to 100% hedging of all volatility from day-to-day changes in the CVA Reserve.
risks, we will be hedging selected segments But, if the client deteriorates – or even defaults – due to
of the credit and market risk – which an idiosyncratic reason, then the index hedge may not
avoids paying away all CVA income to the be affected at all (i.e. the hedge doesn’t work). So the
market,” says Murphy. decision whether to hedge illiquid names depends on
For banks evolving the CVA function, what you want to protect against: actual losses following
there are two main reasons hedging is not default… or earnings volatility caused by changes in
further along. The first is technology: they market credit spreads.
may not yet be fully confident in their risk
measurement system, which requires a
complex and time-intensive development
period. The second reason is strategic: the
bank might not think that all of its potential
hedges are very useful.
23
26. TH!NK JUNE 2012
In the traditional CVA ACCOUNTING FOR BASEL
In the traditional CVA approach, a bank accepts a new trade, takes
a fee and uses that fee to buy good hedges for all the risks in that
approach, a bank accepts trade. These hedges should eliminate all of the bank’s risk, but
this is not necessarily the case once Basel III is taken into account.
a new trade, takes a fee Basel III does not recognize all types of hedges that the bank
might want to use. Therefore the regulatory capital for certain
and uses that fee to buy trades will not be zero, even if the bank has used the full CVA fee
to hedge all its risks.
The first impact Basel III has on CVA desks is on pricing.
good hedges for all the Pre-deal pricing needs to be reviewed to ensure the costs of
imposed regulatory capital are covered. If not, additional pricing
risks in that trade. These may need to be added. And the decision on which risks are
efficient to hedge also becomes affected not just by strategic or
hedges should eliminate business reasons, but also by the regulatory capital impact.
As part of Basel III’s updated regulatory capital guidelines,
a new element has been added: VaR on CVA. Regulators have
all of the bank’s risk… specified very precisely how the underlying CVA must be calcu-
lated for this charge. Banks will therefore need to decide whether
to adjust their pricing and balance sheet CVA to match the
BIII rules, or to use different CVA calculations for pricing and
Regulatory purposes.
24
27. The CVA Desk
A DEFINING ROLE
When individual trading desks own risk, one desk may have had a
positive exposure to a client. This could lead the desk to hedge the
positive exposure, without knowing that there was a negative
exposure at another desk, which means the hedge wasn’t really
necessary. Because the CVA desk owns all the risk from all the
different derivatives desks, it has a full view of the risks with each
counterparty, across all desks, products and locations and can
price and hedge the risk appropriately.
It has been suggested that a CVA desk is just a ‘smart middle
office’. Murphy doesn’t agree: “The main difference between CVA
other front office trading functions is that most of the risks are
originated internally from the bank’s other trading desks. But the
CVA desk must price, originate, and distribute those risks in
exactly the same way as any other front office trading desk.” …but this is not
CVA desks have evolved to price, centralize, and manage a
bank’s counterparty risks, requiring sophisticated modeling of
hybrid risks encompassing every asset class that the bank is
necessarily the case
involved in. When implemented correctly, CVA desks should
support an institution’s business and strategic vision, while helping once Basel III is taken
banks maintain normalized relationships and control risk in an
ever more complex trading universe. into account.
25
29. The
OPTIMIZATION
of EVERYTHING
OTC Derivatives, Counterparty
Credit Risk and Funding
By Jon Gregory
The global financial crisis has created much excitement
over counterparty credit risk (CCR) and, in recognition
of this, banks have been improving their practices
around CCR. In particular, the use of CVA (credit value
adjustment) to facilitate pricing and management of
CCR has increased significantly. Indeed, many banks
have CVA desks that are responsible for pricing and
managing CVA across trading functions. In addition
to CVA, DVA (debt value adjustment) is often used
as recognition of the “benefit” arising from one’s own
default and funding aspects may be considered via
funding value adjustment (FVA). Also, the impact that
collateral has on CVA, DVA and FVA is important to
quantify. Finally, there is a need to consider the impact
of the funding requirements and systemic risk when
trading with central counterparties (CCPs).
27
30. TH!NK JUNE 2012
T
he dynamics of trading OTC derivatives is becoming
increasingly driven by the components mentioned
above. Such a trend can only grow as regulation arising
from Basel III creates the need for significantly
increased amounts of capital to be held against CCR.
It therefore seems likely that banks will not only invest
significantly in building knowledge around the afore-
mentioned concepts but will also optimize their trading decisions.
For example, should one trade through a CCP or not? Is it pref-
erable to trade with a counterparty via a 2-way collateral agree-
ment (CSA)? Should we collateralize via cash or other securities?
What currency should I post cash collateral in?
There are a number of considerations around optimizing OTC
derivatives trading with respect to CCR, funding, and systemic risk. Collateral. A “margin
From the point of view of a bank, an OTC derivative transaction period of risk” of 20 days
depends very much on the type of counterparty to the trade. must be applied for trans-
Most unsophisticated users of OTC derivatives will not post collat- actions where netting sets are
eral against positions while more sophisticated users will post large (i.e. over 5,000 trades), have
collateral or trade through a central counterparty. This creates a illiquid collateral, or represent hard-to-
wide spectrum of behaviour with respect to CCR and funding replace derivatives. The current time frame
aspects that we will discuss. A bank then has the issue of deter- on such transaction is 5-10 days. No benefit can
mining how best to optimize their trading across this spectrum. be achieved from downgrade triggers (e.g. receiving more
collateral if the rating of a counterparty deteriorates).
The impact of regulation In addition, additional haircuts for certain securities
The Basel III rules will be phased in from the beginning of 2013 and the liquidity coverage ratio will limit the amount of
and will force banks to hold a lot more equity capital, much of rehypothecation (reuse of collateral) and encourage the
which is due to CCR requirements. Ballpark estimates are that use of cash collateral. This ratio aims to ensure that a bank
most large banks will have to more than triple the amount of maintains an adequate level of unencumbered, high-
equity held compared with pre-crisis. Loopholes to reduce capital quality liquid assets that can be converted into cash to
requirements, such as off balance-sheet entities, are being meet its liquidity needs.
closed. A trillion dollars or so of extra equity will need to be CVA VAR. Banks must hold additional capital to capture the
raised by American banks by the end of the implementation of volatility of CVA. This is in addition to the current rules that
Basel III (2019) with European banks needing to raise a similar capitalise default risk.
figure. Basel III will have a profound effect on banking behaviour. Central counterparties. A risk weighting of 2% will be
The changes will make all banking activities more expensive, in given to exposures to a CCP, not only via margin posted but
particular exposures held in the trading book. also via the default fund contribution that must be made. In
Under Basel III, the changes around CCR (that will apply to banks addition, the CCP must meet various rigorous conditions,
from 1 January 2013) are particularly significant and include: including the establishment of a high specific level of initial
Stressed EPE. Banks which have permission to use the margin and ongoing collateral posting requirements, and
internal models method (IMM) must calculate exposures that it has sufficient financial resources to withstand the
using data that includes a period of stressed market conditions, default of significant participants. While this represents an
if this is higher than the standard calculation. increase (from zero) in capitalization of CCP exposures, it
Wrong way risk. Banks must identify exposures that give is intended to incentivise the clearing of OTC derivatives
rise to a greater degree of “general” wrong-way risk and must through CCPs.
assume a higher exposure for transactions with “specific”
wrong-way risk. Collateral and CCPs
Systemic risk. Banks must apply a correlation multiplier of Collateral arrangements involve parties posting cash or securities
1.25 to all exposures to regulated financial firms with assets to mitigate counterparty risk, usually governed under the terms
of at least $100 billion and to all exposures to unregulated of an ISDA Credit Support Annex (CSA). The typical frequency of
financial firms. posting is daily and the holder of collateral pays an (typically
28