Algo think0612-june12


Published on

Published in: Economy & Finance, Business
1 Like
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Algo think0612-june12

  1. 1. Volume 7 June 2012 The CVA Desk: Pricing the True Cost of Risk _ P.18 The Optimization of Everything: Derivatives, CCR and Funding _ P.26 Through the Looking Glass: Curve Fitting _ P.32 The Social Media World: What Risk Can Learn From It _ P.38 Stochastic and Scholastic: The Interconnectivity of Risk _ P.44Not all risks are worth taking. Back to the FutureMeasuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics,we help clients to see risk in its entirety. This unique perspective enables financial services companies tomitigate exposures, and identify new opportunities that maximize returns. Supported by a global team ofrisk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informeddecision making through the science of knowing better. JUNE 2012 Revisiting capital and the bank of tomorrow
  2. 2. Not all risks are worth taking. Not all risks are worth taking.Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics, Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics,we help clients to see risk in its entirety. This unique perspective enables financial services companies to we help clients to see risk in its entirety. This unique perspective enables financial services companies tomitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team ofrisk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informeddecision making through the science of knowing better. decision making through the science of knowing better.
  3. 3. Table of Contents Volume 7 June 2012 BACK TO THE FUTURE P12 BEST OF P02 Recent Awards and Recognitions Revisiting Capital and the Bank of Tomorrow Opening bell p03 Responses to Uncertainty THE CVA DESK P18 IN CONVERSATION P04 Pricing the Cost of Risk IBM’s Brenda Dietrich at Societe Generale IN REVIEW p08 Earth Audit THE OPTIMIZATION OF EVERYTHING P26 OTC Derivatives,express written permission of Algorithmics Software LLC or any other member of the Algorithmics group of companies. The materials presented herein are for informational purposes only and do not constitute financial, investment or risk management advice. © 2012 Algorithmics Software LLC, an IBM Company. All rights reserved. You may not reproduce or transmit any part of this document in any form or by any means, electronic or mechanical, including photocopying and recording, for any purpose without the reading room p10 A Roundup of New and Counterparty Credit Risk Noteworthy Titles and Funding the last word p50 Risk Man’s Desk THROUGH THE LOOKING GLASS P32 An Empirical Look at Curve Fitting Counterparty PUBLISHER Michael Zerbs PRODUCTION & DISTRIBUTION Credit Risk Exposures EDITORIAL AND MANAGER Elizabeth Kyriacou ART DIRECTION Touchback Contact THE SOCIAL MEDIA WORLD P38 Information (and what risk can CONTRIBUTORS Algorithmics, Leo Armer an IBM Company Andy Aziz 185 Spadina Avenue learn from it) David Bester Bob Boettcher Toronto, Ontario, Canada Tom Chernaik M5T 2C6 STOCHASTIC AND SCHOLASTIC P44 Mike Earley 416-217-1500 Jon Gregory Assets, Liabilities and the Francis Lacan Alan King Interconnectivity of Risk Gary King John Macdonald Cesar Mora David Murphy Yaacov Mutnikas Martin Thomas
  4. 4. TH!NK JUNE 2012BEST OFOur commitment to innovation has helpedAlgorithmics earn a number of publicrecognitions from industry publications, readersurveys, and judged panels year after year.Below is a list of awards we recently received.Best Risk Management Technology Provider, HFMWeek’s European Hedge FundServices 2012. Best Global Deployment for Algorithmics’ collateralmanagement client BNY Mellon, American Financial Technology Awards(AFTAs) 2011. First place for Risk Management – Regulatory/economic capitalcalculation, Structured Products Technology Rankings 2012. First placeoverall for Enterprise-wide risk management and first place in enterprise-wide market risk management, risk dashboards, risk aggregation, riskcapital calculation (economic) and collateral management in Risk magazine’sRisk Technology Rankings 2011. Readers’ Choice Winner (Highly Commended)for Best Risk Management Product or Service, Banking Technology Awards 2011.First place in Market risk management and ALM, Asia Risk TechnologyRankings 2011. Best Risk Analytics Provider, Waters Rankings 2011.Best Solvency II software package, Life & Pension Risk Awards 2011.First place overall first place for Scenario Analysis, Key risk indicators,and Operational risk loss data collection, Operational Risk & RegulationSoftware Rankings 2011. Shortlisted, best post-trade risk managementproduct for Algo Collateral, Financial News Awards for Excellence inTrading & Technology, Europe 2011.2
  5. 5. Opening Bellopening bellRecent elections in France and Greece have addeda new chapter to the ongoing sovereign debt crisisin Europe. At the time of this issue going to print,Greek voters turned on the Conservative NewDemocracy and Socialist PASOK, two parties thathave defined Greek politics for decades. New Greekparties from the left and right are divided in outlookbut united in opposition to EU-IMF bailouts andtheir widely unpopular austerity measures.In France, François Hollande has replaced capitalization and risk profiles was ownedformer President Nicolas Sarkozy. “Europe by decision makers. The impact of thisis watching us,” said Hollande during his framework on their business holds inter-victory speech. “At the moment when esting implications.the result was proclaimed, I am sure that Elsewhere in our pages are other featuresin many countries of Europe there was that explore new approaches to existingrelief and hope: finally austerity is no challenges. These include a look at intercon-longer destiny.” Yet following both elections, nectivity and stochastic modeling, risk andChancellor Angela Merkel of Germany social media, and the CVA desk’s functionclearly stated that neither she nor her of pricing the true cost of risk. In “Throughgovernment were interested in reopening the Looking Glass” we return to the topicthe eurozone fiscal pact, or the strategy of of curve fitting, with an empirical look atdeficit-cutting austerity measures. how chief risk officers and supervisors can What is the appropriate response in times gain critical insights into major exposuresof uncertainty and conflicting views on they would otherwise be unable to obtain.future direction? This has been an issue for In finance and politics, there will alwaysfinancial service firms since the financial be an element of uncertainty. As an industrycrisis. Regulators, governments and analysts and as global citizens, we will continue tohave called for financial firms to change identify and respond to the challenges ofthe way they do business. our times by searching the past, and also for One way that firms may be able to respond solutions that have yet to be by looking to how they have manageduncertainty in the past. In “Back to the Future”,this issue’s cover story revisits capital andits role in the bank of tomorrow. When earlybanks operated as partnerships with personalliability attached, every decision regarding Michael Zerbs Vice President, IBM Risk Analytics 3
  6. 6. 4
  7. 7. Brenda Dietrich has spent her professional TH!NK: You have been connected with IBM Research since the mid-1980s. Has the company’s approach to research changed over this span?career with IBM Research, and recently Brenda: It has changed quite a lot. In my early days with the group, IBMbecame the company’s first CTO of Analytics Research most closely resembled a think tank. Our job was to figure out cool things one could do with computing and computers first, andSoftware. In this issue’s conversation, Brenda then to try and establish a shared vision within the company’s product lines. In that period we invented some wonderful things and publisheddiscusses the nature of research, new data papers and patents. After we were done, it fell to others to find applications for our work. Over time, it has become more of a shared responsibility tostreams, and how the way we think about connect our work with IBM product and service lines.information is changing. In the last decade or so, we in the Research division have been much more tightly engaged with actual end users. Part of our role is now to understand how people approach computing, how they would like to use computing, and doing experiments in the art of the possible with real people. And that is a huge amount of fun. 5
  8. 8. TH!NK JUNE 2012“The name of the TH!NK: What would be an example of that type of compliance?game right now Brenda: Think about the GPS in your car. I don’t always follow the instructions mine gives me. And I really wish that she would keep track of what I do and learn that “Brenda prefers this street to that route,” for whatever reason and be responsive to that, rather thanis to find insight just yell at me and recalculate every time. TH!NK: I would too. The information GPS devices pick up represents new data streams, which are a big focus of the 2012 GTO. What streams are out there?faster than Brenda: We’re most familiar with structured data, which is generally numeric and tends to be nicely organized. You can find each of the pieces of it that you want, and nothing else. You can do queries against structured data. You can find averages, and ranges, and apply standard deviations.anybody else.” A lot of people say structured data is data you do arithmetic on, but a lot of properly formatted text data is also structured. For example, the name field in a client record. You can’t average two names or talk about a range of names; that doesn’t make any sense. But you can match names against one another in a way to say, “these two instances are actually the same person versus they are different people.” With geo-spatial data, we tend to be computing along varied types of metrics, so what you tend to do with location data is compute TH!NK: Why the emphasis on working with people? distances. People we try to count. And then we try to categorize them. Brenda: Ten years ago, the research lab was focused on the algorithm. Most catalog data is now also fairly well structured.You couldn’t do The operating model for the math team was, “someone gives me things like Amazon searches if their catalog data weren’t reasonably the mathematical representation of the business problem and I’ll well structured. Now, it still may be imperfect, but it’s far less imperfect work on the algorithm to solve it.” It would return a mathematical than five years ago. representation of the solution, or perhaps a code, and it was someone TH!NK: Catalog data takes us online, which is where most unstruc- else’s role to fit that into the business process. tured data exists. How do you define it, and what is its significance? For the relatively static problems we were looking at then, like flight schedules for airlines or production schedules for the manu- Brenda: I don’t have a concise definition for unstructured data, but facturing floor, this worked reasonably well. But we now live in a I would say that unstructured data is the data that we don’t yet know world where things are much more dynamic. The easy stuff has been how we are going to be computing (as opposed to just storing, copying, done. As we try to push the use of mathematics to support business and accessing) with. A lot of free form text data is unstructured. decision making, we are working in problem domains that are Once you have annotated it and tagged it with the associated metadata, much more subtle. Fine differences between the way two different you can say, “this sentence is about this person. It’s about where she enterprises in the same industry operate come into play. lives. It’s about where she works. It’s about her relationship with other people.” When you have all of that in the metadata it starts to TH!NK: Has the direction of research remained consistent, or has it move towards the structured space. evolved in surprising directions? But when it is just free form text, without the meta data, that’s Brenda: The GTO (Global Technology Outlook) is an annual unstructured. Most instances of data are very unstructured, like process in the Research division. I have been directly involved in voice data or tweets. Then it gets more complicated when we include every GTO since 1995. In the 1990s, our focus was much more about data coming off of sensor systems. We don’t know what we are going to speeds and feeds: how fast would storage be – how dense would storage do with it yet, and right now it is just drilling random measurements. be? How fast would access be? How many compact transistors and We know what each field means, we don’t yet know how we are going circuits would fit onto a chip? I’d estimate the split at that time to be computing on them. So that is kind of in a strange land. was around 50% hardware, 50% how the hardware will be used. TH!NK: So the next challenge is to incorporate these new data streams? This past year it was maybe 10% hardware, and 90% how the hardware will be used. Brenda: Yes. Let’s talk about a really simple example – retail forecasting. The easily acquired structured data comes off the point of sale device. TH!NK: What caused the change? You obtain huge amounts because every item is scanned. You can sort Brenda: For many applications hardware is good enough now, through that data and count the number of a given item bought each whereas that wasn’t the case 15 years ago. It’s not that we are day at each location. You can keep track of what combinations are struggling with the challenge of how to do the things we want to do bought together. And you can do simple time series to forecast how with computers. Now we are saying, “ok, we have this enormous much it will evolve in the future. wealth of data and computational power. What can we do with it?” You can also pull in other sources of data like weather, advertising In other words, it is much more focused on how we can use technology, events, or a single time event, and begin to understand how they impact less on how we can progress technology along its natural course. the consumption, demand, and purchasing of these items. If you extract We are also now more concerned with other perspectives. I would the effect from the underlying signal, you can propagate that signal call this modeling compliance: do the people do with computers forward using your usual time series methods. You can then look at things they should do, and can they (both the computers and the when these events are going to happen in the future, and put the people) adapt? multipliers back in. It can lead to an even better job of forecasting.6
  9. 9. In ConversationTH!NK: And data streams can be used to look at individual decision TH!NK: What would that role be?making as well. Brenda: Within a company like IBM, analytics touch almost everyBrenda: To stay with retail for a moment, the area that everyone part of our internal operations. We use analytics in human looking at, especially for big-ticket items, is intent to buy. We use them in supply chain. We use them in our own financialThis is what people tweet about, what they post on social media planning. We use them in our own risk analysis. We use themsites, on various blogs, and mention in comment fields on sites in facility planning. We use them in compensation Amazon. They’re everywhere. These are activities that tend to be done before the buy happens. There is a danger however, of each individual group using differentA lot of them are probably noise. And so the two things you will tools and different data to make the same sets of decisions. And awant to do is to detect a mention of a product, and you want to Chief Data Officer, who may or may not be your CIO, may be chargedkeep track of the mentions of that product. You can review the with the one source of the truth. They are the keepers who know thedata by source, by what type of person, by time, and then compare data of record for everything, as there should be.that to the actual buys of the product that occurred at some later But there’s more than one way to analyze data. There are differentpoint. You want to understand – is there a decent correlation techniques. This is a field where it does require some understandinghere? Is expression of intent a signal to buy? How powerful of a of the theory behind the methods. In an enterprise that is usingsignal is it? analytics and multiple business functions, having someone who will If this process gives differentiating insight to one of the actors in say, “this is our strategy; these are the tools we will use; this is howan economic ecosystem that the other actors don’t have, it helps we will share; this is how we will be more efficient.”create an advantage. And the name of the game right now is to find It’s about more than having the same data go into two analysisinsight faster than anybody else. processes. It is about the assumptions and the methodologies that are used in the analysis processes. That’s why I think a Chief AnalyticsTH!NK: Moving away from retail, what does the future hold for high Officer is going to be very important in the future in companies.performance users? TH!NK: It sounds like the data we gather is changing, and the ways weBrenda: Over time I think it will become important to gain a better gather that data are changing too. Does this mean we need to changeunderstanding of how data is used in combination. One of the first the way we think about data itself?uses of mathematical algorithms to control how a computer actuallyworked dates back to the big flat platter disc drives. For those drives, Brenda: As someone who grew up believing in the scientific method,you had to decide which track you were putting what data in. it’s my comfort zone. Researchers and scientists generally say,And you did the analysis up front of which data were you most likely “I want to look at the data, I want the data to inspire hypothesis, andto be accessing most often, so that data could go in the center track. then I want a way to test that hypothesis.”The least frequently accessed data was at the very center and at We can’t necessarily create new experiments with everythingthe very edge. that we come across today. What we can observe, which is vast, isn’t You did this because the center is the point from which, no matter the law of physics. We almost have to think of ourselves differently.where the head happens to be, on average it’s the shortest distance Perhaps as researchers we should no longer think of ourselves asto get to. So, this was a really important algorithm. It sped things up laboratory scientists, but more like astronomers. We have greattremendously in terms of data access. tools to observe the stars, but we can’t move them around.TH!NK: How does that relate to using data in combination?Brenda: Most of the advanced analytics we use pull in multiplesets of data. We might be pulling in bigger historical data sets whenwe are looking at one economic measurement versus another.Moving forward, we may want to pull in event information as well. “We should no longer think ofWe may want to see if an event, or a publication, or a blog, or someother signals affect our targets, and with what duration. The goal would be to pull multiple sources of data and try todetermine if one piece informs another piece in any way. Can we ourselves ascompute “a priori” the frequency with which two pieces or classes ofdata are going to be used together, and figure out some way to storethem so that we can get them ready and together, at the same time?Because you can’t start the computations until you have both of laboratorythose pieces. As long as the memory is in one place and the informationis on a magnetic drive somewhere, you bring one in first and thenwait for the other. If we could fetch them together because they were the same scientists,distance apart, if you will, it would be very interesting. That’s mynotion of using data together.TH!NK: Let’s talk for a moment about titles. A Chief Risk Officeris common. I have heard people start to speak about a Chief but more likeData Officer.Brenda: I would want to be Chief Analytics Officer. astronomers.” 7
  10. 10. TH!NK JUNE 2012 in review Earth Audit The finite supply of natural resources drives economies and influences pricing. Even though their costs may not be directly factored into all products and services, availability of key minerals can impact operational risk, markets, and capital. Using rough calculations, this energy audit illustrates how increases in living standards affect the rate of consumption, and brings an eye-opening perspective to the state of our planet. IF DEMAND GROWS… Some key resources will be exhausted more quickly if predicted new technologies appear and the population grows ANTIMONY 15-20 years SILVER 15-20 years HAFNIUM 10-years TANTALUM 20-30 years ˜ INDIUM 5-10 years URANIUM 30-40 years PLATINUM 15 years ZINC 20-30 years SOURCE: ARMIN RELLER, UNIVERSITY OF AUGSBURG; TOM GRAEDEL, YALE UNIVERSITY8
  11. 11. In Review © 2007 Reed Business Information - UK.All rights reserved. Distributed by Tribune Media Services 9
  12. 12. TH!NK JUNE 2012READING ROOMThere’s a connection between our thought process and the courses ofaction we choose. These new and noteworthy titles explore the scienceof decision making, and the impact ideas can have on society as a whole.SCIENCE +James Gleick digs deep into The Information, a journey from the language of Africa’s talking drums to theorigins of information theory. Gleick explores how our relationship to information has transformed the verynature of human consciousness. Charles Seife examines the peculiar power of numbers in Proofiness, aneye-opening look at the art of using pure mathematics for impure ends. In Being Wrong, Kathryn Schulzwonders why it’s so gratifying to be right and so maddening to be mistaken – and how attitudes towards erroraffect decision making and relationships. John Coates reveals the biology of financial boom and bust inThe Hour Between Dog and Wolf. Coates, a trader turned neuroscientist, shows how risk-taking transformsour body chemistry and drives us to extremes of euphoria or depression.Poor Economics The Price of Civilization Paper Promises Finance and theby Abhijit Banerjee, Esther Duflo by Jeffrey Sachs by Philip Coggan Good SocietyPublicAffairs Random House Allen Lane by Robert Shiller Princeton University PressSCREENING ROOM +Based on the bestselling book by Andrew Ross Sorkin, Too BigTo Fail reshapes the 2008 financial meltdown as a riveting thriller.Centering on U.S. Treasury Secretary Henry Paulson, the film goesbehind closed doors for a captivating look at the men and womenwho decided the fate of the world’s economy in a few short weeks. Too Big to Fail Directed by Curtis Hanson HBO Films10
  13. 13. Reading RoomThe Information Proofiness Being Wrong The Hour Betweenby James Gleick by Charles Seife by Kathryn Schulz Dog and WolfVintage Viking Adult Ecco by John Coates Random HouseSOCIETY +Jeffrey Sachs has travelled the world to help diagnose and cure seemingly intractable economic problems.In The Price of Civilization, Sachs offers a bold plan to address the inadequacies of American-style capitalism.Abhijit Banerjee and Esther Duflo offer up a ringside view of Poor Economics, arguing that creating a worldwithout poverty begins with understanding the daily decisions facing the poor. Philip Coggan’s Paper Promisesexamines debt, the global finance system, and how the current financial crisis has deep roots – going back to thenature of money itself. Robert Shiller believes that finance is more than the manipulation of money or managementof risk. In Finance and the Good Society, Shiller calls for more innovation and creativity so that society canharness the power of finance for the greater good. 11
  14. 14. 12
  15. 15. 13
  16. 16. We are unlikely to return to the capi- talization levels or strict regional focus employed by the gentlemen of Pawtuxet. There are however crucial lessons to be learned when examining the structure and scope of early financial institutions. When we talk about addressing concerns over capital, funding, and liquidity, it just might be that what we need for the bank of tomorrow is not a new model, but rather one that takes inspiration from the bank of yesterday. FIRST DEPOSITS In the early history of banking, each partner made decisions knowing they shared liability if the bank failed. As a result, choices on whom credit should ver the decades there be extended to were not taken likely. Unambitious business owners have been many views on with a slight but steady production of widgets were considered what the bank of the future ideal customers. Less attractive was the dubious repayment would be. Some ideas have potential of innovative or entrepreneurial types. The latter group been radical and some have been represented the potential of a phenomenal return on investment, transitional, while others never really but only if an unproven product or process succeeded.took hold. This particular view begins in Pawtuxet, Rhode Island, This philosophy was in keeping with the dominant bankingon the eastern seaboard of the United States. By all accounts a theory of the 18th & early 19th centuries. The real-bills doctrinelovely place to visit, Pawtuxet is well known for scenic harbour proposed that banks should restrict the extension of credit toviews and boating along its historic river corridor. But in the early customers involved in the transfer of existing products only.19th century, textile mills and coastal trade dominated its land- Real-bills supporters argued that by basing loans on the securityscape. As the community thrived and local businesses grew, the of actual goods, any individual bank’s liquidity was ensured.Pawtuxet Bank emerged. Common for its time, the bank was a While there is an admirable simplicity in tying loans to tangiblepartnership and its directors mostly merchant-manufacturers. goods, the skyline of the 19th century was starting to change. There The Pawtuxet Bank’s directors shared personal liability in the were towers, factories and infrastructure projects that needed toevent of loan default, or if the bank itself failed. In “The Struc- be built, without existing product to offer in exchange for funding.ture of Early Banks in Southeastern New England”, Naomi R. And these projects were poised to generate great returns.Lamoreaux recounts the events of June 1840 when the bank’s The unlimited liability model was ill-suited to finance thesestockholders presented themselves before the Rhode Island projects. When shareholders’ money was directly on the line, banksGeneral Assembly. The group appeared seeking permission to had good reason to avoid speculative projects. Because the incen-reduce the bank’s capitalization from $87,750 to $78,000 in tives for self-discipline were so high, banks often lent only to thoseorder to cover losses sustained due to the death of John Pettis, they knew best. These could be local businessmen, often engagedone of the bank’s directors: in the same type of industry as the bank shareholders. Often bank Pettis died in 1838 with notes worth $8,800 outstanding at funds became personal resources for the shareholders themselves, the bank and endorsements amounting to at least another and this type of insider lending, or trading, would frequently make $1,500...(t)his loss was not sufficiently up the majority of a bank’s exposures. large to cause the bank to fail. Nor did The adjustment (towards As the world began to change, so did banks. depositors or the bank’s own noteholders By the late 1850s, Great Britain had moved suffer. Most (91 percent) of the bank’s limited liability) would towards limited liability, with France following loans were backed by capital rather than correct what had turned suit in 1867. As with unlimited liability, the notes or deposits, and the stockholders out to be the too successful logic was easy enough to follow: if a bank simply absorbed the loss. risk measure of personal could diversify its investor base, there would obligation: banks weren’t interested in funding anything risky.14
  17. 17. Back to the Future DYNAMIC CAPITAL MANAGEMENT BY FRANCIS LACAN To visualize the concept of dynamic capital management, think of flying an aircraft as close as possible to the ground. If you go too high, thebe a greater availability of credit and capital. The cost of fuel becomes unreasonable. You cannot go negative becauseadjustment would correct what had turned out to be the option doesn’t really exist. The goal is to seek out the most efficientthe too successful risk measure of personal obligation: middle ground that best mimics the changing landscape below.banks weren’t interested in funding anything risky. Managing capital dynamically would enable a bank to determine, In “Early American Banking: The Significance of on a day by day and month by month basis, the most efficient flightthe Corporate Form,” Richard Sylla suggests that the path for capital and allocate it accordingly. Optimization andtipping point away from unlimited liability originated anticipation are the two extremes guiding such decisions, and in thewith the New York Free Banking law of 1838 which middle reside a big set of constraints. Basel III and its liquidity coveragestated, “no shareholder of any such association shall ratio have restricted certain freedoms, particularly in terms of assetbe liable in his individual capacity for any contract, qualification. The other set of constraints is risk management.debt or engagement of such association.” New York’s Liquidity is increasingly subject to risk management because therefree banking law didn’t just make limited liability are lots of dependences in funding liquidity and the rest of the risk.possible. It opened the door for the incorporation of It seems liquidity is following a similar path to what happenedbanks and the freedom from personal obligation. with capital and solvency. Banks didn’t invest much in economic capital, but the strong requirement to look at regulatory capitalONE STEP FORWARD, TWO STEPS BACK acted as an incentive to build more analytics, more rigorous reporting,As banks moved beyond their villages in search of capital and to become more serious about addressing uncertainty withand opportunities, the strategies and measurements the right tools. What is more complex for capital management isused in their operation changed as well. Instead of to connect all the sources of information. Pulling cash flow acrossprudence being the only driver, customer profitability entities and supporting good cash management today still has a lotand shareholder value became ongoing concerns. of room to evolve. There are for example too many overlapping Expansions, mergers, and deregulation replaced local systems of information that are not very good at talking to onepartnerships with a mandate to maximize customer another. Overcoming this hurdle would be a huge step towardsbases and profitability. Operating at an extreme opposite active capital management, rebalancing, and optimization.of the early 19th century model were institutions The current baseline for automation is extremely Alfinanz, an offshore administration factory that The only true mechanical element is the planning of short term inflowsfunctioned as a global back office for a global network and outflows within the Treasury, because their contractual commit-of financial advisors, intermediaries, or brokers. ments are relatively easy to model. The rest is seen as shocks, and As banks moved from private partnerships to the focus seems to be on what the regulators are asking bankspublic corporations, shareholder demands added to address as the possibility of these shocks. As a result, banks areanother voice to how bank capital and risk would be being pushed into modeling with greater consistency what maymanaged. Enhanced returns were a factor in banks’ happen with respect to different uncertainties tied to cash flows.decisions to pursue diversification, more complex In the short term, banks will have to continue on the foundationstransactions such as structured products, and other of operational management of cash and collateral, addressingstrategies that gained support from managers operating regulatory requirements for cash flow modeling and forecasting,with limited liability. asset qualification, and scenario modeling. Together, these elements In the modern era of banking, even the idea of ‘who will provide a rugged foundation to move towards automatedis a customer’ was up for grabs. In the 1990s, First decision making, and eventually, a more automated approach toManhattan Consulting Group took a leading role in at least some aspects of capital management.introducing profit-based segments to banking. First This prediction comes with a number of ‘ifs’: if you are able toManhattan came to prominence with the now famous have very good and trustable aggregated pooling of all internalconclusion that only 20% of a bank’s customers were and external balances of cash in all currencies, and if you haveprofitable. Their idea to focus on profitable customers access to a very good repository for your treasury operations soonly was an attractive one to banks who were seeking to you can see your money market for all these currencies, you couldimprove low revenue growth, particularly in core retail to an extent begin to automate capital allocations for particularproducts. The concept also encouraged mergers and the areas of the business. Decisions on how to refinance each of thesecreation of larger banks, who were better positioned to currencies, and perhaps rebalance positions into a smaller numbertake advantage of segmentation opportunities. of currencies to save costs or optimize even the risk profile of certain Today we see banks retreating from these drivers and transactions, could in theory be automated.measures, often forced to adjust strategy by regulation, This won’t happen tomorrow. But with the proper foundation, weand perhaps in retreat from acting “in loco parentis”. have the technology to make dynamic capital management part of the bank of the future. 15
  18. 18. TH!NK JUNE 2012 It just might be that what we need for the bank of tomorrow is not a new model, but rather one that takes inspiration from the bank of yesterday.Shareholder demands, which focused exclusively on the creation as easy as flipping a switch. A lack of cheap availability andof shareholder value, must now be balanced against closer regulatory reduced funding sources have changed the capital landscape.scrutiny and the need to protect customer interests. Banks of the future must focus on the preservation and leverage Diversification led to its own set of challenges, as it did not help of available capital, and make that capital work harder.spread risk well. The credit crisis demonstrated that market riskand credit risk can appear in unexpected ways, and that the need tomaintain strong liquidity positions was more crucial than realized. All the short term profitability in the world cannot help if thesystem isn’t stable. And today, if you want stability, every discus-sion must begin with the importance of access to capital.CAPITAL: THE ONCE AND FUTURE KINGIn his memoir On the Brink: Inside the Race to Stop the Collapseof the Global Financial System, former U.S. Secretary of theTreasury Henry Paulson reflects back on the credit crisis. One ofhis conclusions is that the financial system contained too muchleverage, much of which was buried in complex structured products: Today it is generally understood that banks and investment banks in the U.S., Europe, and the rest of the world did not have enough capital. Less well understood is the important role that liquidity needs to play in bolstering the safety and stability of banks...(f )inancial institutions that rely heavily on short-term borrowings need to have plenty of cash on hand for bad times. And many didn’t.Politicians and regulators have joined hands on capital, proposingmeasures that would lead to banks holding more of it. Manybanks have argued against this approach, claiming that additionalcapital requirements would affect performance and competition.Yet recent investigations into the correlation between bank capitaland profitability suggest that holding additional capital may not Part of this focus must be organizational. Allocation of capital canbe a bad thing. Which is encouraging, since banks will likely have no longer be controlled at a business unit, subsidiary, country, orto do it anyway. branch level. It needs to be allocated at the time of doing business Allen Berger and Christa Bouwman’s interests are reflected in with specific customers, business lines, and even at a transactionthe title of their recent paper, “How Does Capital Affect Bank Per- level. Dynamic capital leads to a radically different structure,formance During Financial Crises?” The authors examine the where the treasury becomes the ‘owner’ of capital, lending it toeffects of capital on bank performance, as well as how these deal makers on demand.effects might change during normal times as well as banking and A dynamic treasury requires great understanding of the uses andmarket crises. The empirical evidence led Berger and Bouwman cost of capital, connected to the technological ability to ‘solve’ theto the following conclusions: problem of Big Data. In the sidebars to this article, my colleagues First, capital enhances the performance of all sizes of banks have expanded on the linked topics of dynamic capital and during banking crises. Second, during normal times and managing complex data. market crises, capital helps only small banks unambiguously in all performance dimensions; it helps medium and large BANKING ON THE PAST banks improve only profitability during market crises and In the early 19th century, the UK limited banking partnerships to only market share during normal times. six members. No one is suggesting banks return to this restriction.Empirical evidence, regulatory measures, and perhaps common But if we think about various aspects of the unlimited liabilitysense dictate that holding additional capital is a worthy goal for banking model, it appears many of their tendencies are beingbanks. Yet even if banks wanted to raise capital thresholds, it isn’t echoed in calls from regulators and stakeholders.16
  19. 19. Back to the Future Data Complexity BY Leo Armer In the Pawtuxet model, a small number of operating partners owned the bank’s data. It was their responsibility to collect information about their clients, and use this knowledge to guide business decisions.Long-dated compensation reform and shareholder ‘say Banks today have challenges managing data, in large parton pay’ programs can be seen as measures intended because the acts of collecting and analyzing information haveto update the shared liability and sense of ownership become so separated. The greater this disconnect, the morepartners used to bring to banks. The credit crisis has important transparency becomes.driven home the importance of liquidity, and that gaining For both banks and clients, it’s crucial to be able to ask: “If thiscapital can be expensive – if it can even be acquired in is my risk number, where did it originate? How do I track it? Howtimes of a crisis. In a way this reflects the notion early can I see which systems it passed through, and what happenedbankers held that capital was expensive, and bringing to it along the way?” Being able to take a number from a balancein additional funds or partners would dilute earnings. sheet or a general ledger and drill back to its origin provides a huge Insider lending and specialization that gave way to amount of confidence.diversification and fewer restrictions on portfolios is In order to make good decisions, you need to see the big picture.being balanced by technologically enabled means to If data complexity is viewed purely as a technological issue, itsbetter know customers. Enhanced collateral manage- strategic importance can be overlooked. When institutions attackment and approaches like CVA can be used to gain data issues purely from an IT perspective, rules are created, trans-a deeper understanding of capital exposures before formations take place, and the data is considered ‘clean’ afterentering into an agreement. going through a reconciliation process. Various systems and approaches are employed to ensure that the numbers coming out of the front system match numbers coming out of the general ledger system, and these match the numbers coming from treasury. The problem is, as much as you can clean the data on a Monday, unless you change the people or method of entering that data, it’s going to need cleaning up again on Tuesday. Today, a few firms are approaching data complexity from a business perspective. They have put their main focus on creating a single data warehouse where all the information is stored. This approach is based on the insight that every piece of data has a golden source: a reliable point of origin before it gets passed through different hands and different teams. It becomes as much about changing mental attitudes as it is about technical architectures. Creating a golden source for data becomes even more crucial when we see what has happened in the last couple of years. CVA charges for example occur when a bank puts a variable fee on top of a deal, depending on whom they are trading with. If your bank was to trade with another bank, and you had a long history with the other bank and deep insights into their credit status, the bank would likely get a better price for that trade than a small finance company from Greece who might be looking less solvent. In these transactions, where is the golden source? Is it in the middle office or front office data? Who is taking ownership over the trades? They can’t be processed the way they used to be,If banks are to thrive in the future, preservation and otherwise you’re swimming against the current demand for realleverage of available capital are crucial steps. Dynamic time responses. If it takes six or seven days to work out whatallocation, enabled by a treasury that quickly and happened when a counterparty defaults, you’re too far behindeffectively uses available capital in prudent ways, the curve.could perhaps be the defining characteristic of the bank It is becoming more common to run into or hear discussionsof tomorrow. From the outside, these institutions about appointing a CDO, or Chief Data Officer. A CDO, or at leastwould look nothing like the stakeholders of the the mindset within an institution that data quality is crucial andPawtuxet Bank, but they would be related in spirit. strategically relevant, can help banks evolve beyond workarounds and create a repository of golden source data. Through a framework where standards, direction, and architecture are provided to different departments throughout an organization, the bank of the future can overcome data complexity. 17
  20. 20. 18
  21. 21. CVATHEDESKPricing the cost of risk at Societe Generale By Bob Boettcher 19
  22. 22. TH!NK JUNE 2012 , “I wouldn t want to overstate it – it’s not bringing the industry to a halt. But there is increasing focus on limiting exposures, even among global banks. And that is starting to affect the way we do business.”20
  23. 23. The CVA Desk THE SETUP Life used to be different – at least in terms of how counterparty credit risk was calcu- lated. In the past, an interest rate swap would have been priced the same for every client. But Lehman’s default, and more re- cently the Greek sovereign stress, has changed all that. Now, no client is assumed to be truly risk free. Different prices are now expected for different clients on that same interest rate swap, depending on variables including the client’s rating and the overall direction of existing trades be-A tween both parties. ny time one bank takes Noting their emergence, and particularly a risk against another their activity in the sovereign CDS market, the probability of default the Bank of England defined CVA desks in exists. To offset this their 2010 Q2 report as follows: concern, and to support A commercial bank’s CVA desk c ongoing stability within centralises the institution’s control the interbank market, of counterparty risks by managing banks have long emphasized the impor- counterparty exposures incurred tance of measuring and managing coun- by other parts of the bank...CVA terparty risk. Yet over the past few desks will charge a fee for managing months banks have becomes noticeably these risks to the trading desk, less comfortable trading with each other. which then typically tries to pass The recent deterioration in credit ratings this on to the counterparty through that has hit many U.S. and European banks the terms and conditions of the has led to a heightened sensitivity over trading contract. But CVA desks are counterparty risk. These apprehensions not typically mandated to maximise may not be voiced directly, but they profits, focusing instead on risk become evident when front office trades management. that would have cleared in the past The Bank of England’s summary captures no longer do because credit lines have the classic model for running a CVA desk, been reduced. which Murphy has implemented at SG As head of the CVA desk at Societe CIB. The classic approach incorporates Generale Corporate & Investment Banking three elements: (SG CIB), David Murphy has a unique 1. pricing of new trades vantage point on interbank relationships. 2. transferring risk to a centralized“I wouldn’t want to overstate it – it’s not desk from individual desks bringing the industry to a halt. But there is 3. hedging or otherwise mitigating the increasing focus on limiting exposures, even aggregated risk on a global basis among global banks. And that is starting to On all new interest rate, FX, equity, or affect the way we do business.” credit derivatives, CVA desks price the CVA desks have grown in popularity as marginal counterparty risk for inclusion banks seek more effective ways to manage into the overall price charged to the client. and aggregate counterparty credit risk. CVA is a highly complex calculation – and From his seat at SG CIB, David has a bird’s manually calculating that for the thou- eye view on the challenges associated with sands of trades and potential trades that establishing CVA desks, and the benefits pass through a bank every day isn’t realistic. banks can realize by gaining an active view An effective automated system therefore on their portfolio of credit risk. becomes crucial to a CVA desk’s viability. 21
  24. 24. TH!NK JUNE 2012 UPSIDES OF AUTOMATION “Really, what our sales team are interested “For a plain vanilla trade with another bank done on an electronic trading in, is earning as much as possible net of the platform, our target delivery time for the price is approximately 10 milliseconds,” CVA. Through the automated system tools, says David. “On the other end of the complexity spectrum, highly-structured, we’ve empowered sales and traders to do long-dated trades may require two or three days to calculate the CVA price. trades with the lowest CVA possible. So it’s Within this range we deliver CVA pricing within timescales that don’t delay worth their while spending time looking the overall trade completion.” for that price, and they can now do that While automated pricing copes well with vanilla products and the speeds themselves quickly and efficiently – without required for those trades, there will always be exotic trades, trades where the delays or extra resources required clients have a non-standard credit story, or a trade with special risk mitigation. when using the manual pricing process.” In these cases, David has a team of four who provide this manual pricing to These ‘pre-deal’ checks are purely Sales and Traders on request. indicative – and optional for Sales and “We try to reduce the need for manual pricing as much as possible, but the Traders. But if they don’t do this check, they business will always have trades where they need someone to take a closer face a big risk, because every time a new look. There may also be situations where we think the automated pricing trade is booked, an ‘official’ CVA fee is then isn’t good enough, so we want to take a look anyway and we don’t allow the auto-calculated – which is then recorded sales or trades to use the automated pricing provided,” he explains. alongside the trade – and will be deducted In the manual process, the CVA desk team often passes along suggestions from the Sales/Trader performance. to the salesperson for improving the credit risk in a trade and enabling the sales person to offer the trade at a lower credit price. Examples of that would PERILS OF PRICING include improving the collateral agreement with a client, or inserting a “A key challenge of building a CVA pricing break clause. system is ensuring real-time access to data “Via the manual process, we have educated our sales team and traders how in three categories: trade details; static they can change the credit risk (and reduce the price). With this knowledge, data (client data such as rating, details they now use the automated pre-deal CVA calculations to provide several of all pre-existing trades, netting status, CVA prices for different versions of the same trade. This allows them to collateral details etc), and market data.” achieve the best price for the client – while minimizing the counterparty risk.” “Designing the system which has reliable and timely data in all 3 categories is crucial, given the impact that will have on pricing and/or hedging decisions. It’s a tough market with sophisticated competitors: if we under-price a risk, you can be sure we will start attracting a large market share. And if we over-price, then we lose business unnecessarily.” “We try to reduce the need for manual pricing as much as possible, but the business will always have trades where they need someone to take a closer look.”22
  25. 25. The CVA Desk “Through the automated system tools, we’ve empowered sales and traders to do trades with the lowest CVA possible.”HALFWAY TO HEDGING “If you take SG CIB’s total portfolio of clients, just overMurphy’s first priority for SG CIB has been 10% have a liquid CDS curve. In other words, for 90% ofto ensure the CVA desk correctly prices our clients, if we wanted to go and buy CDS protection,the risk in all new trades. Now that this we couldn’t do it because there’s no market. To hedgeprocess is well-advanced, the desk will these illiquid risks, banks would need to use somestart to focus more on hedging – or otherwise kind of credit index.”mitigating – its legacy portfolio of credit But this is an imperfect hedge. In ‘normal’ times, therisks. “For hundreds of years banks have credit spread of the index and the ‘generic’ spreadmanaged reasonably well hedging 0% of applied to calculate the client’s CVA will move in tandem.their counterparty risk. So instead of an The hedge is therefore effective in reducing earningsinstant seismic shift to 100% hedging of all volatility from day-to-day changes in the CVA Reserve.risks, we will be hedging selected segments But, if the client deteriorates – or even defaults – due toof the credit and market risk – which an idiosyncratic reason, then the index hedge may notavoids paying away all CVA income to the be affected at all (i.e. the hedge doesn’t work). So themarket,” says Murphy. decision whether to hedge illiquid names depends on For banks evolving the CVA function, what you want to protect against: actual losses followingthere are two main reasons hedging is not default… or earnings volatility caused by changes infurther along. The first is technology: they market credit spreads.may not yet be fully confident in their riskmeasurement system, which requires acomplex and time-intensive developmentperiod. The second reason is strategic: thebank might not think that all of its potentialhedges are very useful. 23
  26. 26. TH!NK JUNE 2012In the traditional CVA ACCOUNTING FOR BASEL In the traditional CVA approach, a bank accepts a new trade, takes a fee and uses that fee to buy good hedges for all the risks in thatapproach, a bank accepts trade. These hedges should eliminate all of the bank’s risk, but this is not necessarily the case once Basel III is taken into account.a new trade, takes a fee Basel III does not recognize all types of hedges that the bank might want to use. Therefore the regulatory capital for certainand uses that fee to buy trades will not be zero, even if the bank has used the full CVA fee to hedge all its risks. The first impact Basel III has on CVA desks is on pricing.good hedges for all the Pre-deal pricing needs to be reviewed to ensure the costs of imposed regulatory capital are covered. If not, additional pricingrisks in that trade. These may need to be added. And the decision on which risks are efficient to hedge also becomes affected not just by strategic orhedges should eliminate business reasons, but also by the regulatory capital impact. As part of Basel III’s updated regulatory capital guidelines, a new element has been added: VaR on CVA. Regulators haveall of the bank’s risk… specified very precisely how the underlying CVA must be calcu- lated for this charge. Banks will therefore need to decide whether to adjust their pricing and balance sheet CVA to match the BIII rules, or to use different CVA calculations for pricing and Regulatory purposes.24
  27. 27. The CVA DeskA DEFINING ROLEWhen individual trading desks own risk, one desk may have had apositive exposure to a client. This could lead the desk to hedge thepositive exposure, without knowing that there was a negativeexposure at another desk, which means the hedge wasn’t reallynecessary. Because the CVA desk owns all the risk from all thedifferent derivatives desks, it has a full view of the risks with eachcounterparty, across all desks, products and locations and canprice and hedge the risk appropriately. It has been suggested that a CVA desk is just a ‘smart middleoffice’. Murphy doesn’t agree: “The main difference between CVAother front office trading functions is that most of the risks areoriginated internally from the bank’s other trading desks. But theCVA desk must price, originate, and distribute those risks inexactly the same way as any other front office trading desk.” …but this is not CVA desks have evolved to price, centralize, and manage abank’s counterparty risks, requiring sophisticated modeling ofhybrid risks encompassing every asset class that the bank is necessarily the caseinvolved in. When implemented correctly, CVA desks shouldsupport an institution’s business and strategic vision, while helping once Basel III is takenbanks maintain normalized relationships and control risk in anever more complex trading universe. into account. 25
  28. 28. 26
  29. 29. TheOPTIMIZATIONof EVERYTHING OTC Derivatives, Counterparty Credit Risk and Funding By Jon Gregory The global financial crisis has created much excitement over counterparty credit risk (CCR) and, in recognition of this, banks have been improving their practices around CCR. In particular, the use of CVA (credit value adjustment) to facilitate pricing and management of CCR has increased significantly. Indeed, many banks have CVA desks that are responsible for pricing and managing CVA across trading functions. In addition to CVA, DVA (debt value adjustment) is often used as recognition of the “benefit” arising from one’s own default and funding aspects may be considered via funding value adjustment (FVA). Also, the impact that collateral has on CVA, DVA and FVA is important to quantify. Finally, there is a need to consider the impact of the funding requirements and systemic risk when trading with central counterparties (CCPs). 27
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.