You are currently browsing the tag archive for the ‘trading’ tag.

Banking giant JP Morgan raised eyebrows in 2012 when it revealed that it had lost a substantial amount of money because of poorly conceived trades it had made for its own account. The losses raised questions about the adequacy of its internal controls, and broader questions about the need for regulations to reduce systemic risk to the banking system. At the heart of the matter were the transactions made by “the London Whale,” the name given to a JP Morgan’s trading operation in the City by its counterparties because of the outsized bets it was making. Until that point, JP Morgan’s Central Investment Office had been profitable and apparently well controlled. In the wake of a discovery of the large losses racked up by “the Whale,” JP Morgan launched an internal investigation into how it happened, and released the findings of the task force established to review the losses and their causes [PDF document].

One of the key points that came out of the internal investigation was the role of desktop spreadsheets in creating the mess. “The Model Review Group noted that the VaR [Value at Risk] computation was being done on spreadsheets using a manual process and it was therefore ‘error prone’ and ‘not easily scalable.’” The report also cited as an inherent operational issue the process of copying and pasting data into analytic spreadsheets “without sufficient quality control.” This form of data entry in any critical enterprise function is a hazard because the data sources themselves may not be controlled. After the fact it is impossible to positively identify the source of the data, and (unless specifically noted) its properties (such as time stamps of the source data) will also be indeterminate. These last two are sensitive issues when marking portfolios to market (that is, determining their value for periodic disclosure purposes), especially when there are thinly traded securities in that portfolio.

The report notes, “Spreadsheet-based calculations were conducted with insufficient controls and frequent formula and code changes were made.” In particular, in response to a recommendation by the Internal Audit group for providing greater clarity and documentation of how securities prices were arrived at, the individual with responsibility for implementing these changes made changes to a spreadsheet that inadvertently introduced material calculation errors, which slipped through because the changes were not subject to a vetting process. Absent a thorough audit of a spreadsheet, these sorts of errors are difficult to spot.

This was not the first time a financial institution has incurred serious losses because of faulty spreadsheets. One of the first big debacles took place more than 20 years ago when a faulty spreadsheet caused First Boston to take a multimillion-dollar hit trading collateralized debt obligations, which had only recently been invented.

Spreadsheet mistakes are quite common. Our recent spreadsheet benchmark research vr_ss21_errors_in_spreadsheetsconfirmed that errors in data and formulas are common in users’ most important spreadsheets. There’s even an association, The European Spreadsheet Risks Interest Group, that tracks the fallout from spreadsheet errors. Yet, even when millions of dollars or euros are at stake, the misuse of spreadsheets persists.

It’s a good thing that spreadsheet hazards are intangible, or they might have been banned or heavily regulated long ago in the United States by the Occupational Safety and Health Administration (OSHA) or similar bodies in other countries. All kidding aside, the London Whale incident raises the question, “Why do people persist in using desktop spreadsheets when they pose this magnitude of risk?”

Our research finds that the answer is ease of use. This is particularly true in situations like the one described by the JP Morgan task force. In the capital markets portion of the financial services industry it’s common for traders, analysts, strategists and risk managers to use desktop spreadsheets for analysis and modeling. Spreadsheets are handy for these purposes because of the fluid nature of the work these individuals do and their need to quickly translate ideas into quantifiable financial models. Often, these spreadsheets are used and maintained mainly by a single individual and undergo frequent modification to reflect changes in markets or strategies. For these reasons, more formal business intelligence tools have not been an attractive option for these users. It’s unlikely that these individuals could be persuaded to take the time to learn a new set of programming skills, and the alternative – having to communicate concepts and strategies to someone who can translate them into code – is a non-starter. Moreover, these tools can be more cumbersome to use for these purposes, especially for those who have worked for years translating their concepts into a two-dimensional grid.

Desktop spreadsheets have become a bad habit when they are used in situations where the risk of errors and their consequences are high. Increasingly, however, they are a habit that can be broken without too much discomfort. The task force recommended more controls over the spreadsheets used for portfolio valuation. One way of doing this is simply to add vetting and sign-off before a spreadsheet is used, controls to prevent unauthorized changes and periodic audits after that to confirm the soundness of the file. This classic approach, however, is less secure and more time-consuming than it needs to be. Organizations can and should use at least one of three approaches to achieve better control of the spreadsheets they use for important processes. First, tools available today can automate the process of inspecting even complex spreadsheets for suspicious formulas, broken links, cells that have a fixed value rather than a formula and other structural sources of errors. Second, server-based spreadsheets retain the familiar characteristics of desktop spreadsheets yet enable greater control over their data and formulas, especially when integrating external and internal data sources (say, using third-party feeds for securities pricing or parameters used in risk assessments). Third, multidimensional spreadsheets enable organizations to create libraries of formulas that can be easily vetted and controlled. When a formula needs updating, changing the source formula changes every instance in the file. Some applications can be linked to enterprise data sources to eliminate the risks of copy-and-paste data entry. Since they are multidimensional, it’s easy to save multiple risk scenarios to the same file for analysis.

Spreadsheets are a remarkable productivity tool, but they have limits that users must respect. Desktop spreadsheets are seductive because they are easy to set up. They are especially seductive in capital markets operations because they also are easy to modify and manipulate. However, these same qualities make it just as easy to build in errors with grave consequences that can be nearly impossible to spot.

A decade ago, there were few practical alternatives to desktop spreadsheets. Today, there are many, and therefore fewer good reasons not to find and use them. The issues uncovered by the “London Whale” episode are far from unique. Only when a disaster occurs and the fallout is made public do people see the consequences, but by then it’s too late. Executives, especially in risk management functions, must become more knowledgeable about spreadsheet alternatives so they can eliminate systemic risks in their internal operations.

Regards,

Robert Kugel

SVP Research

Increasingly, global financial markets compete on speed, so much so that high-speed trading capabilities have become a performance differentiator for the largest financial services firms and some investment funds. Transmitting messages with quotes, prices and trade data is a core capability for currency dealers. Informatica recently introduced Ultra Messaging, which is designed to offer global currency traders an efficient, high-throughput, lower-latency (that is, faster) and more secure method of linking their worldwide operations.

Currency trading, like much of the financial services landscape, has been transformed by IT. When I worked on a small currency trading desk in 1978, the two most sophisticated pieces of technology the traders used were the fax machine (in those days, before it became a consumer electronics item, a fax machine cost $10,000, or about $35,000 in current dollars) and a Monroe desktop financial calculator. Back then, a one-day movement of 50 “pips” (0.0050 of any currency unit or a half-cent) was considered a huge move in the market. Today, in an era when currency swings of several cents in a day are common and daily trading volumes are measured in trillions, computers and the networks that connect them are integral components of any dealer’s operation, helping to manage trades, searching for arbitrage opportunities and enabling financial institutions to carefully monitor their risk exposure to their customers. Traders, when they are involved, must have up-to-the-split-second information. Those that are charged with managing counterparty risk must be certain that they have real-time information about their global exposure to individual credits.

The worldwide currency market operates around the clock, and major trading centers (Tokyo, London and New York, for example) have normal working hours that overlap over the course of the day, so it’s important that the individual trading desks around the world have the exact same information simultaneously. If they don’t, they risk having competitors arbitraging their bids or otherwise trading against themselves.

Ultra Messaging is designed to optimize performance across wide area networks (WAN). Managing the flow of quotes and trade information is less difficult on a local area network (LAN) since bandwidth can be enormous, distances are short and messages on the network cross a limited number of nodes. When networks must span the world, though, latency develops because of the sheer distances involved, the sometimes complex routing a message takes between two points and potential bandwidth limitations in the technology that’s employed to handle the message. WANs also pose problems of lost data because of the relatively high number of nodes between the sending and receiving points.

Ultra Messaging optimizes the path across a WAN by continuously monitoring the network topography and using algorithms to select the optimal routing of messages for each specific moment, an approach that seeks to achieve an “as soon as physically possible” (ASAPP) latency. It is designed to offer graceful failover capability because it also has calculated the next best routing if that becomes necessary. Informatica claims it can regularly save tens of microseconds compared to other solutions on the market.

The software is available in three versions to address different optimization requirements. The Persistence Edition focuses on guaranteed messaging with zero-latency failover to address the need for institutions to have an accurate global record of all of its trades. The Queuing Edition is designed for guaranteed “once-and-only-once” delivery with low-latency messaging and automatic load balancing for constrained bandwidth environments. The Streaming Edition is designed to achieve the lowest possible latency by using “nothing in the middle” methods that eliminate daemons and brokers.

Global financial services companies have options when it comes to managing their WAN-based messaging and communications. They should investigate whether Ultra Messaging will improve their network performance.

Regards,

Robert Kugel – SVP Research

Twitter Updates

Stats

  • 75,113 hits
Follow

Get every new post delivered to your Inbox.

Join 68 other followers

%d bloggers like this: