nav search
Data Center Software Security Transformation DevOps Business Personal Tech Science Emergent Tech Bootnotes BOFH

Insuring against a future financial crisis

Staying compliant & off the front page

By Dave Cartwright, 27 Apr 2017

There’s nothing quite like a nice, juicy financial crisis to wake up the regulators’ rule-setters, psych up the lawmakers and get the lawyers sharpening their quill pens and breaking out a fresh bottle of Quink. And so it seems to have been proven since the financial car crash of the mid to late noughties, with the appearance of a variety of new rules and legislation to keep the financial services industry on its toes.

So for example just as we’ve got used to referring to Basel II we now have the delights of Basel III: “A global regulatory framework for more resilient banks and banking systems”. At 77 pages for the basic doc it’s neither a light pamphlet nor the most compelling read I’ve ever experienced, but neither is it rocket science. In a sentence: banks need to hold enough capital and do sensible forecasting to make sure they continue to do so over time.

While phrases like “reducing procyclicality and promoting countercyclical buffers” need to be taken out and shot, even I can understand what they mean by “raising the quality, consistency and transparency of the capital base”.

And as an ISO 27001 sort I start to tune in more when I see sentences like: “Failure to capture major on- and off-balance sheet risks, as well as derivative related exposures, was a key destabilising factor during the crisis”. Hmmm … risk … I can relate to that.

Then you have the Solvency II Directive – EU Directive 2009/138/EC, stretching to a remarkable 309 pages – which is a lot like the Basel stuff but aimed at the insurance/reinsurance industry. There’s a lot in there, but it shares a notable theme in common with Basel III: the term “risk”, which on average gets a mention more than twice per page of the document.

Or there’s the Markets in Financial Instruments Directive (“MiFID” – or more appropriately these days “MiFID II”), the aim of which is to bring together the regulation frameworks for the member states of the EU.

And guess what: dig into the text and you keep finding the “r” word, in sentences like: “requiring an investment firm to establish, implement and maintain an adequate risk management policy” or “establish, implement and maintain adequate policies and procedures designed to detect any risk of failure by the firm to comply with the provisions of national law”.

So, although our friends in financial services have an increasing range of regulatory requirements to follow which are ostensibly financial regulations, what they actually have on their hands is a risk assessment and management mission. And to succeed in this mission the technology types must provide them with the information platform they need so they can rely on the data with which they’re working and use it to demonstrate compliance to the auditor and to the regulator.


Unsurprisingly security is paramount, but not in the sense that one often considers it. Much of the time the technology group focus on the security of their information to be sure that it’s protected against corruption (by taking and managing backups), unavailability (protecting against denial-of-service attacks and applying bug-fix patches, for example) and theft (by applying firewall rules and ensuring user logins are managed effectively).

The other thing you need to do to prove regulatory compliance is to demonstrate not just the integrity of your data but its provenance and its history. You need to show that the results of the reports you’ve produced for the regulator can be relied upon – that the underlying data’s origins can be traced back to their source and that the chain of evidence is unbroken. That means effective logging at all levels within the applications and systems, of all the operations that could affect the level of confidence anyone could have in the data.

Of course, this is not really any different from what you need to do for your annual financial audit. Auditors place just the same reliance as regulators in the integrity of the data you’re providing to them, and if you’ve ever given an auditor reason to doubt your data integrity, you’ll remember the bill you were whacked with for the additional investigation work they had to do to gain back the missing confidence.


Reporting is the lifeblood of an audit and regulatory regime. Now, in the IT audit realm I usually live in there’s a great deal of overlap between the various standards against which you’re measured (or, in some cases, against which you measure yourself). In the financial world, the same applies; although the standards vary in size and all of them have their own nuances, the generic basics of risk assessment and risk management don’t change from situation to situation. And similarly, many of the reports you’ll be running for your periodic examinations will be common between standards and regulators: there are, after all, only so many ways to represent revenue, margins, costs, capital, and so on.

The other common factor between regulators and auditors, though, is that they’ll all have areas that they want to burrow into in more depth. They’ll take the mile-wide-and-inch-deep summary reports and will want to dig into particular details, and if it takes you five hours to run each report their fees department will have the metaphorical the taxi meter running while it chugs along.

Reporting is a game of two halves. On one hand is the business-as-usual material that should just drop out of the system, and on the other is the need for efficient on-demand reporting. And scrolling backward and forwards through time is also something you’ll often be asked for: to produce a report “as at” a given date. So scalability, indexing, historic data and access speed are the orders of the day.


Remember also that there’s more to data than files. Alongside the file stores you have the data that’s sitting in back-end databases behind your apps, and perhaps in the on-board caches of your reporting engines – not to mention email messages whose retention is required by the regulations. When you are required by financial regulators to keep records for upwards of five years you don’t want to keep everything on on-line, live storage: aside from the storage space you’ll be paying for, storing little-used data on your front-line storage is simply going to slow everything down reporting-wise.

Archiving, at both the storage and application levels, is key to maintaining performance: and if you can let the kit take the load and automatically squirrel away the stuff that’s not touched very often (and retrieve it if you should happen to use it) then all the better.

Data disposal

The final thing we’ll mention, and one that the regulators don’t tell you to do, is throw away your data: but failing to do so is plain daft. If you have a genuine need to keep data then that is fair enough, but can you really justify keeping ten-year-old sales data in case you feel like reporting on it one day?

At a fundamental level, it’s occupying storage – potentially lots of storage if you have multiple copies on a SAN that doesn’t do automated de-duplication. But more importantly there may be compelling reasons for you to actively dispose of data – not least data protection law that permits you only to hang onto data for as long as it’s required under your stated processing regime. And then there’s the consideration that if you’ve securely dumped the data that you’re no longer obliged to keep, it can’t be stolen by a hacker or demanded under a Freedom of Information request.

So … … here we are, in this industry we call technology, supporting businesses whose regulatory requirements extend into the often-esoteric land of finance, accounting and audit.

And guess what? Turns out it’s really just risk management, but with numbers. And like most other risk: it’s still all about the data.

The Register - Independent news and views for the tech community. Part of Situation Publishing