In April 2010 I’ll be at SOURCEBoston on a panel discussing how compliance regulations get made.  This got me thinking about how to explain in simple terms such a complex series of events.  I’ve previously discussed the question of “why” regulatory compliance is important and it’s relation to vaccinations.  Here I’d like to discuss the “how” of regulatory issues.

(If you’d like to hear about this and other PCI related issues then register for the BrightTALK PCI Compliance Summit on March 25, 2010.)

There are so many debates about the pros and cons of regulatory compliance but they all focus on the individual and not the population as a whole.  In fact, the best way to model and examine the evolution of regulation and deregulation is through the eye of the scientist examining the entire population of players.

Background:

Let’s take a look at the history of regulation and deregulation.  The following are a few industries that have experienced both regulation and deregulation over the years, but the list may as well also include industries such as agriculture, telephone, communications (radio, TV, cable), medical and pharmacy.

  • Airline
    –Civil Aeronautics Board (1937)
    –Airline Deregulation Act (1978)
  • Railway
    –Interstate Commerce Commission (1887)
    –Railroad Revitalization and Regulatory Reform Act (1976) / Staggers Rail Act (1980)
  • Trucking
    –Motor Carrier Act (1935)
    –Motor Carrier Regulatory Reform and Modernization Act (1980)
  • Energy
    –OPEC price hikes (1973)
    –Emergency Natural Gas Act (1977)

Each of these industries experienced a need for regulation and eventual deregulation in order to keep in check the potential for large problems that could impact large numbers of people (e.g. monopoly, poor conditions, unbound risk, lack of consumer protection).  In 1935 Congress passed the Motor Carrier Act that gave the Interstate Commerce Commission (ICC) authority to regulate trucking involved in interstate commerce.  When the confines of this regulation outlived it usefulness the tides turned.  From 1971 until the eventual passage in 1980 politicians worked to remove barriers to entry into this industry and finally passed the Motor Carrier Regulatory Reform and Modernization Act.  This migratory pattern of regulation and deregulation occurs regularly in many industries.

Pattern of Data Loss

It is no surprise to anyone that there is a building momentum of data loss.  We can gather individual statistics from the news or get detailed statistics from DataLossDB.org.  Either way we notice a pattern of attacks and rising numbers of data breaches that make us ask, is the situation getting better or worse?  Is what we are doing having the desired effect?

It’s very difficult to answer that question since the problem is multi-factorial, but there are signs that things are getting better.  As fraud shifts from one industry to another and one method to another we are slowly driving it from the system.  (This type of analysis does not as easily apply to authentication/identity fraud, but may very well when it comes to system infiltration and data exfiltration techniques.)  For example, we see attack vectors moving from one method to another and from one geographic region to another.  Attackers originally stole data from flat files but when those were encrypted the attackers began capturing data as it traversed the network.  When this was encrypted they began installing custom malware to capture data in memory.  Slowly the system are moving from system protection, to network, to software, and finally hardware protection.

As protection system such as Chip-PIN were implemented across Europe and Asia we saw a drop in card present fraud as the attackers moved to online and e-commerce fraud (via UKPA or APACS).  The attackers adapted to the system and moved on to other low hanging fruit.

History of Regulatory Time

I can’t really do justice to replicating the work of David Lineman, of  Information Shield, so I’ll simply reference his paper “A History of Regulatory Time” and reference his graph showing a timeline of security privacy-related regulations.  Take a look and map the regulations below against the major data breaches of recent and we begin to notice the correlation of regulation in reaction to the rise in tide of data breaches.

Inflection Points and Traffic Jams

Simply analyzing data breaches and their respective reactionary regulation doesn’t paint a precise picture of how the regulations are formed, only that they are somehow correlated.  To understand this we need to first understand a little about math.  Inflection points are the change in slope from an increasing value to a decreasing value or vice versa.  In terms of data breaches we can consider if the number of data breaches, though currently increasing, has a slope that is increasing or decreasing.

Andy Grove, founder of Intel, said in his book Only the Paranoid Survive that “An inflection point occurs where the old strategic picture dissolves and gives way to the new.”  We need to focus on this inflection point in order to understand and if the increasing numbers reflect a state of growth or decline in a system, which we are (unfortunately) only able to measure over time.

In fact, this concept is familiar to physicists in the term “hysteresis“.

For example, consider a thermostat that controls a furnace. The furnace is either off or on, with nothing in between. The thermostat is a system; the input is the temperature, and the output is the furnace state. If one wishes to maintain a temperature of 20 °C, then one might set the thermostat to turn the furnace on when the temperature drops below 18 °C, and turn it off when the temperature exceeds 22 °C. This thermostat has hysteresis. If the temperature is 21 °C, then it is not possible to predict whether the furnace is on or off without knowing the history of the temperature.

The question we always ask is “Where are we on the Sine Wave of Pain?”  Is the rate of negative events increasing or decreasing?  The only way to know is gather and map data as well as measure trending patterns in the industry and make calculated estimates as to which it is.

One thing for sure is that the population not the individual is what drives regulation and as such it is the population that examined the rising data loss numbers and determines when they want change.  It is this demand for change that ultimately initializes the regulation engine to affect what the individual cannot directly.

Traffic Patterns and Modeling

Still, all we have shown at this point is that a culmination of actions can result in change brought upon by the populous.   How that change is enacted is an area of great interest and one that draws from, of all things, traffic patterns.  Before getting into that I’d like to reflect on different types of phase shifts seen both in nature and fiction.  We are all familiar with the concept of ice melting into water which freezes into ice.  It was Kurt Vonnegut who in his book Cat’s Cradle first proposed the fictional concept of Ice-Nine.  This was said to be a polymorph of water that freezes at 45.8 °C (114.4 °F) instead of 0 °C (32 °F).  The idea being that ice could maintain its ice form even at room temperature which is around 20 °C  (68 °F) to 25 °C (77 °F).  In the book, it would take only a single fragment of “ice-nine” to come in contact with the ocean and they would all instantly freeze.  This shows how a seemingly stable system can react suddenly when given the proper catalyst.

A common method of modeling traffic patterns is the Nagel-Schreckenberg (NaSch) model.  (For more detailed information on this model I recommend reading Traffic Simulation using Agent-Based Modelling by Andrew Lansdowne.)  The diagram to the right shows this model in that the traffic flow (y-axis) is measured against the traffic density (x-axis).  You can see that as the traffic density increases the traffic flow increases.  This continues until point “A” where we reach the critical density.  This is the density at which a chance can occur but not at which it must occur.  If everyone continues driving along at the same rate the density can increase until a critical event occurs that breaks down the system.  An example could be one person applying the breaks which then causes the person behind them to do the same, and on and on.  Point “B” is the moment at which the critical event occurs.  At this point we see the traffic flow decrease representing the slowing of traffic until the density is so high it stops (point “D”).

One interesting feature of this series of events is that the traffic flow pattern will always exist in a cycle moving from point A to B, to D and back to A in that order.  Traffic will never go from D to B because doing so requires it to first traverse A.  Remember that term hysteresis?  In the book Critical Mass by Philip Ball he states, “A state of traffic depends not only on its density but on its history – on whether it was previously denser or less dense.  As the traffic rate rises and then falls, the flow rate follows a loop.”

We can examine the graphical flow of data in another form by mapping space on the road (x-axis) against time (y-axis).  As you can see in the second diagram, we map the position of each vehicle over time.  Until the density decreases the traffic jam will continue.  Here the traffic jam is visible in the very dense points as a diagonal across the diagram.  Once the density decreases we once again see a greater flow of traffic.

What’s the Solution?

As you can see, modeling traffic patterns can be very similar to the regulation and deregulation of an industry.  So what is the solution to an increase in incidents that push us past the critical density?  Contrary to initial though the solution to high traffic is not to simply build more roads.  In fact, Richard Moe, Head of the US National Trust for Historic Preservation, once said “building more roads to ease traffic is like trying to cure obesity by loosening the belt”.  Simply applying ‘more’ security does not mean you achieve ‘better’ security.

I propose the following approaches:

  • Help prevent data sprawl :: Security is required where data is maintained.  Does your environment reflect the “data, data, anywhere” or “data, data, everywhere” philosophy?  Do you know where all your data is? Does it exist in more locations than is necessary?  Check these items and set measurable actions to correct it.
  • Examine use cases :: While medical record data requires persistence, payment card data is only used once and then not ever again.  The use cases are simple enabling a flexible set of measures to secure the data.  If your business model does require retention of data then examine what data you are retaining and make sure it’s as benign as possible.
  • Brute force is effective but costly, while the elegant solution is simple and secure :: Have you ever considered replacing the data you retain with a reference number instead?  I recommend you read up on technologies such as point-to-point encryption and tokenization.
  • Solve tomorrows problems with today’s technology :: Problems are not hard if you know which ones to solve.  I recommend absorbing and comparing as many of the data breach reports (more) you can to determine what emerging attack patterns exist in your industry and how to prevent them.  If you are only able to implement one set of technology each 10+ years then make sure it solves tomorrows problems and not yesterdays.
  • Plugging one hole doesn’t save the levee :: Reducing card present fraud drives attackers to e-commerce.  Reducing fraud in one country drives them to others.  Only a holistic solution will work on such interconnected systems.  This is one of the arguments for industry regulation.

3 Habits of Highly Effective Regulation

In the end there are three attributes, or habits, that make regulation effective in achieving adoption and acceptance.

  1. Education, education, education :: This is the single most effective method of driving adoption.  People want to know how to interpret, implement, and adopt the regulation to their business model.  I’ve seen more people fail to start because they didn’t know where to start than anything else.  People want to know if they can use a $0.10 piece of duct tape or if they need to replace the entire engine of the car.
  2. Flexibility of controls :: This is an attribute of so many regulations due to the fact that they apply to such a range of companies, industries, size of organizations and the like.  Remember that 100% compliance is not the goal when system failures occur in groups.  The PCI DSS has what’s called “compensating controls.”  The EU Data Protection Directive has the “comply or explain” concept.  Even the ISO 27000 series do not mandate 100% adherence to each and every control.
  3. More data for Risk Modeling :: Let’s consider this without getting into a debate over Frequentist vs. Bayesian statistics (as I’ll leave that to Alex Hutton).  The more data we have the more closely we can make educated decisions about how to evolve the standard, protect against failure, and make deterministic decisions about how to proceed.  More data will help us understand when we have reached an inflection point and ultimately determine when the rising regulation turns toward deregulation.
Slide 10

that freezes at 45.8 °C (114.4 °F) instead of 0 °C (32 °F)