The Poker Face of Wall Street Calls NHTSA’s Bluff

We read uber-Risk Manager and author of The Poker Face of Wall Street, Aaron Brown’s post “Sympathy for the Flash Crash” in Minyanville, a business and investment information website, with great enthusiasm. It was fascinating to see a business pundit draw parallels between Toyota Unintended Acceleration and the flash crash of May 6, 2010, in which the Dow dropped about 1,000 points, but recovered almost immediately afterward. It was refreshing to read a financial columnist who actually understands what happened after NHTSA tried to wrest control over an elusive technical problem. He writes:

“…the net result was that the agency ordered the recall of 8 million vehicles and levied the maximum allowed civil fine, then waited for the problem to go away on its own before issuing a study denying there had been a problem in the first place because they looked really hard and couldn’t find one.

When you don’t understand a system, throwing experts at it to announce they can’t understand what happened so it must have been human error, is an unconvincing—but irresistible—tactic.”

While we part company with Mr. Brown over the possibility and advisability of implementing regulation to fix the problem, and a few other details, his viewpoint is worth a read. The good folks at Minyanville kindly gave us permission to re-print it.  (The original article can found here on Minyanville.)

Sympathy for the Flash Crash

Reprinted with permission from Minyanville

By Aaron Brown May 04, 2012 9:00 am

The entire modern world has become too complex for anyone to understand, and therefore, too complex for anyone to fix with top-down rulemaking.

MINYANVILLE ORIGINAL When I learned to drive 40 years ago, there were direct mechanical linkages between the car’s controls and its wheels and power train. When I turned the steering wheel, my muscle power (mediated by some levers and gears) changed the direction of the wheels. Accelerator, brakes, heater dial — all affected things through direct physical actions. We make fun of someone who confuses effects with causes by saying he tries to slow down a car by moving the speedometer needle. But in that 1962 model VW bus, pushing down the speedometer needle would, in fact, increase friction on one rear wheel and reduce the speed of the car. The cable was too delicate to provide significant deceleration, but in physical principle it would work.

In the car I drive today, none of that is true. The controls are inputs to or outputs from the computers that drive the car. Backup physical systems are preserved for steering and brakes in case the computers fail, but they have disappeared from larger vehicles and will likely soon be removed from passenger cars as well.

The surprising thing is, almost no one noticed. People know cars drive better, last longer, and are more fuel efficient (the 1962 version came with a 10,000 mile/one-year warranty and required minor maintenance every time you bought gas — its 38 horsepower engine consumed more gas per mile than far heavier and more powerful modern cars). Curious drivers who open the hood see a bunch of mysterious boxes instead of a few easily recognizable components that could be repaired out of general mechanical intuition. Many people know intellectually about the changes, but they don’t affect much about how you drive a car.

The mechanical systems in modern cars are two orders of magnitude more reliable than the systems they replaced. It’s very rare for a modern, properly-maintained car to have a serious failure due to mechanical breakdown. However when cars do malfunction, it can be extremely difficult to locate the problem. In 1970 if your brakes failed, the accident investigators could point to a physical part that broke or otherwise failed (assuming there was enough of the car left to study). But in 2009 when drivers alleged out-of-control acceleration in Toyota’s (TM) Prius, it took rocket scientists to analyze the issue. The National Highway Transportation Safety Administration summarized its 10-month investigation:

At the Goddard Space Flight Center in Maryland, NASA hardware and systems engineers rigorously examined and tested mechanical components of Toyota vehicles that could result in an unwanted throttle opening. At a special facility in Michigan, NHTSA and NASA engineers bombarded vehicles with electromagnetic radiation to study whether such radiation could cause malfunctions resulting in unintended acceleration. NHTSA engineers and researchers also tested Toyota vehicles at NHTSA’s Vehicle Research and Test Center in East Liberty, Ohio to determine whether there were any additional mechanical causes for unintended acceleration and whether any of the test scenarios developed during the NHTSA-NASA investigation could actually occur in real-world conditions.
I think it’s fair to say that a similar announcement would have been inconceivable in 1970. Moreover, it’s striking how little content there is. The report included lots of other details about how much work was done to try to recreate the problem: enlistment of the National Academy of Sciences, examination of “tens of thousands of documents,” review of “128,000 lines of computer code,” and involvement of “the best and brightest engineers.”

But the net result was that the agency ordered the recall of 8 million vehicles and levied the maximum allowed civil fine, then waited for the problem to go away on its own before issuing a study denying there had been a problem in the first place because they looked really hard and couldn’t find one.

When you don’t understand a system, throwing experts at it to announce they can’t understand what happened so it must have been human error, is an unconvincing—but irresistible—tactic.

In 1970, if you put in an order to buy 100 shares of General Electric (GE), you called a real person, who sent it to the floor of the New York Stock Exchange, where real people executed a real transaction. You bought your shares from a real counterparty. Today, your order could be routed by a complex algorithm to a number of places. It will touch off an unpredictable flurry of high-frequency trading activity. Even after the fact and, in principle, it is impossible to determine who holds 100 fewer shares of GE as a result of your trade. The only way to determine that would be to go back in time and rerun trading without your order.

In fact, it’s only faith that tells us anyone holds 100 fewer shares. Whenever actual shares outstanding have to be added up for some purpose, significant discrepancies are found. So your shares could have been created out of thin air.

The same things are true of stocks as cars. Lots of people haven’t noticed the fundamental changes in the financial system. The new trading systems are far more efficient, handling far larger volumes at far lower cost, with far more efficient prices. But modern financial disasters like the flash crash resist simple explanation even after extensive study by experts and issuing of wordy reports stressing the effort that went into the investigation rather than credible and comprehensible explanations of events.

The fact that financial disasters have become unexplainable is the impetus for many reform proposals. It’s not just the flash crash; think of the missing money at MF Global (MFGLQ) or even the entire 2007-2009 financial crisis. Some people want to break up firms and simplify businesses. Others want to outlaw high-frequency trading, or at least tax it and slow it down. There are many proposals to discourage derivatives and structured products, and also to gather information so regulators can understand them. People want to discourage short-term and levered trading and to push execution to the simple 1970 public exchange model.

While some good may come of some of these ideas, I think the potential for good is limited and the potential for harm is great. Our modern complex and tightly-coupled financial system is far better than the one it replaced. It has made the world richer and better and fairer. Restoring things to 1970 technology would be a disaster worse than a thousand flash crashes. Milder proposals to add safety devices might help a little, but there are problems. Safety devices increase complexity, the harm of which can outweigh any good. Many engineering disasters are caused by safety systems interacting in unexpected ways. Even when they work, safety systems can encourage people to rely on them, thereby increasing net danger.

The mildest proposals are ones that merely attempt to understand the system, not to change it. These will cause little harm, despite the complaints of the enemies of transparency, but also little good. It’s just not possible to understand the system in the sense that people who like regulation mean. Gathering lots of data will not help.

And it’s not just cars and stocks, it’s the entire modern world that has become too complex for anyone to understand, and therefore, too complex for anyone to fix with top-down rulemaking. Things get fixed with bottom-up tinkering and experimentation, by natural selection and evolution.

I have no idea if we will have another flash crash, but I’m highly confident we will have many financial events as dramatic and mysterious. [Editor’s note: The flash crash took place at 2:45 p.m. EDT on May 6, 2010. During it, the Dow (^DJI) plunged about 1,000 points, almost one-tenth of its entire value, before recovering within minutes. It was the largest one-day point decline in Dow Jones Industrial Average history.] And these don’t worry me as much as dramatic and mysterious physical disasters that kill people, not just rearrange people’s wealth. I expect we will have many unexplained disasters: airplane crashes, blackouts, mass computer failures, diseases, and sudden environmental changes, just to name a few. I am also highly confident that the total amount of damage from financial and physical disasters will be far smaller than the damage caused by the simpler, but more common and bigger, disasters of the past.

This is the world we live in, and I think most people know it. Politicians and regulators have to pretend otherwise, that we can prevent unexplainable bad things by giving more power to the former, creating more of the latter and writing a few thousand more pages of rules. After all, as they are fond of saying, “We have to do something!”

But we all know none of them are remotely qualified to diagnose or solve these problems. It takes real experts with specialized knowledge and their own money on the line, and it takes trial and error, patience and creativity. The people making top-down rules have no more idea of the effect of those rules than they know what would happen if you Tasered the malfunctioning autopilot of a modern jetliner in mid-flight. I don’t know what would happen either, but I think it’s a bad idea. Complex systems are, well, complex, and amateurs changing them from the outside are far more likely to do harm than good.

I have heard some exciting things from people I respect about Scott Patterson’s upcoming book, Dark Pools: High-Speed Traders, A.I. Bandits, and the Threat to the Global Financial System (he wrote The Quants in 2010). Scott won’t give me an advance copy so I’ll have to buy it in June like everyone else. Maybe he, a professional journalist working from the inside, has figured out something about the flash crash. I don’t claim that a clear, comprehensible answer is impossible, just that no one has come close to delivering one yet.

Humans have always lived in a universe they don’t understand, and probably always will. The practical focus of some of that uncertainty has shifted from natural events like weather and disease to the behavior of systems we created ourselves. We have made the world a much safer, more interesting, and more comfortable place than the past, but our creations have sometimes proven to be as unpredictable as the natural systems they replaced. So on the anniversary of the flash crash, give thanks for the millions of complex systems that worked perfectly today, and don’t worry too much about the one or two that failed.
Read more: