A fascinating article in the NYT about Dr. Nouriel Roubini, points out that the current United States economy was a predictable disaster. He found a pattern and tried to warn people in 2006 of the coming crisis:
Most of these countries also had poorly regulated banking systems plagued by excessive borrowing and reckless lending. Corporate governance was often weak, with cronyism in abundance.
I noted in the article that he wrote a book with Brad Setser, a friend and former colleague of mine. I had tried to convince Brad, I would guess around the same time, that he should work with me on a book about macro-security.
Brad was not convinced. He said he worked in an obscure monetary field that has little or no relevance to information security.
This article reminds me that economics are never far from security, even information security. The economic security of nations and the role of governance are a macro study of the same issues most companies face every day when dealing with information security.
Maybe I’ll ping Roubini about this, although it sounds like he probably is in high demand:
Kenneth Rogoff, an economist at Harvard who has known Roubini for decades, told me that he sees great value in Roubini’s willingness to entertain possible situations that are far outside the consensus view of most economists. “If you’re sitting around at the European Central Bank,” he said, “and you’re asking what’s the worst thing that could happen, the first thing people will say is, ‘Let’s see what Nouriel says.’ ” But Rogoff cautioned against equating that skill with forecasting. Roubini, in other words, might be the kind of economist you want to consult about the possibility of the collapse of the municipal-bond market, but he is not necessarily the kind you ask to predict, say, the rise in global demand for paper clips.
That sounds exactly like the role of a security executive. They might be called a CSO, CISO, Chief Paranoid, or even court jester, but the role they play is critical to maintaining an even balance of information. Many times in my career I have been the only person to say “the data and reports show success highly unlikely and the benefits do not outweigh the risks — proceed with caution”. The first time you tell executives or a board of directors that they are headed for disaster, you can expect resistance. After a disaster you predict, you can bet management will at least ask for your view on all remaining projects.
As I used to say in meetings “it’s nice to look at the clouds during a picnic, but the guy watching ants is the most likely to predict the weather”.