Since 2013, the U.S. Intelligence Community’s annual worldwide threat assessment to Congress has highlighted cybersecurity as the top national security threat. In September 2015, the Director of National Intelligence, James Clapper, testified that “cyber threat cannot be eliminated; rather, cyber risk must be managed.” But how is a nation—let alone any company targeted by hackers—supposed to manage this pervasive risk when there has traditionally been no mechanism to reliably quantify the probabilities of adverse cyber incidents or measure the likely costs in dollars? Indeed, that very challenge led the World Economic Forum to identify cyber threats as a vital risk to businesses in the 21st century.
During my tenure as the first National Intelligence Officer for Cyber Issues from 2011 to 2016, I regularly engaged industry leaders and stressed the need for risk management strategies for information and communication technology (ICT) resources. Functionality reigns supreme in the innovative ICT sector, and security—at least the kind that would be required to stymie well-resourced, determined adversaries—is often an afterthought in the competitive race to market. That continuing trend, combined with inchoate legislative and regulatory efforts that have not kept pace with the evolving cyber landscape, leaves many corporate enterprises not only vulnerably dependent on insecure ICT networks, but also highly uncertain about the potential impacts of cyber attacks and their own actual risk exposure.
In order to better assess those concerns and chart a pragmatic path forward, one must properly frame the problem at hand. This past May, I delivered a keynote address for Temple University in Tokyo where I explained that “cybersecurity” models must be replaced by “information risk” paradigms. That assertion carried two important messages. First, perfect cybersecurity is an unachievable goal, so risk analysis is a better intellectual construct to apply. Second, this is not just a technology problem; it must also consider people, processes and other contextual factors in addition to the underlying hardware and software.
So, if the marketplace is not offering effective enterprise solutions yet, and public sector initiatives are lagging as well, then where can today’s C-suite turn? Ironically, the answer to that Information Age quandary may lie in the centuries-old insurance industry. During congressional testimony before the House Oversight and Government Reform Committee on July 13, both myself and Peter Singer (a senior fellow from the think tank New America) highlighted the role of the growing cyberinsurance industry in driving best practices for cybersecurity. I have publicly opined for several years now, that more sophisticated analytics are needed within the cyberinsurance industry because we currently lack robust actuarial data regarding the frequency and cost of cyber incidents. Most enterprises are at risk, and many have already been compromised (whether they know it or not). Yet, the majority of those events are either not being reported or do not lend themselves to easy quantification.