Threat Assessment


May 2012
Ivan Obolensky

Operational by September 2013 will be the National Security Agency’s Bluffdale, Utah Data center. This multibillion-dollar facility will be able to tap into and monitor the vast amount of electronic data that is transmitted anywhere in the world. The center will store it, de-encrypt it, and analyze it. And because this is the National Security Agency it would be logical to assume it will perform an ongoing threat assessment based on the information gathered.

Exactly what is a threat assessment?

A threat assessment is a document that outlines what is valuable to an individual or organization and needs to be protected. It covers who or what are the threats, and where the vulnerabilities
are. It outlines what happens if a threat is realized and what are the consequences. It then recommends what can be done to minimize exposure or loss.

These have become part of corporate culture and are used even by small organizations.

To begin a threat assessment, one outlines the scope: who will use it and how far-ranging it will be. A narrow scope would be individual or corporate vulnerability to a hacker attack while a national threat assessment might cover hundreds, if not thousands, of possible contingencies. Threats in terms of severity and exposure are usually given a numerical value to provide some idea of relative importance. Potential threats are then drilled down as to who and what is involved.

Policies and procedures for defense are outlined, and if the potential damage is so widespread and costly, a preemptive offense might be recommended.

The result is a wide-ranging document outlining potential threats, exposure, and policies and procedures to combat them, with recommendations to be implemented. It is a tool for top management and is updated regularly.1

Threat priorities can change and can offer profound insights into the thinking on both a national and an individual level. For instance in the United States prior to 1995, emphasis was primarily on external threats but with the Oklahoma City bombing and the 2001 anthrax attacks, internal threat risks took on greater significance.

Since 2008 there has also been increased economic distress, such that governments worldwide find themselves vulnerable to larger and more violent internal disturbances. Consequently, governments of democratic nations have gradually implemented plans to monitor not only external enemies but its citizens as well.

One need only look at the overthrow of the Egyptian government in 2011 to recognize that the ability to curtail the use of the Internet and cell phones would be a valuable means of population control during times of civil unrest.

Legislation in the US regarding Internet “kill switches”, and the detaining of citizens who are suspected of involvement in terrorism indefinitely without trial or charges, as well as a recent addition to the Highway Bill to suspend passports if there is merely a tax investigation, are on the rise as citizens are put under more severe scrutiny.2

In Great Britain the extraordinary number of closed-circuit cameras installed everywhere is legendary.

Threat assessments differ depending on who is doing them. If one were a government, the above actions might be viewed as prudent. If one were a citizen performing one’s own threat assessment it might appear differently.

Regardless, a realistic and useful threat assessment would have to involve the collection and processing of information.

For a large organization this would include the formation of databases with which to detect patterns of interest.

The difficulty with massive amounts of data is that someone or something has to eventually process it and summarize it so it can be used by planners and management.

In the case of the NSA or any intelligence agency not only must the information be correct but it must be timely and have the correct emphasis. Such information is costly and not easy to obtain.

One can train an agent like a 007 to go into an area, get the relevant information, handle the threat, and simultaneously create a script for a Hollywood blockbuster; but training and finding exceptional agents is difficult and expensive. They are also subject to human frailty as evidenced by the numerous instances of turned agents reported over the years.

Further, with threats both external as well as internal, the lower-cost and no-less-effective option is to simply monitor all communications traffic as was done successfully against the Japanese and Germans during World War II and thus be privy to the intentions and plans of enemies of all kinds.

To be effective one must be able to read the traffic one receives and this is where the facility at Bluffdale, Utah comes in. It will house a more powerful and faster version of Cray’s XT4 supercomputer named Jaguar for its processing speed.

Code-breaking has always been a necessary adjunct to communication monitoring. In the rivalry between decoding and encryption, the private sector’s ability to encrypt has surpassed the ability to decode. This has resulted in a huge volume of recorded but unreadable traffic, at least until now. With the computing power soon to be available at Bluffdale, previously encrypted information becomes vulnerable to unlocking by brute-force attack.

Storing even currently indecipherable transmissions serves two purposes. The more raw data available the more likely patterns can be found and the faster the encryption is unlocked. Secondly, once the coding is cracked one now has a long-term historical record. This allows the agency an over-the-shoulder view as to what was being said and written during times when the trafficker considered its communication unreadable. This can allow the decoder the ability to discover how a source has acted in the past
and then extrapolate how a potential threat will probably act in the future. It is also possible to use this information for future criminal prosecution as it establishes links between planned actions and events.

For this form of intelligence-gathering to be outstanding, its goal must be to monitor and store all possible electronic communications from emails to voice transmissions, from numerical data to tweets across all frequencies and along all pathways anywhere and at any time in the world. This is a lot of information. To give some perspective, the volume of phone traffic in 2001 was around 500 million calls a day worldwide and growing exponentially. And this is only cell phone traffic. To monitor all electronic information and store it will take a very large facility. In addition, to process and sift it for patterns and threats will take yet another type of computation. This is called data mining.3

An example of data mining is a program called NORA (Non- Obvious Relationship Awareness). Jeff Jonas put together such a program for use by Las Vegas casinos in the early 2000s. It sifted through vast amounts of personal information from dates of birth to social security numbers, travel records and many other databases in order to find hidden connections between seemingly unrelated individuals. The program’s purpose was to prevent a crime before
it happened. It was able to track down card cheats before they were hired by a casino.4

A similar program on a national scale was called the Total Information Awareness (TIA) program. It surfaced in 2003. Its purpose was to recognize terrorist behavior amongst the
transactional behavior of ordinary citizens. It created a hue and cry by privacy advocates and was shot down at least from public view shortly after it was announced. Similar programs are in existence. There were 199 data-mining programs in fifty-two Federal agencies according to a GOA report from 2004. It has become a serious tool on all levels.5

Data mining is as effective as the amount and quality of data one feeds a data-mining program. One recent (and perhaps the greatest) intelligence breakthrough in the amassing and processing of personal information has been the founding and widespread use of Facebook and Twitter. A whole new type of relationship data has become available. Any data-mining program of the NORA variety at work at Bluffdale should find such a database an extraordinary opportunity.

And just when it seemed that the simple way to move out from the electronic scrutiny was to simply not use any electronics at all, there is the recent announcement of the unfettered use of drones over unpopulated areas. Further, by suddenly not using any electronic devices, surveillance as all-encompassing as that envisioned might flag the fact that information that is expected to be taking place is not, and will therefore elicit closer scrutiny. Exactly what the consequences of these policy decisions are is not known, but a trend is in place.

One particular point about threat assessments is that they are tools. They can and should influence policy-making. On the other hand, they should not be the entire basis for setting policy either in a small organization or a large one. By emphasizing the threats, management is inclined to become locked into the defensive nature that is inherent in such an analysis and forget the purpose as to why the organization exists.

This applies to all who use them whether government or citizen.

1 Bayne, J. (2002). An Overview of Threat and Risk Assessment. Retrieved April 27, 2012, from SANS Institute InfoSec Reading Room:

2 Armstrong, M. (2012, April 22). Who is Really Behind the Curtain? Retrieved April 27, 2012, from Martin

3 Bamford, J. (2012, March 15). The NSA is Building the Country’s Biggest Spy Center (Watch What You Say). Retrieved April 27, 2012, from″>ff_nsadatacenter/all/1

4 Young, R. (2007, May 15). What Happens in Vegas… Retrieved April 27, 2012, from

5 Ibid.

If you would like to sign up for our monthly articles, please click here.

Interested in reprinting our articles? Please see our reprint requirements.

© 2012 Ivan Obolensky. All rights reserved. No part of this publication can be reproduced without the written permission from the author.

Leave a Reply