May 4, 2016
“We’re about to push out a social media policy for security clearance professionals,” said William Evanina, director of the National Counter Intelligence and Security Center (NCIC) in the Office of the Director of National Intelligence.
Timing is not set and details are scant. Some agencies have already run pilots to see what social media monitoring might produce. Others include social media checks in their insider threat programs..
“People are ok with giving financial [data],” Evanina said during an April 28 Insider Threat Symposium presented by the Intelligence and National Security Alliance. “But there’s still great reluctance to show what we do online.”
How social media will be measured and exactly what will be observed is still being worked out, but it’s likely to figure into both security clearance reform and insider threat monitoring. The two are not the same.
Security clearances focus on external data, such as self-reported financial information, credit reports and interviews with colleagues, family, friends and neighbors. By contrast, insider threat programs focus on actual behavior. They watch patterns, like arrival and departure times and computer usage, such as which files you access and download, which internet sites you visit and how often you use a given application.
Both have interest in social media activity.
Carrie Wibben, director for security and policy oversight within the office of the undersecretary of Defense for intelligence, said the Pentagon is “convinced of the value of social media” and what such intelligence it can convey.
Social media networks bring to light extended personal and family relationships that might be hard to detect otherwise, as well as off-duty activities and personal expression that could indicate potential risk factors. Wibben said DoD has piloted social media checks using “publicly available electronic information,” and that such searches, while expensive and difficult to automate, can be fruitful.
“Social media is a critical data source,” Wibben said, especially for the younger security professionals joining the intelligence community today.
At the National Security Agency (NSA), 80 percent of the workforce has been hired since 9/11, said Kemp Ensor, NSA’s director of security. “They live on the net. Their friends are online. That’s where we need to be.”
In a recent pilot test, he said, the agency examined 175 individuals’ public internet profiles and found that 45 percent identified information that could be adjudicated, meaning further investigation was necessary. “We’ve got to mine what’s out there in adjudicated criteria,” he said.
Evaluation vs. Monitoring
Interest in social media comes as the intelligence and national security communities work to update the 70-year-old security clearance process that’s been in place since World War II. Built on self reporting and interviews, it provides only a snapshot of an individual’s trustworthiness.
Daniel Payne, director of the Defense Security Service, says that’s not good enough. “Lives change,” he said. “We have backlogs. Nobody has been reinvestigated every five years.”
He and others want to implement a system of continuous evaluation (CE) to “give us insight into behaviors in between those reinvestigation periods.”
At the same time, agencies are trying to roll out insider threat programs, which focus directly on day-to-day activity and behavior. Unlike the security clearance system, which is standardized across the community, insider threat programs are unique to each agency: The Pentagon alone will have 44 separate programs. Yet the two can overlap.
The new National Background Investigations Bureau (NBIB) is taking over security clearance management from the Office of Personnel Management, and will ultimately be responsible for implementing CE. Jim Onusko, NBIB’s transition director, said CE’s increased focus on data and new sources of information will transform the process over time and change both the way employees are evaluated and the skills and tools NBIB uses to do its job.
“This truly becomes a capacity issue,” he said. “The number of investigators and the volume coming in the front door” are not aligned. “CE has got to be the future, but we need the electronic means to harness this information … and greater automation to complete these faster.” The kinds of workers the agency needs will also change. “The more electronic information you bring in, it begins to displace the investigating officers,” he said, and increases the need for data analysts.
Monitoring Programs
Security officials all say these monitoring programs cannot exist in a vacuum and that managers and supervisors, along with human resources officials, must understand and embrace these programs as a necessary part of securing the workplace for everyone. Likewise, program owners have to be clear with employees about what is being watched and how that information will be used.
Government systems employ banners warning employees about inappropriate use of government computer systems, but it doesn’t stop them, said the NSA’s Ensor. “You’d be surprised at what people do on bannered systems,” he said.
What’s the answer? Ensor says it boils down to “the ‘A’ word: Accountability.” Managers must be accountable for applying and enforcing the rules. And employees have to be accountable for their behavior.
It’s not the monitoring programs that are the problem, he said. It’s the behavior they identify.
Tobias Naegele is the editor in chief of GovTechWorks. He has covered defense, military, and technology issues as an editor and reporter for more than 25 years, most of that time as editor-in-chief at Defense News and Military Times.
No comments:
Post a Comment