Home » Secure Information Sharing » Sharing Sensitive Information

Sharing Sensitive Information

April 2, 2010

supplu-chain

Despite efforts to improve information sharing among intelligence organizations, headlines continue to trumpet the fact that gaps still exist [“Tribal warfare,” by Daniel Dombey, Financial Times, 11 March 2010]. Dombey notes that “concern is mounting about the continued failure of the myriad agencies to co-operate as they face unprecedented pressure to maintain national security.” He reports:

“Rarely, if ever, have the myriad agencies that make up the intelligence landscape been as central to US national security as in the past decade, as the CIA, FBI and other services have refocused on the struggle against extremist militant groups. ‘This is a war,’ Leon Panetta, CIA director, said. … But as the agencies wage that war, questions keep surfacing about their ability to work together – even after the most comprehensive overhaul in their history following the attacks of September 11 2001. Their very tactics also remain the subject of intense, sometimes almost tribal, disputes. ‘It is hard to think of a decade in which the intelligence community has been more important for the core functions of the American government,’ says Philip Zelikow, formerly a senior state department official and a central figure on the 9/11 commission, which called for an intelligence shake-up. ‘And some of the problems have never been starker.’ Although espionage and counterespionage played central roles in the cold war, much of that struggle involved diplomacy, grand strategy and tests of military and economic strength. By contrast, in the battle against al-Qaeda, it is the US intelligence sector that is at the forefront.”

Although a number of potential attacks have undoubtedly been prevented as a result of shared intelligence, intelligence personnel remind us that potential terrorists can fail time and again but it only takes a single success for people to claim that the intelligence system is broken. Each failure causes people to reexamine policies and procedures on how information is shared. The dozen and half organizations involved are shown in the graphic that accompanied Dombey’s article.

 

 

Dombey continues:

“The real question is whether the huge intelligence sector has changed enough to prevail in the long run; or whether the CIA still hankers after its old role as first among equals while the FBI resists co-operation, and a new center struggles to assert control. At stake is not just whether the different agencies opt for co-ordination or culture clash but the very means by which they take on their adversaries. America’s intelligence agencies have been held responsible for two historic blunders in the past decade: failure to anticipate the attacks of 9/11 and incorrect assertions that Saddam Hussein possessed weapons of mass destruction on the eve of the Iraq war. Dennis Blair, director of national intelligence, recently remarked that after 9/11 the US discovered it could not be protected by the military alone and intelligence agencies realized they had to pool information, rather than just report to different government departments. But only this year Mr Obama himself upbraided the agencies for failing to ‘connect the dots’ and use available information to stop Umar Farouk Abdulmutallab, who allegedly tried to blow up the Northwest Airlines flight over Detroit at Christmas, from travelling to the US. There has since been an almost uninterrupted stream of criticism of the 16 agencies’ problems in working together. ‘You have a very large posse and no real sheriff running it,’ says Bruce Riedel, a CIA veteran and former White House aide. The cultural differences go deep. In the old days, Mr Riedel says, the word was that FBI agents shopped at Sears & Roebuck – whose cheap suits went with a down-to-earth image of hunting down criminals – while DIA staff wore uniforms and CIA case officers had a penchant for fine tailoring. Today, Mr Riedel hastens to add, dress styles at the much expanded CIA look more like those of a college campus, but the old divisions between agencies have not disappeared.”

Dombey asserts that not all of the blame for this tribal warfare rests with the organizations involved. Some of the blame can also be laid at the feet of “legislators and the current administration.” He explains:

“When the 9/11 commission proposed the position of director of national intelligence, it envisaged a clear hierarchy in which the CIA director would serve as the DNI’s number two. That never happened. … Intelligence officials barely bother to disguise the tension between Mr Blair, a former commander of US forces in the Pacific, and Mr Panetta, a savvy former White House chief of staff who knows the Oval Office inside out.”

The old saw that claims “knowledge is power” has always been true in politics and information sharing among organizations hasn’t yet been divorced from politics at the top. Dombey continues:

“Officials and experts add that it is unrealistic to expect the intelligence reforms to have taken full effect barely half a decade after they were enacted. A common comparison is with the Goldwater-Nichols act that reshaped the military from 1986, and sought to reduce intra-service rivalries and co-ordination failures by, for example, increasing the powers of the chairman of the joint chiefs of staff. That process took 20 years to complete, by many accounts. Doubts remain, however, about the classic Washington solution of designing another level of bureaucracy. … But many intelligence professionals say the creation of the DNI has improved co-ordination, particularly with ‘fusion centres’ drawing in staff from the agencies in the US and across the world. Mr Blair touts the creation of A-Space – short for Analytic Space – a web-based resource where analysts from different agencies can post information and ideas, an effort he compared to a classified MySpace or Face-book, complete with hyperlinks and RSS news alert feeds. But differences between the agencies are reflected in tensions over striking the right balance between electronic surveillance, often [called] signals intelligence (‘sigint’), and human intelligence gathering (‘humint’) when handling extremists.”

These agency differences and tensions are centered around the operations groups and how they must share information from a daily mission and technical standpoint. There has actually been significant progress in the development of policies that promote cross-agency information sharing and provide the basic rules to achieve “the right balance.” In response to the 9/11 Commission’s Recommendations, Congress passed and the President signed the Intelligence Reform and Terrorism Prevention Act of 2004, which called for the creation of an Information Sharing Environment (ISE – www.ise.gov) across the intelligence community and defined it as “an approach that facilitates the sharing of terrorism information.” Many key agencies have implemented plans to meet the ISE objectives and under the Obama Administration, the ISE’s Information Sharing Council has been integrated into the White House policy process through the Information Sharing and Access Interagency Policy Committee (IPC) to promote collaboration and sharing lessons learned.

 

Although it continues to take time for the paradigm shift into a new intelligence sharing culture to emerge, advances in hardware and software are enabling the shift to move faster. For example, on the hardware side, a technique known as photonic switching will help get the right information to the right place more quickly [“Inside the Black Box,” by Kelvin Chau, C4ISR, March 2010]. Chau, an electrical engineer and optics expert, is head of system integration and testing at Glimmerglass, a Hayward, CA-based company founded in 2000 to develop optical communications systems for commercial and intelligence community customers. He writes:

“Traditional optical communications equipment is widely deployed by the world’s most elite intelligence agencies, but the critical need for rapid and flexible data dissemination is creating a more central role for purely optical ‘photonic’ switching systems. … The great majority of voice, video and data traffic flows over fiber-optic networks. Undersea communications fibers are used to connect continents; signals at satellite terminals are converted into optical streams for ground transmission; sensor signals are carried over optical paths; and fiber-optic cables carry critical information for telecommunications service providers and intelligence agencies. The systemic deployment of fiber infrastructure has delivered profound benefits, but also major challenges. Before photonic switches, the main method for directing optical traffic at the intelligence community’s communications hubs was to convert it to electrical signals inside optical-electrical-optical (O-E-O) switches. The electrical signals would be redirected, or switched, in a new direction, and converted back to optical signals for transmission through fiber-optic cables. Managing these optical links is an increasing problem as network operators expand capacity and migrate to Internet Protocol/Ethernet. With an ever-increasing number of fibers and multiple wavelengths per fiber, and ever-increasing data rates, the process of optical-electrical conversion required hundreds of electronic chips, and these chips required a commensurate amount of space, cooling and power. Given these increasing burdens, the idea of sidestepping the conversion process where possible sparked the development of photonic switching technology.”

Chau notes that his company “has pioneered their use for photonic switching” and that “a Three-Dimensional Micro-Electro-Mechanical Systems (3D-MEMS) architecture for putting mirrors on silicon has emerged as the most economically viable approach for building reconfigurable, transparent and scalable photonic switches.” He reports that “3D-MEMS components are used today in car navigation systems and inkjet heads and many other applications.” The real value of the 3D-MEMS, he writes, is that provides “a scale needed to support a global communications network node with multiple fibers, each carrying hundreds of wavelengths.” Chau continues:

“The real power of 3D-MEMS technology is enabling many small but complex elements to be precisely built on a wafer through lithography, etching and masking processes. Through batch fabrication and minimization of materials, 3D-MEMS-based products are now manufacturable and cost-effective across many industries. They are lightweight and require very low power while providing high performance.

Chau goes on to report how photonic optical signal management helps address “five major challenges facing the intelligence community”:

1. FAST SWITCHING, LESS OPTICAL LOSS

Glimmerglass harnessed 3D-MEMS technology to create and control beam steering mirrors, enabling millisecond reconfiguration of optical signal paths for the dissemination of intelligence. … A double-gimbal actuator structure allows a wide range of tilt in all directions. … Hinges provide the mirrors a full range of movement with no friction and virtually no material stress. With this design, a mirror can be adjusted to a new, stable position within 25 milliseconds. … This 3D-MEMS photonic switching architecture offers enormous flexibility and efficiency relative to cost and size. … 3D-MEMS have the best performance in terms of optical loss among the purely optical switch technologies. Excessive loss in an optical network could lead to the need for higher-power lasers, increasing the cost of surrounding equipment. Minimizing such loss improves the overall network economics.

2. MORE CAPACITY

There is a significant increase in port capacity, which equates to increases in the amount of information that can be disseminated and the number of users who can receive actionable intelligence. … The photonic switching approach provides great flexibility in intelligence monitoring and resources sharing, reduces network churn and lowers costs. These attributes become more important when data rates and volume climb.

3. FEWER UPGRADES NEEDED

Photonic switching produces bit rate and protocol transparent platforms with minimal system optical loss by redirecting the actual photons instead of processing the information into an electrical signal. Photonic switching devices can accept and manage evolving formats and data rates and manage higher capacity signals without the need for upgrades, thus ‘future proofing’ the network and protecting investments.

4. SMALLER IS BETTER

The switches provide a dramatic reduction in power, size and heat. … This translates into a reduced footprint and cost savings by eliminating power-generating, air-conditioning and distribution equipment such as batteries, rectifiers, diesel generators and monthly maintenance.

5. HANDS-FREE DISTRIBUTION

Photonic optical solutions are transforming intelligence collection and dissemination by enabling the flexibility of nonintrusive, remote monitoring and rapid reconfiguration of optical signal paths through software control, eliminating the need for costly manual equipment adjustments and precluding human errors. Leaps forward in software and hardware integration make it possible for network operators at remote locations, particularly headquarter control sites, to use Web-based graphical interfaces or command language to control photonic switches. Operators are able to form large networks of multiple switches capable of being monitored and controlled from a single remote server.

Chau concludes by providing a few examples of why this new switching technique is so valuable to the intelligence community.

“Consider a military operations center, which requires constant feeds of critical data and where automatic switching from a failed fiber could preclude loss of actionable intelligence. One such example is a knowledge wall displaying high-definition information feeds from many sources requiring complex management. Many intelligence community applications require remote sensor management. Manning these sensor sites is challenging due to both geographic location and staffing requirements. Several intelligence programs have already realized the flexibility of the remotely managed photonic optical systems. Sensing locations have been set up and, because of the remote management capability, network managers and intelligence analysts can make fiber switching decisions from headquarters. Much has been debated about the ability of various intelligence parties to collaborate. Significant work has been done to improve intelligence sharing, such as across the Department of Defense’s Distributed Common Ground/Surface System, the National Geospatial-Intelligence Agency’s e-GEOINT Web services and the National Security Agency’s Real Time Regional Gateway. Thanks to major technology advances, photonic switching solutions today offer significant potential for enhancing real-time information exchange among all the relevant intelligence community partners.”

Having the right hardware supporting intelligence sharing is, of course, important. Moving information around, however, is only half the problem. Once information arrives, because it is sensitive, the system needs to know that person having access to the information has the right clearance and a need to know. To deal with this problem, the intelligence services have been working on Attribute-Based Access Control (ABAC). I first wrote about ABAC in a post entitled Persistent Ocean Surveillance. In that post, I wrote:

“The goal of an Attribute-Based Access Control framework is to provide a community the ability to manage secure information and resource exchange while instilling confidence among information owners that only authorized users can access appropriate data and information. Role-based access control (RBAC) functions accomplish this today for stand-alone systems and enterprises but do not scale for enterprise and multi-party communities that need to share real-time data at a granular level (e.g., individual document) and/or under specific conditions. ABAC frameworks deal with both granularity and scalability challenges and are able to address dynamic situations. Because information owners retain control over access to their information, authority is inherently distributed and managed. ABAC, however, provides a means for information granting authority to determine and specify who gets access and when. As a result, while control policies are centralized, decentralized resource owners retain fine-grained control of their own information.”

As one academic paper concludes, “Attribute-based access control, making access decisions based on the attributes of requestors, resources, and the environment, provides the flexibility and scalability that are essential to large-scale distributed systems such as the Grid” [“Attribute Based Access Control for Grid Computing,” by Bo Lang, Ian Foster, Frank Siebenlist, Rachana Ananthakrishnan, and Tim Freeman, Argonne National Laboratory]. These distributed systems are growing in complexity and deal with security controls for sensitive information, as I discussed earlier, as well as proprietary information (think secure supply chains) and privacy issues (think new federal health care and smart grid programs). As things now stand, the technology for secure information sharing will likely be in place before the culture that it will ultimately support is fully embraced. If I’m correct, the technology will actually help speed up the process of culturalization and provide the means to address difficult security requirements at a reasonable cost.

Related Posts:

Beyond the Blockchain Hype

New technologies are always subject to hype and blockchain (aka distributed ledger) technology is no different. Matthew Lieberman (@MBLieberman), an Advisory Marketing Leader at PwC,

Read More »
Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!