Home » Best Practices » The Curse of Silo Thinking

The Curse of Silo Thinking

January 21, 2010

supplu-chain

One of the characteristics of industrial age enterprises is that they are organized around functional departments. This organizational structure results in both siloed information and siloed thinking. For nearly a hundred years, the culture of siloed thinking permeated almost every organization — including national intelligent services where information was protected like a miser’s gold rather than shared as a common good. We all know the devastating consequences of this lack of shared data. The intelligence world isn’t the only arena of human activity where the absence of a cross flow of information can prove devastating. Another recent case in point is the collapse of Lehman Brothers [“The lessons: The dangers of silo thinking,” by Gillian Tett, Financial Times, 14 December 2009]. Tett writes:

“When Larry McDonald, a former bond trader at Lehman Brothers, wrote an exposé of that broker’s collapse, it seems that his main intention was to reveal the extraordinary folly and ineptitude of the former Lehman bosses. In practice, though, his colourful tale also highlights – almost inadvertantly – another crucial problem that haunts the modern financial world: the curse of silos.”

The silo is an apt image but it’s not the only one that is used. In some government circles, like the military, they talk about stovepipes. Regardless of what you call it, organizations that fail to share information across artificial boundaries will inevitably suffer as a result. Tett continues:

“As McDonald narrates, several years before the Lehman collapse in the autumn of 2008, its own fixed-income department was already so alarmed by the American real estate market that they were hunting for ways to go ‘short’. However, while one department of Lehman was exceedingly bearish, other departments, such as the mortgage securitisation team, were so aggressively bullish that they were increasing their exposure – and the different departments were in such rivalry that they barely knew what the other was doing, with disastrous consequences. It is a saga that raises a wider moral, not just for bankers, but for investors too. Vats of ink have been spilt to explain all the macro-economic and regulatory reasons for the financial crash. But one issue that has received less attention is the trend towards fragmentation in the financial industry, not just in a structural sense (ie departments that do not talk), but a mental sense too (ie financiers operating in tunnel-vision mode). This fragmentation fuelled many of the recent failures of public policy.”

Whenever information is not shared within an organization, that organization becomes a house divided against itself. Tett reports, however, that it is not just siloed information within organizations that can create problems. Problems can also be created when information is not shared between organizations. She continues:

“Just look at how the activities of groups such as AIG ‘fell through the cracks’ because there were numerous competing regulatory bodies in the US. Look too at how British policy-makers tried to separate out the conduct of monetary policy (managed by the Bank of England) from financial regulation (handled by the Financial Services Authority) with equally disastrous effect. However, the problem of fragmentation has also been central to the disaster in private-sector institutions. Lehman was certainly not the only bank marked by internal tribalism. Institutions such as UBS, Merrill Lynch and Citi demonstrated similar problems. And this sense of fragmentation has not just hampered information flows around banks, but has also prevented information flowing across the market too. That, in turn, has fuelled a sense of tunnel vision among some investors, with equally dismal results.”

Tett notes that “the financial crisis has highlighted with painful clarity just how dangerous such tunnel vision can be,” but she’s encouraged by the fact that more and more people are engaging in “lateral thinking.” She continues:

“The good news is that some financiers, investors and policymakers are belatedly trying to combat it. The hot new fad among regulators, for example, is macro-prudential surveillance (a posh phrase for proactive regulation that tries to join up all the dots). Investment banks are scurrying to beef up their risk management functions, and stressing the importance of holistic oversight. Meanwhile, a host of asset managers are champions of lateral thought, and are trying to understand what is happening in seemingly disconnected silos – be that in the Chinese auto industry, carbon trading markets or credit default swaps.”

Tett is not naive, however, and she understands that “the curse of silos will not be easy to beat.” The reasons, she asserts, are two-fold. First, bad habits are hard to break. Second, the information age brings with it a crush of data that requires experts to make sense of it. The problem is that most experts only talk to other experts in the same field. Tett concludes:

“For one bizarre paradox of the modern age is that while technology is integrating the world in some senses (say, via the internet), it is simultaneously creating fragmentation too (with people in one mental silo tending to only talk to each other, even on the internet.) And as innovation speeds up, this is creating a plethora of activities that are only understood by ‘experts’ in a silo – be that in finance or in numerous other fields. That pattern implies there is now a big need for ‘cultural translators’, who can explain what is happening in those silos to everyone else. But the cadre of cultural translators in today’s world is pitifully small (and may even be shrinking, as institutions such as media organisations and rating agencies find their business models under threat).”

Long-time readers of this blog know that I’m a strong proponent of cross-sector interaction. Breaking down barriers and establishing dialogs between experts in different arenas almost always leads to positive consequences. When I began Enterra Solutions®, one of the goals I had in mind was creating a system that would help companies break down the silos that were hampering their operations. I called the system Enterra’s Resilient Technology Architecture™, which I envisioned as a next-generation architecture that could establish the infrastructure for resilience on an organization-wide basis. Although Enterra’s business model has evolved to focus on other areas in addition to RTA, it remains a viable solution to silo thinking.

 

A Resilient Technology Architecture would take advantage of the features of a service-oriented architecture to deploy dynamic, automated processes, encoded in standards-based languages and technologies, — such as BPEL, XML, and Java, across multiple IT systems and layers within the architecture stack. A Resilient Technology Architecture would provide for the encoding of extremely complex rules, including those governing contingencies and other multi-faceted processes operating across the full breadth of the organization. The processes involved in a Resilient Technology Architecture would call data from existing data repositories, thus supporting one of the major goals of Enterra’s Enterprise Resilience Management Methodology® – breaking down information silos and allowing data to flow across the organization.

As I envisioned it, a Resilient Technology Architecture would also provide a mechanism to serve decision-support information to managers and senior leaders. That information would be delivered by Transparent Intelligent Interfaces™ — rich Internet applications — which would consolidate data about processes in progress, and present it in context so that it could serve as the basis for individual and organizational decision-making. Complex event processing (CEP), which involves the continuous processing and analysis of high-volume, high-speed data streams from inside and outside an organization to detect business-critical issues in real-time, would be an important part of architecture. Enterra’s envisioned approach differs from traditional intelligence processes, which generally provide delayed analysis. The vast majority of event processing applications used today are custom-coded. Much of this custom coding effort, however, can be eliminated by using commercial-off-the-shelf CEP engines. The CEP layer would be a critical component of the RTA because it would allow the architecture to filter, correlate, and react to events in real-time or near-real-time. Events in CEP can be business, external, or system events.

 

The linkage between Transparent Intelligent Interfaces and automated processes would also enable automatic real-time alerts to appropriate managers in the event of a process failure, a threat event, or an emerging competitive opportunity. A Resilient Technology Architecture would establish the basis for consistent application of automated best practices, and for near-real-time updating of automated rules in an event driven service-oriented architecture. An RTA would have a dynamic knowledge base containing a repository of pre-configured business-process and best-practice templates. The knowledge base would include industry-standard rule set collections that encode automated processes that apply across an entire industry – established compliance documentation procedures and standard practices – and it would include a custom rule set collection of client-specific processes.

 

Because rules maintain linkages back to their source documents in this kind of architecture, they can be easily updatable to account for changing regulations, new best practices, and changes in standard operating procedures. The update capability inherent in the knowledge base establishes a system that, much like antivirus software, keeps an organization always current with regulatory requirements and performance standards – a capability that is key to resilience, and is a central element of Enterprise Resilience Management. Because a Resilient Technology Architecture is only part of broader enterprise system, Enterra now includes the concept under a broader business philosophy we call an Advanced Enterprise Management Systems™. Enterra’s most advanced segment of AEMS involves helping Consumer Packaged Goods manufacturers with their order fulfillment process. It is one part of Enterra’s drive to help create “Intelligent Supply Chains” that reduce inefficiencies and increase profitability.

Related Posts: