Christian Verstaete has started a multi-part series about “the business aspects of cloud computing” on Hewlett Packard’s Cloud Source Blog. In this post, I review the first six segments of that series. In Part 1 [“Responding to Megatrends,” 28 December 2011], Verstaete discussed “megatrends” that are changing the global business landscape. As a result, businesses must “become more responsive and agile.” So what does this have to do with cloud computing? Verstaete explains:
“[Being] more responsive and more agile … implies [having] a better understanding of the marketplace, competition, partners, suppliers, customers etc. [It also means] having the appropriate information … and being able to use powerful analytics tools across structured and unstructured data … to allow fast decision making.”
Verstaete believes that only cloud computing architectures can provide an infrastructure that is “flexible and agile” enough to meet the changing needs of businesses. He continues:
“A cloud platform managed and operated by the IT department can deliver the services that are ‘core’ for the business, while the ‘context’ ones may be sourced from public clouds. It’s the role of IT to manage the integration across those platforms while maintaining the legacy environments operational. Secondly, the business requires easy access to the appropriate information provided by internal systems or available on the internet. Establishing a set of analytics tools to slice and dice the data will allow the business teams to make their decisions quickly. And obviously, this needs to be done at lower cost and with as little [capital expenditure] as possible. Cloud [computing] and its consumption models lend itself ideally to those challenges.”
Over the course of his series, Verstaete tries to explain why cloud computing is up to facing the emerging challenges presented by his megatrends. The five trends he discusses are: demographics, globalization, social networks, mobility, and sustainability.
In Part 2 [“When the Papy-Boom gets involved,” 4 January 2012], Verstaete, talks about how the retirement of Baby Boomers (or Papy Boomers – Papy being a familiar nickname for European grandfathers) is going to affect the marketplace. He notes that “a fair amount of reasonably affluent people are reaching the retirement age where they have an increased amount of time to consume goods and services. It’s a treat and an opportunity.” What does that have to do with cloud computing? Verstaete’s first concern is that a lot of knowledge and expertise could walk out the door when a baby boomer retires unless it’s “properly harvested.” He explains:
“When they retire, the knowledge and experience of older technologies will disappear, as well as the deep understanding of the intrinsic nature of IT, as many learned to program in machine code, assembler and other lower level languages. IT departments will have to backfill this knowledge or start working differently. And can they find enough new hires to fill the positions left open with the retirement of the baby boomers? Here is where cloud [computing] can help. Consuming public cloud services for ‘context’ services leaves a smaller IT team available to focus on the delivery of the ‘core’ services required by the business. Virtualizing and automating the delivery environment allows them to increase productivity.”
Verstaete doesn’t write off Generation Y, he just believes they need to be managed differently than baby boomers or Generation X employees in order to make them as productive as possible. Generation Y employees will be the individuals who create the products and services that baby boomers will desire. Verstaete continues:
“The baby boomers have worked hard and expect to take full advantage of their retirement. They are also one of the first generations that experienced exponential progress in technology in general and electronics in particular. Many are actually eager adopters of those technologies and will continue be that as they grow older. They now have time on their hand, many of them with a reasonable buying power. So, enterprises that can focus on the creation of products and services for the baby boomers have great opportunities ahead of them. Combining actual products, with localization services, and cloud [computing] can serve as a platform to provide entertainment, healthcare, and other innovative user experiences. I leave it up to your imagination to identify such products.”
In Part 3 [“The strategic service broker,” 11 January 2012], Verstaete discusses why software-as-a-service can offer good returns on investment for most businesses. He writes:
“After having invested heavily in the deployment of applications such as finance and enterprise resource planning (ERP) to support the day-to-day operation of the enterprise and address the millennium bug problem, many CIO’s have started improving the efficiency of their environments through virtualization. But that is just the start. The next step is to add automation to speed-up operations and improve flexibility. But will that be sufficient to address the needs of the business? Probably not, as one of the requirements is the ability to quickly address new needs, some of which may require new functionality. Efficiency, responsiveness and flexibility, obtained by transforming the in-house IT landscape into a private cloud, allow existing operations to be adapted to changing needs quicker, but do little to improve the development of new functionality. So, here is the question, does that functionality actually need to be developed by the IT department, or could it be sourced from somewhere else? You will probably tell me it takes a long time to evaluate software packages and applications, negotiate licensing terms and conditions with software vendors, etc. And you are right, but that’s in the old world. In a new world, where Software-as-a-Service (SaaS) becomes the norm, one takes an account, tests things out (and we can ask the business to do that, can’t we?) and make a decision. If the service does not address the needs, well, it just costs us an hourly or monthly fee.”
As businesses move into the realm of cloud computing, Verstaete notes that the role of in-house IT departments will change dramatically – as will the role of the Chief Information Officer. He explains:
“The role of the CIO is changing. From being responsible for maintaining an existing IT infrastructure and application environment, he/she now becomes the guardian of the services to be provided to the business. It’s his/her decision to source those services from the most appropriate way to deliver them securely and timely to the business, taking security and compliance issues into account. Slowly but surely, the CIO becomes the ‘strategic service broker’ to the business. His/her main challenge, transform the IT department, manage the change and ensure he/she has the right resources for the job moving forward.”
In Part 4 [“Core versus context services,” 18 January 2012], Verstaete goes into more detail about his assertion that in-house IT departments should manage core services while context services should be managed by cloud service providers. He begins by explaining the difference between core and context services. He writes:
“A couple years ago I heard Geoffrey Moore speak about ‘core versus context.’ This is how he defines core: ‘My analysis in a nutshell is that core activities are those that increase the sustainable competitive advantage of a company. Core activities create value for customers in a way that is hard for competitors to replicate, and by doing so increase the market power of the company. Investors notice this, and reward the company with a higher stock price.’ … Obviously Moore recognizes that in the current world, core activities do not remain core very long as successful enterprises are often copied by others. Actually in a recent interview, he adds one important element: ‘Core is what allows a business to make more money and/or more margin, and make people more attracted to a business than to its competitors. Core gives a business bargaining power: it is what customers want and cannot get from anyone else.’ And obviously, context is everything else.”
Verstaete states that his “first rule of thumb is that core services should be kept internal. … The ‘secret sauce’ should remain secret.” When it comes to dealing with Big Data, Verstaete makes an interesting comment. He writes:
“One more aspect to look into is that data may be core. Indeed, enterprises may gather and combine specific information elements, providing them a unique view of the way their customers behave for example. Such data should be considered core and treated in the same way as core services.”
This may be one area, however, where a company may need outside help to gain all the benefit it can from its data resources. But those outside providers can establish a private cloud for a company so that it doesn’t fear losing a core asset.
In Part 5 [“Do you need multi-tenancy?” 30 January 2012], Verstaete talks about multi-tenancy and why it’s an important subject to understand. He writes:
“According to Wikipedia, multi-tenancy refers to a principle in software architecture where a single instance of the software runs on a server, serving multiple client organizations (tenants). Multi-tenancy is contrasted with a multi-instance architecture where separate software instances (or hardware systems) are set up for different client organizations. With a multi-tenant architecture, a software application is designed to virtually partition its data and configuration, and each client organization works with a customized virtual application instance. Webopedia has a simpler definition. In cloud computing, multi-tenant is the phrase used to describe multiple customers using the same public cloud. Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
• Centralization of infrastructure in locations with lower costs (e.g., real estate, electricity)
• Peak-load capacity increases (users need not engineer for highest possible load-levels
• Utilization and efficiency improvements for systems that are often only 10-20% utilized
“Actually, all of this sounds nice, but why does this really matter? In trying to maximize the use of our computing assets, we are confronted by two antagonistic elements. On the one hand, we would like to have our information assets being used as much as possible to maximize utilization. This may imply people from different departments and/or companies (each being one tenant) using the same assets concurrently. But on the other, we want to ensure we are fully isolated from the others. So, multi-tenancy is all about being able to isolate multiple tenants using the same IT assets.”
Verstaete admits that he is “very cautious about shared assets” and recommends that companies understand the level of isolation they require for a specific application. “If you decide to use public cloud services for your application,” he writes, “ensure you understand the multi-tenancy model used by the service provider.” In Part 6 [“One size cloud does not fit all,” 8 February 2012], Verstaete discusses the fact that company’s don’t have to choose either a private cloud architecture or public cloud architecture. He explains that there are a number of architectures from which a company can choose, such as: “cloud, hosted private cloud, virtual private cloud and public cloud.” He discusses each of those options in a little more detail. He concludes, “An enterprise has the option to choose any of those models for the implementation of a particular service.”
I would sum up what Verstaete is saying this way: To gain traction in the new business landscape and remain agile, a company is going to have to migrate some or all of its data and processes into cloud-based applications. Be careful, however, that you don’t expose your core data or processes in public clouds. There are plenty of options for companies to tailor their architectures to meet both their needs and requirements.