Categories
Cloud Computing

Plain-speaking: Data Privacy vs. Data Security – Espionage in the Cloud Age

When it comes to data privacy, data security and espionage, there are myths and obscurities to consider. Over and over, false truths continuously circulate, especially within the context of the cloud. Vendors shamelessly take advantage of customer insecurities by using incorrect information in their PR or marketing activities.

Causing confusions and hoaxes

Headlines like „Oracle invests in Germany for data security reasons“ are just one of many examples of information misinterpreted by the media. However, the ones who should know better – the vendors – are doing nothing to provide better clarity. On the contrary, the fears and concerns of the users are used without mercy to make business. For example, Oracle’s country manager Jürgen Kunz justifies both new German data centers by stating that “In Germany data security is a particularly sensitive topic.” The NSA card is easy to play these days by just saying, “… that Oracle, as a US company, stays connected to the German market.”

However, the location has nothing to do with data security and the NSA scandal. If intelligence is getting access to the data in a data center in Germany, the US, Switzerland or Australia, this has very little to do with the country itself. If the cloud provider sticks to its own global policies for data center security on the physical as well as the virtual level, a data center regardless of the location should overall provide the same level of security. Storing data in Germany is no guarantee for a higher level of security. A data center in the US, UK or Spain is just as secure as a data center in Germany.

The confusion. When it comes to security, two different terms are frequently being mixed: data security and data privacy.

What is data security

Data security means the implementation of all technical and organizational procedures in order to ensure confidentiality, availability and integrity for all IT systems.

Public cloud providers by far offer better security than a small business is able to achieve. This is due to the investments that cloud providers are making to build and maintain their cloud infrastructures. In addition, they employ staff with the right mix of skills and have created appropriate organizational structures. For this reason, they are annually investing billions of US dollars. There are only few companies outside of the IT industry that are able to achieve the same level of IT security.

What is data privacy

Data privacy is about the protection of personal rights and privacy during the data processing.

This topic leads to the biggest headaches for most companies, due to the fact that the legislative authority can’t take it easy. This means that a customer has to audit the cloud provider in compliance with the local federal data protection act. In this case, it is advisable to use the expert report of a public auditor since it is time and resource consuming for a public cloud provider to be audited by each of its customers.

Data privacy is a very important topic; after all, it is about a sensitive dataset. However, it is essentially a topic of legal interest that must be ensured by data security procedures.

The NSA is a false pretense. Espionage is ubiquitous.

Espionage is ubiquitous. Yes, even in countries like Germany. Although one should not forget that each company could have a potential Edward Snowden in its home. The employee still feels comfortable but what happens when he receives a more attractive offer or the culture in the team or the company changes? Insider threat presents a much greater danger than external attackers or intelligence. The former hacker Kevin Mitnick describes in his book “The Art of Deception” how he got all the information in order to prepare his attacks by simply browsing the trash of his victims and using techniques of social engineering. In his cases it was more about the manipulation of people and extensive research instead of the capturing of IT systems.

A German data center as a protection against the espionage of friendly countries is and will stay a myth. When there’s a will, there’s a way. When an attacker wants to get the data it is only about the criminal energy he is willing to undertake and the funds he is able to invest. If the technical challenges are too high, there is still the human factor as an option – and a human is generally “purchasable”.

The cloud is the scapegoat!

The cloud is not the issue. To use espionage as an excuse for not using cloud services is too easy. Bottom line, in the times before the cloud, the age of outsourcing, it was also possible to spy. And the intelligence did it. Despite the contracts with their customers, providers were also able to secretly give data to the intelligence.

If espionage had been in the focus during the age of outsourcing as it is today, outsourcing would have been demonized by now. Today’s discussions are a relevant product of the political situation in which the lack of trust characterizes the formerly established economic, military and intelligence partnerships.

Due to the amount of data that cloud providers are hoarding and merging, today they have become more attractive. Nevertheless, for an outsider to get access to a data center takes a lot of effort. Andrew Blum describes in his book „Tube: Behind The Scenes At The Internet“ that because of the high connectivity to other countries (e.g. the data from Tokyo to Stockholm or data from London to Paris), one of the first Internet hubs „MAE-East“ (1992) had quickly become an objective of the US espionage. No wonder, since MAE-East was the de-facto way into the Internet. Bottom line is, intelligence does not need to make a footstep into one single provider data center – it simply needs to hijack a connectivity hub to eavesdrop the data lines.

The so-called “Schengen-Routing” is discussed in this context. The idea is to let the data traffic stay in Europe as data are transferred only between hosts in Europe. Theoretically, this sounds like an interesting idea. In practice it is totally unfeasible. When using cloud services from US providers the data are routed through the US. If an email from a German provider’s account is sent to an account managed by a US provider, the data need to leave Europe. In addition, for many years we have been living in a fully interconnected world where data are exchanged globally. And there is no way back.

A more serious issue is the market power and the clear innovation leadership of the US when compared to Europe and Germany. The availability and competitiveness of German and other European cloud services is still limited. The result is that many companies have to use cloud services from US providers. Searching for a solution only on the network layer is useless, unless competitive cloud services, infrastructure and platforms of European providers are available. Until then, only data encryption helps in order to avoid intelligence and other criminals from accessing the data.

It is imperative that European and German providers develop and market innovative and attractive cloud services, because a German or European data center on its own has little to do with a higher data security. It just offers the benefit of the German/ European data privacy standard to fulfill the regulatory framework.

– – –
Picture source: lichtkunst.73 / pixelio.de

Categories
Cloud Computing

Cloud Computing Adoption in Germany in 2014

A current study by Crisp Research including 716 IT decision makers shows a representative picture of the cloud adoption in the German (DACH) market. In 2014, the cloud has been embraced by IT departments. More than 74 percent of the companies are already planning and implementing cloud services and technologies in production environments. Only about one out of four (26 percent) of the surveyed said that at present and in the future the cloud is not an option for their company.

The cloud naysayers are primarily represented by small and medium-sized enterprises (more than 50 percent of all cloud naysayers are from companies with up to 500 employees). The number of companies from this domain who are actively planning or implementing cloud technologies is also significantly low. One reason for this is the low affinity for IT at enterprises of this size. Many small and medium-sized enterprises still do not understand the different IaaS, PaaS and SaaS possibilities and are not familiar with the offered cloud services. In addition, compared to bigger companies, the smaller ones place higher weight on the risk associated with use of cloud services. IT departments of larger companies are better equipped with protection measures and knowledge of IT security, and so feel less exposed to the risk of data misuse or loss.

Cloud naysayers are a dying species in large businesses with at least 5000 employees (less than 20 percent). IT managers are already implementing cloud operation processes and are looking for the most intelligent architectures as well as the most secure systems. For them the question is not “if” but rather “how” they can use cloud technologies.

According to the survey, companies with 5000 – 10000 employees are the frontrunner. The percentage of companies that describe the cloud as a solid element of their IT strategy and IT operations is around 39.5 – the peak in the study. For companies of this size, the number of cloud naysayers is also the lowest – only 16.3 percent.

In very large companies with more than 10000 employees cloud usage is flattening a little. There are various reasons for this. On the one hand, the increasing IT and organizational complexity lead to longer planning and implementation processes. Discussions of security and governance processes also take respectively more time and naturally lead to demands for higher standards, which sometimes cannot be fulfilled by the cloud provider. On the other hand, the tendency to develop own applications to manage individual software solutions and processes in this enterprise category is higher. The stronger financial resources and the availability of own, comprehensive data center capabilities allow for additional demands to be handled internally, even if the implementation is not as flexible and innovative as the internal users and departments require.

Categories
Cloud Computing

Platform-as-a-Service: Strategies, technologies and providers – A compendium for IT decision-makers

“Software is eating the world” – With this sentence in an article for the Wall Street Journal in 2011, the inventor of Netscape and famous venture capitalist Marc Andreesen described a trend which is today more than apparent: software will provide the foundation for transforming virtually all industries, business models and customer relationships. Software is more than just a component for controlling hardware: it has become an integral part of the value added in a large number of services and products. From the smartphone app to the car computer. From music streaming to intelligent building control. From tracking individual training data to automatic monitoring of power grids.

60 years after the start of the computer revolution, 40 years after the invention of the microprocessor and 20 years after the launch of the modern internet, the requirements for developing, operating and above all distributing software have changed fundamentally.

But not just the visionaries from Silicon Valley have rec- ognised this structural change. European industrial concerns such as Bosch are in the meantime also investing a large percentage of their research and development budgets in a new generation of software. So we find Bosch CEO Volkmar Denner announcing that by 2020 all equipment supplied by Bosch will be internet-enabled. Because in the Internet of Things, the benefit of products is only partially determined by their design and hardware specification. The networked services provide much of the benefit – or value added – of the product. And these services are software-based.

The implications of this structural shift have been clearly apparent in the IT industry for a few years now. Large IT concerns such as IBM and HP are trying to reduce their dependency on their hardware business and are investing heavily in the software sector.

In the meantime, the cloud pioneer Salesforce, with sales of four billion USD, has developed into one of the heavyweights in the software business. Start-ups like WhatsApp offer a simple but extremely user-friendly mobile app which have gained hundreds of millions of users worldwide within a few years.

State-of-the-art software can cause so-called disruptive changes nowadays. In companies and markets. In politics and society. In culture and in private life. But how are all the new applications created? Who writes all the millions of lines of code?. Which are the new platforms upon which the thousands upon thousands of applications are developed and operated?

This compendium examines these questions. It focuses par- ticularly on the role that “Platform as a Service” offerings play for developers and companies today. Because although after almost a decade of cloud computing the terms IaaS and SaaS are widely known and many companies use these cloud services, only a few companies have so far gathered experience with “Platform as a Service”.

The aim is to provide IT decision-makers and developers with an overview of the various types and potential applications of Platform as a Service. Because there is a wide range of these, extending from designing business processes (aPaaS) right through to the complex integration of different cloud services (iPaaS).

With this compendium, the authors and the initiator, PIRONET NDH, wish to make a small contribution to the better under- standing of state-of-the-art cloud-based software development processes and platforms. This compendium is designed to support all entrepreneurs, managers and IT experts who will have to decide on the development and use of new applications and software-based business processes in the coming years. The authors see Platform as a Service becoming one of the cornerstones for implementing the digital transformation in companies, because the majority of the new digital applications will be developed and operated on PaaS platforms.

The compendium can be downloaded free of charge under “Platform-as-a-Service: Strategies, technologies and providers – A compendium for IT decision-makers“.

Categories
Cloud Computing

Analyst Strategy Paper: Open Cloud Alliance – Openness as an Imperative

Future enterprise customers will be accessing a mixture of proprietary on-premises IT, hosted cloud services of local providers and globally active cloud service providers. This is a major opportunity for the market and all participants involved. This is especially so for small hosters with existing infrastructures, as well as for system integrators with the appropriate know- how and existing customer relations.

In light of this, in this strategy paper Crisp Research investigates the challenges that both provider groups are facing in this situation, and deals with the most important aspects and their solutions.

The strategy paper can be downloaded under “Open Cloud Alliance – Openness as an Imperative“.

Categories
Cloud Computing

The impact of OpenStack for the cloud sourcing

In 2014 German companies will invest around 6.1 billion euro in cloud technologies. Thus, cloud sourcing is already seven percent of the whole IT budget. For this reason the importance of cloud ecosystems and cloud marketplaces are getting a higher significance in the future.

Crisp Research predicts the amount of cloud services that are traded using cloud marketplaces, platforms and ecosystems of around 22 percent until 2018. However, the basic requirement for this is to eliminate the current weak spots:

  • The lack of comparability,
  • Minor transparency,
  • As well as a poor integration.

These are elemental factors for a successful cloud sourcing.

Openness vs. Comparability, Transparency and Integration

In bigger companies the cloud sourcing process and the cloud buying center are dealing with a specific complexity. This is due to the challenge that the cloud environment is based on several operation models, technologies and vendors. On average smaller companies using five vendors (e.g. SaaS). Big and worldwide distributed companies are dealing with over 20 different cloud providers. On the one hand this shows that hybrid and multi cloud sourcing is not a trend but reality. On the other hand that data and system silos even in cloud times are an important topic. But, how should IT buyer deal with this difficult situation? How could a dynamic growing portfolio be planned and developed in the long-term and how is the future safety guaranteed? These are different challenges that should not be underestimated. The reason is obvious: Over the last years neither cloud providers nor organizations or standardization bodies were able to create mandatory and viable cloud standards.

Without these standards clouds are not comparable among themselves. Thus, IT buyer had a lack of comparability. On the technical as well as on the organizational level. In this context contracts and SLAs are one issue. More difficult and chancier it is becoming in the technical context. Each cloud infrastructure provider has its own magical formula on how the performance of a single virtual machine is composed. This lack of transparency leads to a bigger overhead for IT buyer and increases the costs for planning and tendering processes. The IaaS providers are fighting out their competition on the back of their customers. Brave new cloud world.

Another problem is the bad integration of common cloud marketplaces and cloud ecosystems. The variety of services on these platforms is growing. However, the direct interaction between these different services within a platform was neglected. The complexity increases when services are integrated across infrastructure, platforms respectively marketplaces. Today, without a big effort deep process integration is not possible. This is mostly due to the fact that each closed ecosystem is cooking one’s own meal.

Standardization: OpenStack to set the agenda

Proprietary infrastructure foundations could have an USP for the provider. However, at the same time they are leading to a bad interoperability. This leads to enormous problems during the use across providers and increases the complexity for the user. Thus, the comparison of the offerings is not possible.

Open source technologies are putting things right in this situation. Based on the open approach several providers are taking part in projects in order to push the solution and of course to represent the own interests. Therefore it turns out that an audit authority is necessary to increase the distribution and adaption. The benefit: If more than one provider is using the technology, this leads to a better interoperability across the providers and the user is getting a better comparability. In addition, the complexity for the user decreases and thus the effort during the use across providers – e.g. the setup of hybrid and multi cloud scenarios.

A big community of interests, where well-known members are pushing the technology and using it for their own purposes, is leading to a de-facto standard over time. This is a technical standard, which is “[…] may be developed privately or unilaterally, for example by a corporation, regulatory body, military, etc.”

How this works the open source project OpenStack shows impressively. Since its start in 2010 the framework for building public and private cloud infrastructures is getting a lot of attention and has a big constant momentum. By now OpenStack is the foundation of several public cloud infrastructures and product portfolios, among others Rackspace, HP, IBM, Cisco and Oracle. But also many enterprises have discovered OpenStack for their private cloud environments, e.g. Wells Fargo, Paypal, Bloomberg, Best Buy and Walt Disney.

Because of the open approach as well as the continuous development by the huge and potent community (every six month a new version is released) OpenStack is a reliable and trustable partner for IT infrastructure manager. Professional distributions are helping to increase the footprint on the user side and make sure that more and more IT decision maker of bigger companies are building their cloud infrastructure based on OpenStack in the future.

This positive development also arrived in Germany. The results of a current Crisp Research study (“OpenStack im Unternehmenseinsatz”, German) show that almost 50 percent of the cloud users know OpenStack. Already 29 percent of the cloud users engage actively with OpenStack.

The OpenStack ecosystem is still getting bigger and thus pushing the standardization in the cloud. For this reason, IT buyers are getting a better scope while purchasing cloud resources from several providers. But they should keep in mind that their IT architects would entirely separate more from the underlying infrastructure in the future to move applications and workloads on demand across providers. Container technologies like Docker – supported by OpenStack – are pushing this trend.

Think across marketplaces

Cloud marketplace provider should act in terms of their customers and instead of using a proprietary technology also set on open source technologies respectively a de-facto standard like OpenStack. Thus they enable the interoperability between cloud service provider as well as between several marketplaces and creating the requirements for a comprehensive ecosystem, in which users are getting a better comparability as well as the capabilities to build and manage truly multi cloud environments. This is the groundwork to empower IT buyer to benefit from the strength of individual provider and the best offerings on the market.

Open approaches like OpenStack are fostering the prospective ability to act of IT buyer across provider and data center borders. This makes OpenStack to an important cloud-sourcing driver – if all involved parties are admitting to a common standard. In terms of the users.

Categories
Cloud Computing

Data Center: Hello Amazon Web Services. Welcome to Germany!

The analysts already wrote about it. Now the advanced but unconfirmed announcements are finally reality. Amazon Web Services (AWS) have opened a data center in Germany (region: eu-central-1). It is especially for the German market and AWS’ 11th region in the world.

Data centers in Germany are booming

These days it is fun to be a data center operator in Germany. Not only the “logistic centers of the future” are getting more and more into the focus because of the cloud and digital transformation. The big players in the IT industry also throw their grapplers more and more into the direction of German companies. After Salesforce announced its landing for March 2015 (partnership with T-Systems), Oracle and VMware followed.

Against the spread out opinion of a higher data security on German ground, these strategic decisions have nothing to do with data security for customers. A German data center on its own offers no higher data security but rather gives only the benefit to fulfill the legal requirements of the German data privacy level.

However, from a technical perspective locality is of a big importance. Due to continuous relocation of business-critical
 data, applications and processes to external cloud infrastructure, the IT-operating concepts (public, private, hybrid), as well as network architectures and connectivity strategies are significantly changing for CIOs. On the one hand, modern technology is required to provide applications in a performance-oriented, stable and secure manner; on the other hand, the location is significantly decisive for optimal “Cloud- Connectivity“. Therefore, it is important to understand that the quality of a cloud service is significantly dependent on its connectivity and the performance on the backend. A cloud service is only as good as the connectivity that provides it. Cloud-Connectivity – minor latency as well as high throughput and availability – is becoming a critical competitive advantage for cloud provider.

Now also the Amazon Web Services

Despite of concrete technical hints, AWS executives have cloaked in a mantle of secrecy. Against all denials, it is now official. AWS has opened a new region “eu-central-1” in Frankfurt. The region is based on two availability zones (two separated data center locations) and offers the whole functionality of the Amazon cloud. The new cloud region is already operational and can be used by customers. With the location in Frankfurt Amazon opens its second region in Europe (besides Ireland). This empowers the customer to build a multi-regional concept in Europe to ensure a higher availability of their virtual infrastructure from which the uptime of the applications also benefits.

Frankfurt is not an unusual selection of location for cloud providers. On the infrastructure side, the location Frankfurt am Main is the backbone of the digital business in Germany. As far as data center density and connectivity to central internet hubs are concerned, Frankfurt is the leader throughout Germany and Europe. 
The continuous relocation of data and applications to external cloud provider infrastructures made Frankfurt the stronghold for cloud computing in Europe. 


In order to help its customers to fulfill the data privacy topics as well as on the technical site (Cloud-Connectivity), Amazon took the right decision on the German data center landscape. This already happened in May this year when the partnership with NetApp for setting up hybrid cloud storage scenarios was announced.

Serious German workloads on the Amazon Cloud

A good indication for the appeal of a provider’s infrastructure is its reference customers. From the beginning AWS is focusing on startups. However, in the meantime they try everything to also attract enterprise customers. Talanx and Kärcher are already two well-known customers from the German business landscape.

The insurance company Talanx has shifted the reporting and calculation of its risk scenarios into the Amazon cloud (developed from scratch). Thereby, Talanx is able to ban its risk management out of its own data center but it is still Solvency II compliant. According to Talanx, it is achieving a time advantage as well as annual savings at a height of eight million euro. The corporation and its CIO Achim Heidebrecht are already evaluating further applications to shift into the Amazon cloud.

Kärcher is the world’s leading manufacturer of cleaning systems and is using the Internet of Things (Machine-to-Machine communication) to improve its business model. For optimizing the usage of the worldwide cleaning fleet Kärcher is using the global footprint of Amazon’s cloud infrastructure. Kärcher’s machines regularly send information into the Amazon cloud to be processed. In addition, Kärcher provides information to its worldwide partners and customers through its Amazon cloud.

Strategic vehicle: AWS Marketplace

Software AG is the first best-known traditional ISV (Independent Software Vendor) on the Amazon cloud. The popular BPM tool Aris is now available as “Aris-as-a-Service” (SaaS) and is scalable using the Amazon cloud infrastructure.

Software AG is only one example. Several German ISVs could follow. The global scalability of Amazon’s cloud infrastructure makes it an especially attractive partner, capable of delivering SaaS applications soon to target customers beyond the German market. The AWS Marketplace plays a key role in this context. AWS owns a marketplace infrastructure that ISVs can use to provide their solutions to a broad and global audience. The benefits for the ISV are in being able to:

  • Develop directly on the Amazon cloud without need for own (global) infrastructure.
  • Develop solutions “as-a-service” and distribute over the AWS Marketplace.
  • Use the popularity and scope of the marketplace.

This scenario means one thing for AWS: The cloud giant wins in any case. As long as the infrastructure resources are used the money is rolling.

Challenges of the German market

Despite their innovation, leadership public cloud provider like AWS are having hard times with German companies, especially when it comes to the powerful Mittelstand. For the Mittelstand self-service and the complex use of the cloud infrastructure are among the main reasons to avoid using the public cloud. In addition, even if it has nothing to do with the cloud, the NSA scandal has left psychological scars at German companies. Data privacy connected with US providers is the icing on the cake.

Nevertheless, AWS has carried out its duty with the data center in Frankfurt. However, to be successful in the German market there are still things left. These are:

  • Building a potent partner network to appeal to the mass of German enterprise customers.
  • Reduce the complexity by simplifying the use of the scale-out concept.
  • Strengthen the AWS Marketplace for the ease of use of scalable standard workloads and applications.
  • Increase the attractiveness for German ISVs.
Categories
Cloud Computing

Top 15 Open Source Cloud Computing Technologies 2014

Open source technologies have a long history. Linux, MySQL and the Apache Web Server are among the most popular and successful technologies brought forth by the community. Over the years, open source experienced a big hype which, driven by developers, moved into corporate IT. Today, IT environments are no longer conceivable without open source technologies. Driven by cloud computing, open source presently gains strong momentum.

Several projects launched in the recent past have significantly influenced the cloud computing market, especially when it comes to the development, setup and operations of cloud infrastructure, platforms and applications. What are the hottest and most important open source technologies in the cloud computing market today? Crisp Research has examined and classified the “Top 15 Open Source Cloud Computing Technologies 2014” in order of significance.

OpenStack to win

Openness and flexibility are among the top five reasons for CIOs during their selection of open source cloud computing technologies. At the same time, standardization becomes increasingly important and serves as one of the biggest drivers for the deployment of open source cloud technologies. It is for a reason that OpenStack qualifies as the upcoming de-facto standard for cloud infrastructure software. Crisp Research advises to build modern and sustainable cloud environments based on the principles of openness, reliability and efficiency. Especially in the areas of openness and efficiency, open source makes a significant contribution. With this in mind, CIOs set the stage for the implementation of multi-cloud and hybrid cloud/infrastructure scenarios and assist the IT department in the introduction and enforcement of a holistic DevOps strategy. DevOps, in particular, plays a crucial role in the adaptation of Platform-as-a-Service and the development of applications for the cloud and leads to significant speed advantages, which also affect the competitive strength of the business.

The criteria for assessing the top 15 open source cloud computing technologies include:

  • Innovation and release velocity
  • Development of the community including support of large suppliers
  • Adoption rate of innovative developers and users

In consulting projects Crisp Research especially identifies leading users who are using modern open source technologies to run their own IT environments efficiently and future oriented in different scenarios.

The “Top 5 Open Source Cloud Computing Technologies 2014”:

  1. OpenStack
    In 2014, OpenStack already is the most important open source technology for enterprises and developers. Over 190 000 individuals in over 144 countries worldwide already support the infrastructure software. In addition, its popularity among IT manufacturers and vendors increases steadily. OpenStack serves a continuously increasing number of IT environments as a foundation for public, private and managed infrastructure. Organizations in particular have utilized OpenStack for their purposes to build own private clouds. IT providers like Deutsche Telekom (Business Marketplace) use OpenStack to build their cloud platforms. Today only few developers have direct contact with OpenStack. However, the solution has a high importance for them since platforms like Cloud Foundry or the access to container technologies like Docker are often delivered via OpenStack. In other cases, they directly access the OpenStack APIs to develop their applications directly on top of the infrastructure.
  2. Cloud Foundry
    In the growing platform-as-a-service (PaaS) market, Cloud Foundry gets in a leading position. The project was initialized by Pivotal, a spin-off by EMC/ VMware. Cloud Foundry is mostly used by organizations to deploy a private PaaS environment for internal developers. Managed service providers use Cloud Foundry to offer PaaS in a hosted environment. The PaaS project plays perfectly together with OpenStack to build highly-available and scalable PaaS platforms.
  3. KVM
    KVM (Kernel-based Virtual Machine) is the preferred hypervisor of infrastructure solutions like OpenStack or openQRM and enjoys a high priority within the open source community. KVM stands for a cost-efficient and especially powerful option to commercial offerings like VMware ESX or Microsoft Hyper-V. KVM has a market share of about 12 percent, due to the fact that Red Hat is using this hypervisor as the foundation for its virtualization solutions. Over time, the standard hypervisor KVM will be in a tight play with OpenStack as CIOs are presently searching for cost-effective capabilities to virtualize their infrastructure.
  4. Docker
    This year’s shooting star is Docker. The container technology, which was created as a byproduct during the development of platform-as-a-service “dotCloud”, currently experiences a strong momentum and gets support from large players like Google, Amazon Web Services and Microsoft. For a good reason. Docker enables the loosely coupled movement of applications that are bundled in containers, across several Linux servers, thus improving application portability. At first glance, Docker looks like a pure tool for developers. From the point of view of an IT decision-maker, however, it is definitely a strategic tool for optimizing modern application deployments. Docker helps to ensure the portability of an application, to increase the availability and to decrease the overall risk.
  5. Apache Mesos
    Mesos rose to a top-level project of the Apache Software Foundation. It was conceived at the University of California at Berkeley and helps to run applications in isolation from one another. At the same time, the applications are dynamically distributed on several nodes within a cluster. Mesos can be used with OpenStack and Docker. Popular users are Twitter and Airbnb. One of the driving forces behind Mesos is the German developer Florian Leibert, who was also jointly responsible for the implementation of the cluster technology at Twitter.

Open Source is eating the license-based world

Generally, proprietary players such as IBM, HP and VMware schmooze with open source technologies. HP’s first cloud offering “HP Cloud” already based on OpenStack. With HP Helion Cloud, the whole cloud portfolio (public, private) was harmonized via OpenStack. In addition, HP has become the biggest code contributor for the upcoming OpenStack “Juno”-release, which will be released in October. IBM contributes to OpenStack and uses Cloud Foundry as the foundation for its PaaS “Bluemix”. At VMworld in San Francisco, VMware announced a tighter cooperation with OpenStack as well as with Docker. In this context, VMware will present its own OpenStack distribution (VMware Integrated OpenStack (VIO)) in Q1/2015, which empowers the setup of an OpenStack implementation based on VMware vSphere. The Docker partnership shall ensure that the Docker engine runs on VMware Fusion and servers with VMware vSphere and vCloud Air.

Open source solutions like OpenStack are attractive not only for technical reasons. From a financial perspective, OpenStack also supplies an essential contribution, as the open source framework reduces the cost for building and operating a cloud infrastructure significantly. The license costs for current cloud management and virtualization solutions are around 30 percent of the overall cloud TCO. This means that numerous start-ups and large, well-respected software vendors like Microsoft and VMware make a good chunk of the business by selling licenses for their solutions. With OpenStack, CIOs gain the opportunity to conduct the provisioning and management of their virtual machines and cloud infrastructure via the use of open source technologies. To support this, free community editions as well as professional enterprise ready distributions including support are available. In both cases options to significantly reduce the license costs for operating cloud infrastructures. OpenStack empowers CIOs with a valuable tool to exercise pressure on Microsoft and VMware.

Categories
Cloud Computing

Analyst Strategy Paper: The significance of Frankfurt as a location for Cloud Connectivity

Due to continuous relocation of business-critical data, applications and processes to external cloud infrastructures, the IT-operating concepts (public, private, hybrid), as well as network architectures and connectivity strategies are significantly changing for CIO’s. On the one hand, modern technology is required to provide applications in a performance-oriented, stable and secure manner, on the other hand, the location is significantly decisive for optimal “Cloud-Connectivity“.

Against this background, Crisp Research assesses the role of Frankfurt as data center location and connectivity hub with this strategy paper.

The strategy paper can be downloaded free of charge under “The significance of Frankfurt as a location for Cloud Connectivity“.

Categories
Cloud Computing

Get in the lead – OpenStack is an important strategic investment for HP

Since the release of the HP Helion Cloud, HP’s whole cloud computing portfolio is using OpenStack as the foundation for both public and private cloud offerings. Since the first release “Austin”, HP has been an active OpenStack contributor. The company committed 13 percent of the code to “Austin“ and throttled it down for “Bexar” and “Cactus”. With “Diablo,” the momentum got bigger but was still under 4 percent.

“Essex” could be seen as a turnaround in the strategy to see OpenStack as a more important project. HP contributed 7 percent of the whole code for the Essex release. For the next release “Folsom” (16 percent), the pace was accelerated and for “Grizzly” it increased by one percent to 17 percent.

The contribution decreased for “Havanna” (16 percent) and “Icehouse” (11 percent). But since the development started for OpenStack “Juno”, HP has become the number one contributor. Compared to the current “Icehouse” release, the code contribution already rose by 9 percentage points for “Juno” (20 percent), which will be released in October 2014.

Driving force for a good reason

With respect to the overall OpenStack contribution, HP has so far delivered 13 percent of the OpenStack code. This makes HP the third largest code contributor after Rackspace (18 percent) and Red Hat (16 percent).

HP has a good reason to become the driving force behind OpenStack. Since the whole cloud computing portfolio completely depends on OpenStack, HP must increase its OpenStack influence. This can be accomplished through acquiring a seat in the OpenStack Foundation (HP is already a Platinum Member) as well as through development of own code and ideas. HP already has an advantage stemming from its substantial experience in infrastructure management and software.

HP is a part of the project since the first OpenStack release. The HP Cloud (former Helion Cloud) is also based on OpenStack. But the numbers clearly speak for themselves. HP finally recognized OpenStack as an important strategic investment and is doing well by raising the investments in OpenStack. For the next 12 months, Crisp Research expects a 25 percent growth for OpenStack-based enterprise private clouds. Here, HP can surely play a leading role.

– –
* The numbers are based on official statistics from the OpenStack Foundation.

Categories
Cloud Computing

OpenStack Deployments Q3/2014: Private Clouds are far ahead – Hybrid Cloud is rising

The momentum of the open source cloud infrastructure software OpenStack remains strong. Together with the continuous improvement of the technology (current release: Icehouse), the adoption by enterprise customers is characterized by very healthy growth rates. This is clear from the worldwide growth of the OpenStack projects. Although the number of new projects in Q3 went up by only 30 percent as compared to Q2, which exhibited an impressive 60 percent growth over Q1, the appeal is still unabated.

Here, on-premise private clouds are by far the preferred deployment model. In Q2 2014, the OpenStack Foundation counted 85 private cloud installations worldwide. In Q3, the number already grew to 114. By comparison, the number of worldwide OpenStack public clouds grew by a mere 7 percent, falling steeply from the 70 percent jump just a quarter before. With 57 percent, hybrid cloud projects reveal the biggest growth. For the next 12 months, Crisp Research expects a 25 percent growth for OpenStack based enterprise private clouds.

– –
* The numbers base on the official statistics of the OpenStack Foundation.