Categories
Cloud Computing

AI Becomes the Game Changer in the Public Cloud

Today, the artificial intelligence (AI) hype wouldn’t exist without cloud computing. Only the easy access to cloud-based innovative AI services (machine learning etc.) and the necessary and fast available computing power enable the developments of novel “intelligent” products, services and business models. At the same time, AI services ensure growth of public cloud providers like Amazon Web Services, Microsoft and Google. Thus, one can observe a “Cloud-AI interdependency”.

After more than 10 years, cloud computing has evolved into a fertile business for providers such as Amazon Web Services or Microsoft. However, competition is getting stronger from laggards like Google and Alibaba. And with the massive and ongoing introduction of AI-related cloud services, providers have increased the competitive pressure themselves, in order to raise attractiveness among their customers.

The Cloud Backs AI and Vice Versa

To build and operate powerful and highly-scalable AI systems is an expensive matter for companies of any size. Eventually, training algorithms and operating the corresponding analytics systems afterwards need oodles of computing power. Providing the necessary computing power in an accurate amount and on time via the own basement, server room or data center is impossible. Computing power that afterwards is not required anymore.

Looking into the spheres of Amazon, Microsoft or Google, all three providers built up an enormous amount of computing power in recent years and equally own a big stake of the 40 billion USD cloud computing industry. For all of them, expanding their portfolios with AI services is the next logical step in the cloud. On one side, developing AI applications respectively the intelligent enhancement of existing applications requires easy access to computing power, data, connectivity and additive platform services. Otherwise, it is necessary to obtain attractiveness among existing customers and to win new customers. Both are looking for accessible solutions to integrate AI into their applications and business models.

Amazon Web Services

Amazon Web Services (AWS) is not only the cloud pioneer and innovation leader, but still by far market leader of the worldwide public cloud market. Right now, AWS is the leading cloud environment for developing as well as deploying cloud and AI application, due to its scalability and comprehensive set of platform services. Among other announcements, AWS presented Amazon Cloud 9 (acquisition of Cloud9 IDE Inc. in July 2016) at the recent re:Invent summit. A cloud-based development environment that is directly integrated into AWS cloud platform to develop cloud-native applications. Moreover, AWS announced six machine learning as a service (MLaaS) services, including a video analysis service as well as a NLP service and a translation service. In addition, AWS offers MXNet, Lex, Rekognition and SageMaker, powerful services for the development of AI applications. SageMaker, in particular, attracts attention, since it helps to control the entire lifecycle of machine learning applications.

However, as with all cloud services, AWS pursues the lock-in approach with AI-related services as well. All AI services are tightly meshed with AWS’ environment to make sure that AWS remains the operating platform after the development of an AI solution.
Amazon also sticks to its yet successful strategy. After Amazon made the technologies behind its massive scalable ecommerce platform publicly available as a service via AWS, technologies behind Alexa, for example, has followed to help customers integrate own chatbots or voice assistants into their applications.

Microsoft

Microsoft has access to a broad customer base in the business environment. This along with a broad portfolio of cloud and AI services offer basically good preconditions to also establish oneself as a leading AI market player. Particularly because of the comprehensive offering of productivity and business process solutions, Microsoft could be high on the agenda of enterprise customer.

Microsoft sticks deep in the middle of digital ecosystems of companies worldwide with products like Windows, Office 365 or Dynamics 365. And that is exactly the point where the data exist respectively the dataflows happen that could be used to train machine learning algorithms and build neural networks. Microsoft Azure is the central hub where everything runs together and provides the necessary cloud-based AI services to execute a company’s AI strategy.

Google

In the cloud, Google is still behind AWS and Microsoft. However, AI could become the game changer. Comparing today’s Google AI services portfolio with AWS and Microsoft you can see that Google is the clear laggard among the innovative provider of public cloud and AI services. This is astounding if you consider that Google invested USD 3.9 billion in AI so far. Compared to the competition, Amazon has invested USD 871 million and Microsoft only USD 690 million. Google simply lacks in consistent execution.

But! Google already has over 1 million AI user (mainly through the acquisition of data science community “Kaggle”) and owns a lot of AI know-how (among others due to the acquisition of “DeepMind”). Moreover, among developers Google is considered as the most powerful AI platform with the most advanced AI tools. Furthermore, TensorFlow is the leading AI engine and for developers the most important AI platform, which serves as the foundation of numerous AI projects. In addition, Google has developed its own Tensor Processing Units (TPUs) that are specifically adapted for the use with TensorFlow. Recently, Google announced Cloud AutoML, a MLaaS that addresses unexperienced machine learning developer, to help creating deep learning models.

And if you keep in mind where Google via Android OS has its fingers in the pie (e.g. Smartphones, home appliances, smart home or cars) the potential of AI services running on the Google Cloud Platform is clearly visible. The only downer is that Google is still only able to serve developers. The tie-breaking access to enterprise customers, something that Microsoft owns, is still missing.

AI Becomes the Game Changer in the Public Cloud

The AI platform and services market is still at an early stage. But in line with the increasing demand to serve their customers with intelligent products and services, companies are going to proceed to search for the necessary technologies and support. And it’s a fact that only the easy access to cloud-based AI services as well as the necessary and fast accessible computing power is imperative for developing novel “intelligent” products, services and business models. Hence, for enterprises it doesn’t make any sense to build in-house AI systems since it is nearly impossible to operate them in a performant and scalable way. Moreover, it is important not to underestimate the access to globally distributed devices and data that has to be analyzed. Only globally scalable and well-connected cloud platforms are able to achieve this.
For providers, AI could become the game changer in the public cloud. After AWS and Microsoft started leading the pack, Google wasn’t able to significantly play catch-up. However, Google’s AI portfolio could make a difference. TensorFlow, particularly and its popularity among developers could play into Google’s hands. But AWS and Microsoft are beware of it and act together against this. “Gluon” is an open source deep learning library both companies have developed together, which looks quite similar to TensorFlow. In addition, AWS and Microsoft provide a broad range of AI engines (frameworks) rather than just TensorFlow.

It is doubtful that AI services are enough for Google to catch up with AWS. But Microsoft could quickly feel the competition. For Microsoft it is crucial, how fast the provider is able to convince its enterprise customer of its AI services portfolio. And at the same time to convey how important other Microsoft products (e.g. Azure IoT) are and to consider them for the AI strategy. AWS is going to stick to its dual strategy and focus on developers as well as enterprise customers and will still lead the public cloud market. AWS will be the home for all those who solely do not want to harness TensorFlow – in particular cloud-native AI users. And not to forget the large customer base that is innovation oriented and is aware of the benefits of AI services.

Categories
Cloud Computing

Figure: Machine Learning Cloud Provider Overview

Machine Learning Cloud Provider Overview

Categories
Cloud Computing

Figure: Chatbot Cloud Provider Overview

Chatbot Cloud Provider Overview

Categories
Cloud Computing

Interview: Innovation and scalability in the public cloud

In this interview with Cloud Era Institute, I discuss the growing trend of companies opting into the public cloud to leverage scalable infrastructure and global outreach. I also share how vendor lock-in contributes to innovation. I provide an important distinction between data privacy and data security, and explain why the public cloud is a shared responsibility.

What trends are you seeing in the public cloud right now?

The first trend is that the public cloud is growing because enterprises need innovation and global scalability. Previously enterprises talked about building private cloud environments, but soon realized the financial impact of building a massive, scalable infrastructure. Last year, Amazon Web Services (AWS) released over seven hundred new services and functionalities, something that would not be possible for a private cloud or a web hosting company. Amazon and Microsoft Azure are investing heavily in innovations at the infrastructure level, in data center operating, and new services.

Another big trend we are seeing are containers like Docker, which has gained momentum because of the importance of portability. With Docker, you can move workloads to different cloud providers. You can capitulate your application and its dependencies in a container, then move everything from one system to another.

A third trend is microservices such as Azure machine learning or AWS Short Notification Service. You can use the microservice approach to create your own powerful application.

Netflix is an on-demand video streaming platform with massive scalability and a highly available application on top of AWS, created based on microservice architecture. Their application is always running because when one microservice has a problem, the others remain unaffected. It is a single application running on top of AWS and connected via a public API.

CIOs and developers typically don’t like vendor lock-in, but I believe it helps with innovation. If you are using an iPhone, then you are totally locked in the Apple environment, and you love it. Apple is able to innovate because they have a closed ecosystem. It’s the same with AWS and Azure, since they also have service lock-in. This is how companies are able to innovate.

What has been the biggest challenge for businesses in the public cloud?

For Germany, Europe, and the U.S., it is data privacy and security. It is important to separate data privacy and data security. Data privacy is about legality issues and ensuring that you are fulfilling the law. Data security means that data is stored securely so nobody can access it without authorization.

Germany thinks its data centers are more secure than the U.S., which is not true. Data centers in Germany, the U.S., or Australia have the same physical security. When it comes to data security, it is no big deal to store data in the U.S.

Another big issue is a lack of cloud knowledge. The cloud has been around more than ten years, yet there is still a global lag. Many people do not understand how to create cloud applications that can be used on cloud platforms; from the design, to the architecture, microservices, and containers.

Public clouds are shared-security environments. There is a lack of knowledge about this as well. A public cloud provider is only responsible for the physical infrastructure and ensuring that the virtual infrastructure can be deployed.

Everything on top of the virtual infrastructure belongs to the customer. In the public cloud, the customer has to create their own virtual infrastructure, for example on top of AWS, and then has to run systems and applications on top of it. To fire up a virtual machine is not cloud. The application and virtual infrastructure must also be scalable.

What have observed about marketing as it relates to the cloud?

It is not only a cloud issue; it is that many companies do not focus on content marketing. It is better to market your products with good content, not just advertisements. Unique content is just as important as having an expert voice contributing to it. It is better to let the people write who are experts, not the marketing people.

– – –

– – –
The interview with Cloud Era Institute has been published under “Global technology expert, Rene Buest, on innovation and scalability in the public cloud“.

Categories
Cloud Computing

Interview: Public Cloud Services – In search of the white knight

Hybrid and multi-cloud strategies are near the top of the agenda for IT decision-makers. They understand that a modern, cloud-based IT world shouldn’t just be drawn in black and white. Diversity is needed to purchase services and innovations from a larger number of cloud providers. Private clouds quickly meet their limits here and don’t offer the benefits of a public cloud.

What is the optimal strategy for using public cloud services in enterprise IT? Find the answers in an interview with T-Systems in “Public cloud services: in search of the white knight”.

Mr. Buest, public, private, hybrid: when does which cloud offering become relevant for a company?
There’s no catch-all answer here. We are now seeing an increasing number of companies that are intensely engaging with the public cloud, following an “all in” approach. This means, they do not manage a local IT infrastructure or internal data centers anymore, instead they are migrating everything to public cloud infrastructures or platforms, or purchasing what they need under a SaaS (Software-as-a-Service) model. However, these companies are still a minority.

…and that means?
At the moment, most companies prefer to use private cloud environments. It’s a logical consequence of the legacy solutions that companies still maintain in their IT. However, we believe that in the future, a majority of German companies will move to hybrid or multi-cloud architectures, enabling them to cover all the facets they need for their digital transformation.

And how can companies coordinate these different solutions in combination?
By using cloud management solutions that have interfaces to the most commonplace public cloud offers, as well as to private cloud solutions. They provide powerful tools for managing workloads in different environments and shifting virtual machines, data and applications around. Another option for seamless management is iPaaS: integration Platform as a Service (iPaaS) provides cloud-based integration solutions. In the pre-cloud era, such solutions were also called “middleware”. They provide support for the interaction between different cloud services.

What do companies have to watch out for principally when using these cloud services?
They should not underestimate the lack of understanding of the public cloud, nor the challenges associated with setting up and operating multi-cloud environments. The benefits gained from using multi-cloud infrastructures, platforms and services often come at a heavy price: namely, the costs that result from the complexity, integration, management and necessary operations. Multi-cloud management and a general lack of cloud experience are currently the key challenges many companies are facing.

What is the solution?
Managed public cloud providers (MPCPs) are positioning themselves as “white knights” or “friends in need”. They develop and operate the systems, applications and virtual environments for their customers – in both the public cloud infrastructures and multi-cloud environments – in a managed cloud service model.

– – –

– – –
The interview with T-Systems has been published under “Public cloud services: What really matters“.

Categories
Cloud Computing

Expert Panel: “Digitizing the Energy World” at AWS Summit Berlin 2016

Expert panel discussion on “Digitizing the Energy World” at AWS Summit 2016 in Berlin together with Werner Vogels (CTO, Amazon), Jan-Hendrik Sewing (President Lifecycle Services, Siemens) and Martin Goetz (CEO, RWE IT CZ).

Categories
Cloud Computing

Analyst Strategy Paper: Service Management in the Public Cloud

In the coming years the Public Cloud will inevitably continue to take hold. From a technical point of view, the use of dynamic infrastructure is the only means to respond to ever changing market situations and to address them in a proactive fashion.

The black box Public Cloud makes it harder for IT organizations to keep sight of the big picture and to live up to their supervisory obligations. This becomes evident mainly through the lack of close ties to the actual IT operations of the cloud infrastructure.

The use of Public Cloud infrastructure is based on the shared responsibility model in which the responsibilities are clearly separated between the provider (physical environment) and his clients (logical environment).

In addition to the full responsibility for the logical environment, the customer does not only need to find an answer to the question of how to handle the black box – the physical environment – but also how to measure the services of the cloud provider at this level in order to maintain control.

With ITIL, CIOs have a powerful framework at their disposal which enables them to monitor the public cloud provider at all levels. Through established ITIL procedures, they can provide the business side with the facts that are required for reporting.

The strategy paper can be downloaded free of charge under “Service Management in the Public Cloud – Sourcing within the Digital Transformation“.

Categories
Cloud Computing

Analyst Strategy Paper: Cloud Data Fabric – Enterprise Storage Services in the Cloud

The maturing public cloud infrastructure gains increasing importance as an attractive alternative to on-premise enterprise IT infrastructure. Public cloud infrastructure offer an abundance of possibilities to CIOs when it comes to operating existing infrastructure and application environments more flexibly and at a lower cost. However, the existing public cloud storage services have been developed with a focus on the new generation of applications. This is why they are less well prepared to run existing enterprise applications. At present, the requisite storage concepts, standards and technologies for use of legacy enterprise applications on public cloud infrastructure are still not available. In order to ensure the continued existence as well as the proper operations of conventional application architectures on public cloud infrastructures, it is necessary to transfer the established and widespread standards into the public cloud. As legacy applications still constitute the lion‘s share of potential cloud migration candidates for new types of cloud native applications, well-known storage concepts are required to ensure the migration of existing applications without modifications to a public cloud infrastructure, where they continue to be operated.

In the strategy paper “Cloud Data Fabric”, Crisp Research analyses and explains the different enterprise storage options in the cloud and illustrates how to establish storage management with public cloud infrastructure.

The strategy paper can be downloaded free of charge under “Cloud Data Fabric – Enterprise Storage Services in the Cloud“.

Categories
Cloud Computing

Analyst Strategy Paper: How to resist Data Gravity

As a result of growing digitization of business models and processes, CIOs and IT decision-makers are compelled to seek new sourcing models and infrastructure concepts. The Public Cloud plays a major role in this context. In the digital age, the significance of data management takes on a new dimension. Data have certain inertia and, hence need to be assigned to different classes so that they comply with regulations, legal requirements, technical limitations and individual safety classes. This so-called „Data Gravity“ impacts the mobility of data. New storage concepts are needed to process these hard-to-move data out-side of one‘s own IT infrastructure without loss of control. Hybrid and multi-cloud storage architectures provide implementation strategies and robust usage scenarios that are aligned with the new requirements. Within these architecture concepts, the data are located in a company-controlled area and the data owner is the only person who determines which parts are to be stored in the public cloud. Accordingly, all benefits of public cloud infrastructures can be utilized without losing control of one‘s data, while fulfilling the necessary compliance guidelines and legal requirements.

In the strategy paper “How to resist Data Gravity”, Crisp Research analyses and explains the connection between “data gravity” and the public cloud and illustrates new ways to reduce the impact.

The strategy paper can be downloaded free of charge under “How to resist Data Gravity“.

Categories
Cloud Computing

Analyst Strategy Paper: Public Cloud – The Key to a Successful Digital Transformation

Within the framework of the digital agenda, IT infrastructure is of central importance. More than two thirds (68 percent) of companies regard digital infrastructure as the most important building block and the key to the successful digitization of their business models and processes. The Public Cloud is one of the most important vehicles of the digital evolution. Only by means of dynamically acting and globally scalable infrastructure are companies able to adapt their IT strategies to continuously changing market situations. Hence, they can strongly support the technical aspects of their company strategy. With a Digital Infrastructure Fabric, companies are mapping the technological image of their “Digital Enterprise“, defining all necessary players and drivers within their digital evolution.

Public Cloud infrastructure services represent a solid base, to support the digitization strategies of companies regardless of their size, predominantly however, companies with very scalable IT workloads. For example, startups are allowed to grow slowly without having to invest massively in IT resources from the very beginning. In this way, companies get a hold on one of the most important features, to have a say in the digital evolution: speed. Today for IT departments, there is more at stake than just preserving the status quo. IT must position itself as a strategic partner and business enabler and be capable of satisfying the individual needs of specialized departments. They need to pursue the goal of creating a competitive edge for the company on the basis of digital technologies. In this context, public cloud infrastructure support the proactive measures of the IT departments.

The strategy paper can be downloaded free of charge under “Public Cloud – The Key to a Successful Digital Transformation“.