Categories
Analysis

Quo vadis VMware vCloud Hybrid Service? What to expect from vCHS?

In May of this year, with the vCloud hybrid service (vCHS) VMware presented its first infrastructure-as-a-service (IaaS) offering in a private beta. The infrastructure service is fully managed by VMware and is based on VMware vSphere. Companies that already have virtualized their own on-premise infrastructure with VMware, are to be given the opportunity to seamlessly expand their data center resources such as applications and processes to the cloud, to span a hybrid cloud. As part of the VMworld 2013 Europe in Barcelona in October, VMware has announced the availability of the vCHS in England with a new data center facility in Slough near London. The private beta of the European vCloud hybrid service will be available from the fourth quarter of 2013, with general availability planned for the first quarter of 2014. This step shows that VMware ascribes the public cloud an important role, but was especially made ​​to meet the needs of European customers.

Be softly with gleeful expectations?

During the press conference at VMworld in Barcelona VMware CEO Pat Gelsinger made ​​the glad announcement that instead of the expected 50 registrations even 100 participants have registered for the test of vCHS private beta. After all, this is an improvement over the expectations by 100 percent. However, 100 test customers cannot be a success for a vendor like VMware. Just take a look on the broad customer base and the effectiveness of their sales.

It is therefore necessary to ask the question how attractive a VMware IaaS offering actually can be and how attractive VMware technology, most notably VMware’s in the enterprise widespread hypervisor, will be in the future. Discussions with IT leaders arise more frequently that VMware’s hypervisor for cost reasons and sense of independence (lock-in) should be superseded by an open source hypervisor like KVM in short to medium term.

Challenges: Amazon Web Services and the partner network

With vCHS VMware is positioning itself exactly as about 95 percent of all vendors in the IaaS market: With compute power and storage. A service portfolio is not available. However, VMware sees the Amazon Web Services (AWS) as a main goal and primary competitor when it comes to attract customers.

However, AWS customers – also enterprise customers – use more than just the infrastructure. Every time I talk to AWS customers, I ask my standard question: “How many infrastructure-related AWS services do you use?” If I sum up all previous answers together, I come to an average of eleven to twelve services that an AWS customer is using. Most say that they would use as many services as necessary in order to have as little effort as possible. A key experience was a customer who, without hesitation and with wide eyes, said he is using 99 percent of all the services.

AWS is a popular and particularly attractive target. With the view on AWS only, VMware should be careful and not lose its network of partners and in particular lose the sight of the service provider that have built their infrastructures with VMware technology, including CSC, Dimension Data, Savvis, Tier 3, Verizon or Virtustream.

From first service providers there are already comments, to turn one’s back on VMware and support other hypervisors such as Microsoft Hyper-V. VMware sees these discussions so far relaxed as vCHS is intended purely for standardized workloads and VMware-powered service provider should take care of the specific customer needs.

Quo vadis vCloud Hybrid Service?

Because of its technology, which is used in a variety of cloud service providers, VMware operates passive for some time in the IaaS environment. However, the still in the beta vCloud hybrid Service is in direct competition to this partner network, so it will come to the question of trust between the two sides.

This is particularly due to the fact that vCHS offers more functionality than the typical vCloud Datacenter Service. This includes, among other things, the new vCloud hybrid service online marketplace, with which the user in accordance to VMware have access to more than 3,800 applications and services, which they can use in conjunction with vCHS.

VMware’s strategy consists primarily to build a seamless technical bridge between local VMware installations and vCHS – including the ability to transfer existing standard workloads between the own data center and vCHS. How much success VMware will have with this strategy needs to be seen.

One result of our exploration of the European cloud market has shown that most European companies like the properties of the cloud – flexible use, pay-per-use and scalability – but rather passed the construction and operation of their applications and systems to the provider (as a managed or hosted service).

These are good conditions for all service providers who have built their infrastructure based on VMware technology, but help customers with professional services on the way to the cloud. However, these are not the best conditions for VMware vCHS since vCHS will only support self-service and standard workloads.

As is typical for U.S. companies, VMware launched its European vCHS in England. The company has already announced plans to be active in other countries within Europe. The European commitment has mainly the background to appease the concerns of Europeans and to respond to their specific needs. Here VMware’s EMEA strategy envisages the points data locality, sovereignty, trust, security, compliance and governance. I do not want to appear too critical, but here the name is interchangeable: All cloud providers advertise with the same requirements.

Basically it can be said that the vCloud hybrid service in combination with the rest of VMware’s portfolio has a good chance to play a leading role in the IaaS market. This is mainly due to the broad customer base, a strong ecosystem and the many years of experience with enterprise customers.

According to VMware figures more than 500,000 customers including 100 percent of the Fortune 500, 100 percent of the Fortune Global 100 and 95 percent of the 100 DAX companies using VMware technology. Furthermore, more than 80 percent of virtualized workloads and a large number of mission-critical applications running on VMware.

This means that the transition to a VMware-based hybrid cloud is not an unusual step to scale without great expense. Furthermore, VMware like no other virtualization or cloud provider have a very large ecosystem of partners, resellers, consultants, trainers and distribution channels. With this ecosystem VMware has a great advantage over its competitors in the IaaS and generally in the cloud computing environment.

The situation is similar in the technical attractiveness. Organizationally VMware does not set on a typical public cloud environment, but either provides a physically isolated dedicated cloud per customer or a virtual private cloud. The dedicated cloud provides much more power than the virtual private cloud and provides the customers a physically separated pool of vCPUs and vRAM. The storage and network are logically isolated between the client. The virtual private cloud provides customers with the same design architecture of the dedicated cloud with resources, but these are only logical and not be physically separated.

With vCHS existing VMware infrastructures can comfortable be spanned to a hybrid cloud to move resources back and forth as needed. In addition, a company can thus build test and development or disaster recovery environments. However, this is for what a number of VMware service providers stand by, which could be the biggest challenge for both sides.

Categories
Analysen

Hosted Private Platform-as-a-Service: Chancen und Möglichkeiten für ISVs und Unternehmen

Auch wenn Platform-as-a-Service (PaaS) immer wieder eine rosige Zukunft vorhergesagt wurde, so richtig kommt das Servicemodell nicht in Fahrt. In Gesprächen erfährt man dann, dass PaaS zwar gerne für die Entwicklung von Prototypen genutzt wird. Wenn es um den Produktivbetrieb geht, wird aber auf ein Infrastructure-as-a-Service (IaaS) Angebot gewechselt. Der Hauptgrund ist das Kontroll-Level. Dabei handelt es sich ebenfalls um das Hauptentscheidungskriterium für bzw. gegen einen PaaS. IaaS bietet einfach mehr Möglichkeiten, Einfluss auf die virtuelle Infrastruktur, Software und Services zu nehmen. Bei einem PaaS hingegen wird gegen eine standardisierte API programmiert, die nicht viele Freiheiten ermöglicht. Dabei bietet ein PaaS vor allem für ISVs (Independent Software Vendor) und Unternehmen einige Möglichkeiten, um ihre Entwickler bequem mit Ressourcen zu versorgen. Wer kein Vertrauen in einen Public PaaS hat, der kann mittlerweile auch auf gehostete sogenannte Hosted Private PaaS zurückgreifen.

Platform-as-a-Service

Platform-as-a-Service (PaaS) ist die mittlere Schicht des Cloud Computing Service-Models und geht einen Schritt weiter als Infrastructure-as-a-Service (IaaS). Ein PaaS ist dafür zuständig eine transparente Entwicklungsumgebung bereitzustellen. Dabei stellt der Anbieter eine Plattform zur Verfügung auf der (Web)-Anwendungen entwickelt, getestet und gehostet werden können. Die Anwendungen werden anschließend auf der Infrastruktur des Anbieters ausgeführt und nutzen dessen Ressourcen. Der vollständige Lebenszyklus einer Anwendung kann darüber verwaltet werden. Über APIs werden die Dienste auf der Plattform des jeweiligen Anbieters angesprochen. Der Vorteil besteht darin, dass vor allem kleine Unternehmen ihre Entwicklungsinfrastruktur auf ein Minimum beschränken können. Sie benötigen lediglich einen Computer, einen Web-Browser, evtl. eine lokale IDE, eine Datenverbindung und ihr Wissen, um Anwendungen zu entwickeln. Der Rest obliegt dem Anbieter, der für die Infrastruktur (Betriebssystem, Webserver, Laufzeitumgebungen etc.) verantwortlich ist.

Platform as a Service
Quelle: Cloud Computing mit der Microsoft Plattform, Microsoft Press PreView 1-2009

Hosted Private Platform-as-a-Service

Hosted Private Platform-as-a-Services (Hosted PaaS) überführen die Idee des Public PaaS in eine dedizierte und von einem Anbieter verwaltete Variante. Sie ist insbesondere für Unternehmen attraktiv, die einen Public Ansatz (Shared Infrastructure, Multi-Tenancy) meiden wollen, aber nicht die Ressourcen und das Wissen besitzen, um ihren Entwicklern einen echten PaaS in der eigenen IT-Infrastruktur bereitzustellen. In diesem Fall können sie auf Anbieter zurückgreifen, die ihnen einen exklusiven PaaS in einer für sie reservierten Umgebung zur Verfügung stellen. Der Vorteil besteht darin, dass der Kunde den Hosted Private PaaS genau so nutzen kann wie einen Public PaaS, aber das auf einer nicht geteilten Infrastruktur, die sich beim Anbieter im Rechenzentrum befindet.

Ein weiterer Vorteil eines Hosted PaaS besteht in den Professional Services, die der Anbieter direkt mitliefert, und die dem Kunden dabei helfen, seine Applikationen entweder auf den PaaS zu überführen oder dort neu zu entwickeln. Das Konzept ist exakt vergleichbar mit den Managed Clouds bzw. Business Clouds im IaaS Umfeld. Das ist sowohl für Unternehmen, aber auch für ISVs interessant, die bisher wenig bis keine Erfahrung bei der Entwicklung von Cloud Applikationen haben und Public Angeboten wie Amazon Elastic Beanstalk, Microsoft Windows Azure, Heroku oder Force.com nicht vertrauen.

Ein mögliches Zeichen dafür, dass Public PaaS in Deutschland nicht so richtig in Fahrt kommt ist, das der erste deutsche PaaS Anbieter cloudControl mit der Application Lifecycle Engine seinen Public PaaS in einen Private PaaS gekapselt hat, mit dem Unternehmen und Webhoster (White-Label) einen eigenen PaaS in einer selbstverwalteten Infrastruktur betreiben können. Zusätzlich lässt sich eine Brücke zu einem Hybrid PaaS aufspannen.

Der Managed IaaS Anbieter Pironet NDH ist der erste deutsche Anbieter, der auf den Hosted Private PaaS Zug aufgesprungen ist. Das Unternehmen aus Köln will damit ISVs und SaaS Anbieter eine Plattform inklusive Professional Services bieten, um deren Web-Applikationen aus einem deutschen Rechenzentrum heraus bereitzustellen. Neben .NET, PHP, Java, Perl, Python, Ruby, node.js oder Go bietet Pironet NDH ebenfalls die vollständige Windows Azure PaaS Unterstützung, wie sie von der Microsoft Windows Azure Public Cloud bekannt ist. Damit lassen sich auf für Azure entwickelte Applikationen ebenfalls innerhalb des deutschen Pironet NDH Rechenzentrums betreiben. Beide PaaS Angebote sind separat aufgebaut. Bei dem Polyglot PaaS (multi-language) kommt eine RedHat OpenShift Implementierung zum Einsatz. Der Azure PaaS basiert auf dem Microsoft Windows Azure Pack. Auch wenn sich Pironet NDH vorwiegend auf 1:1 Geschäftsbeziehungen konzentriert, werden im Q1/Q2 2014 ebenfalls Public PaaS Varianten folgen, die allerdings nur sekundär vermarket werden sollen.

Insbesondere bei traditionellen ISVs rennt Pironet NDH mit seinem Angebot offene Türen ein. Deren Kunden werden in Zukunft verstärkt nach Web-Applikationen fragen, was für den einen oder anderen ISV große Herausforderungen birgt. Diese werden unter anderem von den Professional Services profitieren, um bestehende und neue Applikationen schneller auf den Markt zu bringen.

Public Cloud Anbieter werden reagieren müssen

Der “Hosted Private” Trend kommt aus dem IaaS Bereich, in dem derzeit auch verstärkt Hosted Private Clouds in Form von Managed bzw. Business Clouds nachgefragt werden. Das aus gutem Grund. Insbesondere das Thema Datenschutz treibt ISVs und Unternehmen in die Arme von Anbietern dedizierter Lösungen. Zudem sind Kunden bereit, für eine höhere Sicherheit, Datenhoheit und Consulting mehr Kapital in Cloud-Angebote zu investieren.

Public Cloud Anbieter werden darauf reagieren müssen, um sich den Bedürfnissen der Kunden zu stellen. Mit Salesforce hat der erste große Public Cloud Player die Hosen heruntergelassen. Zunächst ein O-Ton von Joachim Schreiner, Salesforce Deutschland Geschäftsführer: “Private Clouds sind keine Clouds. Das ist nichts anderes als ein Application-Server-Provider-Vertrag, der auch schon im Jahr 2000 hätte abgeschlossen werden können, wenn die nötige Bandbreite verfügbar gewesen wäre.” Das sieht Salesforce CEO Marc Benioff scheinbar ein wenig anders. Schließlich rollt der Rubel nur dann, wenn die Kunden zufrieden sind. Dafür hat Salesforce auf der Dreamforce 2013 seinen “Salesforce Superpod” vorgestellt. Dabei handelt es sich um eine Dedicated Cloud auf Basis von HPs Converged Infrastructure. Also im Grunde genommen nichts anderes als eine Hosted Private Cloud.

Categories
Analysis

Cloud Computing Myth: Less know-how is needed

I recently found an interesting statement in an article which describes the advantages of cloud computing. A caption was named “It’s not needed to build additional know-how within the company”. This is totally wrong. On the contrary it’s exactly the opposite. It is more knowledge needed as each vendor is promising.

There is a lack of knowledge

The kind and amount of the needed knowledge depends on the service that is used from the cloud. For an alleged high-standardized software-as-a-service (SaaS) application like e-mail there is less knowledge needed regarding the service and its characteristics compared to a service, which maps a certain process.

For the use of infrastructure-as-a-service (IaaS) or platform-as-a-service (PaaS) it looks quite different. Although the provider takes care about building, operating and maintaining the physical infrastructure in this case. The responsibility for the virtual infrastructure (IaaS) behooves by the customer. The provider itself – for fee-based support – or certified system integrator help here by building and operating. It’s the same while operating an own application on a cloud infrastructure. The cloud provider is not responsible and just serves the infrastructure and means and ways with APIs, web interfaces and sometimes added value services, to help customers with the development. In this context one need to understand, that depending on the cloud scale – scale out vs. scale up – the application have to be developed completely different – namely distributed and automatically scalable across multiple systems (scale out) – as in the case of a non-cloud environment. This architecturally knowledge is lacking in most companies at every nook and corner, which is also due to the fact that colleges and universities have not taught this kind of programmatic thinking.

Cloud computing is still more complex than it appears at first glance. The prime example for this is Netflix. The U.S. video-on-demand provider operates its platform within a public cloud infrastructure (Amazon AWS). In addition to an extensive production system, which ensures the scalable and high-performance operation, it also has developed an extensive test suite – the Netflix Symian Army – which is only responsible for ensuring the smooth operation of the production system – inter alia virtual machines are constantly arbitrarily being shot down, but the production system must still continue to function properly.

The demand for the managed cloud rises

Although, the deployment model can less reduce the complexity, but the responsibility and the necessary know-how can be shifted. Within a public cloud the self-service rules. This means that the customer is first 100 percent on his own and is solely responsible for the development and operation of his application.

This, many companies have recognized and confess to themselves that they either do not have the necessary knowledge, staff and other resources to successfully use a public cloud (IaaS). Instead, they prefer or expect help from the cloud providers. In these cases it is not about public cloud providers, but managed cloud/ business cloud providers who also help in addition to infrastructure with professional services.

Find more on the topic of managed cloud at “The importance of the Managed Cloud for the enterprise“.

Categories
Analysen

Cloud Computing Mythos: Es ist weniger Know-How erforderlich

Ich bin in einem Artikel, der die Vorteile des Cloud Computing hervorhebt, auf eine interessante Aussage gestossen. In einer Bildunterschrift heißt es “Es muss kein zusätzliches Know-how im Unternehmen aufgebaut werden.” Das ist völlig falsch. Denn genau das Gegenteil ist der Fall. Es wird immer noch viel mehr Wissen benötigt, als von den Anbietern versprochen wird.

Das notwendige Wissen fehlt

Die Art und Menge des benötigten Wissens hängt von der Art des Service ab, der aus der Cloud genutzt wird. Für eine vermeintlich hochstandardisierte Software-as-a-Service (SaaS) Applikation wie E-Mail wird hinsichtlich des Service und seiner Eigenschaft deutlich weniger Wissen benötigt, als ein Service, der einen bestimmten Prozess abbildet.

Für die Nutzung von Infrastructure-as-a-Service (IaaS) oder Platform-as-a-Service (PaaS) verhält es sich jedoch ganz anders. Zwar übernimmt der Anbieter in diesem Fall den Aufbau, Betrieb und die Wartung der physikalischen Infrastruktur. Die Verantwortung für die virtuelle Infrastruktur (IaaS) obliegt jedoch dem Kunden. Der Anbieter selbst – über kostenpflichtigen Support – oder zertifizierte Systemintegratoren helfen hier beim Aufbau und Betrieb. Genau so verhält es sich bei dem Betrieb einer eigenen Applikation auf einer Cloud-Infrastruktur. Der Cloud Anbieter selbst ist dafür nicht verantwortlich, sondern stellt lediglich die Infrastruktur sowie Mittel und Wege in Form von APIs, Weboberflächen und ggf. weiteren Mehrwertservices zur Verfügung, um den Kunden die Entwicklung zu erleichtern. In diesem Zusammenhang muss auch verstanden werden, dass je nach Cloud-Skalierung – Scale-out vs. Scale-up – die Applikation vollständig anders für die Cloud entwickelt werden muss – nämlich über mehrere Systeme hinweg automatisch skalierbar verteilt (Scale-out) – als es in einer nicht Cloud-Umgebung der Fall ist. Dieses architekturelle Wissen fehlt in den meisten Unternehmen an allen Ecken und Enden, was auch dem Fakt geschuldet ist, dass Hochschulen und Universitäten diese Art des programmatischen Denkens bisher nicht vermittelt haben.

Cloud Computing ist derzeit noch komplexer als es auf dem ersten Blick erscheint. Das Paradebeispiel hierfür ist Netflix. Der US-amerikanische Video-on-Demand Anbieter betreibt seine Plattform innerhalb einer Public Cloud Infrastruktur (Amazon AWS) und hat neben einem umfangreichen Produktivsystem, welches den skalierbaren und performanten Betrieb sicherstellt, ebenfalls eine umfangreiche Test-Suite – die Netflix Symian Army – entwickelt, die nur dafür zuständig ist, den einwandfreien Betrieb des Produktivsystems sicherzustellen – u.a. werden laufend willkürlich virtuelle Maschinen bewusst abgeschossen und das Produktivsystem muss dennoch weiterhin einwandfrei funktionieren.

Bedarf an Managed Cloud steigt

Die Komplexität lässt sich durch das Deploymentmodell zwar weniger verringern, die Verantwortung und das notwendige Know-How jedoch verschieben. Innerhalb einer Public Cloud regiert der Self-Service. Das bedeutet, dass der Kunde zunächst zu 100 Prozent auf sich selbst gestellt ist und die alleinige Verantwortung für die Entwicklung und den Betrieb seiner Applikation trägt.

Dieses haben viele Unternehmen erkannt und gestehen sich ein, dass sie entweder nicht über das notwendige Wissen, Personal und weitere Ressourcen verfügen, um erfolgreich eine Public Cloud (IaaS) zu nutzen. Stattdessen bevorzugen bzw. erwarten sie Hilfe von den Cloud Anbietern. In diesen Fällen handelt es sich allerdings nicht um Public Cloud Anbieter, sondern um Managed Cloud/ Business Cloud Anbieter, die neben Infrastruktur ebenfalls mit Professional Services helfen.

Mehr zum Thema Managed Cloud kann unter “Die Bedeutung der Managed Cloud für Unternehmen” nachgelesen werden.

Categories
Analysis

Dropbox alternatives for the enterprise IT

The popularity of easy of use cloud storage services like Dropbox to cause IT decision makers quite a headache. Withal, the market already offers enterprise ready solutions. This article introduces cloud services for the professional use.

Dropbox drives shadow IT

Dropbox has driven cloud storage services into the enterprise. The fandom of the US provider extends from the ordinary employee up to the executive floor. In particular, the fast access, the ease of use on each device and the little costs made Dropbox to an attractive product. But what sounds like a true success story at first, is in reality a serious problem for CIOs and IT manager. Dropbox has led to a new form of shadow IT. Meant, here is the widely uncontrolled growth of IT solutions, employees and departments use without taking care of the IT department, purchasing these using credit cards. Behind this mostly stands the criticism internal IT departments are not able to deliver suitable solutions fast and in a desired quality. This leads to situations, where company data are stored on private Dropbox accounts, where they do not have to belong.

The Dropbox boom and the easy access to public cloud services in general led to a discussion about the right to exist of traditional IT departments. Sooner or later they could die out some analysts predict. Then the IT strings are in the hand of the Line of Business Manager (LOB). Yet, the reality looks different: In particular, the often anxious LOB Manager have normally neither the time nor the knowledge, to make such IT decisions. They indeed know what is important for their area, but do they have the knowledge, which systems also have to play together? For many years companies fight with not ideal integrated isolated applications and data silos. Public cloud solutions exponentiate this problem and Dropbox is just the tip of the iceberg.

To get the Dropbox phenomenon under control several vendor of enterprise cloud storage have established in the past years. The widely used Dropbox service offers by far not what typical enterprise policies and IT governance models demand.

Dropbox for Business

Since 2011 “Dropbox for Business”, a corporate offer with advanced features for more safety, team management and reporting capabilities, exists. However, the solution does not have the breadth and variety of functions like other similar offers on the market. Therefore, Dropbox is more suited for small and familiar teams that do not require as much control as larger companies. For $795 per year for five users unlimited space is available. Each additional user cost $125 per year.

Administrators get access over a dashboard to information about the activities of their users. This includes the used devices, browser sessions and applications. Here it is also possible to close browser sessions, disconnect devices and disable third-party apps.

For improved security, various authentication mechanisms can be activated, including a two-factor authentication. There is also a single sign-on (SSO) integration with Active Directory and other SSO providers. For the technical infrastructure Dropbox uses Amazon S3. This means that the data is stored in one of the global Amazon data centers. Although these data centers meet high safety standards as SSAE16, ISAE 3402 and ISO 27001. However, Dropbox does not guarantee a specific location of the data within the Amazon Cloud, like a data center in the EU. Dropbox indicates that the data is encrypted with AES 256-bit before it is stored on Amazon S3. However, Dropbox has plain text access to user files. A separate encryption is only possible with external tools.

Another deficit is the lack of audit mechanisms at file level and activities of the user. It is not possible to centrally look into a single user account, or to look for an earlier version of the file. This only works if one register as a user to look into the data. In addition, the reports provide no information about user activities such as uploading and sharing of files – a big gap in the audit process.

Strengths

  • Ease of use.
  • Supports the major operating systems.
  • Big market share and acceptance in consumer space.
  • Unlimited storage space at an attractive price.

Weaknesses

  • Dropbox has full plain text access to user files.
  • No end-to-end encryption.
  • Data encryption using external tools.
  • Weak reporting.
  • Insufficient administration and audit options.
  • Location of the data can not be set.

Box

Box is one of the well-known providers of public cloud enterprise storage and targets its functions to small and medium-sized as well as large companies.The business plan is $15 per user per month for 3 to 500 users. This includes 1,000 GB of storage space. Box for Enterprise IT offers an unlimited number of users and unlimited disk space, the prices are obtained on request.

Clients for common desktop and mobile operating systems allow synchronization and uploading of data with almost any device. Files can be locked and automatically be released after time. In addition, depending on the plan, the version history is stored between 25 to 100 files. Other functions allow external authentication mechanisms, user management and auditing capabilities. The enterprise plan offers further management functions and access to APIs.

Depending on the plan more functions open. This can be particularly well seen on the permissions level. The higher the plan, the more types of users and access rights can be assigned to an object. Business and enterprise customers also get detailed reporting capabilities. These include, among other things, information on who has viewed and modified which files. Other safety features Box offers with authentication mechanisms for Active Directory, Salesforce, NetSuite, Jive and DocuSign and single sign-on (SSO) integration capabilities. In terms of data center capacity Box cooperates with Equinix. Among others, there is a data center in Amsterdam for the European market. Where Equinix has no sites, Box relies on Amazon Web Services.

Box ‘biggest weakness is the limitation on 40,000 objects for files and folders. This restrictions customers have already pointed out in mid-2012. So far, nothing has changed. There is only the information that the limit is raised to 100,000 objects in “Box Sync 4”.

Strengths

  • Ease of use.
  • Variety of extensions.
  • Supports the major operating systems.
  • Many relevant features for business (management, audit, etc).

Weaknesses

  • Files and folders are limited to 40,000 objects.
  • Encryption codes are owned by Box.

TeamDrive

TeamDrive from Hamburg is a file sharing and synchronization solution. It is intended for companies that do not want to save their sensitive data at external cloud services, but still want to allow their teams to synchronize data or documents. For this TeamDrive monitors any folder on a PC, laptop or smartphone that can be used and edited together with invited users. Thus, data is also offline available at all times. An automatic synchronization, backup and versioning of documents protect users against data loss. With the possibility to operate TeamDrive registration and hosting server in an own data center, the software can be integrated into existing IT infrastructures. For this reason all necessary APIs are available. For TeamDrive Professional enterprise customers pay 5.99 euros per user per month, or 59.99 euros per year.

Using the global TeamDrive DNS service several independently operated TeamDrive systems can be linked together. If necessary, this allows customers to build a controlled community cloud in a hybrid scenario.

TeamDrive offers many business-related functions for the management and control of a storage service. These include a rights management on Space-level for different user groups, as well as a version control system to access older versions of documents and changes of group members. For the synchronization of the data, clients for all major local and mobile operating systems are available, including Windows, Mac, Linux, iOS and Android. With TeamDrive SecureOffice, the vendor has also brought an expansion of its mobile clients on the market, with which documents can be processed within an end-to-end encryption. An integrated mobile device management (MDM) helps to manage all devices used with TeamDrive. These can be added, blocked or erased. TeamDrive can be bound to existing directory services such as Active Directory and LDAP to synchronize the user administration.

In addition to these management functions TeamDrive features a fully integrated end-to-end encryption where the encryption keys are exclusively owned by the user. Thus, TeamDrive is not able to access the data at no time. For encryption, TeamDrive relies on AES 256 and RSA 3072

It should also be mentioned that TeamDrive, as the only enterprise storage solution, carries the privacy seal by the Independent Centre for Privacy Protection Schleswig-Holstein (ULD). The privacy seal confirms that TeamDrive is suitable for the use in businesses and governments for the confidential exchange of data.

Strengths

  • End-to-end encryption.
  • Different encryption mechanisms.
  • SecureOffice for mobile secure processing of documents.
  • Certification by the ULD.
  • Integrated mobile device management.
  • Many relevant functions for businesses.

Weaknesses

  • No locking of files.
  • No browser access.

Microsoft SkyDrive Pro

SkyDrive Pro is Microsoft’s enterprise cloud storage, which is provided in conjunction with SharePoint Online and Office 365. The service is exclusively designed for business purposes and therefore should be different from SkyDrive. SkyDrive is aimed at home users who should predominantly store and share documents and photos in the Microsoft cloud. The management of SkyDrive Pro is in the responsibility of a company. Employees should store, share, and collaborate business documents with colleagues within a private domain.

SkyDrive Pro is fully synchronized with SharePoint 2013 and Office 365. An administrator decides how the libraries can be used within SkyDrive Pro for each user. For this purpose, different access rights for users and user groups can be assigned. Using a client documents can be synchronized with the local computer. Mobile clients are available for iOS and Windows Phone. Android and Blackberry are currently not supported.

Documents or entire folders can be shared with individual colleagues or distribution lists. Access rights can be assigned for read or write access. A recipient then receives an e-mail including the comment and the link to the document and can follow it to get change information later. Sharing with partners and customers outside the domain is possible if the company supports external sharing.

According to Microsoft, all data in SkyDrive Pro will be protected with several layers of encryption. The only way to get the information, is if an administrator granted access rights to it. Furthermore, Microsoft guarantees that the private corporate data is protected from search engines so that no meta-data is collected in any form. In addition, SkyDrive Pro is compliant with HIPAA, FISMA and other data protection standards.

Strengths

  • Integration with Office 365 and SharePoint.
  • Clients for mobile operating systems.

Weaknesses

  • Proprietary Microsoft system.
  • European data center only (Dublin, Amsterdam).
  • No Android client.

Amazon S3

Over a web service, Amazon S3 (Amazon Simple Storage Service) provides the access to an unlimited amount of storage in the Amazon cloud. Unlike to competing cloud storage services the storage can only be accessed via a REST and SOAP interface (API). Amazon does not provide an own local client for synchronization. This is due to the fact, that Amazon S3 basically serves as a central storage location, many other Amazon services use to store or retrieve data. Here an ecosystem of partners help with paid clients to make use of synchronization capabilities with desktop and mobile operating systems. Using the own Amazon AWS Management Console, folders and files can be accessed via the web interface.

With the API, data as objects can be stored, read, and deleted in the Amazon Cloud. The maximum size of an object is 5 GB. Objects are organized in buckets (folders). Authentication mechanisms ensure that the data is protected from unauthorized third parties. For this purpose, objects can be marked for private or public access and assigned with different user access rights to the objects.

Amazon S3 pricing varies by region in which the data is stored. One GB of storage used for the first TB in the EU region cost 0,095 U.S. dollars per month. In addition, the outgoing data transfer is charged. Up to 10 TB per month the traffic costs $0.12 per GB.

Many other cloud storage services use Amazon S3 to store the user data, including Dropbox, Bitcasa or Ubuntu One.

Strengths

  • The API is the de facto standard in the market.
  • Very high scalability.
  • Very good track record.

Weaknesses

  • No own clients.
  • The pay-per-use model requires strict cost control.

ownCloud

Like TeamDrive, ownCloud is a file sharing and synchronization solution. It is aimed at companies and organizations that want to keep their data under control and not to rely on external cloud storage services. The core of the application is the ownCloud server. This allows to integrate the software along with the ownCloud clients seamlessly into the existing IT infrastructure. In addition, the server enables the use of existing IT management tools. ownCloud serves as a local directory which mounts different local storages. Thus, the files are available to all employees on all devices. In addition to a local storage, directories can be connected via NFS and CIFS.

The ownCloud functions form a set of add-ons that are directly integrated into the system. These include a file manager, a contact manager and extensions to OpenID, WebDAV and a browser plugin for viewing of documents such as ODF and PDF. Other applications for enterprise collaboration are available on ownCloud’s own marketplace. Files can be uploaded using a browser or synchronized with clients for local and mobile operating systems.

Security is provided via a plugin for the server-side encryption, but which is not enabled by default. Is the plugin enabled, the files are encrypted when they are stored on the server. Here, only the contents of the files, the file names themselves are not encrypted. In addition ownCloud relies exclusively on security “at rest”.

The biggest advantage of ownCloud is also its disadvantage. The control over the data, which a company recovers through the use of ownCloud, on the other hand causes costs for the setup and operation. Administrators need to have enough knowledge about the operation of web servers such as Apache, but also about PHP and MySQL to successfully run ownCloud. In addition, a meticulous configuration is needed, without the expected performance of an ownCloud installation can not be reached.

Strengths

  • Open source.
  • Variety of applications.
  • Clients support the major operating systems.

Weaknesses

  • Weak security and encryption.
  • High costs for the operation of an own ownCloud infrastructure.
Categories
Analysen

Dropbox Alternativen für die Unternehmens-IT

Die Popularität einfach bedienbarer Cloud-Storage-Dienste wie Dropbox bereitet IT-Verantwortlichen Kopfzerbrechen. Dabei bietet der Markt inzwischen auch Enterprise-taugliche Lösungen. Dieser Artikel stellt Cloud-Services für den professionellen Einsatz vor.

Dieser Artikel ist exklusiv im IDG-Netzwerk erschienen und kann unter “Cloud Storage treibt die Schatten-IT: Dropbox Alternativen für Unternehmen” gelesen werden.

Categories
Analysis

My cloud computing predictions for 2014

The year 2013 is coming to an end and it is once again time to look into the crystal ball for the next year. To this end, I’ve picked out ten topics that I believe will be relevant in the cloud area.

1. The AWS vs. OpenStack API discussion will never end

This prediction is also valid for the years 2015, 2016, etc. It sounds like fun at first. However, these discussions are annoying. OpenStack has more important issues to address than the never-ending comparisons with Amazon Web Services (AWS). Especially since the troubles constantly come from outside, that does not help. If the OpenStack API is supposed to be 100% compatible with the AWS API, then OpenStack can be renamed in Eucalyptus!

The OpenStack community must find its own way and make it possible for service providers, away from the industry leader AWS, to be able to build up their own offerings. However, this mammoth task is again only in the hands of the provider. Eventually, OpenStack is just a large construction kit for building cloud computing infrastructures. What happens at the end (business model) is not the task of the OpenStack community.

Read more: “Caught in a gilded cage. OpenStack providers are trapped.

2. Security gets more weight

Regardless of the recent NSA scandal, the security and confidence of (public) cloud providers have always been questioned. In the course of this (public) cloud providers are better positioned to provide desired data protection than most of the world’s businesses. This is partly due to the extensive resources (hardware, software, data center and staff) that can be invested in the development of strong security solutions. In addition, as data protection is part of (public) cloud providers’ core business, its administration by the provider can ensure smoother and safer operations.

Trust is the foundation of any relationship. This applies to both private and business environments. The hard-won confidence of the recent years is being put to the test and the provider would do well to open further. “Security through obscurity” is an outdated concept. The customers need and want more clarity about what is happening with their data. This will vary depending on the world’s region. But at least in Europe, customers will continually put their providers to the test.

Read more: “How to protect companies’ data from surveillance in the cloud?

3. Private cloud remains attractive

The private cloud is financially and in terms of resource allocation (scalability, flexibility) not the most attractive form of cloud. Nevertheless, it is, despite the predictions of some market researchers, not losing its significance. On the contrary, here, despite the cost pressures, the sensitivity of the decision makers who want retain control and sovereignty over the data and systems is underestimated. This is also reflected in recent figures from Crisp Research. According to this, only about 210 million euros were spent on public infrastructure-as-a-service (IaaS) in Germany in 2013. By contrast, investment for private cloud infrastructure has exceeded 2.3 billion euros.

A recent study by Forrester also shows:

“From […] over 2,300 IT hardware buyers, […] about 55% plan to build an internal private cloud within the next 12 months.”

This does not mean that Germany has to be the land of private clouds. Finally, it’s always about the specific workload and use case, which is running in a private or public cloud. But there is clear trend towards a further form of cloud computing usage in the future.

4. Welcome hosted private cloud

After Salesforce has also jumped on the train, it is clear that enterprise customers can not make friends with the use of a pure public cloud. In cooperation with HP, Salesforce has announced a dedicated version of its cloud offering – “Salesforce Superpod”.

This extreme change in Salesforce strategy confirms our results from the exploration of the European cloud market. Companies are interested in cloud computing and its properties (scalability, flexibility or pay-per-use). However, they admit to themselves that they do not have the knowledge and time, nor is it their core business to operate IT systems; they instead let the cloud provider take care of this, expecting a flexible type of managed services. Public cloud providers are not prepared for this, because their business is to provide highly standardized infrastructure, platforms, applications and services. The remedy is that the provider secures specially certified system integrators alongside.

Here is an opportunity for providers of business clouds. Cloud providers that do not offer a public cloud model, can on the basis of managed services help the customer find the way to master the cloud and can take over the operations, supplying an “everything from one source” service offer. This typically does not happen on shared infrastructure but within a hosted private cloud or a dedicated cloud where a customer is explicitly in an isolated area. Professional services round off the portfolio with integration, interfaces and development. Due to the exclusivity and higher safety (usually physical isolation), business clouds are more expensive than public clouds. However, considering the effort that a company must make to operate itself in one or another public cloud to actually be successful or in getting the help of a certified system integrator, then the public cloud cost advantage is almost eliminated.

5. Hybrid cloud remains the continuous burner

The hybrid cloud is always a hot topic during the discussions about the future of the cloud. But it is real. Worldwide, the public cloud is initially mainly used to get access to resources and systems easily and conveniently in the short term. In the long run, this will form into a hybrid cloud, by IT departments that bring their own infrastructure to the level of a public cloud (scalability, self-service, etc.) to serve their internal customers. In Germany and Europe, it is exactly the opposite. Here private clouds are preferred, because the topics privacy and data security have a top priority. Europeans must and will get used to the public cloud to connect certain components – even some critical systems – to a public cloud.

The hybrid cloud is about the mobility of the data. This means that the data are rather held locally in an own infrastructure environment, shifted to another cloud for processing, e.g. public cloud, then being placed back within the local infrastructure. This must not always refer to the same cloud of a provider, but depending on the cost, service level and requirements more clouds are involved in the process.

6. Multi-cloud is reality

The topic of multi-cloud is currently highly debated, especially in IaaS context, with the ultimate goal being to spread the risk and take advantage of the costs and benefits of different cloud infrastructures. But even in the SaaS environment the subject must necessarily become of greater importance to avoid data silos and isolated applications in the future, to simplify the integration and to support companies in their adaptation of their best-of-breed strategies.

Notwithstanding these expectations, the multi-cloud use is already reality in companies using multiple cloud solutions from many different vendors, even if the services are not yet (fully) integrated.

Read more: “Multi-Cloud is “The New Normal”“.

7. Mobile cloud finally arrives

Meanwhile, there are providers who have discovered the mobile cloud slogan for their marketing. Much too late. The fact is that, since the introduction of the first smartphones, we are living in a mobile cloud world. The majority of all data and information, which we can access from smartphones and tablets, are no longer on the local device but on a server in the cloud.

Solutions such as Amazon WorkSpaces or Amazon AppStream support this trend. Even if companies are still careful with the outsourcing of desktops, Amazon WorkSpaces will strengthen this trend, from which also vendors such as Citrix and VMware will benefit. Amazon AppStream underlines the mobile cloud to the full extent by graphic-intensive applications and games that are processed entirely in the cloud and only streamed to the device.

Read more: “The importance of mobile and cloud-based ways of working continues to grow“.

8. Hybrid PaaS is the top trend

The importance of platform-as-a-service (PaaS) is steadily increasing. Observations and discussions with vendors that have found no remedy against Amazon AWS in the infrastructure-as-a-service (IaaS) environment so far show that they will expand their pure compute and storage offerings vertically with a PaaS and thus try to win the favor of the developer.

Other trends show that private PaaS and hosted private PaaS try to gain market share. With the Application Lifecycle Engine, cloudControl has enclosed its public PaaS in a private PaaS that enterprises can use to operate an own PaaS in a self-managed infrastructure. In addition, a bridge to a hybrid PaaS can be spanned. IaaS provider Pironet NDH has adapted the Windows Azure Pack from Microsoft to offer on this basis an Azure PaaS in a hosted private environment. This is interesting, since it closes the gap between a public and a private PaaS. With a private PaaS companies have the complete control over the environment, but they also need to build and manage it. Within a hosted version, the provider takes care of it.

9. Partner ecosystems become more important

Public cloud providers should increasingly take care to build a high quality network of partners and ISVs to pave the way for customers to the cloud. This means that the channel should be strongly embraced to address a broader base of customers who are potential candidates for the cloud. However, the channel will struggle with the problem of finding enough well-trained staff for the age of the cloud.

10. Cloud computing becomes easier

In 2014 more offerings will appear on the market, making the use of cloud computing easier. With these offers, the “side issues” of availability and scalability need to be given less attention. Instead, cloud users can focus on their own core competencies and invest their energy into developing the actual application.

An example: Infrastructure-as-a-service market leader Amazon AWS says that IT departments using the AWS public cloud no longer need to take care about the procurement and maintenance of the necessary physical infrastructure. However, the complexity has shifted to a higher level. Even if numerous tutorials and white papers already exist, AWS does not stress upon the fact that scalability and availability within a scale-out cloud infrastructure can be arbitrarily complicated. These are costs that should never be neglected.

Hint: (Hosted) Community Clouds

I see a growing potential for the community cloud (more in German), which currently has no wide distribution. In this context, I also see a shift from a currently pronounced centralization to a semi-decentralized nature.

Most companies and organizations see in the public cloud a great advantage and want to benefit in order to reduce costs, to consolidate IT and equally get more scalability and flexibility. On the other hand, future-proof, trust, independence and control are important “artifacts” that no one would like to give up.

The community cloud is the medium to achieve both. It combines the future-proof, trust, independency and control characteristics with the benefits of a public cloud that come from a real cloud infrastructure.

Some solutions that can be used as a basis for creating own professional clouds already exist. However, one should always keep close watch on the basic infrastructure that forms the backbone of the entire cloud. In this context, one should not underestimate the effort it takes to build, properly operate and maintain a professional cloud environment. Furthermore, the cost of the required physical resources needs to be calculated.

For this reason, for many small and medium-sized enterprises, as well as shared offices, co-working spaces or regular project partnerships, it makes sense to consider the community cloud. This type of cloud environment will offer the benefits of the public cloud (shared environment) while providing the future-proof, trust, independence and control requirements, alongside cost advantages that can be achieved among others by dynamic resource allocation. For this purpose, one should think about building a team that exclusively takes care of the installation, operations, administration and maintenance of the community cloud. In this case, the availability of own data center or a co-location is not a prerequisite. Instead, an IaaS provider can serve as an ideal partner for a hosted community cloud.

Categories
Analysen

Meine Cloud Computing Vorhersagen für 2014

Das Jahr 2013 neigt sich dem Ende und es ist wieder einmal an der Zeit in die Glaskugel für das kommende Jahr zu schauen. Hierzu habe ich mir zehn Themen herausgegriffen von denen ich glaube, dass Sie im Cloud Bereich relevant sein werden.

1. Die AWS vs. OpenStack API Diskussion wird niemals enden

Diese Vorhersage gilt übrigens auch für die Jahre 2015, 2016 usw. Sie hört sich im ersten Moment lustig an. Allerdings nerven diese Diskussionen langsam. OpenStack hat ganz andere Probleme, als sich ständig mit den Amazon Web Services (AWS) vergleichen zu lassen. Zumal die Unruhen, die ständig von außen hinein getragen werden, nicht weiterhelfen. Wenn die OpenStack API zu 100% kompatibel mit der AWS API sein soll, dann kann sich OpenStack gleich in Eucalyptus umbenennen!

Die OpenStack Community muss ihren eigenen Weg finden und es den Service Providern ermöglichen, abseits vom Branchenprimus AWS, eigene Angebote aufbauen zu können. Allerdings liegt diese Mammutaufgabe wiederum nur in den Händen der Anbieter. Schließlich handelt es sich bei OpenStack nur um einen großen Baukasten von Lösungen für den Aufbau von Cloud Computing Infrastrukturen. Was am Ende (Geschäftsmodell) dabei herauskommt, ist nicht die Aufgabe der OpenStack Community.

Mehr unter “Gefangen im goldenen Käfig. OpenStack Provider sitzen in der Falle.

2. Die Sicherheit erhält mehr Gewicht

Unabhängig von dem jüngsten NSA Skandal wurde schon immer die Sicherheit und das Vertrauen von (Public) Cloud Anbietern in Frage gestellt. Dabei sind (Public) Cloud Anbieter sicherheitstechnisch besser aufgestellt, als es die meisten Unternehmen weltweit sein können. Das liegt zum einen an den Ressourcen (Hardware, Software, Rechenzentrum und Personal) die in das Thema Sicherheit investiert werden. Zum anderen aber insbesondere daran, dass es sich dabei um ihr Kerngeschäft handelt, den reibungslosen und sicheren Betrieb sicherzustellen.

Vertrauen ist die Grundlage einer jeden Beziehung. Das gilt sowohl im privaten als auch im geschäftlichen Umfeld. Das hart erarbeitete Vertrauen der letzten Jahre wird derzeit auf die Probe gestellt und die Anbieter tun gut daran, sich weiter zu öffnen. “Security through obscurity” ist ein veraltetes Konzept. Die Kunden benötigen und wollen mehr Klarheit darüber, was mit ihren Daten passiert. Das wird sich je nach Region der Erde unterscheiden. Aber zumindest in Europa werden die Kunden ihren Anbieter deutlich härter auf die Probe stellen.

Mehr unter “Wie schützen Unternehmen ihre Daten gegen die Überwachung in der Cloud?

3. Die Private Cloud bleibt attraktiv

Die Private Cloud ist finanziell und hinsichtlich der Ressourcenallokation (Skalierbarkeit, Flexibilität) nicht die attraktivste Form der Cloud. Dennoch wird sie, ungeachtet der Prognosen mancher Marktforscher, nicht ihre Bedeutung verlieren. Im Gegenteil, hier wird die Sensibilität der Entscheider trotz des Kostendrucks unterschätzt, welche die Kontrolle und Hoheit über die Daten und Systeme behalten wollen. Das spiegelt sich auch in aktuellen Zahlen von Crisp Research wieder. Demnach wurden in Deutschland im Jahr 2013 nur etwa 210 Millionen Euro für Public Infrastructure-as-a-Service (IaaS) ausgegeben. Hingegen lagen die Investitionen für Private Cloud Infrastrukturen bei 2,3 Milliarden Euro.

Eine aktuelle Studie von Forrester zeigt zudem:

“From […] over 2,300 IT hardware buyers, […] about 55% plan to build an internal private cloud within the next 12 months.”

Das heißt nun nicht, dass Deutschland das Land der Private Clouds werden muss. Schließlich geht es immer um den spezifischen Workload und Use Case, der in einer Private oder Public Cloud abgebildet wird. Es zeigt aber einen deutlichen Trend hin zu einer weiteren Form wie Cloud Computing in Zukunft genutzt wird.

4. Willkommen Hosted Private Cloud

Nachdem ebenfalls Salesforce auf den Zug aufgesprungen ist, wird klar, dass Unternehmenskunden sich mit der Nutzung einer reinen Public Cloud nicht anfreunden können. In Kooperation mit HP hat Salesforce eine dedizierte Variante seiner Cloud Angebote angekündigt – “Salesforce Superpod”.

Dieser extreme Wandel in Salesforce Strategie bestätigt unsere Untersuchung des europäischen Cloud Markts. Unternehmen sind interessiert an Cloud Computing und dessen Eigenschaften (Skalierbarkeit, Flexibilität oder Pay per use). Allerdings gestehen sie sich selbst ein, dass sie nicht über das Wissen und die Zeit verfügen oder es einfach nicht (mehr) zu ihrem Kerngeschäft gehört, IT-Systeme zu betreiben und dies stattdessen von dem Cloud-Anbieter übernehmen lassen. Es geht also um eine flexible Art von Managed Services. Darauf sind Public Cloud Anbieter nicht vorbereitet, da ihr Geschäft darin besteht, hochstandardisierte Infrastrukturen, Plattformen, Applikationen und Services bereitzustellen.

Hier besteht die Chance für Anbieter von Business Clouds. Also Cloud Anbieter, die kein Public Cloud Modell fahren, sondern anhand von Managed Services den Kunden dabei helfen, den Weg in die Cloud zu meistern und den Betrieb übernehmen und damit einen „Alles aus einer Hand“ Service bieten. Das erfolgt im Normalfall nicht auf einer Shared-Infrastructure sondern innerhalb einer Hosted Private Cloud bzw. einer Dedicated Cloud bei der sich ein Kunde explizit in einem isolierten Bereich befindet. Professional Services runden das Portfolio ab, die bei der Integration, Schnittstellen und der Weiterentwicklung helfen. Business Clouds sind auf Grund dieser Serviceleistungen, Exklusivität und höherer Sicherheit (in der Regel physikalische Isolation) teurer als Public Clouds. Betrachtet man jedoch den Aufwand, den man als Unternehmen selbst in der einen oder anderen Public Cloud betreiben muss, um tatsächlich erfolgreich zu sein oder sich Hilfe von einem der zertifizierten Systemintegratoren holt, dann ist der Kostenvorteil meist eliminiert.

5. Die Hybrid Cloud bleibt der Dauerbrenner

Die Hybrid Cloud ist ein Dauerbrenner während der Diskussionen um die Zukunft der Cloud. Aber sie ist real. International wird die die Public Cloud zunächst überwiegend eingesetzt, um kurzfristig auf Ressourcen und Systeme ohne großen Aufwand und bequem Zugriff zu erhalten. Langfristig betrachtet wird sich dies in eine Hybrid Cloud formen, indem IT-Abteilungen auch ihre eigenen Infrastrukturen auf das Level einer Public Cloud (Skalierbarkeit, Self-Service, usw.) bringen, um ihre internen Kunden zu versorgen. In Deutschland und Europa ist es genau umgekehrt. Hier wird bevorzugt auf Private Clouds gesetzt, da die Themen Datenschutz und Datensicherheit höchste Priorität besitzen. Europäer müssen und werden sich erst an die Public Cloud gewöhnen, um gewisse Teile – zum Teil auch kritischere Systeme – mit einer Public Cloud zu verbinden oder teilweise darin auszulagern.

Grundsätzlich wird es bei der Hybrid Cloud verstärkt um die Beweglichkeit der Daten gehen. Das bedeutet, dass die Daten eher lokal in einer eigenen Infrastruktur gehalten werden und für die Verarbeitung in eine andere Cloud, z.B. Public Cloud, verschoben wird, um anschließend wieder innerhalb der lokalen Infrastruktur abgelegt zu werden. Dabei muss es sich nicht immer wieder um dieselbe Cloud eines Anbieters handeln, sondern je nach Kosten, Service Level und Anforderungen mehrere Clouds betreffen.

6. Die Multi Cloud ist Realität

Das Thema Multi-Cloud wird derzeit insbesondere im IaaS-Umfeld stark diskutiert, um das Risiko zu streuen und die Kosten und Leistungen unterschiedlicher Cloud-Infrastrukturen optimal ausnutzen zu können. Aber ebenfalls im SaaS-Umfeld muss das Thema unbedingt eine höhere Bedeutung zugesprochen werden, um Daten- und Insellösungen in Zukunft zu vermeiden, die Integration zu vereinfachen und Unternehmen bei ihrer Adaption ihrer Best-of-Breed Strategie zu unterstützen.

Ungeachtet diesen Erwartungen ist der Multi-Cloud Einsatz bereits Realität, indem Unternehmen mehrere Cloud Lösungen von vielen unterschiedlichen Anbietern einsetzen, auch wenn diese noch nicht (vollständig) miteinander integriert sind.

Mehr unter “Wir leben bereits in einer Multi-Cloud getriebenen Welt“.

7. Die Mobile Cloud kommt endgültig an

Mittlerweile gibt es Anbieter, die den Mobile Cloud Claim für ihr Marketing entdeckt haben. Viel zu spät. Tatsache ist, dass wir seit der Einführung der ersten Smartphones in einer Mobile Cloud Welt leben. Der Großteil aller Daten und Informationen, auf die wir von Smartphones und Tablets zugreifen, befinden sich nicht mehr auf dem lokalen Endgerät sondern auf einem Server in der Cloud.

Lösungen wie Amazon WorkSpaces oder Amazon AppStream unterstützen diesen Trend. Auch wenn Unternehmen noch vorsichtig mit dem Auslagern von Desktops sind, wird Amazon WorkSpaces diesen Trend stärken, wovon auch Anbieter wie Citrix und VMware profitieren werden. Amazon AppStream unterstreicht die Mobile Cloud im vollen Umfang, indem graphikintensive Anwendungen und Spiele vollständig in der Cloud verarbeitet und nur auf das Endgerät gestreamed werden.

Mehr unter “Die Mobile Cloud ist der wahre Megatrend” und “Die Bedeutung von mobilen und cloud-basierten Arbeitsweisen wächst stetig“.

8. Hybrid PaaS ist der Top-Trend

Die Bedeutung von Platform-as-a-Services (PaaS) nimmt stetig zu. Beobachtungen und Gespräche mit Anbietern, die bisher kein Mittel gegen Amazon AWS im Infrastructure-as-a-Service (IaaS) Umfeld gefunden haben, zeigen, dass diese ihre reinen Compute und Storage Angebote vertikal um einen PaaS erweitern werden und damit versuchen, die Gunst der Entwickler zu gewinnen.

Andere Trends zeigen, dass Private PaaS bzw. Hosted Private PaaS versuchen Marktanteile zu gewinnen. cloudControl hat mit der Application Lifecycle Engine seinen Public PaaS in einen Private PaaS gekapselt, mit dem Unternehmen einen eigenen PaaS in einer selbstverwalteten Infrastruktur betreiben können. Zusätzlich lässt sich eine Brücke zu einem Hybrid PaaS aufspannen. IaaS Anbieter Pironet NDH hat das Windows Azure Pack von Microsoft adaptiert, um auf dieser Basis einen Azure PaaS innerhalb einer gehosteten privaten Umgebung anzubieten. Das ist aus diesem Grund interessant, da hiermit die Lücke zwischen einem Public und einen Private PaaS geschlossen wird. Mit einem Private PaaS haben Unternehmen zwar wieder vollständige Kontrolle über die Umgebung, müssen diese aber ebenfalls aufbauen und verwalten. Innerhalb einer gehosteten Variante sorgt der Anbieter dafür.

9. Partner Ökosysteme werden wichtiger

Public Cloud Anbieter sollten sich verstärkt darum kümmern, ein hochqualitatives Netzwerk von Partnern und ISVs aufzubauen, um Kunden den Weg in die Cloud zu ebnen. Das bedeutet, dass der Channel verstärkt in Anspruch genommen werden sollte, um auf eine breite Basis von Kunden zu treffen, die potentielle Kandidaten für die Cloud sind. Der Channel wird jedoch mit dem Problem kämpfen, ausreichend gut ausgebildete Mitarbeiter für das Zeitalter der Cloud zu finden.

10. Cloud Computing wird einfacher

In 2014 werden Angebote auf dem Markt erscheinen, welche die Nutzung des Cloud Computing einfacher machen. Bei diesen Angeboten muss den “Randthemen” der Verfügbarkeit und Skalierbarkeit weniger Beachtung gewidmet werden. Stattdessen können sich Cloud Nutzer auf ihre tatsächlichen Kernkompetenzen konzentrieren und ihre Energie in die Entwicklung der eigentlichen Applikation investieren.

Ein Beispiel: Infrastructure-as-a-Service Marktführer Amazon AWS sagt zwar, dass sich IT-Abteilungen durch die Nutzung der AWS Public Cloud nicht mehr um die Beschaffung und Wartung der notwendigen physikalischen Infrastruktur kümmern müssen. Stattdessen verlagert sich die Komplexität jedoch eine Ebene höher. Auch wenn mittlerweile zahlreiche Tutorials und Whitepaper existieren, spricht AWS öffentlich nicht aussagekräftig darüber, dass Skalierbarkeit und Verfügbarkeit innerhalb einer Scale-out Cloud Infrastruktur beliebig kompliziert werden kann. Das sind Kosten die niemals vernachlässigt werden dürfen.

Ein Tipp: (Hosted) Community Clouds

Ich sehe ein wachsendes Potential für die Community Cloud (MEHR HIER), die derzeit noch keine große Verbreitung hat. In diesem Zusammenhang sehe ich zudem einen Wechsel von einer zurzeit stark ausgeprägten Zentralität zu einer Semi-Dezentralität.

Die meisten Unternehmen und Organisationen sehen in der Public Cloud einen großen Vorteil und wollen in ihren Genuss kommen, um Kosten zu reduzieren, ihre IT zu konsolidieren und gleichermaßen mehr Skalierbarkeit und Flexibilität erhalten. Auf der anderen Seite sind Zukunftssicherheit, Vertrauen, Unabhängigkeit und Kontrolle wichtige Artefakte, die niemand gerne aufgeben möchte.

Die Community Cloud ist der Mittelweg, um beides zu erreichen. Zukunftssicherheit, Vertrauen, Unabhängigkeit und Kontrolle durch den Einfluss, was in der Community Cloud passiert und die Vorteile einer Public Cloud, durch das Partizipieren von einer echten Cloud Infrastruktur.

Es stehen mittlerweile einige Lösungen bereit, die als Basis für die eigene professionelle Cloud genutzt werden können. Dabei sollte man jedoch immer ein Auge auf die grundlegende Infrastruktur richten, die das Rückgrat der gesamten Cloud bildet. In diesem Zusammenhang darf man auch nicht den Aufwand unterschätzen, den es bedeutet, eine professionelle Cloud-Umgebung aufzubauen, ordnungsgemäß zu betreiben und zu warten. Weiterhin sind die Kosten für die notwendigen physikalischen Ressourcen zu kalkulieren.

Aus diesem Grund macht es für viele kleinere und mittlere Unternehmen, aber auch Bürogemeinschaften, Co-Working Spaces oder regelmäßige Projektpartnerschaften Sinn, den Community Cloud Gedanken in Betracht zu ziehen, um in den Genuss der Public Cloud (Shared Umgebung) zu kommen aber weiterhin Einfluss auf die Zukunftssicherheit, das Vertrauen, die Unabhängigkeit und Kontrolle der Umgebung zu haben und Kostenvorteile u.a. durch Ressourcenallokation zu erzielen. Hierzu sollte man sich zudem überlegen, ein Team aufzubauen, dass sich exklusiv um den Aufbau, Betrieb, die Administration und Wartung der Community Cloud kümmert. Dafür ist kein eigenes Rechenzentrum oder eine Co-Location erforderlich. Ein IaaS-Anbieter kann dazu als ideale Basis für eine Hosted Community Cloud dienen.

Categories
Analysis

Criteria for selecting a cloud storage provider

Who is searching for secure and enterprise ready options for Dropbox should have a closer look to the vendors. The quest for a cloud storage vendor depends in most cases on the individual requirements. These decision makers previously need to debate and define. In particular, this includes classifying the data. Here is defined which data is stored in the cloud and which is still located in an own on premise infrastructure. During the selection of a cloud storage vendor companies should regard the following characteristics.

Configuration and integration

The storage service should be able to integrate in existing or further cloud infrastructure in a simple manner. Thus users are empowered to expand the existing storage through a hybrid scenario cost-efficient. In addition, data can be migrated from the local storage into the cloud in a self-defined period. This leads to the option to disclaim an own storage system for specific data in the long run. It is the same with the straightforward and seamless export of data from the cloud that needs to be ensured.

A further characteristic is the interaction of the cloud service with internal systems like directory services (Active Directory or LDAP) for a centralized collection of data providing to applications. For an easy and holistic administration of user access to the storage resources this characteristic is mandatory. For this, the vendor should provide an open and well documented API to realize the integration. Alternatively he can also deliver a native software.

Platform independence to access data from everywhere

The mobility for the employees become more and more important. For companies it is of vital importance to appoint their working habits and deliver appropriate solutions.

In the best case the cloud provider should enable a platform independent access to the data by providing applications for all common mobile and local operating systems as well as an access over a web interface.

Separation of sensitive and public data

To give employees data access over mobile and web applications further security mechanisms like DMZs (demilitarized zone) and right controls on granular file level are necessary. A cloud storage provider should have functions to separate data with a higher security demand from public data. Companies who want to provide the data from an own infrastructure need to invest in further security systems or find a vendor who has integrated these type of security.

Connection to external cloud services

A cloud storage can be used as a common and consistent data base for various cloud services to integrate services like software-as-a-service (SaaS) or platform-as-a-service (PaaS). The cloud storage serves as a central storage. For this purpose the vendor needs to provide an open API to realize the connectivity.

Cloud storage – Eco- and partner system

Especially for storage vendors who exclusively dispose cloud solutions, a big ecosystem of applications and services is attractive and important to expand the storage service with further value added functions. This includes, for example, an external word processor to edit documents within the storage with multiple colleagues.

Size of the vendor – national and international

The track record is the most important evidence for the past success giving a statement about the popularity based on well-known customer and succeeded projects. This aspect can be considered for a national as well as an international footprint. Besides its available capacity and therefore its technology size, for a cloud storage vendor the international scope is also vital importance. If a company wants to enable its worldwide employees to access a central cloud storage, but decides for a vendor who just have data centers in the US or Europe, not only the latency can lead to problems. Insofar the scalability regarding the storage size as well as the scope are a crucial criteria.

In addition, it is interesting to look at the vendor’s roadmap: What kind of changes and enhancements are planned for the future? Are these enhancements interesting for the customer compared to another potential vendor who does not consider this?

Financial background

A good track record is not the only reason while choosing a vendor. Not least the drama of smashup storage vendor Nirvanix has shown that the financial background must be considered. Especially during the risk assessment a company should take a look on the vendor’s current financial situation.

Location and place of jurisdiction

The location where the company data is stored becomes more and more important. The demand for the physical storage of the data in the own country increasingly rises. This is not a German phenomenon. Even the French, Spain or Portuguese expect their data stored in a data center in the own country. (http://research.gigaom.com/report/the-state-of-europes-homegrown-cloud-market/) The Czechs prefer a data center in Austria instead of Germany. More relaxed are the Netherlands on this topic. Thereby the local storage of the data is basically not a guarantee for the legal compliance of the data. However, it becomes easier to apply local laws.

Most of the US vendor cannot fulfill a physical locality of the data in each European country. The data centers are either located in Dublin (Ireland) or Amsterdam (Netherlands) and just comply with European law. Although many vendors joined Safe Harbor which allows to legally transfer personal data into the US. However, it is just a pure self-certification that based on the NSA scandal is challenged by the Independent Regional Centre for Data Protection of Schleswig-Holstein (Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein (ULD)).

Cloud storage – Security

Regarding the topic of security it is mostly all about trust. This, a vendor only achieves with openness. He needs to show his hands to his customers technologically. Especially IT vendors are often criticize when it’s about talking on their proprietary security protocols. Mostly the critics are with good cause. But there are also vendors who willingly talk about it. These companies need to be find. Besides the subjective topic of trust it is in particular about the implemented security which is playing a leading role. Here it’s important to look on the current encryption mechanism a vendors is using. This includes: Advanced Encryption Standard – AES 256 for encrypting the data, Diffie-Hellman and RSA 3072 for key exchange.

Even the importance of the end-to-end encryption of the whole communication rises. This means, that the whole process a user is running through the solution, from the starting point until the end, is completely encrypted. This includes among others: The user registration, the login, the data transfer (send/ receive), the transfer of the key pairs (public/ private key), the storage location on the server, the storage location of the local device as well as the session while a document is edit. In this context it is to advise against separate tools who try to encrypt a non-secure storage. Security and encryption is not a feature, but rather a main function and belongs into the field of activity of the storage vendor. He has to ensure a high integrated security and a good usability at once.

In this context it is also important that the private key for accessing the data and systems is exclusively in the hands of the user. It also should be stored encrypted on the user’s local system. The vendor should have no capabilities to restore this private key. He should never be able to access the stored data. Note: There are cloud storage vendors that are able to restore the private key and are also able to access the user’s data.

Certification for the cloud

Certifications are a further attribute for the quality of storage vendors. Besides the standards like ISO 27001, with which the security of information and IT environments are rated, there also exist national and international certificates by approved certification centers.

These independent and professional certificates are necessary to get an honest statement on the quality and characteristic of a cloud service, the vendor and all down streamed processes like security, infrastructure, availability, etc. Depending on how good the process and the auditor is, a certification can also lead to an improvement of the product, by the auditor proactively gives advices for security and further functionality.

Categories
Analysen

Kriterien zur Auswahl eines Cloud Storage Anbieter

Die Suche nach einem Cloud-Storage-Anbieter hängt in den meisten Fällen mit den individuellen Anforderungen zusammen, die der Service erfüllen soll. Diese müssen Entscheider im Vorfeld erörtern und festlegen. Dazu gehört vor allem die Klassifizierung der Daten. Dabei wird definiert, welche Daten in der Cloud und welche in der eigenen On-premise-Infrastruktur gespeichert werden.

Dieser Artikel ist exklusiv im IDG-Netzwerk erschienen und kann unter “Wenn Dropbox die falsche Lösung ist: Cloud Storage – wie Sie den richtigen Anbieter finden” gelesen werden.