Thursday, August 20, 2009

Limitation of Private Cloud

Limitation of Public Cloud
• Service availability: Enterprises are very sensitive to whether cloud providers can guarantee adequate availability required by business (especially since enterprises will have little or no control over the physical cloud environment). Further, relying exclusively on a single cloud service provider can also be a single point of failure; most enterprises are reluctant to move to a cloud provider without some business continuity strategy in place. In order to guarantee high availability and avoid single source of failure, multiple cloud providers with independent software stacks could be used. This, however, increases implementation complexity significantly.

• Service lock-in (proprietary APIs à no interoperability): Cloud providers today lack interoperability standards; this ensures that a cloud user cannot easily move applications and data between cloud vendors leading to lock-in scenario. The lock-in is definitely advantageous to cloud providers but cloud users are vulnerable to price increases, reliability problems or, in the worst case, the said provider becoming defunct. Standardizations of APIs not only will mitigate lock-in but also enable same software infrastructure to be used in private and public cloud such that excess computation workload that cannot be handled in the private cloud could be off-loaded to the public cloud (“surge protection”).

• Data confidentiality and auditability: Security and auditability is a significant concern of most enterprises given the public nature of cloud offerings. There are regulatory requirements such as HIPPA or Sorbanes-Oxley that will necessitate auditability of corporate data in the cloud. In addition, some nation's law may mandate cloud providers to keep data within national boundaries or prevent a second country getting access to data via its court system. There is also a significant risk to privacy of personal information as well as information confidentiality of businesses or government organisations when data is located in the cloud.

• Data transfer bottleneck and cost (technology trends): Given the data-intensive nature of applications, data transfer into and out of cloud becomes a major issue with current rate of $100 to $150 per terabyte transferred. This cost can quickly become prohibitive thus making data transfer a major bottleneck to cloud adoption. This is a significant challenge, since over the past decade; the cost of wide area network bandwidth has reduced much more slowly than the cost of computation and storage capacity.

• Performance unpredictability: Multiple Virtual Machines (VMs) can share CPU and main memory quite well but I/O sharing often leads to interference and hence unpredictability in performance.

• Difficulty in debugging large-scale distributed systems: When an application is migrated to the cloud and executed in a large-scale distributed environment, there may be bugs that manifest which cannot be reproduced in small-scale configuration. Detecting and debugging faults in such large-scale distributed deployment is quite a challenge.

• Software licensing: The licensing model for commercial software is a mismatch for cloud computing. Current software licensing limits the computer on which it can be installed and users pay one time plus annual maintenance charges. Thus, many cloud providers have relied upon open source software but key challenge for commercial software vendors is to devise a better licensing model for the cloud.

No comments:

Post a Comment