• Handful Of Monopoly Infrastructure Players – Not So Fast

     

    Picture Courtesy: Addict3d.orgThis is the second post on the topic I have been emphasizing on many different forums. My earlier post, Handful Of Monopoly Infrastructure Players – A Shortsighted Idea, laid out philosophical and economic reasonings against the idea of the emergence of handful of infrastructure providers. This idea is a pet theme for many cloud pundits. As I have argued many times in the past, these pundits are either failing to understand the diversity in this world or trying to ignore diversity completely. In this post, I am going to quote a recent news and argue that such a consolidation cannot happen anytime in the near future.

    Little over ten days back, Google shocked the world with an announcement that they are rethinking about their Chinese operations. They quoted an apparent attack on their infrastructure originating from Chinese government. The whole story organically grew into talks about cyber-warfare and its consequences.

    Like many other well-known organizations, we face cyber attacks of varying degrees on a regular basis. In mid-December, we detected a highly sophisticated and targeted attack on our corporate infrastructure originating from China that resulted in the theft of intellectual property from Google. However, it soon became clear that what at first appeared to be solely a security incident–albeit a significant one–was something quite different.

    These attacks and the surveillance they have uncovered–combined with the attempts over the past year to further limit free speech on the web–have led us to conclude that we should review the feasibility of our business operations in China. We have decided we are no longer willing to continue censoring our results on Google.cn, and so over the next few weeks we will be discussing with the Chinese government the basis on which we could operate an unfiltered search engine within the law, if at all. We recognize that this may well mean having to shut down Google.cn, and potentially our offices in China.

    This is a perfect example of the current world order and should make us think whether such a consolidation of infrastructure players can even happen. Whether we like it or not, such attacks from government agencies are bound to happen pretty soon in the future. It need not be just China, it could be US trying to take down the so called “enemy states” or India and Pakistan fighting it out on the internet or Israel-Palestinian conflict escalating over to the net. With the internet gaining more and more importance in our daily lives as well as enterprise and government operations, the cyber-warfare is becoming a realistic possibility. With so much at stake, governments are going to do everything to protect their interests and their citizens’ interests. This translates into laws imposed by the governments on the cloud infrastructure industry. Such fears and other concerns regarding law enforcement will definitely lead to restrictions that could call for building infrastructure inside their borders.

    No, I am not cooking up this scenario. Recently, Indian government was considering regulations that will force all the businesses in India to store their data on the servers inside the country, to avoid tax evasion and other types of fraud.

    The concept, known as cloud computing, allows a customer to use distant servers to store and manage data. The service is cost-effective and increasingly becoming popular. However, the Finance Minister has formed a high-level committee to study the Information Technology Act (IT) and suggest amendments that will make it compulsory for firms and individuals to maintain mirror servers in India.

    Even though this is not entirely related to the emergence of cloud computing as the article implies and it could have happened in the traditional hosting world too, the movement to cloud has the potential to increase such concerns. It is quite natural that governments will impose inward looking regulations to protect their bases. We should also note that many governments engage in protectionism when it comes to foreign companies doing business in their land. These factors, along with the ones I have highlighted in the previous post, will make sure that new cloud infrastructure players spring up in different parts of the world denying any possibility for the consolidation fantasized by some cloud pundits. Their fantasies can happen only if the country borders vanish in thin air. The last time I heard from people, it is not happening anytime soon. What do you think? 

    CloudAve is exclusively sponsored by

    Read more

  • Gartner Talks About The Role Of Cloud Computing In IT Organizations In The Coming Years

     

    The analyst firm Gartner has made some predictions regarding how IT organizations will shape up in the next few years of this decade. Some of these predictions are about how Cloud Computing will transform IT and I will highlight them here in this post.

    One of the most important predictions by Gartner is that 20% of businesses will have absolutely no IT assets (that is one in every 5 businesses). Gartner attributes Cloud Computing as one of the important factors leading to the zero IT asset scenario.

    By 2012, 20 percent of businesses will own no IT assets. Several interrelated trends are driving the movement toward decreased IT hardware assets, such as virtualization, cloud-enabled services, and employees running personal desktops and notebook systems on corporate networks.

    In fact, this is a bold prediction considering how traditional IT vendors and companies interested in pushing private cloud offerings are dismissing the proliferation of public clouds. This prediction goes on to confirm what many of us are saying already, the cloud computing is not just here to stay but it is going to transform how we do business.

    Another interesting prediction from Gartner is related to how Indian outsourcing industry is going to embrace cloud computing

    By 2012, India-centric IT services companies will represent 20 percent of the leading cloud aggregators in the market (through cloud service offerings). Gartner is seeing India-centric IT services companies leveraging established market positions and levels of trust to explore nonlinear revenue growth models (which are not directly correlated to labor-based growth) and working on interesting research and development (R&D) efforts, especially in the area of cloud computing. The collective work from India-centric vendors represents an important segment of the market’s cloud aggregators, which will offer cloud-enabled outsourcing options (also known as cloud services).

    This reminds me of a prediction by Mike West of Saugatuck technology at last year’s Gluecon keynote and my apprehension about it.

    The day ended with a Keynote by Mike West, an analyst from Saugatuck technology. He gave an overview on where the market is going and the issues that matter. He noted that India Inc. is readying itself to offer IT as a Service in the future. This got me going on my feet and I disputed this notion. My argument was based on the fact that the enterprises are not trusting Cloud providers inside of USA like Amazon.com and there is no way they will trust companies in foreign countries with their IT. Plus, the regulatory issues will definitely prevent such a move. He tried to dismiss my concerns by arguing that his projection was about something down the road and they can use the infrastructure provider in US to take care of these issues. I am still not convinced about it and I would love to have a discussion with him on the topic if an opportunity presents itself.

    I still feel that the regulatory issues and the security concerns related to shipping of the control of important data to many different providers (cloud vendors and Indian outsourcing vendors) will make such a large scale adoption difficult. Right now, many of the Indian outsourcing vendors are engaged in cloud washing of their services. Nevertheless, it is important to keep a watch on this segment of the industry.

    CloudAve is exclusively sponsored by

    Read more

  • Cloud Computing's Electricity Metaphor Has Outlived Its Usefulness?

     

    Image via Wikipedia

    Nick Carr, in his book The Big Switch, used the electricity analogy to explain the nature of Cloud Computing. Initially, this comparison helped people get enthusiastic about cloud computing by connecting the idea with that of electric generation. However, I think this concept has outlived its utility and we need to go beyond this simplistic model. Recently, I had a twitter discussion with Randy Bias of CloudScaling on the topic. He didn’t like the idea of using the term outsourcing while describing Cloud Computing. I thought there is a need for this term because, unlike electricity, we cede lot of control to the cloud provider. However, Randy disagreed and emphasized on the comparison to the electricity model.

    @krishnan Whoa. Disagree. You give up control of power generation. You become a self-service consumer. *exactly* like cloud.

    Already, James Urquhart has written an eloquent post on the topic. But, at the risk of appearing to be repetitive, I want to address the topic here because the 140 character limit on Twitter is too restrictive for such discussions.

    Let us do some comparison between cloud computing and electricity model and see where these two resemble each other.

    • Electrical power generation and delivery through electrical grid are similar to the cloud provider taking care of compute power and delivering through internet.
    • Pay as you go pricing model.
    • Enormous cost savings that accompanies large scale centralized power/compute generation and delivery.

    In a way, the comparison ends here. As James clearly highlighted in his post, when you put the all important data into the mix, everything changes. There is no parallel to compute data in the electricity model. The various issues surrounding the data clearly limits the comparison to the electricity model. Even though we let go of the control to manage computing infrastructure much like how we let go the control of power generation, the presence of data inserts the additional outsourcing component to the definition of cloud computing. This is due to various risk factors, security and regulatory issues that come up when we throw the data into the mix. There is no analog in the electricity industry that matches the risk factors introduced with the transfer of control of data in the case of cloud computing.

    However, this doesn’t mean the electricity comparison is invalid. On the contrary, it is still the most attractive idea in cloud computing. It also doesn’t mean that we should keep away from cloud computing. Rather, we should embrace cloud computing for the electricity-like benefits but we should consider various issues surrounding the data while planning for the cloud adoption. It is important for us to realize that there is an outsourcing component involved here and we need to use due diligence during the planning stages like any other outsourcing process. The electricity model is a good starting point for understanding the advantages of cloud computing but we need to go far beyond this model to implement it. Feel free to jump in with your take on this topic.

    CloudAve is exclusively sponsored by

    Read more

  • i365 Releases EVault Cloud-Connected Services Platform

     

    i365, the Seagate company offering data protection, backup and recovery services, has taken the next step in pursuing their vision for cloud computing by releasing EVault Cloud-Connected Services Platform. i365 is one of the largest cloud service provider in the mid market segment and it was formed by the acquisition of E-Vault and few other companies by Seagate. They have, approximately, 22000 customers and their focus is on cloud storage software, SaaS, managed services and appliances. I have been following them for the past few months and they seem to be executing a cloud strategy that could cement their position in the mid-market, SMBs to SMEs. i365, short for Information, 365 days a year, is clearly focused on the narrow mid market region with an aim to create cloud based storage solutions for their needs. They claim to be the god father of the cloud storage segment.

    Selling both through the channels and directly, their vision of cloud storage is a hybrid one. They want to blur the lines between the on-premise and cloud based storage and they do this by adding value at the edge device. Their cloud backup process works as follows. The backup software backs up in a primary vault inside the corporate datacenter. Then, the data is reduced prior to replication. The primary vault then replicates the data to the passive vault present inside the i365 cloud. The same backup history is kept in both vaults. The users can either use the primary vault or the vault inside the i365 cloud for disaster recovery.

    i365 is taking a smart route to the cloud game. Instead of insisting on the use of their own software, they have designed their cloud to be application agnostic, thereby, allowing the clients to use the software they are already comfortable with. In November of 2009, they partnered with Microsoft on development of a heterogeneous solution that will allow IT managers to extend Microsoft System Center Data Protection Manager (DPM) 2010 across non-Microsoft platforms and into the cloud, using i365’s EVault data protection software and cloud-connected storage solutions infrastructure. This is a smart move on the part of i365 to tap into the existing Microsoft Data Protection Manager clients.

    As a part of their vision for cloud connected storage solutions vision, i365 announced about their Cloud Connected Services Platform on Monday. In short, it allows Independent Software Vendors (ISVs) to SaaSify their existing enterprise applications by tapping into the i365 cloud. The EVault Cloud-Connected Services Platform extends the i365 Cloud beyond EVault Software and allows ISVs to use i365’s technology, Cloud storage, and SaaS infrastructure for their applications. It is the latest offering to support i365’s Cloud-Connected storage solutions vision, which is focused on helping midmarket organizations manage their storage solutions in an integrated on-premise, edge, and Cloud environment.

    Essentially, this Cloud Connected Services Platform consists of

    • a cloud interface to integrate the application with the i365 cloud so that business functions such as account provisioning, metering and billing are done using RESTful webservices.
    • a service connector that resides on-premises, caching data and efficiently sending it outside the firewall via a secure network connection to the i365 Cloud
    • i365 cloud storage with SAS 70 Type II or ISO 9001:2000 certification
    • The platform includes SaaS business and support systems that can accommodate a variety of go-to-market strategies, including different pricing and billing models; account and contract management; and levels of customer service and support

    Well, this doesn’t lead to enterprise SaaS applications per se but enables ISVs to store the data in the cloud and leverage enormous cost savings along with other benefits. When I spoke to Terry Cunningham, Senior Vice President in November, he was very enthusiastic about their cloud plans and told me to expect more announcements in this year.

    CloudAve is exclusively sponsored by

    Read more

  • Dissecting Google's Nexus One Strategy

     

    Two days back Google announced the release of their Android 2.1 based mobile phone, NexusOne, with much fanfare. In spite of the fact that CES is just around the corner, this announcement got the tech blogosphere and mainstream media go wild. The reactions from the pundits ranged from why is Google getting into handset business to why is Google not giving away the phone for free. In this post, I am going to intentionally approach Google’s strategy from a completely different angle, with a possibility for getting dismissed as a conspiracy theory. It is my argument that these pundits are not understanding Google’s strategy at all. Let me explain why.

    In 2008, when I wrote a post about Google, I highlighted two important points about them. 

    There are two types of companies in the world, companies with products and companies with a vision. Google belongs to the latter category and there are only a handful of companies in that category, with Microsoft being one of them. Bill Gates had a clear vision for his company: he wanted to put a PC on every desk. This vision drove Microsoft’s operating system products and they did manage to put a PC on every other desk (well, not literally). Similarly, Google also started with a vision, to organize all of the world’s information.

    and

    The case with Google is different. They are still focused on their initial mission to organize all of the world’s information. Everything they do, whether it is Google Apps or Google Books or their Chrome browser or even their investment on research related to alternative energy sources, is linked to their original vision. They just want more and more information on their servers. That is the bottom line and that is why Google still considers Search to be their core business. 

    This will the basis of my argument in this post too. Google is still in the business of organizing all of the world’s information. However, to have a successful with their vision, they need to take care of many factors. Some of them include

    • The application to handle all the information, browser

    • OS to control complete user experience

    • Last mile network to users home

    • Internet

    • Mobile OS

    • Mobile network

    and so on. Even though there are other factors, the above mentioned ones are relevant to my argument. Let me now try to explain how these factors affect Google’s strategies to achieve their goal and, then, point out how Nexus One strategy is one among them.

    When Google started off with their mission to organize all of the world’s information, their first concern was about the web browser. With the complete dominance of Internet Explorer, Google was fully aware of the fact that it is very easy for Microsoft to bump them out of the marketplace. They supported Mozilla in their quest to break the dominance of IE. Firefox is an open source browser and Google cannot exert much pressure on them and make them dance to their tunes. So, they plotted their own browser to have a better control over the user experience. They released the browser under an open source license to keep their “do no evil” mantra intact and also to keep the regulators at bay.

    As long as Microsoft controlled the consumer and enterprise desktop market, there is always a danger of Microsoft continuing with their dominance over productivity applications. They can easily offer a better user experience with their S+S strategy. If Google has to reach their goal of organizing all of the world’s information, they have to get the data of every single user into their servers. The best way to do it and, also, change the traditional desktop mindset of users is by taking control of the operating system itself. It is no easy task to break the backbone of Microsoft in the OS market but it is important to loosen the hold of Microsoft in the user experience side. With Chrome OS, Google is trying to go behind the consumer market now.

    Thankfully for Google, AOL’s walled garden approach died before Google was even launched and the world wide web was neutral and based on open standards. This made the task easy for Google and they didn’t have to worry about ISPs controlling the last mile to users’ computers. Probably, this could be why Google never talked about the last mile ISP marketplace and internet itself. Google knew that the internet should be neutral without a class system for their vision to be successful and, hence, their support for net neutrality initiatives.

    As more and more users rely on mobile phones for their web access, it became crucial for Google to get a hold here. Initially, they relied on Apple to reach the consumers. But I am sure it was never their long term strategy. With Apple’s thirst for a maniacal control of their platform and their own plans for a cloudy future, it was just a matter of time before Google found their own way. Google could not rely on the competition of proprietary mobile operating systems for their strategy. The best option for Google is to have an open platform without any of their competitors exerting control over the platform. Android was a result of this thinking and the need to control the user experience in the mobile space. Now, Android is already on ebook readers and other entertainment devices. This gives Google an opportunity to reach the entertainment market controlled by Apple, Microsoft, Sony, etc., in their quest to organize world’s information.

    One of the biggest obstructions for Google in their march towards their stated vision is the control exerted by the mobile network operators. Even though Google was lucky not to face the AOL walled garden, it is different with the mobile operators, especially in the United States. The monopoly like power exerted by these mobile operators is one of the reasons for a slow adoption of mobile in the US (compared to some other countries in the world). Google has been at the forefront of the efforts to thwart the hold mobile operators have on their users. By “participating” in the 700 MHz spectrum auction, Google forced the hands of FCC to impose open standards. Couple of days back, Google asked FCC to designate Google as one of potentially several administrators of a white spaces geolocation database. However, these efforts by Google may only yield a slower change and the lack of speed in this change could be potentially damaging to Google in the long run. One way to accelerate the process of “opening” up the mobile network space in US is by removing the operators’ hold on their customers’s experience in the mobile web. It involves ensuring the access for open handset devices. It is also important to educate the US customers about the advantages of using operator neutral open devices. Such actions will loosen the hold mobile operators have on their customers. Once this hold is loosened up, Google can easily gain better control over the mobile platforms. In my opinion, NexusOne is an attempt by Google to warn the telecom operators to open up their devices. If the telecom operators refuse to budge, Google could, then, nudge them out by reaching to users directly. If the mobile operators fall in line, Google will just let them continue with their operations and just focus on organizing all of the world’s information. This, probably, is the reason why Google has not subsidized the phone. They may end up doing it if it becomes a necessity.

    As I pointed in my above mentioned post,

    Google takes an entirely different approach to putting their products into mainstream use. They don’t compete with other products head on but, rather, slowly change the consumer behavior towards Google products.

    This is how I see the release of NexusOne too. Google doesn’t want to compete with other companies offering handsets. Rather, they want to change the mindset of consumers towards having an open handset that will work with any network in any country. Once users get accustomed to this philosophy, it will be a cakewalk for Google to dominate the mobile web on their march towards achieving their vision of organizing all of the world’s information. What do you think?

    CloudAve is exclusively sponsored by

    Read more

  • MySQL, Oracle And Cloud Computing

     

    Image via CrunchBase

    Ever since Oracle announced the acquisition of Sun Microsystems along with MySQL, all hell broke loose in the open source community. With EU questioning the deal, there is a war (of words) erupting inside the community with one side asking EU to block the deal or, at the very least, change the license to another open source license from GPL and the other side urging EU to allow the transaction to go through. Even though I have no love for Oracle, I think it is time to let the deal go through at least for the sake of Sun employees who are sitting there with their future unknown. At the same time, I am not unduly worried about the future of MySQL because I have complete confidence in the open source license of MySQL. Let me try to explain my position here in this post.

    For the sake of argument, let us consider the hypothetical scenario of Oracle killing off MySQL. This leaves us with the only option of going with one or more of the MySQL forks. Such a scenario is perfectly fine for most of us except a handful of people who are either packaging MySQL into a proprietary software or keen on building a business making money from a dual licensing scheme like the one that MySQL uses now. If you look at how MySQL is being used in the world, it is easy to see that only a small percentage of users are affected by Oracle killing MySQL. Now, if you factor in the odds of Oracle resorting to such an action and the associated PR impact, you are left with only 2-3 people getting affected. They are the ones who are planning to run a business using a dual licensing scheme on a MySQL fork. I can assure you that these 2-3 people are quite capable of taking care of themselves and we need not waste EU’s taxpayer money and our valuable time fighting for them. I want to emphasize once again that the only business model that will get affected in the case of Oracle killing MySQL is the dual licensing scheme. Everything else, including the hugely successful support services model, will continue to thrive.

    Now I am going to address the “Cloud Computing” part of the title. Richard Stallman and some of the free software evangelists dismiss Cloud Computing as an attempt to push vendor lock-in through the backdoor. There are others like Tim O’ Reilly who advocate the line that we should not worry about licensing and, instead, focus on ensuring open architectures and standards. On the other hand, I have argued many times in this blog about the importance of open source from a moral and strategic point of view. However, as noted by the industry observers including Matt Asay and Index Ventures general partner Bernard Dallé, cloud computing may turn out to be the best way to monetize the open source software in the coming decade.

    If you are wondering how this argument fits into the MySQL show, I would like to argue that cloud computing will ensure that (forks of) MySQL will continue to live and serve the needs of the users as before. Here is my line of argument but feel free to poke holes into it (it will help me tweak my understanding of the open source marketplace). Already big cloud infrastructure players are offering cloud based services relying on MySQL. Amazon has started offering MySQL like capabilities in the name of Amazon RDS (note the absence of the term MySQL in the name), Joyent is offering MySQL accelerator and Rackspace is partnering with FathomDB to offer their own MySQL offering. Of the above three examples, Amazon and Rackspace jumped into the game after the announcement of Oracle-Sun deal. These players are not there because they are trying to squeeze out as much revenue as possible before MySQL goes under the water. They are there for a long haul which implies that they will be forking MySQL or supporting one of the forks. Let us not forget that Rackspace has committed resources to Cassandra project because they have plans to offer an open cloud alternative to Amazon’s SimpleDB and Google’s datastore. I am pretty sure Rackspace will spend some of their resources on a MySQL fork to keep the project going. There is a good chance that Amazon might also support the project even though they have the ability to absorb many of the current MySQL employees and keep the development for their in-house usage (Thanks to GPL’s still existing SaaS loophole). In fact, these cloud vendors may even end up supporting a neutral MySQL fork and, then, use their in-house expertise to differentiate themselves from the competitors. With more and more cloud vendors offering some form of service based on MySQL, the longevity of MySQL increases. These vendors, in their own self interest, will ensure that a fork of MySQL live as long as people use relational databases.

    I think it is time for some sanity to prevail in the community and let Oracle absorb Sun and MySQL. The very nature of open source will ensure that users are never left in the lurch. MySQL and any other open source software absorbed by proprietary vendors in the future will survive irrespective of what the new owner does to the OS software they buy. Along with other factors, cloud computing will also help them survive.

    CloudAve is exclusively sponsored by

    Read more

...7891011
...7891011