• CloudCamp Sydney – The Same Old Memes

     

    Two weeks ago the Australasian CloudCamp tour hit Sydney for an event at the National Innovation Centre of the Australian Technology Park. A crowd of around 90 people turned up for some lively discussions – as always there were some common themes – the legal and jurisdictional aspects of Cloud Computing and the public/private cloud debate. For the former we were lucky to have Alec Christie from law firm DLA Phillips Fox in attendance – somewhat surprisingly for a lawyer who makes money out of complexity, Alec was reasonably relaxed about the legal implications of the cloud. His firm actually produced this guide to cloud computing and the law – some of it is country specific but it’s still a worthwhile read. It’s a meme I’ve returned to since with some discussions at Cloud Connect in San Jose, and will do so in May at Glue conference – more on that another time.

    Anyway – as with previous CloudCamps, the private cloud debate reared it’s head. I even scheduled that particular session out in the foyer so that if people came to blows (this debate really inflames some passions), we could simply hose the blood off the tiles. There is a real continuum of thought around this topic, fellow commentator Phil Wainewright, an analyst I greatly respect, comes firmly from the school of thought that says private cloud is bunkum. He certainly doesn’t mince words, however he is moderate enough to accept a private segmentation of some public cloud resource as a valid approach (roughly what Amazon is doing with its Virtual private Cloud) but is adamant that “it’s not cloud if it’s confined within a closed, single-enterprise environment.” In fact Wainewright goes so far as to contend that many so-called “cloud” offerings are :

    as alluring as lipstick daubed on a pig, because behind the scenes the hosting providers will be doing a lot of covert physical partitioning to cut corners

    Ouch…

    At the other end of the spectrum, at Cloud Connect we had a lot of what Krish calls “cloudwash” – traditional vendors dressing their legacy offerings up and calling them “cloud enabled”, “cloud middleware” or some other cloudy descriptor – little more than marketing but it seems to work for them, despite the howls of derision from attendees:

    rofl

    Anyway – at the Sydney event Samuel from UltraServe organized a camera to capture some thoughts from attendees – thanks to Geoff for spending time to record (and edit) the video.

    CloudAve is exclusively sponsored by

    Read more

  • Pearl and Rapportive – Helping Email Become the Killer App (Again)

     

    A couple of weeks ago when I posted about the launch of the Google Apps Marketplace, I mentioned Appirio’s “PS Connect” offering that embedded contextual data from Appirio, within Gmail. At the time it was an interesting offering, but didn’t garner much attention given the noise that was building around the Marketplace itself.

    The other day Chris Tanner from Pearl (more on them here) pinged me to tell me about a new tool they were rolling out that would allow data from within the Pearl CRM system to be embedded into the Rapportive (more on that here) window within Gmail. This integration gives us another taste of what vendors who retain a core belief in Email as the “killer app” for business are doing to use it as the aggregative platform for a whole host of different data streams.

    Setting this integration up is simple – simply install Rapportive and the Pearl raplet and voila – your email has just become the killer communication and CRM tool. Rapportive on its own gives a little bit of data about the sender – links to their social media profiles, their avatar and job title etc, but this new integration builds on that to give a rich CRM data stream – contact details, the person within the organization who deals with the account, specific customer status etc.

    It’s yet another example of the value that can be gained by using email as the platform for data aggregation on the web – while many people contend that email is dead, tool such as this, ensure that email’s utility will remain for the foreseeable future.

    pearlrapportive

    CloudAve is exclusively sponsored by

    Read more

  • Azure Targets Facebook Developers

     
    All is Azure

    Image by sbisson via Flickr

    When Microsoft announced the commercial availability of Windows Azure during last year’s PDC, some of the companies they showcased were more of Web 2.0-ish than any enterprise scale. Even though companies from Web 2.0 and beyond were not using Microsoft technologies, it is a huge market for Microsoft to ignore. The huge success of Facebook applications made this market segment very important from Microsoft’s perspective.

    In fact, Windows Azure can be very handy for such developers who want a seamless way to scale up and down based on demand, without getting their hands dirty on the nuts and bolts of infrastructure.Even though I don’t like the lock-in aspect of Azure cloud, I like the way it is set up to be developer friendly. Developers who code for social networking platforms are either individuals or small shops with a handful of people. They are not prepared for the sudden, unimaginable success that could come in through these social networking sites. Cloud Computing, in general, and Platforms like Azure and Google App Engine, in particular, can come handy for these developers.

    Recently, Microsoft announced that they have partnered with Thuzi, the consultancy firm specializing on social media platforms, to offer a toolkit for running Facebook apps on Windows Azure. Named Facebook Azure Toolkit, this open source tool allows developers to get started easily and deploy Facebook applications on top of Windows Azure Cloud. This will help them focus on the development of the app without worrying about any viral impact of the Facebook platform.

    This starter kit consists of

    • Facebook Developers Toolkit
    • Ninject 2.0 for Dependency Injection
    • Asp.Net MVC 2
    • Windows Azure Software Development Kit (February 2010)
    • AutoMapper
    • Azure Toolkit – Simplified library for accessing Message Queues, Table Storage and Sql Server
    • Automated build scripts for one-click deployment from TFS 2010 to Azure

    With this toolkit, the developers can either deploy directly on Azure cloud or run locally on their computers. The project can be used with Visual Studio 2010 RC and makes Facebook app deployment only a click away. Even though there are not many developers in this space building their apps on top of Microsoft technologies, this opens up another opportunity for developers. Since free markets is all about having choice, I think this is good for developer community.

    CloudAve is exclusively sponsored by

    Read more

  • HubCast – Ponoko for Printing ;-)

     

    Coming from a manufacturing background, and having an understanding of the pressures and imperatives facing manufacturing, I’ve long been excited by Ponoko’s attempts (more on ‘em here) to reinvent manufacturing. The other day I received a note from HubCast who’s seeking to do something similar for printing. According to their PR. HubCast:

    changes the way premium-quality print is bought, sold and delivered. HubCast completely automates printing with a simple cloud application that delivers premium print production, competitive pricing, global reach, and the speed of next-day delivery around the world.

    Essentially HubCast is both cloud content storage and distributed outputting of that content. The idea is that:

    users can upload and maintain an unlimited number of files to a library on the cloud…. File verification in HubCast Professional ensures that each file uploaded to the library is press-ready, guaranteeing easy and confident reordering…[and enables] printing with a simple cloud application that delivers print production, competitive pricing, global reach, and the speed of next-day delivery around the world

    Sounds good huh? In essence it’s cloud storage meeting the ability to output material anywhere in the world and HubCast have coined the term “cloud printing” to describe what their service is. From his introductory blog post, founder Toby LaVigne says:

    Cloud Printing. What?! Is Guttenberg printing from above?… Not exactly.

    Imagine this; you go online to your cloud printing account(think Amazon or Expedia). And you upload a high resolution pdf file, select a quantity, choose your stock and pick a delivery destination and hit “submit”.

    You just printed in the cloud, and you did it in seconds, on your time, from anywhere, to anywhere with the click of a mouse.

    Cloud computing is bringing to print what it has already brought to services like travel, banking and enterprise applications.

    • Lower overall costs
    • Smaller environmental impact (with less paper waste)
    • Substantial time savings

    The HubCast service sounds good, and the site looks nice – it is however a little unfortunate that it lacks a fair amount of what can only be seen as the basic information. The FAQ page for pricing for example is substantial but nowhere does it actually indicate the pricing to output a document.

    I spoke with CEO Toby LaVigne with a real world example of needing to output some material. My example (somewhat far fetched I’ll agree) saw me needing to print a document from here in New Zealand and get copies to clients in Mozambique, Iraq, Cayman Islands, Haiti and Antarctica, his answer was reasonable:

    …we focus on delivering our service in the top 100 GDP markets worldwide.  Not a stroke of brilliance on our part, it’s simply where most of the need and business is.  Our perspective is driven by HubCast service delivery.  We want to be able to support business in as much of the world as possible in as short an amount of time as possible.  Strictly speaking, the number of countries we print in is less relevant than the number of countries we can deliver to reliably – next day, 5-day, etc. Your specific examples are unique.  Antarctica and Iraq we are taking a pass on for now.  Mozambique, Haiti, and the Caymans are next-day service.  And, of course you can order from NZ

    Which is pretty good – to be honest I’ll accept that Antarctica and Iraq are pretty much “edge cases” but the ability to output from my location and have copies in three locations as different as Mozambique, the Caymans and Haiti the next day is pretty powerful.

     

     

       

     

     

     

     

    CloudAve is exclusively sponsored by

    Read more

  • StrataScale Unveils New Cloud and Hybrid Solutions

     

    Public Cloud, Private Cloud, Automated Managed Hosting, and Hybrid Hosting Services to be Available 24/7 through Easy-to-use Storefront

    The launches are coming thick and fast at Cloud Connect here in Santa Clara. StrataScale is right now announcing the addition of three new cloud offerings. In addition to its physical Automated Managed Hosting service, StrataScale has developed  their own public and private virtualized clouds. The company also unveiled a Hybrid Hosting solution that allows customers to integrate physical managed servers and virtualized cloud servers on-the-fly in the same secure network and optionally cross connected with existing co-location infrastructure. StrataScale’s cloud offerings live on servers housed in parent company RagingWire 220,000 square foot data center offering 99.999% availability and N+2 redundancy.

    StrataScale Vice President of Marketing, Dave Geada, gave me a demo of their offering and more importantly their new self-service storefront. I am of the belief that cloud services are becoming more and more commoditized – there are two ways to differentiate – one is by price while the other is by value add. The likes of Microsoft and Amazon will always be able to beat the smaller players on price, so companies like StrataScale need to compete on service.

    In their case, StrataScale are making life easy for customers to provision – they’ve got a nice self-service storefront but at the same provide good one-on-one customer support. On the storefront customers can automatically scale up or down and add new computers and IT components without speaking to a sales representative. All systems are automated on the storefront, including physical managed servers, public virtual servers and private virtual servers

    As for the technical details both public and private Cloud offerings deliver hypervisor based virtual machine instances running CentOS, Red Hat Linux, or Microsoft Windows. Hybrid Cloud customers can run Public Cloud, Private Cloud, and Automated Managed Hosting servers all on the same network, managed through the StrataScale portal. The new offerings will be available after April 1 – pricing details below.

    StrataScale-Cloud-Pricing

    CloudAve is exclusively sponsored by

    Read more

  • Appistry Introduces CloudIQ Storage for Data-Centric Applications

     

    It’s Cloud Connect week – which means that every Cloud vendor under the sun is launching new products or services. Not wanting to be left out, Appistry today announced the availability for beta testing of Appistry CloudIQ Storage. CloudIQ Storage is a play for data intensive applications in the cloud – beta clients are running highly sensitive, big data applications for government security and intelligence, using a combination of private and hybrid cloud setups.

    CloudIQ Storage may be used as a stand-alone cloud storage system or in conjunction with Appistry CloudIQ Engine to enable what Appistry is calling Computational Storage. Computational storage unifies applications and data by storing data across commodity servers and intelligently locating application processing on the machines containing the relevant data. As a result, computational storage allows for the delivery of data-intensive applications more cheaply than otherwise.

    Some relevant details:

    • Special CloudIQ Storage “editions” will address the unique storage requirements of particular communities.
    • The first of these, Hadoop Edition, offers plug-and-play compatibility with the Hadoop Distributed File System (HDFS), part of the popular Apache Hadoop open source framework.
    • The HDFS architecture is built around a single metadata repository, called the NameNode. Because the NameNode is not easily clustered, it represents a single point of failure and a bottleneck for the entire system. CloudIQ Storage has no single point of failure and no centralized bottleneck, making it more suitable for mission-critical deployment.
    • Appistry CloudIQ Storage Hadoop Edition ships with HDFS drivers, enabling it to be easily deployed in place of HDFS for applications where reliability and throughput are key considerations.

    It’s an interesting play that brings file-based storage down a few notches in the cost rankings while overcoming data access issues by co-locating processing and data storage. Finally it brings a smart approach to application workloads, moving them closer to the data on which they work. CEO of Appistry, Kevin Haar talks it up saying:

    Storage is an integral component of today’s data-centric applications. It’s no surprise then that traditional approaches to storage are often to blame for the high cost and inferior performance of many a mission-critical application… With Appistry CloudIQ Storage, we are able to unify application processing and storage requirements to the cloud to dramatically reduce total costs and improve overall performance.

     

    CloudAve is exclusively sponsored by

    Read more

  • Uh Oh – the Google Apps Marketplace is Failing Me

     

    Exciting news yesterday that the Google Apps Marketplace was launching – this morning I thought I’d give it a whirl but…

    marketfail

    To quote Homer Simpson… Doh!

    CloudAve is exclusively sponsored by

    Read more

  • 2010 – The Year of the Cloud (or something)

     

    I kind of thought we were a little for “10 things to watch for in 2010” type posts but it seems ChannelWeb doesn’t think so and have compiled a list of somewhat conflicting Cloud prophesies for 2010 from some of the clouderati. So without any further ado – let’s hear what them-that-know predict will happen this year and my measure of how accurate their predictions are:

    First up James Demoulakis, CTO of GlassHouse Technologies opines that Cloud Storage Adoption will Broaden. Coming from a perspective of technology developments solving security and latency issues – he predicts 2010 will all be about cloud storage. I kind of agree that cloud storage will broaden this year but don’t see that’ll be caused by anything so high level. Quite simply it’s a reflection of a degree of momentum and some critical mass. Any issues that did exist still will. I give this an 80% chance of eventuating.

    Hybrid will Happen, says Jimmy Tan, general manager for PEER Software. He calls it hybrid but I’d call it more offline available web apps. Either way he suggests that cloud services will continue to develop “off-line” working modes to complement their “always on” approach. Given HTML5, Google ascending and Microsoft’s play with Office 2010 – I give this a 90% chance.

    Platform-as-a-Service Takes Hold says Sam Charrington (a really nice guy by the way) from Appistry. he believes that 2010 is the year that PaaS will really take hold as organizations look at how they take advantage of cloud platforms and push it beyond just requests for virtual machines. I’m not entirely convinced – while I love PaaS as a concept, I just don’t see widespread use as a given. I’ll give this a 50% chance.

    Public Vs. Private Becomes Less Relevant says Vanessa Alvarez an industry analyst from Frost & Sullivan (and someone I’m looking forward to meeting at Cloud Connect in a few weeks). Vanessa says that in 2010, we’ll START to move away from these terms as the importance of how apps/services/resources are delivered and/or from where, becomes less relevant to end users and the market overall. I’ve got to agree with Vanessa here – I’m not a hand wringing dogmatic who gets caught up passionately defending “purity” chapter and verse. At the end of the day it’s about results and I for one don’t care if those results are obtained through some sort of “pseudo cloud”. 75% of happening but less if the handwringers have their way.

    2010 will be the year of planning for the cloud says John Ross, CTO of GreenPages. Apparently everyone will need to stop thinking about how we have done things in the past and begin to think about how we can do things differently with the resources that are being made available to us. I’m not so sure – I don’t see the world in black and white pre cloud/post cloud terms and I see the planning that John talks about as being more of the same due diligence type stuff that has always occurred. I’m not sold and I give this a 20%.

    Cloud Platforms Gain Acceptance opines Barry Lynn, CEO of 3Tera. Apparently 2010 will be the year that the best cloud platforms will be accepted as enablers of mission critical enterprise applications in need of high availability, dependable SLAs and world class disaster recovery. What? I don’t think so. I think Barry’s been drinking the KoolAide a little too much 10% on this one.

    Disaster Recovery In The Cloud will be big says Chris Pyle, CEO of Champion Solutions Group. Clients will start considering using the “cloud” as another choice when developing a disaster recovery plan he says. I don’t think so. Clients who already use the cloud will think about using it for DR, those who don’t won’t give it a second thought. DR will stay inline with general cloud adoption – 20% from me.

    Private Clouds Die, Intercloud Rises, Openness Abounds says the normally reticent Sam Johnson. Sam’s a strong character and, gets a little passionate about things and attached to the dogma of cloud. I love what he tries to do but disagree with much of his vehemence. When it comes to this prediction, I’m erring on the side of the (somewhat confusingly) opposite view given by Vanessa – public? private? who cares just make it work. 10%

    WAN Optimization-as-a-Service Surfaces preaches Adam Davison, corporate vice president for Expand Networks. Where do they get these guys from? Get a load of this: “As cloud-based services become more prevalent, whether private or public, the provision of an end-to-end software solution for virtualized WAN optimization from the data center, to the branch office and mobile users will be paramount.” Yeah whatever dude – just buy some bigger pipes – 15% although I’d qualify that by saying he’s probably got a 40% chance within enterprise who love the big words he uses.

    CloudAve is exclusively sponsored by

    Read more

  • NetSuite Brings VARs Some More Love

     

    NetSuite is this morning announcing a new reseller program that is aimed at converting VARs currently dealing with on-premise products, and fearful of the revenue implications of a move to cloud applications, to come on board as a NetSuite reseller. As background, NetSuite currently has around 200 partner companies that are receiving margin throughout the renewal life of the end customers. NetSuite has found however that traditional resellers are struggling to decide how best to integrate cloud offerings into their existing practices – part of this struggle is caused by their real fears over revenue impacts moving from a large license fee model to a smaller recurring revenue one.

    The new program will mean that, for qualified new customer transactions of at least two years, NetSuite “SP100” partners get the entire year-one software subscription revenue, they will then also get 10% margin on all renewals, to give them a recurring revenue stream. The existing benefits that resellers get will remain for VARs under the new model – they’ll still get deal support, sales engineering support etc.

    The idea of this program is to help traditional resellers, who have business models built on lucrative revenue streams from on-premise deployments, to move themselves to a recurring revenue model – the 100 will be positive for revenue flows for VARs in the initial years when they feel the most impact from the move from on-premise to cloud.

    The following chart gives a comparison between an on-premise reseller, one who uses the existing NetSuite modal and one who embraces the new SP100 model – note that the existing channel model gives resellers a 50% margin on license and a 30% margin on maintenance. You’ll notice that after a few years resellers are actually worse of under this model than under the SP100 model – as such it is very much a tool to aid in the transition for VARs rather an ongoing change of model.

    NSCOMP2

    NetSuite’s stated goal through this program is to attract new VARs as Craig West from NetSuite pointed out in my briefing, existing resellers who have already bit the bullet and moved to recurring revenue models have already suffered the pain of the shift – they’re less likely to come on board with the program than new resellers. Resellers will be able to chose which commission model they adopt with each customer deal, thereby giving the opportunity to tailor their own revenue flows.

    So why now? Well NetSuite is adamant that 2010 is the year that the midmarket channel embraces the cloud – they’re dedicating much of their 2010 SuiteCloud Conference to building knowledge within the partner community.

    CloudAve is exclusively sponsored by

    Read more

  • Skytap Matures Into A Cloud Automation Provider

     
    Image representing Skytap as depicted in Crunc...

     

     

     

    Image via CrunchBase

    Skytap, the Seattle based lab cloud automation provider, has announced powerful network automation capabilities in the cloud. Skytap is one of the hot companies in the enterprise cloud space (see my previous coverage here and here). I have been following them closely and I have often wondered about their long term strategy. I was not very convinced that their lab automation strategy will take them far. With today’s announcement, they are positioning themselves as a cloud automation provider getting ready for a long innings in the cloud marketplace.

    Starting off as a lab automation provider, they were catering to the needs of enterprises by enabling testing labs, QA and training on their cloud. Their solution enabled enterprise applications to run unchanged in the cloud, making it easy for collaboration at a global level. The 50-70% savings they could offer made Skytap very attractive for enterprises. From the lab automation beginnings, Skytap has evolved to support Sales demos and complex ERP migrations. With today’s announcement about multi-tier network automation capabilities that will accelerate the creation, migration and deployment of multi-tier enterprise applications in the cloud, they are standing out from the rest of the competition as one of the few providers offering powerful virtual datacenter features that can be managed with just a few mouse clicks. The evolution of Skytap can be captured as follows.

    VM Automation –> Self Service VM Import/Export –> Network Automation

    There are many companies offering storage automation, CPU automation, server automation, etc. but Skytap goes far ahead to offer a complete network automation in the cloud. According to Rachel Chalmers of The 451 Group, network automation is an important capability for external clouds. It builds on approaches that have been shown to reduce costs in the physical data center. Compared to weeks and months needed to configure the complex networks, where IT admin resources are needed, Skytap simplifies the configuration of network to a few clicks and it can be done by an end user with absolutely no networking capabilities. With this release, Skytap enables IT organizations to move multi-tiered enterprise applications with clustering and fail-over networking capabilities into the cloud with point-and-click ease. The resulting cost savings is huge and this is what makes Skytap a hot company in this space.

    Skytap allows IT organizations to create “ready to run” virtual data centers with advanced networking topologies, customizable security policies, and scalable computing capacity.  Functional users can utilize the self-service Web interface to deploy those virtual data centers immediately. As application needs change, IT organizations can easily add or remove networks, change server and storage capacity, and rapidly adjust security and access policies. Skytap’s networking capabilities bring unprecedented power, scale, ease-of-use, time-to-value and cost efficiencies to cloud-based application deployments.

    While talking to Sundar Raghavan, their chief product and marketing officer, he also pointed out to me that everything that can be done through Skytap’s web UI can also be done using the APIs. The API access allows for automated back and forth movement of Virtual Machines between the enterprise datacenter and Skytap cloud. I asked Skytap about the possibility for an “on the fly” configuration of the networks while moving the Virtual Machines to Skytap Cloud. They pointed out that even though such an automated configuration is not possible right now, it is definitely on their roadmap for the future. In fact, Sundar pointed out that the very availability of API access will allow an admin to write a Python or shell script to push a Virtual Machine to the Skytap Cloud and configure the network on the fly. In future, this network configuration will be completely automated.

    Network automation itself is not easy and the fluidity of virtual compute infrastructure makes virtual network automation a much difficult problem to tackle. By attacking this problem head on, Skytap has completely altered the game. By making the process of creating advanced multi-tier topologies just a few mouse clicks away, Skytap has provided the critical missing piece that has held back complex enterprise application deployments in the cloud. For more information on the features announced today, check out their blog post on the topic. This is truly game changing. The interesting aspect of their offering is not the powerful network features available in the Skytap Cloud but the ease with which one can create such networks. It is so easy that even your grandma can create complex networking topologies such as clustering, fail-over, shared resources, and multiple subnets with firewalls and security controls. This is a hot company to watch in this space.

    CloudAve is exclusively sponsored by

    Read more

...23456...
...23456...