Friday, 26 February 2010

Small Businesses Set To Profit From Cloud Computing

Cloud enables SMBs to get access to fully fledged IT services. The key word in this statement being ‘Services’ - SMBs need services not technology. They need a compelling business reason to spend their hard earned reserves on further IT services. They will need to buy into the concept of ‘Business as a Service’.

And this is a message that should be heeded by both the Industry, and marketing departments in particular, if Cloud is to reap the benefits that the hype suggests will be delivered. The figures are pretty impressive. IDC claims that there are an estimated 73.5 million small businesses worldwide, with a further 100 million SOHO businesses operating across the globe. 40% of all IT spend is attributable to the SMB sector, with 45% of this spend accounted for by businesses with less than 100 employees. Cloud spending is expected to reach $44 Billion by 2013, up from £17 billion in 2009 and at a CAGR of 26% this is far and away outstripping traditional IT spend. When you factor in the SMB spend predicted in this area – and that’s a big chunk of pie for service providers operating in this market.

So that’s it sorted then. We’ll all going to become very rich and Cloud will be very successful and we’ll all live happily ever after. Well not quite.

I highlighted the fact that whilst most SMBs surveyed were generally aware of the Cloud, they were somewhat reticent to dive in and procure services. Several reasons were given for this, including hype over substance, internal resistance and security concerns – all of which suggest that we need to work harder to get the Cloud benefit message across. And it’s here that we should take a leaf out of the Hotel and Catering Industry’s book. The Hotel business has successfully packaged a very complex set of business processes into a simple to understand, simple to buy all in one service, tailored to suit a particular market.

When was the last time that you phoned a hotel to make a booking and was asked: “Would Sir like to avail himself of our laundry as a service, room cleaning as a service, food as a service, ability to use the internet as a service, stay in trim during your stay as a service, watch the Winter Olympics live as a service, whilst he stays with us?”. The answer is (hopefully) never.

Whilst accepting that the hospitality industry has had centuries to ‘market its message’, the challenge for our industry outlined by both Melanie and Serguei is well made. We need to focus less on the technology, the terms which seem to proliferate on a daily basis, and more on the needs of the customer.

A customer that is currently using a hosting company for shared hosting, domain name and email can be gently migrated up the value chain as their business needs dictate. In reality they are hardly likely to make the jump from their existing basic business services to a full blown Cloud CRM system just because we believe it’s what they need. If you think about the Hotel business again, you check in and once you are a customer of that hotel, you then have the option of purchasing many services available on a ‘pay as you use basis’ – the mini bar, the restaurant, the telephone, the laundry etc.

This is the model that appears to be gathering momentum and support for successful Cloud implementation. Service providers will need to extend the value of existing offerings, transform the customer experience and reinvent the service creation process if they are to attract the SMB market.

To further use another Hotel analogy, Cloud is about evolution of services not revolution. Hosting services have been around for a while and perhaps we should think of Cloud services as ‘old wine but new bottles’.

Join Us:

iPhone, Android and RIM are the real winners in the Smartphone war

Gartner just released its annual numbers for worldwide mobile phone sales to end users in the year known as two thousand nine.

Looking at smartphone OS market share alone, Gartner shows the iPhone OS, Android, and RIM making the biggest gains (up 6.2, 3.4, and 3.3 percentage points from 2008, respectively) at the expense of Windows Mobile (off 3.1 percentage points) and Symbian (off 5.5 points). Although Gartner says that Symbian "has become uncompetitive in recent years," (ouch) it concedes that market share is still strong especially for Nokia; something backed up by Nokia's Q4 financials and reported quarterly smartphone growth by 5 percentage points.

Regarding total handsets of all classifications sold, Nokia continues to dominate with 36.4% of all sales to end users (down from 38.6% in 2008) while Samsung and LG continue to climb at the expense of Motorola (dropping from 7.6% to 4.5% of worldwide sales in 2009) and Sony Ericsson. See that table after the break or hit up the source for the full report.

If you are looking to manage a mixed estate of Smartphones, check our MobileIron, next generation Smartphone Device Management from Cloud Distribution.

Join Us:

Thursday, 25 February 2010

CA to Acquire Cloud Computing Solution Provider 3Tera

CA, Inc. on Wednesday announced a definitive agreement to acquire privately held 3Tera, Inc., a pioneer in cloud computing.

3Tera's AppLogic offers an innovative solution for building cloud services and deploying complex enterprise-class applications to public and private clouds using an intuitive graphical user interface (GUI). Terms of the agreement were not disclosed.

With 3Tera – which follows CA's recent acquisitions of Cassatt, NetQoS and Oblicore – CA continues to aggressively expand its portfolio of solutions to manage cloud computing as part of an integrated information technology management program. 3Tera enables enterprises and service providers to provision, deploy and scale public and private cloud computing environments while maintaining full control, flexibility and reliability.

3Tera also makes it easy for service providers to offer application stacks on demand by adding applications to the AppLogic catalog, where they can be deployed to a low-cost, shared cloud infrastructure. 3Tera's customers include more than 80 enterprises and service providers globally, which use the cloud computing technology to provide services to thousands of users.

Join Us:

How smartphones are bogging down some wireless carriers

It's no secret that the iPhone has taxed AT&T's network in densely populated areas, especially New York and San Francisco. Reports of problems using iPhones at major tech conferences, like SXSWi, Macworld Expo, CES, and NAMM are not unusual. The iPhone's ease of use and focus on mobile media generally lead to higher data usage on average, but despite claims by AT&T Mobility CEO Ralph de la Vega, the amount of data being consumed is rarely the problem. The issue has to do with how modern smartphones—beginning with the iPhone—save power by disconnecting from the network whenever possible.

Even though AT&T has made improvements to its network over the last couple of years—including moving towers to an 850MHz spectrum that can more easily penetrate building walls, as well as upgrading to faster 7.2Mbps HSPA+ protocols—those improvements have done little to stem the tide of complaints from consumers in larger urban areas. Those users experience frequent dropped calls and an inability to make data connections, and in general they feel that service is spotty.

Join Us:

Cloud Computing Hosting vs. Traditional Hosting

How does the cost of dedicated hosted server solution compare with on demand hosting?

For the purpose of the comparison I have selected for dedicated server hosting and OpSource for on-demand cloud hosting. (In an earlier post I had compared the cost of different cloud providers). This is only a direct cost comparison. I have not considered issues like support, security, reliability, availability etc.

There are also several other assumptions I have made that can be challenged. In conclusion, if you are going to use the application(s) in anger, pricing becomes very similar. The real difference is that in the on-demand hosting model with OpSource, all set-up, pre-configuration and staging was minimal from a cost perspective whereas the hosted provider charges his fixed rate month in, month out.

The benefits to an on-demand model are that you can experiment before going live, try out scenarios, re-configure your own network infrastructure and security policies on the fly and only deploy out to your users when you are entirely satisfied. Over a 12 month period, I estimate costs to be very similar.

The real benefits of Cloud based IaaS are clear, easy introduction, complete control and if you chose the right partner (I would of course recommend OpSource) then you are in safe hands with an SLA a hosting provider simply could not meet.

Join Us:

Wednesday, 24 February 2010

Smarter phones

I just read this on BBC.

Mobile phones become smarter in 2010

By Dan Simmons
Click reporter

Despite the name, smartphones are often quite dumb as there are many tasks these devices struggle to do.

The Mobile World Congress in Barcelona saw the launch of software aiming to try and solve some of the bigger issues such as handsets not playing Flash videos.

Despite the majority of video, animation, and many games on the net being Flash based, the system is not yet fully supported on handsets.

David Wadhani, vice president of Adobe, said: "We'll see over the next 12 months Flash player 10.1, which is the same version of Flash on desktops, running on a variety of smartphones."

The mobile download of Flash 10.1 will be available from this summer on Android and Palm Pre handsets - and with other manufacturers to follow.

Currently 19 out of the top 20 manufacturers have signed up to support the video and animation platform.
The update will not be available on the iPhone because Apple has decided not to support the software.

Familiar browsers
Firefox has finally brought its popular browser to the mobile.

It will feature tabbed browsing, session syncing between mobile and desktops, and currently includes 40 add-ons to make it among the most customisable browser on a handset.

The Twitter add-on allows the user to tweet from the address bar. Firefox is already available on Nokia's N900 and will be downloadable to other handsets throughout 2010.

Opera has also produced ultra-fast mobile browser Mini for the iPhone which is yet to be submitted for App Store approval.

The company claims it will load pages up to six times faster than the iPhone's current Safari browser.

New interfaces
One of the most anticipated events at the 2010 MWC was the launch of Microsoft's new mobile platform Windows Phone 7.

Due to be finished in November, the operating system has a more natural and lifestyle-led feel.
The name change from Windows Mobile to Windows Phone does not mean the firm will start making phones. Instead it reflects a move towards personalising what used to be a very PC experience.

One way it has done this is by taking the Start button off the software, and bringing in live tiles to show friends' updates in real time.

Microsoft has also replaced the Windows Media Player with its Zune music service to create a tool similar to a built-in iPod - Microsoft said Zune would sync with iTunes as well as computers.

There is also live integration with Microsoft's Xbox 360 gaming console which means a user's profile, avatar, top scores, and notifications are all kept up to date.

HTC launched two new Android smartphones called Desire and Legend with a revamped Sense user interface.

One feature is the Friend Stream app which aggregates updates from a user's social networks on to one page.
Both phones sport a new and brighter Active Matrix Oled touchscreen first seen in Google's Nexus One handset.

The Legend, which succeeds the popular HTC Hero, is flatter in design. Both phones include an optical mouse instead of a trackball.

Slicker interaction
Other innovations on display included the Israeli-designed Else phone which has been designed to reach any contact, document, or application with a single touch.

For people preferring to type into a phone via a real keypad, until now they have not had the flexibility of changing the buttons' functions.
Now British firm Pelikon has produced a hybrid display that uses printed LED layers to allow users to change the design of their keypads.

For instance, letters and numbers can be changed to reveal a media player interface or transform the keyboard into one large trackpad.

The company said its keypads can be added to slider phones for less than $10 (£6) and they are aiming to team up with a major handset maker this year.

Immersion which produces haptics, or touch feedback, for more than 100 million phones showed off a package that enables users to decide how keystrokes should feel.

Watch Click on Saturday 20 February, BBC News Channel, 11.30 (GMT)

Software as a Service – A Guide to Transforming Your Software ...

OpSource, the SaaS delivery experts, is focused on providing the operational infrastructure and ongoing services that enable software companies, On-Demand businesses, and Web applications providers to deliver and maintain the highest ...

Software Views -

Join Us:

Tuesday, 23 February 2010

Amazon, and Google Continue to Lead the Cloudsphere

Cloud computing continues to gain steam, and it has been some time since we first asked, who is gaining your mind share?

As previously noted in the 4Q 09 article, this list is a subjective impression of Cloud Service Providers (SaaS, PaaS, IaaS) on the cloud computing radar.

As usual, large technology companies like AT&T, Dell, EMC, HP, Microsoft, Unisys, and others where cloud computing is a small part of their overall offering portfolio have been excluded.

So here is my list of the Top 25 Global Cloud Services Providers gaining mind share in the first quarter of 2010.

1. Amazon Web Services (AWS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), andVirtual Private Cloud (VPC)
2. / Sales Cloud 2 (CRM), Service Cloud 2 (Support), (Development Platform), Chatter (Collaboration)
3. Google Apps (AppEngine)
4. Citrix – XenServer (Virtualization)
5. VMWare – vSphere (Virtualization)
6. Rackspace – Mosso
7. NetSuite
8. Rightscale
9. Joyent
10. GoGrid
11. 3Tera – AppLogic
12. Caspio
13. Zuora
14. Eucalyptus
15. CohesiveFT
16. Boomi
17. Appirio – Cloud Connectors
18. Relational Networks – LongJump
19. AppZero
20. Enomaly – Elastic Compute Platform (ECP)
21. Astadia
22. Bluewolf
23. Intacct
24. 3PAR
25. Elastra

Join Us:

Citrix & Novell to Collaborate on Cloud Computing & Virtualization

Citrix Systems and Novell have announced a collaboration that expands choice for customers through increased virtualization interoperability and new assessment tools to help pinpoint the economically most advantageous approach to virtualization. Through this new partnership, Novell has certified SUSE Linux Enterprise Server as a "Perfect Guest" running on Citrix XenServer and both companies will provide joint technical support to customers.

As a result of this agreement, the more than 4,500 enterprise applications certified as Novell Ready for SUSE Linux Enterprise Server are now Citrix Ready community-verified when running in a SUSE Linux Enterprise Server guest virtual machine on XenServer. SUSE Linux Enterprise Server is the only Linux operating system that has been optimized to be the perfect guest on all major hypervisors, with outstanding performance when running on XenServer.

Now customers and cloud providers gain the added value of the free XenServer as an enterprise-class virtualization platform for Linux and Windows environments, providing them with an expanded set of technologies and processes to run and manage heterogeneous data centers.

Join Us:

Monday, 22 February 2010

Active Directory: 10 Years Old and Thinking Cloud

Ten years ago, Microsoft released its Active Directory technology to skepticism that it could build an infrastructure technology to anchor user management and access control. Today the software is an integral part of nearly every corporate network and stands ready for its next frontier: public and private clouds.

Join Us:

Cloud and the Utility Meter Analogy

The "home utility" analogy is often used to explain Cloud Computing. For example, most people do not have an electricity generator in their house, and even if they do, they do not run it all the time. Instead, they rely on an external provider for electricity.

In fact, I have often heard this analogy used by Cloud providers themselves: when asked "Why should I use your hosted services?", they ask back "Do you run an electricity generator in your house?".

We seem to take for granted that our utility providers quality of service is a given. I for one live in a rural village which often has outages of electricity and telephony due to... we'll I'm not entirely sure and to find out would be a time consuming task I hazzard a guess.

Now, applying this to Cloud based applications and services points the finger directly at the connectivity provider. More often that not, the underlying provider is BT and we all know (like em or hate em) their network is improving constantly and 21CN should overcome the vast majority of the SLA issues we currently face.

So, in conclusion, putting your business critical applications and service in the Cloud is not a 'leap of faith'. Today, for small and medium sized businesses, it's a simple case of economics and simplicity. Don't hold back on investigating your options, you never know, what the Cloud promises may just be true!

Join Us:

Friday, 19 February 2010

An API for Cloud Infrastructure Services

If you are a cloud provider or consumer, specifically in the Infrastructure as a Service (IaaS) landscape (like OpsourceCloud for example), it's hard not to be excited by the work going on in the Open Cloud Computing Interface Working Group. Owing to the growing number of IaaS providers and increasing adoption of the relatively new paradigm by consumers, this group was formed with a goal of bringing about standards to manage these cloud-based infrastructures.
Join Us:

Vodaphone, RIM CEOs sound alarm over wireless data crisis

Vodaphone CEO Vittorio Colao took to stage at Mobile World Congress is Spain this week to warn that smartphones—representing the only growth proposition in mobile phone sales—will soon begin to outstrip wireless networks' capacity to send and receive data. While chipmakers, handset makers, and services providers are all cashing in on the growth, mobile carriers are left trying to "cope" with the growing demand for data from increasingly data-hungry phones.

Echoing comments from AT&T Mobility CEO Ralph de la Vega late last year, Colao said that new business models need to be created to enable carriers to invest in improved network infrastructure while still earning profits. In effect, Colao is saying that users should be charged more for using more data. Also, he seems to be pointing a finger at Google—though it's not clear why—saying that its hold on a large majority of the search and mobile ad business needs to be "looked at," according to Reuters.

Read the rest of this article...

Thursday, 18 February 2010

Joke Of The Week: Microsoft Plans To Charge For Its Mobile Operating System [Windowsphone7]

What's the biggest difference between Microsoft's newest mobile operating system and the Google version that it will be competing against (eventually)?

Microsoft will charge carriers for its OS, while Google is giving Android away for free. (The other big competitors, Apple and RIM, don't license their operating systems to third parties.)

That's just silly. Microsoft is dead in the water in this business. If it wants to get moving again, it needs to do everything it can to help itself. And in mobile software, that means competing with "free" with "free."
Yes, giving a product away for free that Microsoft used to charge for will undermine Microsoft's business model. But that's eroding anyway. And, again, to come from behind in this business, Microsoft needs all the help it can get.

The good news is that Microsoft is so big that its new Windows Phone operating system has almost no chance of making enough money to move the needle anyway. So it might as well join rival Google in giving it away for free, in an effort to drive up device sales and market share.

During Monday's keynote, Microsoft CEO Steve Ballmer announced that Microsoft would stick with its current Windows Mobile business model. That is, Microsoft would let handset makers license Windows Phone 7 Series in exchange for a fee.

Why bother?

"I think there's something clean and simple and easy to understand about our model," Ballmer said. "We build something, we sell that thing." He added, "I think it's not only in our best interests, but it's ... a simple model that's easy for developers, handset manufacturers, and our operator partners to deal with, to understand, and to build from."

Maybe so. But the problem is that the amount Microsoft can charge for a smartphone OS license is very low, and probably getting lower.

And even if the smartphone market grows significantly in the next few years, and even if Microsoft can capture a respectable share of the market, it stands to make a very small amount of money selling Windows Phone 7 software.

How little?

Not enough to represent a real growth driver for giant Microsoft. And therefore, not enough to justify charging a license fee — which is surely a deterrent when handset makers like LG, Samsung, HTC, Sony Ericsson, Motorola, and others choose which OS platforms to base their phones on.

To estimate what Microsoft's mobile OS license revenues might look like, we ran the numbers with a variety of license fees, ranging from $5 (if Google's strategy pushed Microsoft's pricing down even lower) to $20 (if an intoxicated gadget maker fell so in love with Windows 7 that it would pay through the nose for it).

• A Microsoft rep wouldn't tell us how much Microsoft's hardware partners pay it to license Windows mobile software, but in the past, it ranged from about $8 to $15 per phone, according to research firm Strategy Analytics.

• We also estimated that the smartphone market would grow to around 250 million devices worldwide in 2011, the first full year that Microsoft's new OS would be on the market, and that it might capture 5% to 15% of the market that first year, or roughly 12.5 million (conservative) to 37.5 million units (aggressive). The smartphone market was about 175 million units in 2009, according to IDC.

The results: Microsoft's mobile revenue from license fees could be as low as $63 million, or as high as $750 million, across all those assumptions.

But a reasonable average is somewhere in the $300 million range, which is less than 0.5% of the $66 billion in revenue that Wall Street expects Microsoft to generate in fiscal 2011.

This suggests that mobile license fees will NOT be a significant growth story for Microsoft any time soon, and that unless our estimates are radically low, charging is a waste of time.

Instead, Microsoft should give Windows Phone software away for free, with the hope that manufacturers will use it to make more Windows phones than they're previously planning to make, that they'll charge lower wholesale prices for them, and that carriers will charge lower retail prices for those phones. That could drive up Microsoft's market share, with little negative effect on Microsoft's financial situation.

So how will Microsoft make money off Windows phones? Beats us. But it has to be a way other than license fees.

The options range from taking a cut from selling mobile applications, to subscription fees for Xbox live and Zune music accounts, to mobile search and display advertising, via Bing and potential ad network acquisitions. Sure, those revenue streams will be minuscule to begin with, too. But they will grow with time, and with a bigger user base, which the no-license-fee phone business should provide.

Or Microsoft could take the route that Apple, Research In Motion, Nokia, Palm, and others take, which is selling their own hardware and software. That can be tricky, but it can also be very lucrative: Apple generated more than $15 billion selling iPhones in 2009.

Why Cloud Computing is More Secure

In the midst of the 1990's economic bubble, Alan Greenspan once famously referred to all the excitement in the market as Irrational exuberance. Similarly in today's cloud computing market a lot of the discussions seem to be driven by a new set of irrational expectations.

The expectation by some that cloud computing will solve all man's problems and by others the expectation that cloud computing is inherently flawed. Flawed by an ended less list of problems most notably that of security. Like most things in life, the reality is probably somewhere in the middle. So I thought I'd take a closer look at the unrestrained pessimism and sometimes irrationality found in the cloud security discussions.

To understand security, you must first understand the psychology of how [cloud] security itself is marketed and bought. It's marketing based on fear, uncertainty and most certainly doubt (FUD). Fear that your data will be unwittingly exposed, uncertainty of who you can trust and doubt that there is any truly secure remote environments. At first glance these are all logical, rational concerns, hosting your data in someone else's environment means that you are giving away partial control and oversight to some third party. This is a fact. So in the most basic sense if you want to micro-manage your data, you'll never have a more secure environment than your own data center. Complete with bio-metric entry, gun toting guards and trust worthy employees. But I think we all know that "your own" data center also suffers from it's own issues. Is that guard with the gun actually trust worthy? (Among others)

Recently it occurred to me that the problem with cloud security is a cogitative one. In a typical enterprise development environment security is mostly an after thought, if a thought at all. The general consensus is it's behind our firewall, or our security team will look at it later, or it's just not my job. For all practical purposes most programmers just don't think about security. What's interesting about cloud computing is all the FUD that's been spread has had an interesting consequence, programmers are actually now thinking about security before they start to develop & deploy their cloud applications and cloud providers are going out of their way to provide increased security (Amazon's VPC for example). This is a major shift, pro-active security planning is something that as far I can tell has never really happened before. Security is typically viewed as a sunk cost (sunk costs are retrospective past costs which have already been incurred and cannot be recovered). But the new reality is that cloud computing is in a lot of ways more secure simply because people are actually spending time looking at the potential problems beforehand. Some call it foresight, I call it completely and totally rational.

Join Us:

Wednesday, 17 February 2010

Adecco Connects Its UK Branches with Magic Software's uniPaaS

Magic Software Enterprises Ltd., a provider of application platforms and business and process integration solutions, announced on Monday that Adecco, a market leader in human resource solutions, has developed MAX, a back-office system for its 200 UK branches, using Magic Software's uniPaaS application platform.

Magic Software's uniPaaS is an application platform that simplifies the process of building and deploying business applications. The MAX back-office system facilitates and optimizes the matching of candidates with enterprises, enables multi-agency registration of candidates, global access to human resources and rapid response to customer requirements.

The MAX system replaced Adecco's disparate local databases. It also simplifies procedures, eliminates the need for any manual intervention by data processing specialists and facilitates remote management of each recruitment agency by a central data processing team.

According to Andrew Chapman, IT Manager, Adecco UK: "It is obvious that centralized management from a single base is easier and more efficient than the previous situation - in which this same type of operation was done on over 200 separate remote servers."

Through the new system, batch processing times have also been reduced to a fifth, enhancing the handling of business opportunities, as well as speeding up customer response times.

The development process was speeded up using the uniPaaS application platform that uses pre-compiled business logic, or 'metadata' rather than hard code. "As any IT manager knows, it's not easy to get the budget to replace existing IT investments. Magic Software had already provided a decade's worth of stable and trusted technology for our internal back office system. Now with uniPaaS we have the scope to evolve this and take our existing application into the cloud without turning our IT investments upside down," continued Andrew Chapman, IT Manager, Adecco UK.

uniPaaS also enables easy remote deployment of new versions of MAX, and an almost real-time correction of potential input errors, assuring the data processing team of a powerful, secure, and scalable infrastructure.

Andrew Chapman understands the benefits of moving applications into the cloud, commenting: "As we enter the new era of cloud based applications we have the scope to deploy our existing MAX system through the cloud to all our branches. Through the use of uniPaaS, we hope to further enhance our status as the world's leading recruitment & HR organization."

Join Us:

Transverse Unveils Cloud Optimized Billing Solution

Provides communication service providers the ability to realize all the inherent advantages of cloud computing

Transverse, a provider of low-cost, carrier-grade business support systems (BSS), on Monday introduced Cloud Optimized blee(p), a complete billing and BSS solution that provides communication service providers the ability to realize all the inherent advantages of cloud computing including scalability, agility and reliability and integration to public cloud services.

With Cloud Optimized blee(p), communication service providers can deploy the award-winning blee(p) billing and BSS application within any virtual environment. These enhancements enable Transverse customers to realize a significant reduction in total cost of ownership for their back office operations. In addition to being optimized to run in cloud computing environments, this version of blee(p) also provides Transverse customers with integration to public cloud services such as inventory, fulfillment, eCommerce, and payment gateways, which provides further economies of scale that the public cloud has to offer.

“In today’s rapidly changing environment, service providers need the flexibility to quickly and easily deploy new services with minimal investment,” said Jim Messer, chief executive officer, Transverse. “Transverse is unique in the BSS industry by providing SOA and web services-based BSS solutions, which are optimized for the cloud. We help keep our customers’ total cost of ownership to a minimum, while providing them with the ability to better monetize their most valuable assets -- their customer relationships.”

Cloud Optimized blee(p) scales up and down on demand based on communication service providers’ needs, eliminating the need for costly infrastructure that can stand idle at times of low or normal usage. By optimizing blee(p) for a virtual environment, customers can run their back office more cost-effectively, while still receiving best-in-class functionality.

Apps for business: Quick fixes grow in popularity

By Ingrid Lunden

Published: February 10 2010 12:58

Mobile applications could become a game-changer for enterprises, which can use them to implement services faster and more cheaply.

“In the past, to set up a fleet tracking system or a tool for your sales force, you had to work on a specific device platform, and it meant a lot of investment,” says Christophe François, vice-president for mobile multimedia products and services at Orange.

read more of Ingrid's article @ the Financial Times -,dwp_uuid=9a8cf74e-1795-11df-a74d-00144feab49a.html

Join Us:

Tuesday, 16 February 2010

Apple Grabs 25% of the Smartphone Market, Android Doubles Market Share

comScore has released a report on the state of the US mobile market from September to December 2009, and it shows that the recently established trends of Android and iPhone growth don't show signs of ceasing.
In December 2009 RIM was still the leading mobile smartphone operating system in the U.S., with 41.6% market share, a slight drop from 42.6% from September 2009. Apple has risen from 24.1% to 25.3% in that same period, and Google, although still in the fifth place, has doubled its market share – from 2.5% to 5.2%.

Microsoft lost one percentage point and dropped to 18% share, but the biggest loser of the bunch was Palm, which dropped from 8.3% to 6.1%, despite recent price cuts which made their smartphones one of the cheapest on the market. If Palm doesn't do something to reverse this trend, it may soon be looking at the back of Android, which is growing like a weed, both in the US and internationally.

Cloud Computing Will Cause Three IT Revolutions

During the next two to five years, you'll see enormous conflict about the technical pros and cons of cloud computing. Three of cloud's key characteristics will create three IT revolutions, each with supporters and detractors.'s Bernard Golden explains the first of these, involving scalability and IT operations.

Every revolution results in winners and losers — after the dust settles. During the revolution, chaos occurs as people attempt to discern if this is the real thing or just a minor rebellion. All parties put forward their positions, attempting to convince onlookers that theirs is the path forward. Meanwhile, established practices and institutions are disrupted and even overturned — perhaps temporarily or maybe permanently. Eventually, the results shake out and it becomes clear which viewpoint prevails and becomes the new established practice -- and in its turn becomes the incumbent, ripe for disruption.

This is true in technology as in every other domain. In the tech business, we often get caught up in focusing on vendor winners and losers. Point to the client/server revolution, and it's obvious — Microsoft (MSFT) and Intel (INTC). Over on the loser side stand the minicomputer vendors. This winner/loser phenomenon can be seen in every significant technology shift (and indeed, one shift's winner can become a future loser). This is understandable: we all love conflict and the vendor wars make for great press.

Less awareness is present for the effects of these revolutions on what makes up the vast majority of the technology industry — users. One could hazard a guess that for every dollar of revenue that Microsoft products pull in, IT organizations spend 10 or 20 additional dollars (or perhaps even more) in building and running systems. By far the biggest impact of any technology revolution is that upon technology users (by which I mean those who work with the technology, i.e., IT organizations).

Another aspect of change is how individuals react to it. It's a cliche that "people don't like change." That's dead wrong. People accept — and even embrace — change when they see it brings a direct benefit. Look at the immediate adoption of the iPhone — didn't see a lot of resistance to that, did you? A more nuanced understanding of people's reaction to change would interpret likely reactions based upon how the effect of the change is perceived by the individual — is it a benefit or a threat?

When it comes to organizations, it's a misreading to assume that the organization will react as a whole — every organization is made up of groups and individual actors, each of which will have its (or his or her) own read on the implications of a change. If we look to the original move of PCs into companies, some portions of IT organizations embraced them, while others, wedded to the existing mode of performing IT, saw them as a distraction, a threat, or a toy. In other words, there were different camps that arose in reaction to the availability of this new form of computing, and there were pitched battles for personal and organizational influence that took the guise of a technical assessment.

Read More

Monday, 15 February 2010

Cloud: Hype, Opportunity or Disintermediation

Will such disintermediation be the final straw for the ailing Telecom giants already faced with the ultimate threat of being relegated to just bit pipe providers? 

With the growth of the ‘prosumer', Telco customers continue to search for the ultimate experience at the lowest possible price, and who can blame them? But with the vast redistribution of wealth from the operators to the ever creative software/hardware companies, is the implementation of cloud solutions just another headache for the industry?

Leaders in Voice such as AT&T, NTT and Virgin Mobile USA have called an urgent meeting at the NGT Summit (hosted by GDS International) to discuss the implications of the Cloud phenomenon in an already volatile and competitive market place.

"As with all business the customer will ultimately drive the move," said a spokesperson for NGT 20 consortium. "Disintermediation is a key concern for the industry which must be addressed."

There is no doubt the Cloud creates an improved customer service but will it be to the telecom industry what itunes was to the music industry? Will such disintermediation be the final straw for the ailing Telecom giants already faced with the ultimate threat of being relegated to just bit pipe providers?

Whatever mix of connectivity products and value-added services that are provided by the large operators, the customer needs to address its business needs, and business must be prepared to evolve to meet them. Is the industry ready?

read more

Transitioning to Cloud Computing

The drive toward cloud computing continues to be a dominant infrastructure deployment theme for organizations looking to reduce costs, increase storage and optimize mobility. What many fail to realize is the trend towards cloud computing is continually forcing IT managers to rethink fundamental security issues as a barrage of new attacks and exploits continue to assault the cloud every day. Compelling for any business model, cloud computing delivers a scalable, accessible and high-performing computing infrastructure that comes at an appealing price for organizations. Similarly, operating in the cloud allows for the convergence of new and emerging technologies. Providing appeal to both the provider and the consumer, cloud computing enables new application deployment and recovery options, as well as new application business models. However, cloud computing may not be the panacea that the press and many organizations make it out to be. We must have trust and confidence in the platform on which we are deploying our applications and data. We must be able to maintain control of the information that drives our business. Ultimately, we must be able to prove that trust to our auditors. The solution, having not yet been defined, could be deemed "auditability."

Friday, 12 February 2010

Ground Rules For Mobilizing Users

Last week, I discussed rolling user-owned mobile devices into your mobility plans. While letting the users pay for their own wireless access may be a boon to the company's bottom line, it can also snowball into a nightmare for administrators dealing with supporting a slew of new devices.  A little advanced work and planning can help thwart the dark side of mobility: the threat of unending calls to help desk, security threats and general chaos. First and foremost, you'll need to set some ground rules.

On the top of the list should be defining the limits of IT support. All too often, end-users assume that it is IT's role to make every piece of technology they possess work. Don't believe me? Just try to find an IT professional who hasn't been asked to fix a co-worker's home computer. It has to be made very clear to users that IT makes no guarantees that connecting a mobile device to corporate email will work, but that they will do what they can to help the users succeed. Building up an archive of support documents and laying out the exact process needed to connect a given mobile device will go a long way in guiding all but the most technically-challenged users in getting and staying connected. If your IT staff does not have access to these mobile devices and your budget doesn't allow bringing a few in house, ask your users. There are no doubt a few folks within your organization who would be willing to trade a little time documenting and taking screen shots for early mobile access.

The second rule applies to the devices themselves, and how many are allowed to connect. Microsoft Exchange allows a virtually unlimited number of mobile devices to connect to a user's mailbox via Server Activesync. While this may thrill the gadget-happy users that want to have their entire collection of iPhones, iPads and Android devices linked to their mail, multiple devices compounds the threat of one of them being stolen or lost.

Furthermore, administrators need to make it clear that "jailbreaking" or "rooting" devices, a process which opens up these devices to third party applications  and networks, is expressly forbidden. While it may be attractive for users to "free up their devices", it opens up brand new security threats.  In fact, there have already been rogue applications that have made their way through these liberated devices. Another side effect of jailbreaking is that there is typically a lag between official releases and the broken versions, meaning users are waiting to apply the known security fixes of the official releases.

If despite your best efforts, the chaos of mobility gets to be too much, there are a number of products out there that can bring provide a level of visibility and control. Vendors like MobileIron, zenprise, iPass and Fiberlink are building tools, appliances and services that can bring visibility, control and a number of self-service options to mobile devices.  As the paradigms of mobility continue to evolve, look for companies like these, as well as additional offerings from both traditional enterprise vendors and the wireless carriers to make the dark side of mobility a little brighter. 

Original Article -=

MobileIron -

Read more Cloud Distribution News @

Google's Smartphone Management Drops Another Big Barrier To "Apps" Adoption

Google has removed what for many CIOs and IT professionals has been one of the last remaining hurdles to their adoption of Google Apps for business documents, spreadsheets, presentations and probably most importantly: email. Yesterday, the biggest of the cloud-based challengers to Microsoft and IBM-Lotus announced it now has a range of Blackberry-esque mobile device management features (including the ability to remotely wipe a smartphone) for iPhones as well as Windows Mobile and Nokia-based devices.

Thursday, 11 February 2010

Oracle Shoots Down Sun Cloud

Amazon's EC2 and S3 have nothing to fear from Sun. Oracle says it's going to blow the Amazon-aping Sun Open Cloud away. It doesn't want to pursue the on-demand service, announced last March, a month before Oracle agreed to buy Sun, and promised for the summer, as a worldwide public cloud using OpenSolaris, Linux, Windows, Sun Grid Engine, ZFS, MySQL and Java running on Sparc and x86 blades complete with open APIs.

Wednesday, 10 February 2010

AppZero Named One of “100 Coolest Cloud Computing Products”

AppZero has announced that Everything Channel's CRN has named its server-side application virtualization software one of the "100 Coolest Cloud Computing Products." The Top 100 Cloud Computing products include 20 storage vendors, 20 security vendors, 20 productivity vendors, 20 infrastructure vendors and 20 platform vendors.

AppZero technology encapsulates a server application in a Virtual Application Appliance (VAA), which contains all of the application's dependencies, but with zero operating system (OS) component. The VAA, freed from any OS encumbrance, can move seamlessly from data center to, and among, clouds and back across both physical and virtual servers.

"AppZero's mobility of server-side applications makes it easy for enterprise data centers to instantaneously provision a single copy of a ‘gold' application image to 1,000s of machines in the data center and/or to any cloud or combination of clouds," says Greg O'Connor, AppZero CEO. "This automation of installation and configuration of server applications speeds deployment, reduces errors, and cuts the need for a human touch."

The list was based on nominations from Solution Providers rating technology, channel influence, effectiveness and visibility along with business and sales impact. The final selections were made by a panel of Everything Channel Editors.

Winners were announced online at

Symbian switches to open source

Symbian phone operating system goes open source

The group behind the world's most popular smartphone operating system - Symbian - is giving away "billions of dollars" worth of code for free.

The Symbian Foundation's announced that it would make its code open source in 2008 and has now completed the move.

It means that any organisation or individual can now use and modify the platform's underlying source code "for any purpose".

Symbian has shipped in more than 330m mobile phones, the foundation says.
It believes the move will attract new developers to work on the system and help speed up the pace of improvements.

"This is the largest open source migration effort ever," Lee Williams of the Symbian Foundation told BBC News.

"It will increase rate of evolution and increase the rate of innovation of the platform."

Ian Fogg, principal analyst at Forrester research, said the move was about Symbian "transitioning from one business model to another" as well as trying to gain "momentum and mindshare" for software that had been overshadowed by the release of Apple's iPhone and Google Android operating system.

Evolutionary barrier
Finnish mobile phone giant Nokia bought the software in 2008 and helped establish the non-profit Symbian Foundation to oversee its development and transition to open source.

The foundation includes Nokia, AT&T, LG, Motorola, NTT Docomo, Samsung, Sony Ericsson, STMicroelectronics, Texas Instruments and Vodafone.

The group has now released what it calls the Symbian platform as open source code. This platform unites different elements of the Symbian operating system as well as components - in particular, user interfaces - developed by individual members.
Until now, Symbian's source code was only open to members of the organisation.
It can be downloaded from the foundation's website from 1400 GMT.

Mr Williams said that one of the motivations for the move was to speed up the rate at which the 10-year-old platform evolved.

"When we chatted to companies who develop third party applications, we found people would spend up to nine months just trying to navigate the intellectual property," he said.

"That was really hindering the rate of progress."

Opening up the platform would also improve security, he added.

'Mind share'
Symbian development is currently dominated by Nokia, but the foundation hoped to reduce the firm's input to "no more than 50%" by the middle of 2011, said Mr Williams.

"We will see a dramatic shift in terms of who is contributing to the platform."
However, said Mr Williams, the foundation would monitor phones using the platform to ensure that they met with minimum standards.

Despite being the world's most popular smart phone operating system, Symbian has been losing the publicity battle, with Google's Android operating system and Apple's iPhone dominating recent headlines.
"Symbian desperately needs to regain mindshare at the moment," said Mr Fogg.

"It's useful for them to say Symbian is now open - Google has done very well out of that."
He also said that the software "may not be as open and free as an outsider might think".
"Almost all of the open source operating systems on mobile phones - Nokia's Maemo, Google's Android - typically have proprietary software in them."

For example, Android incorporates Google's e-mail system Gmail.
But Mr Williams denied the move to open source was a marketing move.

"The ideas we are executing ideas came 12-18 months before Android and before the launch of the original iPhone,".

Tuesday, 9 February 2010

The Network Beneath the Clouds

For those of us familiar with the network equipment industry, the suggestion that computing costs will fall to the cost of power should at least raise some eyebrows within cultures that for the most part have been driven by manual labor and specialization. Lew Tucker's response to the Lew's Law blog further helped to clarify the point I was trying to make about the rise of IT automation and the network's pivotal role: While I'm sure this is more obvious than brilliant, the cost of computing will continue to fall bounded only by the cost of the power to produce the computation. We're simply turning electricity into computation and communication using electrons to move around bits. What cloud computing does is to use automation, scaling, multi-tenancy, and a competitive marketplace to bring us closer to this lower bound for the cost of computing.

For those of us familiar with the network equipment industry, the suggestion that computing costs will fall to the cost of power should at least raise some eyebrows within cultures that for the most part have been driven by manual labor and specialization.

While the network powered the automation of business practices it has managed to insulate itself from the competitive forces it unleashed between competing supply chains and operating models.
I think that the automation of the network is at least partially in response to:

1) the threat of consumerization;
2) the growing significance of new initiatives like IPv6, virtualization, DNSSEC and private cloud; and
3) rising network complexity (more endpoints, higher rates of change, ongoing labor specialization).
Now add to that lineup of manual opex pain a recent Gartner prediction that 1 in 5 businesses will dump their IT assets by 2012 (Jon Brodkin, Network World):

The analyst firm predicts that 20 percent of businesses will own no IT assets by 2012, a shift that will have a major impact on IT careers

Anyone still clinging to their manual DDI (DNS, DHCP, and IP address management) scripts and spreadsheets, for example, after reading this sobering and yet provocative prediction will likely join the legions of "middlemen" who were eventually retrained as network-enabled supply chains replaced similar business processes.  This transformation has today set the stage for the coming conflict between the real-time enterprise and the increasingly inflexible network.

The recent ESG 2010 Outlook and Gartner DDI MarketScope for DDI appliances are harbingers of a new shift in how networks are managed; and a host of vendors and professionals who embrace automation stand to benefit.

A reduction in the operating expense of networks could vastly expand the market for network solutions that are easier to manage, more powerful and connect ever larger populations of systems and endpoints.  Enterprises will be forced to automate or outsource to those who do.

This new network will be all about availability, flexibility and economy and will set the stage for a new resurgence in network spending and the rise of network software.

Monday, 8 February 2010

Elastra Named Cool Cloud Computing Vendor

Elastra Corporation announced that Everything Channel's CRN has named its Enterprise Cloud Server (ECS) as one of the "100 Coolest Cloud Computing Products" of the year, and part of the top "20 Coolest Cloud Infrastructure Vendors." The Top 100 Cloud Computing products include 20 storage vendors, 20 security vendors, 20 productivity vendors, 20 infrastructure vendors and 20 platform vendors.

Elastra ECS brings an unprecedented level of automation to IT organization seeking to leverage cloud computing to address the fundamental challenges they face today. ECS increases IT agility by cutting lead times through automatically generating deployment plans and provisioning of systems. ECS also helps reduce IT costs by minimizing human intervention in systems management while allowing IT organizations to maintain operations control by integrating with their existing tools and technologies.

"It's an honor to be recognized by the editors of CRN. The demand for IT efficiency is on the rise and it's important to create products to meet those requirements," said Peter Chiu, Director of Product Management, Elastra. "ECS seamlessly fits into existing data centers and addresses IT's perennial challenge of getting more done with fewer resources, delivering faster results and reducing IT operational costs."

The "100 Coolest Cloud Computing Products" list was based on nominations from Solution Providers rating technology, channel influence, effectiveness and visibility along with business and sales impact. The final selections were made by a panel of Everything Channel Editors.

Considering a move to hosted email services ? READ THIS FIRST

Cloud based email services are mature, flexible and cost effective. So why do so many small businesses try to recreate what they've been using (and paying through the nose for..) for years ?

In my last start-up, I wanted to ensure the business had a robust, flexible and cost effective email solution but did not want to invest in servers, licensing and in-house support for a Microsoft Exchange solution. As it happened, we shared an office with a Microsoft Hosted Exchange provider so the obvious route was to utilize their service which cost circa £12 ($19) per seat per month. The solution worked, was familiar to the staff and provided web based access (via Outlook Web Access) as well and calendaring and SharePoint for document management. For the first 18 months, I was happy, the staff were happy and the guys who supported the network we happy.

However, as the business grew, the cost of the solution began to outweigh the benefits and developments we were seeing from the system. The truth was (and still is I am sure) it became very costly and time consuming for the Microsoft provider to deploy new features, releases and additional services. The end game was that 18 months on, we'd still got what we had at day one. Old technology made for servers on the LAN and not suited to Cloud based deployment, development and scale.

More recently, I set about building Cloud Distribution, my new business which is focused (as you may have guessed) on Cloud Computing solutions. I took a look around the market and as expected, the same Microsoft based providers were there en-mass with little or no differentiation other than price which still came it at around £12 ($19) per seat per month. That's £180 ($288) a month for 15 mailboxes which we needed from day one. Now, in start-up land every pound counts and I simply was not prepared to splash out on old technology no matter how familiar it was to my staff so I looked around and found (not that easily I have to tell you) Google Apps Standard Edition. What an eye opener !

Up to 50 seats, 7gb mail box for every user, Web access via Gmail, Desktop access via any IMAP or POP client such as Outlook, Thunderbird etc. etc. Full group calendaring, Google Docs document management (recently extended to a full storage solution where any type of document can be uploaded and accessed), iPhone push email, calendar and contact sync and a whole host of other great small business features for NOTHING. Yes, that's right, nada, zilch, zero per seat per month. For up to 50 seats. Where's the catch ?
Believe me, from a small business perspective, there simply isn't one. Google Apps rocks. Period.

Now, I can hear the masses saying "yes but it does not do this, that and the other" but the truth is I can live without the bits it does not do, saving £180 ($288) a month for all of the other bits it does do that Microsoft's solutions will never do like deliver new features on a weekly basis, smart integration with (our CRM cloud based platform) an lots of other elements which work for any business who wants to do business not support email systems.

Why am I selling Google Apps ? It's certainly not to make money for Cloud Distribution. As I said it's free and any IT savvy person who knows how to configure a domain via their ISP can get it up and running in a day max. I am outlining this proposition because it is typical of the way Cloud solutions can change the dynamics of a traditionally LAN based application or service so think about other services a small business could use Cloud IT for ?

•    Clean Email (already widely used)
•    Firewalling (currently at the router or behind it)
•    Wi-Fi (Controller in the Cloud not on the LAN)
•    VPN (Push it up to the ISP)
•    Unified Threat Management (again, push it up to the ISP)
•    Telephony (Hosted IP Centrex and SIP Trunking)
•    Device Management
•    Security Assessments

All of these can be set up to the cloud and delivered with many additional benefits at a fraction of the cost of LAN based services. So, look for the Google Apps of your next application or service requirement, save your business money, make your life easier and help you users do more with less.

Friday, 5 February 2010

SlideShare Launches Channels for Businesses and Brands


Cloud presentation sharing site SlideShare today adds a new Channels service to its professional content sharing arsenal, allowing businesses and brands to create custom microsites within the community. With Channels, companies can create a branded channel for sharing professional content including presentations, whitepapers and webinars, or sponsor a topical content channel curated by SlideShare staff.

Combined with the LeadShare and AdShare programs launched in late 2009, businesses can develop integrated social media campaigns, from custom brand experience to lead generation to targeted promotion of professional content to the large community of business leaders and decision makers that comprise SlideShare's more than 25 million unique visitors per month.

The above screenshot showcases a highly customizable branded channel experience for Microsoft Office, while the below image depicts the second curated channel option. In the latter, the focus is on curating great content within SlideShare to incorporate around a chosen topic (virtualization, in the example below). Interested users can follow a particular channel to get notified of new updates.

Clouds Are Like Onions

Which of course are like Ogres. They’re big, chaotic, and have lots of layers of virtualization

In discussions involving cloud it is often the case that someone will remind you that “virtualization” is not required to build a cloud.

But that’s only partially true, as some layers of virtualization are, in fact, required to build out a cloud computing environment. It’s only “operating system” virtualization that is not required.

Problem is unlike the term “cloud”, “virtualization” has come to be associated with a single, specific kind of virtualization; specifically, it’s almost exclusively used to refer to operating system virtualization, a la Microsoft, VMware, and Citrix. But many kinds of virtualization have existed for much longer than operating system virtualization, and many of them are used extensively in data centers both traditional and cloud-based.
Like ogres, the chaotic nature of a dynamic data based on these types of virtualization can be difficult to manage.

Layer upon layer of virtualization within the data center, like the many layers of an onion, are enough to make you cry at the thought of how to control that volatility without sacrificing the flexibility and scalability introduced by the technologies. You can’t get rid of them, however, as some of these types of virtualization are absolutely necessary to the successful implementation of cloud computing. All of them complicate management and make more difficult the task of understanding how data gets from point A to point B within a cloud computing environment.


Yes, that’s right, eight kinds of virtualization exist though we tend to focus on just the one, operating system virtualization. Some may or may not be leveraged in a cloud computing environment, but at least four of them are almost always found in all data center environments.
  1. Operating System Virtualization is what we tend to think of when we simply say “virtualization.” This is the virtualization of compute resources, the slicing and dicing of a single physical machine into multiple “virtual” machines typically used today to deploy several different applications (or clones of a single application) on the same physical hardware.
  2. Network Virtualization is likely one kind of virtualization many don’t even consider virtualization, but it is and it’s even got standards that help ensure consistency across The State of Virtualization implementations. The VLAN (Virtual LAN) has existed since the early days of networking and is used in cloud computing environments to isolate customer data. VLANs essentially create a virtual network overlay atop an existing physical network, slicing and dicing the physical connections into multiple virtual (and hopefully smaller) networks that can be configured to provide security and network-layer functions like quality of service and rate shaping peculiar to the applications and users that are directed over the VLAN. VLAN tagging, used to identity traffic as “belonging” to a specific virtual network, is defined by IEEE 802.1q.

    Also a form of network virtualization is trunking or link aggregation as defined by IEEE 802.1ad. Trunking aggregates multiple physical ports on a switching device and makes them appear as one logical (virtual) link, providing additional bandwidth to high volume networks as well as load balancing traffic across the physical interconnects in order to maintain consistent network performance. Interestingly enough, VLANs are almost always used when trunking is used in a network.

    And of course there is NAT (Network Address Translation), which is also a form of network virtualization. Because of the dearth of IP addresses, most users internal to an organization are directed through a pool of one or more public IP addresses (routable, i.e. accessible by people across the Internet) to access resources external to the organization. The virtualization here again makes many IP addresses (internal, non-routable, private) appear to be one or a small number of IP addresses (public, routable, external). This process is also used on inbound connections, making one or a small number of external, public IP addresses appear to represent multiple, internal, private IP addresses.
  3. Application Server Virtualization occurs when a Load balancer, application delivery controller, or other proxy-based application network device “virtualizes” one or more instances of an application. The process of virtualization an application server makes multiple servers appear to be one ginormous server to clients, and acts in a manner very similar to trunking in that this form of virtualization is about aggregation. When applied to application servers, this virtualization focuses on the aggregation of compute resources.

    This form of virtualization is almost always necessary in a data center, whether traditional or cloud-based. Application server virtualization is the foundation on which failover (reliability) and scalability are based, and one would be hard-pressed to find a modern data center in which this form of virtualization – whether provided by software or hardware – is not already implemented.
  4. Storage Virtualization is another form of aggregation-based virtualization. Storage virtualization aggregates multiple sources of storage such as NAS (network attached storage) devices and NFS/CIFS shares hosted on various servers around the data center and “normalizes” them into a single, consistent interface such that users are isolated from the actual implementation and see only the “virtual” namespaces presented by the storage virtualization device.
There are four other “types” of virtualization, but it is these four that are primarily utilized today and with which most folks are already familiar – it just may be that they are using different terminology. Perhaps that’s because virtualization of the network and application server have existed for so long most people do not associate it with virtualization. All four of these kinds of virtualization end up forming layers of abstraction throughout the network, and like operating system virtualization introduce management and architectural challenges that are increasingly difficult to address as environments become more and more dynamic, a la a cloud computing environment.

Read more Cloud Distribution News @

Thursday, 4 February 2010

Microsoft Brings Cloud Interoperability Down to Earth

An interoperable cloud could help companies cut costs and governments connect constituents, say Microsoft executives. Governments and businesses alike are looking at cloud services as a way to consolidate IT infrastructure, scale their IT systems for the future, and enable innovative services and activities that were not possible before. To help organizations realize the benefits of cloud services, technology vendors are investing in the hard work of identifying and solving the challenges presented by operating in mixed IT environments, and are collaborating to ensure that their products work well together. In fact, although the industry is still in the early stages of collaborating on cloud interoperability issues, there has already been considerable progress. But what does 'cloud interoperability' mean, and how is it benefiting people today?

Cloud interoperability is specifically about one cloud solution, such as Windows Azure, being able to work with other platforms and other applications, not just other clouds. Customers also want the flexibility to run applications either locally or in the cloud, or on a combination of the two. Microsoft is collaborating with others in the industry and working hard to ensure that the promise of cloud interoperability becomes a reality.
Leading Microsoft’s interoperability efforts are general managers Craig Shank and Jean Paoli. Shank spearheads the company’s interoperability work on global standards and public policy, while Paoli collaborates with Microsoft’s product teams as they map product strategies to customers’ needs.

Shank says one of the main attractions of the cloud is the degree of flexibility and control it gives customers: ‘There’s a tremendous level of creative energy around cloud services right now — and the industry is exploring new ideas and scenarios together all the time. Our goal is to preserve that flexibility through an open approach to cloud interoperability.’

Adds Paoli, ‘This means continuing to create software that’s more open from the ground up, building products that support the existing standards, helping customers use Microsoft cloud services together with open source technologies such as PHP and Java, and ensuring that our existing products work with the cloud.’
Shank and Paoli firmly believe that welcoming competition and choice will make Microsoft more successful in the future. ‘This may seem surprising,’ notes Paoli, ‘but it creates more opportunities for its customers, partners and developers.’

With all the excitement around the cloud, Shank says it’s easy to lose sight of the payoff. ‘To be clear, cloud computing has enormous potential to stimulate economic growth and enable governments to reduce costs and expand services to citizens.’

The public sector provides a great example of the real-world benefits of cloud interoperability, and Microsoft is already delivering results in this area through solutions such as the Eye on Earth project. Working with the European Environment Agency, Microsoft is helping the agency simplify the collection and processing of environmental information for use by government officials and the general public. Using a combination of Windows Azure, Microsoft SQL Azure and pre-existing Linux technologies, Eye on Earth pulls data from 22,000 water monitoring points and 1,000 stations that monitor air quality. It then helps synthesize this information and makes it available for people to access in real time in 24 languages.
This level of openness and interoperability doesn’t happen by accident. ‘The technical work of interoperability is challenging, and it requires a commitment to our customers’ needs as well as a concerted effort on multiple fronts and a measured, pragmatic approach in how technology is developed,’ Paoli says. Microsoft’s efforts in this area include designing its cloud services to be interoperable. The Windows Azure platform, for example, supports a variety of standards and protocols. Developers can write applications to Windows Azure using PHP, Java, Ruby or the Microsoft .NET Framework.

Many of these product developments are the result of diverse feedback channels that Microsoft has developed with its partners, customers and other vendors.

For example, in 2006 Microsoft created the Interoperability Executive Customer (IEC) Council, a group of 35 chief technology officers and chief information officers from organizations around the world. They meet twice a year in Redmond to discuss their interoperability issues and provide feedback to Microsoft executives such as Microsoft Server and Tools President Bob Muglia.

In addition, last week Microsoft published a progress report, sharing for the first time operational details and results achieved by the Council across six work streams, or priority areas. And the Council recently commissioned the creation of a seventh work stream for cloud interoperability, aimed at developing various standards related to the cloud, working through business scenarios and priorities such as data portability, and establishing privacy, security, and service policies around cloud computing.

Microsoft also participates in the Open Cloud Standards Incubator, a working group formed by the Distributed Management Task Force (DMTF), a consortium through which more than 200 technology vendors and customers develop new standards for systems management. AMD, Cisco, HP, IBM, Microsoft, Red Hat and VMware are among a handful of IT vendors that lead the Open Cloud Standards Incubator, creating technical specifications and conducting research to expedite adoption of new cloud interoperability standards.
Developers also play a critical role. Microsoft is part of Simple Cloud, an effort it co-founded with Zend Technologies, IBM and Rackspace designed to help developers write basic cloud applications that work on all of the major cloud platforms.

Microsoft is also engaging in the collaborative work of building technical ‘bridges’ between Microsoft and non-Microsoft technologies, such as the recently released Windows Azure Software Development Kits (SDKs) for PHP and Java and tools for Eclipse version 1.0, the new Windows Azure platform AppFabric SDKs for Java, PHP and Ruby, the SQL CRUD Application Wizard for PHP, and the Bing 404 Web Page Error Toolkit for PHP. Each is an example of the Microsoft Interoperability team’s yearlong work with partners to bring core scenarios to life.

Though the industry is still in the early stages of collaborating on cloud interoperability issues, great progress has already been made. The average user may not realize it, but this progress has had a significant positive impact on the way in which we work and live today.

Cloud interoperability requires a broad perspective and creative, collaborative problem-solving. Looking ahead, Microsoft will continue to support an open dialogue among the different stakeholders in the industry and community to define cloud principles and incorporate all points of view to ensure that in this time of change, there is a world of choice.

Most Not Interested in Cloud Data Storage

Another dose of reality for the Cloud Computing industry! A new survey by Forrester says that just 3% of companies use cloud storage. Worse, the vast majority of firms don’t plan to put data in the cloud. This is the latest shot of poor showings for the cloud, and I have a theory about it. But first, read on: Forrester interviewed more than 1,200 IT decision makers at enterprises and small and mid-size businesses in North America and Europe. The research company asked IT decision makers if they had plans to adopt cloud storage services such as those offered by Amazon S3, EMC Atmos, Nirvanix, The Planet, or AT&T.

·       43% said they’re not interested in cloud storage;
·       An equal proportion were interested but have no plans to adopt;
·       3% plan to implement a cloud storage platform in the next year;
·       5% plan on it a year from now or later;
·       And, while 3% have already switched to cloud storage, only 1% are expanding an existing implementation.

To me, this reflects issues and concerns that just won’t go away on the part of IT folks and end users, chief among them the need for assurances of guaranteed service levels and security.  Forrester agrees, according to a story about the survey in SF Gate:

Forrester analyst Andrew Reichman writes in the report that “there is long-term potential for storage-as-a-service, but Forrester sees issues with guaranteed service levels, security, chain of custody, shared tenancy, and long-term pricing as significant barriers that still need to be addressed before it takes off in any meaningful way.”

One interesting finding of the survey is that companies are more interested in the cloud for back-up storage rather than general purpose storage. Why is that?

“First, it’s s a complete service offering, not just CPU or storage capacity,” Reichman writes. “You get the backup software intelligence and storage capacity in a fully managed service. Second, it’s solving a very specific pain point – the pain of bringing a costly and error-prone, but very necessary, IT function under control. This is in contrast to storage-as-a-service offerings where the user has to figure out how to put it all together.”

I’ll put my theory about this finding out there, too.  I think that companies probably feel they’re taking less of a chance backing up data on the cloud – data they own and also have access to on internal servers. Perhaps this doesn’t require such a leap of faith as it does to migrate all your data to the cloud…with a fear that you can’t recoup any information that gets lost.

This is why companies that do use cloud data storage also take safeguards to ensure their data remains safe and accessible – such as cloud monitoring services.

Read more Cloud Distribution News @

Wednesday, 3 February 2010


I am attending the OpSource webinar - Making Channels Work to Grow Your SaaS / Cloud Business TONIGHT @ 6pm

Read more Cloud Distribution News @

Genetic Cloud: Cloud Computing Is For Everyone Every Day

Genetic algorithms are very simple. They have only three steps.  

Cloud Computing provides almost unlimited resources. They are always available. The payment is only for usage time. The current article continues the series about new opportunities that become available to every person.

Do you play billiard? I do. I’m not a professional player. So I enjoy pocketing a ball accidentally. The ball’s going at the rebound from cushions, coming to the pocket; it almost stops with revolving, and … falls to the pocket. Such kind of a casual luck!

But I’m not sure the luck is here today. The coming manager doesn’t look like he simplifies my current work day.

“A task for you….”
To exclude all technical details, commercial secrets, and other things, the task can be presented as a known travelling salesman problem with 50 points there. Yes, the order of variant’s number is about 50! (10^64). The distance matrix is not static additionally. Even if I find the necessary library quickly it’ll take much time to include the task features there.
“When should it be ready?” I’m asking guardedly.
“I believe in you” my manager‘s tapping me. “Today’s evening.”
“What is if it is ready earlier than evening?” I’m trying to joke.
“If it’s ready earlier you could even play billiard the rest of the time.” My manager answers in the same manner.

Ok, at least I’m motivated now.
There are a lot of methods to solve such a problem .
It would be great if the task could be solved by my minimal efforts. Even it takes a little bit more other resources, e.g. computing time. Sure! Indeed it can be solved in this way! A genetic algorithm is the simplest method that I can implement. But I will have to just wait for the result. Wasting of time! But I can use the cloud resources. It’ll save me time, and I will be able to play billiard during waiting.

Stage 1. Genetic Algorithm.

Genetic algorithms are very simple. They have only three steps (excluding an initialization): crossover, mutation, and selection.
The chromosome is a ring path starting from the first defined point and directed to the second defined point (to not duplicate the same ways during the search).
The implementation is:
  1. The initialization is easy, because it’s random.
  2. The selection is even easier, because the common distance is necessary to be calculated and the best (shortest by the distance) chromosomes should be taken to the further search.
  3. The mutation is not such a difficult task. Two random cells of a chromosome are replaced by each other.
  4. The crossover is a little bit more complex. Two random cells are taken, that defines 2 parts of rings for both chromosomes. One part from each chromosome is taken in the way where parts have minimal overlapping. Duplicated cells are removed randomly, absent cells are added randomly according to their placement in the initial chromosomes.
The C# code is based on the European cities with the distance though the geographical coordinates.
So the algorithm is ready. Is the search not most effective? But I’m very effective. The code is ready quickly, even if it takes some additional time to calculate, the whole task will be finished quicker.

Stage 2. Cloud.

The genetic algorithm is still randomized method. It guaranties almost optimal solution and depends on parameters. I can increase the population size to improve the probability of the best result, but it will increase the calculation time. Or I can run several instances simultaneously and pickup the best result.
As I write on .NET, I prefer any other working tools also in the Visual Studio. So my cloud is also in the Studio. I use a free tool EC2Studio. It’s the most convenient tool for me to operate with the Amazon EC2 in the Visual Studio. Amazon EC2 provides computing resources when I need them. I choose the necessary AMI configuration and start an instance. After I finish using the instance, I terminate it. The payment is only for the used time.
The deploying is:
  1. Launch an instance.
    1. Generate a key pair, if it is not generated yet.
    2. Find the necessary AMI (I use the following instances for competitive tests: ami-6a6f8d03 (Win2008, large, EBS), ami-a2698bcb (Win2008, small, EBS), ami-dd20c3b4 (Win2003, large), and ami-df20c3b6 (Win2003, small)).
    3. Start it.
    4. Wait until the instance becomes in running status.
  2. Make a test in the instance
    1. Get an Administrator’s password for the instance.
    2. Start a console for the instance.
    3. Copy the application there.
    4. Run it there.
Perfect! I shut down a laptop soon and go to play billiard.
The picture shows 4 cloud instances and 1 locally run version (on Vista).
The best solution is 181.85 here (it’s on the slowest machine, but the genetic algorithm is random).
There are different configurations by software and hardware are available on EC2. My tests use small and large images (small images are on 32bit, large images are on 64bit). The competitive test result - the average time per one cycle (selection, mutation, and crossing over according to defined shares):
For small Windows 2003 32bit Amazon Instance 1098 msec
For large Windows 2003 64bit Amazon Instance 594 msec
For large Windows 2008 64bit Amazon Instance 503 msec
For small Windows 2008 32bit Amazon Instance 1139 msec
For my laptop Vista 1659 msec
Taking into account 4th times prices difference between a small and large instance in EC2, a small instance is twice cheaper for this program (when the execution time is not critical).

The same day later.
By e-mails:
“Mr. Manager, the found almost most optimal variant consists of the sequence of the following points…”
“Good! But where are you?”
“I’m having a good billiard playing, as you said in the morning.”
“I understand. Have beautiful doublets! And big thanks for the result!”
Now I think almost all optimization tasks can be solved with two components: a cloud and a genetic algorithm.
Every problem leads to a good solution. Every cloud has a silver lining.

Read more Cloud Distribution News @