Friday, 28 May 2010

Intercept Chooses 3PAR Utility Storage for High Performance Server

3PAR, the leading global provider of utility storage, announced today that Intercept, a leading UK provider of IT services and hosting, has deployed both 3PAR InServ T-Class and InServ F-Class Storage Server models to support its online and virtualization offerings.

Intercept replaced an inefficient, poorly performing network of six devices with these two resilient, high-performance 3PAR Utility Storage arrays. In doing so, the cloud computing service provider has simplified storage administration, reduced its storage footprint by 67%, and halved storage-related power consumption across its virtual server and desktop infrastructure. In addition, by choosing 3PAR, Intercept's storage environment now delivers 40% faster performance.

Join Us: http://bit.ly/joincloud

Nearly 60% of Euro CIOs Already Use Some Cloud Says IDC

Nearly 60 percent of European CIOs are already using cloud services - even if they don't realise it. That's according to a new survey from IDC which found that the use of cloud services was more prevalent than had been expected.

The revelation was particularly revealing given that 50 percent of the same CIOs said they were not using cloud services when asked a more general question - and about a third say they won't be using cloud computing within the next five years, said Marianne Kolding, associate vice president, European services and software, IDC . "The results change when they're asked about specific products, however," she added. She explained that this suggested some CIOs weren't aware that some of the technologies they were using were actually cloud technologies.

Kolding was speaking at an event commemorating IDC's 40th birthday (IDC is a sister company of IDG, publisher of Techworld) and revealed details of spending plans for Europe for the coming year.

The signs are for an increase in spending said Kolding, although vendors probably can't roll out the champagne just yet, the increases are modest and there are still a significant number of CIOs whose top priority is reducing costs, she said. But, she added, 45 percent of organisations did expect IT spending to grow.

High priorities on CIOs' shopping list of technologies include virtualisation, although security is not far behind, with storage management third on the list "Security is always up there," said Kolding.

There's little interest in green technology, particularly compared with the situation a few years ago. "In previous years, people were only talking about green and innovation but we don't hear so much about that. Now they're only interested in the green label when it offers something else too - like cost savings," she said.

For the survey, IDC questioned 673 CIOs across Europe, said Kolding.

Original Article - CIO

Join Us: http://bit.ly/joincloud

Thursday, 27 May 2010

Expand Networks win Microscope "Networking Vendor Of The Year"

Last night (May 26th 2010), Expand Networks beat tough competition from the likes of Juniper, HPO, Extreme, D-Link and Ruckus to become 'Networking Product Of The Year" Full press release to follow.

Come and talk to Cloud Distribution if you'd like to know more about Expand's value proposition.

Join Us: http://bit.ly/joincloud

The real arguments for Cloud Computing

As more vendors dive into the cloud computing market, every possible claim regarding the supposed benefits of moving to a cloud-based service is being made.

I ran across an article titled ” Why Cloud-based Monitoring is more reliable and secure than Nagios.

” The auth0r, who represented a cloud-based network monitoring company, contended that the Software-as-a-Service (SaaS) model offered by his company was better for companies than Nagios and other open source products.

The question is not Cloud Computing vs. Open Source. In fact, there are open source SaaS providers like MindTouch out there. If considering a product like Nagios, a better comparison would be open source vs. commercial. In many cases, cost is the determining factor for companies to look to open source technologies. Other considerations include flexibility and security. [cloud-computing]

The more relevant comparison would be hosting and managing a network monitoring system on site vs. moving to a SaaS provider.

For many organizations, IT is considered overhead and not the primary function of the organization. Companies move to the cloud for most of the same reasons companies out-source. Can someone else do it better for less? Cost is ually the easier consideration. Companies have to grapple with the ‘better’.

Does it mean more security, availability, capacity? Many cloud providers would say ‘yes’ to all and then some.

Organizations have to really consider and make that determination themselves. Make a real comparision between their options and not just follow the typical vendor hype.

Original Article - Cloud Computing Journal

Join Us: http://bit.ly/joincloud

User Control and the Private Cloud

One of the comparison points between the public and private cloud domains is the difference in the level of control and customization over the cloud-based service.

In a public cloud environment, users typically receive highly standardized (and in many cases commoditized) IT services from the provider. In the private cloud, it is usually the case that users have a higher degree of control and customization over the cloud-based service. I would wager a bet that to many this makes perfect sense and it is simply a manifestation of who has control over the means of delivery (third party in the public domain, end-users in the private domain).

Presumably, those users who are heading down the public cloud route have come to grips with the fact that they will have less control over certain elements. They are happy to get fast, on-demand access to services at the cost of some control. However, I have come to learn private cloud users have a much different expectation. They want the features they get from a public cloud, but typically do not want to give up one iota of control/customization capability. Not to be the bearer of bad news, but in most cases this is all but impossible.

While I believe it is obvious that private clouds provide a much higher degree of customization and control than their public counterparts, that does not mean every single thing about a service delivered via a private cloud is configurable by the end-user. In fact, some things that are pre-configured and immutable deliver significant value to the user.

As an example, let's consider the scenario of an enterprise working to build a private cloud to run their application platforms. Part of this effort is likely to include the creation of a service catalog that contains different application platforms they can run in their cloud. A significant portion of the service catalog will consist of vendor-supplied, virtualized application platforms. These virtualized offerings have been pre-packaged, pre-integrated, and optimized by the vendors, and they are ready to simply activate and run within the private cloud.

By getting these virtualized packages from vendors, the company spares the cost of developing and maintaining them over time. This is good for multiple reasons, including:

- The company can focus on core competencies, which we can assume do/should not include application infrastructure administration
- The company benefits from pre-packaged application stacks, which the vendor configures, integrates, and optimizes appropriately to achieve the best possible result in terms of service delivery time and service performance.

Of course, these pre-packaged, virtualized application stacks also have another effect on the company. Namely, it means the company gives away some control over the exact configuration of the software bits. This may include things like the location of the installed product on disk (which I've learned is not nearly as inconsequential as I thought), default system users, initial disk allocation, etc. Some of these things may be reconfigurable by the end-user (system users, disk allocation), but invariably some remain burned into the virtualized stack for all of eternity (installed product location).

In this case, end-users sacrifice some control and in turn achieve simplicity, rapidity, and standardization. After all, if the system required the user to specify installation locations for each product in the application stack for every service request, the delivery of said service would not be as simple or fast. In addition, there would be a lesser degree of standardization among the running services, since it would be possible for the software inside those instances to have varying installation directories. This may seem minor, but it could have substantial impacts on the ability to centrally administer these cloud-based services.

In the end, the very specific use case above highlights general themes regarding control in a private cloud environment. Private cloud environments typically offer more control and customization capability than their public cloud counterparts, but that does not mean they offer the same control capability that users have grown accustomed to in their traditional environments. Users must usually sacrifice some control in order to achieve the benefits of simple, fast, and standardized cloud-based service delivery and administration. When confronted with potentially giving up control over a specific capability, users should consider this simple question:

- Do I need control over this element?

It seems like an obvious question, right? However, I emphasize need because users often merely want control. They are used to a certain culture, or a "that's the way we do it" mode of operation. Making decisions based on culture and standard operating procedure does not always yield the best results for the enterprise.

Of course, the entire onus is not on the users when it comes to this dilemma in the private cloud. Vendors must carefully design and implement private cloud solutions and services so that they offer a high degree of customization and configurability, all the while delivering out-of-the-box value. They cannot be overwhelming to use, and at the same time they should accommodate a wide array of use cases. As you may imagine, this is far from easy and the state of this art is just beginning to evolve.

Original Article - Cloud Computing Journal

Join Us: http://bit.ly/joincloud

Wednesday, 26 May 2010

MERAKI EXPANDS INTERNATIONAL PRESENCE BY PARTNERING WITH UK-BASED CLOUD DISTRIBUTION

Cloud Distribution adds Meraki as its first wireless LAN provider

SAN FRANCISCO — May 25th 2010 – Meraki, the cloud-based wireless networking company, today announced that it has partnered with distributor and cloud specialist Cloud Distribution, expanding Meraki’s presence in the UK. Given Meraki’s singular focus on cloud-based networking infrastructure, Cloud Distribution was a natural fit to distribute and promote its products.

Cloud Distribution is a value added distributor focusing on the security, management and optimization of devices that access the cloud. The company prides itself on finding innovative, leading edge vendors with cloud-based products and services it can bring to the UK market. With the addition of Meraki, Cloud Distribution can offer a cloud-based wireless networking solution to clients for the first time.

“Meraki’s cloud-based wireless networking solution is a perfect fit for our partner base,” said Scott Dobson, Managing Director at Cloud Distribution. “We are already seeing massive interest in Meraki’s Cloud based wireless platform and are actively looking for new channel partners to fulfill this demand.”

“Our partnership with Cloud Distribution not only provides a great match with our technology vision, but gives us access to a new, forward-thinking base of partners in the UK,” said Andy McCall, director of worldwide channels at Meraki. “Beyond fulfilling traditional WLAN needs like an 802.11n upgrade or simplifying network management, we look forward to connecting with organizations that are already excited about the promise of cloud computing.”

About Meraki:
Meraki offers enterprise-class wireless networks at a fraction of the cost and complexity of traditional networking vendors. Using Meraki’s unique cloud-based architecture, an administrator can configure thousands of Meraki access points over the web through a single interface. The company’s customers include small-to-medium sized businesses, global hotel chains, and world-class educational institutions. Meraki wireless networks serve millions of users on over 15,000 networks in more than 140 countries. Meraki is located in San Francisco, California, and is funded in part by Sequoia Capital and Google. Follow Meraki on Facebook and Twitter. For more information, go to www.meraki.com.

About Cloud Distribution:
Cloud Distribution are a Value Added Distributor focusing on the security, management and optimization of devices which access the Cloud.

Join Us: http://bit.ly/joincloud

A Cloud Computing Epiphany

One of the greatest moments a cloud evangelist indulges in occurs at that point a listener experiences an intuitive leap of understanding following your explanation of cloud computing. No greater joy and intrinsic sense of accomplishment.

Government IT managers, particularly those in developing countries, view information and communications technology (ICT) as almost a “black” art. Unlike the US, Europe, Korea, Japan, or other countries where Internet and network-enabled everything has diffused itself into the core of Generation “Y-ers,” Millennials, and Gen “Z-ers.” The black art gives IT managers in some legacy organizations the power they need to control the efforts of people and groups needing support, as their limited understanding of ICT still sets them slightly above the abilities of their peers.

But, when the “users” suddenly have that right brain flash of comprehension in a complex topic such as cloud computing, the barrier of traditional IT control suddenly becomes a barrier which must be explained and justified. Suddenly everybody from the CFO down to supervisors can become “virtual” data center operators – at the touch of a keyboard. Suddenly cloud computing and ICT becomes a standard tool for work – a utility.

The Changing Role of IT Managers

IT managers normally make marginal business planners. While none of us like to admit it, we usually start an IT refresh project with thoughts like, “what kind of computers should we request budget to buy?” Or “that new “FuzzPort 2000″ is a fantastic switch, we need to buy some of those…” And then spend the next fiscal year making excuses why the IT division cannot meet the needs and requests of users.

The time is changing. The IT manager can no longer think about control, but rather must think about capacity and standards. Setting parameters and process, not limitations.

Think about topics such as cloud computing, and how they can build an infrastructure which meets the creativity, processing, management, scaling, and disaster recovery needs of the organization. Think of gaining greater business efficiencies and agility through data center consolidation, education, and breaking down ICT barriers.

The IT manager of the future is not only a person concerned about the basic ICT food groups of concrete, power, air conditioning, and communications, but also concerns himself with capacity planning and thought leadership.

The Changing Role of Users

There is an old story of the astronomer and the programmer. Both are pursuing graduate degrees at a prestigious university, but from different tracks. By the end of their studies (this is a very old story), the computer science major focusing on software development found his FORTRAN skills were actually below the FORTRAN skills of the astronomer.

“How can this be” cried the programmer? “I have been studying software development for years, and you studying the stars?”

The astronomer replied “you have been studying FORTRAN as a major for the past three years. I have needed to learn FORTRAN and apply it in real application to my major, studying the solar system, and needed to learn code better than you just to do my job.”

Original Article - Cloud Computing Journal

Join Us: http://bit.ly/joincloud

How Managers Stopped Worrying and Learned to Trust the Cloud

More than half of European organisations believe that cloud computing will result in an improvement in security defences. That's according to a survey conducted on the behalf of IT conference 360.

In February, a survey from Symantec found that only a third of IT managers thought cloud computing would improve security - the same number that thought cloud would make their systems more insecure

The survey could herald a change in attitude by IT managers. Security frequently features as the number one inhibitor when it comes to looking at cloud. Only this week, SNIA Europe chairman Bob Plumridge described how it was the main concern of user companies. And in March, the former US National Security Agency technical director, Brian Snow said that he didn't trust cloud services

According to the 360IT poll, only 25 percent of managers surveyed thought that the onset of cloud computing would make security worse.

Commenting on the survey, Richard Hall, CEO of CloudOrigin, said that the current trend of businesses migrating their IT systems into the cloud does not mean a reduction in security defences.

"After decades performing forensic and preventative IT security reviews within banking and government, it was already clear to me that the bulk of security breaches and data losses occur because of a weakness of internal controls," he said in his 360IT blog post.

According to Hall, the complete automation by public cloud providers means the dynamic provision, use and re-purposing of a virtual server occurs continuously within encrypted sub-nets. "That's why solutions built on commodity infrastructure provided by the likes of Amazon Web Services have already achieved the highest standards of operational compliance and audit possible," he said.

Original Article - CIO

Join Us: http://bit.ly/joincloud

Tuesday, 25 May 2010

B2B Cloud Start-up Gets $10.2M To Enter the US

Huddle, the three-year-old UK cloud collaboration start-up, has closed a $10.2 million B round from Matrix Partners and existing investors Eden Ventures and Huddle chairman Charles McGregor.

Huddle, which recently went cash flow positive, according to co-founder Andy McLoughlin, has already been through a $5 million first round. It means to use the new money to push deeper into the United States, which will entail quadrupling its current 40-man staff in the next year. McLoughlin has just moved to San Francisco from London to head the effort.

read more


Join Us: http://bit.ly/joincloud

iPhone still ahead of Android in Europe

iPhone is still beating Android in Europe despite the Google OS overtaking it in the US, according to the latest count by the market research boffins at Gartner. Symbian and Blackberry are still ahead of both of them. Dive into the figures and you'll find the future doesn't look so bright for Microsoft…

Gartner says twice as many iPhones flew off the shelves as Android handsets across Europe in the first quarter of 2010 helping Apple retain its spot as the third biggest seller of smartphones worldwide. Meanwhile Android leapfrogged over Windows Mobile to come in at number 4 with phones like the HTC Desire helping to deliver 5.2 million sales.

Symbian is still by far the largest smartphone OS. 24.1m Symbian-toting handsets were sold in during the same period but its market share is slipping, dropping from 48.8 percent to 44.3 percent. That's hardly a massive decline and we've got our fingers crossed that Symbian 3 and the Nokia N8 will deliver.

Blackberry maker RIM is still holding on to the number 2 spot and also has the potential to deliver something exciting with the Blackberry 6 OS. It's Microsoft that really needs to worry having dropped into 5th place with Windows Mobile phones shifting just 3.7 million units. Windows Phone 7 really needs to pull out the stops.

Make your prediction: do you think iPhone 4.0 will keep Apple ahead of Android? Or does the sheer variety of Google OS-packing handsets mean thjavascript:void(0)e 'droid is destined to win?


Join Us: http://bit.ly/joincloud

Monday, 24 May 2010

The Cloud Computing Opportunity by the Numbers

How big is the opportunity for cloud computing?

A question asked at pretty well every IT conference these days. Whatever the number, it's a big one. Let's break down the opportunity by the numbers available today. By 2011 Merrill Lynch says the cloud computing market will reach $160 billion. The number of physical servers in the World today: 50 million.

By 2013, approximately 60 percent of server workloads will be virtualized By 2013 10 percent of the total number of physical servers sold will be virtualized with an average of 10 VM's per physical server sold. At 10 VMs per physical host that means about 80-100 million virtual machines are being created per year or 273,972 per day or 11,375 per hour.

50 percent of the 8 million servers sold every year end up in data centers, according to a BusinessWeek report The data centers of the dot-com era consumed 1 or 2 megawatts. Today data center facilities require 20 megawatts are common, - 10 times as much as a decade ago.

Join Us: http://bit.ly/joincloud

Cloud Computing Revolution Seeding Unseen Servers

The rise of cloud computing is going to stoke demand for servers, according to a new forecast from IDC.

For me, the critical point is that public cloud providers like Google don’t buy servers, they build them. And the design decisions they make -- constructing sparsely configured but powerful scale-out servers –- will feed back into the enterprise market.

If you want to check out the IDC forecast, go here. The raw numbers of which they speak are as follows: "Server revenue for public cloud computing will grow from $582 million in 2009 to $718 million in 2014. Server revenue for the much larger private cloud market will grow from $7.3 billion to $11.8 billion in the same time period."

When you consider that the overall server market was around $44 billion in 2009, the qualitative takeaway is that the public-cloud portion amounts to a spring rain shower. However, I believe the influence of this sector will be disproportionately large. Here's why:

The server market for public cloud vendors such as Google and Amazon is a hidden market. These guys don’t go out and buy in bulk from Dell or HP. They roll their own, and the own that they roll are highly, highly, highly efficient designs. The strip their blades of all extraneous components and outfit them with nothing but the bare necessities. So if taking out a USB port saves $5 and a couple of watts of power, that’s how the system is configured.

What's also not obvious to the public is that the Googles of the world exert influence on the server design arena right at the source. Namely, they go directly to Intel and AMD and tell them stuff they'd like to see in terms of TDP (processor power dissipation), power management, price/performance, etc. Since a Google can buy upwards of 10,000 of a processor SKU, Intel and AMD listen.

So who cares if Google builds cheap, stripped down servers and attaches them in their data centers with Velcro strips? Well, I admit that right now there isn't a clear connection between Google's servers and what a Tier 1 server maker sells to its enterprise customers. (It should be obvious, but in case it's not, I'm talking about the x86 server space, not RISC/Itanium.)

However, as the server technologies applied by the public cloud vendors becomes more visible, it will influence buyers. For one thing, they'll say, 'Hey, we want these things.' Thus it will become harder, over time, for the Tier 1's to sell quite as many of their heavily configured conventional servers.

More correctly, I should say, it will force them to continue to push the value proposition up from the server level, per se, into the higher rung of the data-center play. As in, how can you offer an enterprise something configured to meet its total compute, storage, and networking needs. And also connect it together, manage it seemlessly, support heavy use of virtualization, and yada, yada, yada.

Fortunately for the likes of IBM, HP, Dell, Oracle/Sun and Cisco, that's where they excel anyway. Which is good, because that's where the competitive center of gravity in the server space will be moving, if it hasn't already.

Original Article - Information Week

Join Us: http://bit.ly/joincloud

Friday, 21 May 2010

Apple Might Announce Cloud-Based iTunes Service At WWDC

It is now over five months since Apple acquired the online music streaming service, Lala.

While there has been no official communication from Apple so far about how the company plans to integrate Lala with the company's own product offerings, it is being speculated that Apple may use Lala's technology to build a cloud-based iTunes service.

According to Macsimum News, Steve Jobs might make an announcement during the Worldwide Developers Conference (WWDC) 2010 next month. The new service will make storage space on the users' device irrelevant by moving all the purchased content to the cloud. Macsimum reports:

"Purportedly, the service would allow iTunes shoppers to build up a digital video collection (music, movies, TV shows, etc.) without having to worry about the intensive storage space involved. iTunes Replay would, per the rumors, stream music, TV shows and movies purchased on iTunes, so you wouldn’t have to download them after purchasing, freeing up hard drive space."

If you consider Lala's announcement last month that the online streaming service shall shutdown operations on May 31st, the rumored announcement of the cloud-based iTunes service around June 7 could be a possibility.

But it contradicts a report published two weeks back, which claimed that the cloud-based iTunes service might not be coming anytime soon as Apple had not reached an agreement with record labels as yet. According to the report, the primary reason for the delay was pricing issues. Apple is learnt to be working on a cloud based "locker" service from where users can stream the content that they own on to multiple devices. Record labels are apparently demanding a higher price for content since they argue that streaming songs across multiple devices should constitute multiple use.

However, Apple is already under some pressure as it is facing competition from popular music streaming services such as Spotify in Europe, which is expected to launch its service in the US in the third quarter.

We hope Apple can bring affordable streaming music in the $3 (like Grooveshark) to $5 range per month.

Let us know your thoughts in the comments.

Join Us: http://bit.ly/joincloud

Realizing Cloud for the Enterprise

Most organizations today continue to view technology as merely a back-office cost center, a viewpoint that diminishes the breadth of Enterprise IT's impact on a firm's competitive profile.

To contribute to strategic growth, IT's role has to be understood as far more than a repository of software and hardware applications.

Instead, it should be seen from a meta-IT perspective, a perception that examines and understands the management of IT as a firm-wide service and infrastructure fabric in the cloud that collaborates with business units to respond rapidly, innovatively, and cost-effectively to changing market conditions. Traditional IT strategy and governance models fail because they do not take into account the changing needs of the markets in which businesses operate and the new cloud-enabling technologies that are now available to expedite and facilitate optimized IT management.

The strategy to fill this void is the concept of a meta-IT model in which IT is positioned as complementary to other strategies for attaining greater market agility.

Join Us: http://bit.ly/joincloud

Thursday, 20 May 2010

Wyse Announces Zero Footprint Software Engine for Cloud Client Computing

Wyse Technology revealed the Wyse Zero engine.

Wyse Zero is software that simplifies the development of cloud connected devices. Wyse Zero, which connects users to cloud computing services and virtual desktops with efficient communications and protocol technology, is already in use in millions of devices, including thin clients, handheld smart devices, and zero clients.

Specifically, the Wyse Zero engine is currently powering all devices that are utilizing Wyse ThinOS, all implementations of Wyse PocketCloud and -- as of today -- all Wyse Xenith devices. For current users of Wyse ThinOS, the highly optimized, management-free solution for Citrix XenApp, Citrix XenDesktop, Microsoft Terminal Server and VMware View virtual desktop environments, Wyse Zero ensures that every Wyse thin client delivers flexible and secure user access, boots-up in seconds, updates itself automatically, and delivers IT managers with simple, scalable administration to suit their organization's needs; all while featuring an unpublished API that delivers built-in protection from viruses and malware.

read more

Join Us: http://bit.ly/joincloud

Cloud Computing Market to Reach $12.6 Billion in 2014 - IDC

The breaking dawn of economic recovery combined with an aging server installed base and IT managers' desire to rein in increasingly complex virtual and physical infrastructure is driving demand for both public and private cloud computing.

According to new research from International Data Corporation (IDC), server revenue for public cloud computing will grow from $582 million in 2009 to $718 million in 2014. Server revenue for the much larger private cloud market will grow from $7.3 billion to $11.8 billion in the same time period. "Now is a great time for many IT organizations to begin seriously considering this technology and employing public and private clouds in order to simplify sprawling IT environments," said Katherine Broderick, research analyst, Enterprise Platforms and Datacenter Trends.

IDC defines the public cloud as being open to a largely unrestricted universe of potential users; designed for a market, not for a single enterprise. Private cloud deployment is designed for, and access restricted to, a single enterprise (or extended enterprise); an internal shared resource, not a commercial offering; IT organization as "vendor" of a shared/standard service to its users.

"Many vendors are strategically advancing into the private and public cloud spaces, and these players are widely varied and have differing levels of commitment," Broderick added. Server revenue for public cloud computing will grow from $582 million in 2009 to $718 million in 2014. Server revenue for the much larger private cloud market will grow from $7.3 billion to $11.8 billion in the same time period.

read more

Join Us: http://bit.ly/joincloud

Wednesday, 19 May 2010

Seven Requirements for Virtual Desktop Success

What do you think are the main ingredients to any successful desktop virtualization project? Is it application integration methodology? Is it hardware?

What about the IT team? Based on my experience, the top requirements really boils down to a few core items, all of which I've discussed many times in previous blog postings (applications, standards, and executive buy-in to name a few).

Before we get into the seven requirements, we must understand the point of desktop virtualization. Desktop virtualization is not about virtualizing the desktop. Virtualizing the desktop is easy. Desktop virtualization is about enabling IT to be more flexible and agile. If done properly, desktop virtualization can allow IT to deliver new operating systems, patches, security fixes and applications without requiring months or years of effort.

What does this mean for the business? It means that users will be more focused on their job instead of trying to deal with IT. How many of you, when you think of your IT team, can feel your stress level increase? You're sitting there watching your logon script take 10 minutes to execute and you know the person who wrote that bloated pile of garbage is sitting in their cube playing World of Warcraft.

So you now take matters into your own hands. You remove your computer from the domain. You install your own applications. You manage your desktop as if it were your own. This means you are no longer in sync with the rest of the organization. This is one of the reasons why it takes IT so long to do anything. If IT changes their practices and processes and utilizes technology that can deliver a pristine desktop environment to you every day, you might be less likely to go rouge. But in order to do this, the virtual desktop implementation must follow a few important requirements if things will change for the better. These items are what I like to call the Seven Requirements for Desktop Virtualization Success.

Read it. Understand it. Let's discuss it. Did I miss anything? Do you think other items are more important than the ones discussed? Let's make sure virtual desktops are the solution to this ever growing problem of desktop delivery.

Original Article - Cloud Computing Journal

Join Us: http://bit.ly/joincloud

Taking Control of the Cloud for Your Enterprise

Leverage existing IdM and security event management (SIEM) systems for increased adoption and greater business value

This white paper is intended for Enterprise security architects and executives who need to rapidly understand the risks of moving business-critical data, systems, and applications to external cloud providers. Intel's definition of cloud services and the dynamic security perimeter are presented to help explain three emerging cloud-oriented architectures. Insecure APIs, multi-tenancy, data protection, insider threats, and tiered access control cloud security risks are discussed in depth.

To regain control, the white paper from Intel illustrates how a Service Gateway control agent can be used to protect the enterprise perimeter with cloud providers. Various deployment topologies show how the gateway can be used to leverage existing IdM and security event management (SIEM) systems for increased adoption and greater business value.

Download the White Paper Now


Join Us: http://bit.ly/joincloud

Monday, 17 May 2010

Flexibility, Security, and Cost: Key Factors in Cloud Computing

"The security threats from highly organized offshore governments and cybermafias are so great that small enterprises are turning to large, trusted cloud servers for protection," notes John Sundman in this exclusive post-show interview with Cloud Expo Chair Jeremy Geelan. John Sundman, founder of Cloud Expo exhibitor wetmachine.com, is a novelist who writes for software developers. He has a lot of experience in hardware and software organizations.

Cloud Computing Journal: A very general question first, about Cloud Computing itself: Surely we've heard all of this before in various forms and guises - grid computing / utility computing, etc.? What is different this time - why is everyone so convinced it will now work?

John Sundman: A bunch of reasons, such as:

1. The cost of hardware keeps coming down, and hardware keeps becoming more reliable even as the challenges of managing vast secure networks multiply. Until fairly recently, CIOs and other decision- makers were concerned with hardware costs. Hardware costs are not such a concern anymore, so people are able to step back and look at the real costs of developing expertise. And it makes more sense to outsource when you can.
2. The marketplace has been prepared by applications like Hotmail, salesforce, Flickr, Facebook, etc. This makes it easier for people to explain the difference between cloud-based applications and locally hosted applications. As recently as five years ago, the whole notion of web-based applications was hard for people to understand. Now everybody gets it.
3. The security threats from highly organized offshore governments and cybermafias are so great that small enterprises are turning to large, trusted cloud servers for protection, much as peasants turned to feudal lords for protection in the middle ages. ABC Manufacturing, Inc, is no match for the most sophisticated computer scientists of offshore criminal gangs and governments. But Microsoft and Amazon are.

Cloud Computing Journal: What are the three main factors driving CIOs toward the Cloud?

Sundman: The three main factors are flexibility, security, and cost.

Cloud Computing Journal: And what are the three main barriers preventing some CIOs from moving some of the on-premise computing to the Cloud?

Sundman: First, specialized needs not met by any off-the-shelf cloud solution. Second, CIO conservatism, that is a general reluctance to adopt new trends until they've been proven. The final barrier is security - some concerns have even more expertise in specific data security issues than cloud vendors do.

Cloud Computing Journal: How does your own company's offering/s assist CIOs and organizations/companies?

Sundman: I write novels to help CIOs and people with similar responsibilities relax and learn. Throughout human history, art and literature have provided an essential data-processing need: artists and storytellers help people digest and deal with the unsettling complexity all around them. There has never been a successful, decent human society that did not have artists and storytellers. There's a reason for that. Humans need art. We need stories.

Cloud Computing Journal: Are there other players in the Cloud ecosystem offering the same - or is your company unique? Why?

Sundman: There are other geek novelists, for example Neal Stephenson. But we are few.

Cloud Computing Journal: We hear talk of a Cloud Revolution and also of a Cloud "evolution" - either way, what kind of time span are we talking about, do you think. In other words, for how long is Cloud Computing going to exert its pull on the minds, hearts, and budgets of all involved in modern-day Enterprise IT?

Sundman: I would guess three years. Lots of IT people at small concerns are going to lose their jobs. This saddens me a lot. But I think it's inevitable.

Original Article - http://cloudcomputing.sys-con.com/node/1390642

Join Us: http://bit.ly/joincloud

8 Tips to Getting Started in Cloud Computing

In the old days, infrastructure meant buying $5,000 servers, putting them into high-security, cooled rooms, and hiring IT folks to make them all work.

And then doing this all over again so that you’d have redundancy. Those days are gone—in fact, if you tell investors that you need money for IT infrastructure, they’re going to question your intelligence.

Cloud computing is what has changed the game. It provides small businesses with the ability to deploy websites and applications quickly, to pay only for what you use, and leave all the management issues to someone else. It makes for a leaner business that can react faster to challenges and opportunities.

But cloud computing also requires understanding a whole new technology and computing philosophy. I was lucky when we put my company’s website, Alltop.com, “into the cloud” because our service provider, Rackspace, hosts hundreds of companies and our developers at Electric Pulp have done this many times. The gang at Rackspace and I have come up with a list of ten tips to help you get started in cloud computing:

1. Know the different options available to you. “Cloud computing” simply means that you pay only for what you use—like electricity, but it can be found many forms. A platform-as-a-service (PaaS for short) works well for front-end applications and websites. It takes care of a lot of the infrastructure you need to get started. An infrastructure as a service (IaaS for short) gives you access to a command line and allows you the flexibility to customize your load balancing, autoscaling, security and failover settings, but requires advanced technical knowledge in system administration.

If you don’t have someone comfortable programming in a terminal, an IaaS is not for you. Infrastructures-as-a-service require some system administration experience. If you don’t have someone with this type of expertise, either find a competent systems administrator or consider a platform-as-a-service instead. A PaaS provider should be able to handle basic but necessary tasks such as load balancing and security.

2. Understand that scaling is a skill, not a default. Cloud computing gives you a lot of flexibility to scale at a lower cost. However, no cloud provider offers “infinite scalability” out of the box though. The world’s most popular websites and applications have professionals working full-time to ensure uptime and speed when they are needed most. Be sure to factor this into your long-term IT costs.

3. Implement a disaster plan. The cloud is not fool proof, but there are ways to protect yourself should it go down. A multi-tenant cloud will go down on occasion. If uptime is crucial to your business, be sure you have a disaster recovery plan, geographic failover, and redundancy set in place.

4. Don’t be naïve. Cloud computing will not make up for a poorly written application or database structure. A hosting provider gets the blame for a lot of performance issues. If a database is not set up properly, or code is not optimized, there is nothing a hosting provider can do to make up for this. When it comes to developers, remember that you often get what you pay for. Be sure to check their resumes and portfolio for other work.

5. Budget for your specific use-case. Calculating your budget is not as simple as reading a provider’s website. Cloud computing treats hosting as a utility. Like other utilities such as electricity, your bill will vary each month with usage. If you see a surge in traffic or users, use more space, or process more information, expect to pay more at the end of the month.

6. Choose a cloud provider on your needs, not its popularity. Do you need something that is highly elastic in a short period of time? Are you going to need support and additional services? High availability? Integration with third party software? Different cloud providers excel at different things. Consider your individual needs, do your homework, and ask cloud providers questions about their availability, speed, security, and integration before you sign up.

7. Remember: some applications are not good fits for cloud. Cloud computing is great for anything you’d need to deploy quickly and at a low cost. However, just like multi-tenant buildings are not good for every business, a multi-tenant cloud is not good for every application. If you have high security or bandwidth needs, you will need to pursue dedicated gear.

For security reasons, any application that requires PCI or HIPPA compliance is not a good fit for cloud computing. A multi-tenant cloud may also not be able to handle extreme performance loads often seen by more resource intensive applications. Evaluate your specific needs and don’t rule out dedicated or hybrid hosting (a combination of cloud and dedicated hosting) if it looks like the right fit.

8. Think outside of the box. When hosting becomes a commodity, it opens your business up to new and exciting things. You can deploy applications or sites on the fly. Consider media rich or real-time elements to your application or website. Set up a server just to comb through customer information or other data your company collects. These possibilities were not as accessible to the masses before cloud computing, so don’t be afraid to try new things and expand how your business operates.

Full Source: Guy Kawasaki at OpenForum

Join Us: http://bit.ly/joincloud

Friday, 14 May 2010

15 Cloud Companies to Watch

Innovative vendors offering ways to make the transition to a cloud-based world less daunting.

1) Abiquo

Headquarters: Redwood City, Calif.What it offers: Abiquo Enterprise Edition, cloud management software How much it costs: Community Edition: free download. Enterprise Edition: From $211 for 1-49 physical cores to $432 for 1,000 or more cores.
Why we're watching the company: Abiquo has tackled two problems plaguing just about any IT organization that operates a virtual infrastructure – how to manage it effectively and how to avoid vendor lock-in. Abiquo combines open source software with cloud expertise and support for multiple hypervisors.

"We built our product around where virtual management is going, which is the cloud. We are hypervisor agnostic and can move virtual machine images between any hypervisor. We provide the management glue," said Abiquo CEO Pete Malcolm in an interview for Network World's "Network/Systems Management Alert" newsletter.

IT managers can use the software to create virtual data centers through which they can deploy bundled servers, storage and other physical resources, and applications from public or private virtual image libraries. The Abiquo cloud management software separates the task of managing those physical infrastructure pieces from the creation and management of applications. The software, running on a central server, auto-discovers machine resources, manages existing live virtual machines, and can capture and store "stateful" virtual machines, Malcolm described.

The latest version, Abiquo 1.5, adds policy-based workload management, enforceable resource limits and multi-tenancy capabilities that provide delegated provisioning of virtual enterprises.

What this all means for enterprise IT shops is the grand promise of vendor-neutral, interoperable management of virtual resource pools, be those inside the company or out in the cloud.

Who heads the company: CEO Pete Malcolm, a serial entrepreneur brought in last year to help the company expand internationally. Most recently, he had been CTO at Orchestria, which he founded and CA bought in 2009.

How the company got into cloud computing: Founders initially focused on grid computing but interest in cloud technologies led to development of the AbiCloud open source project in early 2008. AbiCloud was formally released in February.

Continue Reading @ CIO.com - http://www.cio.com/article/593133/15_Cloud_Companies_to_Watch?source=rss_cloud_computing

Join Us: http://bit.ly/joincloud

Microsoft Office vs. Google Apps: The Business Brawl

n the business battle to rule the enterprise office suite of the future, Microsoft (MSFT) and Google (GOOG) both must overcome significant problems.

On Google's side, despite the Google Apps price advantage and Google's announcement that 2 million companies are now using Google Apps, various research data still shows Google is losing out in the enterprise. Google Apps adoption not only lags way behind Microsoft but also trails behind OpenOffice and even IBM's (IBM) Lotus Symphony. Estimated revenue for Google Apps in 2009 is $50 million, a tiny portion of the company's $22 billion war chest.

Microsoft, meanwhile, must be competitive with Web apps — but not too competitive, since these apps will presumably drive much less revenue than its longtime cash cow, the traditional Microsoft Office suite. In fiscal year 2009, Microsoft's business division (of which Office is a major component) generated a healthy $18.9 billion in revenue. Microsoft must also convince customers that it "gets" cloud computing the way that Google does.

If anyone has the cash and the consumer branding power to take on Microsoft, it's Google, analysts say. But there's still plenty of explaining to do.
Web Apps By the Numbers

The enterprise today is offered a plethora of productivity tools: There is the big horse, Microsoft Office, followed by lower-cost or free alternatives such as IBM's Lotus Symphony and Web-only products Google Apps and Zoho's Office Suite, among others.

Yet although the Web browser may be the most prevalent computing platform for both consumers and businesses, enterprise use of cloud-based alternatives compared to Microsoft Office remains low, according to recent data from Forrester Research.

In a March survey of 115 North American and European enterprise and SMB technology decision-makers, Forrester cites that 81 percent are currently supporting Office 2007 — while only 4 percent are using Google Apps.

Moreover, despite Office 2010's higher price than current alternatives, one-third of survey respondents plan to upgrade to Office 2010 in the next year and one-quarter plan to upgrade in the next two to three years.

So does Google Apps stand a chance against Microsoft Office in the enterprise? Maybe not right away, but Google is in a good position to advance as the enterprise shifts to the Web, say industry analysts.

Continue Reading @ CIO.com - http://www.cio.com/article/593149/Microsoft_Office_vs._Google_Apps_The_Business_Brawl?page=2&taxonomyId=3012

Join Us: http://bit.ly/joincloud

Thursday, 13 May 2010

Cloud Forms Integration Point for Printing from Smartphones

Hot on heels of smartphone popularity, cloud-based printing scales to enterprise mainstream

As enterprises focus more on putting applications and data into Internet clouds, a new trend is emerging that also helps them keep things tangibly closer to terra firma, namely, printed materials – especially from mobile devices.

Major announcements from HP and Google are drawing attention to printing from the cloud. But these two heavy-hitters aren’t the only ones pushing the concept. Lesser-known brands like HubCast and Cortado got out in front with cloud printing services that work to route online print orders to printer choices via the cloud. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Still a nascent concept, some forward-thinking enterprises are moving to understand what printing from the cloud really means, what services are available, why they should give it a try, how to get started—and what’s coming next. Again, we’re early in the cloud printing game, but when Fortune 500 tech companies start advocating for a better way to print, it’s worth investigating.

HP's Cloud Printing

HP is no stranger to cloud printing. The company is behind a service called MagCloud that lets self-publishers print on demand and sell through a web-based marketplace with no minimum orders. But a recent announcement suggests HP is looking to deeply lead the charge into printing from the cloud for the broader enterprise ... and consumers.

Earlier this month, HP rolled out the ePrint Enterprise mobile printing solution developed in collaboration with RIM. It’s based on HP CloudPrint technology and works with BlackBerry smartphones. As HP describes it, CloudPrint lets users print documents from their mobile devices, computers and netbooks while they aren’t in the office on a LAN.

Essentially, CloudPrint blends cloud and web-services-based technologies to let people print anything—like reports, photos, emails, presentations, or documents—from anywhere. All you need is a destination network-connected printer. With CloudPrint and ePrint Enterprise, HP has a wide margin of enterprise printing needs covered.

Google's Cloud Printing
Google got into the cloud printing game in mid-April. Dubbed Google Cloud Print, the search engine giant’s service will work with the Chrome operating system, where all applications are web apps. Google wanted to design a printing experience that would make it possible for web apps to give users the full printing capabilities that native apps have today. Access to the cloud is the one component all major devices and operating systems have in common.

Here’s how it works: Instead of relying on a single operating system—or drivers—to print, apps can use Google Cloud Print to submit and manage print jobs. Google Cloud Print will send the print job to the appropriate printer, with the particular options the user selected, and then return the job status to the app. But Google Cloud Print is still under development, which gives HP and other players a chance to gain market momentum.

Cloud Printing Pioneers

Indeed, there are other players promoting printing from the cloud—and some could be considered pioneers. Hubcast is one of them. Hubcast bills itself as the only worldwide digital print delivery network. It routes your online print order to the high quality network printer closest to you. This way you don’t have to pay shipping charges for printing. Hubcast won the Gartner “Cool Vendor” Award back in 2008.

Meanwhile, Cortado offers one-stop mobile business software solutions that aim at the enterprise—including cloud printing. Cortado competes with HP, offering a free cloud printing app called Cortado Workplace for BlackBerry and iPhone that lets you print your documents to any printer reachable via Wi-Fi or Bluetooth. Enterprise customers can also get Cortado Corporate Server for use on their company network behind the firewall.

Why Print from the Cloud?
Road warriors, mobile workers and on-the-go professionals can see the value in being able to access information and personal documents from just about any device. The problem historically has been the need to install drivers that make printing possible. Keeping up to date with print drivers for the various printers you might meet with while out of the office is cumbersome at best and nearly impossible at worst.

HP has also invested heavily in new ways of publishing, of making the mashup of printing and cloud services a commercial opportunity, with even small-batch, location-focused publications possible via printers rather then presses.

Similarly, the latest user-focused cloud printing solutions that are integrated with mobile devices make publishing boundary-less and set the stage to boost productivity with the ability to print documents on the fly at places like FedEx, hotels, business centers or anywhere else along a professional’s travels that offer access to a printer. In other words, these solutions extend the corporate network and offer cross-platform conveniences that aren’t available through traditional printing options.

Getting started is getting easier easy. You just have to download an application to your BlackBerry or iPhone. Becoming an early adopter of cloud printing puts you on the cutting-edge of business and could give you an advantage in a competitive marketplace.

Think about the possibilities of being able to print, sign and fax a document back to a client from just anywhere you happen to be. Cloud printing is poised to revolutionize the enterprise work environment in much the same way that cloud computing is transforming IT settings.

It also highlights the longer-term strength of cloud models, beyond more than cost savings from outsourcing. And that value is the powerful role that clouds play as integration platforms, to enable things that could not be done before, to bind processes -- like printing -- that scale up and down easily and affordably.

Join Us: http://bit.ly/joincloud

Unlocking the Value of IaaS

Until recently I've been in an odd spot, generally speaking my biggest competition when a potential customer came to us looking for an Infrastructure as a Service (IaaS) platform was either to build it yourself (aka huge risk) or buy it from us.

(Yes there were a few other competitors) But for the most part the space was a greenfield, we really didn't have to worry about our competitors because there weren't many and the ones that were out there were positioned in a significantly different way than us.

The problem with being first is one of education, most potential customers didn't even realize they needed a cloud platform. The good news is things are changing, the idea of Infrastructure as a service is no longer a radical one.

IaaS companies are getting funded left and right and customers are buying. I've long held the notion that an industry isn't real until you have direct competitors. This both proves there is an opportunity as well as brings a broader awareness. A rising tide floats all boats if you will. In my post today, I thought I'd briefly explore the value proposition of IaaS.

So you ask what's the value of providing an Infrastructure as a Service? From a private point of view it's most about efficiency, the draw back is you need to spend money to save money. Which can be a tough sell in a rough economic climate. From a public cloud context it's about converting costs (Fixed to Variable, Capex to Opex etc). Ok, we've heard the story before. It's really about saving money or not losing it to a more nimble competitor. So the real question becomes how do you unlock the value of an IaaS cloud, either internally or externally or both? For me it's all about the application.

If done right, an IaaS platform provides a simple unified interface to your infrastructure. At it's heart it simplifies your infrastructure so you can focus on what matters most, the applications that run on it. It also changes the job of a system/network admin from one of being reactive to one of being proactive. If a server dies, who cares -- leave it. Instead hot-add additional servers just in time based on automatic utilization notifications. Where previously a sysadmin could manage dozens of servers that same admin can now manage thousands, because no one server really matters. It's the overall cloud that matters. IaaS allows you to focus higher in the stack, where the real business value really lies. Do you think end users really case what flavor of load balancer you're using? Probably not. What they do care about is what application they can have instant access to when they have a real problem they need solved. And when it's deployed it will scale to handle anything they through at it. The application just works.

Another one of the more puzzling approaches I've seen a lot lately in the virtualization and IaaS space is that of the so called "Virtual Data Center". Basically what a few of the more backward thinking vendors are saying is, let's recreate the traditional physical experience of running a datacenter but in a virtual or cloud context. Some have even gone to the extent of creating virtual blades which graphically look like physical servers to get their point across. To which I say why? Infrastructure is already too complex so why would I want to recreate an already painful experience. The sales pitch is do what you've already done. I think a better approach is to instead take what you've already done and make it better, easier, more efficient and ultimately simpler. Instead of recreating complexity, I say remove it. If someone is trying to sell you a "Virtual Data Center" I say run. What they should be selling you is business value, how can this save or make you money. The value of IaaS is that it just works, it just scales and it makes you money. Not how many features can we cram into a user interface. And more importantly from a technical standpoint it doesn't require endless hours of configuration and testing. Yep, it's turnkey.

The biggest value of an IaaS platform is as a stepping stone -- one that allows you to gracefully migrate from the traditional physical single tenant infrastructure of the past to a multi-tenant distributed cloud of the future, while not requiring you to completely re-architect or rebuild your applications for it. What IaaS does is move the need to make your applications multi-tenant by making your infrastructure mutli-tenant. If it doesn't accomplish this, than it's not IaaS and there's not a lot of value in it.

Join Us: http://bit.ly/joincloud

Best Buy Goes to the Cloud

Enomaly Helps Best Buy Leverage Cloud Computing to Connect with Customers

In late May, Gary Koelling and Steve Bendt came to Enomaly looking for our help to realize their latest brainchild, called "Connect". We'd previously worked with Best Buy to develop an internal social networking site called "Blueshirt Nation" and we were eager for an another opportunity to collaborate with them.

Inspired by Ben Herington's ConnectTweet, the concept was simple; Connect BlueShirts directly with customers via twitter. In less than two months, Enomaly developed a Python application on Google App Engine to syndicate tweets from Best Buy employees and rebroadcast them. Thereby allowing customers to ask questions via twitter (@twelpforce) and have the answer crowdsourced to a vast network of Best Buy employees who are participating in the service. Now the answer to "looking for a Helmet Cam I can use for snowboarding and road trips" is only a tweet away.

The application, which has since come to be named Twelpforce, is a prime example of innovators like Best Buy leveraging cloud computing to enhance their business. The response to the service since it was launch on Sunday has been very positive and it's exciting to be an intregral part in bringing this project to life. We're interested in hearing your feedback on the service, so please feel free to leave a comment!

Original Article - Cloud Computing Journal - http://cloudcomputing.sys-con.com/node/1388813

Join Us: http://bit.ly/joincloud

Wednesday, 12 May 2010

Android Now Outselling iPhone

Smartphones carrying Google’s Android operating system outsold the iPhone in the first quarter of 2010, according to new research out today from NPD.

During the quarter, Android (Android) handsets accounted for 28% of smartphone sales, beating out iPhone OS and its 21% share. BlackBerry was the bestselling OS, with its devices capturing 36% of the market. NPD attributes the shift to strong sales of the Motorola Droid and Droid Eris.

The news follows a report last month from AdMob that showed Android’s share of the mobile OS market had actually grown past that of iPhone OS, at least in the U.S. Worldwide, iPhone OS still has a big lead, however, with 46% share compared to Android’s 25%. AdMob’s data is based on ad requests across mobile websites and applications.

With the new Droid Incredible reportedly selling quite well, this trend may continue in the near-term. However, the fourth-generation iPhone is expected to debut next month. Rumors that the iPhone will soon come to Verizon also continue to intensify, with that carrier’s customers indicating they are ready to welcome the device with open arms. If all of that happens, the numbers could quickly swing back in Apple’s favor. Stay tuned.

original article - Mashable - http://mashable.com/2010/05/10/android-outselling-iphone/

Wi-Fi at 60 GHz Will Be 10 Times Faster

The Wi-Fi Alliance and the Wireless Gigabit Alliance have reached an agreement to set up standards for Wi-Fi to operate in the previously unclaimed 60 GHz frequency band, which would offer up to 10 times faster data transfer speeds. Currently, Wi-Fi operates in the 2.4 GHz an 5 GHz frequency bands.


”The 60-GHz band allows for significant boost in performance, so we are talking about speeds in the gigabits per second range,” says the executive director of the Wi-Fi Alliance, Edgar Figueroa.

Unfortunately, the new standard would come with severe drawbacks; Wi-Fi signal operating in the 60 GHz band would not be able to penetrate walls; one would need to have line of sight for a decent connection. However, the increase in speed is so significant that the trade-off would definitely be worth it in certain situations.

It’s not going to happen overnight, though. After the standard is set, consumers will want routers and wireless adapters that can switch between two or all three (2.4 GHz, 5 GHz and 60 GHz) Wi-Fi frequency bands. Figueroa estimates that these will be available on the market in about two years.

Tuesday, 11 May 2010

Speed & Access: Are They Opposite Goals?

The economic slowdown has not dampened the IT world’s enthusiasm for fast computing and fast start-up

Speed, speed, speed. The economic slowdown has not dampened the IT world’s enthusiasm for fast computing and fast start-up.

Today, Steria, a European-based IT services company, joined forces with Cisco to create a solution that claims to give users access to the cloud in about 30 minutes. It’s called Infrastructure on Command, to debut around June, and it’s a Platform as a Service (PaaS) product that’s built on Cisco’s Unified Computing System and supported by Steria Advanced Remote Services (STARS).

Users will be able to access Infrastructure on Command through an online portal, and they can purchase additional computing power and infrastructure services on a pay-as-you-go basis. Companies – and this sounds like it would be perfect for small and mid-size firms – can scale up or down depending on what they need.

What’s so attractive about this is not only that they can adjust supply depending on their demand, but it also allows them to simply pay for the computing power they use.

While this is certainly good news for IT folks (especially at smaller companies) who want to get their apps and data on the cloud quick, and keep within their budget guidelines, too, quick access doesn’t necessarily address some of the major worries about cloud computing these days – especially who within a company should get access and how to control that.

A recent poll by the Ponemon Institute, called the “2010 Access Governance Trends Survey,” asked 728 IT practitioners about their procedures and outcomes in setting up access to information resources. The survey found that 87% thought individuals had too much access to information systems, up 9% from 2008.

Meanwhile, 73% said adoption of cloud-based applications is allowing users to circumvent existing corporate policies on access. This is clearly a problem for companies that want to tighten information control and security. And the study says that lack of central control over access to the cloud and online information “can contribute to excess user access and generally decrease the ability to apply policies and processes consistently across the enterprise.”

So, if you consider the first bit of news, that is, a way to get access to the cloud in 30 minutes, and then juxtapose it against the second, that is, that IT executives are unhappy about a lack of control over employees gaining access to the cloud, it seems we’ve got a problem here. Is cloud development moving too fast for IT folks concerned with governance?

This makes an even more compelling argument for monitoring services, especially the monitoring of networks, and even more especially, multi-site networks. If more and more people are accessing information services, whether it’s done via internal servers or virtual servers at remote sites, you’re bound to worry more about keeping it all running smoothly. What kind of monitoring can Monitis do for you? Here are some services for network monitoring:

* Native agent for Windows, Linux, Solaris, FreeBSD and Mac
* Computer configuration monitoring
* SNMP monitors, SNMP traps, MIB Browser with local infrastructure
* Internal telnet checks for switches, routers and other network devices
* Internal http check for intranet and extranet applications
* Office and cross-office WAN connectivity checks
* Ad hoc monitoring locations
* Expandable with custom scripts

Now is the time to consider letting a third-party monitoring system, like Monitis, do the tracking—and worrying—for you!

Join Us: http://bit.ly/joincloud

Start-up Claims to Make Cloud Storage Actually Work

StorSimple does real-time data deduplication to minimize the footprint of the data stored

It seems it's not that easy to use the cloud for storing mainstream Microsoft applications - or other people's mainstream applications for that matter - at least according to the budding integration start-up StorSimple.

And there must be something to it because it emerged from stealth mode Wednesday with agreements in hand with Amazon Web Services (AWS), AT&T, EMC, Iron Mountain and Microsoft's Windows Azure to connect on-premise Windows and VMware apps to EMC Atmos public and private cloud storage for joint customers.

The alternative appears to be modifying the applications.

See, the problem is cloud storage uses HTTP/REST APIs. So clouds don't accept snapshots or integrate with existing backup applications; data recovery processes need to be changed. On top of which, StorSimple says, WAN latency significantly degrades app performance, WAN bandwidth can be cost-prohibitive and traditional WAN optimization doesn't work. Then there's always the security risk.

StorSimple's application-optimized storage controller is targeted at clearing these hurdles.

As a fledgling operation the outfit's limiting how much it's biting off. Just the ever-popular SharePoint and Exchange 2010, Windows User Files and Microsoft and Linux Virtual Machines for now.

Down the road it means to do home directories like CIFS and NFS; content from EMC, IBM, Interwoven and Microsoft; libraries and archives; ERP and CRM like SAP, Oracle, Siebel and PeopleSoft; OLTP; and simulation and modeling.

Pretty much all the biggies are on its roadmap.


Its iSCSI-based Armada Storage Appliance is a hybrid solution that makes cloud storage look like local data center storage and integrates with the customers' existing environment.

Armada identifies and stores all the hotspot and bottleneck data on a tier of high-performance solid state drives (SSDs) and uses lower-cost SATA storage and/or cloud storage as primary storage.

StorSimple does real-time data deduplication to minimize the footprint of the data stored and provides the WAN optimization functions for cloud storage.

The widgetry is supposed to deliver consistent storage performance at scale, simplify data protection and reduce cost by up to 90% compared to traditional enterprise storage.

The core of the StorSimple widgetry is called a Weighted Storage Layout or WSL (say whistle).

All volume data is dynamically broken into chunks, analyzed and weighted based on how frequently it's used. Frequently used data is deduped to minimize the space required and stored in the SSDs for fast access. Less frequently used data can be stored on the SATA drives or encrypted in the cloud.

All data stored in the cloud is encrypted.


StorSimple's also got a scheme called Cloud Clones to simply back up and restore using existing backup software for application object recovery. The way it works is an independent backup copy of the data volume is created in the cloud. Changes are transferred daily and clones can be mounted for data volumes for restore. Tape is eliminated.

The year-old 35-man start-up is operating on an $8.2 million A round from Redpoint Venture and Index Ventures.

It has been in beta for the last two months with eight large and mid-range sites and it's still three months away from GA. It reckons concerns with 500-10,000 users are its sweet spot. Makes for a faster close.

Join Us: http://bit.ly/joincloud

Monday, 10 May 2010

The Next Chapter in the Virtualization Story Begins

Virtualization is almost too successful, says Abiquo's CEO - adding that "the infrastructure beast is becoming uncontrollable"

Cloud Computing Journal recently caught up with Pete Malcolm, CEO of cloud management innovators Abiquo - a major new player in the fast-emerging Cloud ecosystem and Platinum Plus Sponsor of 5th Cloud Expo held in New York April 19-21, 2010.

Malcolm keynoted personally at the event. His theme was what he called "The next chapter in the Virtualization story." As the CEO of one of Cloud computing fastest rising stars put it: "Virtualized or not, the infrastructure beast is becoming uncontrollable. But, as they say, big change is a-coming..."

Read more @ The Cloud Computing Journal

Join Us: http://bit.ly/joincloud

Leveraging Public Clouds to SaaS-Enable Enterprise Applications

Q&A with Marty Gauvin, President & CEO of Virtual Ark

"Security in the Cloud isn’t bad, it’s just different," Gauvin at one point notes, adding: "It is essential to take a measured, careful approach to security issues." Here in the interview in full.

[On April 20, 2010 at 5th Cloud Expo in the Jacob Javits Convention Center, New York City, Gauvin gave a General Session presentation in the keynote room]

Cloud Computing Journal: As a successful serial entrepreneur and now the CEO of a major new player in the fast-emerging Cloud ecosystem, how do you see your company fitting in? What layer of the ecosystem are we talking about?

Marty Gauvin: Virtual Ark fills a significant gap between the Cloud vendors like Amazon, Rackspace, Microsoft etc and the customer. By providing deep application management services and support, we help SaaS-enable applications on the Cloud with our ISV partners.

Rather than being a "cloud vendor", Virtual Ark is a Cloud expert that uses Cloud services on behalf of its customers to deliver their enterprise applications in a SaaS model with consumption-based pricing similar to that delivered on public clouds.

As a result, customers can choose their preferred application as a SaaS solution. Because of our expertise and use of Public Cloud infrastructure services, Virtual Ark has virtually instant international reach enabling customers to use their applications globally, quickly and easily.

Cloud Computing Journal: What kind of existing enterprise applications does the Virtual Ark platform support?

Gauvin: ERP, CRM, Financial, Web, Content Management Systems and line of business applications across a wide range of industries. Virtual Ark works with ISV partners to SaaS-enable their applications. There is typically very little coding required by our ISV partners of their original applications and Virtual Ark does almost all of the work to SaaS-enable these applications. Significantly for the enterprise, multi-tenancy is not a requirement in order for Virtual Ark to SaaS-enable these applications.

Cloud Computing Journal: But what about security issues: Is it really sensible for a Tier 1 company to be running its core business applications in the Cloud?

Gauvin: Enterprises require a risk analysis process that systematically identifies and assesses the relevant aspects of the chosen computing model or service. This equips them to analyze risk by examining the technical and process dimensions of a specific implementation rather than trying to second-guess the security needs of a generic service identified as "Cloud Computing."

Security in the Cloud isn't bad, it's just different. It is essential to take a measured, careful approach to security issues. This approach should apply whether you protect a person, a physical asset or your data. What changes are the risks and threats to which you respond.

In the case of the Cloud-enabled applications, techniques such as encryption and overlay tools are available to meet significant security accreditation when implemented correctly.

Cloud Computing Journal: What is the cost model? Is the platform available - in true Cloud fashion - on a pay-as-you-go/usage basis?

Gauvin: Absolutely. Because we have no underlying fixed cost of infrastructure, we can bring very flexible billing models to market. While billing varies from application to application, all are aligned as close as possible to a sensible consumption based-pricing model. Some use "number of users per hour"; others use "by transaction". Ultimately, because we leverage existing Cloud service providers, we are able to bring pricing billed on a pay-as-you-go basis.

read more of this interview @ Cloud Computing Journal

Join Us: http://bit.ly/joincloud

Friday, 7 May 2010

Cloud Computing – Demystifying SaaS, PaaS and IaaS

Is cloud computing the next biggest thing since the Web? The answer is YES.

After Mainframe, Personal Computer, Client Server Computing and the Web, this is the next big thing. According to the International Data Corporation (IDC) October 2008 report, projected Cloud IT Spending in 2012 will be $42 Billion; a growth of about 27% from 2008. IDC forecasts that Asia Pacific spending on IT cloud services to grow fourfold, reaching $3.6 billion by 2013. IDC sees that new uses of cloud technology will be introduced to include those markets which cannot yet take advantage of cloud computing.

SaaS, PaaS and IaaSFig 1. Cloud Computing Services

Cloud computing is broken down into three segments: “software”, “platform” and “infrastructure”.

Each segment serves a different purpose and offers different products to businesses and individuals around the world.

Software as a Service (SaaS) is the service based on the concept of renting software from a service provider rather than buying it yourself. The software is hosted on centralized network servers to make functionality available over the web or intranet. Also known as “software on demand” it is currently the most popular type of cloud computing because of its high flexibility, great services, enhanced scalability and less maintenance. Yahoo mail, Google docs, CRM applications are all instances of SaaS. With a web-based CRM all that employees need to do is register and login to the central system and import any existing customer data. The service provider hosts both the application and the data so the end user is free to use the service from anywhere. SaaS is very effective in lowering the costs of business as it provides the business an access to applications at a cost normally far cheaper than a licensed application fee which is possible due to its monthly fees based revenue model. With SaaS user need not worry about installation or upgrades.

From SaaS, now the industry is moving towards Platform as a Service (PaaS). PaaS offers a development platform for developers. The end users write their own code and the PaaS provider uploads that code and presents it on the web. SalesForce.com’s Force.com is an example of PaaS. PaaS provides services to develop, test, deploy, host and maintain applications in the same integrated development environment. It also provides some level of support for the creation of applications. Thus PaaS offers a faster more cost effective model for application development and delivery. The PaaS provider manages upgrades, patches and other routine system maintenance. PaaS is based on a metering or subscription model so users only pay for what they use. Users take what they need without worrying about the complexity behind the scenes.

There are basically four types of PaaS solutions – social application platforms, raw compute platforms, web application platforms and business application platform. Facebook is a type of social application platform wherein third parties can write new applications that are made available to end users. The CRM solutions provided by the companies are examples of business application platform. Developers can upload and execute their applications on Amazon’s infrastructure which is an example of raw compute platform. While the Google provides APIs to developers to build web applications which is an example of web application platform.

The final segment in the cloud computing is the infrastructure. Infrastructure as a Service (IaaS) is delivery of the computing infrastructure as a fully outsourced service. Some of the companies that provide infrastructure services are Google, IBM, Amazon.com etc. Managed hosting and development environments are the services included in IaaS. The user can buy the infrastructure according to the requirements at any particular point of time instead of buying the infrastructure that might not be used for months. IaaS operates on a “Pay as you go” model ensuring that the users pay for only what they are using. Virtualization enables IaaS providers to offer almost unlimited instances of servers to customers and make cost-effective use of the hosting hardware. IaaS users enjoy access to enterprise grade IT Infrastructure and resources that might be very costly if purchased completely. Thus dynamic scaling, usage based pricing, reduced costs and access to superior IT resources are some of the benefits of IaaS. IaaS is also sometimes referred to as Hardware as a Service (HaaS). An Infrastructure as a Service offering also provides maximum flexibility because just about anything that can be virtualized can be run on these platforms. This is perhaps the biggest benefit of an IaaS environment. For a startup or small business, one of the most difficult things to do is keep capital expenditures under control. By moving your infrastructure to the cloud, you have the ability to scale as if you owned your own hardware and data center.

So we could see that where SaaS offers a complete application as service and PaaS offers the ability to develop an application, IaaS doesn’t care about the application at all. If you have already written a lot of code or have a software package you want to install and run in the cloud, then you’ll be looking for IaaS. If you have no software or want to build something from scratch to solve a problem for which there is no package available or the packages are too expensive or complicated, then you should go for PaaS. The unit of deployment varies from the server to the application and these three types of services offered in cloud computing will have great effect on the nature of IT operations.

Full Source: e2eNetworks

Join Us: http://bit.ly/joincloud

Sage to Deliver Cloud-Based CRM Solution on Amazon Web Services

Sage North America, which provides business management software and services to more than three million small and midsized businesses in Canada and the United States, will offer the Sage SalesLogix CRM Suite on the Amazon Elastic Compute Cloud (Amazon EC2) infrastructure.

The cloud-based, on-demand version of Sage SalesLogix will be available in North America the first half of 2010, after which it will be expanded to other regions worldwide. Sage CRM Solutions business partners are now implementing Sage SalesLogix in the cloud for select North American customers. Leveraging Amazon EC2, Sage can provide customers with flexible deployment and user access, as well as customer data ownership, customer control over the timing of upgrades, and IT management and hardware cost benefits.

Join Us: http://bit.ly/joincloud

Thursday, 6 May 2010

How To Deploy An Open Area With Wireless

Whether you provide wireless as a courtesy or use it as a money-maker, there are definite do's and don'ts to setting up and sustaining a usable local wireless environment where many clients will rely on it.

There are also no hard and fast rules and individual wireless access points have different features but the general principles of AP placement are well understood. Regardless of the wireless access point vendor you use, follow these steps for a successful deployment.

Up-front analysis is key. Have a scale drawing of your area so you can map out your AP placement. A free tool like Meraki's WiFi Stumbler will help any environment size up neighboring wireless networks that need to be worked around by showing adjacent network names, channels in use and how strong they are being felt in your space. Once you map the airspace, you can design your own access point placement. Remember that your airspace will change over time as neighbors add or remove their own access points, but you can adjust your network as needed.

Estimate how many users are going to be on the wireless network at a given time. If more than 15-20 users might be on the wireless network at the same time, you'll need to plan for more access points even if just one can fill the given area with signal. Wireless is a shared medium and when you get over 20 simultaneous users, contention will become a limiting factor in performance.

Determine the minimum throughput you'd like to provide. 802.11 a/g offers upto 54Mbps while 802.11n can reach, in theory 600 Mbps (though actually data rates will be much, much lower). The data rates will increase and decrease based on conditions in the airwaves. Balancing the data rate and number of users is a tricky process. Web and mail traffic might be considered lightweight, but a single YouTube video is usually around 500 Kbps. Consider what your users are likely to be doing; 10 users all streaming video might max out an 11g access point, for example.

Map out the size of the area do you need to cover. APs are usually "rated" for open areas when distances are claimed. You might get 200 feet of good signal outdoors, but only a couple of rooms inside. Construction materials make effective wireless barriers, particularly in older buildings where metal screening was often used to hold plaster. Verification is a must. If you see signal falling below 70 dBm, you need another AP. You don't need to have your AP blasting at full power either. If you are at an edge, try adjusting the power levels so that you have adequate power for your area, but are also being a nice neighbor. It's also a good time to go talk to your neighbors who are using maximum power to see if they can turn it down a bit.

More

Join Us: http://bit.ly/joincloud

Wednesday, 5 May 2010

Hubs, Spokes and WANs

Recently, we've had a number of discussions with enterprises about how they'd like to use the cloud.

The basic use case is around capacity on-demand (not surprisingly), but the specifics have raised some interesting issues. The companies have distributed branch offices that need the capacity for a range of applications, including dev/test environments as well as back-office and web apps. Today, these distributed groups are relying on corporate IT to meet their scaling and infrastructure needs, and they are frequently bottlenecked.

This is both in terms of overall challenges in getting new capacity approved in a timely way, but also from a network bandwidth perspective. At a panel this week at Interop, Riverbed noted that 2/3 of their enterprise customers have a hub and spoke model that requires the "spokes" to backhaul to the "hub" for connectivity to the internet, and thus to cloud computing services.

Only the remaining 1/3 have direct connections. At the same panel, Blue Coat agreed with the stats but commented that the branch sites are trending towards a direct-connect model as new sites are added.

As Expand Networks are able to provide a complete holistic VM based solution at both branch offices and head quarters, the TCO can be significantly lower than appliance beased solutions so check them out also.

Join Us: http://bit.ly/joincloud

Tuesday, 4 May 2010

How to Move Your Business to Cloud Computing

Adam Pash is a technology guru and the editor-in-chief of Lifehacker, a popular Website that focuses on technology, productivity, and making your life easier with your computer.

The Internet has changed the way people do nearly everything–from consuming media to performing research to maintaining relationships to communicating.

Its effect on business has been similarly wide ranging. And today, for mission-critical data, Internet-based computing–aka cloud computing–is introducing major changes to the way work is done.

Whether you’re ready to move your business operations to the cloud full time or just want to dip a toe in the water vapor, cloud integration offers significant benefits. Specifically:

With the right tools, you can have your data in the cloud in no time–and it will still play nicely with your desktop, laptop, or smartphone.

Don’t Make Me Ditch Microsoft Office!

The Internet has certainly changed the way people work, but desktop applications continue to outperform the Web on a number of fronts–especially in the business world. Fortunately, you can take advantage of the cloud without ditching desktop apps altogether. With the right tools and know-how, you can pair your significant business apps and data with the Web to keep everything in sync, accessible from any browser, and backed up to the cloud.

Note: There are more options for syncing data to the cloud than we can highlight in one article. Our focus here is on some of the most popular and trustworthy options–and in general, that means a large number of Google applications.

Microsoft Office Live

Since businesses tend to create lots of documents, spreadsheets, and presentations, you probably use Microsoft Office regularly. Most Web alternatives still haven’t come close to matching the power of the Microsoft Office suite. But you do have a couple of good options for integrating the most popular Office apps with the Web.

Microsoft’s own Office Live Workspace service works with Word, Excel, and PowerPoint; it aims to eliminate your need to carry a thumb drive, by letting you to store any of the three types of Office documents on the Live Workspace Website.

When you save a document to Live Workspace, you can view it from any browser, share it with other people in your workspace, and edit it on your desktop with Microsoft Office. The files are stored online on Microsoft’s servers, but you can open, edit, and save them with your desktop Office applications just as though they’re on your hard drive. Continue…

Full Source: WashingtonPost

Join Us: http://bit.ly/joincloud

Monday, 3 May 2010

Product Cloud Or Service Cloud? Know The Difference

In the world of on-premise computing, product companies are as distinct from services companies as land is from water. Yet in the cloud, it’s often hard to see the difference. Some people lump cloud companies together, as if they all do the same thing. But they don’t; in fact, not recognizing the difference can be an expensive mistake for customers, investors, and even the cloud companies themselves.

First, let’s understand what makes it hard to tell the product from the services companies in the cloud. A product vendor (for example, a provider of subscription-based software applications) sometimes sells services, but primarily to support its products. A services vendor relies on cloud consulting and implementation services as its primary revenue stream, yet may see an opportunity for additional revenue with products. With cloud computing so hot, and few established role models to emulate, both types of vendors steal each other’s ideas thinking the grass is greener on the other side. In the end, this can cause problems for these vendors, as well as their customers and investors.

So, what does a genuine cloud-based product vendor look like? Typically, this is a company that provides continuous product upgrades, support and maintenance as part of the subscription, and the product is served from a multitenant architecture. All customers immediately reap the benefits as the product evolves and improves.

Typically, more than 70% of a true product vendor’s revenues are recurring, and come primarily from product subscriptions. Like its on-premise counterparts, a cloud product vendor owns its intellectual property, has product managers, follows product release cycles, and provides upgrades and support for the life of the product. Unlike an on-premise vendor, a cloud product vendor may use a platform-as-a-service (PaaS) or an infrastructure-as-a-service (IaaS) as the base, making it dependent on other cloud providers (for instance, FinancialForce is a cloud product vendor built on Salesforce.com’s Force.com PaaS). In other words, a product vendor in the cloud is often the front-end of a consortium of cloud vendors, providing a bundled offering with multiple players supporting parts of its offering on an ongoing basis.

Since this reliance on other PaaS/IaaS vendors is a newer concept, it creates complexities of its own, putting the cloud product vendor in uncharted territory. The concept of multiple vendors working together to meet SLAs, without having exclusive partnerships, is still evolving and generates a lot of competition and conflict. In addition, building multitenancy into the product proves to be challenging and expensive.

To find an easier path, a product vendor sometimes takes a page out of the services vendor’s playbook. As opposed to focusing on building a product that can satisfy most customers, the product vendor starts implementing solutions tailored for individual customers. Since each customer pays for the implementation services on an hourly basis, the product vendor is able to generate more dollars upfront as well as avoid making difficult product architecture decisions. The danger is that this temptation may lead the vendor down the road of even more single-tenant engagements (see my previous blog, “Why Multitenancy Matters in the Cloud”). The single-tenant model forces it to continue supporting the unique needs of each customer, increasing its operating costs considerably over time. A product vendor in that situation is, slowly and steadily, forced to follow the path of a services company, and eventually starts becoming one.

Customer Hand Holding

Pure services firms (like Deloitte, where I spent part of my career), meanwhile, help companies implement cloud apps and infrastructure, and then charge them for these services. However, some services firms aspire to go a step further and dabble into selling products. The reason is simple: third-party cloud apps or platforms make it cheap and easy for them to create “templates” or “solutions” that they can offer to their customers to create a recurring revenue stream to their otherwise one-time revenue model.

What a service vendor going down this path may not realize, however, is that the recurring revenue model requires it to provide continuous customer handholding for product support, which can be a low-margin business unless it’s implemented using a multitenant architecture. The services vendor then becomes even more reliant on growing cash-flow friendly service revenues. It may then use its templates or solutions as bait for customers (so that it can eventually charge them for more services); nevertheless, that makes it look like a product vendor.

In the majority of cases, a product vendor offering multitenant software is the best choice for companies attracted to the cost/value benefits of cloud computing. They’ll receive continuous upgrades and support, and will always benefit from the highest point in the evolution cycle of the product. All of this could translate into significant costs savings and added value for the customer. That means, of course, that the product is going to change over time, and the customer must be willing to go with the changes.

A services vendor, however, might make the most sense for customers that believe their needs are very unique. Maybe they want very specific customizations or to build custom apps – or, maybe, they are confident in their own abilities to deliver the cloud computing benefits to their organization. Going with a services vendor would mean that the customer would have to bear the ongoing cost of activities such as customization, enhancement, maintenance and support.

In either case, it’s important for a customer to understand what type of company it wants to do business with and choose accordingly, since there can be significant cost implications. It’s also important for the cloud vendors to ask themselves the product vs. services question, since the kind of company they need to build can be very different depending on the answer. And venture capitalists and investment bankers need to look beyond standard financial numbers/projections of cloud companies, to ensure those companies aren’t inadvertently heading down the wrong path.

At a minimum, if you look at how the vendor is providing maintenance, support and upgrades for the entire solution, you should have a pretty good idea of whether they are a product or a services company.

Alok Misra is a cofounder of Navatar Group, which provides cloud apps for the financial services industry and helps software companies launch and support SaaS.

Join Us: http://bit.ly/joincloud