Wednesday, 30 September 2009

Moving to the Cloud: How Hard is it Really?

Today's cloud providers impose architectures that are very different from those of standard enterprise applications

Many IT managers would love to move some of their applications out of the enterprise data center and into the cloud. It's a chance to eliminate a whole litany of costs and headaches: in capital equipment, in power and cooling, in administration and maintenance. Instead, just pay as you go for the computing power you need, and let someone else worry about managing the underlying infrastructure.

But moving from theory into practice is where things get complicated. It's true that a new web application built from scratch for the cloud as a standalone environment can be rolled out quickly and relatively easily. But for existing applications running in a traditional data center and integrating with a set of other systems, tools and processes, it's not nearly so simple.

What's really involved when moving an application from your enterprise data center to the cloud? Let's say you've decided on a particular cloud, and you've identified the application you want to run there - now what? You need to consider a range of issues which can potentially turn the migration into a complex engineering project.

Migrating to the Cloud
Today's cloud providers impose architectures that are very different from those of standard enterprise applications. As Bernard Golden explains in his in-depth look at cloud computing, difficulty in migration is holding back uptake, and there aren't yet any automated tools to smooth the way. The result is lots of manual configuring, complex engineering, and trial and error before the enterprise application is able to run in the cloud. A whole landscape of specifications for OS versions, storage, networks, integration with other applications and databases - all those configuration steps that normally happen behind the scenes - have to be mapped to a cloud environment that is probably very different from what your IT staff is used to. It's the type of project that can tie up a development team for weeks or even months.

Keeping Your Data Safe
When data moves to the cloud, it moves beyond the reach of tools and mechanisms put in place over the years to preserve its integrity. In an environment characterized by multi-tenancy and decoupling between hardware and applications, cloud users need to be vigilant and understand the risks. (For a good introduction to cloud security issues, see David Binnings' article, Top Five Cloud Computing Security Challenges.) In brief, you'll need to make sure that the cloud provider has a level of physical security and regulatory compliance that meets the needs of your business and the specific application (for example, those with public information vs. confidential vs. compliance-regulated). You'll also need to consider what additional measures might be necessary to protect against potential threats, including protecting data in transit as well as at rest. It may also be appropriate in some cases to keep the database within your data center and put the rest of the application outside in the cloud.

Managing Dual Environments
After you finally get your application running in the cloud, you'll find another big hurdle: how are you going to manage it? The cloud and the data center are currently two completely separate environments, each with its own set of system management tools, and no meaningful way to integrate the two. Accordingly, your IT staff will need to learn and use each cloud provider's management tools and policies, in addition to the ones they already have. They'll also have to give up some of the control and visibility into an application and its supporting infrastructure that's available in the data center, at least in current cloud environments. (More details about the challenges of managing enterprise applications in the cloud can be found in Peter Loh's article in Cloud Computing Journal.) And as the cloud provider makes changes to their underlying infrastructure (for example, patching a version of their OS), the cloud version of the application needs to be maintained to meet this new environment, so it becomes even more different from the local versions over time.

What if You Want to Change Clouds or Move Back to the Data Center?
All that effort was just for one cloud! What if another cloud provider comes along with lower prices or better service? Since you've invested all that time to set up the application for one cloud, you're going to be very reluctant to repeat all the development and integration work to meet the new provider's requirements. Many companies also wish they had the flexibility to use the cloud to develop and test a new application (leveraging the cloud's benefits in agility and low cost for early research/prototyping/development), before bringing it back to the data center to take advantage of the production set of data and their corporate processes and infrastructure. Today, it's not possible to move an application between different clouds or back to the data center easily, with a few mouse clicks. For many companies, the goal is to create a federated environment of their data center with one or more clouds, and to move applications and workloads wherever is most appropriate.

The cloud offers a great opportunity for enterprise applications, but it's important to understand the work required before embarking on a migration, and how the cloud environment will integrate with the existing data center. CloudSwitch has been working hard to address these issues. Stay tuned for further developments.

Original  Article -

Tuesday, 29 September 2009

Should Our Business Tweet in the Cloud?

The jury’s still out for most businesses on whether or not significant effort is made in “tweeting” information via – but the evidence is IN regarding whether or not to simply sign up and prepare to do so.

Your company’s online popularity, reputation and ultimately success is derived from two core things – what you say, and what others say about you. Let’s address these communication elements as “attributed source information” (ASI). “Attributed” from the perspective that there is in fact a known source (though it may be an anonymous ID), “source” from the perspective that it’s the very first sincere representation of the communication or concept actually published online, and “information” in that it’s not just some data or graphic fragments, it’s actually a message or concept with enough context to drive interest.

If it’s non-attributed, non-source or just bits n’ pieces (i.e. not “information”), there’s not much you can do about it as “evidence” – but, just like any good lawyer or PR consultant, you can shape, evolve, dispute, share or otherwise react to the material to meet any of your agendas.

Back to Twitter (or any other social media channel), it’s a simple prospect – the more ASI you post about yourself, the more you’re likely to get posted about you. It’s common practice these days, and basic SEO “block and tackling”, to post as many “billboards” about your company as possible around reputable Internet sites and directories, with short marketing messages and direct backlinks. In fact, most businesses should take every opportunity to create a standard billboard, profile or directory entry among all popular social platforms that allow it – this also helps preserve and protect the core “brand”.

Therefore, most companies should establish an “unprotected” (i.e. publicly viewable) Twitter ID, create a basic profile, and establish a minimally-acceptable, germane and objective list of followers and regular corporate news updates (ASI). Re-tweets and requests to follow, and engage with you online, will be a typical “community-managed” affair, and be likely kept to a minimum with few “incidents”. This is truly no different than online press releases, many of which these days include and syndicate the press release ASI across multiple social media channels, including Twitter.

Some may see this sort of bland, generic Twitter land-grab as contrary to the “spirit” of the medium, and therefore a “poser” action – but Twitter use is ubiquitous enough now in the business and marketing community where this may no longer be the majority vocal opinion. It simply must be done, and is expected. All professional Internet Marketing and Social Media Consultants should recommend this.

Beyond this generic use of Twitter is where the potholes are. If your company is prepared to engage, across multiple agendas and subjects, with the online community that WILL develop as you post additional ASI – then you’ll need to develop your policies and procedures for Twitter, as a part of your broader “public discourse” strategy and risk-management framework. These include addressing what you post (i.e. who the authors will be, the backlink strategy, the recurrence and subject-area focus, the degree of personalization, etc.), and how you proactively and/or reactively deal with ASI that others post, in response to your own. Apply some method to the madness.

Otherwise, don’t count on a lot of ROI from your Twitter account, though there are simple benefits from just being at the game. Also don’t be too worried that your Twitter presence, albeit somewhat passive and conformist, will create any significant PR issues. Don’t be surprised, though, if your competitors end up driving and shaping the online conversation (and collecting the customers) with their own risk-managed ASI, in your absence.

Original Article -

Monday, 28 September 2009

The new wave in cloud computing (from the Hindu newspaper, India)

CHENNAI: Cloud computing is the latest buzzword. But small-time business operators are wondering if it will be useful and affordable. For them, the answer is a “software as a service” (SaaS) delivery model built into cloud computing.

It is a relatively new model of software deployment. Typically, here an application is hosted as a service provided to customers across the Internet. This model eliminates the need for installing and running the application in the customer’s own computer, and also the burden of maintaining software, operations and support.

While the first revolution of the Internet saw the three-tier model emerge as a general architecture, virtualisation in clouds has created a new set of layers such as applications, services and infrastructure. These layers do not just encapsulate on-demand resources, they also define a new application development model.
There are three layers — SaaS; the middle layer, known as platform as a service (PaaS); and the lowest layer, infrastructure as a service (IaaS). is a good example of SaaS. There are many others, including Google Apps, offering basic business services such as e-mail. While cloud computing is considered a boon to large and mid-sized players, the neighbourhood pop and mom stores are waiting for a technology or solution that can help in cost-effective and efficient business. Business Intelligence (BI) software is making inroads into the small business area, and new technologies like SaaS appear perfect for smaller firms.

According to Niraj Jaipuria, founder of BI Retail Ltd, these management software solutions help retail outlets, providing information and data on bill collection, customer trends, inventory management, product display and sales patterns.

New technologies such as SaaS help small firms save money by having a ‘host’ somewhere in cyberspace rather than being present in the physical firm. The inherent ability of SaaS to reduce price points to fractions is what is working to its advantage.

The use of SaaS can also reduce the upfront expenses on software purchase, through less costly on-demand pricing. End-users may thus reduce their investments in servers and other hardware, says Mr. Jaipuria.

HR solutions Human resource solutions created by Adrenalin eSystems Ltd are also marketed through the SaaS delivery model in cloud computing. According to its chief executive officer and managing director, the HR department in every company is expected to take on the role of process owner on behalf of the leader and facilitate this across the organisation. The HR department is also tied down to mundane administrative activities. Thus, there is a key challenge faced by the HR head — to use human resource solutions to help the organisation do the needful.

Such a solution for a small and medium enterprise (SME) could prove expensive. But implementing it through the SaaS delivery model could be cost-effective. Today, most of the mid and large-sized companies are using HR solutions through SaaS, says Mr. Ganesh.

Original Article -

Sunday, 27 September 2009

Has Virtualization Solved the Data Center Crisis?

Clouds deliver several of the benefits of cloud computing within the enterprise

Over the past several years, many IT departments have committed to virtualization as an antidote to the spiraling costs and inflexibility plaguing corporate data centers everywhere.

By running applications on virtual servers and consolidating underutilized hardware, data centers can get maximum value from their equipment. Virtualization also makes IT more responsive to the needs of the business: rather than spending weeks or months to provision a physical server, a virtual server can be launched in minutes.

Virtualization was meant to be the solution to today's data center woes - but is it? While it brings much-needed flexibility and efficiency to an environment where these qualities were sorely lacking, virtualization alone doesn't cure the underlying problem and in some ways adds to it.
Companies still have large data center infrastructure footprints to maintain, plus virtualization licenses, plus management issues introduced by virtualization - ironically adding cost as they try to reduce cost. Many IT managers report that the technical and management challenges associated with virtualization are hindering them from realizing its full cost benefits. They're still paying huge energy bills (those consolidated servers are working much harder than previously). They're still running out of capacity and need to keep buying more servers and storage. And over half of them are still building new data centers at enormous cost.

We're Not Done Yet
But virtualization is one step toward a larger goal, not the end of the journey. IT is in the middle of a fundamental transition from the rigid, siloed world of traditional data centers toward a more elastic, responsive model where needs are met far faster and more efficiently. And we're not done yet. While virtualization helps companies reduce cost and improve agility, the full promise of the new model plays out with the addition of cloud computing, delivering infrastructure on demand as an easily-accessible, cost-effective service.

Rather than perpetuating a bloated data center, the new model will allow companies to get out of the computing infrastructure business where appropriate, retaining only the portion that is essential to the enterprise. As the cloud environment becomes increasingly agile and secure, provisioning decisions will be framed by asking: Should we be really be doing this ourselves, or can someone else do it better and at lower cost? The majority of companies surveyed that are either using or actively planning to run at least some apps in the public cloud have started asking themselves the same question.

Some companies - particularly larger enterprises with the skills and scale to do it effectively -- are building on their virtualized environments to create private, or internal, clouds that deliver several of the benefits of cloud computing within the enterprise.

Private clouds provide users with an elastic computing resource on demand and help make better, more efficient use of existing capacity. But IT departments still face many of the same fundamental challenges - they still need to buy, manage and grow the data center infrastructure on which the private cloud depends. As Gartner Group's Tom Bittman points out, for most enterprises, the private cloud is not the ultimate goal, it's another stepping stone to services available in the public cloud as they become available.

It's All About the Application
The real issue is determining where each application truly belongs. Some apps are simply not suitable for any cloud, while others, at least for the foreseeable future, belong in the private cloud. Some applications are candidates for the public cloud, but the appropriate services aren't ready yet. And some data center applications could be moved to a public cloud now or in the very near future.

While virtualization is a key step toward moving beyond the rigid data center, cloud computing takes you all the way there - which is why it's getting so much attention. With new technology from CloudSwitch under development, it may work for your enterprise faster than you think. Stay tuned.

Original Article -

Friday, 25 September 2009

Facts People Don’t Know About SaaS

No need to worry about disaster recovery

Some people in the technology field tend to understand and appreciate the fact that the more a company runs its business in a SaaS environment, the less the company has to worry about disaster recovery. Many SaaS providers are running in data centers in a variety of places besides the United States. As a result, businesses often end up with some distributed data operations for its solutions – without one center - giving the companies multiple solution providers. For example, a company’s ERP system could derive from one SaaS provider in Virginia, while all of that same company’s mail could come from Google- a SaaS provider - in Seattle.

SaaS companies emphasize security procedures

SaaS companies will spend an extraordinary amount of money on security and privacy while a typical business will not spend as much. That said: “A SaaS company can actually provide you with better security and privacy than your own operations in a data center,” says Cakebread. “Most SaaS companies spend proportionally more dollars on security, privacy and prevention than a typical business would today because they have leveraged their customer base and their investments.”
SaaS lets you run your business everywhere

With the availability of broadband, WiFi and netbooks anytime, running a business from any location is now a reality. As airplanes begin to introduce wireless internet early next year, an individual can run a business from his or her coach seat, target specific applications, and save information on the Internet.
Multi-tenant efficiency

On the technical side, computer experts need to understand and appreciate the difference between a multi-tenant and the typical way of keeping data with one provider. With SaaS, there are multiple users for the technology which results in a faster turnaround. SaaS technology allows for a sharing model in a multi-tenant environment – and it is simply a matter of setting “switches” to determine who can see what data.

For almost twenty years now, technicians have sought out all the computers and units at companies to upgrade their applications and distribute the information. In a SaaS environment, particularly in a multi-tenant architecture, only one upgrade is needed – and generally this upgrade can be accomplished over a weekend. This dramatic change is due to the fact that the solutions are delivered via the Internet.

SaaS Providers Today

According to Cakebread, Google has a development platform - resulting in numerous providers with Google solutions. Meanwhile, has a business solution platform called, with 60,000 subscribers. “What you’re starting to see is third parties developing on those platforms in a SaaS environment,” notes Cakebread.

Most software that exists today was developed in the late 70s and early 80s. Back then, the Internet, broadband and wireless did not exist. The architecture and technologies - and billions of lines of code- will not be converted to a SaaS environment. “This becomes a cash flow issue for most traditional software companies. It will be a rare software company that will deliver you a SaaS platform,” Cakebread adds.

Typically, a SaaS provider will upgrade or enhance their solutions every three to four months. “That’s something unheard of in traditional software, which is typically two to three years,” he says.

Challenges for SaaS

SaaS is a new “programming language” that IT people have to learn. Although it is the latest generation, it is not hard to learn. “Programs can be developed in SaaS very quickly,” notes Cakebread. “However, if you’re a third party developer, you have to create a business and what hasn’t changed is brand building, marketing and sales.” As Cakebread points out, if you create a product in SaaS, you still need to create a business that delivers that solution to your customers, and it could take five to seven years to do that.

One of the problems that SaaS has not solved yet is the ability to quickly integrate applications. If SaaS is to be implemented with ten to twenty old solutions, it will take the IT department longer to get the integration completed. It is better to be point specific and integrate over time - rather than trying to do every application at once.

Thursday, 24 September 2009

Making Cloud Computing Ridiculously Easy

With all the hullabaloo about cloud computing, it is easy to get caught up in the trend of the day and miss the big picture. The big picture is that cloud computing disrupts the data center world by slashing the capital and skills required to deploy a web application.

If that is the big prize, then most of what passes for news in cloud computing is more along the lines of "me speak cloud too."

Today, cloud development and deployment is still the exclusive domain of highly paid web experts and just as highly paid hosting providers and systems administrators. As much as cloud providers like Amazon and Rackspace have done to simplify web hosting and eliminate people from the equation, it still takes far too much expertise and effort to get applications built and deployed in the cloud.

The goal of cloud computing is to make web development and deployment something that any bum can do and charge in on their credit card with nary a care in the world.

With all due humility, I think RightScale and WaveMaker have taken a big step towards that goal this week, introducing an easy-to-use cloud development platform with one-click deployment to Amazon EC2 via RightScale.

It is now monkeys-on-keyboards easy to create a web application and deploy it in a secure, scalable cloud environment using WaveMaker/RightScale and Amazon.

So who cares about this stuff anyway? How 'bout IBM and Amazon!

On Thursday, October 1, IBM and Amazon are hosting a half-day webinar entitled Cloud computing for developers: Hosted by IBM and Amazon Web Services . At that webinar, WaveMaker and RightScale will provide an online demonstration of building a web application with WaveMaker and deploying it to a WebSphere AMI using RightScale. One small click for man, one giant cloud for mankind!

Wednesday, 23 September 2009

SaaS: Revolutionizing the Way You Do Business

What can SaaS (Software as a Service) offer your organization? Review the countless benefits offered by this revolutionary software deployment model to determine if SaaS merits a test-drive.

As you may know, Software as a Service (SaaS) is a method for delivering software applications to customers over the Internet. However, since SaaS solutions have only been available for approximately seven years, many information technology (IT) people – as well as company managers and owners - know very little about SaaS due to its recent emergence. Therefore, two key issues will determine if SaaS will grow in popularity in the near future: the education and training of IT people and the education of people in the business world.

In 2000, the United States Census Bureau noted that out of two million provider businesses, approximately 100,000 of these companies were utilizing SaaS. Therefore, the growth potential for SaaS providers is huge as people in the technology field realize that there are alternatives to the technology they have been using for the last twenty years.

Read more about the countless benefits offered by this revolutionary software deployment model here: SaaS: Revolutionizing the Way You Do Business. 

Tuesday, 22 September 2009

Doyenz ShadowCloud Promises SMBs Business Continuity

Doyenz has started calling its patent-pending managed service platform, now on rev 3, ShadowCloud, a name that makes a lot of sense when you think about it since its first mover skills are active disaster recovery, failover and migrations in the cloud.

It replicates a virtual copy of the user’s production infrastructure in the cloud and is supposed to let service providers leverage virtualization and automation to deliver enterprise-grade IT capabilities to their SMB clients at predictable SaaS prices they can afford since redundant data centers are out of the question and traditional DR takes days to restore.

Doyenz provides an agent-based production-ready image that can be run on-demand in the cloud in case of – eek! – a failure.

Service providers can also use the latest copy of the production image at any time to test, patch and upgrade servers in the Doyenz Production-Ready Virtual Lab and then provision, deploy or just plain run them in the cloud.

With its 3.0 version, the Doyenz platform now supports both physical and virtual IT environments, including VMWare vSphere. Hyper-V is on the way. The company claims ROI can be measured in hours or days instead of weeks or months and complexity is low to non-existent.

Original Article -

Monday, 21 September 2009

One Billion Mobile Cloud Computing Subscribers !!

Yes. That's what I said! A recent EDL Consulting article cites the rising popularity of smartphones and other advanced mobile devices as the driving force behind a skyrocketing mobile cloud computing market.
According to ABI Research, the current figure for mobile cloud computing subscribers worldwide in 2008 was 42.8 million, representing 1.1 percent of all mobile subscribers. The 2014 figure of 998 million will represent almost 19 percent of all mobile subscribers. They also predicted that business productivity applications will take the lead in mobile cloud computing applications, including collaborative document sharing, scheduling, and sales force automation.

"The major platform-as-a-service providers -, Google and Amazon - are expected to start "aggressively" marketing their mobile capabilities starting in 2010. An earlier study from ABI Research reported that mobile cloud computing will generate annual revenues of more than $20 billion by 2014."
With that recent bit of news, let's look back at an August Government Computer News article:
  • The U.S. Postal Service has equipped nearly 9,500 senior and operational managers with BlackBerrys, giving them access to real-time information and alerts they need to make decisions about services USPS provides to the public.
  • The Census Bureau has deployed the Microsoft Windows Mobile operating system on 140,000 handheld personal digital assistants that census workers use during the decennial head count.
  • Users of Army Knowledge Online (AKO) will be able to access and send sensitive information through a secure mobile platform using Windows Mobile devices.
  • President Barack Obama, for whom security is paramount, kept his BlackBerry after taking office รข€" but not before it was locked down with strong encryption and security provisions to protect e-mail and communications with his inner circle.
"One has to wonder as application architectures adjust to cloud computing, how much longer they are going to be tightly coupled to data center architectures. At what point will it no longer be advantageous for application owners to define infrastructure in terms of servers, storage, and security devices?"
The timing of secure cloud computing technologies and secure mobile devices couldn't have been better.

Original Article -

Friday, 18 September 2009

Is Privacy in The Cloud an Illusion?

With the announcement today that Adobe is acquiring Omniture, one of the largest web analytics firms -- something occurred to me. This is probably obvious to most, but the move to cloud based services has some pretty big potential ramifications when it comes to privacy and risks in unknowingly agreeing to the use of your private information being data mined. The Adobe Omniture deal makes this very apparent. More then ever, your privacy is becoming an illusion.

Like it or not when it comes to terms of use and privacy policies, most people don't read them. I'm a little bit of nut when it comes to them, but I fear I am in a minority. I also realize that when it comes to free and or low cost cloud services, you are typically asked to give up some privacy in return for the service. After all nothing is truly free. So the question now becomes how much of personal information should I be prepared to give away, be it for using a free cloud service or even for a paid service. What is an acceptable amount? None or some?

For some applications, such this blog or even some social networking application the idea of giving away some personal information doesn't appear to be a problem for most. Think of a targeted ad in facebook or a Google search. But what about when it comes to other more "grey" areas such as a cloud based document / content management system, email or CRM or other various business applications. The idea of unknowingly giving way personal information becomes a horrifying thought. Making it worse, you may never even realize it's happening. And who can blame you for being scared, I don't have time to read 50 page terms of usage and other various click wrapped user agreements.

So my question; is the Omniture deal just the tip of the privacy iceberg ?

Thursday, 17 September 2009

Cloud security through control vs.ownership

Cloud computing makes auditors cringe. It's something we hear consistently from enterprise customers: it was hard enough to make virtualization "palatable" to auditors; cloud is going to be even harder. By breaking the links between hardware and software, virtualization liberates workloads from the physical constraints of a single machine. Cloud takes that a step further making the physical location irrelevant and even obscure.

Traditionally, control of information flows directly from ownership of the underlying platform. In the traditional security model location implies ownership, which in turn implies control. You build the layers of trust with the root of trust anchored to the specific piece of hardware. Virtualization breaks the link between location and application. Cloud (at least "public cloud") further breaks the link between ownership and control.

As we've examined in many previous columns we are rapidly moving from a location-centric security model to a more identity- and data-centric model. The unstoppable forces of ubiquitous connectivity and mobility have broken the location-centric security model and perimeter strategy, and left us searching for a better model for security. In the process, certain fundamental assumptions have also changed. When security is location-centric, then location, ownership and control are aligned. The logical security model coincides with the physical security model and a perimeter separates trusted (owned, local) from untrusted (other, remote). As we move beyond this model we have to examine the links between location, ownership and control.

Control of information is not in fact dependent on total ownership or a fixed location. An easy example is public key encryption. I maintain ownership of a private key and I control access to it. Usually the private key is stored in a secure location. But from the ownership of the key I can exert control over the information without having to own the rest of the infrastructure. I can build a trusted VPN over an untrusted infrastructure.

The key here is that public cloud computing requires us to exert control without ownership of the infrastructure. We can exert control and secure the information through a combination of encryption, contracts with service-level agreements and by (contractually) imposing minimum security standards on the providers. If those are in place, then there is no inherent reason why a cloud computing environment cannot be made secure and compliant. We do not need to own the assets in order to exert security, anymore than we need to own the Internet in order to trust a VPN.

Original Article -

Five Reason to Choose a Private Cloud

The number of solutions in both the public and private cloud spaces are increasing every day

As enterprise interest in cloud computing offerings and concepts continues to increase, the number of solutions in both the public and private cloud spaces increases as well. There's been much debate over public versus private cloud, even to the point of debating whether there can be such a thing as a private cloud. I'm not here to debate the latter (in my opinion the location of the service has nothing to do with whether or not it is a cloud), but rather I want to take a look into why consumers would choose private clouds over their public counterparts.

During the last several months, Ive been lucky enough to chat with numerous enterprise users about their thoughts on cloud computing. Quite a few of these users are already utilizing the cloud on some level, and they are doing this by leveraging both public and private clouds. In my experience, enterprises have been more comfortable leveraging the private cloud initially and using that as a stepping stone to jump into the public realm when and where it makes sense.

For enterprise users that have chosen the private cloud approach, I always try to understand their reasoning behind such an approach. Over time, it has become pretty clear to me that there is a common set of requirements and reasons users choose a private cloud. Here are the top five most common reasons I hear for implementing private clouds:

  1. Security and privacy: This is probably the most obvious and prevalent reason regardless of the argument that public clouds can be more secure than on-premise data centers. Perception is reality and sometimes users are very wary of putting their confidential data and applications in a cloud they dont necessarily control.
  2. Customization: Users who require highly tailored, customized environments are likely to go the private cloud route simply because customizations are easier to achieve. Often times, public cloud solutions focus on offering the most common services. While good for a large majority of the users, that approach doesnt suit all needs.
  3. Tight operational control: Some enterprise users desire complete control over the behavior of their cloud-based services. For instance, public cloud providers may not guarantee the necessary of level of performance for the network, infrastructure, and application services they provide.
  4. Governance: In my observations, the perception of many public cloud offerings is that they sometimes lack in providing capabilities that enable proper governance. Users expect to be able to apply many of the same governance concepts in SOA to their use of cloud. In some cases, this desire is strong enough to shun public clouds in favor of implementing a governance model around their own private cloud.
  5. Existing investment: In some situations it comes down to the very simple fact that a company has significant existing investment in their IT infrastructure. If a relatively inexpensive solution can be implemented to help them better leverage this infrastructure via a private cloud, they are more likely to embrace that than turn their back on the investment in favor of an entirely new approach via the public cloud.
I think there's definitely a time and place for both public and private clouds. In my opinion it is unwise to dismiss private clouds out of hand, and I think it equally unwise to think enterprises cannot make use of public clouds. I'd like to hear what you think too. If you are leveraging cloud services are they public or private and why did you choose that particular approach?

Original Article -

Wednesday, 16 September 2009

DuPont Sues Employee for Insider Theft

Many of us think about protecting our data against the strangers of the world who might be trying to find a way to use our information to their benefit. It can be surprising, therefore, when the breach occurs within our company (or circle of friends, family, etc). Unfortunately, DuPont is learning that insider theft is becoming
more and more common.businessman at laptop

The industrial manufacturing company discovered that one of their employees, a senior research chemist, transferred confidential files containing trade secrets from his company-issued laptop to an external hard drive.

Immediately, I couldn't help but wonder why DuPont wouldn't have some sort of alert in place in case someone tried to attach a hard drive to company computers. I was further baffled when I learned that this isn't the first time they've been through this.

After 10 years with DuPont, an employee gathered information from thousands of documents and scientific abstracts. His mission? To sell the information to rival company, Victrex. He ended up being sentenced to 18 months of jail time.

Original Article - 

Open Group Focuses on Cloud Security

Standards and open access are increasingly important to users of cloud-based services. Yet security and control also remain top-of-mind for enterprises. How to make the two -- cloud and security -- work in harmony?

The Open Group is leading some of the top efforts to make cloud benefits apply to mission critical IT. To learn more about the venerable group's efforts I recently interviewed Allen Brown, president and CEO of The Open Group. We met at the global organization's 23rd Enterprise Architecture Practitioners Conference in Toronto.

Brown: We started off in a situation where organizations recognized that they needed to break down the boundaries between their organizations. They're now finding that they need to continue that, and that investing in enterprise architecture (EA) is a solid investment developing for the future. You're not going to stop that just because there is a downturn.

In fact, some of our members who I've been speaking to see EA as critical to ready their organization for coming out of this economic downturn.

... We're seeing the merger of the need for EA with security. We've got a number of security initiatives in areas of architecture, compliance, audit, risk management, trust, and so on. But the key is bringing those two things together, because we're seeing a lot of evidence that there are more concerns about security.

... IT security continues to be a problem area for enterprise IT organizations. It's an area where our members have asked us to focus more. Besides the obvious issues, the move to cloud does introduce some more security concerns, especially for the large organizations, and it continues to be seen as an obstacle.

On the vendor side, the cloud community recognizes they've got to get security, compliance, risk, and audit sorted out. That's the sort of thing our Security Forum will be working on. That provides more opportunity on the vendor side for cloud services.

... We've always had this challenge of how do we breakdown the silos in the IT function. As we're moving towards areas like cloud, we're starting to see some federation of the way in which the IT infrastructure is assembled.

As far as the information, wherever it is, and what parts of it are as a service, you've still got to be able to integrate it, pull it together, and have it in a coherent manner. You’ve got to be able to deliver it not as data, but as information to those cross-functional groups -- those groups within your organization that may be partnering with their business partners. You've got to deliver that as information.

The whole concept of Boundaryless Information Flow, we found, was even more relevant in the world of cloud computing. I believe that cloud is part of an extension of the way that we're going to break down these stovepipes and silos in the IT infrastructure and enable Boundaryless Information Flow to extend.

One of the things that we found internally in moving from the business side of what our architecture is that the stakeholders understand to where the developers can understand, is that you absolutely need that skill in being able to be the person that does the translation. You can deliver to the business guys what it is you're doing in ways that they understand, but you can also interpret it for the technical guys in ways that they can understand.

As this gets more complex, we've got to have the equivalent of city-plan type architects, we've got to have building regulation type architects, and we've got to have the actual solution architect.

... We've come full circle. Now there are concerns about portability around the cloud platform opportunities. It's too early to know how deep the concern is and what the challenges are, but obviously it's something that we're well used to -- looking at how we adopt, adapt, and integrate standards in that area, and how we would look for establishing the best practices.

Original Article -

Tuesday, 15 September 2009

Virtual appliances cure appliance bloat

Over the past few years hardware appliances have become the preferred form factor for deploying specialized IT solutions. Appliances are preconfigured, easy to deploy and simple to manage, and usually offer a compact form factor.

However, the rapid adoption of appliances, particularly for security solutions, has led to appliance bloat -- racks of multicolored boxes, each performing a specialized function. As enterprises deploy more appliances, the ease-of-use and management benefits that make security appliances popular are at risk of being overwhelmed by the complexity and cost involved with managing a large number of point solutions. 

One solution is to use virtual appliances. They let enterprises install hardware-free appliances on existing virtualized server infrastructure. Virtualization technology, such as that of VMware, transforms a mix of industry-standard x86 servers and their associated processors, memory, disk and network components into a pool of logical computing resources that can be dynamically allocated to different virtual machines (each of which might be running an entirely different operating system, applications and services).

Commercial virtual appliances are just starting to appear. VMware's virtual appliance marketplace lists commercial virtual appliances certified to meet VMware's production-ready criteria.

Many of today's hardware appliances are based on standard x86 server hardware running a customized operating system and specialized applications that lend themselves well to transformation into a virtual appliance. Everything about a virtual appliance -- operating system, application, user interface -- is the same as with a physical appliance, except that it requires no dedicated, physical infrastructure.

The benefits are the same as those realized by traditional server virtualization -- server and storage capacity can be increased without investing in additional hardware -- but virtual appliances can also take advantage of the data center's virtualized failover, backup, change management and disaster recovery features, generating further efficiencies. In addition, virtual servers can be deployed for scalability or redundancy purposes on an as-needed basis at zero incremental cost.

* Streamlined product evaluations: Instead of waiting to get hardware in-house for evaluation, you can download a trial virtual appliance and begin using it in a matter of hours without having to interact with the vendor or reseller.

* Simpler, more powerful lab environments: Virtual appliances make it easy to set up multiple appliances on a single server for testing purposes. You can try new products and modules, test configuration changes and evaluate different server configurations. You also can take a snapshot of your production environment and run it in a lab environment. Applying patches and upgrades can be performed at low cost in the lab environment on an identical snapshot of your production system.

* Lower capital expenditures: Virtual appliances can save thousands of dollars on initial purchase price vs. hardware appliances, and thousands more by utilizing existing data center failover and disaster recovery resources. But make sure your virtual appliance vendor has a virtualization-friendly pricing model that is tied to number of users, for example, rather than number of CPUs (which makes little sense when you can instantiate any number of virtual appliances dynamically).

* Increased performance and agility: Though there may be a small performance hit in running a virtual appliance on hardware that is otherwise identical to a vendor's hardware appliance platform, most servers used in virtual environments are more powerful than a single special-purpose appliance. So in practice, virtual appliances can offer superior performance in a cost-effective way.

While hardware appliances will remain the most popular deployment method for security applications in the near term, it's not a stretch to say that virtual appliances, coupled with commodity hardware, 
eventually will overtake customized, multifunction appliances. This will happen rapidly at enterprises with aggressive virtualization strategies in which the significant cost savings, coupled with benefits of using superior technology, will outweigh any perceived performance advantages of appliances, even those built on custom hardware.

Monday, 14 September 2009

What Is an OpenCloud API?

When it comes to defining Cloud Computing I typically take the stance of "I know it when I see it". Although I'm half joking, being able to spot an Internet centric platform or infrastructure is fairly self evident for the most part. But when it comes to an "OpenCloud API" things get a little more difficult.

Lately it seems that everyone is releasing their own "OpenCloud API's", companies like GoGrid and Sun Microsystems were among the first to embrace this approach offering there API's under friendly open creative common licenses. The key aspect in most of these CC licensed API's is the requirement that attribution is given to the original author or company. Although personally I would argue that a CC license isn't completely open because of this attribution requirement, but at the end of the day it's probably open enough.

For some background, a Cloud API determines the vocabulary / taxonomy that a programmer needs employ when using a particular set of cloud services. To be clear, an API is only the documents and processes for the usage of a certain cloud API, without these there is no API. In a recent twitter post Simon Wardley said it this way, "Companies can't own APIs only the documentation, tools and processes (which are generic to the API)."

Generally there are three types of Cloud API's.

1. Blind APIs - API's that don't tell you their restrictions. Amazon Web Services is the best example. (Personally I'd rather know what I can or can't do then not know at all)

2. Closed APIs - API's that do tell you their
restrictions. Google App Engine is a good example using a highly restrictive license. The Google terms state in section 7.2. that "you may not (and you may not permit anyone else to): (a) copy, modify, create a derivative work of, reverse engineer, decompile or otherwise attempt to extract the source code of the Google App Engine Software or any part thereof, unless this is expressly permitted" Which would make things like AppScale which is an open-source implementation of the Google AppEngine illegal under Google's terms of use.

3. OpenCloud APIs - API's that generally let you do whatever you want as long as you give attribution. Rackspace, GoGrid and Sun are the best examples. A major issue facing a lot of these so called open API's is although you maybe free to remix and use the API, you could be limited by the use of a company's name or trademark. Making the attribution clause a potential mine field.

I'd like to also note that a sub branch of OpenCloud API's are Open Community API's such as the OGF's OCCI project. These community API's apply an open source model that allow anyone to get involved in the development and specifications of a Cloud API at little or no cost.

This brings us to what exactly is an OpenCloud API?

A Cloud API that is free of restrictions, be it usage, cost or otherwise.
Original Article -

Cloud Computing and Open Source

Any new technology market has its own lifecycle and rhythm.  From mainframes, through smart phones, there’s the early years, the rapid growth, some slowing down and inevitably a decline.  Some technologies never go away completely (e.g. mainframes), while others never really get a foothold (insert your own example here).

Open source was a software movement that began as an idea and now dominates how many new software offerings are marketed and sold. Open source is not a technology, but a business and legal framework within which technology is propagated.  Still, the biggest companies in software are largely closed source – Oracle, SAP, etc.  

Nearly all specialty vertical apps (e.g. trading systems) are closed source.  Whereas most new development technologies including databases and tools are open source.  Given that open source is more a legal construct which bleeds into sales and marketing, it’s highly likely that there will be both open and closed source models co-existing for any foreseeable future.  Further, open source shrinks the size of the industry from a revenue perspective by default  (though paradoxically, software spending is up this year even in this economy).

What about cloud computing?  Will there still be the need for the cloud modifier in the future?  In the past most infrastructure was sold directly to the users under a capex spending model – this includes servers, databases, operating systems, etc.  Of total infrastructure spending in 20 years, how much will still be for on-premise capex, and how much for cloud opex?  Will economies of scale drive infrastructure in the cloud to a point where the infrastructure market will shrink in both real and nominal terms?

Will the purveyors of servers, networking and core infrastructure software sell 90% of their wares to cloud companies?  Will what we currently call  ”cloud computing” be  just plain “computing” in the future?  Time will tell, and it will be a long time before the cloud distinction becomes superfluous, but it will be interesting to watch.

Original Article -

Saturday, 12 September 2009

PCI compliance and web application security: what you need to know for the upcoming policy changes

If you are a merchant that processes credit cards, then you are probably already well aware of PCI compliance, but you may not be sure how web application security fits into the picture. You may also have heard that starting in June 2008, section 6.6 of the rules for PCI compliance will go from a ‘best practice’ to a mandatory requirement (if not, it's time to pay attention!), but you might not know what this means for your business. The fact is, in a perfect world you already have in place what is necessary to be compliant with not only section 6.6, but PCI rules as a whole. This is because ideally, you would have handled your web application security practices from the start, as the applications are built, so that you are not scrambling to add security to existing applications. Unfortunately, this is often not the case – which makes now a great time for businesses to re-evaluate their web application security processes overall.

What PCI compliance means

A bit of background regarding PCI compliance - as credit card use has become more widespread both offline and online, and as consumer concern about security has understandably grown, the credit card industries have made an effort to ensure that sensitive information is protected. To that end, in September 2006, the major credit card companies (American Express, Discover Financial Services, JCB, MasterCard Worldwide and Visa International) formed the PCI Security Standards Council (SSC) and established a set of rules for what they called PCI compliance. These rules have to be followed depending on the size of a business and the number of credit card transactions handled, and if done properly will help protect consumers' data from theft.

The rules for PCI compliance

There are six major categories within the standards established by the PCI SSC, which are as follows:

* Build and maintain a secure network
* Protect cardholder data
* Maintain a vulnerability management program
* Implement strong access control measures
* Regularly monitor and test networks
* Maintain an information security policy.

Within these six categories are 12 requirements that address particular issues and that are directly related to web application security:

1. Install and maintain a firewall configuration to protect cardholder data
2. Do not use vendor-supplied defaults for system passwords and other security parameters
3. Protect stored cardholder data
4. Encrypt transmission of cardholder data across open, public networks
5. Use and regularly update anti-virus software
6. Develop and maintain secure systems and applications
7. Restrict access to cardholder data by business need-to-know
8. Assign a unique ID to each person with computer access
9. Restrict physical access to cardholder data
10. Track and monitor all access to network resources and cardholder data
11. Regularly test security systems and processes
12. Maintain a policy that addresses information security.

Each requirement for PCI compliance is broken up into a variety of subsections that go into detail about the process, the full list of which can be viewed at Section 6.6 – the most important subsection regarding web application security because it is coming under scrutiny this year - states the following:

Ensure that web-facing applications are protected against known attacks by applying either of the following methods:

* Having all custom application code reviewed for common vulnerabilities by an organization that specializes in application security
* Installing an application layer firewall in front of web facing applications.

As a result of this upcoming change, it is important for companies to have a game plan in place for web application security. Until now, companies may not have taken PCI compliance very seriously. No major fines have been levied for noncompliance so far and the entire process may have been seen as something nonessential. But with this new change to 6.6, IT teams around the world are evaluating the strengths and weaknesses between web application firewalls, code reviews and application assessment software which all satisfy the requirement.

What it means for your business

There are two mistakes that many organizations make related to web application security. First, many businesses and government organizations have historically focused their attention on network security rather than web application security, and it may seem that the June 2008 deadline is coming out of nowhere and that businesses will be scrambling to achieve PCI compliance. But the fact is, your business should have ensured that all of its web applications were secure from the beginning. PCI compliance shouldn't be viewed as a checklist, because then all that will happen is that unreliable fixes will be applied to problems. Instead, the concept of web application security needs to be implemented within the web application itself. When web application security is implemented properly, the PCI compliance requirements related to web application security are automatically met.

As a result, the development and QA teams at businesses need to be focused on web application security. It may be that businesses will need to take their web applications and break them down from the start, rather than trying to install patches and fixes for PCI compliance.

Another section related to PCI compliance that could cause problems in the near future is 11, which states that security scans must be done on a regular basis. If instead of fixing web application security issues internally, patches had been installed as an afterthought, these scans could become nightmarish because they will identify hundreds of issues that will need to be fixed. Better to take the time up front to build in web application security measures and avoid this problem altogether.


Businesses that process credit cards are likely to be already aware that they must be PCI compliant – but they may not have worked very hard to make sure that they are. In 2008, one of the subsections of PCI compliance will become mandatory, and businesses are going to have to evaluate their web applications very carefully. By ensuring that web application security is built from within, rather than by adding on fixes that will only work in the short term, businesses will find that not only are they compliant with one part of the PCI standards, but that they are compliant with all of them, and that their customers' data is secure across the board.

Original Article -

The laptop that shouts 'Stop, thief' when stolen

“Get your hands off me, I have been stolen,” the laptop shouted. That is a new solution to laptop computer theft: a program that lets owners give their property a voice when it has been taken.

The program allows users to display alerts on the missing computer's screen and even to set a spoken message. Tracking software for stolen laptops has been on the market for some time, but this is thought to be the first that allows owners to give the thief a piece of their mind.

Owners must report their laptop missing by logging on to a website, which sends a message to the model: a red and yellow “lost or stolen” banner pops up on its screen when it is started. Under the latest version of the software, users can also send a spoken message.

The theft-busting program, Retriever, has been produced by Front Door Software Corporation, a small company based in Evergreen, Colorado. Carrie Hafeman, its chief executive, said that if owners tick a box indicating that their computer has gone, when it is started up, “a big red and yellow banner pops up, saying 'This laptop is lost or stolen'.”

The message can be set to reappear every 30 seconds, no matter how many times the thief closes it. “One customer sent a message saying, 'You are being tracked. I am right at your door',” Ms Hafeman said.

In the latest version, people can add a spoken message. The default through the computer's speakers is: “Help, this laptop is reported lost or stolen. If you are not my owner, please report me now.”

Others have introduced more direct alerts. “You can record your own. You can say, ‘Get your hands off me, you S.O.B.',” Ms Hafeman said.

The Retriever software package, which costs $29.95 (£21) but has a free trial period, has the functions of many security software programs. Owners can remotely switch to an alternative password prompt if they fear that the thief has also got hold of the log-in details.

If a thief accesses the internet with the stolen laptop, Retriever will collect information on the internet service provider in use and nearby wi-fi access points, so that the police can be alerted to its location.

Thousands of laptops are stolen every year from homes and offices, but with the use of laptops increasing, the number stolen while their owners are out and about has been rising sharply.

Other laptop security software allows users to erase data remotely or lock down the computer.

The Undercover software from Orbicule for Macintosh laptops sends back screenshots from the stolen computer and takes pictures with the laptop's built-in camera.

The “talk” update of the Retriever program works with Microsoft Windows XP and Vista and a version for Mac users is in the pipeline, Ms Hafeman said.

Posted via email.

Thursday, 3 September 2009

Defense-contract discs sold in African market for $40

Dumped hard drives with US defense data have turned up for open sale in a West African market. A team of Canadian journalism students bought a hard drive containing  information on multi-million dollar contracts between military contractor Northrop Grumman and the Pentagon for just $40 in a market near Accra, Ghana. The exercise was part of shooting a documentary on e-waste by Vancouver journalism students, researching what happens to the West's discarded and donated electronics.

"You'd think a security contractor that constantly deals with very secret proprietary information would probably want to wipe their drives," Blake Sifton, one of the three graduate journalism students told CBC. The team bought seven hard drives at a market in the port of Tema, a major point of entry for electronic waste from Europe and North America into Africa.

Northrop Grumman is reported to be investigating how an unencrypted hard drive containing sensitive data on the firm ended up on an African market, in violation of its established kit disposal procedures.

"Based on the documents we were shown, we believe this hard drive may have been stolen after one of our asset-disposal vendors took possession of the unit," Northrop Grumman told CBC.
Blogged with the Flock Browser

Pensions Trust faces grilling over stolen laptop security lapse

The Information Commissioner has launched an investigation into the theft of a laptop containing the details of 57,000 members of social housing pension schemes.

The commissioner will attempt to find out how the details, held by the Pensions Trust, came to be put at risk and whether action is needed to prevent similar data losses.

The computer stolen on 23 March held the details of around 50,000 members of the Social Housing Pension Scheme and a further 7,000 members of the Scottish Federation Housing Association Pension Scheme.

The data, which included bank account details, National Insurance numbers and salary details, was password-protected but not encrypted. It was taken from the offices of Northgate Arinso, which supplies the trust’s computerised pensions administration system.

The commissioner launched the investigation after it was contacted about the theft by the trust on 22 May.

When you have Backstopp installed you have protection against breaches like these.

News Source:
Blogged with the Flock Browser

Wednesday, 2 September 2009

Laptop security and data protection in schools

Schools, as is the case with any person or organisation handling personal data, must comply with the Data Protection Acts. Personal data collected by schools for their own purposes is the responsibility of the schools as they are the Data Controller. The website of the Data Protection Commissioner has advice and information in relation to the responsibilities of Data Controllers.

Sensitive data should not leave the school unless it is absolutely necessary. Other options of accessing the school based data such as remote accessing your schools network can ensure that the data does not physically leave the network and any information used in the manipulation of the data can be securely disposed of. This may be considered before transferring data onto a device that will be accessed outside of school.

Laptop and Portable Data Security – Some advice and support

All schools should have a policy pertaining to use of all portable data storage (PDS) media i.e. laptops, USB sticks, external hard drives, PDA’s, memory cards etc.

The assignment of laptops or other devices to teachers should always be recorded so that they can be easily tracked from an asset management point of view. A record of serial numbers, list of hardware features (DVD, wireless mouse, etc.) and other information that can be used to identify the laptop or device should be compiled and retained in the school. These items should be included on the school insurance policy. If the laptop is for teachers’ work at home use , this should also be noted in the school insurance policy. Any policy should include procedures for reporting loss/theft. It may also be useful to inform the manufacturer technical support in case the laptop or device is sent back for repairs.

Security issues to consider

There are a number of issues and questions to consider before removing sensitive data from a school location.

* Is the need to access information outside of school justifiable?
* Has the data been backed up before it leaves the building?
* Have you reduce the risk of the device being lost or stolen?
* Have you taken efforts to make it more difficult for an unauthorised person to access that device?
* Have you taken efforts to make it more difficult for an unauthorised person to access data on that device?

Security measures to implement

* All essential data leaving the office should be adequately backed up. In regards to email, it is important that there is a copy backup not on the physical laptop. Consider using web based mail (or use IMAP) or ensure a copy of mail is saved on the server.
* Where possible the School name and contact details should be identifiable on purchased equipment to deter thieves and aid recovery. A well secured security label on the frame of the laptop is advisable. The desktop background could contain the name and address of the school to aid return if the laptop is lost.
* Cable locks should be used wherever possible to secure the laptop to the desk if used in areas where there is a risk of theft.
* Schools should also ensure that security features within the Operating System are enabled and user profiles are protected. It is possible to set a password at the boot up stage of the laptop to make it more difficult to access the hard drive or reinstall the operating
* Strong passwords should accompany any log in user authentication
* A secure password policy should be implemented, requiring staff to
change their passwords on a regular basis (monthly is industry standard
* Use “strong” passwords consisting of letters (both uppercase and lowercase),
numbers and symbols such as w£Ty78&Q
* Check your password strength at
* A school should give serious attention and thought to where sensitive data is stored and accessed. For example if a laptop contains sensitive data it may then not be wise that this laptop is used throughout the school by many users. It may be best kept in the principals’ office.
* Where sensitive data is saved on the laptop, schools should make the effort to ensure data security of that data. It is possible to password protected files and folders within the popular operating systems used in schools.
* There are different levels of data sensitivity. Data relating to a person’s bank details will be more sensitive than results of in school exams. The level of security should reflect the data it is protecting.
* There are also different encryption solutions available if required. These cost of these solutions range from free to expensive and have varying degrees of encryption which may make it more difficult to access the data.

Blogged with the Flock Browser

Tuesday, 1 September 2009

The Green Cloud: Hype or Reality?

The cloud, as we all know, is the hot IT topic of the moment; in Gartner's latest Hype Cycle, published last month, Cloud Computing holds the dubious honor of being tied with e-book readers at the top of the "peak of inflated expectations."
From the press release announcing the latest Hype Cycle:
The “Hype Cycle for Emerging Technologies” is the longest-running annual Hype Cycle, providing a cross-industry perspective on the technologies and trends that IT managers should consider in developing emerging-technology portfolios. This Hype Cycle features technologies that are the focus of attention in the IT industry because of particularly high levels of hype, or those that may not be broadly acknowledged but which Gartner believes have the potential for significant impact.

“Technologies at the Peak of Inflated Expectations during 2009 include cloud computing, e-books (such as from Amazon and Sony) and Internet TV (for example, Hulu), while social software and microblogging sites (such as Twitter) have tipped over the peak and will soon experience disillusionment among enterprise users,” said Jackie Fenn, vice president and Gartner Fellow.
 But in the last week, I've happened upon a number of blog posts and news items talking about the most important (from my perspective, at least) element of the cloud: Is it green?

I raised this question, somewhat indirectly, in July from the standpoint of Microsoft's online-Office announcement:
Both Microsoft and Google have extremely efficient large-scale data centers; both companies are aiming for an industry-leading PUE of 1.12 in their computing centers. Expanding the use of these services means more incentive to concentrate IT operations in these top-of-the-line facilities, and will continue the shift that individuals are already undertaking toward netbooks -- cheap, web-centric laptops that forgo much of the established abilities of desktops and full-sized laptops for more portability and lower price.
We'll come back to my take on this topic, but first, the rundown. In a new article on NetworkWorld, Tom Jowitt looks at the findings of Rackspace's latest Green Survey, and finds plenty of skepticism:
over 21% of IT managers believe that cloud computing is a much greener alternative to traditional computing infrastructures, but it seems that the vast majority still remain to be convinced.

Thirty-five percent said they were not convinced on the green benefits of cloud computing, and 25 percent felt that there was too much hype around the green benefits of cloud computing. Meanwhile 19% felt that the true green benefits of cloud computing have not yet been realized.

Seven percent admitted that cloud computing was critical to their company becoming greener; 14% are currently evaluating cloud computing and its environmental benefits; 13% have considered the benefits of cloud computing as part of their overall environmental strategy; and 20% would be interested in learning more about the green benefits of cloud computing.

But a sizable portion (46%) said that cloud computing was not a part of their overall environmental strategy.
 While CIOs and IT managers as a whole are still uncertain about the green benefits of the cloud, big business -- and industry experts -- see the green lining. Exhibit A: a blog post from David Talbot at MIT's Technology Review, which kicks off thusly:
Cloud computing may raise privacy and security concerns, but this growing practice -- offloading computation and storage to remote data centers run by companies such as Google, Microsoft, and Yahoo -- could have one clear advantage: far better energy efficiency, thanks to custom data centers now rising across the country.

"There are issues with property rights and confidentiality that people are working out for mass migration of data to the cloud," says Jonathan Koomey, an energy-efficiency expert and a visiting professor at Yale University. "But in terms of raw economics, there is a strong argument," he adds. "The economic benefits of cloud computing are compelling."
 Original Article -