Friday, April 29, 2011

SETI - Shutting down for lack of funds


Remember SETI? The ambitious 'Search for Extra Terrestrial Intelligence' program which probably was also the first of its kind public program that used distributed computing power. 

SETI program sought to use the power of millions of distributed home/office computers to process the tonnes of data that a set of radio telescopes named the Allen Telescope Array in Hat Creek, California, USA collected day in and day out. Users who had their systems connected the World Wide Web could install a small client software that would take bits of data by talking to a SETI remote server , analyse this data when the computer processor was idle and send the data back to the central server.

What's disheartening is that the program is being stalled due to lack of funds. Started way back in 2007, the array of 42 radio dishes scanned deep space for possible signals that are indicative of communication from intelligence life. The SETI institute itself was founded in 1984 with NASA funding.

The $50 million array was built by SETI and UC Berkeley with a $30 million donation from Microsoft Corp. co-founder Paul Allen, being the biggest chunk of money. Operating the dishes costs about $1.5 million a year, mostly to pay for the staff of eight to 10 researchers and technicians to operate the facility.An additional $1 million a year is needed to collect and sift the data from the dishes.

If you enjoyed reading this post, Subscribe to the feed here ...And never miss a post

Thursday, April 28, 2011

Energy Consumption by Cloud Service Providers

An interesting piece of statistic that Engadget has shared shows data center power consumption across the major cloud service providers.

Facebook, Google and Microsoft (all services included) make up for 26.38 % of energy consumption. What's interesting is that BOA (Bank of America) too registers as a high energy consumer. This is surprising and would definitely try and find out what makes them a data center energy guzzler!!

If you enjoyed reading this post, Subscribe to the feed here ...And never miss a post

Wednesday, April 27, 2011

Not all apps are meant for the cloud

Phil Wainewright of ZDNet today has a nice article that speaks of "7 things to learn from the Amazon outage". One of the key points he makes is that this should be a clarion call for enterprises to revisit their strategy of what applications of theirs should be deployed on the cloud. 

If you had put something on the cloud, it is implicit that you are assuming that you do not retain full control. A failure is one of the many things that can go wrong with possibly nothing in your power to set things right on the cloud. 

A colleague of mine was narrating another incident wherein a medical application was on the EC2 cloud and it got affected due to the outage. The owner in question raised tickets with AWS folks - No response. They then out of sheer desperation posted their complaint on the AWS forum. Instead of help, what they got were dollops of advice from other people on the forum questioning their strategy of deploying a mission critical medical application on the cloud !!!

I am narrating this here in the light of how mere cost savings and on-demand availability cannot be justifications to embrace cloud. Yet another article elsewhere highlighting the fact that most cloud providers aim to provide discrete services with as little support as possible chiefly because they are operating under razor thin margins. Volumes count and not individual attention to customers.

Remember cloud is not the panacea for all your infrastructure woes :-)

If you enjoyed reading this post, Subscribe to the feed here ...And never miss a post

Tuesday, April 26, 2011

Traditional Hosting versus Cloud Hosting - 3 Differences

Much too often, the question on what is the difference between hosting a website on a remote server and on a cloud is - crops up during passionate discussions on cloud computing. Truly geeky friends of mine often forget the fundamental differences between cloud hosting and non-cloud hosting and start wondering if cloud computing is just a fad that was invented to pull the wool over unsuspecting enterprises. To all those friends of mine and to those of you who really are puzzled over the differences, here's a beautiful video that settles the matter for once. You can bookmark this URL for the subsequent times...;-)


If you enjoyed reading this post, Subscribe to the feed here ...And never miss a post

Monday, April 25, 2011

SaaS - Biggest Cloud Computing Growth Driver

Forrester has predicted a nearly 6 fold growth in SaaS between 2011 and 2020. The entire cloud computing trend is slated to undergo a slightly  higher growth of about 6.11. This clearly indicates the huge role that SaaS would have in the growth of the cloud computing in general.

If you refer back to the analogy of desktops, operating systems and applications that I had provided, this starts making more sense. Once you have the hardware and bare operating system in place, these two items tend to get fairly standardized and hence commoditized. What does not stagnate is the continuous need for applications for different computational needs. Each need is further driven by niche needs. This is followed by need to optimize, need for cross integration, etc.The cloud computing model is no different. It will follow a similar pattern. First essential software will be available on demand. Next would come the need for specialized software and finally lead to mash-ups, market places, etc
If you enjoyed reading this post, Subscribe to the feed here ...And never miss a post

Sunday, April 24, 2011

AWS Disruption - Cloud Concerns Re-debated

In August of 2006, Amazon launched EC2 - Elastic Compute Cloud - a service that allowed for the first realization of the 4 tenets of cloud computing at the Infrastructure layer. - On demand elastic compute power was available on a pay per usage model. Developers and Enterprises could host applications by themselves and also program them to take advantage of the underlying infrastructure that could scale up and down with a call to exposed APIs by Amazon. 

Architected, designed and rolled out from CapeTown, South Africa, 5 years ago, AWS ran into its first major battle this week with a major disruption at a datacenter in Northern Virginia
  1. With major customers like NASDAQ, NetFlix, FourSquare, Pfizer, NewYorkTimes, this disruption brings to the forefront the debate on what should enterprises keep in house versus putting them on 3rd party provided infrastructure like AWS EC2. 
  2. It also will result in enterprises revisiting their strategy on 3rd party infrastructure. Enterprises like NetFlix that hosted from multiple datacenter locations of AWS did not face disruptions while SMBs who opted for single data center hosting faced the brunt.
  3. A definitive 3rd angle to be discussed would be on "Should we go with one provider or distribute our bets with multiple?
Occurrences of this nature do help both consumers and providers a chance to tighten their belts. A bane in the short term but a boon in the long term.

If you enjoyed reading this post, Subscribe to the feed here ...And never miss a post

Saturday, April 23, 2011

Understanding Cloud Computing - 3 - IaaS

The Infrastructure layer of cloud computing is the base foundation of cloud computing. This is the layer that triggered off the thoughts of true cloud computing.Additional hardware purchased by divisions of companies that remained unused must have prompted other divisions to request for this hardware on a temporary basis. That action is what sparked off the thought of subscribing to hardware on demand and not buying and owning it.

Hosting companies that provided space on remote servers took the first step in this direction. They were prompted more from the need for the servers to be exposed to public at all times unlike enterprise servers that are within the firewall. This also freed up enterprises from having to invest and maintain open-to-public servers. They just put up their websites and related public consumable data onto these hosted servers. 

It was exactly 5 years ago that the next big step in IaaS was taken. It was by a company named Amazon (more famous for it online book store). AWS or Amazon Web Services (started in July 2002) announced the availability of its EC2 - Elastic Compute Cloud offering in August 2006. EC2 allowed users to rent out computing power and pay for it by the hours of usage. EC2 allowed for users to load their applications onto Amazon hosted infrastructure and EC2 related services allowed for scaling up or down the needed computing and storage power based on the consumption of these applications.

Today you have a whole set of companies trying to match what AWS brought to the market. GoGrid, Rackspace, Akamai, etc are among the few top ones. 

You could safely say that the current trends and gung-ho around cloud computing had its seeds sowed back in 2006. In our next post, lets dive into Platform as a Service facet of Cloud Computing.

If you enjoyed reading this post, Subscribe to the feed here ...And never miss a post

Never miss a post...

If you enjoyed reading this post, Subscribe to the feed here ...And never miss a post