The EC2 cloud computing system should be a lot more tempting to developers with the addition of over a terabyte of public data.
If you've ever wanted to create the ultimate mash-up website but lacked the stupendously large datasets to get started, Amazon has the answer for you.
As revealed on ReadWriteWeb
yesterday, the company has started to offer unlimited access to over a terabyte of public data via its Elastic Compute Cloud (EC2) system. There's some pretty neat stuff on there, too – although most is aimed at the US.
While access to certain public datasets has been available via EC2 for a while, the company has really been ramping up the addition of new content – and has finally broken the impressive 1TB barrier. This has been helped by the addition of four new data sources this week: statistics from the US Bureau of Transportation for aviation, maritime, road, rail, bike, pedestrian, and other modes of transportation; a copy of the DBPedia Knowledge Base containing more than 2.6 million entries; a complete copy of the semantic Freebase Data Dump; and the English language version of Wikipedia in its entirety.
All this content – plus existing data sets including genetic and other scientific databases which were already loaded into the EC2 cloud – has been made available in machine-readable format for immediate use by developers signed up to the service. In a blog post
the company described the use of these data sets on existing EC2 accounts as “basically trivial
,” and explained that due to no data having to move anywhere before it is used there is no need to “[spend] days or weeks downloading these data sets [as] you can be up and running from a standing start in minutes.
While all this data could lead to impressive creations from talented web developers with a neat idea behind them, one thing that will be crucial for the data sets to maintain their allure is for frequent updates: as of yet, Amazon has not committed to an update schedule for these or any public databases provided on the Amazon Web Services system.
Has the thought of all that data got your web developing fingers itching to have a go, or has Amazon simply amassed over a terabyte of pure bore? Share your thoughts over in the forums