During this 2010 sort benchmark competition the World cup of data sorting the computer scientists from UC San Diego Jacojs School of engineering also tied a world record for fastest sorting rate. They sorted one trillion data records in 172 minutes and did so using just a quater of the computing resources of the other record holder.
Companies looking for trends, efficiencies and other competitive advantages have turned to the kind of heavy duty data sorting that requires hardware muscletypical of data centers.
"Companies are ushing the linit on how much data they can sort and how fast. This is data analytics in real time"
US San Diego Computer Science professor Amin Vahdat, who lead the project - said If major corporation wants to run a query accross all of their page views or products sold, that can require a sortaccross a milti-peta-byte dataset and one that is growing by many GBs everyday.
The two new world records from UC San Diego are among the 2010 results released recently on sortbenchmark.org. The competitions provide benchmark for data sorting and interactive forum for researchers working to prove data sorting rechniques.
US San Diego Computer Science professor Amin Vahdat, who lead the project - said If major corporation wants to run a query accross all of their page views or products sold, that can require a sortaccross a milti-peta-byte dataset and one that is growing by many GBs everyday.
The two new world records from UC San Diego are among the 2010 results released recently on sortbenchmark.org. The competitions provide benchmark for data sorting and interactive forum for researchers working to prove data sorting rechniques.
No comments:
Post a Comment