Business Cloud News

Last month, German service provider Deutsche Telekom embraced Big Data wholeheartedly, forging a collaboration with Apache Hadoop specialist integrator Cloudera, in order to deliver cloud-based, big data analytics as a service.

Hadoop, an Apache-developed open source software framework, is causing a stir by making compute and analytics processes on very large amounts of data not just possible but even relatively simple.

Given that it’s now technically and increasingly economically possible to store  petabytes and petabytes of data, the problem has shifted to being able to perform actions on those data in a technical and cost effective manner.

So the deal between DT’s T-Systems and Cloudera will see the two working together to develop and deliver production-ready solutions that will enable T-Systems customers to more quickly and easily derive value from their data.

T-Systems will integrate its existing cloud computing infrastructure with Cloudera’s Hadoop-based platform, to enable easily accessible and affordable analytics-as-a-service solutions. The joint solution was first made available to T-Systems’ enterprise customers in Europe last month, with additional markets to follow worldwide.

Recently, caught up with Jeurgen Urbanski, chief technologist for big data and cloud at Deutsche Telekom, at the Telco Cloud World Forum, to hear his thoughts on the Big Data landscape.

“Big Data is an entirely new way to take advantage of the incredible volume of data and the increasing diversity and variety of data as well as the incredible velocity of data. Hadoop makes it possible to store almost anything as long as you want economically speaking, and then means you can start to make sense of that data,” Urbanski said.

“Essentially it means you can store first, ask questions later.”