Be the first to like this
Nowadays we are producing a huge volume of information, but unfortunately at most only 12% of it is analyzed.
That is why we should dive into our data lake and pull out the Holy Grail - the knowledge. But BigData means big problem.
So, challenge accepted!
The perfect solution for achieving this goal is Hadoop. It is a 'data operating system', which allows us to process large volumes of any data in a distributed way.
Together, we will take a phenomenal journey around Hadoop world.
First stop: operations basics.
Second stop: short tour around Hadoop ecosystem.
At the end of our travel, we will walk through several examples, that show you real power of a Hadoop as your data platform.
Arkadiusz Osinski - Works in Allegro Group as a System administrator. From the beginning he is related with building and maintaining of Hadoop infrastructure within Allegro Group. Previously he was responsible for maintaining large scale database systems. Passionate about new technologies and cycling.
Robert Mroczkowski - In 2006 graduated master studies in Computer Science at Nicolaus Copernicus University. In 2007 he graduated Bachelor Studies in Applied Informatics at Nicolaus Copernicus University. In years 2006 - 2011 he was a PhD student in Computer Science. His research field was Computer Science applied in Bioinformatcs. In 2012 he started to work as Unix System Administartor in Allegro Group. He gained experience in Hadoop World building and maintaining a cluster for GA. Every day he works with modern high-performance and high-available technologies, centrally managed in cloud environment.