Be the first to like this
Just how big are Big Data anyway? The best answer - data sets too large for existing or traditional tools to handle - leads to some difficult questions. Sure, we can leverage Hadoop or NoSQL in the Cloud or some other modern approach to deal with today's Big Data problems, but what about tomorrow, when Hadoop and the Cloud move to the "existing tools" category?
The core problem is that the quantity of data we have to work with continues to explode exponentially, while our ability to deal with such Big Data grows linearly. No matter how good our tools are, eventually Big Data will outgrow them.
The solution is to take an Agile Architecture approach that addresses change directly by calling for a governance-centric approach to leveraging tools for solving Big Data problems. The architectural focus shifts from the tools to human activity and behavior. The tools may never solve the Big Data problem over the long term, but human innovation and ingenuity can.
Attendees of this session will:
- Gain an understanding of the fundamental limitations of technology when dealing with continually exploding Big Data challenges
- Learn how a governance-centric Agile Architecture approach can enable data teams to deal with increasingly large data sets
- Gain an appreciation of how a properly architected human/technology system of systems can foster innovation sufficient for dealing with Big Data