Let's face it - data sources are growing larger and more diverse every year - far outpacing the ability for us puny humans to successfully grok the data alone, let alone it's relevance to an investigation. One of the most important skills for an investigator or incident responder is that which will help him or her quickly answer diverse questions about the data they have been presented. Whether reviewing data from network captures, filesystem analysis, online services, or anywhere else - knowing how to slice data can give the investigator a clear edge in their pursuits. Most importantly, we can quickly eliminate the overwhelming volume of data that has no bearing on the investigation, leaving behind just the valuable tidbits that are most necessary. This talk will discuss how normalizing various data sources into a database can help wrangle data into a highly efficient tool that, when used properly, provides fast and decisive insight to the investigation at hand.