The amount of text data (news articles, blogs, social media etc.) on the web is increasing at a staggering rate. However, the amount of irrelevant information or noise on the web is increasing at a much higher rate than action-able information that can generate alpha. It is becoming increasingly difficult to mine for actionable stories on the web using standard, out of the box language processing techniques and libraries. Given that the performance, robustness and reliability of all data-centric models are directly dependent on the quality of the data, noise reduction becomes one of the most important steps in the data science pipeline. Thanks to the recent research advancements in the field of big data, deep learning and natural language processing technologies, we are now able to mine for actionable stories in millions of information pieces and hundreds of terabytes of data. In this talk, we will highlight various approaches and technologies we employ as part of the noise cancellation mechanism at Accern. We will also compare the performance of trading strategies that use social analytics derived using standard versus sophisticated noise cancellation techniques, as well as those that utilize other advanced metrics.