hack/reduce is a community and hackspace for working with big data that provides access to a computing cluster, holds regular hackathons, and allows users to work with large datasets containing millions or billions of records using tools like Hadoop and MapReduce to find patterns and extract new information. The computing cluster has 240 cores, 240GB of RAM, and 10TB of disk space available for exploring open datasets like government documents, weather records, and transportation data.