This document summarizes a presentation about working with large datasets in R. It discusses how R loads all data into memory by default, which can cause issues for large datasets that exceed available RAM. It then overviews several packages for working with "big data" in R, including bigmemory and ff, which allow accessing data from files without loading entirely into memory. The document provides examples of using bigmemory to create large matrices stored on disk but shared across multiple R sessions, as well as applying it to analyze a large airline on-time performance dataset consisting of 120 million rows.