The Hadoop course covers hardware and software requirements, installing and configuring Linux, Hadoop, HDFS, MapReduce, Pig, Hive, HBase, and MongoDB. It also covers ETL processes using Pentaho Data Integration to load and transform data between relational databases and Hadoop. The course content includes tutorials on virtual machines, Linux commands, Hadoop processes and architecture, developing MapReduce applications, Pig commands, Hive basics, and connecting data warehouse tools to relational databases and Hadoop.