The West coast of America traditionally refers to the westernmost states of the United States, including states acquired through treaties and purchases. The region is geographically diverse, encompassing coastlines, mountains, deserts, and plains. Culturally, the West has been shaped by Native Americans, Spanish explorers, Asian immigrants, and Mormons. Major industries include agriculture, mining, logging, and technology, and the West boasts many iconic national parks and destinations like Las Vegas and Los Angeles that attract millions of tourists each year.