The Hollywood sign originated from a real estate development called Hollywood in the late 1880s. Over time, the area developed into the center of the American film industry due to its favorable climate. By the early 1900s, major film studios like Paramount, Warner Bros., RKO, and Columbia had set up production facilities in Hollywood, cementing its status as the capital of American cinema. Today, major film studios like Universal, Disney, Fox, and DreamWorks continue to operate in Hollywood and produce many well-known films.