The document describes AffectNet, a new database of over 1 million facial images collected from the internet and annotated for facial expressions, valence, and arousal. About 450,000 images were manually annotated for the presence of 7 discrete facial expressions in the categorical model and intensity of valence and arousal in the dimensional model. This makes AffectNet the largest database of facial expressions in the wild annotated for both categorical and dimensional models of affect. Two neural network baselines are used to classify images by expression in the categorical model and predict valence and arousal in the dimensional model, showing improved performance over conventional methods.