This paper proposes FedBN, a method for federated learning that uses local batch normalization to address statistical heterogeneity in non-IID datasets across clients. FedBN trains batch normalization parameters locally on each client's data and aggregates only the global model during federated averaging. This allows FedBN to converge faster and more smoothly than FedAvg on heterogeneous datasets like SVHN. The paper shows FedBN provides theoretical convergence guarantees and improves privacy by keeping more client data local compared to baseline federated learning methods.