This document discusses the application of stochastic gradient descent (SGD) for statistical inference, particularly in the context of detecting adversarial attacks in machine learning. It outlines how confidence intervals can be computed using SGD, which is more efficient than traditional bootstrap methods for large data problems. The work emphasizes the practical advantages of using SGD for both inference and optimization in statistical analysis.