The document discusses using MobileBERT and TensorFlow.js to run BERT models in browsers. It introduces MobileBERT as a lightweight version of BERT optimized for mobile devices through architectural and training method optimizations. It then demonstrates a TensorFlow.js question answering model using MobileBERT finetuned on SQuAD. The model is deployed as a browser extension that allows querying pages. While not perfect, it shows the potential of running NLP models directly in browsers.