The document discusses the transformation of natural language processing (NLP) projects due to the advent of pre-trained language models (PTLMs) like BERT, emphasizing their influence on existing methodologies and the cost-efficiency of development processes. It highlights how PTLMs have shifted project focus from extensive feature engineering and custom development to leveraging pre-trained models through fine-tuning, leading to agility in project execution. Additionally, the document raises concerns about energy consumption and the accessibility of research in deep learning, suggesting a shift of capabilities toward industry due to resource demands.