The document discusses cognitive biases in crowdsourcing, highlighting their negative impact on annotation quality and the challenges in mitigating them. It introduces a 12-item checklist adapted for crowdsourcing to help measure, assess, and document these biases. The updated version of the checklist is available in a repository for further research and application.