September 12, 2014
Nate Gross, MD
60 E 3rd Ave #115
San Mateo, CA 94401
Dear Dr. Gross:
As leaders of the top organizations representing emergency medicine, we have been contacted by scores
of emergency physicians from around the country about a survey being conducted by Doximity and
publicized by U.S. News & World Report. We appreciate your recognition of emergency medicine as an
academic medical specialty with a unique core of knowledge and robust research agenda.
However, we are concerned about the sampling method chosen for this survey, because we believe it will
fail to achieve your objective for this survey — to identify America’s top emergency medicine training
programs. Asking only physicians enrolled in a social media website to nominate their five most
preferred residencies will result in egregious sample bias and is not capable of resulting in a scientifically
valid result. The results will be based solely upon opinions expressed by physicians who have no first-
hand knowledge of any residency training programs other than the ones they attended themselves.
Response to Doximity#CORDAA16
2015-2016 Executive Committee
Douglas Franzen, MD
David Gordon, MD
Rahul Patwari, MD
Luan Lawson, MD
Nicholas Kman, MD
Kathy Hiller, MD
Julianna Jung, MD
Stacey Poznanski, DO
Matthew Tews, DO
June 8, 2015
RE: Doximity and U.S. News and World Report on the Top Medical Residency
Programs for 2015
You may have recently seen that Doximity has partnered with U.S. News and World
Report to generate a list of the Top Medical Residency Programs for 2015. As mentors
and counselors for students applying to residency, we appreciate the appeal of such a list
– it implies a measure of how “competitive” a residency program might be.
Unfortunately, nothing could be further from the truth.
Last year, the list was generated from a survey. The population surveyed included only
those physicians registered on Doximity, and asked them to rank their “top 5” residency
programs. Unlike other U.S. News & World Report rank lists, no objective criteria were
used and indeed, no such criteria currently exist. In this era of crowdsourced opinions,
you might think that a survey is a valid way to create a rank list. However, the vast
majority of practicing emergency physicians has intimate knowledge of only one
residency program and no basis for comparing other programs. The small percentage of
those in academics might have detailed knowledge of one or two more. Thus, the 2014
survey results really reflected a ranking of the programs with the most alumni who were
members of Doximity and were willing to vote in the survey.
Response to Doximity#CORDAA16
Response to Doximity
• 461 students at 3 medical schools (10% applying to EM)
• 33% of EM applicants modified their lists, mostly by adding
CORD Medical Student Advising Taskforce
• How can we help students apply
smarter, not harder? (fit > rankings)
• What factors matter to students?
• What data will programs share?