Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Perspectives 2018: Ara Tekian

375 views

Published on

The Process of Blueprinting National Examinations

Published in: Health & Medicine
  • Login to see the comments

  • Be the first to like this

Perspectives 2018: Ara Tekian

  1. 1. Dr. Ara Tekian University of Illinois Plenary II: The Process of Blueprinting National Examinations
  2. 2. The Process of Blueprinting National Examinations Ara Tekian, PhD, MHPE University of Illinois at Chicago tekian@uic.edu TOUCHSTONE Institute – Annual Symposium January 22, 2018, Toronto
  3. 3. Outline •Process of developing blueprints •Experience of the National Board of the Medical Examiners (NBME) in Blueprinting • Two committees of content experts • Five step process •American Board of Internal Medicine (ABIM) • Blueprint for MOC exam • Sample content distribution • Detailed blueprint with levels of importance
  4. 4. The Blueprint •A document that describes the CONTENT to be included in an examination. • Indicates the proportion of questions • General and specific content areas •It serves as a guide for the design of the examination and for developing a pool of potential test questions or items.
  5. 5. Components of a Faculty Development Program in Assessment (Chapter 16) 1. Basic principles 2. Methods and their alignment with competencies 3. Blueprinting and test construction 4. Assessor training 5. Scoring and standard setting
  6. 6. System History Explanation Examination Procedures Cardiovascular Chest pain Discharge drugs Cardiac Blood pressure management Respiratory Hemoptysis Respiratory Peak flow Gastro-intestinal Abdominal pain Gastroscopy Abdominal Rectal examination Reproductive Amenorrhea Abnormal smear Cervical smears Nervous Headache Eyes Ophthalmology Musculo-skeletal Backache Hip Generic Pre-op assessment Consent for post- mortem IV cannulation Example of an Objective Structured Clinical Examination (OSCE) Blueprint
  7. 7. A Sample Blueprint for National Examination
  8. 8. Experience of NBME in Blueprinting* Performing a job or task analysis is expensive and time consuming Therefore - • If content experts can identify the tasks involved in performing a job • They should be able to identify what knowledge one must have in order to perform that job. • They should be able to identify the knowledge base that needed to be tested in order to show competency in performing the job. *NBME Blueprinting, 2016
  9. 9. Acceptance of the rationale •Asking content experts: • What should be on the test? • How much of it? •Immediate appeal for the following reasons: • addressed the blueprint question • concerned only with concepts that could be measured by MCQs • relatively simple, practical, and cost-effective
  10. 10. Before beginning to develop a content outline •Establish the primary purpose of the examination and the inferences to be made about examinees from their scores •Ensure that the content scope and breadth represented in the blueprint support appropriate score interpretations • Common inference: Examinees who pass a certification examination should have demonstrated adequate content mastery to practice in a specialty or subspecialty area.
  11. 11. Main questions in developing a blueprint •What areas of knowledge should be included in the test? •How much of each area should be included? Note: While a practice analysis study can tell us what the target examinees do on the job, what actions they perform, and what decisions they must make, it does not directly address the question of what should be on the test.
  12. 12. Recruitment of two committees of content experts •The Advisory Committee will develop the prototype, evaluate comments, and finalize the blueprint. • academic and industry leaders in the subject area • approximately 7 members •Subject matter experts will review the prototype draft and suggest specific content areas to add, remove, or expand. • provide overall comments on the breadth, depth, and balance of the content. • 30 to 50 experts
  13. 13. Expertise of the two committees •Advisory Committee •Subject matter experts •Experienced practitioners who are familiar with the knowledge, skills, and abilities needed by the target users (examinees / practitioners / educators / influential individuals in the field).
  14. 14. 5 Step Process •1. Information gathering •2. Initial meeting and draft development •3. Confirmation and review •4. Data gathering and compilation •5. Finalizing the blueprint
  15. 15. 5 Steps Process 1. Information gathering •Compile background materials: • Teaching curricula • Job analyses • Relevant articles • Training manuals • Milestones and required competencies • Blueprints for similar existing exams • Any professional standards Note: this should be a collaborative process between the client and the “national board.”
  16. 16. 2. Initial meeting and draft development •The advisory committee convenes at a one-day meeting to go over the background materials and produce a draft of the content areas and relative weights that will make up the test blueprint (content specifications). •The activities of the meeting are: • Presentation/orientation by the psychometrician, explaining the tasks of the day. • Review and discussion of the background materials by the advisory committee.
  17. 17. 2. Initial meeting and draft development (cont.) •The activities of the meeting are (cont.): • Discussion and definition of the major areas of content to be included in the examination. (Test Development staff ensure the resulting draft blueprint meets their required guidelines.) • Definition of specific sub-areas within each major area on which questions should be written. • Assignment of weights to each major and sub-area in terms of percentage and number of items to be included in the test from that area (more important areas will have higher percentages of items). • Review and finalization of the draft of the blueprint.
  18. 18. 3. Confirmation and review •Send the blueprint draft assembled by the advisory committee to a group of subject matter experts for suggestions, comments, critiques, alternatives, and revisions of points and weights.
  19. 19. 4. Data gathering and compilation •Consolidate the results of the survey into a single document and send to the advisory committee for review.
  20. 20. 5. Finalizing the blueprint •The advisory committee discusses the survey results and finalizes the blueprint (via a series of conference calls or other means.) •This blueprint will then be used to analyze any existing item pools and to establish areas where items need to be developed, as well as guide the assembly of the examination.
  21. 21. Practice analyses (Two parts) •An initial data generation phase with a focused group of experts (e.g., blueprint development process utilizing an advisory committee) •A confirmation phase to validate the information by a larger group of stakeholders. Note: This validation phase typically incorporates a survey sent out to identified individuals.
  22. 22. Necessity of the confirmation phase •A small group of experts may be biased •The initial data generation phase is based on human judgment rather than objective reality, and is therefore not immune to the influence of raters who may be subject to numerous sources of inaccuracy Note: Most practice analyses include this subsequent questionnaire step to have a larger group of outside experts review the materials created by the initial advisory committee during the data generation phase.
  23. 23. Survey • Members of the expert survey group are selected by the advisory committee as a broader coalition of stakeholders with the necessary expertise to provide input into the blueprint. •The expert survey group also serves as a vehicle to create buy-in from the larger professional community. •The survey itself provides the proposed blueprint along with instructions to the group to provide input. It should be stressed that the blueprint needs to reflect the content intended to be tested.
  24. 24. American Board of Internal Medicine (ABIM)* Maintenance of Certification (MOC) Exam Blueprint • In 2015, ABIM invited certified general internists to provide ratings of the relative frequency and importance of blueprint topics in practice. • A sample of over 300 physicians, similar to the total invited population of internists in age, time spent in direct patient care, and practice setting, provided the blueprint topic ratings. • ABIM considered the average respondent ratings of topic frequency and importance in each of the content categories. • A second source of information was the relative frequency of patient conditions in the content categories, as seen by certified Internists and documented by national health care data. *ABIM: MOC Exam Blueprint, 2015
  25. 25. Purpose of the Internal Medicine MOC exam •Evaluate whether a certified internist has maintained competence and currency in the knowledge and judgment required for practice. •The exam emphasizes diagnosis and management of prevalent conditions, particularly in areas where practice has changed in recent years.
  26. 26. Exam format •The exam is composed of 240 single-best-answer MCQs, of which 40 are new questions that do not count in the examinee’s score. •All questions describe clinical scenarios and ask about the work done by physicians in the course of practice: • Diagnosis • Testing • Treatment/ Care Decisions • Risk Assessment/ Prognosis/ Epidemiology • Pathophysiology/ Basic Science
  27. 27. How the blueprint ratings are used to assemble the MOC exam • Blueprint reviewers provided ratings of relative frequency in practice for each of the detailed content topics in the blueprint and provided ratings of the relative importance of the topics for each of the tasks described. • In rating importance, reviewers are asked to consider factors such as the following: • High risk of a significant adverse outcome • Cost of care and stewardship of resources • Common errors in diagnosis or management • Effect on population health • Effect on quality of life
  28. 28. Frequency and Importance • Rated on a three-point scale corresponding to: •low, medium, or high • parameters for selecting MOC exam questions: • At least 75% of exam questions will address high-importance content (indicated in green) • No more than 25% of exam questions will address medium- importance content (indicated in yellow) • No exam questions will address low-importance content (indicated in red) • Independent of the importance and task ratings, no more than 18% of exam questions will address low- frequency content (indicated by “LF” following the topic description).
  29. 29. In Conclusion • Blueprinting of a national examination is extremely important • It might involve up to 5 steps and 2 committees (Advisory and Subject matter experts) • Practice analyses includes two parts: an initial data generation phase, and a confirmation phase (to validate the information (by survey) • For the initial data generation phase, patient issues and problems could be utilized from national databases (e.g., Medicare, ambulatory care database) or any billing database • Ratings for relative frequency and relative importance are essential considerations
  30. 30. Bibliography • Connell, A. F., & Nord, W. R. (1996). The bloodless coup: The infiltration of organization science by uncertainty and values. Journal of Applied Behavioral Science, 32, 407-427. • Goldstein, I. L., Zedeck, S. & Schneider, B. (1993). An exploration of the job analysis-content validity process. In: N. Schmitt & W. C. Borman (Eds.), Personnel selection in organizations. San Francisco, CA: Jossey-Bass. • Knapp, J. E., & Knapp, L. G. (1995). Practice analysis: building the foundation for validity. In: J. C. Impara, (Ed.), Buros-Nebraska series on measurement and testing. Lincoln, NE: Buros Institute of Mental Measurements, 93-116. • Morgeson, F. P, & Campion, M. A. (2004). A framework of potential sources of inaccuracy in job analysis. In: M. Wilson, R. Harvey, G. Alliger, &W. Bennett (Eds.), The handbook of work analysis: The methods, systems, applications, and science of work measurement in organizations. New York: Psychology Press/Taylor and Francis Group. • Newman, L. S., Slaughter, R. C., & Taranath, S. N. (1999, April). The selection and use of rating scales in task surveys: A review of current job analysis practice. Paper presented at the meeting of the National Council on Measurement in Education, Montreal, Canada. • Raymond, M. R. (2002). A practical guide to practice analysis for credentialing examinations. Educational Measurement: Issues and Practice, 21, 25-37.
  31. 31. 2018 Perspectives Symposium January 22, 2018 #TSINPerspectives2018

×