This document summarizes the key developments and improvements made to the MRCS examination. It discusses the implementation of the new OSCE format for Part B and revisions made to the syllabus and questions. The OSCE sub-group reviewed feedback and made changes such as examining more basic sciences and clinical content. The number of stations was increased from 16 to 18 and domains and content areas were simplified. Other subgroups discussed ongoing work to ensure question quality, standard setting, and quality assurance. Contributors provided overviews of their respective subgroups and examinations such as the DO-HNS.
Detailed information about the syllabus for the MRCS Examination Part A and Part B. This has been extracted from the official ICBSE 'Guide to the intercollegiate MRCS examination'.
Applications for the exams are open at: https://postgradexams.rcsi.ie
The format of the MRCS Part A Examination will change from January 2017. This is a brief summary of the implemented changes.
For more information, please visit: http://www.intercollegiatemrcsexams.org.uk/
You can apply for an exam at: https://postgradexams.rcsi.ie/
How to Become a Thought Leader in Your NicheLeslie Samuel
Are bloggers thought leaders? Here are some tips on how you can become one. Provide great value, put awesome content out there on a regular basis, and help others.
Introduction of Objective Structured Clinical Examination as assessment tool ...iosrjce
IOSR Journal of Research & Method in Education (IOSRJRME) is an open access journal that publishes articles which contribute new results in all areas of research & method in education. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on advanced research & method in education concepts and establishing new collaborations in these areas.
Detailed information about the syllabus for the MRCS Examination Part A and Part B. This has been extracted from the official ICBSE 'Guide to the intercollegiate MRCS examination'.
Applications for the exams are open at: https://postgradexams.rcsi.ie
The format of the MRCS Part A Examination will change from January 2017. This is a brief summary of the implemented changes.
For more information, please visit: http://www.intercollegiatemrcsexams.org.uk/
You can apply for an exam at: https://postgradexams.rcsi.ie/
How to Become a Thought Leader in Your NicheLeslie Samuel
Are bloggers thought leaders? Here are some tips on how you can become one. Provide great value, put awesome content out there on a regular basis, and help others.
Introduction of Objective Structured Clinical Examination as assessment tool ...iosrjce
IOSR Journal of Research & Method in Education (IOSRJRME) is an open access journal that publishes articles which contribute new results in all areas of research & method in education. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on advanced research & method in education concepts and establishing new collaborations in these areas.
Angovian Methods for Standard Setting in Medical Education: Can They Ever Be Criterion Referenced? ............. 1
Brian Chapman
Development Model of Learning Objects Based on the Instructional Techniques Recommendation....................... 27
Antonio Silva Sprock, Julio Cesar Ponce Gallegos and María Dolores Villalpando Calderón
Influential Factors in Modelling SPARK Science Learning System ............................................................................... 36
Marie Paz E. Morales
Investigating Reliability and Validity for the Construct of Inferential Statistics ......................................................... 51
Saras Krishnan and Noraini Idris
Influence of Head Teachers‟ Management Styles on Teacher Motivation in Selected Senior High Schools in the
Sunyani Municipality of Ghana ......................................................................................................................................... 61
Magdalene Brown Anthony Akwesi Owusu
Comparison and Properties of Correlational and Agreement Methods for Determining Whether or Not to Report
Subtest Scores ....................................................................................................................................................................... 61
Oksana Babenko, PhD. and W. Todd Rogers, PhD
Analysis of Achievement Tests in Secondary Chemistry and Biology ......................................................................... 75
Allen A. Espinosa, Maria Michelle V. Junio, May C. Manla, Vivian Mary S. Palma, John Lou S. Lucenari and Amelia E.
Punzalan
Towards Developing a Proposed Model of TeachingLearning Process Based on the Best Practices in Chemistry
Laboratory Instruction ......................................................................................................................................................... 83
Paz B. Reyes, Rebecca C. Nueva España and Rene R. Belecina
College Writing II Synthesis Essay Assignment Summer Semester 2017.docxclarebernice
College Writing II Synthesis Essay Assignment Summer Semester 2017
Directions:
For this assignment you will be writing a synthesis essay. A synthesis is a combination of two or more summaries and sources. In a synthesis essay you will have three paragraphs, an introduction, a synthesis and a conclusion.
In the introduction you will give background information about your topic. You will also include a thesis statement at the end of the introduction paragraph. The thesis statement should describe the goal of your synthesis. (informative or argumentative)
The second paragraph is the synthesis. You will combine two summaries of two different articles on the same topic. You will follow all summary guidelines for these two paragraphs. The synthesis will most likely either argue or inform the reader about the topic.
The conclusion paragraph should summarize the points of your essay and restate the general ideas.
For this essay you will read two research articles on a similar topic to the previous critical review essay as you can use this research in your inquiry paper. You will summarize both articles in two paragraphs and combine the paragraphs for your synthesis. In the synthesis you must include the main ideas of the articles and the author, title, and general idea in the first sentences.
This essay will be three pages long and the first draft and peer review are due June 15. You must turn them in hardcopy in class so you can do a peer review.
Running head: THESIS DRAFT 1
THESIS DRAFT 3Thesis Draft
Katelyn B. Rhodes
D40375299
DeVry University
Point-of-Care Testing (PoCT) has dramatically taken over the field of clinical laboratory testing since it’s introduction approximately 45 years ago. The technologies utilized in PoCT have been refined to deliver accurate and expedient test results and will become even more sensitive and accurate in order to dominate the field of clinical laboratory testing. Furthermore, there will be a dramatic increase in the volume of clinical testing performed outside of the laboratory. New and emerging PoCT technologies utilize sophisticated molecular techniques such as polymerase chain reaction to aid in the treatment of major health problems worldwide, such as sexually transmitted infections (John & Price, 2014).
Historic Timeline
In the early-to-mid 1990’s, bench top analyzers entered the clinical laboratory scene. These analyzers were much smaller than the conventional analyzers being used, and utilized touch-screen PCs for ease of use. For this reason, they were able to be used closer to the patient’s bedside or outside of the laboratory environment. However, at this point in time, laboratory testing results were stored within the device and would have to then be sent to the main central laboratory for analysis.
Technology in the mid-to-late 1990’s permitted analyzers to be much smaller so that they may be easily carried to the patient’s location. Computers also became more ...
Invited talk at the 'New Fellows' conference organised by the Royal College of Surgeons of England, Bristol, June 2016. Mr Vasdev shares his fellowship experience being the first accredited Post CCT Fellow in the UK in Robotic Urology by the Royal College of Surgeons of England and British Association of Urological Surgeons
End of Internship Presentation Slides (Geomatika University College)Darshini Perumalsivam
Overall, this internship was a useful experience. I have gained new knowledge, skills and met many new people. I achieved several of my learning goals, however for some the conditions did not permit.
Throughout my internship, I could understand more about the definition of Good Clinical Practice and the way of applying it in the Clinical Study as well as the importance of Drug Regulation. This provide me to prepare myself to become a responsible and an ambitious Clinical Research Associate (CRA) in the future. Along my training period, I realise that observation and time management is a main element in order to identify and to complete the study.
During the task assigned, I corporate with my colleagues to determine the problems. This indirectly helped me to learn independently, discipline myself, be considerate/ patient, self-trust, take initiative and ability to solve problems. Besides, my communication skill is strengthened as well when communicating with others. During training period, I have received advices from supervisors and colleagues when mistakes were made. Those advices are useful guidance for me to change myself and avoid myself making the same mistakes again.
In sum, the activities and tasks assigned that I have done as well as learned during my industrial training are really useful for me in future to face challenges in a working environment.
A due diligence Annual progress report for the CLINCHEM Department (www.clinchem.kasralainy.edy.eg)
An assessment tool for over all productivity and development of Education, Research and Laboratory healthcare services.
Highlights:
1- Mission, Vision and objectives
2- Role of the Department
3- HR
4- Organogram for the year
5- Lab Services
6- Scientific Research
7- Projects and more
Critique Template for a Mixed-Methods StudyNURS 6052We.docxannettsparrow
Critique Template for a Mixed-Methods Study
NURS 6052
Week 6 Assignment: Application: Critiquing Quantitative, Qualitative, or Mixed Methods Studies (due by Day 7 of Week 7)
Date:
Your name:
Article reference (in APA style):
URL:
What is a critique? Simply stated, a critique is a critical analysis undertaken for some purpose. Nurses critique research for three main reasons: to improve their practice, to broaden their understanding, and to provide a base for the conduct of a study.
When the purpose is to improve practice, nurses must give special consideration to questions such as these:
· Are the research findings appropriate to my practice setting and situation?
· What further research or pilot studies need to be done, if any, before incorporating findings into practice to assure both safety and effectiveness?
· How might a proposed change in practice trigger changes in other aspects of practice?
To help you synthesize your learning throughout this course and prepare you to utilize research in your practice, you will be critiquing a qualitative, quantitative, or mixed-methods research study of your choice.
If the article is unavailable in a full-text version through the Walden University Library, you must e-mail the article as a PDF or Word attachment to your Instructor.
MIXED-METHODS RESEARCH CRITIQUE
1. Research Issue and Purpose
What is the research question or issue of the referenced study? What is its purpose? (Sometimes ONLY the purpose is stated clearly and the question must be inferred from the introductory discussion of the purpose.)
1. Researcher Pre-understandings and / or Hypotheses and Research Questions
Does the article include a discussion of the researcher’s pre-understandings? What does the article disclose about the researcher’s professional and personal perspectives on the research problem? What are the hypotheses (or research questions/objectives) of the study? (Sometimes the hypotheses or study questions are listed in the Results section, rather than preceding the report of the methodology used. Occasionally, there will be no mention of hypotheses, but anytime there are inferential statistics used, the reader can recognize what the hypotheses are from looking at the results of statistical analysis.)
2. Literature Review
What is the quality of the literature review? Is the literature review current, relevant? Is there evidence that the author critiqued the literature or merely reported it without critique? Is there an integrated summary of the current knowledge base regarding the research problem, or does the literature review contain opinion or anecdotal articles without any synthesis or summary of the whole? (Sometimes the literature review is incorporated into the introductory section without being explicitly identified.)
3. Theoretical or Conceptual Framework
Is a theoretical or conceptual framework identified? If so, what is it? Is it a nursing framework or one drawn from another.
1. 1
OSCE:TheFuture
Mr Chris
Oliver
Chairman
ICBSE
W
DO-HNS
IQA
Nofear!
newsletter
ICBSE:
TheIntercollegiateExaminers’
October2009Newsletter2
Quicklinks
elcometothe
secondeditionof
theICBSEintercol-
legiateexaminers’
newsletter.
Both this newsletterand
the examinations continue to evolve. It
was Charles Caleb Colton (1780-1832)
who said ‘Examinations are formidable
even to the best prepared, for the
greatest fool may ask more than the
wisest man can answer’. However, in
2009, I believe that the MRCS is still a
formidable, quality surgical examination
but that the examination is now much
fairerand delivers reasonable, reliable
and reproducible questions.
I am almost half way through my
three-yearchairmanship of ICBSE
(Intercollegiate
Committee for
Basic Surgical
Examinations) and
during this time
the new MRCS
Part B OSCE has
been implemented
and improved, the
MRCS syllabus
has been revised,
and the whole
examination
re-submitted
to PMETB for
approval. We will
know the outcome by January 2010.
This newsletter is to inform the
intercollegiate examiners and bodies
of some of the people and develop-
ments in ICBSE. In this edition, we
have contributions from: Chris Butler,
Chairman, Part B (OSCE) sub-group, on
the future developments of the OSCE;
Derek Skinner, Chairman, DO-HNS
sub-group, defining the work and evolu-
tion of the Intercollegiate Diploma of
Otolaryngology - Head & Neck Surgery;
Mr Mark Fordham, Chairman, MCQ
sub-group, outlining the hard work
involved in getting the Part A, MCQ
exam together; and Mr Kevin Sherman,
Chairman of the Internal Quality
Assurance Committee (IQA), who tells
us about some of his group’s work. We
hear from June Laird of herexperience
as an OSCE candidate and FY1 Ewan
Goudie, who tells us the difference
between a syllabus and a curriculum;
it’s surprising how many don’t know the
difference! Finally, we have a short brief-
ing about the work of the ICBSE Office.
I would like to thank all the ICBSE and
sub-group members and advisers, the
Heads of Examinations of the Colleges,
the ICBSE Office team and you, the
ICBSE examiners, who have all helped
to shape and run the MRCS exam
so well – sometimes in quite severe
adversity.
Finally, if anyone wishes to contribute
to the next edition of the newsletter,
contributions and feedback can be sent
to Rezi Bocz, Intercollegiate MRCS
Coordinator, at: rbocz@icbse.org.uk.
ChrisOliver
Chairman ICBSE
cwoliver@icbse.org.uk
Nextpage
2. 2
TheOSCEofthefuture
It was always the intention to review
and modify the MRCS Part B (OSCE)
examination when we had enough
information. By May 2009 over 850
candidates had taken the examination
and we had accumulated a wealth
of statistical evidence and a lot of
feedback from candidates, examiners
and administrative staff.
All members of the ICBSE OSCE
sub-group, including the trainee and
patient representatives, were invited
to submit suggestions for change. The
sub-group also received suggestions
from a number of other sources includ-
ing the RCSEng Court of Examiners.
Constructive debate followed and there
was broad consensus about the way
forward.
It was agreed that the system basically
worked: candidates felt they were
fairly examined, the standard-setting
method was standing up to scrutiny
and the majority of examiners
thought the standard was appropriate.
Necessary improvements included the
need to examine more basic sciences,
particularly pathology, and more clini-
cal content. We needed to simplify the
marking and question writing process
by reducing the number of domains
and broad content areas and to reduce
candidate choice. It was also agreed
that we should assess communication
skills across more of the examination.
A smaller group was tasked with work-
ing up detailed proposals.
In July, ICBSE debated these proposals
and agreed to move to an OSCE of 18
examined stations. Rest stations will
be replaced by examined stations. All
stations will be manned. There will be
a new surgical pathology station and
two extra clinical stations which will
be hospital-based to give easier access
to real patients. Candidates will have
less choice between the four specialty-
specific areas.
There will be four, rather than five,
broad content areas: anatomy and
surgical pathology; applied surgical
science and critical care; communica-
tion skills (including history taking);
and clinical and procedural skills. The
domain structure will be simplified
and reduced from six to four: clinical
knowledge and its application; clinical
and technical skills; communication;
and professionalism. Domains will
no longer be pass/fail criteria and
the standard-setting group can now
use higher marks for the content area
and overall pass thresholds. One new
station will be piloted in each circuit.
When these proposals are approved by
the regulator for May 2010, the MRCS
exam will be longer than the three-part
MRCS.
I am grateful to all of you who have
provided feedback and suggestions.
The OSCE sub-group considered them
all. The proposed changes represent a
significant improvement. With seven
hours of examining we will be able to
get the standards right and make sure
that the assessment is rigorous and fit
to give successful candidates the right
to membership and the feeling that
they have undergone a proper post-
graduate examination.
ChristopherMButlerMS,FRCS
Chairman ICBSE Part B (OSCE)
subgroup
NextpageBacktotopLastpage
3. 3
IQA:frombehindclosed
doors
Several times ayear the IQA sub-
committee meets behind closed doors
and wades through a (very) largevolume
of statistics and reports and then makes
comments, suggestions and reports to
the ICBSE. So, what is the purposeof this
arcanegroup?
The IQA sub-group’s remit is toensure
that theexamination is doing what
is claimed for it; this involves several
differentaspects: theexamination should
test thoseattitudes, skills and aspects
of knowledgeas setout in the syllabus;
it should test these things in away
that is fair to thecandidates; it should
ensure that the standard, as laid down
by theappropriate body, is maintained
in aconsistent fashion. To put this into
plain English; theexamination should
beappropriate, fairand reproducible.
As in all things educational, theabove
statementcan be re-translated intovery
technical sounding educational jargon,
designed todeter too many intrusive
enquiries – thus, in education terms, the
examination must, amongstother things,
haveconstructvalidityand reliability.
Fortunately, in matters involving num-
bers, categories and statistical strategy,
the sub-group is ably ‘aided and abetted’
and, aboveall, educated on a regular
basis, by John Foulkes, whoseanalyses
and explanations of thecomplexities
of educational measurementarean
essential componentof the sub-group’s
work.
Have theexaminations been achiev-
ing theirstated aim (I use the plural
advisedlyas weclearly now have several
differentexaminations, each designed
foradifferentgroupof trainees)? It is
really tooearly to be passing adefinitive
view on the ‘OSCE style’ examination as
there is not the samedepth of historical
data for this formatof theexamination
as for theearlier formats and measures
such as Cronbach’s alpha must be
interpreted with caution as many factors
may influence this score. Thealpha score
should therefore be seen as onlyoneof
a rangeof indicators of thequalityof
an examination; nevertheless thealpha
score has been running atapproximately
0.8, which is generallydeemed accept-
ablealthough wewill continue to strive
toachievea higheraspirational value.
Feedback from thecandidates taking
the OSCE styleexamination appear to
indicate that theyare, in general, satisfied
that theexamination is fairand appropri-
ate for their training.
Nocurriculum should ever be fixed
and immutableand assessments
and examinations must therefore be
constantly reviewed and refined to reflect
thechanges that take place. The MRCS
examination(s) will continue toevolve
and I anticipate that IQA will be kept
very busy for the foreseeable future!
KevinSherman
Chairman, IQA subgroup
IntercollegiateDO-HNS
The Diploma of Otolaryngology - Head
& Neck Surgery (DO-HNS) became
an intercollegiate examination in 2008
and is administered through ICBSE. It
replaced the Diploma of Laryngology
and Otology (DLO) in May 2003. The
DLO, originally introduced in 1923,
had been used as an assessment for
ENT surgeons who were becoming
consultants before specialty-specific
FRCS examinations existed. The new
DO-HNS examination was designed,
piloted and launched in 2003 – led by
Maurice Hawthorne.
Lately, with MMC and the restructuring
of surgical training, the DO-HNS has
been used as a test of knowledge for
ENT basic surgical trainees considering
highersurgical training in ENT surgery
before embarking on higherspecialist
training. The examination is also used
as a specific and explicit standard of
ENT knowledge for those doctors in
non-careergrades of ENT surgery and
forgeneral practitioners with a specific
interest in ENT surgery.
The examination has generated
considerable interest in the ENT surgery
community of the United Kingdom
and abroad. Indeed, the written paper
can now be undertaken in Cairo and
Damascus. As the reputation of the
examination has grown, the DO-HNS
became a pre-requisite forentry into
highersurgical training forspecialist
registrars and, more recently, with the
inception of specialty registrars it has
became a specific requirement for the
recruitment process into ST3 for higher
surgical training in ENT Surgery.
The DO-HNS examination is in two
parts: Part 1 is a two hourwritten paper
comprising extended matching ques-
tions and multiple true false/single best
answerquestions and Part 2 is an OSCE.
It is now offered three times ayear.
At present, the DO-HNS Diploma,
with the addition of the Part A MRCS,
allows the award of an MRCS Diploma.
Although this was accepted by PMETB
in 2008, issues of equivalence have now
been raised and we are now working
towards an MRCS Diploma that is ENT-
themed using the Part A MRCS and
the DO-HNS OSCE to act as the MRCS
requirement for ENT trainees entering
ST3.
With this new innovation being
actively pursued, the curriculum for the
DO-HNS is being brought in to line
with the themed ENT MRCS Diploma.
However, is anticipated that the
DO-HNS will also continue as a stand-
alone examination. The curriculum is
the ISCP curriculum for the Early Years
of Surgical Training.
Much work remains to be done over
the next few months, and the DO-HNS
sub-group fully appreciates the support
of the ICBSE and the Examination
Departments at all the Colleges.
DerekSkinner
Chairman, DO-HNS subgroup
NextpageBacktotopLastpage
4. 4
PartA:Anexamoftwo
halves
The Part A MRCS is an examination of
two halves: two hours foreach written
paperwith a break in between. The first
half consists of 135 single best answer
questions testing applied basic science
and the second half 135 items using
extended matching questions, testing
principles of surgery in general. Three
areas arevital to make the examination
reliable, fairand up-to-date: the ques-
tions, security and the standard.
Over 100 question writers, surgeons and
basic scientists, meet in teams to write
new questions, check poorly-performing
ones and update ones where technology
or practice has changed. A Question
Quality Group (QQG) chaired by Peter
Billings does the final ‘kite-marking’
of questions which are then entered
into the computerised question bank
by Greg Ayre, ICBSE Question Editor.
New questions are pre-tested. A fixed
numberare included in each paper to
assess how they perform but do not
count towards the final mark.
The software uses a syllabus-based
blueprint to randomly generate each
new paperwhich is scrutinised by the
Paper Panel, chaired by Mark Fordham,
for balance of topics.
Security is vital. UK centres must
conform to approved arrangements;
international centres liaise with the
British Council which oversees safe
distribution. Each paper is produced in
different question sequences identified
by a coloured strip; adjacent candidates
in the examination room work from
differentversions. The answersheets are
marked by an optical mark reader.
Then comes the challenge of setting the
standard.
When a new style of paper is developed,
a standard-setting exercise is carried
out with 20 or 30 experts judging what
percentage of borderline candidates
would get each question correct. Their
judgements are pooled and analysed to
give a measure of the difficulty of indi-
vidual questions and the paperoverall.
Once candidates have sat the paper
each question is analysed to give it a
level of discrimination. Questions have
a positive discrimination value where
high performing candidates get them
correct and low performing candidates
incorrect. Questions have a negative
discrimination value where good
candidates get them wrong and poor
candidates right; these questions are
reviewed by the QQG.
High positive discrimination questions
are used as markeror ‘benchmark’ ques-
tions in subsequent papers and used to
help calculate the pass mark. This analy-
sis is prepared by ICBSE’s professional
educationalist John Foulkes, discussed
at the Paper Panel, and a final pass mark
decided. It is a requirement of passing
that at least 50% is obtained in both
sections of the paper. The final results
are made available on College websites.
MarkFordham
Chairman, MCQ subgroup
Nofear!
JuneLairdovercomesherexamnervesandwritesaboutherexperiencesittingtheMRCSOSCE
This was not the case. By the time I
was applying to re-sit, the format of the
exam had changed and I was now going
to be involved in the first sitting of the
MRCS OSCE. Looking at the candidate
guidance notes it seemed to mimic the
format of the assessment I was familiar
with from undergraduateyears. There
were to be 16 stations covering six
domains: clinical knowledge; clinical
skill; technical skill; communication;
decision making, problem solving,
situational awareness and judgment; and
organisation and planning.
In addition, I had to select two specialty
contexts. Trunk and thorax was the
obvious first choice for me but, after
my previous skull/parotid/mandible
based nightmare, I elected for limbs.
There were fourspecialty context sta-
tions – one foranatomy and pathology,
one for history taking and two forclinical
examination. To pass I had to achieve the
overall pass mark and the pass in each of
the domains and content areas.
The OSCE was much less intimidating
than theviva; there was a feeling of
‘safety in numbers’ as there were more
of us being assessed simultaneously;
the style was more clinically oriented,
reflecting what we do on a day-to-day
basis; and I knew there would be a bell in
nine minutes. I was also better prepared.
The stations included: data interpreta-
tion with a follow-up station involving
communication with an ‘on-call con-
sultant’ by telephone; suturing of a
simulated wound while interacting with
the patient; looking at radiological
investigations followed by discussion
about relevant pathology.
The Quincentenary Hall was a perfect
venue with sufficient space to accom-
modate the stations. There were plenty
of staff on hand to help keep candidates
moving in the right direction.
Overall, I thought the exam was a better
representative of modern specialty train-
ing and there is no surprise as to which
format I prefer.
I woke up in Edinburgh with that all too
familiar pre-exam nausea and feeling
of impending doom. As I was getting
ready to head to the College I thought of
when I had attempted, unsuccessfully,
to pass the ‘old’ MRCS viva five months
earlier. That exam took the format of
three oral stations: critical care and
applied physiology, applied pathology
and generic clinical skills, and applied
and surgical anatomy. Each station was
a 20-minute long interrogation covering
avariety of topics. Similarly, with regard
to the anatomical specimens presented,
there were about a dozen on the table
but of the four I was subjected to, only
one was of an area in which I had been
operatively involved as a general surgical
trainee. Unfortunately, the ground did
not swallow me up there and then and
I was not allowed to progress to the
clinical component. I consoled myself by
thinking that now I knew what to expect
and I would be better prepared next
time.
NextpageBacktotopLastpage
5. 5
NextpageBacktotopLastpage
What’sthedifference?
We are all aware of the terms cur-
riculum and syllabus. However, we
may find it difficult to describe the dif-
ferences between them since they are
often used loosely or interchangeably.
It is, however, important for trainers
and trainees to have a common
understanding of what they mean, the
difference between them, and their rel-
evance to training. This is particularly
the case since the implementation of
the Intercollegiate Surgical Curriculum
Programme (ISCP) (www.iscp.ac.uk).
The Postgraduate Medical Education
and Training Board (PMETB) provided
definitions of both terms (visit: http://
www.pmetb.org.uk/fileadmin/user/
QA/Assessment/Assessment_system_
guidance) which were adopted by each
of the Specialist Advisory Committees
(SACs) when they outlined their
modernised training programmes for
the ISCP.
A curriculum is a statement of the
aims and intended learning outcomes
of an educational programme. It states
the rationale, content, organisation,
processes and methods of teaching,
learning, assessment, supervision and
TheICBSEOffice
The ICBSE office is based at the Royal
College of Surgeons of England.
It co-ordinates and manages the
development of the Intercollegiate
MRCS and DO-HNS examinations
and related question banks on behalf
of, and in conjunction with, the
four Royal Surgical Colleges. It also
supports the work of the Chairman
and all members of the ICBSE
committee and its sub-groups by
producing documents and policy
papers (including examination
papers, OSCE scenarios, regulations
and guidance notes) and ensuring the
smooth running of the Intercollegiate
Membership Examinations.
The ICBSE office consists of the
following staff:
GregAyre
gayre@icbse.org.uk
MCQ Question Editor
ReziBocz
rbocz@icbse.org.uk
Intercollegiate MRCS coordinator
AishaHashmi
ahashmi@icbse.org.uk
Systems coordinator
CatherineMcEvoy
cmcevoy@icbse.org.uk
MRCS OSCE Question Editor.
Full details of the MRCS and
DOHNS examinations can be found
at
www.intercollegiatemrcs.org.uk.
Ifyouhaveanycomments
onthisnewsletter,orwould
liketocontributetothe
nextedition,pleaseemail:
rbocz@icbse.org.uk
feedback. Essentially, it is a strategic
document that provides teachers,
educationalists and supervisors insight
into the range of activities within an
educational programme and gives
them an indication of the resources
required to deliver it. It gives the
learner a detailed overview of the
breadth and depth of a training course
and can provide prospective trainees
with detailed knowledge of what a
programme will entail and can inform
choice.
A syllabus is a list or summary descrip-
tion of course contents or topics that
might be tested in examinations. It
identifies important, relevant, measur-
able outcomes which usefully reflect
the performance of learners and helps
to prioritise learning.
In modern medical education a
syllabus should not be regarded as an
adequate substitute for a curriculum.
EwanBGoudie
FY1 Orthopaedic surgery
ChrisOliver
Chairman ICBSE
Edinburgh Royal Infirmary