JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, VOLUME 14, NUMBER 5, 2013

The future of medical physics in the US health-car...
2   Mills: Editorial	

2

4.	 Competition — For example, a large not-for-profit US hospital chain and a hospital chain
fro...
JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, VOLUME 14, NUMBER 5, 2013

AAPM Medical Physics Practice Guideline 1.a: CT Pr...
4   AAPM: Medical Physics Practice Guidelines	

4

This MPPG only applies to CT scanners used for diagnostic imaging. It i...
5   AAPM: Medical Physics Practice Guidelines	

5

ii.	 Responsibilities of the QMP
	
In the context of CT Protocol Manage...
6   AAPM: Medical Physics Practice Guidelines	

6

v.	 Qualifications and Expectation of the Lead CT Technologist
	
The Am...
7   AAPM: Medical Physics Practice Guidelines	

7

should be no less frequent than 24 months. This review should include a...
8   AAPM: Medical Physics Practice Guidelines	

8

rotation time and/or low pitch value would not be appropriate for a che...
9   AAPM: Medical Physics Practice Guidelines	

9

and that set of protocols moved to the other similar scanners, or if an...
10   AAPM: Medical Physics Practice Guidelines	

10

ACKNOWLEDGMENTS
This guideline was developed by the Medical Physics P...
11   AAPM: Medical Physics Practice Guidelines	

11

	 5.	 Siegelman JR, and Gress DA. Radiology stewardship and quality i...
12   AAPM: Medical Physics Practice Guidelines	

12

APPENDIX
Appendix A: Example of how team members may work together an...
JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, VOLUME 14, NUMBER 5, 2013

Evaluation of the sensitivity of two 3D diode arra...
14   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

14

I.	 Introduction
Volumetric-modulated arc...
15   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

15

Table 1.  Patient characteristics and pla...
16   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

16

Fig. 1.  Coordinate system used in the st...
17   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

17

Table 2.  The QA results of 11 VMAT plans...
18   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

18

Fig. 3.  The impact of translational setu...
19   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

19

Fig. 4.  The impact of rotational setup e...
20   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

20

Table 3.  Translational and rotational se...
21   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

21

F. 	 Cumulative effect of both translatio...
22   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

22

maximum measured dose in the detector rin...
23   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

23

accuracy of 2 mm/1° and 2 mm/2° is requir...
24   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA	

24

	 24.	 Poppe B, Blechschmidt A, Djouguela...
JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, VOLUME 14, NUMBER 5, 2013

Development of real-time motion verification syste...
26   Park et al.: Real-time motion verification	

26

system within a particular portion of the patient’s breathing cycle ...
27   Park et al.: Real-time motion verification	

27

Fig. 1.  A schematic representation for the developed quality assura...
28   Park et al.: Real-time motion verification	

28

C. 	 Stereocamera system and IR marker tracking
An IR-based stereoca...
29   Park et al.: Real-time motion verification	

29

from the RS files. The initially generated ten sets of body midlines...
30   Park et al.: Real-time motion verification	

30

imports a text file containing the 100 sets of 4D reference lines. S...
31   Park et al.: Real-time motion verification	

31

Finally, the accuracy of phase synchronization was evaluated by comp...
32   Park et al.: Real-time motion verification	

32

Fig. 5.  Five motion scenarios were taken into account in the phanto...
33   Park et al.: Real-time motion verification	

33

G.3  Evaluation of intrafractional changes in breathing motion
Evalu...
34   Park et al.: Real-time motion verification	

34

Fig. 6.  The programmed motion of platform B coincided perfectly wit...
35   Park et al.: Real-time motion verification	

35

C. 	 Phantom evaluation results
Figure 8 shows the 4D CT images of t...
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Journal of applied clinical medical physics Vol 14, No 5 (2013)
Upcoming SlideShare
Loading in …5
×

Journal of applied clinical medical physics Vol 14, No 5 (2013)

3,968 views

Published on

Journal of applied clinical medical physics Vol 14, No 5 (2013)
--
Журнал прикладной клинической медицинской физики (JACMP) публикует статьи, которые помогут клиническим медицинским физиков выполнять свои обязанности более эффективно и результативно, с большей полезностью для пациента. Журнал был основан в 2000 году, является журналом открытого доступа и публикуется дважды в месяц.

Published in: Health & Medicine, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
3,968
On SlideShare
0
From Embeds
0
Number of Embeds
32
Actions
Shares
0
Downloads
26
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Journal of applied clinical medical physics Vol 14, No 5 (2013)

  1. 1. JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, VOLUME 14, NUMBER 5, 2013 The future of medical physics in the US health-care system This issue, I would like to digress from the meaning of the JACMP and discuss with you some personal thoughts on the US health-care system and the future of medical physics under the Affordable Care Act. My credentials for discussing this topic include a newly minted PhD in Health Management from the School of Public Health at the University of Louisville. I spent many rewarding hours in classes, seminars, and lectures learning about the changes in US health care and public health over the past 11 years. This is a very difficult topic to explore in the limited space assigned to an editorial, but I would like to point out some themes that we need to bear in mind as we look forward. 1. Efficiency — It is expected that the Affordable Care Act will add approximately 30 million people to the roles of the insured. Health-care workers likely will be expected to absorb the associated additional work with only modest increases in employment. Please keep in mind that hundreds of billions of dollars are being diverted from the Medicare Program over the next decade to fund the services provided to these additional patients and consumers. Now add to this dynamic the large number of retiring baby-boom health-care workers and the loss of a large base of knowledge and experience. It is projected that up to 30% of the health-care workforce could retire within the next five years. While younger workers are usually less expensive to providers, they also may be less efficient in the delivery of quality services. Those health delivery systems that can deliver health services at community standard of care quality will be at a competitive advantage. For example, if a cancer center can treat patients presenting with a certain disease with fewer fractions than another while reporting similar and competitive outcomes, the gatekeepers of the systems likely will seek out and reward a more efficient provider. 2. Training — Barriers are increasing for those seeking to enter the heath-care workforce. Medical physics MS programs are charging up to $20,000 per year for tuition. Student loans for this training are running at a 7.5% annual interest rate. Financing a medical school education is even more challenging. If a student borrows $200,000 to go to medical school, then moves on to complete a residency program, it may be years before a young physician can begin to make a dent in this liability. Yes, some physicians earn significant financial rewards, but such success is by no means guaranteed. Physicians of tomorrow may very well face student loan debt larger than their mortgages. Although medical physics training pathways are shorter and less expensive, financial barriers are both significant and growing. 3. Reimbursement — If you look at business trends, health-care providers are merging into enormous entities composed of multiple hospitals and clinics. It may be that one primary purpose for this trend is to become large enough to negotiate directly with the primary payers. It is no secret the Center for Medicare and Medicaid Services (CMS) seems to be moving away from the Current Procedural Terminology (CPT)-based fee-for-service reimbursement for outpatient health services. Each year, we see more services being bundled under the broader Ambulatory Patient Classification (APC) categories, with the apparent ultimate goal seeming to be to assign a dollar value for each International Classification of Disease (ICD-9 or ICD-10) patient diagnosis. There are dozens of ICD codes for breast cancer (just one example), and most other major cancers you can name. Once the relative values of each code are assigned, it will be possible to name a single multiplier to the entire table to determine the reimbursement of any patient presentation. The multiplier would be determined by the funds available and political realities. This approach could replace the current fee-for-service reimbursement system. The bottom line is that the value of work performed by medical physicists could easily get lost in the consolidation of the reimbursement and payment system, along with the trend of health-care providers merging into larger and even larger entities. We will need to redouble our efforts to be visible and relevant. 1    1
  2. 2. 2   Mills: Editorial 2 4. Competition — For example, a large not-for-profit US hospital chain and a hospital chain from an Asian nation together are building a hospital in the Cayman Islands. Large investment banks fund this multibillion dollar project. Ultimately the plans include a two thousand-bed hospital, medical school, research center, as well as a biotechnology park and assisted living complex. It intends to seek Joint Commission International Accreditation. This is only one of many such projects either being constructed or planned in the Caribbean and Central America. The funding for such projects comes from those that anticipate there will be many willing to travel for quality services offered at lower cost. Some aspects of the Cayman I ­ slands project include the importation of health-care professionals from other countries with no additional requirements to practice and no taxes on imported capital equipment, and the purchase of equipment and supplies at the rates that hospital groups in Asia pay. Considering the proximity of the Cayman Islands to Florida, Texas, and the other Gulf Coast states, it seems very likely to me those 2,000 beds might not be enough. My conclusion from what I have seen in the literature and heard from my instructors is the US health-care system will undergo enormous changes over the next few years. These will be times of uncertainty and struggle for all health professions. Although medical physicists are in some ways better equipped to weather the storm than others, we too might face significant pressures. Are we ready for the challenge? Michael D. Mills Editor-in-Chief Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  3. 3. JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, VOLUME 14, NUMBER 5, 2013 AAPM Medical Physics Practice Guideline 1.a: CT Protocol Management and Review Practice Guideline The American Association of Physicists in Medicine (AAPM) is a nonprofit professional society whose primary purposes are to advance the science, education, and professional practice of medical physics. The AAPM has more than 8,000 members and is the principal organization of medical physicists in the United States. The AAPM will periodically define new practice guidelines for medical physics practice to help advance the science of medical physics and to improve the quality of service to patients throughout the United States. Existing medical physics practice guidelines will be reviewed for the purpose of revision or renewal, as appropriate, on their fifth anniversary or sooner. Each medical physics practice guideline represents a policy statement by the AAPM, has undergone a thorough consensus process in which it has been subjected to extensive review, and requires the approval of the Professional Council. The medical physics practice guidelines recognize that the safe and effective use of diagnostic and therapeutic radiology requires specific training, skills, and techniques, as described in each document. Reproduction or modification of the published practice guidelines and technical standards by those entities not providing these services is not authorized. 1. Introduction The review and management of computed tomography (CT) protocols is a facility’s ongoing mechanism of ensuring that exams being performed achieve the desired diagnostic image quality at the lowest radiation dose possible while properly exploiting the capabilities of the equipment being used. Therefore, protocol management and review are essential activities in ensuring patient safety and acceptable image quality. These activities have been explicitly identified as essential by several states(1-2) regulatory and accreditation groups such as the American College of Radiology (ACR) CT Accreditation program,(3) as well as the Joint Commission in its Sentinel Event Alert,(4) among others. The AAPM considers these activities to be essential to any quality assurance (QA) program for CT, and as an ongoing investment in improved quality of patient care. CT exam protocols are used to obtain the diagnostic image quality required for the exam, while minimizing radiation dose to the patient and ensuring the proper utilization of the scanner features and capabilities. Protocol Review refers to the periodic evaluation of all aspects of CT exam protocols. These parameters include acquisition parameters, patient instructions (e.g., breathing instructions), the administration and amounts of contrast material (intravenous, oral, etc.), and postprocessing parameters. Protocol Management refers to the process of review, implementation, and verification of protocols within a facility’s practice. This is a complex undertaking in the present environment. The challenges in optimization of dose and image quality are compounded by a lack of an automated mechanism to collect and modify protocols system-wide. The manual labor involved in identifying, recording, and compiling for review and subsequent implementation of all relevant parameters of active protocols is not inconsequential.(5) The clinical community needs effective protocol management tools and efficient methods to replicate protocols across different scanners in order to ensure consistent protocol parameters. The ability to quickly view and understand the myriad of CT protocol parameters contained within a single exam type is critical to the success of protocol review. The ability to quickly identify an outlier protocol parameter would also be hugely beneficial to the CT protocol review process. 3    3
  4. 4. 4   AAPM: Medical Physics Practice Guidelines 4 This MPPG only applies to CT scanners used for diagnostic imaging. It is not applicable to scanners used exclusively for: a. Therapeutic radiation treatment planning or delivery; b. Only calculating attenuation coefficients for nuclear medicine studies; or c. Image guidance for interventional radiologic procedures. 2. Definitions a. CT Protocol – the collection of settings and parameters that fully describe a CT examination.(6) Protocols may be relatively simple for some body part specific systems or highly complex for full-featured, general-purpose CT systems.(7) b. Qualified Medical Physicist –as defined by AAPM Professional Policy 1(8) 3. Staffing Qualifications and Responsibilities a. The Protocol Review and Management Team Protocol Review and Management requires a team effort; this team must consist of at least a lead CT radiologist, the lead CT technologist, and qualified medical physicist (QMP). In addition, a senior member of the facility administration team should also be involved. This could be the Chief Medical or Administrative Officer for the facility, or a dedicated Radiology Department Administrator/Manager, as determined by hospital leadership. If a senior member of the facility administration team is not a member of the Protocol Review and Management Team, there should be a clear delineation of the reporting structure. This team must be responsible for protocol design and review of all parameter settings. Each team member brings different expertise and may have different responsibilities in the Protocol Review and Management process. To be successful, it is very important that the expectations of roles and responsibilities of each member are clearly described. The ability to work together as a team will be an important attribute of each member of this group. The flow chart in Appendix A is an example of how team members should work together and in parallel during the process.(5) Additional examples of protocol management based on one facility’s experience are discussed in References 9 and 10. The team members, their qualifications and expectations are described below. i. Qualified Medical Physicist (QMP) The first Professional Policy of the AAPM provides a comprehensive definition of a Qualified Medical Physicist (QMP).(8) The subfield of medical physics applicable for CT Protocol Management is Diagnostic Medical Physics. As stated by the Policy, “a [QMP] is an individual who is competent to independently provide clinical professional services in one or more of the subfields of medical physics” and meets each of the following credentials: a. “Has earned a master’s or doctoral degree in physics, medical physics, biophysics, radiological physics, medical health physics, or equivalent disciplines from an accredited college or university; and b. Has been granted certification in the specific subfield(s) of medical physics with its associated medical health physics aspects by an appropriate national certifying body and abides by the certifying body’s requirements for continuing education.” c. For Diagnostic Medical Physics, the acceptable certifying bodies as of 2012 are: the American Board of Radiology, the American Board of Medical Physics, and the Canadian College of Physicists in Medicine. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  5. 5. 5   AAPM: Medical Physics Practice Guidelines 5 ii. Responsibilities of the QMP In the context of CT Protocol Management and Review, the QMP’s responsibilities may vary, depending on the type of facility being supported; regardless, the QMP must be involved in the review of all protocols. These considerations should be balanced with adequate response times to facility inquiries. A QMP’s time at a facility should include but not be limited to: a. meeting with the CT Protocol Management and Review team; b. clinical observation; phantom measurements; c. side-by-side image review with radiologist(s); d. artifact review with technologist(s) and/or radiologist(s); and e. discussion of equipment performance and operation, etc. While regular dialogue is important, the QMP should also remember that facility personnel themselves, in particular the Lead CT Radiologist, should lead the CT Protocol Management and Review process; the QMP is an integral member of the team. The QMP may elect to perform baseline dose measurements and image quality tests at the outset of the project, particularly if the QMP does not have personal historical experience with the scanner(s) in the facility. iii. In-house QMP For the in-house QMP, this ongoing CT protocol review project may consume much of his/her time, so the QMP should be sure to adequately communicate with his/her supervisor(s), with other team members, and with department/hospital management in this regard. The facility should understand that the CT Protocol Management and Review process is an ongoing investment in improved quality of patient care. In-house QMPs may be able to arrange more frequent meetings with CT Protocol Management and Review team members than their consulting colleagues; six to twelve meetings annually may be more appropriate for facilities with in-house QMPs, with the meeting frequency likely decreasing as time goes on and the facility’s protocols are sufficiently improved. iv. Consulting QMP It is important to note that CT Protocol Management and Review services are above and beyond normal QMPs consulting services (e.g., the annual physics survey), which have traditionally been limited to image quality, dosimetry, and basic protocol review for a few selected examinations. Consultant QMPs should make this clear to their clients, and negotiate their services appropriately. QMPs providing consulting services should maintain regular dialogue with the facility via convenient means (e.g., email, phone, and perhaps text message, if appropriate). It may be beneficial to use a communication process that provides a log of these interactions. It is recommended that the consulting QMP discuss with each facility access to images, including, but not limited to, remote access to the facility’s Picture Archiving and Communication System (PACS) for improved consultative capabilities. Consulting QMP’s should work with the facility to arrange mutually agreeable times to visit the facility for CT protocol portfolio review activities. Three to four visits annually may be reasonable. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  6. 6. 6   AAPM: Medical Physics Practice Guidelines 6 v. Qualifications and Expectation of the Lead CT Technologist The American Society of Radiologic Technologists (ASRT) has developed a practice standard entitled The Practice Standards for Medical Imaging and Radiation Therapy – Computed Tomography Practice Standards, effective June 19, 2011, which describes the education and certification requirements and scopes of practice for CT technologists.(11) The Lead CT Technologist is expected to provide the interface between the patient, staff, and the equipment. This includes workflow, the assembly and management of the CT portfolio, and education of the technologist pool. vi. Qualifications of the CT Radiologist Facilities should refer to the ACR for guidance on the requirements for physicians for accreditation or those in the Practice Guideline for Performing and Interpreting CT(12) and CT Accreditation Program Requirements.(13) The CT radiologist leads the CT Protocol Management and Review and defines image quality requirements.(14) 4. The Protocol Management Review Process It is important that the CT Protocol Review and Management team designs and reviews all new or modified protocol settings for existing and new scanners to ensure that both image quality and radiation dose aspects are appropriate. Each member of CT Protocol Management team has a critical role related to his or her specific area of expertise for the evaluation, review, and implementation of protocols. The following elements should be considered for inclusion in a specific facilities’ protocol review process: • • • While performing the review process, the CT Protocol Management team should pay particular attention to the oversight and review of existing protocols along with the evaluation and implementation of new and innovative technologies that can improve image quality and/or lower patient dose in comparison to the older protocol. Particular attention should be paid to the specific capabilities of each individual scanner (e.g., minimum rotation time, automatic exposure controls including both tube current modulation, as well as kV selection technologies, iterative reconstruction, reconstruction algorithms, etc.) to ensure maximum performance of the system is achieved. In addition, consideration should be made to consolidate protocols or remove legacy protocols that may not be current or applicable any longer. The review process should include a review of the most current literature such as ACR practice guidelines,(12) AAPM protocol list,(7) and peer-reviewed journals, etc., to ensure state-of-the-art protocols are being utilized. The following considerations are important during review of a protocol: a. Recommendations for State and National Guidance Local, state, and federal law or regulation varies greatly depending on the state in which the facility is located. The QMP must be familiar with applicable federal law and the specific requirements for the state or local jurisdiction where the facility is located. Protocol review and management, while not always explicitly required by state law or regulation, may often facilitate compliance with many provisions within state laws and regulations relating to radiation dose from CT. Links to applicable state regulations can be found at: http://www.aapm.org/government_affairs/licensure/default.asp. b. Frequency of Review The review process must be consistent with federal, state, and local laws and regulations. If there is no specific regulatory requirement, the frequency of protocol review Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  7. 7. 7   AAPM: Medical Physics Practice Guidelines 7 should be no less frequent than 24 months. This review should include all new protocols added since the last review. However, the best practice would be to review a facility’s most frequently used protocols at least annually. c. Clinically Significant Protocols that Require Annual Review For every facility there are protocols that are used frequently or could result in significant doses. If a facility performs the following six clinical protocols, the CT Protocol Review and Management team must review these annually (or more frequently if required by state or local regulatory body). Facilities that do not perform all of the exams listed below must select additional protocols at their facility, either the most frequently performed or higher-dose protocols, to a total of at least six for annual review. The six clinical protocols requiring annual review are: i. Pediatric Head (1 year old) (if performed at the institution) ii. Pediatric Abdomen (5 year old; 40-50 lb. or approx. 20 kg) (if performed at the institution) iii. Adult Head iv. Adult Abdomen (70 kg) v. High Resolution Chest vi. Brain Perfusion (if performed at the institution) d. Protocol Naming A facility should consider naming CT protocols in a manner consistent with the RadLex Playbook ID.(15) This would provide a more consistent experience for patients and referring physicians, and allow more direct comparison among various facilities. This practice may also allow more direct utilization of the ACR Dose Index Registry(16) tools and provide more efficient automated processes with postprocessing workstations. Also, the standardization of protocol names between scanners, even when the scanners are of different makes and models, is strongly encouraged. Appropriate protocol naming will likely result in fewer technologist errors and allow more efficient comparison of protocol parameters between scanners. A facility should consider incorporating version dates in protocol names to easily confirm the latest approved version. e. Permissions. i. It is important that each facility establish a process for determining who has permission to access the protocol management systems. Each facility should decide and document who has permission to change protocol parameters on the scanner(s). If the scanner allows password protection of protocols, then the facility is encouraged to use this important safety feature. Facilities should also decide how passwords are protected and archived. ii. Each facility should decide on the process of making protocol adjustments and the frequency with which these adjustments should be made. This includes decisions as to what approvals need to be secured before a protocol adjustment may be made, and the documentation process (e.g., a change control log documenting the rationale for each change, as well as who authorized or motivated the change). iii. Each facility should consider how to most effectively utilize the NEMA XR 26 standard (Access Controls for Computed Tomography)(17) when these tools become available on scanners at their facility. f. Acquisition parameters including kV, mA, rotation time, collimation or detector configuration, pitch, etc., should be reviewed to ensure they are appropriate for the diagnostic image quality (noise level, spatial resolution, etc.) necessary for the clinical indication(s) for the protocol, while minimizing radiation dose. For example, a slow Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  8. 8. 8   AAPM: Medical Physics Practice Guidelines 8 rotation time and/or low pitch value would not be appropriate for a chest CT exam due to breath-hold issues. i. The facility should explicitly review the expected Volume Computed Tomography Dose Index (CTDIvol) values. For the limited set of protocols where reference values are available, the CTDIvol values should be compared to the reference values of the ACR CT Accreditation Program,(3) Dose Reference Levels (DRLs),(18) AAPM CT Protocols,(7) or other available reference values for the appropriate protocols. Note: These reference values may be exceeded for individual patient scans (such as for a very large patient, or when the routine protocol is not used because of a different clinical indication, or when the reference value only refers to a single pass in a multipass study). ii. For a facility’s routine protocol for a standard sized patient, the expected CTDIvol values should be below these reference values. g. Reconstruction parameters such as the width of the reconstructed image (image thickness), distance between two consecutive reconstructed images (reconstruction interval), reconstruction algorithm/kernel/filter, and the use of additional image planes (e.g., sagittal or coronal planes, etc.) should also be reviewed to ensure appropriate diagnostic image quality (noise level, spatial resolution, etc.) necessary for the clinical indication(s) for the protocol. For example, a high-resolution chest exam typically generates thin (~ 1 mm) images using a sharp reconstruction filter. h. Advanced dose reduction techniques should be considered when the use of such techniques is consistent with the goals of the exam. Depending on the capabilities of each specific scanner, consider use of the following, if they are available: i. Automatic exposure control (e.g., tube current modulation or automatic kV selection) methods. ii. Iterative reconstruction techniques. i. Adjustments of acquisition parameters should be adjusted for patient size, either through a series of manual adjustments or through the use of automatic techniques (such as tube current modulation methods that adjust for patient size). j. Radiation dose management tools fall under two related but different categories, and may provide CT dose data that can be used to determine facility reference dose ranges. i. Radiation dose management tools that identify when potentially high-radiation dose scans are being prescribed should be implemented when available. This includes dose reporting and tracking software, participation in dose registries, and methods as described in the MITA XR25 standard (“Dose Check”).(19) ii. Radiation dose management tools may be used to monitor doses and collect data from routine exams. Statistical analysis of dose parameter values for a specific exam or clinical indication (e.g., average CTDIvol for a routine noncontrast head) can be provided. Participation in a national registry (such as the ACR Dose Index Registry)(16) and use of commercial dose tracking products are now available for this purpose. k. Populating Protocols Across Scanners Each facility should decide on the process by which protocol parameters are populated across additional scanners (whether this is done manually or by copy/paste, if the s ­ canners allow). The facility should decide whether there are ‘master’ or ‘primary’ scanners in the facility where manual protocol adjustments are to be made and archived, Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  9. 9. 9   AAPM: Medical Physics Practice Guidelines 9 and that set of protocols moved to the other similar scanners, or if another strategy will be employed. l. Documentation The CT Protocol Review and Management team should maintain documentation of all changes to protocols, and historical protocols should be available for review. Documentation should include the rationale for changes (e.g., improve temporal resolution, reduce breath-hold time, reduce patient dose, etc.). The latest protocol should be readily and obviously available to users during clinical protocol selection. In some settings it may be helpful to maintain historical protocols on the scanner, in a less conspicuous location or clearly labeled as a legacy protocol. The facility should decide and document who is responsible for maintaining the overall protocol description documentation. The facility should also describe whether the protocol description documentation is accessible to others for reference, how often it is updated, and how all protocols (on the scanners as well as the protocol description documentation) are archived. m. Periodic Vendor-specific Education/Refresher Sessions The CT Protocol Management Process team is responsible for ensuring that each member is adequately trained for protocol review on each scanner used at his or her facility. Each member of the CT Protocol Management Process team should receive refresher training no less than annually or when new technology is introduced that substantially impacts image quality or dose to the patient. i. Available educational resources should be considered in order to keep staff updated on current best practices. ii. Periodic refresher training should be scheduled for all members of the CT Protocol Management Process team. iii. Attendance should be taken at initial and all refresher-training sessions, and consequences identified for failure to complete training. n. Verification Once a CT Protocol Management Process has been established, the CT Protocol Review and Management team must institute a regular review process of all protocols to be sure that no unintended changes have been applied that may degrade image quality or unreasonably increase dose. As a best practice, the CT Protocol Review and Management team should conduct a random survey of specific exam types to verify that the protocols used are acceptable and consistent with protocols specified above. This should involve a limited review of recent patient cases to assess: i. Acquisition and reconstruction parameters, ii. Image quality, and iii. Radiation dose. 5. Conclusion CT protocol management and review is an important part of a CT facility’s operation and is considered important by many state regulatory bodies, accrediting, and professional organizations. Protocol parameter control and periodic review will help maintain the facility’s image quality to acceptable levels, and will serve to assure patient safety and continuous improvement in the imaging practice. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  10. 10. 10   AAPM: Medical Physics Practice Guidelines 10 ACKNOWLEDGMENTS This guideline was developed by the Medical Physics Practice Guideline Task Group-225 of the Professional Council of the AAPM. TG-225 Members: Dianna D. Cody, Chair, PhD, FAAPM Tyler S. Fisher, MS Dustin A. Gress, MS Rick Robert Layman, Jr., MS Michael F. McNitt-Gray, PhD, FAAPM Robert J. Pizzutiello, Jr., MS, FAAPM Lynne A. Fairobent, AAPM Staff AAPM Subcommittee on Practice Guidelines – AAPM Committee responsible for sponsoring the draft through the process. Joann I. Prisciandaro, PhD, Chair Maria F. Chan, PhD, Vice-Chair Therapy Jessica B. Clements, MS Dianna D. Cody, PhD, FAAPM Indra J. Das, PhD, FAAPM Nicholas A. Detorie, PhD, FAAPM Vladimir Feygelman, PhD Jonas D. Fontenot, PhD Luis E. Fong de los Santos, PhD David P. Gierga, PhD Kristina E. Huffman, MMSc David W. Jordan, PhD Ingrid R. Marshall, PhD Yildirim D. Mutaf, PhD Arthur J. Olch, PhD, FAAPM Robert J. Pizzutiello Jr., MS, FAAPM, FACMP, FACR Narayan Sahoo, PhD, FAAPM J. Anthony Seibert, PhD, FAAPM, FACR S. Jeff Shepard, MS, FAAPM, Vice-Chair Imaging Jennifer B. Smilowitz, PhD James J. VanDamme, MS Gerald A. White Jr., MS, FAAPM Ning J. Yue, PhD, FAAPM Lynne A. Fairobent, AAPM Staff References 1. AB 510, Radiation control: health facilities and clinics: records. 2011-2012 Reg. Sess. (CA 2012). An act to amend Sections 115111, 115112, and 115113 of the Health and Safety Code, relating to public health, and declaring the urgency thereof, to take effect immediately. Available from: http://leginfo.legislature.ca.gov/faces/billNavClient. xhtml?bill_id=201120120AB510&search_keywords 2. Rules and Regulations – Radiation Control Program [Internet]. Austin, TX: Texas Department of State Health Services (DSHS). Last updated 25 July 2013. Available from: http://www.dshs.state.tx.us/radiation/rules. shtm#227 3. American College of Radiology. CT Accreditation Program. Available from: http://www.acr.org/Quality-Safety/ Accreditation/CT 4. The Joint Commission Sentinel Event Alert: Radiation risks of diagnostic imaging. Issue 47, August 24, 2011. Available from: http://www.jointcommission.org/assets/1/18/SEA_47.pdf Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  11. 11. 11   AAPM: Medical Physics Practice Guidelines 11 5. Siegelman JR, and Gress DA. Radiology stewardship and quality improvement: the process and costs of implementing a CT Radiation Dose Optimization Committee, in a medium sized community hospital system. J Am Coll Radiol. 2013;10(6):416–22. 6. AAPM. CT Lexicon, ver. 1.3, 04/20/2012. Available from: http://www.aapm.org/pubs/CTProtocols/documents/ CTTerminologyLexicon.pdf 7. AAPM. CT scan protocols. Available from: http://www.aapm.org/pubs/CTProtocols/ 8. AAPM. Professional Policy Statement, PP 1-H: Definition of A Qualified Medical Physicist. Available from: http://www.aapm.org/org/policies/details.asp?id=316&type=PP 9. Siegle C, Kofler JM, Torkelson JE, Leitzen SL, McCollough CH. CT scan protocol management. Presented at the Radiological Society of North America Scientific Assembly and Annual Meeting, Chicago, Illinois, November 2004. 10. Kofler JM, McCollough CH, Vrieze TJ, Bruesewitz MR, Yu L, Leng S. Team-based methods for effectively creating, managing and distributing CT protocols. Presented at the Radiological Society of North America Scientific Assembly and Annual Meeting, Chicago, Illinois, November, 2010. 11. ASRT. The practice standards for medical imaging and radiation therapy – computed tomography practice standards (effective June 19, 2011). Available from: http://www.asrt.org/main/standards-regulations/practice-standards/ practice-standards 12. ACR. ACR practice guideline for performing and interpreting diagnostic computer tomography (CT). Available from: http://acr.org/~/media/ACR/Documents/PGTS/guidelines/CT_Performing_Interpreting.pdf 13. ACR. CT accreditation program requirements. Available from: http://acr.org/~/media/ACR/Documents/ Accreditation/CT/Requirements.pdf 14. ACR. CT accreditation program clinical image quality guide. Available from: http://www.acr.org/~/media/ACR/ Documents/Accreditation/CT/ImageGuide.pdf 15. RSNA. RadLex playbook. Available from: http://rsna.org/RadLex_Playbook.aspx 16. ACR. National radiology data registry: dose index registry [website]. Available from: www.acr.org/nrdr 17. NEMA. Access controls for computed tomography: identification, interlocks, and logs. NEMA XR 26-2012. Rosslyn, VA: National Electrical Manufacturers Association; 2012. 18. McCollough C, Branham T, Herlihy V, et al. Diagnostic reference levels from the ACR CT Accreditation Program. J Am Coll Radiol. 2011;8(11):795–803. 19. NEMA. Computed tomography dose check. NEMA XR 25-2010. Rosslyn, VA: National Electrical Manufacturers Association; 2010. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  12. 12. 12   AAPM: Medical Physics Practice Guidelines 12 APPENDIX Appendix A: Example of how team members may work together and in parallel during the process. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  13. 13. JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, VOLUME 14, NUMBER 5, 2013 Evaluation of the sensitivity of two 3D diode array dosimetry systems to setup error for quality assurance (QA) of volumetric-modulated arc therapy (VMAT) Guangjun Li,1,2 Sen Bai,1,2 Nianyong Chen,1a Lansdale Henderson,3 Kui Wu,1,2 Jianghong Xiao,1,2 Yingjie Zhang,1,2 Qingfeng Jiang,1,2 Xiaoqin Jiang1,2 Department of Radiation Oncology,1 Cancer Center, West China Hospital, Sichuan University, Chengdu, Sichuan, China; Center for Radiation Physics and Technology,2 Cancer Center, West China Hospital, Sichuan University, Chengdu, Sichuan, China; Department of Neuroscience,3 University of Virginia, Charlottesville, VA, USA nchenyy@gmail.com Received 28 November, 2011; accepted 8 April, 2013 The purpose of this study is to evaluate the sensitivities of 3D diode arrays to setup error for patient-specific quality assurance (QA) of volumetric-modulated arc therapy (VMAT). Translational setup errors of ± 1, ± 2, and ± 3 mm in the RL, SI, and AP directions and rotational setup errors of ± 1° and ± 2° in the pitch, roll, and yaw directions were set up in two phantom systems, ArcCHECK and Delta4, with VMAT plans for 11 patients. Cone-beam computed tomography (CBCT) followed by automatic correction using a HexaPOD 6D treatment couch ensured the position accuracy. Dose distributions of the two phantoms were compared in order to evaluate the agreement between calculated and measured values by using γ analysis with 3%/3 mm, 3%/2 mm, and 2%/2 mm criteria. To determine the impact on setup error for VMAT QA, we evaluated the sensitivity of results acquired by both 3D diode array systems to setup errors in translation and rotation. For the VMAT QA of all patients, the pass rate with the 3%/3 mm criteria exceeded 95% using either phantom. For setup errors of 3 mm and 2°, respectively, the pass rates with the 3%/3 mm criteria decreased by a maximum of 14.0% and 23.5% using ArcCHECK, and 14.4% and 5.0% using Delta4. Both systems are sensitive to setup error, and do not have mechanisms to account for setup errors in the software. The sensitivity of both VMAT QA systems was strongly dependent on the patient-specific plan. The sensitivity of ArcCHECK to the rotational error was higher than that of Delta4. In order to achieve less than 3% mean pass rate reduction of VMAT plan QA with the 3%/3 mm criteria, a setup accuracy of 2 mm/1° and 2 mm/2° is required for ArcCheck and Delta4 devices, respectively. The cumulative effect of the combined 2 mm translational and 1° rotational errors caused 3.8% and 2.4% mean pass rates reduction with 3%/3 mm criteria, respectively, for ArcCHECK and Delta4 systems. For QA of VMAT plans for nasopharyngeal cancer (NPC) using the ArcCHECK system, the setup should be more accurate. PACS numbers: 87.55.ne, 87.55.Qr, 87.55.km Key words: VMAT, setup error, patient-specific QA, 3D diode array a Corresponding author: Nianyong Chen, Department of Radiation Oncology, Cancer Center, Sichuan University West China School of Medicine/West China Hospital, 37 Guoxuexiang, Wuhou District, Chengdu, Sichuan 610041, P.R. China; phone: (86) 28 8542 2952; fax: (86) 28 8542 2952; email: nchenyy@gmail.com 13    13
  14. 14. 14   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 14 I. Introduction Volumetric-modulated arc therapy (VMAT) is a new intensity-modulated radiotherapy (IMRT) technology with single or multiple gantry arcs that achieves appropriate dose-target conformity and permits critical organ sparing. VMAT delivers radiation via dynamic multileaf collimator (MLC) motion, and allows for variable dose rates, gantry speed modulation, and collimator rotation.(1) Thus far, VMAT has been used to treat various tumor sites, including head and neck,(2-5) lung,(5-8) prostate,(4,5,9-12) rectum,(5,13) cervix uteri,(14) spinal metastases,(15,16) and brain metastases.(17) However, the dose calculation and the implementation of VMAT plans are highly complex. It is, therefore, essential to perform patient-specific quality assurance (QA) of VMAT plans.(18) Various types of dosimetry systems exist for dose verification, including gel dosimetry,(19,20) water phantoms with film and ion chambers,(21) online 2D detector arrays,(22-25) Monte Carlobased frameworks,(26) and 3D diode arrays.(27-29) With the exception of the 3D diode arrays, these QA systems are limited either by single-plane measurements of the dose distribution or increased off-line processing time for measured data.(27) Two commercial 3D diode array systems, ArcCHECK (Sun Nuclear, Melbourne, FL) and Delta4 (ScandiDos AB, Uppsala, Sweden), were applied for dose verification in IMRT and VMAT. In our clinical practice, we used ArcCHECK and Delta4 for QA of VMAT plans. During QA measurements, the phantoms were positioned such that the crosslines on the surface aligned with the room lasers. In practice, we noted that the registration errors (e.g., 2 mm) of the room lasers and the radiation isocenter of the linacs affected the phantom positioning error, and thereby obviously influenced the dose verification of VMAT. However, unlike most other commercial systems, the analysis software available in the ArcCHECK and Delta4 systems is unable to correct for positioning errors. Therefore, it is crucial to determine the sensitivity of the 3D detector arrays to setup error for QA of VMAT plans. In one case, Letourneau et al.(27) assessed the sensitivity of the prototype of the ArcCHECK dosimetry system to phantom translational setup error in the right–left and anterior–posterior directions and the results demonstrated that the diode array sensitivity to setup error is strongly dependent on the patient-specific VMAT plans. However, the effect of the rotational setup error on ArcCHECK and other 3D detector arrays with various detector positions is not fully understood. In this study, we examined the sensitivities of ArcCHECK and Delta4 to translational and rotational setup errors in all directions for patient-specific QA of VMAT plans. II. Materials and Methods A. Patients’ plan selection Eleven patients requiring VMAT plans of differing complexity for cancers, including esophageal, prostate, cervix uteri, rectal, and nasopharyngeal cancer (NPC), were selected for this study. The VMAT plans were designed using a commercial 3D treatment planning system (Pinnacle v9.0, Philips Medical, Madison, WI) with a SmartArc optimization algorithm.(30) Patients’ characteristics and planning states are summarized in Table 1. The plans for NPC had two full arcs with one control point per 4°, and the plans of other cancer sites had only one full arc with one control point per 4°. All TPS calculations in this study were done with a dose grid resolution of 2 mm. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  15. 15. 15   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 15 Table 1.  Patient characteristics and planning states. Using the SIB technique, two or three dose levels are defined for each patient, except for patients with rectal cancer. Number of Dose Monitor Disease Site Patients Levels Units Esophageal Prostate(c) Cervix uteri Rectal NPC(d) 3 1 1 3 3 2 2 2 1 3 497±108 1316 906 1259±136 668±65 Average Field Width (mm) Average Leaf Travel(a) (mm) Average Leaf Travel Speed(b) (mm/s) 37±5 51 49 82±8 26±1 579±75 848 818 610±57 1418±121 3.8±0.6 5.4 5.0 4.2±0.3 4.5±0.3 a Average leaf travel is calculated over all leaves, excluding leaves which remain closed over all treatment. leaf travel speed is average leaf travel divided by delivery time. c The prostate plan is a whole pelvis and prostate boost plan. d The treatment volume for NPC included all sites: primary, upper, and lower neck with a single VMAT plan. b Average B. Measurement devices Two 3D dosimetry systems, ArcCHECK and Delta4, were used for measurements. The ArcCHECK dosimetry system,(27,31) consists of 1386 diodes, each with 0.8 × 0.8 mm2 active measuring area, embedded in the cylindrical wall of the phantom. The Delta4 dosimetry system is based on two crossing arrays including 1069 diodes in a fixed cylindrical geometry, providing full coverage of the cross section for any beam direction.(28,29) The spatial locations of the detectors are different between the two dosimetry systems. The dose distribution tested by ArcCHECK forms a cylindrical distribution with a diameter of 21 cm, typically positioned in the region surrounding the tumor target volume. In the Delta4 system, the dose distribution is measured on the two intersected perpendicular planes that cut through the tumor target volume. C. Delivery and patient-specific QA The 11 VMAT plans were cast on the reference CT images of ArcCHECK and Delta4 phantoms and the dose distributions were recalculated. Both 3D diode arrays were placed on the HexaPOD 6D robotic treatment couch (Elekta, Crawley, UK) for measurements. Beam attenuation by the treatment couch has been considered in the plans by generating the couch’s model from the contour of the couch, together with the density information. All tests were carried out using an Elekta Synergy accelerator at the nominal energy of 6 MV X-rays with a 1 cm leaf width MLCi and an RTD 7.01 controller system (Elekta). Before recording QA measurements, we performed the quality control (QC) for the linac according to the TG142 report(32) ensuring the coincidence of lasers with the isocenter was within a radius of 1 mm. To minimize the setup error, cone-beam computed tomography (CBCT) and a HexaPOD robotic treatment couch (HRTC) were used to set up the phantoms with 0.5 mm and 0.5° residual errors as prescribed in the studies by Sharpe et al.(33) and Meyer et al.(34) The reference CT images were acquired on a CT scanner with a slice thickness of 1 mm, and the resulting CBCTs possessed a voxel resolution of 0.5 mm in all three dimensions of the reconstructed images. Registration between the reference CT and CBCT was carried out automatically using an inbuilt method in XVI, namely gray value match. D. Setup error simulation The ArcCHECK and Delta4 phantoms were translated respectively in the right–left (RL), anterior–posterior (AP), and superior–inferior (SI) directions by ± 1, ± 2, and ± 3 mm and rotated in the pitch, roll, and yaw directions by ± 1° and ± 2° using the 6D treatment couch. Figure 1 shows the definition of pitch, roll, and yaw used in this study. The 11 VMAT plans were separately delivered to the each of the two phantoms for dose verification; in total, 31 measurements (1 without positional error, 18 with translational errors, and 12 with rotational errors) were performed for each patient plan with one dosimetry system. For the combined Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  16. 16. 16   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 16 Fig. 1.  Coordinate system used in the study. Arrows indicate positive rotation with respect to each axis. reproducibility of setup and measurement, the above procedure was repeated three times for one rectal cancer case with the ArcCHECK and Delta4 systems, respectively. The intervals between the reproducibility tests were more than one month. We compared the measured dose distributions of each array with the calculated dose distributions generated by the planning system in order to analyze the effect of phantom-specific setup errors on VMAT QA. We also analyze the cumulative effect of combined 2 mm translational and 1° rotational errors, which could be figured out by the quadratic summation method.(32) The pass rate of γ analysis was computed by comparing the calculated and measured dose distributions using 3%/3 mm, 3%/2 mm, and 2%/2 mm criteria, respectively. Diode readings, or “dose-values,” lower than 10% of the highest diode signal were ignored in the analysis. These ignored readings reflect low-dose and low-gradient regions, typically located under the “jaws,” where the diode response is less reliable and the signal-to-noise ratio presents a concern.(35) The paired Student’s t-test was used for analysis of the results obtained from ArcCHECK and Delta4. All tests were two-tailed with a p-value of < 0.05 considered the threshold for statistical significance. Statistical analysis was performed with the SPSS (v.14.0, Chicago, IL) program. III. Results A. QA for VMAT patients’ plans QA results of the 11 VMAT plans tested with ArcCHECK and Delta4 are shown in Table 2. All pass rates of γ analysis with the 3%/3 mm criteria are higher than 95% for both diode arrays. Except for NPC, all pass rates of γ analysis with the 3%/2 mm and 2%/2 mm criteria are higher than 95% and 90%, respectively. The mean pass rate of γ analysis with the 2%/2 mm criteria for NPC by ArcCHECK was 84.7%. The lower results for NPC compared to the other cancer sites are due to the target volume complexity and the differences in the geometrical position of the diodes in ArcCHECK and Delta4 (see Fig. 2). A significant difference in the pass rate with the 3%/3mm criteria between the two systems (p = 0.004) indicates that the pass rates of Delta4 are higher than those of ArcCHECK. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  17. 17. 17   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 17 Table 2.  The QA results of 11 VMAT plans using ArcCHECK and Delta4 phantoms obtained without introducing any setup error. Cancer Site γ (%)a of ArcCHECK 3%/3 mm 3%/2 mm 2%/2 mm 3%/3 mm Esophageal Prostate Cervix uteri Rectal NPC 98.5±0.3 98.7 98.5 98.8±0.9 95.6±0.8 99.7±0.3 98.6 99.2 99.4±0.4 98.5±0.5 96.8±0.4 97.5 97.2 97.2±1.8 92.5±0.9 91.2±0.7 92.9 91.4 93.6±2.4 84.7±1.0 γ (%)a of Delta4 3%/2 mm 2%/2 mm 98.2±1.0 96.5 98.0 97.8±1.2 95.3±1.9 95.3±1.7 90.6 92.9 95.8±1.0 90.1±2.0 a Gamma (γ) results are the percentage of points passing the gamma criterion of 3%/3 mm, 3%/2 mm, and 2%/2 mm, respectively. Fig. 2.  The two systems measured the different section of the dose distribution because of the geometrical position of the diodes. ArcCHECK and Delta4 diodes positioned on the circular line and the crossline, respectively. B. Sensitivities of two diode arrays to translational setup error Figure 3 shows the impact of the translational setup error on 11 patient-specific VMAT QA plans. Setup error was separately introduced in the RL, SI, and AP directions, and the impact was measured using ArcCHECK and Delta4. When the translational setup errors are ± 1, ± 2, and ± 3 mm, respectively, the pass rates of γ analysis with the 3%/3 mm criteria decreased by a maximum of 2.5%, 6.4%, and 14.0% for ArcCHECK and 2.5%, 6.9%, and 12.2% for Delta4 in the RL direction; 6.1%, 8.4%, and 13.4% for ArcCHECK and 1.6%, 6.3%, and 14.4% for Delta4 in the SI direction; 2.0%, 4.5%, and 9.5% for ArcCHECK and 1.7%, 5.1%, and 10.5% for Delta4 in the AP direction. To further test the difference between the two dosimetry systems in sensitivity to setup error, we compared all of their values for the reduction of γ analysis with the 3%/3 mm criteria in each direction. Significant differences in the pass rate of γ analysis in the RL and SI directions (p = 0.019 and < 0.001, respectively) indicate a higher sensitivity of ArcCHECK diodes than Delta4 diodes to translational setup error in both directions; however, only a nominal difference was observed in the AP direction between the two systems (p = 0.074). Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  18. 18. 18   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 18 Fig. 3.  The impact of translational setup errors on dosimetric verification of 11 VMAT plans using ArcCHECK and Delta4 phantoms in (a) RL, (b) SI, and (c) AP directions. The simulated translational setup errors are 1, 2, 3, -1, -2, and -3 mm, respectively. The decreased pass rates of γ analysis from the original results are assessed with the 3%/3 mm criteria. The tested results also indicated that the pass rate of γ analysis was most affected by translation in the RL and AP directions for NPC and esophageal cancer, but only affected by translation in the SI direction for prostate cancer. For ArcCHECK, the maximum standard deviations (error bars shown in Fig. 3) which were calculated for the three cases of each disease site for each setup error were 4.0%, 3.3%, and 3.1%, respectively, for NPC, esophageal, and rectal cancer; for Delta4 they were 3.8%, 4.5%, and 4.3%. These results indicate that the effects of translational setup errors on VMAT QA are strongly dependent on patient-specific plans in spite of the same disease site. C. Sensitivities of two diode arrays to rotational setup error Figure 4 shows the impact of the rotational setup error for 11 patient-specific VMAT QA. Setup error was separately introduced in the pitch, roll, and yaw directions, and the impact was measured using ArcCHECK and Delta4. When the rotational setup errors were ± 1° and ± 2°, respectively, the pass rates of γ analysis with the 3%/3 mm criteria decreased by a maximum of 5.5% and 9.9% for ArcCHECK and 2.5% and 5.0% for Delta4 in the pitch direction; 5.2% and 19.2% for ArcCHECK and 1.8% and 4.9% for Delta4 in the roll direction; and 8.4% and 23.5% for ArcCHECK and 1.7% and 4.9% for Delta4 in the yaw direction. Significant differences between the two systems in all rotation directions (p < 0.001, = 0.001, and < 0.001 in the pitch, roll, and yaw directions, respectively), indicate that the configuration of the ArcCHECK Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  19. 19. 19   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 19 Fig. 4.  The impact of rotational setup errors on dosimetric verification of 11 VMAT plans using ArcCHECK and Delta4 phantoms in (a) pitch, (b) roll, and (c) yaw directions. The simulated rotational setup errors are 1°, 2°, -1°, and -2°, respectively. The decreased pass rates of γ analysis, compared to the original results, are assessed with the 3%/3 mm criteria. system is more sensitive to rotational setup error than the Delta4 system when determining VMAT QA, and emphasize the importance of accurate rotational positioning during measurements using ArcCHECK. From the results gathered by the two systems, we observed the greatest impact on the pass rate of γ analysis with the 3%/3 mm criteria in all directions for NPC and esophageal cancer. For ArcCHECK, the maximum standard deviations (error bars shown in Fig. 4) were 5.8%, 3.4%, and 2.5%, respectively, for NPC, esophageal, and rectal cancer; for Delta4 they were 1.1%, 2.2%, and 1.6%. These results indicate that the effects of the rotational setup errors on VMAT QA are strongly dependent on patient-specific plans in spite of the same disease site. D. Influence of setup error on the pass rate of γ analysis with various criteria Table 3 shows the impact of setup errors in translation and rotation on the pass rate of γ analysis with various criteria attained by ArcCHECK and Delta4. Stricter gamma criteria resulted in a greater impact of the setup error on the pass rate of γ analysis. For a translational setup error of 3 mm, the pass rates of γ analysis with the 2%/2 mm criteria decreased by an average of 13.2% ± 5.5% for ArcCHECK and by an average of 14.6% ± 6.7% for Delta4. For the rotational setup error of 2°, the pass rates of γ analysis with the 2%/2 mm criteria decreased by an average of 14.5% ± 6.6% for ArcCHECK and by an average of 7.0% ± 3.7% for Delta4. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  20. 20. 20   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 20 Table 3.  Translational and rotational setup errors result in a decrease in the pass rate of γ analysis with various criteria on the average and standard deviation. Setup Error 3%/3 mm ArcCHECK 3%/2 mm 2%/2 mm 3%/3 mm Delta4 3%/2 mm 2%/2 mm Translation 1 mm 0.7±1.0 1.4±1.5 2.0±2.0 0.6±0.6 1.6±1.5 2.1±2.2 2 mm 2.8±1.7 5.0±2.9 6.7±3.5 2.3±1.9 5.4±3.3 7.3±4.2 3 mm 6.9±3.3 10.0±4.9 13.2±5.5 6.1±3.8 11.1±5.5 14.6±6.7 Rotation 1° 1.8±1.9 3.7±2.9 4.9±3.7 0.5±0.8 1.5±1.5 2.5±2.6 2° 8.6±4.7 12.1±5.5 14.5±6.6 2.4±1.7 5.0±2.9 7.0±3.7 E. Combined reproducibility of setup and measurement Figure 5 shows the standard deviation of pass rates with the 3%/3 mm criteria for the reproducibility tests for one rectal cancer case with the ArcCHECK and Delta4 systems, respectively. The mean standard deviations were 0.59% for ArcCHECK and 0.44% for Delta4 for all reproducibility tests in this case, and the mean standard deviations for 1, 2, and 3 mm translational setup errors, respectively, were 0.32%, 0.65%, and 0.83% for ArcCHECK and 0.12%, 0.41%, and 0.88% for Delta4; for 1° and 2° rotational setup errors, respectively, they were 0.41% and 0.75% for ArcCheck and 0.29% and 0.54% for Delta4. Fig. 5.  The standard deviation of pass rates with the 3%/3 mm criteria for the reproducibility tests for one rectal cancer case with the ArcCHECK and Delta4 systems, respectively. The tests contain all simulated (a) translational and (b) rotational setup errors for this case. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  21. 21. 21   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 21 F. Cumulative effect of both translational and rotational error Figure 6 shows the cumulative impact of 2 mm translational and 1° rotational setup errors for the 11 patient-specific VMAT QA using ArcCHECK and Delta4, respectively. For ArcCHECK system, the average decreased pass rates of γ analysis with the 3%/3 mm criteria in all directions were 3.4%, 3.1%, 3.3%, 2.9%, and 5.6%, respectively, for esophageal, prostate, cervix uteri, rectal, and nasopharyngeal cancer, and for Delta4 the average decreased pass rates were 2.9%, 3.3%, 1.7%, 1.4%, and 3.0%, respectively. Fig. 6.  The cumulative impact of 2 mm translational and 1° rotational errors was evaluated for the pass rate reduction with 3%/3 mm criteria for both ArcCHECK and Delta4 systems. The calculated average and standard deviation of the pass rate reduction contained the translational and rotational errors of combinations in all directions. IV. DISCUSSION Patient-specific dosimetric verification has been indispensable for IMRT QA. Furthermore, achieving accurate QA results is critical in detecting discrepancies between delivery and planning. The characteristics of Delta4, as reported by Korreman et al.,(36) were determined by comparing consecutive deliveries of the same plan. The results indicated a strong agreement in all cases for the accumulated dose with dose deviations < 1% for all measurement points and cases. Letourneau et al.(27) assessed the combined reproducibility of the ArcCHECK dosimeter system response and the linear accelerator (Elekta Synergy) for VMAT with the repeat delivery of the head and neck plans. The results demonstrated strong performance and stability of both systems. In our study, we tested the combined reproducibility of setup and measurement. The mean standard deviations were 0.59% (0.06%–1.27%) for ArcCHECK and 0.44% (0.06%–1.06%) for Delta4, which shows good reproducibility. However, the results were slightly worse than the reports above because the pass rate altered more significantly as the setup error increased. Dosimetric verification in the study indicated that the QA of VMAT assessed by both ArcCHECK and Delta4 met therapeutic quality requirements. However, it has been noted that the pass rate of γ analysis for Delta4 was higher than ArcCHECK, referring to the results for zero setup errors (p = 0.004). The reasons for the differences are as follows. First, ArcCHECK and Delta4 dosimetry systems have very different spatial locations of diode detectors. Thus, each system measures a different section of the total dose distribution and samples different dose gradients (see Fig. 2). Compared to Delta4, dose distributions of ArcCHECK are more complex. Moreover, ArcCHECK is more sensitive to the dose delivery errors and the angular discretization effect because of the different diode locations.(37) Second, the dose error normalization was different. To establish the percent dose error normalization value, a different method was used for each device, given the different arrangement of the detectors. The Delta4 results were normalized at the isocenter, while the ArcCHECK results were normalized to the Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  22. 22. 22   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 22 maximum measured dose in the detector ring. Moreover, the 10% dose cutoff threshold, below which the voxel is excluded from analysis, may not mean the same on the ArcCHECK diode surface as it does on the Delta4 diode planes.(37) In all tests of the setup error simulation, the two 3D diode arrays exhibited extreme sensitivity to translational and rotational setup errors in all axes for patient-specific QA results of VMAT plans, while exhibiting strong dependence on the patient-specific plan. In general, we have observed an impact of the translational setup error on the QA results of complex VMAT plans and target volumes such as NPC, a cancer contained in the upper and lower neck regions. We have also observed a marked influence of the rotational setup error on the QA results of VMAT plans with long target volumes, such as esophageal cancer. In this paper, we tested three cases for each of the sites of NPC, esophageal, and rectal cancers. Despite the same cancer site amongst case triplets, a difference in sensitivity to setup errors was observed due to the variation of patient-specific plans. Letourneau et al.(27) assessed the sensitivity of the prototype ArcCHECK dosimetry system for phantom setup error after CBCT image-guided setup. Letourneau and colleagues found specifically that the diodes’ sensitivity to setup error in the RL and AP directions were highly plan-dependent; the direction of the steepest dose gradients for a given plan did not necessarily correspond with the direction of the phantom setup. In addition, the results in Figs. 3 and 4 for both sets of setup errors indicate some differences in the order of 3%–4% reduction rates between the negative and positive setup errors. The differences are mainly due to the following two reasons. Firstly, the residual setup errors up to 0.5 mm and 0.5° were still present when CBCT and HRTC were used. Secondly, planspecific features along each translational and rotational setup directions were different, such as the dose gradient orientation. On the other hand, the respective sensitivities to setup error of ArcCHECK and Delta4 were not uniform. Though the diode arrays demonstrated similar sensitivity to translational setup error, ArcCHECK was slightly more sensitive than Delta4 for the gamma criteria 3%/3 mm, likely measuring a section of the dose distribution with more dose gradients. In addition, compared to the translational setup error, ArcCHECK diodes were more sensitive to the rotational setup error than Delta4, due to the difference in spatial locations between the two 3D diode arrays. For the same rotational setup error, the diodes of the ArcCHECK system shift a greater distance than those of Delta4. The AAPM TG-142 report recommended that the tolerance of laser localization was 1.5 mm for IMRT.(32) For both ArcCHECK and Delta4 systems, 1° rotational error could cause an approximate error of 2 mm on the surface of the phantoms. Therefore, the cumulative effect of the combined 2 mm translational and 1° rotational errors was evaluated, and the average pass rates reduction with the 3%/3 mm criteria were 3.8% and 2.4% for ArcCHECK and Delta4 systems, respectively. The cumulative effect for ArcCHECK system was more obvious than Delta4 system mainly due to the higher sensitivity of ArcCHECK to the rotational error. Especially for NPC tests using ArcCHECK, the cumulative effect was quite obvious; thus, the setup of ArcCHECK should be more accurate than the other cancer sites. In addition, because of the difference in the sensitivity between the two systems, their setup accuracy could be different. As shown in Table 3, in order to achieve less than 3% mean pass rate reduction of VMAT plan QA with the 3%/3 mm criteria, a setup accuracy of 2 mm/1° and 2 mm/2° are required for ArcCheck and Delta4 devices, respectively. V. Conclusions In this study, both the ArcCHECK and Delta4 diode arrays showed high sensitivity to setup errors, and sensitivity of both systems is strongly dependent on patient-specific plans. The sensitivity of ArcCHECK to the rotational error was higher than that of Delta4. In order to achieve less than 3% mean pass rate reduction of VMAT plan QA with the 3%/3 mm criteria, a setup Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  23. 23. 23   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 23 accuracy of 2 mm/1° and 2 mm/2° is required for ArcCheck and Delta4 devices, respectively. The cumulative effect of the combined 2 mm translational and 1° rotational errors caused 3.8% and 2.4% mean pass rates reduction with 3%/3 mm criteria, respectively, for ArcCHECK and Delta4 systems. For QA of VMAT plans for NPC using the ArcCHECK system, the setup should be more accurate. Acknowledgments Guangjun Li, Sen Bai, and Nianyong Chen contributed equally to this work. The Delta4 dosimetry system was provided by Beijing HGPT Technology & Trade Co. Ltd. This work was partially supported by the National Natural Science Foundation of China (Grant No. 81101697). References 1. Otto K. Volumetric modulated arc therapy: IMRT in a single arc. Med Phys. 2008;35(1):310–17. 2. Verbakel WF, Cuijpers JP, Hoffmans D, Bieker M, Slotman BJ, Senan S. Volumetric intensity-modulated arc therapy vs. conventional IMRT in head-and-neck cancer: a comparative planning and dosimetric study. Int J Radiat Oncol Biol Phys. 2009;74(1):252–59. 3. Bertelsen A, Hansen CR, Johansen J, Brink C. Single arc volumetric modulated arc therapy of head and neck cancer. Radiother Oncol. 2010;95(2):142–48. 4. Yu C, Li X, Ma L, Chen D, et al. Clinical implementation of intensity-modulated arc therapy. Int J Radiat Oncol Biol Phys. 2002;53(2):453–63. 5. Cao D, Holmes T, Afghan M, Shepard D. Comparison of plan quality provided by intensity-modulated arc therapy and helical tomotherapy. Int J Radiat Oncol Biol Phys. 2007;69(1):240–50. 6. Bedford J, Nordmark Hansen V, McNair H, et al. Treatment of lung cancer using volumetric modulated arc therapy and image guidance: a case study. Acta Oncol. 2008;47(7):1438–43. 7. Verbakel WF, Senan S, Cuijpers JP, Slotman BJ, Lagerwaard FJ. Rapid delivery of stereotactic radiotherapy for peripheral lung tumors using volumetric intensity-modulated arcs. Radiother Oncol. 2009;93(1):122–24. 8. McGrath SD, Matuszak MM, Yan D, Kestin LL, Martinez AA, Grills IS. Volumetric modulated arc therapy for delivery of hypofractionated stereotactic lung radiotherapy: a dosimetric and treatment efficiency analysis. Radiother Oncol. 2010;95(2):153–57. 9. Ma L, Yu C, Earl M, et al. Optimized intensity-modulated arc therapy for prostate cancer treatment. Int J Cancer. 2001;96(6):379–84. 10. Palma D, Vollans E, James K, et al. Volumetric modulated arc therapy for delivery of prostate radiotherapy: comparison with intensity-modulated radiotherapy and three-dimensional conformal radiotherapy. Int J Radiat Oncol Biol Phys. 2008;72(4):996–1001. 11. Wolff D, Stieler F, Welzel G, et al. Volumetric modulated arc therapy (VMAT) vs. serial tomotherapy, step-andshoot IMRT and 3-D-conformal RT for treatment of prostate cancer. Radiother Oncol. 2009;93(2):226–33. 12. Guckenberger M, Richter A, Krieger T, Wilbert J, Baier K, Flentje M. Is a single arc sufficient in volumetricmodulated arc therapy (VMAT) for complex-shaped target volumes? Radiother Oncol. 2009;93(2):259–65. 13. Duthoy W, De Gersem W, Vergote K, et al. Clinical implementation of intensity-modulated arc therapy (IMAT) for rectal cancer. Int J Radiat Oncol Biol Phys. 2004;60(3):794–806. 14. Cozzi L, Dinshaw KA, Shrivastava SK, et al. A treatment planning study comparing volumetric arc modulation with RapidArc and fixed field IMRT for cervix uteri radiotherapy. Radiother Oncol. 2008;89(2):180–91. 15. Matuszak M, Sui H, Yan D. Potential impact of volumetric modulated arc therapy on the planning and delivery of radiation therapy. Int J Radiat Oncol Biol Phys. 2008;72(1 Suppl):S651. 16. Kuijper IT, Dahele M, Senan S, Verbakel W. Volumetric modulated arc therapy versus conventional intensity modulated radiation therapy for stereotactic spine radiotherapy: a planning study and early clinical data. Radiother Oncol. 2010;94(2):224–28. 17. Clark GM, Popple RA, Young PE, Fiveash JB. Feasibility of single-isocenter volumetric modulated arc radiosurgery for treatment of multiple brain metastases. Int J Radiat Oncol Biol Phys. 2010;76(1):296–302. 18. Bortfeld T, Webb S. Single-arc IMRT? Phys Med Biol. 2009;54(1):N9–N20. 19. Low D, Dempsey J, Venkatesan R, et al. Evaluation of polymer gels and MRI as a 3-D dosimeter for intensitymodulated radiation therapy. Med Phys. 1999;26(8):1542–51. 20. Vergote K, De Deene Y, Duthoy W, et al. Validation and application of polymer gel dosimetry for the dose verification of an intensity-modulated arc therapy (IMAT) treatment. Phys Med Biol. 2004;49(2):287–305. 21. Pallotta S, Marrazzo L, Bucciolini M. Design and implementation of a water phantom for IMRT, arc therapy, and tomotherapy dose distribution measurements. Med Phys. 2007;34(10):3724–31. 22. Jursinic P and Nelms B. A 2-D diode array and analysis software for verification of intensity modulated radiation therapy delivery. Med Phys. 2003;30(5):870–79. 23. Spezi E, Angelini A, Romani F, Ferri A. Characterization of a 2D ion chamber array for the verification of radiotherapy treatments. Phys Med Biol. 2005;50(14):3361–73. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  24. 24. 24   Li et al.: Evaluation of 3D diode array dosimetry systems for VMAT QA 24 24. Poppe B, Blechschmidt A, Djouguela A, et al. Two-dimensional ionization chamber arrays for IMRT plan verification. Med Phys. 2006;33(4):1005–15. 25. Jursinic PA, Sharma R, Reuter J. MapCHECK used for rotational IMRT measurements: step-and-shoot, TomoTherapy, RapidArc. Med Phys. 2010;37(6):2837–46. 26. Bush K, Townson R, Zavgorodni S. Monte Carlo simulation of RapidArc radiotherapy delivery. Phys Med Biol. 2008;53(19):N359–N370. 27. Letourneau D, Publicover J, Kozelka J, Moseley DJ, Jaffray DA. Novel dosimetric phantom for quality assurance of volumetric modulated arc therapy. Med Phys. 2009;36(5):1813–21. 28. Bedford J, Lee Y, Wai P, South C, Warrington A. Evaluation of the Delta4 phantom for IMRT and VMAT verification. Phys Med Biol. 2009;54(9):N167–N176. 29. Sadagopan R, Bencomo J, Martin R, Nilsson G, Matzen T, Balter P. Characterization and clinical evaluation of a novel IMRT quality assurance system. J Appl Clin Med Phys. 2009;10(2):104–19. 30. Bzdusek K, Friberger H, Eriksson K, Hardemark B, Robinson D, Kaus M. Development and evaluation of an efficient approach to volumetric arc therapy planning. Med Phys. 2009;36(6):2328–39. 31. Yan G, Lu B, Kozelka J, Liu C, Li JG. Calibration of a novel four-dimensional diode array. Med Phys. 2010;37(1):108–15. 32. Klein EE, Hanley J, Bayouth J, et al. Task Group 142 report: quality assurance of medical accelerators. Med Phys. 2009;36(9):4197–212. 33. Sharpe MB, Moseley DJ, Purdie TG, Islam M, Siewerdsen J, Jaffray D. The stability of mechanical calibration for a kV cone beam computed tomography system integrated with linear accelerator. Med Phys. 2006;33(1):136–44. 34. Meyer J, Wilbert J, Baier K, et al. Positioning accuracy of cone-beam computed tomography in combination with a HexaPOD robot treatment table. Int J Radiat Oncol Biol Phys. 2007;67(4):1220–28. 35. Basran PS and Woo MK. An analysis of tolerance levels in IMRT quality assurance procedures. Med Phys. 2008;35(6):2300–07. 36. Korreman S, Medin J, Kjaer-Kristoffersen F. Dosimetric verification of RapidArc treatment delivery. Acta Oncol. 2009;48(2):185–91. 37. Feygelman V, Zhang G, Stevens C, Nelms BE. Evaluation of a new VMAT QA device, or the “X” and “O” array geometries. J Appl Clin Med Phys. 2011;12(2):146–68. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  25. 25. JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, VOLUME 14, NUMBER 5, 2013 Development of real-time motion verification system using in-room optical images for respiratory-gated radiotherapy Yang-Kyun Park,1,2 Tae-geun Son,3 Hwiyoung Kim,2 Jaegi Lee,4 Wonmo Sung,2 Il Han Kim,1,2 Kunwoo Lee,3 Young-bong Bang,5 and Sung-Joon Ye1,2,4,5a Department of Radiation Oncology,1 Seoul National University Hospital, Seoul; Interdisciplinary Program in Radiation Applied Life Science,2 Seoul National University, Seoul; Department of Mechanical and Aerospace Engineering,3 Seoul National University, Seoul; Program in Biomedical Radiation Sciences,4 Department of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Seoul; Advanced Institutes of Convergence Technology,5 Seoul National University, Suwon, Korea sye@snu.ac.kr Received 5 October, 2012; accepted 15 April, 2013 Phase-based respiratory-gated radiotherapy relies on the reproducibility of patient breathing during the treatment. To monitor the positional reproducibility of patient breathing against a 4D CT simulation, we developed a real-time motion verification system (RMVS) using an optical tracking technology. The system in the treatment room was integrated with a real-time position management system. To test the system, an anthropomorphic phantom that was mounted on a motion platform moved on a programmed breathing pattern and then underwent a 4D CT simulation with RPM. The phase-resolved anterior surface lines were extracted from the 4D CT data to constitute 4D reference lines. In the treatment room, three infrared reflective markers were attached on the superior, middle, and inferior parts of the phantom along with the body midline and then RMVS could track those markers using an optical camera system. The real-time phase information extracted from RPM was delivered to RMVS via in-house network software. Thus, the real-time anterior–posterior positions of the markers were simultaneously compared with the 4D reference lines. The technical feasibility of RMVS was evaluated by repeating the above procedure under several scenarios such as ideal case (with identical motion parameters between simulation and treatment), cycle change, baseline shift, displacement change, and breathing type changes (abdominal or chest breathing). The system capability for operating under irregular breathing was also investigated using real patient data. The evaluation results showed that RMVS has a competence to detect phase-matching errors between patient’s motion during the treatment and 4D CT simulation. Thus, we concluded that RMVS could be used as an online quality assurance tool for phase-based gating treatments. PACS number: 87.55.Qr Key words: gated radiotherapy, external marker tracking, quality assurance, 4D CT I. Introduction Respiration-induced motion has been a significant challenge in radiotherapy for thoracic and abdominal tumors.(1) To manage this motion, the respiratory gating technique was introduced and evaluated in previous studies.(2,3) In this technique, radiation is controlled by a beam delivery a Corresponding author: Sung-Joon Ye, 101 Daehak-ro Jongno-gu, Seoul, Korea, 110-744; phone: (82) (2) 2072 2819; fax: (82) (2) 741 2819; email: sye@snu.ac.kr 25    25
  26. 26. 26   Park et al.: Real-time motion verification 26 system within a particular portion of the patient’s breathing cycle (so-called gating window).(4) The tumor motion only within the gating window is taken into account in both treatment planning and delivery processes. Therefore, with this technique tumor margins can be reduced and, thus, tumor dose escalation is enabled without compromising normal tissue sparing.(5) One widely used gating system with external marker-based monitoring is the real-time position management (RPM) system (Varian Medical Systems, Palo Alto, CA). Several studies have been performed to evaluate efficacy of RPM.(6,7) The RPM system provides two alternative methods to define the gating window: phase-based gating and amplitude-based gating.(8) It has been reported that amplitude-based gating results in lower residual motion than phase-based gating.(7,9) However, in some institutions, phase-based gating is preferred for two reasons: (1) phase-based gating provides a stable duty cycle, whereas the amplitude-based gating suffers from baseline shifts;(3,9) and (2) some specific CT systems correlate images only in terms of the respiration phase.(8,10) In the RPM phase-based gating technique, the reproducibility of respiratory motion (e.g., displacement according to the respiratory phase) between simulation and treatment fractions is essential. However, in routine clinical practice, the RPM system with a phase-based mode has not provided any solution to quantitatively compare two displacements during the delivery and the CT simulation. The best way to verify the reproducibility of the respiratory motion is to use X-ray imaging,(4) which results in excessive radiation exposure if the acquisitions are performed frequently during treatment. To verify the reproducibility of the external marker position, several methods using noninvasive optical tracking were proposed. Wong et al.(11) used the ExacTrac system (BrainLAB AG, Feldkirchen, Germany) to monitor a patient’s abdominal surface positions during the deep inhalation breath-hold (DIBH) technique. Venkat et al.(12) developed an audiovisual biofeedback system using a single infrared (IR) reflective marker to improve and verify the reproducibility of external marker positions between simulation and treatment. On the other hand, Plathow et al.(13) demonstrated that the correlation between internal tumor motion and external marker motion was highly dependent on the breathing type such as abdominal breathing and thoracic breathing. This finding supported the idea that motion monitoring with a single external marker could not provide sufficient tracking information for tumor motion.(14,15) Therefore, to improve the internal–external correlation, several studies have proposed multiple external marker tracking(16) or markerless surface monitoring,(17-19) rather than single external marker tracking. For tracking of multiple markers or a patient’s surface, commercial products such as ExacTrac and GateRT (VisionRT, London, UK) are available on the market. However, so far no study has attempted to use 4D CT data as the reference of motion monitoring to check the positional reproducibility of the multiple external markers or patient’s external surface. This study aimed to develop a quality assurance technique to quantitatively compare a patient’s external surface motion between 4D CT simulation and treatment for RPM phase-based gating. The developed technique involved stereocamera-based optical tracking, surface extraction from 4D CT simulation data, and a phase synchronization method with RPM. Phantom experiments were performed using a programmable respiratory motion platform to evaluate the performance of our system. II. Materials and Methods A. System overview A schematic illustration of the proposed quality assurance method for RPM phase-based gating is shown in Fig. 1. A conventional RPM-based gating technique uses a single IR camera, an IR reflective marker bock, and a workstation connected to the beam delivery system. In order to acquire real-time images of patient surface motion according to the signals of the RPM system, the developed system consisted of two wall-mounted stereocameras and multiple IR markers Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  27. 27. 27   Park et al.: Real-time motion verification 27 Fig. 1.  A schematic representation for the developed quality assurance method using real-time motion verification system (RMVS). on the patient’s anterior surface, and a phase synchronization program (PSP). The motion error calculator (MEC) was programmed to quantitatively compare acquired real-time images of the multiple IR markers with phase-matched reference images extracted from 4D CT simulation data. Details of the developed system are given in the Materials & Methods Sections C, D, E and F below. B. Anthropomorphic phantom and motion platforms An anthropomorphic phantom (Alderson Research Laboratories, New York, NY) and two different types of motion platform (A and B) were used to evaluate the system. Platform A was composed of an acrylic stage and two linear actuators that were approximately 40 cm apart along the superior–inferior direction. The actuators oscillated the stage between two positions in the anterior–posterior direction. The two oscillation positions and the cycle can be programmed. In normal mode, these two actuators were synchronized to keep the stage horizontal. However, by fixing one actuator in a certain position, the motion platform can also simulate asymmetric motions, such as abdominal and chest breathing. On the other hand, platform B is a fully programmable motion platform which can simulate arbitrary motions in 3D space, such as patient respiratory data. The platform is composed of polycarbonate panels and four stepping motors to simulate 3D tumor and external marker motion. The positional accuracy of the platform had been evaluated by using a high-resolution laser sensor (RF603, RIFTEK, Minsk, Belarus), of which spatial resolution is 0.03 mm and temporal resolution is 0.01 ms. It was determined to be 0.2 mm.(20) In this study, most of the phantom experiments were performed with platform A because of its simplicity in operation and unique feature of asymmetric motions for breathing type change simulations. Platform B was used to evaluate the accuracy of the optical tracking system and to simulate real patient respiratory data. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  28. 28. 28   Park et al.: Real-time motion verification 28 C. Stereocamera system and IR marker tracking An IR-based stereocamera enabled us to monitor the phantom’s respiratory motion in real time. The system hardware consisted of two charge-coupled device (CCD) cameras (HVR2300C, Hi Vision System, Korea) with universal serial bus (USB) 2.0 interface, 2 IR filters (B+W 092, Schneider-Kreuznach, Bad Kreuznach, Germany), multiple IR light-emitting diodes (LEDs), and a personal computer (PC) with a 2.8 GHz central processing unit (CPU). A custom-fabricated frame housing the cameras, IR filters, and IR LEDs was mounted on the inferior wall of the treatment room. The stereocamera system was calibrated with a checkerboard template and free software (Camera Calibration Toolbox, Imperial College, London, UK). The calibration procedures were performed by following Zhang’s method.(21) Three IR reflective markers (Scotchlite 154 TM 3000X, 3M, St. Paul, MN) having a diameter of 6 mm were attached on the phantom surface along the body midline, even though the system can track multiple external markers simultaneously independent of their positions and number. In our previous studies using the same marker tracking method, the tracking accuracy was found to be 0.4 ± 0.4 mm and 0.2 ± 0.4 mm for 3D and vertical direction, respectively.(22,23) D. Extraction of 4D CT-based reference lines Phase-resolved anterior body midlines were extracted from the 4D CT simulation data and used as vertical displacement references denoted as “4D reference lines”. A workflow to obtain the 4D reference lines is shown in Fig. 2. Ten phase image sets of the phantom were acquired from a 4D CT scanner (Big Bore Brilliance, Philips Medical Systems, Bothell, WA) equipped with the RPM system. The acquired images were transferred to a treatment planning system (Eclipse, Varian Medical Systems, Palo Alto, CA), and external body surfaces were automatically contoured in the system by using a CT number threshold of -450 HU. These body contours were then exported into DICOM-RT structure (RS) files. Using in-house DICOM processing software, ten sets of anterior surface midlines tagged by unique respiratory phase values were extracted Fig. 2.  A flow chart demonstrating how to generate the 4D reference lines from 4D CT simulation data. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  29. 29. 29   Park et al.: Real-time motion verification 29 from the RS files. The initially generated ten sets of body midlines were linearly interpolated to create 100 sets of data so that each line was assigned integer phase values ranging from 0 to 99. These 100 body midlines were defined as “4D reference lines” in this study. Finally, the complete sets of the 4D reference lines were exported to a text file. E. Respiratory phase synchronization with RPM In RPM-phase based gating, beam-on and -off are controlled by the RPM real-time calculated phase information. It has been reported that the real-time calculated phase is error-prone and the retrospective phase calculation using a RPM log file (called “vxp file”) is more accurate.(8) However, despite of its imperfectness, the real-time calculated phase was our choice for the online verification for detecting any errors by monitoring in real time the relationship between phase and displacement. Therefore, to compare current positions of the tracked markers with the reference line at the same phase taken from the 100 sets of 4D reference lines, the phase synchronization program (PSP) was developed to provide this phase information in real time. The PSP was installed on the RPM workstation and operated simultaneously with RPM software version 1.7.5. As the RPM software provided a clock-shaped interface for displaying respiratory phase values, the PSP set the ROI at the center of the “clock” and processed the image of the ROI in real time, as shown in Fig. 3. Finally, the phase value calculated by the software was then transferred to the RMVS through a LAN. Fig. 3.  An illustration of the procedure to acquire the respiratory phase value from the RPM workstation. The PSP processed the ROI set on the “phase clock” and transferred the calculated phase value to the RMVS over a network in real time. F. Motion error calculator The motion error calculator (MEC) is a software module integrated into the RMVS. Three types of input data were required to run the MEC. The first data were the 3D positions of the external markers tracked by the stereocamera system in real time, the second was the 4D reference lines, and the third was the current phase value acquired from the PSP in real time. First, the MEC Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  30. 30. 30   Park et al.: Real-time motion verification 30 imports a text file containing the 100 sets of 4D reference lines. Secondly, current 3D position data of the tracked external markers were acquired by the stereocamera system in real time and simultaneously transferred into the MEC. Thirdly, the MEC received the RPM-generated current phase information from PSP and dynamically selected the reference line corresponding to the current phase value. Finally, displacement error between the external marker positions and the reference line was calculated and displayed on an operator’s monitor screen in real time. Even though the stereocamera system can track 3D coordinates of the markers, only the z-axis value was used to calculate errors. The x-axis value was not used because the 4D reference line was assumed to be a body midline having the x-axis value of 0. On the other hand, the y-axis value was used to find the corresponding projected points of the tracked marker on the 4D reference line. The MEC calculated singed and absolute errors, which are given as follows: Signed error(Δi) = Pi(t)z − z4DCT( (t), Pi(t)y (1) Absolute error = | Δi | (2) where i is the marker index, Δ is the signed error for ith marker, Pi(t)y and Pi(t)z are the y and z component of the 3D position of ith marker at time t, respectively. z4DCT is the anterior–posterior position of the 4D reference lines, and ϕ (t) is the integer value of the respiratory phase at time t. The average error for all markers can be defined as an overall error as follows: 1 N (3) Overall error (signed) = Δ N i=1 i Σ 1 N (4) Overall error (absolute) = |Δ | N i=1 i Σ where N is the number of markers. If the motion error of individual markers exceeds a userdefined tolerance, the MEC can display a warning message to the operator. The proposed point-to-line matching method allowed us to remove the necessity of an external marker during 4D CT scan. Even though a corresponding projected point on the reference line is not exactly matched with a tracked point (especially when considering deformation of patient surface), the proposed error metrics can provide quantitative error values when there are some problems in positional reproducibility between simulation and treatment. G. Phantom evaluation G.1  Evaluation of system accuracy and precision Prior to the phantom experiments with various motion parameters, the overall accuracy and precision of the system was evaluated. Even though the accuracy of the stereocamera system and platform B had been evaluated in our previous studies, a test experiment with an irregular breathing pattern was performed by using both stereocamera and platform B. Three IR markers on the RANDO phantom (The Phantom Laboratory, Salem, NY) were tracked in this experiment. On the other hand, to evaluate the accuracy of platform A, a test motion with a displacement of 31 mm and a cycle of 3.1 s was simulated with four equidistant markers and solid water slabs (one more marker and flat surface phantom in this initial evaluation). The motion parameters were then compared with the tracking data acquired by the stereocamera system. Using the test motion, the accuracy of RPM and 4D CT was also evaluated for the comparison purpose. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  31. 31. 31   Park et al.: Real-time motion verification 31 Finally, the accuracy of phase synchronization was evaluated by comparing the respiratory phase data of RPM and RMVS to confirm that both systems have the same phase value when the same positional data is given. G.2  Evaluation of interfractional changes in breathing motion Three IR reflective markers in an interval of 6 cm were attached on the RANDO phantom surface (superior, middle, and inferior markers), and a RPM marker block was placed beside the middle marker. The phantom (mounted on platform A) oscillated regularly according to a reference motion pattern (“normal breathing” hereafter, for convenience) having a displacement of 20 mm, and a cycle of 3.1 s for all three markers. The phantom then underwent a 4D CT scan and 4D reference lines were prepared as described in Section D above. In a treatment room, various motion scenarios were simulated to test whether our system could detect the abnormalities when the motion pattern was changed from that of CT simulation. Figure 4 shows the experimental setup in the treatment room. Five different motion scenarios were designed to simulate possible clinical situations. They included ideal (identical motion parameters between CT simulation and treatment), cycle change, baseline shift, displacement change, and breathing type change cases. Figure 5 shows the five motion scenarios graphically. In the ideal case, motion errors were expected to be zero. Even in the cycle change scenario, no motion errors were expected because the RPM phase-based system cannot account for any systematic changes in the breathing cycle. In contrast, the baseline shift, displacement change, and breathing type change scenarios were expected to exhibit significant motion errors in our developed system. In total, eight sets of phantom experiments, the parameters of which are listed in Table 1, were performed in the treatment room. Each experiment took 100 s. Mean signed errors (MSEs) and mean absolute errors (MAEs) between external marker positions and 4D reference lines were evaluated for each experiment. Fig. 4.  Experimental setup for system evaluation on a motion phantom. Two types of motion platform were used for the simulations of breathing type changes (platform A) and irregular breathing (platform B). Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  32. 32. 32   Park et al.: Real-time motion verification 32 Fig. 5.  Five motion scenarios were taken into account in the phantom experiments: (a) ideal, (b) cycle change, (c) baseline shift, (d) displacement change, and (e) breathing type change. Table 1. Various motion parameters of the phantom experiments. Scenario Experiment No. Description Displacement (mm) Baseline (mm) Superior Middle Inferior Cycle (s) Ideal case 1 Ideal case 0 20 20 20 3.1 Cycle change 2 Longer cycle 0 20 20 20 4.7 Baseline shift 3 4 Offset -5 Offset +5 -5 5 20 20 20 20 20 20 3.1 3.1 Smaller 5 0 10 10 10 3.1 displacement Displacement change Larger 6 0 40 40 40 3.1 displacement Abdominal 7 0 14.6 20 25.4 3.1 Breathing type breathing change Chest 8 0 25.4 20 14.6 3.1 breathing Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  33. 33. 33   Park et al.: Real-time motion verification 33 G.3  Evaluation of intrafractional changes in breathing motion Evaluation for intrafractional motion changes was simply performed by using a combination of phantom motion sequences including normal breathing, abdominal breathing, chest breathing, offset -5 mm, and larger displacement patterns. Approximately 60 s was allotted for each sequence. Between any two consecutive sequences, approximately 5 s of motor initialization time was inserted. The most important difference from the interfractional change experiment was that a respiratory model initially established by the RPM system was gradually changed during the experiment, resulting in a beam interruption signal produced by the RPM predictive filter. Therefore, the duty cycle was expected to be changed from the initial value (30%). The threshold of the predictive filter was set to be a default value.(20) G.4  Evaluation for real patient respiratory motion To evaluate the capabilities of our system in more realistic situation, additional experiments were performed using the motion platform B in conjunction with two patients’ respiratory data. The input data for the platform were made by processing RPM log files that were acquired during the 4D CT simulation and one treatment session of each patient. It should be noted that only anterior–posterior movement was simulated in this experiment because the RPM system could support vertical tracking only. The RANDO phantom mounted on the platform underwent 4D CT scans while the platform reproduced the patient’s simulation session. Finally, the RMVS was tested while the phantom reproduced the patient respiratory motion in treatment room by the same procedures as described above. III. Results A. Compatibility between RMVS and RPM The phantom experiment demonstrated that all of the functions of the developed system (i.e., RMVS) were successfully executed. Without any interference the system worked together with RPM as intended. The phase values generated by RPM were successfully transferred to the system with a minimum time delay of approximately 60 ms. A possible source of time delay could be the TCP/IP communication between RPM and RMVS. As the time delay existed consistently, a constant phase shift value was applied to MEC. B. Evaluation results for system accuracy and precision As seen in Fig. 6, the respiratory curve obtained from the stereocamera system perfectly coincided with that of the input data for platform B. The mean absolute error for 300 s of simulation calculated for a single marker was 0.2 ± 0.2 mm, which validated our previous findings on the accuracy of the system. Table 2 summarized the results of accuracy evaluation for platform A and 4D CT. It was demonstrated that platform A also moved accurately as programmed when compared to the tracking result of the stereocamera system. However, the 4D CT showed a slight underestimate of the displacement, which would be propagated to the error in our system. Details on inaccuracy of 4D CT imaging follows in the Discussion Section below. In phase synchronization, phase values of RMVS coincided well with those of the RPM system, which implied that the PSP worked well with both systems (Fig. 7). The mean absolute error for 180 s calculated for all markers was 1.4 ± 3.5%. It should be noted that a constant phase offset had been applied to the RMVS to account for the systematic time delay. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  34. 34. 34   Park et al.: Real-time motion verification 34 Fig. 6.  The programmed motion of platform B coincided perfectly with the marker motion obtained from the stereocamera system, confirming the accuracy of the systems. Table 2.  Motion parameters of the reference motion independently measured by stereocamera system, 4D CT, and RPM. Programmed tereocameraa S 4D CTb RPMa,c Displacement (mm) Mid position (mm) Cycle (s) 31.0 30.9±0.1 30.3±0.4 31.7±0.0 22.5 22.3±0.1 22.6±0.1 23.2±0.0 3.1 3.1±0.1 N/A 3.1±0.0 a Mean value of ten cycles and four marker positions. value of four positions corresponding to each marker position on 4D CT. c The baseline was corrected by using a reference position. RPM = real-time position management system; 4D CT = FOUR-dimensional computed tomography. b Mean Fig. 7.  Phase comparison between the RPM and the RMVS. Good agreement in two phase values demonstrated that the phase synchronization was successfully carried out and the selected value for the constant phase offset was optimal. Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013
  35. 35. 35   Park et al.: Real-time motion verification 35 C. Phantom evaluation results Figure 8 shows the 4D CT images of the phantom with the normal breathing. For the comparison purpose, 4D CT images from abdominal and chest breathing motions are also illustrated. It was observed that even though the breathing type changed, the movement of the central region of the phantom was similar to that of the reference breathing, which would be a potential pitfall of single marker-based monitoring. Fig. 8.  Phantom sagittal images from 4D CT sets acquired during reference motion, abdominal breathing, and chest breathing. The dotted line was set parallel to the middle surface point on the end-of-exhale phase image. C.1  Evaluation of interfractional changes in breathing motion From these experiments, it was confirmed that RPM phase-based gating didn’t provide any warning or interruption signals, even if there were significant interfractional changes in respiratory motion pattern. This supports the usefulness of our system as a respiratory QA tool. For each experiment, MSEs and MAEs between external marker positions and 4D reference lines were calculated. These are summarized in Tables 3 and 4 and graphically shown in Fig. 9. MSE mode was useful to detect baseline shifts with their direction (overall errors of -5.0 ± 0.9 and 5.1 ± 0.9 mm for experiments 3 and 4, respectively). In MAE mode, the system detected errors even in the ideal and cycle change cases (overall errors of 0.8 ± 0.5, and 0.7 ± 0.5 mm for experiments 1 and 2, respectively), which turned out to be the systematic error in our system. Relatively large errors and deviations were observed in the displacement change experiments compared to the ideal case (overall errors of 2.7 ± 1.2, and 5.9 ± 3.6 for experiments 5, and 6, respectively). For breathing type change cases, the errors of the middle marker were relatively small (0.9 ± 0.7 and 1.5 ± 1.0 for experiment 7 and 8, respectively), compared to those of the inferior and superior markers (inferior marker error of 2.5 ± 1.7 for experiment 7 and superior Journal of Applied Clinical Medical Physics, Vol. 14, No. 5, 2013

×