SlideShare a Scribd company logo
- 1 -
Managing for quality global physical rehabilitation
THE QUALITY
ASSURANCE SYSTEM AT
THE KOMPONG CHAM
(CAMBODIA) PHYSICAL
REHABILITATION
CENTRE:
ANALYSIS, AMENDMENTS AND
RECOMMENDATIONS
Wesley Pryor, 2012
- 2 -
Handicap International is an international organisation specialised in the field of
disability. Non-governmental, non-religious, non-political and non-profit making, it
works alongside people with disabilities, whatever the context, offering them
assistance and supporting them in their efforts to become self-reliant.
Since its creation, the organisation has set up programmes in approximately 60
countries and intervened in many emergency situations. It has a network of eight
national associations (Belgium, Canada, France, Germany, Luxembourg,
Switzerland, United Kingdom, USA) which provide human and financial
resources, manage projects and raise awareness of Handicap International’s
actions and campaigns.
Wesley Pryor
Regional Technical Advisor,
Rehabilitation (South Asia)
Handicap International – Technical
Resources Division, Rehabilitation
Services Unit
w: www.handicap-international.org
e: wpryor@handicap-international.asia
- 3 -
1 Table of contents
1  Table of contents........................................................................................................3 
2  Acronyms and abbreviations......................................................................................5 
3  Reading and using this report ....................................................................................6 
4  Terms of reference.....................................................................................................6 
5  Background................................................................................................................7 
5.1  HI and Physical Rehabilitation in Cambodia and Globally............................................... 7 
5.2  HI and Quality Management of Global Physical Rehabilitation ....................................... 7 
5.3  Learning from Quality Assurance in Cambodia ............................................................... 7 
5.4  QA The Physical Rehabilitation Centre in Kampong Cham............................................. 8 
6  Scope and limitations of the report.............................................................................8 
7  Executive Summary of findings and recommendations .............................................9 
8  List of recommendations..........................................................................................11 
9  Methods and activities..............................................................................................12 
9.1  Audit of existing quality assurance system .................................................................... 12 
9.2  An examination of current QA indicators ....................................................................... 12 
9.3  Benchmarks – critique and re-definition according to relevant standards ..................... 12 
9.4  Operationalising indicators to compare against key benchmarks.................................. 12 
9.5  Refinement and improvement of QAS processes.......................................................... 13 
9.6  Development of a user-friendly composite tool.............................................................. 13 
9.7  An analysis of 2010/2011 findings ................................................................................. 13 
10  Results .....................................................................................................................14 
10.1  Audit of existing quality assurance system .................................................................... 14 
10.1.1  “Coffee bean analysis” – Audit of current indicators .................................................. 25 
10.1.2  A timeline of QAS development ................................................................................. 28 
10.2  An examination of current QA indicators ....................................................................... 30 
10.2.1  Validity explained ....................................................................................................... 30 
10.2.2  Results of analysis of process and validity ................................................................ 30 
10.2.3  Summary of findings on process and indicator validity.............................................. 37 
10.3  Benchmarks – critique and re-definition according to relevant standards ..................... 37 
10.4  Operationalising indicators to compare against key benchmarks.................................. 38 
10.5  Refinement and improvement of QAS processes.......................................................... 38 
10.5.1  Planning and definitions of indicators ........................................................................ 40 
10.5.2  Data collection............................................................................................................ 40 
Generating original data ............................................................................................................ 40 
Entering primary data ................................................................................................................ 41 
Centralisation – entering into database..................................................................................... 41 
Compilation, aggregation, disaggregation................................................................................. 41 
10.5.3  Monitoring, analysis and reporting ............................................................................. 41 
10.6  Development of a user-friendly composite tool.............................................................. 42 
10.7  An analysis of 2010/2011 findings ................................................................................. 43 
10.7.1  ‘Workshop’ results...................................................................................................... 43 
Indicator 1 & 2 - Adjustment during alignment and fitting.......................................................... 43 
Indicator 4.3 – device durability ................................................................................................. 44 
Indicators 6&7 – P&O and Benchworker production statistics .................................................. 45 
10.7.2  PT findings ................................................................................................................. 46 
Indicator 3 – Treatment planning............................................................................................... 46 
Indicator 4 – missed appointments at the PRC ......................................................................... 47 
Indicator 5 – Daily treatments per PT........................................................................................ 48 
10.8  Additional Analysis: Comparing the QAS against sustainability indicators.................... 49 
11  Analysis....................................................................................................................52 
11.1  Audit of existing quality assurance system .................................................................... 52 
11.1.1  Why aren’t indicators being collected? ...................................................................... 52 
11.2  An examination of current QA indicators ....................................................................... 53 
11.2.1  Why have these indicators been chosen? ................................................................. 53 
11.2.2  A way forward............................................................................................................. 54 
- 4 -
11.3  Benchmarks – critique and re-definition according to relevant standards ..................... 55 
11.3.1  Understanding benchmarks – why these ones haven’t worked................................. 55 
11.3.2  Operationalising indicators to compare against key benchmarks.............................. 55 
11.3.3  Where we are now: data collection and flow.............................................................. 55 
11.3.4  A way forward for complex data management requirement in a PRC....................... 56 
11.4  Refinement and improvement of QAS processes.......................................................... 56 
11.4.1  A proposed process for practical, simple and manageable QAS processes............. 56 
11.5  Development of a user-friendly composite tool.............................................................. 56 
11.5.1  Introducing a Rehabilitation Management System – a new investment in managing
for quality rehabilitation services................................................................................................ 56 
11.6  An analysis of 2010/2011 findings ................................................................................. 57 
11.6.1  A general look at quality at the PRC .......................................................................... 57 
11.6.2  Learning from the experience: The challenges of the current QAS reporting
processes................................................................................................................................... 57 
11.6.3  What can we say about the service based on the data we have?............................. 57 
P&O services ............................................................................................................................. 57 
The PT service........................................................................................................................... 58 
11.7  Additional analysis of sustainability indicators and the current QAS ............................. 58 
11.8  General recommendations............................................................................................. 58 
12  Concluding remarks .................................................................................................60 
- 5 -
2 Acronyms and abbreviations
QAS – Quality Assurance System
QA – Quality Assurance
RMS – Rehabilitation Management System (HI Internal procedures)
MoSVY – Ministry of Social, Youth and Veterans Affairs (Royal Cambodian Government)
HI – Handicap International
PwD – Person/s with disability
PRC – Physical Rehabilitation Centre
PT – Physical Therapist/Physiotherapist
P&O – Prosthetist/Orthotist (person) or Prosthetics and Orthotics (the discipline)
- 6 -
3 Reading and using this report
This report is structured around the terms of references. Each term of reference is
addressed in turn, in section 10, starting on page 14. A subsequent section analyses the
findings around emergent themes. Presenting the results in this way allows a quick
orientation to the results of key questions, but develops a richer analysis of those
findings in a separate section and explores other areas that emerged during the
evaluation.
A list of recommendations is presented in section 8 on page 11.
4 Terms of reference
This section is a direct excerpt from the TOR document
The objective of the assignment is to establish an operational quality assurance system
utilising existing tools and indicators, identifying and applying benchmarks and
implementing a system of data collection, storage and reporting.
Expected outputs are as follow:
 An audit of the existing quality assurance system from 2010 and 2011 is
completed identifying indicators that are routinely collected and those that
are not.
 The indicators are appraised for their relevance against the goals of the
quality assurance system and refined as appropriate ie reconfirm the
significance of the indicator in terms of quality assurance, collection
methodologies, data storage and reporting to define a master list of key
indicators to proceed with.
 Benchmarks for the indicators are identified from national and international
standards, local laws and customs, MoSVY/PoSVY and PRC internal
practice and policies.
 Indicators and benchmarks are made operational to enable identification of
risk and safety concerns as well as identification of acceptable targets.
 Data collection, storage and reporting systems refined and/or developed
clarifying information source, frequency of collection and responsible
person.
 A user friendly composite tool is developed for data management.
 2010 indicators and first semester 2011 indicators are reviewed and a
report highlighting the main findings in terms of performance and quality is
produced.
- 7 -
5 Background1
5.1 HI and Physical Rehabilitation in Cambodia and Globally
Handicap International has been working in Cambodia since its inception in 1982.
Physical Rehabilitation has always been a substantial component of its activities.
Globally, HI has supported the Physical Rehabilitation in some 65+ countries. In 2009, HI
supported services that delivered physical rehabilitation to nearly 100,000 people. Some
of the core operational methodologies of Handicap International in Physical
Rehabilitation are a specific emphasis on supporting local, pre-existing services, using
local Human and material resources, an emphasis on capacity building and emphasising
the role of rehabilitation as only a part of a comprehensive, systemic approach to
addressing and upholding the rights of persons with disabilities. Because of the scope of
HI’s activities and focus and the inter-connectedness of its domains of action, a core
operational methodology for physical rehabilitation has not been defined.
5.2 HI and Quality Management of Global Physical Rehabilitation
Since around 2010, the rehabilitation unit of HI’s technical resource division has
emphasised on ensuring access to quality rehabilitation services. This conceptual
approach recognises the importance of equitable access to mainstream services, a need
for specialised services, and that external agencies like HI need not necessarily directly
implement those services, but might seek to ensure they exist and are effective.
Achieving this is attempted through systematic approaches to measuring and improving
the overall sectoral response in physical rehabilitation (in concert with broader disability
actions), and focusing on understanding and improving the quality of physical
rehabilitation services. Quality improvement is addressed through methodologies such
as updating and re-emphasising management-related policy, emphasising a user-
focused approach and understanding clinical governance in global rehabilitation
services.
But while this approach is an evolution of decades of action, rather than a revolutionary
change, we are still in early phases of these more systematic, repeatable and scaleable
approaches to our work.
5.3 Learning from Quality Assurance in Cambodia
The efforts of HI Cambodia and its partner organisations in implementing a
comprehensive and systematic QAS pre-date this work at HQ. Consequentially, HQ has
much to learn from the process. It also creates many opportunities for the organization to
invest in further development of the system that is in place, as an exercise in learning
from previous practice. It is in that spirit that this support visit was undertaken.
1
For a more comprehensive background to rehabilitation services in Cambodia, the reader is directed to HI-
Cambodia documentation and the original TOR for this report.
- 8 -
5.4 QA The Physical Rehabilitation Centre in Kampong Cham
The team in Cambodia and particularly at the Kampong Cham PRC have been working
to establish a Quality Assurance System to ensure a quality, well managed service is
sustained after handover to local authorities. As this report identifies, the approach has
evolved since its inception, taking into account the many changes in management,
reporting requirements and so on.
6 Scope and limitations of the report
The challenge presented in the original TORs was an immense task. A comprehensive
analysis of data, starting from raw data, of some 42 QAS indicators plus adjustments of
the QAS system, taking into account contemporary changes in program HR, governance
and project cycles requires much longer time-frames. Simply reviewing the 42 indicators
benchmarks and validity against a range of literature is itself a huge task.
Consequentially, it was agreed to target TORs 1-4 and 7, de-emphasising amendments
to the system.
However, very early in the process, it became clear that TORs 5 and 6 were probably the
more important indicators, since the QAS system so far has evolved very quickly since
its inception, and only a core set of indicators was currently in use. Consequentially, this
report examines the existing QAS system, the current effectiveness of the system, and
outlines a course of action to improve the system so that it is genuinely useful, efficient
and realistic.
- 9 -
7 Executive Summary of findings and recommendations
“Not everything that can be counted counts, and
not everything that counts can be counted.”
The QAS is a strong foundation, has evolved but is not currently used effectively.
The current QAS is not used routinely. Only a small percentage of the indicators have
been collected at all, and fewer still have been collected routinely. The necessary data
for the MoSVY and project reporting requirements and strong operational processes
have been collected elsewhere, but not in the QAS per se. During 2011, a simple
decision to refine the overall QAS to focus only on a small number of indicators was
taken. The current QAS, then, is more efficient and manageable than the original
version, but may not meet its key objectives. Overall, while there is a positive shift
towards quality assurance processes, they are disconnected with ordinary operational
activities. This dichotomy has created much additional work for an already busy team.
The indicators of the QAS are comprehensive and reasonable divided between
different domains. However, they are complex to measure, not always related to
realistic or meaningful objectives and work is needed to more effectively collect
useful data.
Many of the core indicators developed for the QAS are not well defined and
operationalised, and fewer still have clear documentation or instrumentation for gathering
and using the data. Clinical indicators were the focus of this analysis. In those domains,
there are some strong key indicators that are giving value to the clinical team, are
reliable to collect and are used to make decisions. These can be built upon.
Benchmarks are not well linked to reasonable foundations and need to be
amended in concert with revisions of the key indicators.
Given the findings that the indicators are complex and often not appropriate, it is difficult
and somewhat redundant to examine the benchmarks in detail. In short, the benchmarks
are not well linked with strong foundations, and many would be better replaced with a
simple binary yes/no indicator, and strengthened centre policies.
The indicators are not well operationalised and there are few clear places for entry of
primary data or systematic approaches to aggregating and disaggregating them. A small
percentage of the overall system has been analysed monthly, but this analysis is limited
by the indicators used and their relationship to the real objectives of the QAS.
Strengthening the collection processes of the current QAS system is not considered the
most appropriate course of action at this time.
- 10 -
The QAS has given a framework for a focus on quality assurance, but has not
been matched with management training, or carefully staged implementation of a
new and complex system.
Processes for planning, training and revision of the QAS have not been described or
implemented. There are no systematic places for data compilation or usage and data are
not routinely aggregated and disaggregated. Data collection is considered very complex
and not proportionately helpful by the staff. Consequentially, there are challenging
process issues that need to be rectified.
Recommendations and practical approaches to building on the current experiences with
alternative approaches, towards an efficient, useful user-friendly tool are presented.
The QAS offers some insights into current practices, but needs much work to
optimise its potential.
Available 2010-2011 data are analysed and some modest findings on service delivery
can be taken from those data.
A summary of recommendations corresponding to overall terms of reference and
emergent themes is presented in a List of recommendations on page 11.
Overall, a strong technical commitment and further investment from HI, partners is
warranted. The project, program and the PRC – as well as the sector and HI itself, stand
to learn much from the implementation of a strong, clear, usable QAS. This attempt has
been extremely innovative, ahead of its time and evidences capacity and commitment of
the stakeholders involved. With additional time, support and a revision of the QAS in
concert with overall management and context changes, drawing on new experiences, a
system that meets its original objectives of helping ensure sustained, quality services in
Kompong Cham, is obtainable.
- 11 -
8 List of recommendations
R 1 Overall, it is recommended to re-commit to a simpler, more efficient, useful QAS,
drawing on these experiences, building on the foundation in place, and learning from
emerging examples of good practice...............................................................................53 
R 2 Review which of the indicators reflect simple policy decisions and amend the QAS
and centre management documentation accordingly ......................................................53 
R 3 While continuing with the current QAS processes that are routinely implemented (PT
and P&O key data), review the overall system – in particular, re-defining key indicators
and their operationalization..............................................................................................53 
R 4 Invest in continued development of the overall management approach, ensuring a
new and stronger QAS is seen as the principle management instrument, rather than a
separate ‘project’..............................................................................................................53 
R 5 Incorporate Kompong Cham in the field-testing of HI's 'Rehabilitation Management
System' ............................................................................................................................54 
R 6 Invest in ongoing, systematic revision of data flow, including client cards,
aggregation of data etc. This should be iterative, be sensitive to the negative impacts of
rapid change and build on the initial overall assessments through the RMS...................55 
R 7 Plan and implement basic training in statistics, data types and usage and on quality
assurance in general........................................................................................................56 
R 8 Careful revision of the indicators is needed overall, but one area of immediate focus
might be to examine the number of missed appointments by ensuring a percentage is
reflected in reporting data. ...............................................................................................58 
R 9 Explore options for a sub-project with a specific project officer focusing on
implementing an QAS system including overseeing and supporting the necessary
training .............................................................................................................................58 
- 12 -
9 Methods and activities
This section briefly outlines the methods and activities used to address the key TORs
and to explore emerging themes and findings.
9.1 Audit of existing quality assurance system
To examine the usage of the quality assurance system, a simple data-collation exercise
was undertaken. Using the provided QAS files, a simple month-by month matrix of which
indicators had been collected was created.
These data are reported as a ‘coffee bean analysis’ – or a graphical representation of
which data had been collected, used, analysed and acted upon.
Because it because immediately clear that very substantive changes to the original QAS
had happened, a simple timeline of changes to the system and the overall project and
program contexts was developed. This involved a very simple retrospective look at key
changes to the project, using discussions with key personnel, project reporting and
logical frameworks and work-plans.
9.2 An examination of current QA indicators
Building on the development of a simpler matrix of QA indicators and parameters from
the pre-existing narrative, this analysis sought to examine the validity of indicators.
Measuring validity – perhaps ironically – is very hard. But looking at face and construct
validity, that is, whether the indicators appear to measure the relevant construct, is
simple enough to do within the scope of the present analysis, and gives a good
orientation to the overall utility of the QA system.
In a systematic manner, the validity of each indicator was analysed and explained.
9.3 Benchmarks – critique and re-definition according to relevant
standards
Because the findings of the overall usage and validity of the current QAS and examining
the changing context of the PRC suggested that the original QAS concept was not being
used, lacked valid indicators to measure the intended objectives and that the
management and human resource context had changed, it was redundant to develop
benchmarks based on the current QAS system. Rather, in subsequent
recommendations, approaches to look at new and alternative benchmarks, drawing from
HIs and other agencies recent experiences in managing physical rehabilitation, are
proposed.
9.4 Operationalising indicators to compare against key benchmarks
As for the benchmarks, operationalising the current indicators is, for now, premature
without a comprehensive re-evaluation of the overall quality assurance approach.
- 13 -
However, the MoSVY data were assessed in detail to understand their implications on
future developments and to ensure they were appropriate in the short-term.
9.5 Refinement and improvement of QAS processes
To understand current QAS processes, and particularly strong and weak areas of
practice, bottlenecks, gaps and repetitions in data collection and so on, a timeline of the
quality cycle was developed, and relevant domains of human resources stratified along
those times. Using this framework, strong points, weak points, comments and other
remarks were documented systematically to understand the overall process, with a view
to proposing changes.
Importantly, this analysis focused only on clinical personnel. As they had not been
involved in either the definition of the quality indicators or the subsequent processing of
data, their role was only in one section of the overall quality cycle. That, in itself, was
taken as an important finding (section 10.5, page 38), but also meant the analysis could
only explore data collection and entry.
The timeline/personnel template is presented as a potential tool for ongoing analysis of
the overall quality management cycle by the project and program teams.
9.6 Development of a user-friendly composite tool
Given the findings of earlier sections, this TOR was not completed. Rather, the report
makes a series of recommendations for redeveloping the hybrid QAS that has evolved
with MoSVY handover processes and the adoption of the Patient Management System.
9.7 An analysis of 2010/2011 findings
Building on the audit in the first TOR , available data were analysed and findings relevant
to the ongoing activities of the PRC are identified.
- 14 -
10 Results
10.1 Audit of existing quality assurance system
An audit of the existing quality assurance system from 2010 and 2011 is
completed identifying indicators that are routinely collected and those that are not.
An overview of the current indicators is presented in subsequent pages. This
presentation has been selected to complement the current processes, which have been
more piecemeal and therefore complex to use. A more straightforward indicator with
clear definition and operationalisation of indicators should aid future development of the
system.
The table outlines, according to the previously identified ‘work units’ (i.e., the different
management sections), the key indicators, their current benchmarks, persons
responsible and so on. .
- 15 -
INSERT: MATRIX CURRENT QA INDICATORS
Insert: A comprehensive matrix of the 2009 QAS guidelines
To facilitate simpler and systematic analysis of the overall QAS approach, this simple matrix was extracted from the QAS guideline documentation, developed in
2009 and amended until now. Subsequent analyses in this report make reference to this matrix and the indicator numbers.
      Indicator  Objective 
Indicator Criteria 
Information 
Sources 
Data 
Collection 
Data 
analysis 
Data responsibility 
Acceptabl
e 
need to 
improve 
not 
acceptable 
Collection 
Analysis 
Project 
1 
Quality of life of PRCs 
client from the 1st to 
2nd assessment 
Improve quality of 
life of the clients 
 
 
Quality of life 
assessment  Daily  Annually 
Social 
Worker 
Project 
Manager 
2 
Level of client 
satisfaction 
To measure client 
satisfaction of the 
device at delivery  85% 
75 ‐ 85 
%  <75% 
satisfaction survey 
with questionnaire  6 monthly  Annually 
Head of 
Work 
Shop 
Head of 
Work 
Shop and 
PM 
3 
% of pathologies treated 
at K Cham PRC and % of 
pathologies treated at 
the 11 PRCs 
To compare the 
representativeness of 
the PRC to the 11 
PRCs  >7  5‐Jul  <5 
PRC and national 
statistic  Annually  Annually 
PM, 
DAC/Mo
SVY and 
other 
PRCs  PM 
- 16 -
Administration 
1  Office Supply to the PRC 
to compare the 
number /amount of 
office supply of 
equipment in the PRC          
Accounting book, 
livre de Bord and 
Cash Box  Monthly 
not 
specified 
Cashier 
and 
Head of 
support 
not 
specified 
2  Staff Leave 
To compare the 
number of leave 
record by 
administrative and 
the leave record by 
each unit 
100% 
followi
ng the 
Policy 
1% 
variatio
n 
>1% 
variatio
n 
Admin and each 
unit record of staff 
leave  Quarterly 
not 
specified 
head of 
support 
unit 
Project 
manager 
with 
head of 
each unit 
3 
Communication of PRC 
by phone  
To strengthen and 
accelerate cost 
effectiveness of 
communication 
through telephone           error in file  Monthly 
not 
specified 
head of 
support 
unit 
PM with 
heads of 
unit 
- 17 -
Administration Continued 
4 
Communication of PRC 
by mailing 
To strengthen and 
accelerate cost 
effectiveness through 
mailing 
>95% 
respon
ded 
and 
filed  95‐90  <90  error in file  quarterly 
not 
specified 
head of 
support 
unit  PM 
5 
Staff training 
(workshops, congresses 
and other refreshers) 
to manage and 
strengthen staff 
training and capacity 
building record 
100% 
recorde
d 
1% 
variatio
n 
>1% 
variatio
n  admin and record   quarterly 
not 
specified 
head of 
support 
unit  PM 
6 
Level of respect of 
working time 
measure level of staff 
commitment  
>7.5 
hours 
7‐7.5 
hours 
<7.5 
hours 
admin file, staff 
movement  Daily  Quarterly  Guard 
Head of 
Support 
Services 
7 
Daily staff presence at 
work / absenteeism 
Staff respects 
working time 
0 staff 
absenc
e 
withou
t notice  5%  >5% 
staff leave record 
with approval by 
line supervision and 
line manager  6 monthly 
not 
specified 
Guard 
and 
section 
heads 
Head of 
Support 
Services 
- 18 -
Accounting 
1 
Amount of money 
between accounting 
book and cash box  compare difference 
0% 
variatio
n 
erasure
s 
any 
differe
nce 
accounting book, 
livre de bord 
weekly or 
when 
needed 
not 
specified 
Cashier 
and head 
of 
support 
unit 
not 
specified 
2 
% of money forecast and 
expenditure 
to compare money 
forecast and 
expenditures  "‐5‐5%"  6‐10%  >10% 
Monthly treasury, 
excel journal of 
accounting records  monthly 
not 
specified 
PM, 
Head of 
support 
unit and 
cashier 
with 
support 
from 
accounta
nt    
3 
Amount of money in 
USD in cash box 
To measure the 
minimum and 
maximum balance of 
the money in cash 
box 
500‐
1000 
300‐
500 or 
1000‐
3000 
<300 or 
>3000 
Cash box through 
accounting  monthly 
not 
specified 
Cashier 
and head 
of 
support 
unit    
3.1 
Amount of money in Riel 
in cash box 
To measure the 
minimum and 
maximum balance of 
the money in cash 
box  1m‐4m       
Cash box through 
accounting  monthly 
not 
specified 
Cashier 
and head 
of 
support 
unit    
4 
Amount of money in 
bank 
To measure the 
minimum and 
maximum balance of 
the money in bank  5‐7k 
3‐5, 7‐
10k 
<3,>10
k 
Bank record and 
record of checks  monthly 
not 
specified 
head of 
support 
unit    
- 19 -
5 
Date of salary payment 
to staff 
To  make sure that 
salary payment is on 
time 
25th‐
30th of 
month 
variatio
n    
pay slip and bank 
transfer records  monthly 
not 
specified 
head of 
support 
unit and 
HR 
deputy 
manager    
6 
Justification of each 
expenditure 
Ensure clear 
justification of 
expenditures  100% 
variatio
n     Records of invoices  monthly 
not 
specified 
Cashier 
and head 
of 
support 
unit    
7 
Number of cheques 
from the bank 
to make sure that the 
money withdrawal is 
done properly and 
regularly 
4 per 
month  3,5  <3,>5  Cheques recorded  monthly 
not 
specified 
head of 
support 
unit    
- 20 -
Store Management 
1 
quantity of items 
between stock cards and 
physical stock 
to make sure that all 
the items from stock 
cards are the same 
from physical stock  100%  95%  <95% 
record of stock and 
cards  Various 
not 
specified 
Store 
keeper 
and head 
of social 
support    
2  Critical stock of item  
to make sure all the 
items especially the 
imported ones and 
consumables are 
always available for 
the workshop 
No 
case of 
stock 
disconn
ection 
OR 
100% 
of 
import
ed 
items 
respet 
critical 
stock 
5% 
item 
not 
respect
ing 
critical 
levels 
OR 
there is 
maximi
m of 3 
case of 
stock 
more 
than 
5% 
Stock Control 
Report  quarterly 
not 
specified 
Head of 
support 
unit and 
store 
keeper    
- 21 -
Workshop Unit 
1 
Adjustment of the 
alignment during gait 
training. 
to ensure the desired 
level of smooth gait 
and stabilities  <3  4,5   >5 
Daily activities and 
checklist in client 
file  Daily  Quarterly  PO and PT 
Head of 
workshop 
unit and 
PM 
2 
Adjustment of the 
socket/orthosis 
To ensure proper fit 
of socket and 
stability of the 
prosthesis for the 
client, lack of pain 
and pressure areas 
on skin  <2  3,4  >4 
Daily activities and 
checklist in client 
file  Daily  Quarterly 
Head of 
Workshop 
Unit 
Head of 
workshop 
unit and 
PM 
3 
Number of mistakes 
during manufacturing 
process 
To measure the 
technical 
competency of the 
P&O  2  3  >3 
Daily activities, 
progress notes for 
P&O and checklist 
made by head of 
section  Daily  Quarterly 
head of 
workshop 
unit and 
head of 
section 
Head of 
workshop 
unit and 
PM 
4.1 
Prosthesis and orthosis 
life span / durability 
(existing client) 
to ensure the quality 
of the devices  >5  3,4  <3 
Devices record, 
PMS and client file  monthly  Quarterly 
head of 
workshop 
unit 
Head of 
workshop 
unit and 
PM 
4.2 
Prosthesis and orthosis 
life span / durability 
(new client) 
to ensure the quality 
of the devices  >9  7,8  <7 
Devices record, 
PMS and client file  monthly  Quarterly 
head of 
workshop 
unit 
Head of 
workshop 
unit and 
PM 
4.3 
Shoe raise and SFAB life 
span in months 
to ensure the quality 
of the devices  >5  3,4,5  <3 
Devices record, 
PMS and client file  Monthly  Quarterly 
head of 
workshop 
unit 
Head of 
workshop 
unit and 
PM 
4.4 
Wheelchair, tricycle 
standing frame, seat life 
span 
to ensure the quality 
of the devices  >85%  75,85  <74% 
Devices record, 
PMS and client file  Monthly  Quarterly 
head of 
workshop 
unit 
Head of 
workshop 
unit and 
PM 
- 22 -
4.5 
Trolley life span in 
months 
to ensure the quality 
of the devices  9  6,7,8  <6 
Devices record, 
PMS and client file  Monthly  Quarterly 
head of 
workshop 
unit 
Head of 
workshop 
unit and 
PM 
5 
Level of client 
satisfaction of device 
To measure client 
satisfaction of the 
device at the delivery  >85%  75,85  <74% 
Satisfaction survey 
with questionnaire  six monthly  annually 
head of 
workshop 
unit 
Head of 
workshop 
unit and 
PM 
6 
Number of devices 
deliver per month, per 
P&O 
Efficiency of P&O 
work  >33  30‐33  <30 
Monthly progress 
data collection and 
P&O record book             
7 
Number of devices 
delivered per month per 
BT 
To measure the 
efficiency of BT work  >17  15,17  <15 
Monthly progress 
data collection and 
P&O record book  monthly 
six‐
monthly 
head of 
workshop 
unit 
Head of 
workshop 
unit and 
PM 
- 23 -
PT Unit 
1 
Progress in functional 
skills of client 
To measure the level 
of technical 
competency of PT  >10%  5‐10%  <5% 
treatment plan, 
progressive note, 
checklist from 
client files 
daily or 
weekly  Quarterly 
head of 
PT unit 
head of 
PT unit 
and PM 
2 
Progress in functional 
skills of children 
To measure the level 
of technical 
competency of PT  >10%  5‐10%  <5%  GMFCS Ax form  monthly 
six‐
monthly 
head of 
the PT 
unit 
head of 
PT unit 
and PM 
3 
Number of detailed 
treatment plans that 
include SMART goals 
To measure the 
treatment planning 
skills  >90%  80‐90%  <80% 
treatment plan, 
progressive note, 
checklist from 
client files  monthly  Quarterly 
head of 
the PT 
unit 
head of 
PT unit 
and PM 
4 
number of missed 
appointments at the PRC 
To measure the level 
of participation client 
to the PT treatment  <50%  50‐60%  >60% 
Database system, 
daily appointment 
schedule   monthly  Quarterly 
head of 
the PT 
unit 
head of 
PT unit 
and PM 
5 
Number of treatment 
sessions per day per PT 
to measure the 
efficiency of working 
of PTs  12,14 
10,11;1
5,16 
<10,>1
6 
Database system, 
daily appointment 
schedule   daily 
six‐
monthly 
head of 
PT unit 
head of 
PT unit 
and PM 
- 24 -
Interdisciplinary approach 
1 
Number of check‐outs 
by PT and PO 
to measure the 
efficacy of the 
multidisciplinary 
work (at the delivery 
of the client, all cases 
should be checked 
regardless of the 
clients with device or 
without device)  100%  95‐99%  <95  check out list  monthly 
six‐
monthly 
PM and 
head of 
sections  PM 
2.1  Morning meeting 
to measure the 
degree of 
collaboration among 
staff  >95%  85‐95%  <85%  file checking system     Quarterly  PT unit 
PM and 
head of 
PT 
2.2 
Joint consultation, 
screening 
to measure the 
degree of 
collaboration among 
staff  100%  95‐99%  94%  file checking system     Quarterly  PT unit 
PM and 
head of 
PT 
2.3 
Joint assessment and 
prescription / number of 
MD meetings 
To measure the 
degree of 
collaboration among 
the staff  100%  95‐99  <95%  file checking system     Quarterly  PT unit 
PM and 
head of 
PT 
2.4  Complex cases meetings 
to measure the 
degree of 
collaboration among 
the staff  100%  95‐99  <95%  file checking system     Quarterly  PT unit 
PM and 
head of 
PT 
2.5  Daily client round 
to measure the 
degree of 
collaboration among 
the staff  100%  95‐99  <95%  file checking system     Quarterly  PT unit 
PM and 
head of 
PT 
- 25 -
10.1.1 “Coffee bean analysis” – Audit of current indicators
Building on a clearer matrix identified in the previous sub-section, the next section explores
the current usage of the system. Specifically, an audit from data since October 2010 was
undertaken based on records supplied.
This very simple analysis is not intended to explore the validity of the approach taken, or
interpret any results, but simply to audit whether or not the data have been systematically
collected, aggregated and disaggregated. These terms are described below.
Not collected Indicators that are pre-defined in the 2009 guideline
document and reinforced in recent documents that were
not, in the specified monthly period, collected. NB. Data
may have been collected elsewhere, but have not been
collated in the QAS spreadsheets2.
Collected Data on the indicator, whether verified or not, are evident in
available QAS spreadsheets
Aggregated Data are compiled and averaged within the relevant time
period
Disaggregated Data are examined for differences in sub-sections and
important comparisons are possible with the treatment of
data
These findings are presented in the subsequent table.
2
A key feature of good QAS systems is that it is not entirely distinct from ordinary operational procedures. Here,
that has been the case. This issue and some potential solutions are explored in subsequent chapters.
- 26 -
Table 1 - 'Coffee Bean' analysis of current collection, processing and analysis
of QAS indicators.
     
Key 
                     
not collected  

Collected  
     
Aggregated  
             
Disaggregated  

 
2010  2011 
   Indicator 
oct  
nov  
dec 
jan 
feb 
mar 
apr 
may 
jun 
jul 
aug 
sep 
oct 
nov 
dec 
Project 
1  QoL change                
2 
Level of client 
satisfaction 
              
3  Pathologies                
Admin 
1 
Office Supply 
to the PRC 
              
2  Staff Leave                
3  Phone                
4  Mailing                
5  Staff training                
6 
Level of 
respect of 
working time 
              
7  Presence                
Accounting 
1 
Amount of 
money 
between 
accounting 
book and 
cash box 
              
2 
% of money 
forecast and 
expenditure 
              
3.1 
Cash box ‐ 
USD 
              
3.2 
Cash box ‐ 
Riel 
              
4 
Amount of 
money in 
bank 
              
5 
Date of salary 
payment to 
staff 
              
6 
Expenses 
justification 
              
7 
# cheques 
from bank 
              
- 27 -
Store 
1  Stock audit                
2 
Critical stock 
of item  
              
Workshop Unit 
1  # alignments                
2  Adjustments                
3  Errors                
4.1 
P&O lifespan 
(existing 
client) 
              
4.2 
P&O lifespan 
(new client)) 
              
4.3 
Shoe raise 
and SFAB life 
span 
              
4.4 
WhCh etc 
lifespan 
              
4.5 
Trolley life 
span in 
months 
              
5 
Satisfaction w 
PO 
              
6  PO Output                
7  BT output                
PT Unit 
1 
functional 
skills of client 
              
2 
functional 
skills of 
children 
              
3  SMART goals                
4  Missed apts                
5  # sessions PT                
ID team 
1 
# team check 
outs 
              
2 
Morning 
meeting 
              
3 
Joint 
consultation, 
screening 
              
4  Joint Ax                
5 
Complex 
cases 
meetings 
              
6 
Daily client 
round 
              
- 28 -
MoSVY 
1  Age                
2  Gender                
3  Pathology                
4 
New v 
existing 
              
5  Production                
6  Dorm usage                
7 
Referrals to 
and from 
              
8  Follow‐up                
9  Cost                
Other  1  Waiting time                
The preceding table summarises how data have been collected and treated since late 2010.
Overall, a few key indicators have been reliably collected. Only MoSVY data have been
systematically disaggregated. Original planning was for a phased rollout of all the indicators,
including periodic review of the systems used, the relevance and complexity of indicators and
so on.
This simple analysis demonstrates that many of the more complex indicators have not been
used at all, since no instruments to examine them have been developed. Instead, more
immediately useful data with clear systems for collection, aggregation, disaggregation and
review, have been collected more successfully.
Consequentially, the implementation of the QAS plan according to the specifications outline in
2009 has not been completed. Rather, a system that has adjusted to the changing
requirements of additional projects, the MoSVY handover, availability of useable instruments
and the workload constraints of staff, has been implemented.
10.1.2 A timeline of QAS development
Given the multiple pressures and constraints on the current project, the overall Kampong
Cham PRC, its staff and the technical personnel, it may have been complex to reconcile the
varying reporting requirements. This may, in part, explain the limited usage of the previously
developed QAS. To better understand these varied requirements, a simple timeline is
presented in Table 2 on the next page.
- 29 -
2009 
2010  2011  2012 
Oct 
Nov 
Dec  
Jan 
Feb  
Mar 
Apr 
May 
Jun 
Jul 
Aug 
Sep 
Oct 
Nov 
Dec 
Jan 
Feb 
  
ongoing 
Kampong Cham QAS milestones 
Management 
and admin  Initial scoping exercise / QAS guideline/indicator devt.    
Initial review of indicators 
Admin and 
management 
scheduled for 
mid‐project 
implementation 
        
Change in project management 
Shift of emphasis to 
project‐related admin 
procedures 
        
Introduction of MoSVY 'PMS' clinical forms 
Current review and re‐planning 
Reflection on management‐
related QAS processes as 
clinical outcomes transition to 
MoSVY system 
P&O  basic 
production 
stats 
Plan in place for the 
gradual introduction of 
more complex 
indicators 
      Changing 
emphasis 
towards MoSVY 
data 
Core set' of indicators 
used to ensure fluid 
transition to MoSVY 
data 
A change to a simpler set of 
indicators that can be used to 
guide practice more effectively PT       
Finances    
Plan for implementation of key 
indicators 
Shift of emphasis to 
project‐related admin 
procedures 
        
Re‐examine finance‐related 
QAS tools separate to project 
management and HI 
procedures  
MoSVY 
handover 
     
MoSVY database 
in place and used 
     
Full reporting to MoSVY implemented via 
"PMS" 
Understand how MoSVY 
requirements can be built on 
for internal QAS. Explore cross‐
centre issues with other 
stakeholdres 
External 
factors 
           
Start of 
development of 
HI‐wide 
'Rehabilitation 
Management 
System' 
Ongoing testing and development of system 
HI release of Beta "RMS" 
HI Shifts in continuing 
professional development 
approaches 
Table 2- A timeline of relevant reporting requirements, QAS milestones and external parameters
This table plots a highly summarised version of key events within the implementation of a QAS, handover processes and concurrent external factors. It shows
that during 2011, handover processes and management changes introduced new and different reporting requirements that probably disrupted the evolution of
the QAS. As of February 2012, clinical reporting is stronger than management and financial processes, which coincides with changing management
structures and handover milestones. 2012 and ongoing activities are summarised.
- 30 -
10.2 An examination of current QA indicators
The indicators are appraised for their relevance against the goals of the quality
assurance system and refined as appropriate ie reconfirm the significance of the
indicator in terms of quality assurance, collection methodologies, data storage
and reporting to define a master list of key indicators to proceed with.
Building on the previous analyses – specifically constructing a clear matrix and an audit
of collected data, the next section seeks to analyse the relevance of the indicators. This
analysis focuses on
 Simple remarks on collection issues identified
 Issues identified with storage
 Comments on reporting
 Face validity
 Construct validity
 Meeting the objectives of the initial QAS design
10.2.1 Validity explained
Because this analysis describes an appraisal of the existing indicators against a set of
goals, it is useful to use common definitions of validity to do so. These are not
necessarily self-explanatory, so simple definitions are given here:
 Face validity is about whether ‘on its face’, is the indicator a reasonable one
for the construct (the thing, quality, quantity) being measured. Put simply:
are we actually measuring what we are trying to measure?
 Construct validity extends the analysis a little further, and asks whether the
operationalised indicators related to what we know about the underlying
phenomenon behind the indicators. Put simply, if we see an improvement
or deterioration in the indicator, will that be related to improvements and
gains in what we are actually trying to measure?
10.2.2 Results of analysis of process and validity
Findings and remarks on collection processes and simple observations on validity are
presented in the following table.
- 31 -
Table 3 - Practical collection and storage issues, and remarks on validity of the current QAS.
This table systematically examines the collection processes, storage and reporting issues of the current QAS. Remarks on face and construct validity are
presented. As for the previous development of a more practical and understandable, user-friendly QAS matrix, this table is presented in the relevant
management sections.
      Indicator Criteria  Process  Validity 
  
 
Indicator 
Acceptable 
need to improve 
not acceptable 
Remarks on 
collection 
Data storage  Reporting  Face  Construct 
Project 
1  QoL change 
        
SW took over, and then this 
was stopped during 
management changes.  
The data that have been 
collected have not been 
routinely aggregated in 
monthly reporting. 
The data have not been 
reported in the audit 
period.  
Overall, a quality of life 
indicator is much needed 
The indicator and benchmark 
criteria are not well 
operationalised to match HIs 
emerging tool in this area. 
Much work is underway on this 
important tool. An earlier 
advisory suggested the TIGA 
instrument, and this should be 
followed further 
2 
Level of client 
satisfaction 
85% 
75 ‐ 85 
%  <75% 
Appears to only have been 
collected once in late 2010.  
There is no clear place for 
monthly data to be 
aggregated.  
There has been no attempt 
to understand who is 
satisfied, who isn't and why. 
Careful disaggregation is 
needed. This should not be 
complex ‐ each client will 
have a simple satisfaction 
entry, possibly reduced 
further into 'dorm, services, 
etc but with an overall 
aggregation.'  
A client satisfaction tool is an 
important part of any 
conceivable QAS approach.  
The benchmarks don't 
adequately reflect satisfaction. 
There is no explanation or any 
obvious reason for the 
delineations between 
acceptable and not acceptable.  
3  Pathologies 
>7  5,6,7  <5 
MoSVY data are collected in 
these areas 
In MoSVY database. 
Currently, reporting on this 
indicator is complex, as 
there are many groups and 
variables in a matrix.  
     
- 32 -
Administration 
1 
Office Supply to 
the PRC          
Not examined in audit. No data, aggregations or analysis appear in the QAS data provided. Some indicators are 
examined in ordinary programme management but do not appear in QAS  
2  Staff Leave 
100% 
following 
the Policy 
1% 
variatio
n 
>1% 
variatio
n 
3  Phone          
4  Mailing 
>95% 
responde
d and 
filed  95‐90  <90 
5  Staff training  100% 
recorded 
1% 
variatio
n 
>1% 
variatio
n 
6 
Level of respect 
of working time  >7.5 
hours 
7‐7.5 
hours 
<7.5 
hours 
7  Presence 
0 staff 
absence 
without 
notice  5%  >5% 
Accounting 
1 
Amount of 
money between 
accounting book 
and cash box 
0% 
variation  erasures 
any 
differen
ce 
Not examined in audit. No data, aggregations or analysis appear in the QAS data provided. Most of the indicators 
are more appropriately simple policies. That is, it should not be necessary to measure as a percentage, compliance 
with compulsory performance.  
2 
% of money 
forecast and 
expenditure  "‐5‐5%"  6‐10%  >10% 
3  Cash box ‐ USD 
500‐1000 
300‐500 
or 1000‐
3000 
<300 or 
>3000 
3  Cash box ‐ Riel  1m‐4m       
4 
Amount of 
money in bank  5‐7k 
3‐5, 7‐
10k  <3,>10k 
5 
Date of salary 
payment to staff  25th‐30th 
of month 
variatio
n    
6 
Expenses 
justification  100% 
variatio
n    
7 
# cheques from 
bank 
4 per 
month  3,5  <3,>5 
Store 
Management
1  Stock audit  100%  95%  <95% 
Store management is in transition to MoSVY Civil Servants. Consequentially, there appear to have been complexities 
in developing a systematic approach. This is a key target area for the immediate future.  2 
Critical stock of 
item  
No case 
of stock 
disconnec
tion OR 
100% of 
imported 
items 
respet 
critical 
stock 
5% item 
not 
respecti
ng 
critical 
levels 
OR 
there is 
maximi
m of 3 
case of 
stock 
more 
than 5% 
- 33 -
Workshop Unit 
1  # alignments 
<3  4,5   >5 
Collecting these data 
appears to be very 
burdensome and complex.  
Data have been stored by 
the head of the PO unit. 
Reporting reduces the data 
into a simple histogram, 
identifying number of 
alignment changes done for 
each fitting. There is no 
attempt to distinguish 
between different team 
members or prosthesis 
types.  
To measure smooth gait and 
gait stability, then those 
constructs should be 
examined ‐ not the number 
of alignment changes.  
The stated objective doesn't 
match the indicator. Stability 
and smoothness are probably 
linearly related to the number 
of adjustments, rather than 
inversely as the indicators 
would suggest 
2  Adjustments 
<2  3,4  >4 
As above 
There are some gaps in the 
data storage and there are 
complexities in entering and 
understanding the data 
The data have been 
reported on occasions. 
There is no attempt to 
understand reasons for high 
numbers of adjustments.  
Proper fit of a device is 
measured by examining the 
fit, not how many 
adjustments it took to reach 
it.  
The number of substantive 
socket adjustments might be a 
useful thing to measure, but it 
doesn't say very much about 
the quality of the final fit.  
3  Errors 
2  3  >3 
Because mistakes have only 
been vaguely defined, it is 
almost impossible to 
reliably collect these data.  
There is no clear place for 
monthly data to be 
aggregated.  
An average number of 
mistakes has been reported 
on occasions, with little 
analysis.  
Technical competency of the 
P&O is really only very 
tenuously related to the 
number of mistakes during 
the manufacturing processes. 
At best, it is one very small 
component of competence. 
Further, the errors defined 
are more properly the work 
of a bench worker.  
Competence is a complex thing 
to measure. We have 
experience with a number of 
tools for rapid analysis. 
However, a longer term 
mentoring approach is a far 
more appropriate way to 
measure competence. That is a 
harder and longer‐term 
management process change, 
but should be explored. This 
would have been in place 
under previous management 
processes, but appears to have 
deteriorated and has possibly 
been confused by constant 
change in management 
approaches.  
- 34 -
4.1 
P&O lifespan 
(existing client) 
>5  3,4  <3 
Not collected for new 
clients, only existing clients  
This is a complex statistic to 
understand, even with 
reliable collection. As 
devices are of a mixed life‐
span, at the time of analysis, 
it is hard to retrospectively 
understand what went 
wrong.  
An average life span of 
devices has been reported 
for most months of since 
late '10. Doesn't distinguish 
between prostheses and 
orthoses or between 
different complexities of 
devices.  
The objective needs to be 
thought about. Durability 
and quality are not the same 
thing.  
The data analysis does not 
match the indicators. The 
benchmarks need to be 
better defined. In fact, it is 
probably more sensible to do 
a self‐benchmark wherein 
the data are compared to 
previous months rather than 
some arbitrarily determined 
number that is intended to fit 
all types of devices, where 
different devices have a 
vastly different expected life‐
span. Not collected for new 
clients, only existing clients 
Possibly a more useful 
approach would be an ongoing 
audit of the kind of failure, and 
efforts to address recurrent 
issues seen in returned 
devices.  
4.2 
P&O lifespan 
(new client)) 
>9  7,8  <7 
Not collected for new 
clients. 
There is no clear place for 
monthly data to be 
aggregated.  
not reported 
It is unclear why new and 
existing clients would have 
different life‐span of devices, 
other than those related to 
fit and functional changes. It 
is unclear how monitoring 
this statistic would help 
improve service.  
  
4.3 
Shoe raise and 
SFAB life span  >5  3,4,5  <3 
Not collected, not defined. 4.4 
WhCh etc 
lifespan  >85%  75,85  <74% 
5.1 
Trolley life span 
in months  9  6,7,8  <6 
5.2  Satisfaction w PO 
>85%  75,85  <74% 
There is no form on 
satisfaction for devices, but 
there is one for the PRC 
overall. The statistic has not 
been collected. 
There is no clear place for 
monthly data to be 
aggregated.  
Has not been reported.  
Satisfaction with the device 
would be a useful measure to 
collect, provided a tool can 
be developed that is valid 
and reliable, without being 
too complicated.  
  
- 35 -
6  PO Output 
>33  30‐33  <30 
Doesn't count complex 
devices like standing 
systems. Unclear why there 
is a large difference 
between the P&Os and the 
benchworkers. The 
benchmarks don't match 
these data 
The process of collecting 
and entering these data 
appears to be more 
complex than necessary, as 
the information has largely 
been collected in the 
MoSVY database anyway.  
No attempt to look at the 
range of staff output. The 
reason to do this would not 
be to critique low‐outputs, 
but to have a way of 
strategically planning who 
might have time for 
alternative activities, 
particularly related to CPD 
and career development.  
The indicator doesn't say 
anything about efficiency 
unless the units are better 
defined. That is, what is the 
net effort for a particular 
device. Month‐to‐month 
comparisons are meaningless 
if the complexity of devices 
changes.  
Efficiency is about much more 
than crude numbers of devices. 
That's not to say the index 
shouldn't be measured, but to 
assume it is related to 
efficiency is problematic. Costs, 
functional gains, quality, 
relevance of the prescription 
are all related ot efficiency, 
and these are not currently 
explored.  
7  BT output  >17  15,17  <15 
as above 
PT Unit 
1 
functional skills 
of client 
>10%  5‐10%  <5% 
Not currently collected. 
Very difficult to implement.  
Has not been collected to date.  
This is a useful construct to measure, but it does not necessarily 
say anything about the competence of the professionals.  
2 
functional skills 
of children 
>10%  5‐10%  <5% 
see comments in file 
3  SMART goals 
>90%  80‐90%  <80% 
For each client, whether the 
goal is smart or not has 
been determined. However, 
this has not been compared 
against the agreed 
benchmark.  
This is currently stored in a 
separate client‐by‐client 
spreadsheet, with a simple 
yes/no indicator for smart 
goals.  
While the data have been 
collected and some effort 
has been taken to average 
them, they haven't been 
compared to benchmarks.  
If a decision is taken to use 
SMART goals in this context ‐ 
just do it. There is hardly a 
need to measure compliance 
to a policy as a percentage.  
Whether are not goals are 
smart really doesn't target 
whether the treatment plans 
are of a high standard, just that 
they use a particular 
documentation approach. A 
simple approach to measure 
the planning skills might be 
part of an overall approach to 
examine the competency of 
PTs 
4 
missed 
appointments at 
the PRC 
<50%  50‐60%  >60% 
Up to now the figures were 
reversed, meaning a high 
number was considered 
positive, when in fact the 
number is MISSED 
appointments. 
Reported and stored in a 
separate spreadsheet. 
Entered by administration.  
Reported on occasions as a 
simple percentage without 
disaggregation or further 
analysis.  
This is probably a simple and 
useful statistic, and therefore 
valuable. But, it probably 
doesn't accurately measure 
the stated objective.  
This does not only cover PT but 
for the whole PRC.  
5  # sessions PT 
12,14 
10,11;1
5,16  <10,>16 
  
Currently a complex 
approach for entering into 
the QAS is used.  
This has consistently been 
reported through both the 
QAS and MoSVY databases. 
As for P&O, the simple 
number of sessions per 
therapist per day is probably 
too simple to measure the 
stated objective, but needs 
to be measured anyway.  
Doesn't include all treatments 
such as client education, etc. 
Not clear if can be 
disaggregated between the 
different therapists.  
- 36 -
Interdisciplinary approach 
1 
# team check 
outs 
100%  95‐99%  <95 
  
Currently entered in a 
complex approach without 
clear definition of the 
responsibility.  
Not reported consistently, 
though the aggregated data 
are available for most 
months.  
The index seems totally 
unrelated to the stated 
objective. Simply checking 
out devices together says 
nothing at all about the 
efficacy of doing so.  
The need for this statistic is 
unclear. Like some others, it 
probably reflects what is a 
simple policy and practice 
decision. That is, if the policy is 
that all clients with an assistive 
device should be checked out 
by both PT and P&O, then that 
should just happen. The QAS 
should not necessarily measure 
to the percentage what 
happens, but identify 
breakdowns in that system and 
seek to address them.  
2.1  Morning meeting 
>95%  85‐95%  <85% 
Only vaguely and not reliably recorded. Should be replaced with a policy decision and practice change.  
2.2 
Joint 
consultation, 
screening 
100%  95‐99%  94% 
2.3  Joint Ax  100%  95‐99  <95% 
2.4 
Complex cases 
meetings  100%  95‐99  <95% 
3  Daily client round 
100%  95‐99  <95% 
- 37 -
10.2.3 Summary of findings on process and indicator validity.
Overall, the introduction of a pilot QAS approach into the PRC has had many positive
results. A focus on understanding quality, thinking about indicators and definitions of
quality service, and ensuring that there is a strong and focused discourse on quality
during handover to the MoSVY.
However, the analysis here suggests that there have been many complexities in the
introduction, and the result is a complex system with modest direct advantages. This
section examines some of the key issues.
In general, a few indicators are clearly useful to the staff, are collected systematically and
are able to be used to draw useful inferences about the quality of ongoing services.
There are, however, several recurrent themes for many of the indicators. These are
summarised as follows.
 Overall, QAS approaches have been considered relevant mostly to clinical
areas, rather than over-arching administrative, HR, logistics issues.
 Indicators are often a poor reflection on the parameter they are trying to
measure.
 Many indicators are not well operationalised – that is, they don’t measure
what we are interested in
 Many indicators simply seek to measure compliance with a policy, and
should probably simply be replaced with a policy change and management
to ensure compliance with it
 There is no real mechanism to review data, reflect on issues and to plan
and implement process change
 The collection of data is seen as – and is – burdensome and not
proportional to the value it offers.
These issues are further analysed in subsequent sections.
10.3 Benchmarks – critique and re-definition according to relevant
standards
Benchmarks for the indicators are identified from national and international
standards, local laws and customs, MoSVY/PoSVY and PRC internal practice and
policies
Given the results outlined in sections 10.1, deepening the analysis to examine the
benchmarks one-by-one is, for the most part, redundant. Given that many of the
indicators themselves lack validity, seeking benchmarks from relevant literature is neither
appropriate nor possible. Instead, what is needed is some changes to the quality
assurance approach at the PRC, building on the strengths in the PMS system, MoSVY
reporting requirements and the indicators that have been collected efficiently and simply
so far.
- 38 -
10.4 Operationalising indicators to compare against key benchmarks
Indicators and benchmarks are made operational to enable identification of risk
and safety concerns as well as identification of acceptable targets.
As for section 10.3 above, a deep analysis of how indicators are operationalised, and
how data are gathered using the current indicators is probably far less relevant than a
new look at the overall quality assurance approach. For this reason, this TOR was
approached in a different manner, by proposing alternative strategies to build on current
QA activities in a more efficient and effective manner.
10.5 Refinement and improvement of QAS processes
Data collection, storage and reporting systems refined and/or developed clarifying
information source, frequency of collection and responsible person.
Given the complexities and wider challenges for the quality assurance system, refining
the current system without first revisiting the basic elements such as key indicators,
reporting requirements and developing the requisite skills in responsible staff, is not
feasible within the current timeframe. Rather, a programmatic response is needed. In
that response this evaluation and its recommendations are considered, and plans to
further refine the approach in light of recent developments that have precipitated major
changes, would be planned and implemented over a longer period.
Irrespective of these findings, as a starting point, a simple process analysis focusing on
clinical services, rather than on management and administration (due to time and the
availability of relevant staff), was conducted.
A matrix of actors and processes was developed, simply by defining the key domains of
activities and approximating the quality cycle as another axis. Doing so enables a quick
analysis of different processes for each player at different stages. In each cell, semi-
structured analysis can explore:
 Challenges
 Barriers
 External constraints
 Good practice
 Errors
 Other experiences
Doing so allows a structured examination of bottle-necks or strong points in the overall
process.
The matrix is presented in Table 4 on the subsequent page.
- 39 -
   
plan  define indicators  Data collection & use  monitor  analyse  Process change 
Staff 
Sub‐category 
Strategic 
planning 
process 
Unit‐level 
planning 
of 
indicators 
Analyse 
external 
factors 
Define 
indicators 
Propose 
and agree 
on 
benchmarks 
Generating 
original 
data 
Enter 
primary 
data 
transfer 
data to 
central 
database 
Compile/ 
aggregate/ 
disaggregate 
Examine 
data 
Examine 
against 
indicators 
Analyse 
cause of 
variations 
identify 
responsible 
staff 
Implement 
change 
process 
change 
to 
system 
Administration 
Management                                              
Office                                              
Finance 
Accounting                                              
MGMT                                              
Clinical 
PT                                              
P&O                                              
Social                                              
WhCh / etc                                              
Support 
Store                                              
Guards                                              
Etc                                              
User                               
Table 4 - A process/actor matrix for examining current approach to QAS implementation
This table plots the elements for analysis by domain and process. The broad time-elements of a QAS process are approximated on the left-right axis. Key
domains are presented in the table rows. The highlighted box represents the key areas of focus in the present analysis.
- 40 -
Once this matrix had been developed, we made a simple analysis of key ‘cells’ – that is,
tasks and considerations for particular human resources at particular elements of the
cycle. The next sections divide the cycle into planning and defining indicators, gathering
and usage of data and monitoring, analysis and reporting.
Because the staff involved in this section of the analysis were most interested with their
responsibilities in collecting the data, this was the main focus, but we made general
observations on planning and analysis.
10.5.1 Planning and definitions of indicators
The planning and definition of indicators occurred mostly in 2009. Recommendations for
change were made in late 2010. Proposed adjustments have not been implemented.
There is no systematic approach to re-evaluating indicators. The present evaluation was
intended to re-shape and build upon the current approach, but as earlier sections have
suggested, there have been dramatic shifts to the project constraints, HR and reporting
requirements that require adjustments that are beyond the scope of this evaluation
alone. However, the matrix developed in table Table 4, above, offers a systematic
approach to re-developing the system if that is considered the most appropriate way
forward.
The analysis of the validity of the indicators presented in 10.2 on page 30 suggests that
this is perhaps one of the biggest bottlenecks in the process. The data are not used
effectively because the indicators have not been refined, tested and considered – and
more importantly that there is no plan to do so currently. Just as critically, there is
nowhere that this can be done routinely embedded into the QA processes. Because
standards change, and the working context changes – not lease new and stronger skills
in gathering, using and interpreting QA data, it follows that the indicators and processes
should also evolve. Currently, that is lacking in the QAS approach, and is probably
largely responsible for the limitations observed.
10.5.2 Data collection
Generating original data
The PT and P&O involved in this section of the analysis both expressed that the overall
number of data entry requirements is very large. They also noted, though not in this
exercise, that there are too many steps in the process. Data are entered into the client
form, and then into a separate data form. They also suggested that:
“the responsibility is changing sometimes. There is not always a clear
form or process for collecting some of the indicators”
In the P&O section, the respondent observed
“For the core patient details this is done via a single form through a
database manager. For QA data, there is no central place. The unit
- 41 -
heads are keeping their own files and then occasionally they have been
compiled but right now the process has stopped.”
This highlights that the QA system is seen as conceptually different from ‘normal’ data
usage. This is consistent with earlier the earlier findings that only a few of the indicators
are normally collected and that there is no overarching process described for the QA
system.
The observation that ‘the responsibility is changing’ is consistent with the changing
external requirements, shifting management processes and a general lack of
experiences with quality management systems – both on the part of the specific
responsible persons but also the program, HI and the wider sector in general.
Entering primary data
Entering data into a QAS system, in addition to the complexities in knowing which data to
use that were described above, there are many challenges in entering relevant and
accurate data into a QA system.
Some of the problems included multiple handling of data:
“Entering the data into a software (sic). Sometimes we have to get the
statistics from a database. For example the staff working hours. “
..and bottlenecks in the data flow
“Before we needed it (the data) from the admin but now we get it directly
from the database or the PM.”
These findings reinforce that the QA system is not only conceptually problematic in terms
of the chosen indicators, but that there are complex and inadequate systems for
collecting and entering data.
Centralisation – entering into database
Here, we observed that there are still further separations and complexities with the QA
system. Overall, data required for the MoSVY reporting system are systematically and
routinely entered into a central database. Data for the QA, on the other hand, must be
entered separately by unit heads, and there is not a strong oversight of the process. The
staff have not been supported to develop their skills in data usage and processing or
even provided with reasonable templates for data usage. While more sophisticated
analyses are possible, even a simple assessment suggests that the multiple entry
sources, incomplete development of templates and the workload planning to use the
system are critical factors in addition to other aspects such as the validity of the
indicators and the separation of the QAS from ordinary management processes.
Compilation, aggregation, disaggregation
10.5.3 Monitoring, analysis and reporting
- 42 -
There are clear limitations in the way data are monitored and analysed. During the
previous phase, there was systematic reporting of a few key indicators, especially in the
clinical domains. There are no reports on management-related indicators. For the
present analysis, only the clinical indicators were explored, but the more important
finding is that only a few indicators are systematically developed.
10.6 Development of a user-friendly composite tool
A user friendly composite tool is developed for data management
While it is clear that this item is very much needed, within the scope of the present
technical input, it is very complex indeed to re-develop the current system. In fact, it is
probably counter intuitive, since the implementation, change and piecemeal development
of the current system are not consistent with sustained, satisfying and valued use of a
QAS in Kampong Cham.
Rather, it is proposed to extend and re-focus the emphasis on the quality management
system to take into account the changes in management structure, strong efforts of the
staff in using the current system, external developments in what we know about
managing for good physical rehabilitation services and HI’s own developments on a
robust and valid quality management process and a suite of research efforts focused on
sustainability, quality and governance. This approach takes into account the elements
found in a simple analysis of the process taken so far in the implementation of the QAS,
which was summarised earlier in Table 2- A timeline of relevant reporting requirements,
QAS milestones and external parameters, in section 10.1.2.
Recommendations for an immediate course of action and short-term responses in
balance with the project requirements, practical options and good practices are
presented elsewhere.
- 43 -
10.7 An analysis of 2010/2011 findings
2010 indicators and first semester 2011 indicators are reviewed and a report
highlighting the main findings in terms of performance and quality is produced.
10.7.1 ‘Workshop’ results
Indicator 1 & 2 - Adjustment during alignment and fitting
Only a few months’ data were available to explore alignment and other adjustments.
Further, the validity of these indicators was challenged in section 10.2 on page 30.
Regardless, these data are examined here. Figure 1 presents the available data
graphically.
Almost all fittings in the early months were within benchmark targets. In later months,
there was deterioration in the alignment targets. Other adjustments – indicator 2 in the
workshop template, was not collected in May or June.
These findings probably suggest that the indicator is not sufficiently operationalised and
produces a ceiling effect – that is – it is not sensitive enough to measure change and
variation to make useful decisions with. However, it also suggests that there is an overall
low rate of alignment changes and adjustment before fitting. While this is listed as an
appropriate benchmark – the contrary is probably true; more alignment changes
probably result in a better final alignment and reflect systematic adjustment of limb
alignment as training progresses and function improves.
Figure 1 - Percentage of adjustments under target rate.
Monthly percentage of alignment (dark bars) and other adjustments (light bar) from January to
June, 2011.
0
10
20
30
40
50
60
70
80
90
100
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
jan Feb mar apr may June
Perctage of fittings under target error rate
Adjustment Benchmarks
Alignment
Other adjustments
- 44 -
Indicator 4.3 – device durability
Overall, the raw data recorded do not make a clear appraisal of lifespan simple. The
indicator entered is, for a given month, the number of devices that were made in the
three benchmark category timeframes.
The grand mean of devices that lasted for an ‘acceptable’ 9 months was 45 devices per
month, 22.4 devices were in the ‘needs improvement’ range of 7 or 8 months and 39.45
devices per month were unacceptable (Figure 2). Overall, then, about 53% are not
meeting the acceptable standard of durability.
Understanding those findings in more detail, based on the current raw data, is very
difficult or impossible. It is not possible to understand which devices are breaking more
than others. We would, for example, expect transtibial devices to fail before upper limb
devices. We almost always anticipate orthoses will last longer than prostheses. We can’t
examine whether, for instance, some devices failed after a week, since no range can be
measured from the reported data.
Overall, then we can detect there are probably some general issues with durability, since
over half of devices are not lasting more than 9 months, and about 40% are lasting less
than 7 months. This warrants a close investigation and careful disaggregation of data to
examine which devices are failing for which users, and under which circumstances, so
potential remedial action can be taken.
Figure 2 - Number of devices within benchmark rages per month
Number of devices within benchmark ranges are presented month-by-month. These data
represent all devices (new, old, prostheses, orthoses, various levels).
0
10
20
30
40
50
60
70
80
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
acceptable
needs improvement
unacceptable
dec jan Feb mar apr may June July Mean
Number of devices in category
Device Durability
- 45 -
Indicators 6&7 – P&O and Benchworker production statistics
Only for months data were available for processing P&Os achieved a rate that was
acceptable according to the pre-determined benchmarks in January and July of 2011.
The results were unacceptable in December and April, when only around 21 devices
were produced monthly.
For benchworkers, the production rate was only acceptable in January, and
unacceptable in all other recorded months. Again, though, precisely why and how these
benchmarks have been chosen is unclear, and since we don’t know about the caseload
in those months, the complexity of the devices and other activities for those months, the
analysis reveals little about efficiently and productivity, other than to say that currently,
the production rates are highly variable from month to month. Indeed, the rates
approximate the averages in most other contexts and, on their face, seem appropriate. A
prosthetist might expect to, on average, deliver over one device per day and a one
prosthetist might have two benchworkers, meaning the rate is about half that. These
numbers are consistent with the findings here.
Figure 3 - P&O and Benchworker Production Rates
P&O acceptable rate is >33 devices per month. For benchworkers, the rate is 17 per month.
Unacceptable rates are <30 and <15 respectively.
21.41
34.86
21.15
34
7.57
17.28
9.49
12
dec jan Feb mar apr may June July
P&O and Benchwork production rate
P&O Benchworkers
- 46 -
10.7.2 PT findings
Indicator 3 – Treatment planning
For three out of four measured months, treatment plans included smart goals. In June,
there was an ‘unacceptable’ compliance.
Overall, these results probably suggest there is a strong compliance with a decision to
use SMART goals for all treatment planning. No other meaningful inferences can be
drawn.
Figure 4 - Percentage of treatment plans with SMART goals per month
The benchmark ‘Acceptable’ rate is 90%. Unacceptable is <80%
90 90.1
78
90.91
50
55
60
65
70
75
80
85
90
95
100
nov dec jan feb mar apr may jun jul aug sep oct nov dec
percentage with SMART goals
Percentage of Treatment Plans with SMART goals
- 47 -
Indicator 4 – missed appointments at the PRC
While data are recorded here, it is not possible to compare the data against the
benchmarks, as only a raw number of missed appointments are reported, rather than a
percentage. The numbers may therefore represent a reasonable percentage of missed
appointments. No inferences can be drawn from these data.
Figure 5 - Missed appointments at the PRC
The intention of this indicator is to understand the responsiveness of the service to the client
needs. Benchmark acceptable is <50
62.2
57.98
69
62.02
50
52
54
56
58
60
62
64
66
68
70
nov dec jan feb mar apr may jun jul aug sep oct nov dec
Number of monthly missed appointments
Monthly number of missed appointments
- 48 -
Indicator 5 – Daily treatments per PT
On average, PTs make 13.3 treatments per day for the 4 months where reliable data
were available. This ranged from around 10 up to nearly 19. While it is not possible to
effectively compute standard deviation from this small dataset, the variation here might
represent differences in measurement process from month to month, as well as
differences in the available working time – rather than the objective of measuring
efficiency of the therapists.
Overall, though, the average number was within the ‘acceptable’ benchmark range, but
each of the four months taken individually was outside the acceptable range – some in
the needs improvement range of 10-12 treatments per month, and a in July, a major
violation of the >16 ‘unacceptable’ benchmark. It is important, then, to understand not
only appropriate benchmarks, but reasonable month to month variation.
Overall, while there are some limitations in these data, they suggest an overall trend of
reasonable work rate for physical therapists.
Figure 6 - Daily Treatments per PT
The acceptable range is 12-14, unacceptable is less than 10 or greater than 16.
14.1
10.03
18.9
10.03
13.265
0
2
4
6
8
10
12
14
16
18
20
nov dec jan feb mar apr may jun jul aug sep oct nov dec mean
Number of treatments
Average daily treatment sessions per PT
- 49 -
10.8 Additional Analysis: Comparing the QAS against sustainability
indicators
While sustainability of services has always been a key target for HI, and many of HIs
operational methodologies have sustainable access to quality services as a centerpiece,
it has re-focused its attention on sustainability of rehabilitation in the last few years.
Consequentially, the organisation has a stronger understanding of the predictors of
sustained delivery of rehabilitation services, and has experimented with a core set of
indicators. As a consequence, it is very likely that at some level – either the service level
or the governance level, a system for reflection on those indicators – will be introduced.
While this is positive and necessary, at the level of staff working in a PRC, there is a
strong chance it would be further destabilising of the current QA processes.
Consequentially, it was proposed by the technical team in Cambodia to make an
additional analysis of the correlation between then QA system and the sustainability
indicators, to see where the indicators are already being detected to minimise any
necessary changes.
The findings of this simple analysis are presented in Table 5 on the following page.
- 50 -
Table 5 - Cross analysis of indicators between KC QAS and HI sustainability
indicators
This table presents the draft sustainability indicators prepared by an earlier sustainability working
group in Cambodia. The analysis outlines whether the indicator is measured at the KC PRC
(whether in the QAS system or elsewhere) and whether the indicator is used in management
decisions.
Kh National Sustainability Indicators
Measured
at KC
PRC
In QAS? Used?
Defined by sustainability working group
CoreindicatorsatPRClevel:
C1
Health
outcome
s
Number of old cases coming to PRC yes no
Number of new cases coming to PRC yes no
C2Healthservices
provision
Number of people treated yes no
Number of devices produced yes yes no
Level of quality of services no
attempted
but no
Number of referrals yes no
Number of outreach activities ? no
C3&C4viabilityandorganizationalcapacity
Availability of monitoring system partly
yes but not
functioning
Availability of annual plan no no
Availability of annual budget yes no
% of PRC staff employed and paid by INGO yes no
Percentage of PRC staff working for more than 4
hours/day
no no
Availability of raw material and consumables unclear yes no
Availability of administrative procedures no no
Implementation of HR procedures no no
Percentage of PRC staff replaced (same qualification) no no
Percentage of staff who follow national standards
(ISPO)
no no
Number of supervision visits of PoSVY directors to
the PRCs
? no
C5Community
capacity
Level of awareness of community people/PWDs of
the PRC activities and services
no no
Level of technical referral/follow-up of clients at
community level
no no
Number of people/PWDs (physical) (women and men)
coming by themselves to the PRCs
yes no
C6Enabling
environment
Level of implementation of the National Action Plan of
PWDs
no no
Level of implementation of Cambodia Disability Law no no
- 51 -
Coreindicatorsatnationalsystemlevel(withoutputtingthemunderspecific
components):
Percentage of PRCs/Factory’s utility costs covered by
national budget
Largely covered by MoSVY
database
Implementation of the MoU
Level of implementation of the National Plan of PWDs
Ratification of the UNCRPD
Availability of a centralized database on PRC
statistics, costs
Level of funds invested by government and/or donors
Adequate recognition of PO and PT qualification in
the public salary scale
Percentage of clients who pay out of pocket money to
access PRC services
Level of money received by PRC/National Component
Factory compared to money allocated to PRCs
according to set standards
Number of supervision visits of 11 PRCs
Number of people treated/given services by PRCs
Level of financial viability of the National Component
Factory
The table above demonstrates very clearly that the current QA approach does not
explore many of the sustainability indicators proposed by the working group. More
alarmingly, where they are collected, the data are not examined by management and
acted upon in either a strategic or systematic manner.
Quality assurance in Kampong Cham Physical Rehabilitation Centre
Quality assurance in Kampong Cham Physical Rehabilitation Centre
Quality assurance in Kampong Cham Physical Rehabilitation Centre
Quality assurance in Kampong Cham Physical Rehabilitation Centre
Quality assurance in Kampong Cham Physical Rehabilitation Centre
Quality assurance in Kampong Cham Physical Rehabilitation Centre
Quality assurance in Kampong Cham Physical Rehabilitation Centre
Quality assurance in Kampong Cham Physical Rehabilitation Centre
Quality assurance in Kampong Cham Physical Rehabilitation Centre

More Related Content

Similar to Quality assurance in Kampong Cham Physical Rehabilitation Centre

Assessing locally focused stability operations
Assessing locally focused stability operationsAssessing locally focused stability operations
Assessing locally focused stability operations
Mamuka Mchedlidze
 
Rand rr2504
Rand rr2504Rand rr2504
Rand rr2504
BookStoreLib
 
2012 Avoca Quality Summit Report
2012 Avoca Quality Summit Report2012 Avoca Quality Summit Report
2012 Avoca Quality Summit Report
The Avoca Group
 
Cdc Safewater Systems Manual
Cdc Safewater Systems ManualCdc Safewater Systems Manual
Cdc Safewater Systems Manual
Jan Hatol
 
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
ITSU - Immunization Technical Support Unit
 
Shalam g+3 01
Shalam g+3 01Shalam g+3 01
Shalam g+3 01
drshujashiza
 
Us gsa (1992) value engineering program guide for design and construction -...
Us gsa (1992)   value engineering program guide for design and construction -...Us gsa (1992)   value engineering program guide for design and construction -...
Us gsa (1992) value engineering program guide for design and construction -...
Wan Yusoff Wan Mahmood
 
Benchmarking survey Report
Benchmarking survey Report Benchmarking survey Report
Benchmarking survey Report
Elizabeth Erck
 
QualityCare_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
QualityCare_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdfQualityCare_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
QualityCare_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
DRHatem ELbitar
 
Sanitation, Solid & Liquid Waste Management
Sanitation, Solid & Liquid Waste ManagementSanitation, Solid & Liquid Waste Management
Sanitation, Solid & Liquid Waste Management
Elvish Momin
 
K12 2011
K12 2011K12 2011
K12 2011
Gulam Mustafa
 
Digital Interventions for Health Systems Strengthening
Digital Interventions for Health Systems Strengthening Digital Interventions for Health Systems Strengthening
Digital Interventions for Health Systems Strengthening
Prof. Rajendra Pratap Gupta
 
Evaluation
EvaluationEvaluation
Evaluation
SocialCycling
 
OECD Guidelines on Measuring Subjective Well-being
OECD Guidelines on Measuring Subjective Well-beingOECD Guidelines on Measuring Subjective Well-being
OECD Guidelines on Measuring Subjective Well-being
The Happiness Alliance - home of the Happiness Index
 
Privacy and Tracking in a Post-Cookie World
Privacy and Tracking in a Post-Cookie WorldPrivacy and Tracking in a Post-Cookie World
Privacy and Tracking in a Post-Cookie World
Ali Babaoglan Blog
 
20150324 Strategic Vision for Cancer
20150324 Strategic Vision for Cancer20150324 Strategic Vision for Cancer
20150324 Strategic Vision for Cancer
Sally Rickard
 
RDGB Corporate Profile
RDGB Corporate ProfileRDGB Corporate Profile
RDGB Corporate Profile
Rejaul Islam
 
Mnual Para Sistemas De MedicióN ( Ingles)
Mnual Para  Sistemas De MedicióN ( Ingles)Mnual Para  Sistemas De MedicióN ( Ingles)
Mnual Para Sistemas De MedicióN ( Ingles)
controlgestionarmada
 
Districtcna
DistrictcnaDistrictcna
Districtcna
KathyRees
 
Developing an effective evaluation plan
Developing an effective evaluation planDeveloping an effective evaluation plan
Developing an effective evaluation plan
Dr Lendy Spires
 

Similar to Quality assurance in Kampong Cham Physical Rehabilitation Centre (20)

Assessing locally focused stability operations
Assessing locally focused stability operationsAssessing locally focused stability operations
Assessing locally focused stability operations
 
Rand rr2504
Rand rr2504Rand rr2504
Rand rr2504
 
2012 Avoca Quality Summit Report
2012 Avoca Quality Summit Report2012 Avoca Quality Summit Report
2012 Avoca Quality Summit Report
 
Cdc Safewater Systems Manual
Cdc Safewater Systems ManualCdc Safewater Systems Manual
Cdc Safewater Systems Manual
 
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
 
Shalam g+3 01
Shalam g+3 01Shalam g+3 01
Shalam g+3 01
 
Us gsa (1992) value engineering program guide for design and construction -...
Us gsa (1992)   value engineering program guide for design and construction -...Us gsa (1992)   value engineering program guide for design and construction -...
Us gsa (1992) value engineering program guide for design and construction -...
 
Benchmarking survey Report
Benchmarking survey Report Benchmarking survey Report
Benchmarking survey Report
 
QualityCare_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
QualityCare_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdfQualityCare_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
QualityCare_lec10_feb2022_hosp_diploma_dr_hatem_el_bitar_apa.pdf
 
Sanitation, Solid & Liquid Waste Management
Sanitation, Solid & Liquid Waste ManagementSanitation, Solid & Liquid Waste Management
Sanitation, Solid & Liquid Waste Management
 
K12 2011
K12 2011K12 2011
K12 2011
 
Digital Interventions for Health Systems Strengthening
Digital Interventions for Health Systems Strengthening Digital Interventions for Health Systems Strengthening
Digital Interventions for Health Systems Strengthening
 
Evaluation
EvaluationEvaluation
Evaluation
 
OECD Guidelines on Measuring Subjective Well-being
OECD Guidelines on Measuring Subjective Well-beingOECD Guidelines on Measuring Subjective Well-being
OECD Guidelines on Measuring Subjective Well-being
 
Privacy and Tracking in a Post-Cookie World
Privacy and Tracking in a Post-Cookie WorldPrivacy and Tracking in a Post-Cookie World
Privacy and Tracking in a Post-Cookie World
 
20150324 Strategic Vision for Cancer
20150324 Strategic Vision for Cancer20150324 Strategic Vision for Cancer
20150324 Strategic Vision for Cancer
 
RDGB Corporate Profile
RDGB Corporate ProfileRDGB Corporate Profile
RDGB Corporate Profile
 
Mnual Para Sistemas De MedicióN ( Ingles)
Mnual Para  Sistemas De MedicióN ( Ingles)Mnual Para  Sistemas De MedicióN ( Ingles)
Mnual Para Sistemas De MedicióN ( Ingles)
 
Districtcna
DistrictcnaDistrictcna
Districtcna
 
Developing an effective evaluation plan
Developing an effective evaluation planDeveloping an effective evaluation plan
Developing an effective evaluation plan
 

Recently uploaded

chatgptfornlp-230314021506-2f03f614.pdf. 21506-2f03f614.pdf
chatgptfornlp-230314021506-2f03f614.pdf. 21506-2f03f614.pdfchatgptfornlp-230314021506-2f03f614.pdf. 21506-2f03f614.pdf
chatgptfornlp-230314021506-2f03f614.pdf. 21506-2f03f614.pdf
marynayjun112024
 
Time line.ppQAWSDRFTGYUIOPÑLKIUYTREWASDFTGY
Time line.ppQAWSDRFTGYUIOPÑLKIUYTREWASDFTGYTime line.ppQAWSDRFTGYUIOPÑLKIUYTREWASDFTGY
Time line.ppQAWSDRFTGYUIOPÑLKIUYTREWASDFTGY
DianaRodriguez639773
 
TEST BANK FOR Health Assessment in Nursing 7th Edition by Weber Chapters 1 - ...
TEST BANK FOR Health Assessment in Nursing 7th Edition by Weber Chapters 1 - ...TEST BANK FOR Health Assessment in Nursing 7th Edition by Weber Chapters 1 - ...
TEST BANK FOR Health Assessment in Nursing 7th Edition by Weber Chapters 1 - ...
rightmanforbloodline
 
Top Rated Massage Center In Ajman Chandrima Spa
Top Rated Massage Center In Ajman Chandrima SpaTop Rated Massage Center In Ajman Chandrima Spa
Top Rated Massage Center In Ajman Chandrima Spa
Chandrima Spa Ajman
 
DECODING THE RISKS - ALCOHOL, TOBACCO & DRUGS.pdf
DECODING THE RISKS - ALCOHOL, TOBACCO & DRUGS.pdfDECODING THE RISKS - ALCOHOL, TOBACCO & DRUGS.pdf
DECODING THE RISKS - ALCOHOL, TOBACCO & DRUGS.pdf
Dr Rachana Gujar
 
PrudentRx: A Resource for Patient Education and Engagement
PrudentRx: A Resource for Patient Education and EngagementPrudentRx: A Resource for Patient Education and Engagement
PrudentRx: A Resource for Patient Education and Engagement
PrudentRx Program
 
Michigan HealthTech Market Map 2024 with Policy Makers, Academic Innovation C...
Michigan HealthTech Market Map 2024 with Policy Makers, Academic Innovation C...Michigan HealthTech Market Map 2024 with Policy Makers, Academic Innovation C...
Michigan HealthTech Market Map 2024 with Policy Makers, Academic Innovation C...
Levi Shapiro
 
Bringing AI into a Mid-Sized Company: A structured Approach
Bringing AI into a Mid-Sized Company: A structured ApproachBringing AI into a Mid-Sized Company: A structured Approach
Bringing AI into a Mid-Sized Company: A structured Approach
Brian Frerichs
 
CANSA support - Caring for Cancer Patients' Caregivers
CANSA support - Caring for Cancer Patients' CaregiversCANSA support - Caring for Cancer Patients' Caregivers
CANSA support - Caring for Cancer Patients' Caregivers
CANSA The Cancer Association of South Africa
 
Unlocking the Secrets to Safe Patient Handling.pdf
Unlocking the Secrets to Safe Patient Handling.pdfUnlocking the Secrets to Safe Patient Handling.pdf
Unlocking the Secrets to Safe Patient Handling.pdf
Lift Ability
 
FACIAL NERVE
FACIAL NERVEFACIAL NERVE
FACIAL NERVE
aditigupta1117
 
Luxurious Spa In Ajman Chandrima Massage Center
Luxurious Spa In Ajman Chandrima Massage CenterLuxurious Spa In Ajman Chandrima Massage Center
Luxurious Spa In Ajman Chandrima Massage Center
Chandrima Spa Ajman
 
Bath patient Fundamental of Nursing.pptx
Bath patient Fundamental of Nursing.pptxBath patient Fundamental of Nursing.pptx
Bath patient Fundamental of Nursing.pptx
MianProductions
 
NKTI Annual Report - Annual Report FY 2022
NKTI Annual Report - Annual Report FY 2022NKTI Annual Report - Annual Report FY 2022
NKTI Annual Report - Annual Report FY 2022
nktiacc3
 
Can coffee help me lose weight? Yes, 25,422 users in the USA use it for that ...
Can coffee help me lose weight? Yes, 25,422 users in the USA use it for that ...Can coffee help me lose weight? Yes, 25,422 users in the USA use it for that ...
Can coffee help me lose weight? Yes, 25,422 users in the USA use it for that ...
nirahealhty
 
DR SHAMIN EABENSON - JOURNAL CLUB - NEEDLE STICK INJURY
DR SHAMIN EABENSON - JOURNAL CLUB - NEEDLE STICK INJURYDR SHAMIN EABENSON - JOURNAL CLUB - NEEDLE STICK INJURY
DR SHAMIN EABENSON - JOURNAL CLUB - NEEDLE STICK INJURY
SHAMIN EABENSON
 
MBC Support Group for Black Women – Insights in Genetic Testing.pdf
MBC Support Group for Black Women – Insights in Genetic Testing.pdfMBC Support Group for Black Women – Insights in Genetic Testing.pdf
MBC Support Group for Black Women – Insights in Genetic Testing.pdf
bkling
 
Pneumothorax and role of Physiotherapy in it.
Pneumothorax and role of Physiotherapy in it.Pneumothorax and role of Physiotherapy in it.
Pneumothorax and role of Physiotherapy in it.
Vishal kr Thakur
 
GIT BS.pptx about human body their structure and
GIT BS.pptx about human body their structure andGIT BS.pptx about human body their structure and
GIT BS.pptx about human body their structure and
MuzafarBohio
 
PET CT beginners Guide covers some of the underrepresented topics in PET CT
PET CT  beginners Guide  covers some of the underrepresented topics  in PET CTPET CT  beginners Guide  covers some of the underrepresented topics  in PET CT
PET CT beginners Guide covers some of the underrepresented topics in PET CT
MiadAlsulami
 

Recently uploaded (20)

chatgptfornlp-230314021506-2f03f614.pdf. 21506-2f03f614.pdf
chatgptfornlp-230314021506-2f03f614.pdf. 21506-2f03f614.pdfchatgptfornlp-230314021506-2f03f614.pdf. 21506-2f03f614.pdf
chatgptfornlp-230314021506-2f03f614.pdf. 21506-2f03f614.pdf
 
Time line.ppQAWSDRFTGYUIOPÑLKIUYTREWASDFTGY
Time line.ppQAWSDRFTGYUIOPÑLKIUYTREWASDFTGYTime line.ppQAWSDRFTGYUIOPÑLKIUYTREWASDFTGY
Time line.ppQAWSDRFTGYUIOPÑLKIUYTREWASDFTGY
 
TEST BANK FOR Health Assessment in Nursing 7th Edition by Weber Chapters 1 - ...
TEST BANK FOR Health Assessment in Nursing 7th Edition by Weber Chapters 1 - ...TEST BANK FOR Health Assessment in Nursing 7th Edition by Weber Chapters 1 - ...
TEST BANK FOR Health Assessment in Nursing 7th Edition by Weber Chapters 1 - ...
 
Top Rated Massage Center In Ajman Chandrima Spa
Top Rated Massage Center In Ajman Chandrima SpaTop Rated Massage Center In Ajman Chandrima Spa
Top Rated Massage Center In Ajman Chandrima Spa
 
DECODING THE RISKS - ALCOHOL, TOBACCO & DRUGS.pdf
DECODING THE RISKS - ALCOHOL, TOBACCO & DRUGS.pdfDECODING THE RISKS - ALCOHOL, TOBACCO & DRUGS.pdf
DECODING THE RISKS - ALCOHOL, TOBACCO & DRUGS.pdf
 
PrudentRx: A Resource for Patient Education and Engagement
PrudentRx: A Resource for Patient Education and EngagementPrudentRx: A Resource for Patient Education and Engagement
PrudentRx: A Resource for Patient Education and Engagement
 
Michigan HealthTech Market Map 2024 with Policy Makers, Academic Innovation C...
Michigan HealthTech Market Map 2024 with Policy Makers, Academic Innovation C...Michigan HealthTech Market Map 2024 with Policy Makers, Academic Innovation C...
Michigan HealthTech Market Map 2024 with Policy Makers, Academic Innovation C...
 
Bringing AI into a Mid-Sized Company: A structured Approach
Bringing AI into a Mid-Sized Company: A structured ApproachBringing AI into a Mid-Sized Company: A structured Approach
Bringing AI into a Mid-Sized Company: A structured Approach
 
CANSA support - Caring for Cancer Patients' Caregivers
CANSA support - Caring for Cancer Patients' CaregiversCANSA support - Caring for Cancer Patients' Caregivers
CANSA support - Caring for Cancer Patients' Caregivers
 
Unlocking the Secrets to Safe Patient Handling.pdf
Unlocking the Secrets to Safe Patient Handling.pdfUnlocking the Secrets to Safe Patient Handling.pdf
Unlocking the Secrets to Safe Patient Handling.pdf
 
FACIAL NERVE
FACIAL NERVEFACIAL NERVE
FACIAL NERVE
 
Luxurious Spa In Ajman Chandrima Massage Center
Luxurious Spa In Ajman Chandrima Massage CenterLuxurious Spa In Ajman Chandrima Massage Center
Luxurious Spa In Ajman Chandrima Massage Center
 
Bath patient Fundamental of Nursing.pptx
Bath patient Fundamental of Nursing.pptxBath patient Fundamental of Nursing.pptx
Bath patient Fundamental of Nursing.pptx
 
NKTI Annual Report - Annual Report FY 2022
NKTI Annual Report - Annual Report FY 2022NKTI Annual Report - Annual Report FY 2022
NKTI Annual Report - Annual Report FY 2022
 
Can coffee help me lose weight? Yes, 25,422 users in the USA use it for that ...
Can coffee help me lose weight? Yes, 25,422 users in the USA use it for that ...Can coffee help me lose weight? Yes, 25,422 users in the USA use it for that ...
Can coffee help me lose weight? Yes, 25,422 users in the USA use it for that ...
 
DR SHAMIN EABENSON - JOURNAL CLUB - NEEDLE STICK INJURY
DR SHAMIN EABENSON - JOURNAL CLUB - NEEDLE STICK INJURYDR SHAMIN EABENSON - JOURNAL CLUB - NEEDLE STICK INJURY
DR SHAMIN EABENSON - JOURNAL CLUB - NEEDLE STICK INJURY
 
MBC Support Group for Black Women – Insights in Genetic Testing.pdf
MBC Support Group for Black Women – Insights in Genetic Testing.pdfMBC Support Group for Black Women – Insights in Genetic Testing.pdf
MBC Support Group for Black Women – Insights in Genetic Testing.pdf
 
Pneumothorax and role of Physiotherapy in it.
Pneumothorax and role of Physiotherapy in it.Pneumothorax and role of Physiotherapy in it.
Pneumothorax and role of Physiotherapy in it.
 
GIT BS.pptx about human body their structure and
GIT BS.pptx about human body their structure andGIT BS.pptx about human body their structure and
GIT BS.pptx about human body their structure and
 
PET CT beginners Guide covers some of the underrepresented topics in PET CT
PET CT  beginners Guide  covers some of the underrepresented topics  in PET CTPET CT  beginners Guide  covers some of the underrepresented topics  in PET CT
PET CT beginners Guide covers some of the underrepresented topics in PET CT
 

Quality assurance in Kampong Cham Physical Rehabilitation Centre

  • 1. - 1 - Managing for quality global physical rehabilitation THE QUALITY ASSURANCE SYSTEM AT THE KOMPONG CHAM (CAMBODIA) PHYSICAL REHABILITATION CENTRE: ANALYSIS, AMENDMENTS AND RECOMMENDATIONS Wesley Pryor, 2012
  • 2. - 2 - Handicap International is an international organisation specialised in the field of disability. Non-governmental, non-religious, non-political and non-profit making, it works alongside people with disabilities, whatever the context, offering them assistance and supporting them in their efforts to become self-reliant. Since its creation, the organisation has set up programmes in approximately 60 countries and intervened in many emergency situations. It has a network of eight national associations (Belgium, Canada, France, Germany, Luxembourg, Switzerland, United Kingdom, USA) which provide human and financial resources, manage projects and raise awareness of Handicap International’s actions and campaigns. Wesley Pryor Regional Technical Advisor, Rehabilitation (South Asia) Handicap International – Technical Resources Division, Rehabilitation Services Unit w: www.handicap-international.org e: wpryor@handicap-international.asia
  • 3. - 3 - 1 Table of contents 1  Table of contents........................................................................................................3  2  Acronyms and abbreviations......................................................................................5  3  Reading and using this report ....................................................................................6  4  Terms of reference.....................................................................................................6  5  Background................................................................................................................7  5.1  HI and Physical Rehabilitation in Cambodia and Globally............................................... 7  5.2  HI and Quality Management of Global Physical Rehabilitation ....................................... 7  5.3  Learning from Quality Assurance in Cambodia ............................................................... 7  5.4  QA The Physical Rehabilitation Centre in Kampong Cham............................................. 8  6  Scope and limitations of the report.............................................................................8  7  Executive Summary of findings and recommendations .............................................9  8  List of recommendations..........................................................................................11  9  Methods and activities..............................................................................................12  9.1  Audit of existing quality assurance system .................................................................... 12  9.2  An examination of current QA indicators ....................................................................... 12  9.3  Benchmarks – critique and re-definition according to relevant standards ..................... 12  9.4  Operationalising indicators to compare against key benchmarks.................................. 12  9.5  Refinement and improvement of QAS processes.......................................................... 13  9.6  Development of a user-friendly composite tool.............................................................. 13  9.7  An analysis of 2010/2011 findings ................................................................................. 13  10  Results .....................................................................................................................14  10.1  Audit of existing quality assurance system .................................................................... 14  10.1.1  “Coffee bean analysis” – Audit of current indicators .................................................. 25  10.1.2  A timeline of QAS development ................................................................................. 28  10.2  An examination of current QA indicators ....................................................................... 30  10.2.1  Validity explained ....................................................................................................... 30  10.2.2  Results of analysis of process and validity ................................................................ 30  10.2.3  Summary of findings on process and indicator validity.............................................. 37  10.3  Benchmarks – critique and re-definition according to relevant standards ..................... 37  10.4  Operationalising indicators to compare against key benchmarks.................................. 38  10.5  Refinement and improvement of QAS processes.......................................................... 38  10.5.1  Planning and definitions of indicators ........................................................................ 40  10.5.2  Data collection............................................................................................................ 40  Generating original data ............................................................................................................ 40  Entering primary data ................................................................................................................ 41  Centralisation – entering into database..................................................................................... 41  Compilation, aggregation, disaggregation................................................................................. 41  10.5.3  Monitoring, analysis and reporting ............................................................................. 41  10.6  Development of a user-friendly composite tool.............................................................. 42  10.7  An analysis of 2010/2011 findings ................................................................................. 43  10.7.1  ‘Workshop’ results...................................................................................................... 43  Indicator 1 & 2 - Adjustment during alignment and fitting.......................................................... 43  Indicator 4.3 – device durability ................................................................................................. 44  Indicators 6&7 – P&O and Benchworker production statistics .................................................. 45  10.7.2  PT findings ................................................................................................................. 46  Indicator 3 – Treatment planning............................................................................................... 46  Indicator 4 – missed appointments at the PRC ......................................................................... 47  Indicator 5 – Daily treatments per PT........................................................................................ 48  10.8  Additional Analysis: Comparing the QAS against sustainability indicators.................... 49  11  Analysis....................................................................................................................52  11.1  Audit of existing quality assurance system .................................................................... 52  11.1.1  Why aren’t indicators being collected? ...................................................................... 52  11.2  An examination of current QA indicators ....................................................................... 53  11.2.1  Why have these indicators been chosen? ................................................................. 53  11.2.2  A way forward............................................................................................................. 54 
  • 4. - 4 - 11.3  Benchmarks – critique and re-definition according to relevant standards ..................... 55  11.3.1  Understanding benchmarks – why these ones haven’t worked................................. 55  11.3.2  Operationalising indicators to compare against key benchmarks.............................. 55  11.3.3  Where we are now: data collection and flow.............................................................. 55  11.3.4  A way forward for complex data management requirement in a PRC....................... 56  11.4  Refinement and improvement of QAS processes.......................................................... 56  11.4.1  A proposed process for practical, simple and manageable QAS processes............. 56  11.5  Development of a user-friendly composite tool.............................................................. 56  11.5.1  Introducing a Rehabilitation Management System – a new investment in managing for quality rehabilitation services................................................................................................ 56  11.6  An analysis of 2010/2011 findings ................................................................................. 57  11.6.1  A general look at quality at the PRC .......................................................................... 57  11.6.2  Learning from the experience: The challenges of the current QAS reporting processes................................................................................................................................... 57  11.6.3  What can we say about the service based on the data we have?............................. 57  P&O services ............................................................................................................................. 57  The PT service........................................................................................................................... 58  11.7  Additional analysis of sustainability indicators and the current QAS ............................. 58  11.8  General recommendations............................................................................................. 58  12  Concluding remarks .................................................................................................60 
  • 5. - 5 - 2 Acronyms and abbreviations QAS – Quality Assurance System QA – Quality Assurance RMS – Rehabilitation Management System (HI Internal procedures) MoSVY – Ministry of Social, Youth and Veterans Affairs (Royal Cambodian Government) HI – Handicap International PwD – Person/s with disability PRC – Physical Rehabilitation Centre PT – Physical Therapist/Physiotherapist P&O – Prosthetist/Orthotist (person) or Prosthetics and Orthotics (the discipline)
  • 6. - 6 - 3 Reading and using this report This report is structured around the terms of references. Each term of reference is addressed in turn, in section 10, starting on page 14. A subsequent section analyses the findings around emergent themes. Presenting the results in this way allows a quick orientation to the results of key questions, but develops a richer analysis of those findings in a separate section and explores other areas that emerged during the evaluation. A list of recommendations is presented in section 8 on page 11. 4 Terms of reference This section is a direct excerpt from the TOR document The objective of the assignment is to establish an operational quality assurance system utilising existing tools and indicators, identifying and applying benchmarks and implementing a system of data collection, storage and reporting. Expected outputs are as follow:  An audit of the existing quality assurance system from 2010 and 2011 is completed identifying indicators that are routinely collected and those that are not.  The indicators are appraised for their relevance against the goals of the quality assurance system and refined as appropriate ie reconfirm the significance of the indicator in terms of quality assurance, collection methodologies, data storage and reporting to define a master list of key indicators to proceed with.  Benchmarks for the indicators are identified from national and international standards, local laws and customs, MoSVY/PoSVY and PRC internal practice and policies.  Indicators and benchmarks are made operational to enable identification of risk and safety concerns as well as identification of acceptable targets.  Data collection, storage and reporting systems refined and/or developed clarifying information source, frequency of collection and responsible person.  A user friendly composite tool is developed for data management.  2010 indicators and first semester 2011 indicators are reviewed and a report highlighting the main findings in terms of performance and quality is produced.
  • 7. - 7 - 5 Background1 5.1 HI and Physical Rehabilitation in Cambodia and Globally Handicap International has been working in Cambodia since its inception in 1982. Physical Rehabilitation has always been a substantial component of its activities. Globally, HI has supported the Physical Rehabilitation in some 65+ countries. In 2009, HI supported services that delivered physical rehabilitation to nearly 100,000 people. Some of the core operational methodologies of Handicap International in Physical Rehabilitation are a specific emphasis on supporting local, pre-existing services, using local Human and material resources, an emphasis on capacity building and emphasising the role of rehabilitation as only a part of a comprehensive, systemic approach to addressing and upholding the rights of persons with disabilities. Because of the scope of HI’s activities and focus and the inter-connectedness of its domains of action, a core operational methodology for physical rehabilitation has not been defined. 5.2 HI and Quality Management of Global Physical Rehabilitation Since around 2010, the rehabilitation unit of HI’s technical resource division has emphasised on ensuring access to quality rehabilitation services. This conceptual approach recognises the importance of equitable access to mainstream services, a need for specialised services, and that external agencies like HI need not necessarily directly implement those services, but might seek to ensure they exist and are effective. Achieving this is attempted through systematic approaches to measuring and improving the overall sectoral response in physical rehabilitation (in concert with broader disability actions), and focusing on understanding and improving the quality of physical rehabilitation services. Quality improvement is addressed through methodologies such as updating and re-emphasising management-related policy, emphasising a user- focused approach and understanding clinical governance in global rehabilitation services. But while this approach is an evolution of decades of action, rather than a revolutionary change, we are still in early phases of these more systematic, repeatable and scaleable approaches to our work. 5.3 Learning from Quality Assurance in Cambodia The efforts of HI Cambodia and its partner organisations in implementing a comprehensive and systematic QAS pre-date this work at HQ. Consequentially, HQ has much to learn from the process. It also creates many opportunities for the organization to invest in further development of the system that is in place, as an exercise in learning from previous practice. It is in that spirit that this support visit was undertaken. 1 For a more comprehensive background to rehabilitation services in Cambodia, the reader is directed to HI- Cambodia documentation and the original TOR for this report.
  • 8. - 8 - 5.4 QA The Physical Rehabilitation Centre in Kampong Cham The team in Cambodia and particularly at the Kampong Cham PRC have been working to establish a Quality Assurance System to ensure a quality, well managed service is sustained after handover to local authorities. As this report identifies, the approach has evolved since its inception, taking into account the many changes in management, reporting requirements and so on. 6 Scope and limitations of the report The challenge presented in the original TORs was an immense task. A comprehensive analysis of data, starting from raw data, of some 42 QAS indicators plus adjustments of the QAS system, taking into account contemporary changes in program HR, governance and project cycles requires much longer time-frames. Simply reviewing the 42 indicators benchmarks and validity against a range of literature is itself a huge task. Consequentially, it was agreed to target TORs 1-4 and 7, de-emphasising amendments to the system. However, very early in the process, it became clear that TORs 5 and 6 were probably the more important indicators, since the QAS system so far has evolved very quickly since its inception, and only a core set of indicators was currently in use. Consequentially, this report examines the existing QAS system, the current effectiveness of the system, and outlines a course of action to improve the system so that it is genuinely useful, efficient and realistic.
  • 9. - 9 - 7 Executive Summary of findings and recommendations “Not everything that can be counted counts, and not everything that counts can be counted.” The QAS is a strong foundation, has evolved but is not currently used effectively. The current QAS is not used routinely. Only a small percentage of the indicators have been collected at all, and fewer still have been collected routinely. The necessary data for the MoSVY and project reporting requirements and strong operational processes have been collected elsewhere, but not in the QAS per se. During 2011, a simple decision to refine the overall QAS to focus only on a small number of indicators was taken. The current QAS, then, is more efficient and manageable than the original version, but may not meet its key objectives. Overall, while there is a positive shift towards quality assurance processes, they are disconnected with ordinary operational activities. This dichotomy has created much additional work for an already busy team. The indicators of the QAS are comprehensive and reasonable divided between different domains. However, they are complex to measure, not always related to realistic or meaningful objectives and work is needed to more effectively collect useful data. Many of the core indicators developed for the QAS are not well defined and operationalised, and fewer still have clear documentation or instrumentation for gathering and using the data. Clinical indicators were the focus of this analysis. In those domains, there are some strong key indicators that are giving value to the clinical team, are reliable to collect and are used to make decisions. These can be built upon. Benchmarks are not well linked to reasonable foundations and need to be amended in concert with revisions of the key indicators. Given the findings that the indicators are complex and often not appropriate, it is difficult and somewhat redundant to examine the benchmarks in detail. In short, the benchmarks are not well linked with strong foundations, and many would be better replaced with a simple binary yes/no indicator, and strengthened centre policies. The indicators are not well operationalised and there are few clear places for entry of primary data or systematic approaches to aggregating and disaggregating them. A small percentage of the overall system has been analysed monthly, but this analysis is limited by the indicators used and their relationship to the real objectives of the QAS. Strengthening the collection processes of the current QAS system is not considered the most appropriate course of action at this time.
  • 10. - 10 - The QAS has given a framework for a focus on quality assurance, but has not been matched with management training, or carefully staged implementation of a new and complex system. Processes for planning, training and revision of the QAS have not been described or implemented. There are no systematic places for data compilation or usage and data are not routinely aggregated and disaggregated. Data collection is considered very complex and not proportionately helpful by the staff. Consequentially, there are challenging process issues that need to be rectified. Recommendations and practical approaches to building on the current experiences with alternative approaches, towards an efficient, useful user-friendly tool are presented. The QAS offers some insights into current practices, but needs much work to optimise its potential. Available 2010-2011 data are analysed and some modest findings on service delivery can be taken from those data. A summary of recommendations corresponding to overall terms of reference and emergent themes is presented in a List of recommendations on page 11. Overall, a strong technical commitment and further investment from HI, partners is warranted. The project, program and the PRC – as well as the sector and HI itself, stand to learn much from the implementation of a strong, clear, usable QAS. This attempt has been extremely innovative, ahead of its time and evidences capacity and commitment of the stakeholders involved. With additional time, support and a revision of the QAS in concert with overall management and context changes, drawing on new experiences, a system that meets its original objectives of helping ensure sustained, quality services in Kompong Cham, is obtainable.
  • 11. - 11 - 8 List of recommendations R 1 Overall, it is recommended to re-commit to a simpler, more efficient, useful QAS, drawing on these experiences, building on the foundation in place, and learning from emerging examples of good practice...............................................................................53  R 2 Review which of the indicators reflect simple policy decisions and amend the QAS and centre management documentation accordingly ......................................................53  R 3 While continuing with the current QAS processes that are routinely implemented (PT and P&O key data), review the overall system – in particular, re-defining key indicators and their operationalization..............................................................................................53  R 4 Invest in continued development of the overall management approach, ensuring a new and stronger QAS is seen as the principle management instrument, rather than a separate ‘project’..............................................................................................................53  R 5 Incorporate Kompong Cham in the field-testing of HI's 'Rehabilitation Management System' ............................................................................................................................54  R 6 Invest in ongoing, systematic revision of data flow, including client cards, aggregation of data etc. This should be iterative, be sensitive to the negative impacts of rapid change and build on the initial overall assessments through the RMS...................55  R 7 Plan and implement basic training in statistics, data types and usage and on quality assurance in general........................................................................................................56  R 8 Careful revision of the indicators is needed overall, but one area of immediate focus might be to examine the number of missed appointments by ensuring a percentage is reflected in reporting data. ...............................................................................................58  R 9 Explore options for a sub-project with a specific project officer focusing on implementing an QAS system including overseeing and supporting the necessary training .............................................................................................................................58 
  • 12. - 12 - 9 Methods and activities This section briefly outlines the methods and activities used to address the key TORs and to explore emerging themes and findings. 9.1 Audit of existing quality assurance system To examine the usage of the quality assurance system, a simple data-collation exercise was undertaken. Using the provided QAS files, a simple month-by month matrix of which indicators had been collected was created. These data are reported as a ‘coffee bean analysis’ – or a graphical representation of which data had been collected, used, analysed and acted upon. Because it because immediately clear that very substantive changes to the original QAS had happened, a simple timeline of changes to the system and the overall project and program contexts was developed. This involved a very simple retrospective look at key changes to the project, using discussions with key personnel, project reporting and logical frameworks and work-plans. 9.2 An examination of current QA indicators Building on the development of a simpler matrix of QA indicators and parameters from the pre-existing narrative, this analysis sought to examine the validity of indicators. Measuring validity – perhaps ironically – is very hard. But looking at face and construct validity, that is, whether the indicators appear to measure the relevant construct, is simple enough to do within the scope of the present analysis, and gives a good orientation to the overall utility of the QA system. In a systematic manner, the validity of each indicator was analysed and explained. 9.3 Benchmarks – critique and re-definition according to relevant standards Because the findings of the overall usage and validity of the current QAS and examining the changing context of the PRC suggested that the original QAS concept was not being used, lacked valid indicators to measure the intended objectives and that the management and human resource context had changed, it was redundant to develop benchmarks based on the current QAS system. Rather, in subsequent recommendations, approaches to look at new and alternative benchmarks, drawing from HIs and other agencies recent experiences in managing physical rehabilitation, are proposed. 9.4 Operationalising indicators to compare against key benchmarks As for the benchmarks, operationalising the current indicators is, for now, premature without a comprehensive re-evaluation of the overall quality assurance approach.
  • 13. - 13 - However, the MoSVY data were assessed in detail to understand their implications on future developments and to ensure they were appropriate in the short-term. 9.5 Refinement and improvement of QAS processes To understand current QAS processes, and particularly strong and weak areas of practice, bottlenecks, gaps and repetitions in data collection and so on, a timeline of the quality cycle was developed, and relevant domains of human resources stratified along those times. Using this framework, strong points, weak points, comments and other remarks were documented systematically to understand the overall process, with a view to proposing changes. Importantly, this analysis focused only on clinical personnel. As they had not been involved in either the definition of the quality indicators or the subsequent processing of data, their role was only in one section of the overall quality cycle. That, in itself, was taken as an important finding (section 10.5, page 38), but also meant the analysis could only explore data collection and entry. The timeline/personnel template is presented as a potential tool for ongoing analysis of the overall quality management cycle by the project and program teams. 9.6 Development of a user-friendly composite tool Given the findings of earlier sections, this TOR was not completed. Rather, the report makes a series of recommendations for redeveloping the hybrid QAS that has evolved with MoSVY handover processes and the adoption of the Patient Management System. 9.7 An analysis of 2010/2011 findings Building on the audit in the first TOR , available data were analysed and findings relevant to the ongoing activities of the PRC are identified.
  • 14. - 14 - 10 Results 10.1 Audit of existing quality assurance system An audit of the existing quality assurance system from 2010 and 2011 is completed identifying indicators that are routinely collected and those that are not. An overview of the current indicators is presented in subsequent pages. This presentation has been selected to complement the current processes, which have been more piecemeal and therefore complex to use. A more straightforward indicator with clear definition and operationalisation of indicators should aid future development of the system. The table outlines, according to the previously identified ‘work units’ (i.e., the different management sections), the key indicators, their current benchmarks, persons responsible and so on. .
  • 15. - 15 - INSERT: MATRIX CURRENT QA INDICATORS Insert: A comprehensive matrix of the 2009 QAS guidelines To facilitate simpler and systematic analysis of the overall QAS approach, this simple matrix was extracted from the QAS guideline documentation, developed in 2009 and amended until now. Subsequent analyses in this report make reference to this matrix and the indicator numbers.       Indicator  Objective  Indicator Criteria  Information  Sources  Data  Collection  Data  analysis  Data responsibility  Acceptabl e  need to  improve  not  acceptable  Collection  Analysis  Project  1  Quality of life of PRCs  client from the 1st to  2nd assessment  Improve quality of  life of the clients      Quality of life  assessment  Daily  Annually  Social  Worker  Project  Manager  2  Level of client  satisfaction  To measure client  satisfaction of the  device at delivery  85%  75 ‐ 85  %  <75%  satisfaction survey  with questionnaire  6 monthly  Annually  Head of  Work  Shop  Head of  Work  Shop and  PM  3  % of pathologies treated  at K Cham PRC and % of  pathologies treated at  the 11 PRCs  To compare the  representativeness of  the PRC to the 11  PRCs  >7  5‐Jul  <5  PRC and national  statistic  Annually  Annually  PM,  DAC/Mo SVY and  other  PRCs  PM 
  • 16. - 16 - Administration  1  Office Supply to the PRC  to compare the  number /amount of  office supply of  equipment in the PRC           Accounting book,  livre de Bord and  Cash Box  Monthly  not  specified  Cashier  and  Head of  support  not  specified  2  Staff Leave  To compare the  number of leave  record by  administrative and  the leave record by  each unit  100%  followi ng the  Policy  1%  variatio n  >1%  variatio n  Admin and each  unit record of staff  leave  Quarterly  not  specified  head of  support  unit  Project  manager  with  head of  each unit  3  Communication of PRC  by phone   To strengthen and  accelerate cost  effectiveness of  communication  through telephone           error in file  Monthly  not  specified  head of  support  unit  PM with  heads of  unit 
  • 17. - 17 - Administration Continued  4  Communication of PRC  by mailing  To strengthen and  accelerate cost  effectiveness through  mailing  >95%  respon ded  and  filed  95‐90  <90  error in file  quarterly  not  specified  head of  support  unit  PM  5  Staff training  (workshops, congresses  and other refreshers)  to manage and  strengthen staff  training and capacity  building record  100%  recorde d  1%  variatio n  >1%  variatio n  admin and record   quarterly  not  specified  head of  support  unit  PM  6  Level of respect of  working time  measure level of staff  commitment   >7.5  hours  7‐7.5  hours  <7.5  hours  admin file, staff  movement  Daily  Quarterly  Guard  Head of  Support  Services  7  Daily staff presence at  work / absenteeism  Staff respects  working time  0 staff  absenc e  withou t notice  5%  >5%  staff leave record  with approval by  line supervision and  line manager  6 monthly  not  specified  Guard  and  section  heads  Head of  Support  Services 
  • 18. - 18 - Accounting  1  Amount of money  between accounting  book and cash box  compare difference  0%  variatio n  erasure s  any  differe nce  accounting book,  livre de bord  weekly or  when  needed  not  specified  Cashier  and head  of  support  unit  not  specified  2  % of money forecast and  expenditure  to compare money  forecast and  expenditures  "‐5‐5%"  6‐10%  >10%  Monthly treasury,  excel journal of  accounting records  monthly  not  specified  PM,  Head of  support  unit and  cashier  with  support  from  accounta nt     3  Amount of money in  USD in cash box  To measure the  minimum and  maximum balance of  the money in cash  box  500‐ 1000  300‐ 500 or  1000‐ 3000  <300 or  >3000  Cash box through  accounting  monthly  not  specified  Cashier  and head  of  support  unit     3.1  Amount of money in Riel  in cash box  To measure the  minimum and  maximum balance of  the money in cash  box  1m‐4m        Cash box through  accounting  monthly  not  specified  Cashier  and head  of  support  unit     4  Amount of money in  bank  To measure the  minimum and  maximum balance of  the money in bank  5‐7k  3‐5, 7‐ 10k  <3,>10 k  Bank record and  record of checks  monthly  not  specified  head of  support  unit    
  • 19. - 19 - 5  Date of salary payment  to staff  To  make sure that  salary payment is on  time  25th‐ 30th of  month  variatio n     pay slip and bank  transfer records  monthly  not  specified  head of  support  unit and  HR  deputy  manager     6  Justification of each  expenditure  Ensure clear  justification of  expenditures  100%  variatio n     Records of invoices  monthly  not  specified  Cashier  and head  of  support  unit     7  Number of cheques  from the bank  to make sure that the  money withdrawal is  done properly and  regularly  4 per  month  3,5  <3,>5  Cheques recorded  monthly  not  specified  head of  support  unit    
  • 20. - 20 - Store Management  1  quantity of items  between stock cards and  physical stock  to make sure that all  the items from stock  cards are the same  from physical stock  100%  95%  <95%  record of stock and  cards  Various  not  specified  Store  keeper  and head  of social  support     2  Critical stock of item   to make sure all the  items especially the  imported ones and  consumables are  always available for  the workshop  No  case of  stock  disconn ection  OR  100%  of  import ed  items  respet  critical  stock  5%  item  not  respect ing  critical  levels  OR  there is  maximi m of 3  case of  stock  more  than  5%  Stock Control  Report  quarterly  not  specified  Head of  support  unit and  store  keeper    
  • 21. - 21 - Workshop Unit  1  Adjustment of the  alignment during gait  training.  to ensure the desired  level of smooth gait  and stabilities  <3  4,5   >5  Daily activities and  checklist in client  file  Daily  Quarterly  PO and PT  Head of  workshop  unit and  PM  2  Adjustment of the  socket/orthosis  To ensure proper fit  of socket and  stability of the  prosthesis for the  client, lack of pain  and pressure areas  on skin  <2  3,4  >4  Daily activities and  checklist in client  file  Daily  Quarterly  Head of  Workshop  Unit  Head of  workshop  unit and  PM  3  Number of mistakes  during manufacturing  process  To measure the  technical  competency of the  P&O  2  3  >3  Daily activities,  progress notes for  P&O and checklist  made by head of  section  Daily  Quarterly  head of  workshop  unit and  head of  section  Head of  workshop  unit and  PM  4.1  Prosthesis and orthosis  life span / durability  (existing client)  to ensure the quality  of the devices  >5  3,4  <3  Devices record,  PMS and client file  monthly  Quarterly  head of  workshop  unit  Head of  workshop  unit and  PM  4.2  Prosthesis and orthosis  life span / durability  (new client)  to ensure the quality  of the devices  >9  7,8  <7  Devices record,  PMS and client file  monthly  Quarterly  head of  workshop  unit  Head of  workshop  unit and  PM  4.3  Shoe raise and SFAB life  span in months  to ensure the quality  of the devices  >5  3,4,5  <3  Devices record,  PMS and client file  Monthly  Quarterly  head of  workshop  unit  Head of  workshop  unit and  PM  4.4  Wheelchair, tricycle  standing frame, seat life  span  to ensure the quality  of the devices  >85%  75,85  <74%  Devices record,  PMS and client file  Monthly  Quarterly  head of  workshop  unit  Head of  workshop  unit and  PM 
  • 22. - 22 - 4.5  Trolley life span in  months  to ensure the quality  of the devices  9  6,7,8  <6  Devices record,  PMS and client file  Monthly  Quarterly  head of  workshop  unit  Head of  workshop  unit and  PM  5  Level of client  satisfaction of device  To measure client  satisfaction of the  device at the delivery  >85%  75,85  <74%  Satisfaction survey  with questionnaire  six monthly  annually  head of  workshop  unit  Head of  workshop  unit and  PM  6  Number of devices  deliver per month, per  P&O  Efficiency of P&O  work  >33  30‐33  <30  Monthly progress  data collection and  P&O record book              7  Number of devices  delivered per month per  BT  To measure the  efficiency of BT work  >17  15,17  <15  Monthly progress  data collection and  P&O record book  monthly  six‐ monthly  head of  workshop  unit  Head of  workshop  unit and  PM 
  • 23. - 23 - PT Unit  1  Progress in functional  skills of client  To measure the level  of technical  competency of PT  >10%  5‐10%  <5%  treatment plan,  progressive note,  checklist from  client files  daily or  weekly  Quarterly  head of  PT unit  head of  PT unit  and PM  2  Progress in functional  skills of children  To measure the level  of technical  competency of PT  >10%  5‐10%  <5%  GMFCS Ax form  monthly  six‐ monthly  head of  the PT  unit  head of  PT unit  and PM  3  Number of detailed  treatment plans that  include SMART goals  To measure the  treatment planning  skills  >90%  80‐90%  <80%  treatment plan,  progressive note,  checklist from  client files  monthly  Quarterly  head of  the PT  unit  head of  PT unit  and PM  4  number of missed  appointments at the PRC  To measure the level  of participation client  to the PT treatment  <50%  50‐60%  >60%  Database system,  daily appointment  schedule   monthly  Quarterly  head of  the PT  unit  head of  PT unit  and PM  5  Number of treatment  sessions per day per PT  to measure the  efficiency of working  of PTs  12,14  10,11;1 5,16  <10,>1 6  Database system,  daily appointment  schedule   daily  six‐ monthly  head of  PT unit  head of  PT unit  and PM 
  • 24. - 24 - Interdisciplinary approach  1  Number of check‐outs  by PT and PO  to measure the  efficacy of the  multidisciplinary  work (at the delivery  of the client, all cases  should be checked  regardless of the  clients with device or  without device)  100%  95‐99%  <95  check out list  monthly  six‐ monthly  PM and  head of  sections  PM  2.1  Morning meeting  to measure the  degree of  collaboration among  staff  >95%  85‐95%  <85%  file checking system     Quarterly  PT unit  PM and  head of  PT  2.2  Joint consultation,  screening  to measure the  degree of  collaboration among  staff  100%  95‐99%  94%  file checking system     Quarterly  PT unit  PM and  head of  PT  2.3  Joint assessment and  prescription / number of  MD meetings  To measure the  degree of  collaboration among  the staff  100%  95‐99  <95%  file checking system     Quarterly  PT unit  PM and  head of  PT  2.4  Complex cases meetings  to measure the  degree of  collaboration among  the staff  100%  95‐99  <95%  file checking system     Quarterly  PT unit  PM and  head of  PT  2.5  Daily client round  to measure the  degree of  collaboration among  the staff  100%  95‐99  <95%  file checking system     Quarterly  PT unit  PM and  head of  PT 
  • 25. - 25 - 10.1.1 “Coffee bean analysis” – Audit of current indicators Building on a clearer matrix identified in the previous sub-section, the next section explores the current usage of the system. Specifically, an audit from data since October 2010 was undertaken based on records supplied. This very simple analysis is not intended to explore the validity of the approach taken, or interpret any results, but simply to audit whether or not the data have been systematically collected, aggregated and disaggregated. These terms are described below. Not collected Indicators that are pre-defined in the 2009 guideline document and reinforced in recent documents that were not, in the specified monthly period, collected. NB. Data may have been collected elsewhere, but have not been collated in the QAS spreadsheets2. Collected Data on the indicator, whether verified or not, are evident in available QAS spreadsheets Aggregated Data are compiled and averaged within the relevant time period Disaggregated Data are examined for differences in sub-sections and important comparisons are possible with the treatment of data These findings are presented in the subsequent table. 2 A key feature of good QAS systems is that it is not entirely distinct from ordinary operational procedures. Here, that has been the case. This issue and some potential solutions are explored in subsequent chapters.
  • 26. - 26 - Table 1 - 'Coffee Bean' analysis of current collection, processing and analysis of QAS indicators.       Key                        not collected    Collected         Aggregated                 Disaggregated      2010  2011     Indicator  oct   nov   dec  jan  feb  mar  apr  may  jun  jul  aug  sep  oct  nov  dec  Project  1  QoL change                 2  Level of client  satisfaction                 3  Pathologies                 Admin  1  Office Supply  to the PRC                 2  Staff Leave                 3  Phone                 4  Mailing                 5  Staff training                 6  Level of  respect of  working time                 7  Presence                 Accounting  1  Amount of  money  between  accounting  book and  cash box                 2  % of money  forecast and  expenditure                 3.1  Cash box ‐  USD                 3.2  Cash box ‐  Riel                 4  Amount of  money in  bank                 5  Date of salary  payment to  staff                 6  Expenses  justification                 7  # cheques  from bank                
  • 27. - 27 - Store  1  Stock audit                 2  Critical stock  of item                  Workshop Unit  1  # alignments                 2  Adjustments                 3  Errors                 4.1  P&O lifespan  (existing  client)                 4.2  P&O lifespan  (new client))                 4.3  Shoe raise  and SFAB life  span                 4.4  WhCh etc  lifespan                 4.5  Trolley life  span in  months                 5  Satisfaction w  PO                 6  PO Output                 7  BT output                 PT Unit  1  functional  skills of client                 2  functional  skills of  children                 3  SMART goals                 4  Missed apts                 5  # sessions PT                 ID team  1  # team check  outs                 2  Morning  meeting                 3  Joint  consultation,  screening                 4  Joint Ax                 5  Complex  cases  meetings                 6  Daily client  round                
  • 28. - 28 - MoSVY  1  Age                 2  Gender                 3  Pathology                 4  New v  existing                 5  Production                 6  Dorm usage                 7  Referrals to  and from                 8  Follow‐up                 9  Cost                 Other  1  Waiting time                 The preceding table summarises how data have been collected and treated since late 2010. Overall, a few key indicators have been reliably collected. Only MoSVY data have been systematically disaggregated. Original planning was for a phased rollout of all the indicators, including periodic review of the systems used, the relevance and complexity of indicators and so on. This simple analysis demonstrates that many of the more complex indicators have not been used at all, since no instruments to examine them have been developed. Instead, more immediately useful data with clear systems for collection, aggregation, disaggregation and review, have been collected more successfully. Consequentially, the implementation of the QAS plan according to the specifications outline in 2009 has not been completed. Rather, a system that has adjusted to the changing requirements of additional projects, the MoSVY handover, availability of useable instruments and the workload constraints of staff, has been implemented. 10.1.2 A timeline of QAS development Given the multiple pressures and constraints on the current project, the overall Kampong Cham PRC, its staff and the technical personnel, it may have been complex to reconcile the varying reporting requirements. This may, in part, explain the limited usage of the previously developed QAS. To better understand these varied requirements, a simple timeline is presented in Table 2 on the next page.
  • 29. - 29 - 2009  2010  2011  2012  Oct  Nov  Dec   Jan  Feb   Mar  Apr  May  Jun  Jul  Aug  Sep  Oct  Nov  Dec  Jan  Feb     ongoing  Kampong Cham QAS milestones  Management  and admin  Initial scoping exercise / QAS guideline/indicator devt.     Initial review of indicators  Admin and  management  scheduled for  mid‐project  implementation           Change in project management  Shift of emphasis to  project‐related admin  procedures           Introduction of MoSVY 'PMS' clinical forms  Current review and re‐planning  Reflection on management‐ related QAS processes as  clinical outcomes transition to  MoSVY system  P&O  basic  production  stats  Plan in place for the  gradual introduction of  more complex  indicators        Changing  emphasis  towards MoSVY  data  Core set' of indicators  used to ensure fluid  transition to MoSVY  data  A change to a simpler set of  indicators that can be used to  guide practice more effectively PT        Finances     Plan for implementation of key  indicators  Shift of emphasis to  project‐related admin  procedures           Re‐examine finance‐related  QAS tools separate to project  management and HI  procedures   MoSVY  handover        MoSVY database  in place and used        Full reporting to MoSVY implemented via  "PMS"  Understand how MoSVY  requirements can be built on  for internal QAS. Explore cross‐ centre issues with other  stakeholdres  External  factors              Start of  development of  HI‐wide  'Rehabilitation  Management  System'  Ongoing testing and development of system  HI release of Beta "RMS"  HI Shifts in continuing  professional development  approaches  Table 2- A timeline of relevant reporting requirements, QAS milestones and external parameters This table plots a highly summarised version of key events within the implementation of a QAS, handover processes and concurrent external factors. It shows that during 2011, handover processes and management changes introduced new and different reporting requirements that probably disrupted the evolution of the QAS. As of February 2012, clinical reporting is stronger than management and financial processes, which coincides with changing management structures and handover milestones. 2012 and ongoing activities are summarised.
  • 30. - 30 - 10.2 An examination of current QA indicators The indicators are appraised for their relevance against the goals of the quality assurance system and refined as appropriate ie reconfirm the significance of the indicator in terms of quality assurance, collection methodologies, data storage and reporting to define a master list of key indicators to proceed with. Building on the previous analyses – specifically constructing a clear matrix and an audit of collected data, the next section seeks to analyse the relevance of the indicators. This analysis focuses on  Simple remarks on collection issues identified  Issues identified with storage  Comments on reporting  Face validity  Construct validity  Meeting the objectives of the initial QAS design 10.2.1 Validity explained Because this analysis describes an appraisal of the existing indicators against a set of goals, it is useful to use common definitions of validity to do so. These are not necessarily self-explanatory, so simple definitions are given here:  Face validity is about whether ‘on its face’, is the indicator a reasonable one for the construct (the thing, quality, quantity) being measured. Put simply: are we actually measuring what we are trying to measure?  Construct validity extends the analysis a little further, and asks whether the operationalised indicators related to what we know about the underlying phenomenon behind the indicators. Put simply, if we see an improvement or deterioration in the indicator, will that be related to improvements and gains in what we are actually trying to measure? 10.2.2 Results of analysis of process and validity Findings and remarks on collection processes and simple observations on validity are presented in the following table.
  • 31. - 31 - Table 3 - Practical collection and storage issues, and remarks on validity of the current QAS. This table systematically examines the collection processes, storage and reporting issues of the current QAS. Remarks on face and construct validity are presented. As for the previous development of a more practical and understandable, user-friendly QAS matrix, this table is presented in the relevant management sections.       Indicator Criteria  Process  Validity       Indicator  Acceptable  need to improve  not acceptable  Remarks on  collection  Data storage  Reporting  Face  Construct  Project  1  QoL change           SW took over, and then this  was stopped during  management changes.   The data that have been  collected have not been  routinely aggregated in  monthly reporting.  The data have not been  reported in the audit  period.   Overall, a quality of life  indicator is much needed  The indicator and benchmark  criteria are not well  operationalised to match HIs  emerging tool in this area.  Much work is underway on this  important tool. An earlier  advisory suggested the TIGA  instrument, and this should be  followed further  2  Level of client  satisfaction  85%  75 ‐ 85  %  <75%  Appears to only have been  collected once in late 2010.   There is no clear place for  monthly data to be  aggregated.   There has been no attempt  to understand who is  satisfied, who isn't and why.  Careful disaggregation is  needed. This should not be  complex ‐ each client will  have a simple satisfaction  entry, possibly reduced  further into 'dorm, services,  etc but with an overall  aggregation.'   A client satisfaction tool is an  important part of any  conceivable QAS approach.   The benchmarks don't  adequately reflect satisfaction.  There is no explanation or any  obvious reason for the  delineations between  acceptable and not acceptable.   3  Pathologies  >7  5,6,7  <5  MoSVY data are collected in  these areas  In MoSVY database.  Currently, reporting on this  indicator is complex, as  there are many groups and  variables in a matrix.        
  • 32. - 32 - Administration  1  Office Supply to  the PRC           Not examined in audit. No data, aggregations or analysis appear in the QAS data provided. Some indicators are  examined in ordinary programme management but do not appear in QAS   2  Staff Leave  100%  following  the Policy  1%  variatio n  >1%  variatio n  3  Phone           4  Mailing  >95%  responde d and  filed  95‐90  <90  5  Staff training  100%  recorded  1%  variatio n  >1%  variatio n  6  Level of respect  of working time  >7.5  hours  7‐7.5  hours  <7.5  hours  7  Presence  0 staff  absence  without  notice  5%  >5%  Accounting  1  Amount of  money between  accounting book  and cash box  0%  variation  erasures  any  differen ce  Not examined in audit. No data, aggregations or analysis appear in the QAS data provided. Most of the indicators  are more appropriately simple policies. That is, it should not be necessary to measure as a percentage, compliance  with compulsory performance.   2  % of money  forecast and  expenditure  "‐5‐5%"  6‐10%  >10%  3  Cash box ‐ USD  500‐1000  300‐500  or 1000‐ 3000  <300 or  >3000  3  Cash box ‐ Riel  1m‐4m        4  Amount of  money in bank  5‐7k  3‐5, 7‐ 10k  <3,>10k  5  Date of salary  payment to staff  25th‐30th  of month  variatio n     6  Expenses  justification  100%  variatio n     7  # cheques from  bank  4 per  month  3,5  <3,>5  Store  Management 1  Stock audit  100%  95%  <95%  Store management is in transition to MoSVY Civil Servants. Consequentially, there appear to have been complexities  in developing a systematic approach. This is a key target area for the immediate future.  2  Critical stock of  item   No case  of stock  disconnec tion OR  100% of  imported  items  respet  critical  stock  5% item  not  respecti ng  critical  levels  OR  there is  maximi m of 3  case of  stock  more  than 5% 
  • 33. - 33 - Workshop Unit  1  # alignments  <3  4,5   >5  Collecting these data  appears to be very  burdensome and complex.   Data have been stored by  the head of the PO unit.  Reporting reduces the data  into a simple histogram,  identifying number of  alignment changes done for  each fitting. There is no  attempt to distinguish  between different team  members or prosthesis  types.   To measure smooth gait and  gait stability, then those  constructs should be  examined ‐ not the number  of alignment changes.   The stated objective doesn't  match the indicator. Stability  and smoothness are probably  linearly related to the number  of adjustments, rather than  inversely as the indicators  would suggest  2  Adjustments  <2  3,4  >4  As above  There are some gaps in the  data storage and there are  complexities in entering and  understanding the data  The data have been  reported on occasions.  There is no attempt to  understand reasons for high  numbers of adjustments.   Proper fit of a device is  measured by examining the  fit, not how many  adjustments it took to reach  it.   The number of substantive  socket adjustments might be a  useful thing to measure, but it  doesn't say very much about  the quality of the final fit.   3  Errors  2  3  >3  Because mistakes have only  been vaguely defined, it is  almost impossible to  reliably collect these data.   There is no clear place for  monthly data to be  aggregated.   An average number of  mistakes has been reported  on occasions, with little  analysis.   Technical competency of the  P&O is really only very  tenuously related to the  number of mistakes during  the manufacturing processes.  At best, it is one very small  component of competence.  Further, the errors defined  are more properly the work  of a bench worker.   Competence is a complex thing  to measure. We have  experience with a number of  tools for rapid analysis.  However, a longer term  mentoring approach is a far  more appropriate way to  measure competence. That is a  harder and longer‐term  management process change,  but should be explored. This  would have been in place  under previous management  processes, but appears to have  deteriorated and has possibly  been confused by constant  change in management  approaches.  
  • 34. - 34 - 4.1  P&O lifespan  (existing client)  >5  3,4  <3  Not collected for new  clients, only existing clients   This is a complex statistic to  understand, even with  reliable collection. As  devices are of a mixed life‐ span, at the time of analysis,  it is hard to retrospectively  understand what went  wrong.   An average life span of  devices has been reported  for most months of since  late '10. Doesn't distinguish  between prostheses and  orthoses or between  different complexities of  devices.   The objective needs to be  thought about. Durability  and quality are not the same  thing.   The data analysis does not  match the indicators. The  benchmarks need to be  better defined. In fact, it is  probably more sensible to do  a self‐benchmark wherein  the data are compared to  previous months rather than  some arbitrarily determined  number that is intended to fit  all types of devices, where  different devices have a  vastly different expected life‐ span. Not collected for new  clients, only existing clients  Possibly a more useful  approach would be an ongoing  audit of the kind of failure, and  efforts to address recurrent  issues seen in returned  devices.   4.2  P&O lifespan  (new client))  >9  7,8  <7  Not collected for new  clients.  There is no clear place for  monthly data to be  aggregated.   not reported  It is unclear why new and  existing clients would have  different life‐span of devices,  other than those related to  fit and functional changes. It  is unclear how monitoring  this statistic would help  improve service.      4.3  Shoe raise and  SFAB life span  >5  3,4,5  <3  Not collected, not defined. 4.4  WhCh etc  lifespan  >85%  75,85  <74%  5.1  Trolley life span  in months  9  6,7,8  <6  5.2  Satisfaction w PO  >85%  75,85  <74%  There is no form on  satisfaction for devices, but  there is one for the PRC  overall. The statistic has not  been collected.  There is no clear place for  monthly data to be  aggregated.   Has not been reported.   Satisfaction with the device  would be a useful measure to  collect, provided a tool can  be developed that is valid  and reliable, without being  too complicated.     
  • 35. - 35 - 6  PO Output  >33  30‐33  <30  Doesn't count complex  devices like standing  systems. Unclear why there  is a large difference  between the P&Os and the  benchworkers. The  benchmarks don't match  these data  The process of collecting  and entering these data  appears to be more  complex than necessary, as  the information has largely  been collected in the  MoSVY database anyway.   No attempt to look at the  range of staff output. The  reason to do this would not  be to critique low‐outputs,  but to have a way of  strategically planning who  might have time for  alternative activities,  particularly related to CPD  and career development.   The indicator doesn't say  anything about efficiency  unless the units are better  defined. That is, what is the  net effort for a particular  device. Month‐to‐month  comparisons are meaningless  if the complexity of devices  changes.   Efficiency is about much more  than crude numbers of devices.  That's not to say the index  shouldn't be measured, but to  assume it is related to  efficiency is problematic. Costs,  functional gains, quality,  relevance of the prescription  are all related ot efficiency,  and these are not currently  explored.   7  BT output  >17  15,17  <15  as above  PT Unit  1  functional skills  of client  >10%  5‐10%  <5%  Not currently collected.  Very difficult to implement.   Has not been collected to date.   This is a useful construct to measure, but it does not necessarily  say anything about the competence of the professionals.   2  functional skills  of children  >10%  5‐10%  <5%  see comments in file  3  SMART goals  >90%  80‐90%  <80%  For each client, whether the  goal is smart or not has  been determined. However,  this has not been compared  against the agreed  benchmark.   This is currently stored in a  separate client‐by‐client  spreadsheet, with a simple  yes/no indicator for smart  goals.   While the data have been  collected and some effort  has been taken to average  them, they haven't been  compared to benchmarks.   If a decision is taken to use  SMART goals in this context ‐  just do it. There is hardly a  need to measure compliance  to a policy as a percentage.   Whether are not goals are  smart really doesn't target  whether the treatment plans  are of a high standard, just that  they use a particular  documentation approach. A  simple approach to measure  the planning skills might be  part of an overall approach to  examine the competency of  PTs  4  missed  appointments at  the PRC  <50%  50‐60%  >60%  Up to now the figures were  reversed, meaning a high  number was considered  positive, when in fact the  number is MISSED  appointments.  Reported and stored in a  separate spreadsheet.  Entered by administration.   Reported on occasions as a  simple percentage without  disaggregation or further  analysis.   This is probably a simple and  useful statistic, and therefore  valuable. But, it probably  doesn't accurately measure  the stated objective.   This does not only cover PT but  for the whole PRC.   5  # sessions PT  12,14  10,11;1 5,16  <10,>16     Currently a complex  approach for entering into  the QAS is used.   This has consistently been  reported through both the  QAS and MoSVY databases.  As for P&O, the simple  number of sessions per  therapist per day is probably  too simple to measure the  stated objective, but needs  to be measured anyway.   Doesn't include all treatments  such as client education, etc.  Not clear if can be  disaggregated between the  different therapists.  
  • 36. - 36 - Interdisciplinary approach  1  # team check  outs  100%  95‐99%  <95     Currently entered in a  complex approach without  clear definition of the  responsibility.   Not reported consistently,  though the aggregated data  are available for most  months.   The index seems totally  unrelated to the stated  objective. Simply checking  out devices together says  nothing at all about the  efficacy of doing so.   The need for this statistic is  unclear. Like some others, it  probably reflects what is a  simple policy and practice  decision. That is, if the policy is  that all clients with an assistive  device should be checked out  by both PT and P&O, then that  should just happen. The QAS  should not necessarily measure  to the percentage what  happens, but identify  breakdowns in that system and  seek to address them.   2.1  Morning meeting  >95%  85‐95%  <85%  Only vaguely and not reliably recorded. Should be replaced with a policy decision and practice change.   2.2  Joint  consultation,  screening  100%  95‐99%  94%  2.3  Joint Ax  100%  95‐99  <95%  2.4  Complex cases  meetings  100%  95‐99  <95%  3  Daily client round  100%  95‐99  <95% 
  • 37. - 37 - 10.2.3 Summary of findings on process and indicator validity. Overall, the introduction of a pilot QAS approach into the PRC has had many positive results. A focus on understanding quality, thinking about indicators and definitions of quality service, and ensuring that there is a strong and focused discourse on quality during handover to the MoSVY. However, the analysis here suggests that there have been many complexities in the introduction, and the result is a complex system with modest direct advantages. This section examines some of the key issues. In general, a few indicators are clearly useful to the staff, are collected systematically and are able to be used to draw useful inferences about the quality of ongoing services. There are, however, several recurrent themes for many of the indicators. These are summarised as follows.  Overall, QAS approaches have been considered relevant mostly to clinical areas, rather than over-arching administrative, HR, logistics issues.  Indicators are often a poor reflection on the parameter they are trying to measure.  Many indicators are not well operationalised – that is, they don’t measure what we are interested in  Many indicators simply seek to measure compliance with a policy, and should probably simply be replaced with a policy change and management to ensure compliance with it  There is no real mechanism to review data, reflect on issues and to plan and implement process change  The collection of data is seen as – and is – burdensome and not proportional to the value it offers. These issues are further analysed in subsequent sections. 10.3 Benchmarks – critique and re-definition according to relevant standards Benchmarks for the indicators are identified from national and international standards, local laws and customs, MoSVY/PoSVY and PRC internal practice and policies Given the results outlined in sections 10.1, deepening the analysis to examine the benchmarks one-by-one is, for the most part, redundant. Given that many of the indicators themselves lack validity, seeking benchmarks from relevant literature is neither appropriate nor possible. Instead, what is needed is some changes to the quality assurance approach at the PRC, building on the strengths in the PMS system, MoSVY reporting requirements and the indicators that have been collected efficiently and simply so far.
  • 38. - 38 - 10.4 Operationalising indicators to compare against key benchmarks Indicators and benchmarks are made operational to enable identification of risk and safety concerns as well as identification of acceptable targets. As for section 10.3 above, a deep analysis of how indicators are operationalised, and how data are gathered using the current indicators is probably far less relevant than a new look at the overall quality assurance approach. For this reason, this TOR was approached in a different manner, by proposing alternative strategies to build on current QA activities in a more efficient and effective manner. 10.5 Refinement and improvement of QAS processes Data collection, storage and reporting systems refined and/or developed clarifying information source, frequency of collection and responsible person. Given the complexities and wider challenges for the quality assurance system, refining the current system without first revisiting the basic elements such as key indicators, reporting requirements and developing the requisite skills in responsible staff, is not feasible within the current timeframe. Rather, a programmatic response is needed. In that response this evaluation and its recommendations are considered, and plans to further refine the approach in light of recent developments that have precipitated major changes, would be planned and implemented over a longer period. Irrespective of these findings, as a starting point, a simple process analysis focusing on clinical services, rather than on management and administration (due to time and the availability of relevant staff), was conducted. A matrix of actors and processes was developed, simply by defining the key domains of activities and approximating the quality cycle as another axis. Doing so enables a quick analysis of different processes for each player at different stages. In each cell, semi- structured analysis can explore:  Challenges  Barriers  External constraints  Good practice  Errors  Other experiences Doing so allows a structured examination of bottle-necks or strong points in the overall process. The matrix is presented in Table 4 on the subsequent page.
  • 39. - 39 -     plan  define indicators  Data collection & use  monitor  analyse  Process change  Staff  Sub‐category  Strategic  planning  process  Unit‐level  planning  of  indicators  Analyse  external  factors  Define  indicators  Propose  and agree  on  benchmarks  Generating  original  data  Enter  primary  data  transfer  data to  central  database  Compile/  aggregate/  disaggregate  Examine  data  Examine  against  indicators  Analyse  cause of  variations  identify  responsible  staff  Implement  change  process  change  to  system  Administration  Management                                               Office                                               Finance  Accounting                                               MGMT                                               Clinical  PT                                               P&O                                               Social                                               WhCh / etc                                               Support  Store                                               Guards                                               Etc                                               User                                Table 4 - A process/actor matrix for examining current approach to QAS implementation This table plots the elements for analysis by domain and process. The broad time-elements of a QAS process are approximated on the left-right axis. Key domains are presented in the table rows. The highlighted box represents the key areas of focus in the present analysis.
  • 40. - 40 - Once this matrix had been developed, we made a simple analysis of key ‘cells’ – that is, tasks and considerations for particular human resources at particular elements of the cycle. The next sections divide the cycle into planning and defining indicators, gathering and usage of data and monitoring, analysis and reporting. Because the staff involved in this section of the analysis were most interested with their responsibilities in collecting the data, this was the main focus, but we made general observations on planning and analysis. 10.5.1 Planning and definitions of indicators The planning and definition of indicators occurred mostly in 2009. Recommendations for change were made in late 2010. Proposed adjustments have not been implemented. There is no systematic approach to re-evaluating indicators. The present evaluation was intended to re-shape and build upon the current approach, but as earlier sections have suggested, there have been dramatic shifts to the project constraints, HR and reporting requirements that require adjustments that are beyond the scope of this evaluation alone. However, the matrix developed in table Table 4, above, offers a systematic approach to re-developing the system if that is considered the most appropriate way forward. The analysis of the validity of the indicators presented in 10.2 on page 30 suggests that this is perhaps one of the biggest bottlenecks in the process. The data are not used effectively because the indicators have not been refined, tested and considered – and more importantly that there is no plan to do so currently. Just as critically, there is nowhere that this can be done routinely embedded into the QA processes. Because standards change, and the working context changes – not lease new and stronger skills in gathering, using and interpreting QA data, it follows that the indicators and processes should also evolve. Currently, that is lacking in the QAS approach, and is probably largely responsible for the limitations observed. 10.5.2 Data collection Generating original data The PT and P&O involved in this section of the analysis both expressed that the overall number of data entry requirements is very large. They also noted, though not in this exercise, that there are too many steps in the process. Data are entered into the client form, and then into a separate data form. They also suggested that: “the responsibility is changing sometimes. There is not always a clear form or process for collecting some of the indicators” In the P&O section, the respondent observed “For the core patient details this is done via a single form through a database manager. For QA data, there is no central place. The unit
  • 41. - 41 - heads are keeping their own files and then occasionally they have been compiled but right now the process has stopped.” This highlights that the QA system is seen as conceptually different from ‘normal’ data usage. This is consistent with earlier the earlier findings that only a few of the indicators are normally collected and that there is no overarching process described for the QA system. The observation that ‘the responsibility is changing’ is consistent with the changing external requirements, shifting management processes and a general lack of experiences with quality management systems – both on the part of the specific responsible persons but also the program, HI and the wider sector in general. Entering primary data Entering data into a QAS system, in addition to the complexities in knowing which data to use that were described above, there are many challenges in entering relevant and accurate data into a QA system. Some of the problems included multiple handling of data: “Entering the data into a software (sic). Sometimes we have to get the statistics from a database. For example the staff working hours. “ ..and bottlenecks in the data flow “Before we needed it (the data) from the admin but now we get it directly from the database or the PM.” These findings reinforce that the QA system is not only conceptually problematic in terms of the chosen indicators, but that there are complex and inadequate systems for collecting and entering data. Centralisation – entering into database Here, we observed that there are still further separations and complexities with the QA system. Overall, data required for the MoSVY reporting system are systematically and routinely entered into a central database. Data for the QA, on the other hand, must be entered separately by unit heads, and there is not a strong oversight of the process. The staff have not been supported to develop their skills in data usage and processing or even provided with reasonable templates for data usage. While more sophisticated analyses are possible, even a simple assessment suggests that the multiple entry sources, incomplete development of templates and the workload planning to use the system are critical factors in addition to other aspects such as the validity of the indicators and the separation of the QAS from ordinary management processes. Compilation, aggregation, disaggregation 10.5.3 Monitoring, analysis and reporting
  • 42. - 42 - There are clear limitations in the way data are monitored and analysed. During the previous phase, there was systematic reporting of a few key indicators, especially in the clinical domains. There are no reports on management-related indicators. For the present analysis, only the clinical indicators were explored, but the more important finding is that only a few indicators are systematically developed. 10.6 Development of a user-friendly composite tool A user friendly composite tool is developed for data management While it is clear that this item is very much needed, within the scope of the present technical input, it is very complex indeed to re-develop the current system. In fact, it is probably counter intuitive, since the implementation, change and piecemeal development of the current system are not consistent with sustained, satisfying and valued use of a QAS in Kampong Cham. Rather, it is proposed to extend and re-focus the emphasis on the quality management system to take into account the changes in management structure, strong efforts of the staff in using the current system, external developments in what we know about managing for good physical rehabilitation services and HI’s own developments on a robust and valid quality management process and a suite of research efforts focused on sustainability, quality and governance. This approach takes into account the elements found in a simple analysis of the process taken so far in the implementation of the QAS, which was summarised earlier in Table 2- A timeline of relevant reporting requirements, QAS milestones and external parameters, in section 10.1.2. Recommendations for an immediate course of action and short-term responses in balance with the project requirements, practical options and good practices are presented elsewhere.
  • 43. - 43 - 10.7 An analysis of 2010/2011 findings 2010 indicators and first semester 2011 indicators are reviewed and a report highlighting the main findings in terms of performance and quality is produced. 10.7.1 ‘Workshop’ results Indicator 1 & 2 - Adjustment during alignment and fitting Only a few months’ data were available to explore alignment and other adjustments. Further, the validity of these indicators was challenged in section 10.2 on page 30. Regardless, these data are examined here. Figure 1 presents the available data graphically. Almost all fittings in the early months were within benchmark targets. In later months, there was deterioration in the alignment targets. Other adjustments – indicator 2 in the workshop template, was not collected in May or June. These findings probably suggest that the indicator is not sufficiently operationalised and produces a ceiling effect – that is – it is not sensitive enough to measure change and variation to make useful decisions with. However, it also suggests that there is an overall low rate of alignment changes and adjustment before fitting. While this is listed as an appropriate benchmark – the contrary is probably true; more alignment changes probably result in a better final alignment and reflect systematic adjustment of limb alignment as training progresses and function improves. Figure 1 - Percentage of adjustments under target rate. Monthly percentage of alignment (dark bars) and other adjustments (light bar) from January to June, 2011. 0 10 20 30 40 50 60 70 80 90 100 acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable jan Feb mar apr may June Perctage of fittings under target error rate Adjustment Benchmarks Alignment Other adjustments
  • 44. - 44 - Indicator 4.3 – device durability Overall, the raw data recorded do not make a clear appraisal of lifespan simple. The indicator entered is, for a given month, the number of devices that were made in the three benchmark category timeframes. The grand mean of devices that lasted for an ‘acceptable’ 9 months was 45 devices per month, 22.4 devices were in the ‘needs improvement’ range of 7 or 8 months and 39.45 devices per month were unacceptable (Figure 2). Overall, then, about 53% are not meeting the acceptable standard of durability. Understanding those findings in more detail, based on the current raw data, is very difficult or impossible. It is not possible to understand which devices are breaking more than others. We would, for example, expect transtibial devices to fail before upper limb devices. We almost always anticipate orthoses will last longer than prostheses. We can’t examine whether, for instance, some devices failed after a week, since no range can be measured from the reported data. Overall, then we can detect there are probably some general issues with durability, since over half of devices are not lasting more than 9 months, and about 40% are lasting less than 7 months. This warrants a close investigation and careful disaggregation of data to examine which devices are failing for which users, and under which circumstances, so potential remedial action can be taken. Figure 2 - Number of devices within benchmark rages per month Number of devices within benchmark ranges are presented month-by-month. These data represent all devices (new, old, prostheses, orthoses, various levels). 0 10 20 30 40 50 60 70 80 acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable acceptable needs improvement unacceptable dec jan Feb mar apr may June July Mean Number of devices in category Device Durability
  • 45. - 45 - Indicators 6&7 – P&O and Benchworker production statistics Only for months data were available for processing P&Os achieved a rate that was acceptable according to the pre-determined benchmarks in January and July of 2011. The results were unacceptable in December and April, when only around 21 devices were produced monthly. For benchworkers, the production rate was only acceptable in January, and unacceptable in all other recorded months. Again, though, precisely why and how these benchmarks have been chosen is unclear, and since we don’t know about the caseload in those months, the complexity of the devices and other activities for those months, the analysis reveals little about efficiently and productivity, other than to say that currently, the production rates are highly variable from month to month. Indeed, the rates approximate the averages in most other contexts and, on their face, seem appropriate. A prosthetist might expect to, on average, deliver over one device per day and a one prosthetist might have two benchworkers, meaning the rate is about half that. These numbers are consistent with the findings here. Figure 3 - P&O and Benchworker Production Rates P&O acceptable rate is >33 devices per month. For benchworkers, the rate is 17 per month. Unacceptable rates are <30 and <15 respectively. 21.41 34.86 21.15 34 7.57 17.28 9.49 12 dec jan Feb mar apr may June July P&O and Benchwork production rate P&O Benchworkers
  • 46. - 46 - 10.7.2 PT findings Indicator 3 – Treatment planning For three out of four measured months, treatment plans included smart goals. In June, there was an ‘unacceptable’ compliance. Overall, these results probably suggest there is a strong compliance with a decision to use SMART goals for all treatment planning. No other meaningful inferences can be drawn. Figure 4 - Percentage of treatment plans with SMART goals per month The benchmark ‘Acceptable’ rate is 90%. Unacceptable is <80% 90 90.1 78 90.91 50 55 60 65 70 75 80 85 90 95 100 nov dec jan feb mar apr may jun jul aug sep oct nov dec percentage with SMART goals Percentage of Treatment Plans with SMART goals
  • 47. - 47 - Indicator 4 – missed appointments at the PRC While data are recorded here, it is not possible to compare the data against the benchmarks, as only a raw number of missed appointments are reported, rather than a percentage. The numbers may therefore represent a reasonable percentage of missed appointments. No inferences can be drawn from these data. Figure 5 - Missed appointments at the PRC The intention of this indicator is to understand the responsiveness of the service to the client needs. Benchmark acceptable is <50 62.2 57.98 69 62.02 50 52 54 56 58 60 62 64 66 68 70 nov dec jan feb mar apr may jun jul aug sep oct nov dec Number of monthly missed appointments Monthly number of missed appointments
  • 48. - 48 - Indicator 5 – Daily treatments per PT On average, PTs make 13.3 treatments per day for the 4 months where reliable data were available. This ranged from around 10 up to nearly 19. While it is not possible to effectively compute standard deviation from this small dataset, the variation here might represent differences in measurement process from month to month, as well as differences in the available working time – rather than the objective of measuring efficiency of the therapists. Overall, though, the average number was within the ‘acceptable’ benchmark range, but each of the four months taken individually was outside the acceptable range – some in the needs improvement range of 10-12 treatments per month, and a in July, a major violation of the >16 ‘unacceptable’ benchmark. It is important, then, to understand not only appropriate benchmarks, but reasonable month to month variation. Overall, while there are some limitations in these data, they suggest an overall trend of reasonable work rate for physical therapists. Figure 6 - Daily Treatments per PT The acceptable range is 12-14, unacceptable is less than 10 or greater than 16. 14.1 10.03 18.9 10.03 13.265 0 2 4 6 8 10 12 14 16 18 20 nov dec jan feb mar apr may jun jul aug sep oct nov dec mean Number of treatments Average daily treatment sessions per PT
  • 49. - 49 - 10.8 Additional Analysis: Comparing the QAS against sustainability indicators While sustainability of services has always been a key target for HI, and many of HIs operational methodologies have sustainable access to quality services as a centerpiece, it has re-focused its attention on sustainability of rehabilitation in the last few years. Consequentially, the organisation has a stronger understanding of the predictors of sustained delivery of rehabilitation services, and has experimented with a core set of indicators. As a consequence, it is very likely that at some level – either the service level or the governance level, a system for reflection on those indicators – will be introduced. While this is positive and necessary, at the level of staff working in a PRC, there is a strong chance it would be further destabilising of the current QA processes. Consequentially, it was proposed by the technical team in Cambodia to make an additional analysis of the correlation between then QA system and the sustainability indicators, to see where the indicators are already being detected to minimise any necessary changes. The findings of this simple analysis are presented in Table 5 on the following page.
  • 50. - 50 - Table 5 - Cross analysis of indicators between KC QAS and HI sustainability indicators This table presents the draft sustainability indicators prepared by an earlier sustainability working group in Cambodia. The analysis outlines whether the indicator is measured at the KC PRC (whether in the QAS system or elsewhere) and whether the indicator is used in management decisions. Kh National Sustainability Indicators Measured at KC PRC In QAS? Used? Defined by sustainability working group CoreindicatorsatPRClevel: C1 Health outcome s Number of old cases coming to PRC yes no Number of new cases coming to PRC yes no C2Healthservices provision Number of people treated yes no Number of devices produced yes yes no Level of quality of services no attempted but no Number of referrals yes no Number of outreach activities ? no C3&C4viabilityandorganizationalcapacity Availability of monitoring system partly yes but not functioning Availability of annual plan no no Availability of annual budget yes no % of PRC staff employed and paid by INGO yes no Percentage of PRC staff working for more than 4 hours/day no no Availability of raw material and consumables unclear yes no Availability of administrative procedures no no Implementation of HR procedures no no Percentage of PRC staff replaced (same qualification) no no Percentage of staff who follow national standards (ISPO) no no Number of supervision visits of PoSVY directors to the PRCs ? no C5Community capacity Level of awareness of community people/PWDs of the PRC activities and services no no Level of technical referral/follow-up of clients at community level no no Number of people/PWDs (physical) (women and men) coming by themselves to the PRCs yes no C6Enabling environment Level of implementation of the National Action Plan of PWDs no no Level of implementation of Cambodia Disability Law no no
  • 51. - 51 - Coreindicatorsatnationalsystemlevel(withoutputtingthemunderspecific components): Percentage of PRCs/Factory’s utility costs covered by national budget Largely covered by MoSVY database Implementation of the MoU Level of implementation of the National Plan of PWDs Ratification of the UNCRPD Availability of a centralized database on PRC statistics, costs Level of funds invested by government and/or donors Adequate recognition of PO and PT qualification in the public salary scale Percentage of clients who pay out of pocket money to access PRC services Level of money received by PRC/National Component Factory compared to money allocated to PRCs according to set standards Number of supervision visits of 11 PRCs Number of people treated/given services by PRCs Level of financial viability of the National Component Factory The table above demonstrates very clearly that the current QA approach does not explore many of the sustainability indicators proposed by the working group. More alarmingly, where they are collected, the data are not examined by management and acted upon in either a strategic or systematic manner.