Your SlideShare is downloading. ×
  • Like
Ces Conference Presentation June 2006 J Sheldon
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Ces Conference Presentation June 2006 J Sheldon

  • 75 views
Published

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
75
On SlideShare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Collaboration From a Distance:The Consequences of Implementing an Evaluation Study by Proxy Jeffrey Sheldon, M. A., Ed. M. School of Behavioral & Organizational Sciences Claremont Graduate University The Claremont Colleges June 2006
  • 2. Evaluation Overview Evaluation site: Private hospital, Durban, South Africa. Clients: Psychology Department Evaluation Component: Process and Outcomes. Evaluand: Patient satisfaction with psychological services. Unit of Analysis: Patients. Method/Data Source: Survey/Patients. Comparison Group: None. Data Collection: Psychology Department Staff. Sample Size: 11. Evaluation period: 15 – 30 November 2005. 2
  • 3. Evaluation Needs Little known about hospital patients’ thoughts, beliefs, and behaviors towards psychological services received. Documentation and analysis of patient satisfaction with processes and outcomes of psychological services. 3
  • 4. Evaluation Questions How satisfied are patients with Psychology Department services? Does high satisfaction correlate with high program quality? 4
  • 5. Evaluation Use Generate knowledge about the department’s effectiveness in providing services to patients. Program improvement (QIP -Quality Improvement Procedures) Primary intended users:  Hospital Psychologist;  Intern Clinical Psychologists; and  Counseling Psychology Intern. 5
  • 6. Background Hundreds of outcome studies on the effectiveness of counselling and psychotherapy. Few carried out specifically in a primary care setting such as hospitals. Hemmings (1997) surveyed 96 patients in an in-house counseled group, asking their opinions about received services. Hemmings as precedent:  Collected and analyzed data about patient satisfaction with the outcomes of a hospital – based psychological service  Point of departure: collected and analyzed data about the peripheral services and environment (processes) of those same services. 6
  • 7. Survey Development No formally articulated program theory to guide the evaluation, Survey developed on the basis of:  Departmental criteria and standards by which counseling services are conducted;  Department’s knowledge needs; and  Patient satisfaction and service quality theory from extant literature. 7
  • 8. Patient Satisfaction Relationship w/ psychologist Expected patient Outcomes Treatment – the counselling process Structure – peripheral services and environmental factors (Donebedian, 1989) Helpfulness of counselling; Being understood by counselor; Having enough time to talk to counselor; and Counsellor easy to talk with. (Hemmings, 1997) 8
  • 9. Service Quality A function of overall satisfaction and prior experience in a therapy setting:  What patients think and feel about their therapy session  Attitudes and behaviors toward therapy  How well they understand and follow their psychologist’s instructions  The outcome of their therapy 9
  • 10. Theoretical Model The following is a conceptual model of the program theory: General satisfaction FIGURE 1Therapy sessionAttitudes & behaviors toward therapy Service qualityFollowing instructionsOutcome of therapy Previous experience Knowledge of therapy Expectation of length of therapy 10
  • 11. Methods Initiated contact with Intern Clinical Psychologist to ascertain Psychology Department evaluation needs. Positive and enthusiastic response to conducting an evaluation. Needs articulated. Designation of an on-site co-researcher coordinator. Intern Clinical Psychologist with research experience designated supervisor. Memos of expectations and instructions sent via email. Some input given on survey instruments. Surveys and instructions were to be administered between 22 October and 22 November 2005. Summary of evaluation with recommendations sent via email January 2006. 11
  • 12. Participants English speaking. 18 years and older. A current hospital patient (either in or out-patient). Have the ability to fully understand and give informed consent. No psychological condition precluding them from fully understanding and giving informed consent. Could not present with any acute and extreme psychological condition that would necessitate immediate intervention on the part of a psychologist or render them incapable of actually filling out a survey. No other distinguishing characteristics. 12
  • 13. Sampling Venue-based and convenient. Patients could have arrived for counselling in different ways. Patients were to have been solicited for participation at the end of a counselling visit. 13
  • 14. Materials Informed consent form developed with input from the Hospital Ethics Committee. Survey:  Sixty-two Likert-scale items about patients’ experiences with counselling at the hospital.  Nine questions to compare the overall quality of care received with that provided at another institution (if applicable). Dillman’s (2000) principles for survey item construction and formatting. 14
  • 15. Planned Analysis Strategy Factor Analysis Correlations:  between constructs and with moderators  Between moderators and service quality Regression:  of 6 constructs and moderators on service quality ANOVA:  of satisfaction (low & high) and previous experience (yes or no/good or bad) on 4 constructs  … previous experience on 2 constructs Content Analysis:  of open-ended questions on 6 constructs of interest 15
  • 16. Expected Results In keeping with patient satisfaction with health services studies in general, it was expected that counseling services would be viewed positively (Hemmings, 1997). Studies have consistently found patients to report reasonable satisfaction with their mental health treatment (MacPherson et al, 1998). 16
  • 17. Actual Results Few of the evaluation questions the clients wanted answered were answered. The only “comfortable” result: patients perceive the six constructs positively. Therapy outcome had the strongest correlation to perceived service quality  the only construct found to significantly contribute to the prediction of service quality if all other constructs were accounted for.  The only supported hypothesis (from the extant literature).  A positive perception of therapy outcomes is the strongest predictor of perceived quality of service based on the variables in the study.  This finding based on an extreme response set and very low numbers. 17
  • 18. Unanticipated Results Provided information about feasibility of conducting a full-scale evaluation of psychological services. Tested the survey instrument and proxy process. Tested the proxy process. Provided preliminary information about hospital patients’ thoughts, beliefs, and behaviors towards psychological services received. 18
  • 19. A Different Kind of Collaboration Could not be physically present, relied on a proxy to implement the evaluation. A belief distance evaluation could work given strong initial enthusiasm and commitment of clients. Lack of total control over the process. Blurred lines of distinction between client and evaluator. 19
  • 20. A Different Kind of Collaboration Unforeseen logistical and collaborative challenges necessitating multiple procedural trade-offs. Trade-offs impinged upon the validity of findings, compromises made likely had some deleterious effect. Would have been easier to take a pass than continue, but… 20
  • 21. Communication Challenges Limited and inconsistent communication with program staff. Communication with only one staff member who relayed information to key decision-makers then relayed back decisions made. Diminished communication frequency over time. Prohibitive costs of calling South Africa and time difference made communication feasible only through email. One internet-connected PC available for staff use at the hospital. 21
  • 22. Implementation Challenges Detailed email memos sent in hopes of negotiating in good faith, but in the end, they implemented the survey without consulting with us. No on-site training of proxy. Reliance on written instructions to provide survey implementation training – not sure if they were adhered to. No oversight of the implementation process.  Unable to see process of survey in action and make mid-course corrections to enhance quality of implementation.  No direct communication with person responsible for survey planning and implementation so anything we might have heard came too late to intervene accordingly. 22
  • 23. Implementation Challenges Staff work-load distribution inequitable. Primary decision maker taking a long time to make decisions and then convey the results Availability of contact person limited at best Never found out:  How patients were recruited  What respondents were told about informed consent or survey  Who administered the surveys  Where the surveys were completed  How long it took for surveys to be completed  How they were collected and stored (for confidentiality). 23
  • 24. Bureaucratic Challenges Misunderstanding of intentions (applied v. basic research) by the hospital’s Ethics Committee:  Delayed an affirmative decision  Limited sample size to ten  Turned planned evaluation into a pilot  Wanted an unspecified reassessment after “pilot”  Wanted survey to be translated into isiZulu  No incentives to respondents Couldn’t do a wire transfer of funds for return postage or incentives. 24
  • 25. Logistical Challenges Survey packet delayed for two weeks by South African customs Only two weeks to conduct the evaluation (half of anticipated time-frame). 25
  • 26. Self-Imposed Challenges Unable to meet with Psychology Department staff to develop clearly articulated program theory:  Overall scale and subscales tied to some of the more commonly operationalized constructs of patient satisfaction from the literature. No pilot-testing with a small sample of the patient population to assess cognitive processing and language difficulties. Population selection bias: patient population low income, no access to another health institution, thus influencing their responses and biasing results toward the positive (Clark, et al 2004). 26
  • 27. Self-Imposed Challenges No qualitative phase to better understand underlying subscale constructs. Did not tie in prior patient psycho-metric assessments to survey to determine potential psycho-emotional confounds. No demographic data collected, e.g., age, education level, patients expectations, intention to recommend the hospital, etc… that could have been used to control for response bias during the analysis. 27
  • 28. Analytical Challenges Small sample size and missing items. Could not run factor analysis.  Couldn’t be certain particular constructs were actually being measured.  Further testing could not be based on calculated average rating score for each construct based on the factor analysis separating each item into a construct - Correlations, regressions, and T-test not trustworthy Potential sampling error and response biases. Aside from general descriptions of the data and calculated averages, results from tests run were unreliable. 28
  • 29. Distance Evaluation Might Work By… Setting up an effective, two-way communication structure. Having a longer, more realistic time-frame with appropriate, mutually agreed upon deadlines. Having a signed M. O. A. Having adequate funding. Having realistic knowledge of organization’s bureaucratic mechanisms. Using technology to train the proxy (availability dependent). 29
  • 30. Distance Evaluation Might Work By… Involving more staff and maintaining their involvement through incentives. Creating the program theory together on-line. Having greater control over process by requiring frequent updates. Going through site IRB first. Having a thorough understanding of department/organization culture. Planning for delays. Making sure clients understand what is being asked of them as proxy. 30
  • 31. For more information contact: Jeffrey Sheldon, Ed. M. School of Behavioral & Organizational Sciences Claremont Graduate University 1.909.447.5474 jeffrey.sheldon@cgu.edu 31