2009 siym theory, research, practice, and profession evidence_final
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

2009 siym theory, research, practice, and profession evidence_final

on

  • 388 views

 

Statistics

Views

Total Views
388
Views on SlideShare
388
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

2009 siym theory, research, practice, and profession evidence_final Presentation Transcript

  • 1. Theory, Research, Practice, Profession: Institute objectives and themes Thomas Keller
  • 2. Introduction
    • Field of youth mentoring
    • Interplay of theory, research, practice
    • Ideas and aims of summer institute
    • Participation and feedback
    • Research principles and statistical analysis
    • Conceptual frameworks for mentoring and mentoring relationships
  • 3. Field of youth mentoring
    • Origins
      • Juvenile court probation (late 1800’s)
      • Big Brothers Big Sisters (1904)
    • The Kindness of Strangers (Freedman, 1993)
      • Interest and growth
      • Positive youth development
      • Potential benefits
      • “ Fervor without infrastructure”
    • Continued expansion
      • MENTOR (1990)
      • PPV study (1995)
      • OJJDP/gov’t funding (1990’s)
      • Handbook of Youth Mentoring (2005)
  • 4. Development of field
    • Attributes of a profession (Greenwood, 1957):
      • Skill based on theoretical knowledge
      • Period of formal training and education
      • Occupational organization
      • Code of ethics or conduct
      • Some form of community-sanctioned license
  • 5. Continuing challenges
    • Expanding and solidifying base of knowledge (practice knowledge and research)
    • Sharing knowledge and education (bi-directional communication)
    • MENTOR National Research & Policy Council (September, 2006):
      • “What do practitioners need to know and what is he best way to disseminate needed information to them?”
  • 6. Interrelations of theory, research, practice Theory Research Practice
  • 7. Sequence
    • Define need/problem
    • Identify associated factors
    • Develop explanatory model
    • Create intervention based on model
    • Pilot test intervention
    • Refine intervention
    • Efficacy trial
    • Effectiveness trials
    • Dissemination
  • 8. Alternative sequence Theory Practice Research
  • 9. Common language Idea Action Observation
  • 10. Everyday example
    • What should I do?
      • You have a reason or rationale based on past experience and/or logic (explanation)
      • You expect a certain result (prediction)
    • Did it work?
      • Determine result
        • Confirming evidence (reinforces your idea/approach)
        • Disconfirming evidence (need to remember this and try something different next time)
    • Other examples
      • Learning and survival
      • Scientist in the crib
      • Internal working model
      • Reflective practice
  • 11. Personal vs. formal
    • Personal theory
      • Intuition
      • Common sense
    • Personal evaluation
      • Observations and recollections
      • Interpretations and meanings
    • Result: convince yourself (belief)
    • Formal theory
      • Stated propositions
      • Logical case
    • Formal evaluation
      • Systematic collection of data
      • Systematic analysis of data
    • Result: convince others (evidence)
  • 12. Cautions
    • The main difference:
      • Making it explicit, documentation
    • The common risk:
      • Want to hold on to our ideas/beliefs
        • Avoid any evidence
        • Pay attention to confirming evidence
        • Disregard disconfirming evidence
        • Twist and stretch theory to match disconfirming evidence
  • 13. Research and practice
    • Internal validity—study has solid design that permits strong conclusions (quality)
    • External validity—study conclusions apply beyond specific sample (relevance)
  • 14. Different roles Theory Practice Research
  • 15. Modified model Ideas Action Observation Action Observation Practice Research
  • 16. Real-life examples Idea Action Observation
  • 17. Example: School-based mentoring
    • Opportunities and potential (Herrera, 1999)
      • Recruitment of volunteers (time, convenience)
      • Reach children with greater needs/challenges (school referrals)
      • Fewer staff resources (less screening, easier supervision)
      • Focus on academic development
    • Rapid growth in BBBSA (Herrera, 2004)
      • 1999=27,000; 2002=90,000
      • 233% increase
    • Careful evaluation of effects (Herrera, et al., 2007)
  • 18. Example: Summer Institute
    • Goal: To improve services for youth who participate in formal youth mentoring programs
    • Premise: A sustained dialogue between experienced professionals and researchers stimulates research with relevance to the field and enhances its translation to practical application
    • Strategy: A direct relationship between researchers and professionals
    • Model: A series of highly interactive discussions that provide an in-depth view of the research and examine its implications for program policies and practices
  • 19. Model elaborated
    • People
      • Professionals eager for cutting edge research and exchange of information and ideas
      • Researchers who want work to be relevant and want findings translated into improvements in services for youth
    • Size
      • Small-group format to encourage active exchange
    • Time
      • Ample time to think critically and creatively about issues and explore opportunities for innovation
      • Get away and go back to school
    • Dynamics
      • Cohort effect
      • Mixing within and between roles
  • 20. Mixture model Advocates Program Leaders Researchers YOU
  • 21. Professional development and career
    • Training tied to transitions (Caplan & Curry, 2001)
      • Internship: Transition from student to worker
        • Transfer knowledge from class to real-life
      • Entry-level: Transition to professional
        • Learn skills and tasks of position
      • Leadership development: Transition to leader in organization
        • Preparation for supervision and management
      • Master practitioner: Transition to leader in the field
        • Special opportunities for experienced professionals
  • 22. Leaders
    • Wisdom and insight to share at institute
    • Wisdom and insight to share in communities
    • Transfer of learning
      • Hold positions of influence
        • Training and supervision of staff
        • Development of program models
        • Implementation of service delivery changes
    • Task
      • Think about new program models, program policies and procedures, training materials for program staff, training materials for volunteer mentors and youth participants
  • 23. Summer Institute aims
    • Contribute to the development of policy and practice in the field of youth mentoring
    • Convene leaders in the field for substantive discussion of practices, policies, and new directions
    • Create new networks of peer relationships among professionals and researchers from different programs and backgrounds
    • Promote professional identity and commitment of participants and researchers
  • 24. Moving forward Idea Action Observation
  • 25. Preliminary observations
    • Support of advocacy agencies providing training and technical assistance?
      • Yes! (support with plans and announcements)
    • Interest on part of researchers?
      • Yes! (eager to attend)
    • Interest on part of professionals?
      • Yes! (competitive application process)
    • Ability to reach target participants?
      • Yes! (look around)
  • 26. Study
    • Encourage reflection
    • Provide feedback
    • Respond to questions at end of institute
    • Respond to questions after 6-12 months
    • Informed consent
  • 27. Questions
    • What types of opportunities for collaboration among colleagues (research-practice, practice-practice, research-research) do you see emerging from the institute: a) this week and b) continuing into the future?
    • What types of initiatives or changes could you undertake upon returning to your program?
    • How will you share the new information and ideas with others in your agency or community?
  • 28. Assignment
      • Assignment for each presentation
        • Summarize (few sentences)
        • Follow-up questions (2-3)
        • Implications for program (example)
  • 29. SIYM Themes
    • Year 1: School-based Mentoring
      • Reprise in year 3:
        • Symposium on Monday
        • Guest speakers on Friday
    • Year 2: Diversity in Mentoring
    • Year 3: Use of Evidence for Practice
      • MENTOR—Elements of Effective Practice, 3 rd Ed.
      • BBBSA program models (SBM, CBM)
      • New Meta-analysis
      • Standards and accreditation discussions
  • 30. Research-Practice Survey
    • WT Grant Distinguished Fellows
      • Use of high quality research in practice
      • Marc Wheeler, VP Programs BBBS Alaska (SIYM alum) to PSU
      • David DuBois, to BBBSA
    • Web-based Survey
      • Opinion survey
      • Needs assessment
      • Descriptive data
  • 31. Survey: sample
    • Sample
      • Responses: n=455
      • Positions
        • ED/CEO (39%)
        • VP of Program/Program Director (18%)
        • Program Coordinator (25%)
      • Mostly with BBBS agencies (54%)
      • Average employment in field=8.9 yrs.
  • 32. Survey: current assessment
    • Current level of EB decision making in field
      • A lot more needed (30%)
      • Somewhat more needed (55%)
      • About right (10%)
      • Less (5%)
    • Personally comfortable using EB decision making
      • Very/somewhat uncomfortable (26%)
      • Neutral (9%)
      • Somewhat comfortable (33%)
      • Very comfortable (32%)
  • 33. Survey: type of data
    • Use of published research
      • Haven’t (11%)
      • Small steps (37%)
      • Creating systems (16%)
      • Have systems to routinely do this (37%)
    • Use of internal agency data
      • Haven’t (7%)
      • Small steps (31%)
      • Creating systems (22%)
      • Have systems to routinely do this (40%)
  • 34. Survey: evidence used 33% 40% 21% 7% Performance data on program operation/outcome 16% 40% 31% 13% Data on local trends/needs 27% 50% 17% 7% Professional experience/expertise 14% 42% 28% 17% Data on client/stakeholder preferences 11% 34% 37% 18% Published external research Extensive Substantial Somewhat Little/none Use of the following:
  • 35. Survey: reasons to use evidence 59% 33% 4% 4% Focus resources on effective areas 49% 38% 8% 6% Design new programs for specific populations 47% 41% 6% 6% Guide decisions of board and stakeholders 60% 26% 8% 7% Prevent negative outcomes for youth 78% 14% 3% 4% Demonstrate impact to funders 83% 11% 1% 5% Improve youth outcomes Very important Somewhat important Neutral Un-important Goals:
  • 36. Survey: Greatest needs (very imp.)
    • How to analyze data and report findings (59%)
    • Step-by-step guide for EB decision making (56%)
    • How to find/select measures/metrics (56%)
    • How to find, read, use existing research (54%)
    • Description of different types of evidence (42%)
    • How to collaborate with researchers/colleagues on EB decision making (38%)
    • Glossary of common research terms (37%)
    • Guidance on ethical issues with research (36%)
    • Description of scientific method and applications (29%)
  • 37. Research principles and statistical analysis Briefly….
  • 38. Classic questions
      • What works for whom under what circumstances? Why?
        • Does program work better/differently for certain types of mentees (age, gender, race, stress, aptitude)?
        • Does program work better/differently in certain settings (community, school, etc.)?
        • Does program work better/differently with certain types of volunteers (age, gender, occupation, personality)?
        • What are the essential processes that yield the results?
  • 39. Research paradigms
    • Qualitative approach
      • Complex data
      • Smaller samples
    • Strengths
      • Reflects complexity of experience
      • Captures contexts and processes
      • Good for discovering what is happening
      • Good for within-system view
    • Limitations
      • Subjective interpretations
      • Translating results to others
    • Quantitative approach
      • Defined data
      • Larger samples
    • Strengths
      • Reflects clear definitions and theories
      • Captures relations between variables
      • Good for demonstrating what is happening
      • Good for comparisons
    • Limitations
      • Less nuanced
      • Sampling biases
  • 40. Research Process
    • Introduction
      • Framing the issue
      • Motivation, rationale
      • Theory/model
      • Research questions
    • Method
      • Design
      • Sample
      • Procedures
      • Measures
      • Analysis plan
    • Results
      • Data
      • Findings
    • Discussion
      • Conclusions/insights
      • Interpretations
      • Limitations
      • Next steps
  • 41. Outcomes
    • Criteria for selecting “outcomes”
      • Outcome can reasonably be expected to change during period, given intensity of intervention
        • Proximal (intervening) vs distal (ultimate)
        • Number of other uncontrolled factors
        • Portion of variance explained
      • Outcome is measurable and assessment is sensitive enough to detect likely change
        • Clearly and narrowly defined
        • Reliable measure
        • Valid measure
  • 42. Measures X X X X X O O O O O Z Z Z Z Z Reliability Validity
  • 43. Sampling
    • Defining population of interest
    • Representative members
    • Random sampling (vs. assignment)
      • Each individual has equal chance of being selected
    • Population parameter vs. sample statistic
    • Inferences apply to population
  • 44. Sampling distribution
    • Unlikely to get a sample statistic exactly equal to population parameter (sampling error).
    • Imagine a hypothetical sampling distribution if you took multiple samples from population and plotted all the sample statistics.
    • Central Limit Theorem:
    • If a population has a mean of μ and a standard deviation of σ, then the sampling distribution of the mean based on sample size N will have a mean of μ and a standard deviation of σ/sq root (N) and will approach a normal distribution as the sample size N upon which it is based becomes larger (regardless of population distribution).
  • 45.  
  • 46. Evaluation Design
    • Logic
      • We can determine the true effect of a program (or experience) if we compare what happens to an individual who is in the program versus what would have happened to that individual if he or she were not in the program (impossible in one lifetime)
    • Problem
      • We always lack the ideal counterfactual (the outcome in the what if situation…)
    • Missing data solution
      • Compare participants to non-participants who are as similar as possible in every way except for having or not having the intervention
  • 47. Evaluation Design
    • How do we get comparison group?
      • Experimental design (optimal)
        • Researcher controls exposure to intervention through random assignment to intervention
      • Quasi-experimental designs
        • Research tries to get a non-participant comparison group as equal/similar as possible
  • 48. Random assignment means everyone has equal chance of being in program Imagine this dimension is motivation to succeed. We would have equal distribution of low, middle, high among participants and “control group” With random assignment, this would be true on EVERY dimension (observed and unobserved). Without random assignment, they may differ on important dimensions. For example, program participants may have higher motivation to succeed than non-participants—that’s why they signed up
  • 49. Experimental design Pre-test Post-test Program (assume equivalence) Xp Control Xc Program X Xp Control X Xc Test of effect = mean (Xp) – mean (Xc)
  • 50. Experimental comparison Pre-test Post-test Program group Control group Program group Control group
  • 51. No control group Pre-test Post-test Program group Program group Control group
  • 52. Comparing Group Means
    • State a null hypothesis (e.g. X 1 – X 2 = 0)
    • Create sampling distribution for difference between means.
    • Compare observed difference between means to null hypothesis.
    • If difference is relatively small, it could be due to sampling error (p>.05, FAIL to reject null).
    • If difference is relatively large, it is unlikely due to sampling error (p<.05, REJECT null) and conclude actual difference exists.
  • 53. Errors
    • Conclusions are based on probabilities
    • Conclusions can be incorrect
      • Reject null hypothesis when we shouldn’t (freaky sample)
      • Fail to reject null hypothesis when we should (don’t detect actual difference)
        • Low statistical power, need larger sample
  • 54.  
  • 55.  
  • 56. Observational studies
    • No imposed difference between groups--naturalistic observation
    • Can see how certain variables correspond
      • Correlation
      • Regression
      • Multiple regression (can evaluate one factor controlling for others)
    • Sampling distribution for each estimate (assumption of no association)
    • Causal inference depends on several considerations (temporal order, ruling out other explanations, etc.)
  • 57.  
  • 58.  
  • 59.  
  • 60. Orienting frameworks Central ideas….
  • 61. Development
    • Universal processes vs. individual differences
    • Successive adaptations
      • Past experience
      • Current circumstances
    • Continuity and change
      • Change possible at any time
      • Change constrained by prior adaptation
    • Diversity in process and outcome
      • Equi-finality
      • Multi-finality
  • 62. Developmental adaptation Source: Sroufe, L.A. (1997). Psychopathology as and outcome of development. Development & Psychopathology, 9, 251-268.
  • 63. Mentoring relationships
    • What distinguishes relationships?
    • (Laursen & Bukowski, 1997)
      • Permanence
        • Voluntary, kinship, committed
      • Social power
        • Resources, experience/knowledge, rank
      • Gender
        • Male-male, female-female, cross-gender
  • 64. Relationship dimensions Friend Cousin Equal social power (horizontal) Mentor Parent Unequal social power (vertical) Voluntary (mutual) Permanent (obligation)
  • 65. Systemic model
  • 66. Systemic model
    • Conceptual points
      • Wholeness and order
        • Parts are interconnected and interdependent
      • Hierarchical structure
        • Composed of sub-systems with boundaries
    • Practical points
      • Intervention goes beyond mentor-child relationship
      • Caseworker, parent, teacher contribute to success or failure of relationship
      • Mentoring effects can be indirect, through multiple pathways of influence
  • 67. Systemic model
    • Analytical uses
      • Direct (M C)
      • Reciprocal (M C)
      • Transitive (W M, M C)
      • Parallel (W M, W C, M C)
      • Circular (C W, W M, M C)
  • 68. Developmental stages Contemplation Initiation Growth & Maintenance Decline & Dissolution Redefinition
  • 69. Stage model Facilitating closure, rematching Negotiating terms of future contact or rejuvenating relationship Redefinition Supervising and supporting, facilitating closure Addressing challenges to relationship or ending relationship Decline and dissolution Supervising and supporting, ongoing training Meeting regularly and establishing patterns of interaction Growth and maintenance Matching, making introductions Beginning relationship and becoming acquainted Initiation Recruiting, screening, training Anticipating and preparing for relationship Contemplation Program practices Conceptual features Stage
  • 70. Final thoughts
    • Human beings of all ages are happiest and able to deploy their talents to best advantage when they are confident that, standing behind them, there are one or more trusted persons who will come to their aid should difficulties arise.
      • --John Bowlby