SlideShare a Scribd company logo
1 of 15
MEASURING IMPROPER PAYMENTS
IN SOCIAL SERVICE BLOCK GRANT (SSBG)
HURRICANE SANDY SUPPLEMENTAL FUNDS
Michael Salmon, M.A.
Research Manager
Stefan Bishop, B.A.
Research Assistant
SSBGHURRICANESANDYERRORRATE
TODAY’S PRESENTATION
• Brief Description of Social Services Block Grant (SSBG) and
Hurricane Sandy Supplemental Funds (HSSF)
• Development of SSBG Improper Payment Methodology and Error
Rate Reviews
• Development of Review Tools and Sampling Criteria
• Computation of Errors and Error Reporting for SSBG HSSF
• Conducting FY 2014 Error Rate Reviews and applying lessons
learned to FY 2015
SSBGHURRICANESANDYERRORRATE
SOCIAL SERVICES BLOCK GRANT (SSBG)
• Provides grants to 50 States, DC, PR, and Territories for social
services
• States can identify usage of funds within 29 service categories
• No eligibility criteria but State activities must be consistent with
the general goals of the program
SSBGHURRICANESANDYERRORRATE
SSBG HURRICANE SANDY
Disaster Relief Appropriations Act, 2013 (DRAA)
$474.5 million in Hurricane Sandy Supplemental Funds (HSSF)
to five SSBG States:
All DRAA funded Federal programs deemed susceptible to improper
payments, and required to calculate an error rate
State
Percentage Share of State FEMA
Individual Assistance Registrants
SSBG Hurricane
Sandy Supplemental
Fund Allocation
New York 49.62% $235,434,600
New Jersey 47.80% $226,794,105
Connecticut 2.23% $10,569,192
Maryland 0.25% $1,185,675
Rhode Island 0.11% $516,428
Total $474,500,000
SSBGHURRICANESANDYERRORRATE
Case Record Review
Direct benefit or payment
amounts to or on behalf of
individuals or households
(i.e., cases) based on specific
eligibility criteria
Vendor Payment Review
Indirect benefits or services,
including group intervention
services, expansion of service
staffing levels, or grants for the
repair/renovation/rebuilding of
service facilities
TWO-FOLD (BIFURCATED) METHODOLOGY
SSBGHURRICANESANDYERRORRATE
Case Record Review
• New Jersey: 12 awards, $157
million, 70% of allocation
• New York: 34 awards, $32 million,
14% of allocation
• Connecticut: 1 award, $700,000,
7% of allocation
Vendor Payment Review
• New Jersey: 44 awards, $64
million, 28% of allocation
• New York: 226 awards, $113
million, 48% of allocation
• Connecticut: 4 awards, $9 million,
93% of allocation
TWO-FOLD (BIFURCATED) METHODOLOGY
Error rate reviews will encompass $378 million (80%) of all SSBG HSSF
allocated to New York, New Jersey, and Connecticut.
SSBGHURRICANESANDYERRORRATE
Case Record Review
Sampling Unit
• individual, family, or household
receiving a payment/benefit
Sampling Universe
• all cases served by selected
vendors during review period
Sample Size
• 383 cases for New York
• 383 cases for New Jersey
• 73 cases for Connecticut
Vendor Payment Review
Sampling Unit
• State payment made to a
vendor for HSSF expenditures
Sampling Universe
• all vendor payments made
within selected contract awards
during review period
Sample Size
• 341 payments for New York
• 224 payments for New Jersey
• 12 payments for Connecticut
TWO-FOLD (BIFURCATED) METHODOLOGY
SSBGHURRICANESANDYERRORRATE
Case Record Tools
Review client records for
completeness of application
and eligibility materials, and
accuracy of service benefits
Vendor Payment Tools
Review payments and invoices
according to State policies for
approval
ERROR RATE REVIEW TOOLS
SSBGHURRICANESANDYERRORRATE
FY 2014 ERROR RATE DEVELOPMENT
Developmental questions and challenges:
• Approach
• Flexibility of Block Grant Funding
• Specialization of Review Tools
• Timeline for Implementation
• State Readiness
FY 2014 SSBG ERROR RATE IMPLEMENTATION
• FY 2014 SSBG improper payment reviews based only on case
record review in New Jersey
• FY 2014 SSBG Error Rate: 13.5%
• Majority of funds in error (74%) resulted from a single vendor in a
single service program
• State response too late to adjust error rate downward
SSBGHURRICANESANDYERRORRATE
APPLYING FY 2014 LESSONS TO FY 2015
Efforts to improve error rate reviews in FY 2015 include:
• Quarterly improper payment sampling in each State
• Remote reviews (where possible) to save time and burden
• 30-day State response periods
• Improved collection and organization of records
SSBGHURRICANESANDYERRORRATE
FY 2015 SSBG ERROR RATE DEVELOPMENT
Case record and vendor payment reviews for all three States—
Connecticut, New Jersey, and New York
• One national error rate
Conducting six separate reviews
• State and Vendor Interviews
• Late Start for New York and Connecticut
• New Jersey’s decentralized payment approval processes
SSBGHURRICANESANDYERRORRATE
FY 2015 SSBG ERROR RATE IMPLEMENTATION
• Despite delays, great improvement in organization and efficiency
of reviews
• Success of Improvements since FY 2014
• Final FY 2015 results still under review
• States’ extended use of funds through FY 2017
SSBGHURRICANESANDYERRORRATE
CONCLUSION
Today’s presentation focused on the challenges of implementing a
new improper payment methodology for a flexible, multi-service
block grant program, including:
• Developing multiple review types across several States
• Specializing review tools to State- and vendor-specific policies
and procedures
• Computation of errors and error reporting
• Developing best practices for future reviews in light of lessons
learned
SSBGHURRICANESANDYERRORRATE
CONCLUSION
Thank you very much for your time today.
Please let us know if you have any questions!

More Related Content

Viewers also liked

2 IJAERS-JUN-2015-6-RSA and Modified RSA algorithm using C Programming
2 IJAERS-JUN-2015-6-RSA and Modified RSA algorithm using C Programming2 IJAERS-JUN-2015-6-RSA and Modified RSA algorithm using C Programming
2 IJAERS-JUN-2015-6-RSA and Modified RSA algorithm using C ProgrammingPuneeth Puni
 
10 IJAERS-JUN-2015-42-Social Engineering on Social Networking sites
10 IJAERS-JUN-2015-42-Social Engineering on Social Networking sites10 IJAERS-JUN-2015-42-Social Engineering on Social Networking sites
10 IJAERS-JUN-2015-42-Social Engineering on Social Networking sitesPuneeth Puni
 
EnergizePhoenixYear2Report
EnergizePhoenixYear2ReportEnergizePhoenixYear2Report
EnergizePhoenixYear2ReportDrew Bryck
 
West Michigan Veterans Coalition - August 15, 2015 Meeting
West Michigan Veterans Coalition - August 15, 2015 MeetingWest Michigan Veterans Coalition - August 15, 2015 Meeting
West Michigan Veterans Coalition - August 15, 2015 MeetingElena Bridges
 
Seguridad informatica
Seguridad informaticaSeguridad informatica
Seguridad informaticajust4everyolo
 

Viewers also liked (11)

Apps html 1
Apps html 1Apps html 1
Apps html 1
 
Imex Smart Cities
Imex Smart CitiesImex Smart Cities
Imex Smart Cities
 
2 IJAERS-JUN-2015-6-RSA and Modified RSA algorithm using C Programming
2 IJAERS-JUN-2015-6-RSA and Modified RSA algorithm using C Programming2 IJAERS-JUN-2015-6-RSA and Modified RSA algorithm using C Programming
2 IJAERS-JUN-2015-6-RSA and Modified RSA algorithm using C Programming
 
COREL DRAW X5
COREL DRAW X5COREL DRAW X5
COREL DRAW X5
 
10 IJAERS-JUN-2015-42-Social Engineering on Social Networking sites
10 IJAERS-JUN-2015-42-Social Engineering on Social Networking sites10 IJAERS-JUN-2015-42-Social Engineering on Social Networking sites
10 IJAERS-JUN-2015-42-Social Engineering on Social Networking sites
 
Genre country music
Genre country musicGenre country music
Genre country music
 
7. 'Digital' as data
7. 'Digital' as data7. 'Digital' as data
7. 'Digital' as data
 
h
hh
h
 
EnergizePhoenixYear2Report
EnergizePhoenixYear2ReportEnergizePhoenixYear2Report
EnergizePhoenixYear2Report
 
West Michigan Veterans Coalition - August 15, 2015 Meeting
West Michigan Veterans Coalition - August 15, 2015 MeetingWest Michigan Veterans Coalition - August 15, 2015 Meeting
West Michigan Veterans Coalition - August 15, 2015 Meeting
 
Seguridad informatica
Seguridad informaticaSeguridad informatica
Seguridad informatica
 

Similar to NAPIPM Improper Payments Methodology Presentation_08102015 (Speaker Notes)

Dynamic Changes Occurring: OMB's Uniform Grant Guidance
Dynamic Changes Occurring: OMB's Uniform Grant GuidanceDynamic Changes Occurring: OMB's Uniform Grant Guidance
Dynamic Changes Occurring: OMB's Uniform Grant GuidanceStreamLinkSoftware
 
Global Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatGlobal Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatRotary International
 
Global Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatGlobal Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatRotary International
 
2016 Pension Update: Change is Constant. Now What?
2016 Pension Update: Change is Constant. Now What?2016 Pension Update: Change is Constant. Now What?
2016 Pension Update: Change is Constant. Now What? Findley Davies, Inc.
 
CMS 2015 Program Audit Protocol
CMS 2015 Program Audit ProtocolCMS 2015 Program Audit Protocol
CMS 2015 Program Audit ProtocolInovaare
 
Brian Bradley, PA DEP, “Pennsylvania AML/AMD Program and Funding Overview”
Brian Bradley, PA DEP, “Pennsylvania AML/AMD Program and Funding Overview”Brian Bradley, PA DEP, “Pennsylvania AML/AMD Program and Funding Overview”
Brian Bradley, PA DEP, “Pennsylvania AML/AMD Program and Funding Overview”Michael Hewitt, GISP
 
NADO Conference- RLF Workshop Combined Presentations.pptx
NADO Conference- RLF Workshop Combined Presentations.pptxNADO Conference- RLF Workshop Combined Presentations.pptx
NADO Conference- RLF Workshop Combined Presentations.pptxnado-web
 
Research Week 2014: CIHR: Opportunities, Eligibility, and Strategies for Success
Research Week 2014: CIHR: Opportunities, Eligibility, and Strategies for SuccessResearch Week 2014: CIHR: Opportunities, Eligibility, and Strategies for Success
Research Week 2014: CIHR: Opportunities, Eligibility, and Strategies for SuccessWilfrid Laurier University
 
2016-12-14 Presentation of Financial Statements of Not-for-Profit Entities
2016-12-14 Presentation of Financial Statements of Not-for-Profit Entities2016-12-14 Presentation of Financial Statements of Not-for-Profit Entities
2016-12-14 Presentation of Financial Statements of Not-for-Profit EntitiesRaffa Learning Community
 
New con awards 2014
New con awards 2014New con awards 2014
New con awards 2014Laura Law
 
Insights and Results of Recent Rotary Research
Insights and Results of Recent Rotary ResearchInsights and Results of Recent Rotary Research
Insights and Results of Recent Rotary ResearchRotary International
 
Effective Rate Case Management
Effective Rate Case ManagementEffective Rate Case Management
Effective Rate Case ManagementScottMadden, Inc.
 
Qualified Energy Conservation Bonds (Pete)
Qualified Energy Conservation Bonds (Pete)Qualified Energy Conservation Bonds (Pete)
Qualified Energy Conservation Bonds (Pete)TNenergy
 
PAS Salford Pre App Journey March 2023 JC.pptx
PAS Salford Pre App Journey March 2023 JC.pptxPAS Salford Pre App Journey March 2023 JC.pptx
PAS Salford Pre App Journey March 2023 JC.pptxPAS_Team
 
ICANN 52: New gTLD Program: Status, Reviews and Next Round
ICANN 52: New gTLD Program: Status, Reviews and Next RoundICANN 52: New gTLD Program: Status, Reviews and Next Round
ICANN 52: New gTLD Program: Status, Reviews and Next RoundICANN
 
Vendor monitoring and payment controls
Vendor monitoring and payment controls Vendor monitoring and payment controls
Vendor monitoring and payment controls akmrahman
 
Broadband and Economic Development (Terry McDermott, Brian Smith)
Broadband and Economic Development (Terry McDermott, Brian Smith)Broadband and Economic Development (Terry McDermott, Brian Smith)
Broadband and Economic Development (Terry McDermott, Brian Smith)nado-web
 

Similar to NAPIPM Improper Payments Methodology Presentation_08102015 (Speaker Notes) (20)

Dynamic Changes Occurring: OMB's Uniform Grant Guidance
Dynamic Changes Occurring: OMB's Uniform Grant GuidanceDynamic Changes Occurring: OMB's Uniform Grant Guidance
Dynamic Changes Occurring: OMB's Uniform Grant Guidance
 
Global Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatGlobal Grants: Moving from Good to Great
Global Grants: Moving from Good to Great
 
Global Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatGlobal Grants: Moving from Good to Great
Global Grants: Moving from Good to Great
 
2016 Pension Update: Change is Constant. Now What?
2016 Pension Update: Change is Constant. Now What?2016 Pension Update: Change is Constant. Now What?
2016 Pension Update: Change is Constant. Now What?
 
Webinar: CMS Innovation Center Kidney Models News You Can Use
Webinar: CMS Innovation Center Kidney Models News You Can UseWebinar: CMS Innovation Center Kidney Models News You Can Use
Webinar: CMS Innovation Center Kidney Models News You Can Use
 
CMS 2015 Program Audit Protocol
CMS 2015 Program Audit ProtocolCMS 2015 Program Audit Protocol
CMS 2015 Program Audit Protocol
 
Brian Bradley, PA DEP, “Pennsylvania AML/AMD Program and Funding Overview”
Brian Bradley, PA DEP, “Pennsylvania AML/AMD Program and Funding Overview”Brian Bradley, PA DEP, “Pennsylvania AML/AMD Program and Funding Overview”
Brian Bradley, PA DEP, “Pennsylvania AML/AMD Program and Funding Overview”
 
NADO Conference- RLF Workshop Combined Presentations.pptx
NADO Conference- RLF Workshop Combined Presentations.pptxNADO Conference- RLF Workshop Combined Presentations.pptx
NADO Conference- RLF Workshop Combined Presentations.pptx
 
Research Week 2014: CIHR: Opportunities, Eligibility, and Strategies for Success
Research Week 2014: CIHR: Opportunities, Eligibility, and Strategies for SuccessResearch Week 2014: CIHR: Opportunities, Eligibility, and Strategies for Success
Research Week 2014: CIHR: Opportunities, Eligibility, and Strategies for Success
 
2016-12-14 Presentation of Financial Statements of Not-for-Profit Entities
2016-12-14 Presentation of Financial Statements of Not-for-Profit Entities2016-12-14 Presentation of Financial Statements of Not-for-Profit Entities
2016-12-14 Presentation of Financial Statements of Not-for-Profit Entities
 
New con awards 2014
New con awards 2014New con awards 2014
New con awards 2014
 
Insights and Results of Recent Rotary Research
Insights and Results of Recent Rotary ResearchInsights and Results of Recent Rotary Research
Insights and Results of Recent Rotary Research
 
Effective Rate Case Management
Effective Rate Case ManagementEffective Rate Case Management
Effective Rate Case Management
 
Non Federal Match
Non Federal MatchNon Federal Match
Non Federal Match
 
Aci mass heat-loan-carlfawcett-csg
Aci mass heat-loan-carlfawcett-csgAci mass heat-loan-carlfawcett-csg
Aci mass heat-loan-carlfawcett-csg
 
Qualified Energy Conservation Bonds (Pete)
Qualified Energy Conservation Bonds (Pete)Qualified Energy Conservation Bonds (Pete)
Qualified Energy Conservation Bonds (Pete)
 
PAS Salford Pre App Journey March 2023 JC.pptx
PAS Salford Pre App Journey March 2023 JC.pptxPAS Salford Pre App Journey March 2023 JC.pptx
PAS Salford Pre App Journey March 2023 JC.pptx
 
ICANN 52: New gTLD Program: Status, Reviews and Next Round
ICANN 52: New gTLD Program: Status, Reviews and Next RoundICANN 52: New gTLD Program: Status, Reviews and Next Round
ICANN 52: New gTLD Program: Status, Reviews and Next Round
 
Vendor monitoring and payment controls
Vendor monitoring and payment controls Vendor monitoring and payment controls
Vendor monitoring and payment controls
 
Broadband and Economic Development (Terry McDermott, Brian Smith)
Broadband and Economic Development (Terry McDermott, Brian Smith)Broadband and Economic Development (Terry McDermott, Brian Smith)
Broadband and Economic Development (Terry McDermott, Brian Smith)
 

NAPIPM Improper Payments Methodology Presentation_08102015 (Speaker Notes)

  • 1. MEASURING IMPROPER PAYMENTS IN SOCIAL SERVICE BLOCK GRANT (SSBG) HURRICANE SANDY SUPPLEMENTAL FUNDS Michael Salmon, M.A. Research Manager Stefan Bishop, B.A. Research Assistant
  • 2. SSBGHURRICANESANDYERRORRATE TODAY’S PRESENTATION • Brief Description of Social Services Block Grant (SSBG) and Hurricane Sandy Supplemental Funds (HSSF) • Development of SSBG Improper Payment Methodology and Error Rate Reviews • Development of Review Tools and Sampling Criteria • Computation of Errors and Error Reporting for SSBG HSSF • Conducting FY 2014 Error Rate Reviews and applying lessons learned to FY 2015
  • 3. SSBGHURRICANESANDYERRORRATE SOCIAL SERVICES BLOCK GRANT (SSBG) • Provides grants to 50 States, DC, PR, and Territories for social services • States can identify usage of funds within 29 service categories • No eligibility criteria but State activities must be consistent with the general goals of the program
  • 4. SSBGHURRICANESANDYERRORRATE SSBG HURRICANE SANDY Disaster Relief Appropriations Act, 2013 (DRAA) $474.5 million in Hurricane Sandy Supplemental Funds (HSSF) to five SSBG States: All DRAA funded Federal programs deemed susceptible to improper payments, and required to calculate an error rate State Percentage Share of State FEMA Individual Assistance Registrants SSBG Hurricane Sandy Supplemental Fund Allocation New York 49.62% $235,434,600 New Jersey 47.80% $226,794,105 Connecticut 2.23% $10,569,192 Maryland 0.25% $1,185,675 Rhode Island 0.11% $516,428 Total $474,500,000
  • 5. SSBGHURRICANESANDYERRORRATE Case Record Review Direct benefit or payment amounts to or on behalf of individuals or households (i.e., cases) based on specific eligibility criteria Vendor Payment Review Indirect benefits or services, including group intervention services, expansion of service staffing levels, or grants for the repair/renovation/rebuilding of service facilities TWO-FOLD (BIFURCATED) METHODOLOGY
  • 6. SSBGHURRICANESANDYERRORRATE Case Record Review • New Jersey: 12 awards, $157 million, 70% of allocation • New York: 34 awards, $32 million, 14% of allocation • Connecticut: 1 award, $700,000, 7% of allocation Vendor Payment Review • New Jersey: 44 awards, $64 million, 28% of allocation • New York: 226 awards, $113 million, 48% of allocation • Connecticut: 4 awards, $9 million, 93% of allocation TWO-FOLD (BIFURCATED) METHODOLOGY Error rate reviews will encompass $378 million (80%) of all SSBG HSSF allocated to New York, New Jersey, and Connecticut.
  • 7. SSBGHURRICANESANDYERRORRATE Case Record Review Sampling Unit • individual, family, or household receiving a payment/benefit Sampling Universe • all cases served by selected vendors during review period Sample Size • 383 cases for New York • 383 cases for New Jersey • 73 cases for Connecticut Vendor Payment Review Sampling Unit • State payment made to a vendor for HSSF expenditures Sampling Universe • all vendor payments made within selected contract awards during review period Sample Size • 341 payments for New York • 224 payments for New Jersey • 12 payments for Connecticut TWO-FOLD (BIFURCATED) METHODOLOGY
  • 8. SSBGHURRICANESANDYERRORRATE Case Record Tools Review client records for completeness of application and eligibility materials, and accuracy of service benefits Vendor Payment Tools Review payments and invoices according to State policies for approval ERROR RATE REVIEW TOOLS
  • 9. SSBGHURRICANESANDYERRORRATE FY 2014 ERROR RATE DEVELOPMENT Developmental questions and challenges: • Approach • Flexibility of Block Grant Funding • Specialization of Review Tools • Timeline for Implementation • State Readiness
  • 10. FY 2014 SSBG ERROR RATE IMPLEMENTATION • FY 2014 SSBG improper payment reviews based only on case record review in New Jersey • FY 2014 SSBG Error Rate: 13.5% • Majority of funds in error (74%) resulted from a single vendor in a single service program • State response too late to adjust error rate downward
  • 11. SSBGHURRICANESANDYERRORRATE APPLYING FY 2014 LESSONS TO FY 2015 Efforts to improve error rate reviews in FY 2015 include: • Quarterly improper payment sampling in each State • Remote reviews (where possible) to save time and burden • 30-day State response periods • Improved collection and organization of records
  • 12. SSBGHURRICANESANDYERRORRATE FY 2015 SSBG ERROR RATE DEVELOPMENT Case record and vendor payment reviews for all three States— Connecticut, New Jersey, and New York • One national error rate Conducting six separate reviews • State and Vendor Interviews • Late Start for New York and Connecticut • New Jersey’s decentralized payment approval processes
  • 13. SSBGHURRICANESANDYERRORRATE FY 2015 SSBG ERROR RATE IMPLEMENTATION • Despite delays, great improvement in organization and efficiency of reviews • Success of Improvements since FY 2014 • Final FY 2015 results still under review • States’ extended use of funds through FY 2017
  • 14. SSBGHURRICANESANDYERRORRATE CONCLUSION Today’s presentation focused on the challenges of implementing a new improper payment methodology for a flexible, multi-service block grant program, including: • Developing multiple review types across several States • Specializing review tools to State- and vendor-specific policies and procedures • Computation of errors and error reporting • Developing best practices for future reviews in light of lessons learned
  • 15. SSBGHURRICANESANDYERRORRATE CONCLUSION Thank you very much for your time today. Please let us know if you have any questions!

Editor's Notes

  1. Michael: Good afternoon, we are Michael Salmon and Stefan Bishop with WRMA, Inc., the contractor to the Office of Community Services, Administration for Children and Families, Department of Health and Human Services, providing the technical and data support for the Social Services Block Grant (SSBG). We’re pleased to be speaking with you today to describe the Improper Payments Methodology we developed for the Social Services Block Grant (SSBG) Hurricane Sandy Supplemental Funds. Today we will briefly review the key components of the methodology and are prepared to answer any questions you have. This presentation is designed to be interactive, so please feel free to ask questions as we go through it.
  2. Michael: Today’s presentation will focus on the challenges of implementing a new improper payment methodology for a flexible, multi-service block grant program: We will begin by briefly reviewing the nature of the Social Services Block Grant (or SSBG), and the Hurricane Sandy Supplemental Funds for which we have developed the improper payment methodology We will describe the development of multiple review types used to capture improper payment information across a variety of social service programs We will discuss the development of review tools that are specialized to State- and vendor-specific policies and procedures We will review how errors are computed and reported across multiple States and review types And we will describe the development and implementation of best practices for future reviews in light of lessons learned
  3. Michael: SSBG is a block grant established through Title XX of the Social Security Act, which provides grants annually to 50 States, DC, PR, and Territories for social services. Since sequestration in 2013, SSBG provides approximately $1.6 billion annually to States and territories. States have general autonomy to identify the types of social services to be provided and the populations to be served. SSBG funds may be used to pay for a range of social services, including day care, protective services, and home-based services. 29 service categories to report on expenditures for, and recipients of, SSBG funded programs. No eligibility criteria outlined in Title XX, but State activities must be consistent with the general goals of the program. The goals generally speak to reducing dependency and improving self-sufficiency, and to protect children, adults, and persons with disabilities from abuse, neglect, and exploitation. WRMA has been responsible for data collection and support of the States’ annual reporting with reports either published or posted on the web starting in 1998. These are available on the Internet at: http://www.acf.hhs.gov/programs/ocs/ssbg. No error rate reporting in base program, however, certain performance measures have been introduced in an effort to reduce funds used for State administrative costs and to ensure that States’ spending is consistent with their intended use plans.
  4. Stefan: In October 2012, Hurricane Sandy made landfall on the North Atlantic Seaboard, causing massive damage to businesses and residences in several States along the coast. The Disaster Relief Appropriations Act, 2013 (DRAA), signed into law in January, 2013 included an appropriation for $474.5 million in additional SSBG funds to address necessary expenses resulting from Hurricane Sandy. The Office of Community Services (OCS) allocated the additional $474.5 million of Hurricane Sandy Supplemental funds to, NY, NJ, CT, MD, and RI. Allocations were based on each State’s share of FEMA individual assistance registrants following the disaster. (New York, New Jersey, Connecticut, Maryland, and Rhode Island) All Federal programs funded by the DRAA deemed susceptible to significant improper payments, and must calculate and report a single, national error rate OCS selected three of the five States that received SSBG Hurricane Sandy Supplemental Funds (New York, New Jersey, and Connecticut) to calculate improper payment error rates, since their allocations represent 99 percent of all SSBG Hurricane Sandy Supplemental Funds. (The States of Rhode Island and Maryland were collectively awarded less than 1 percent of SSBG Hurricane Sandy Supplemental Funds.)   Transition to next slide: Because the States determine the types of services and eligibility for these services, as permitted by the SSBG regulations, there is considerable variation among States for the application of these funds. To account for this variation, we have worked with OCS to develop a two-fold (bifurcated) imroper payment methodology to review the use of SSBG Hurricane Sandy Supplemental Funds in Connecticut, New Jersey, and New York. All funds not subject to improper payment reviews (including those in MD and RI) are subject to increased monitoring and oversight activities, documenting and testing key controls, and coordinating grantees’ corrective actions.
  5. Stefan The bifurcated approach applies a review process for two types of payments. Depending on the nature of the services to be provided, or projects to be undertaken, States expenditures will be assessed in one of two ways – either a case record review process or a vendor payment review process.     Case record review: Where States provide a direct benefit or payment amount to or on behalf of individuals, families or households (i.e., cases) based on specific eligibility criteria, we conduct a case record review. This methodology applies to programs in which services and benefits are specific to discernable cases, and does not include more generalized group intervention services or other programs that cannot trace specific benefit amounts to individual service recipients. This review examines whether individual cases provided the documentation required to meet the eligibility requirements for service, and determines the amount of payments made in error (both under- and overpayments). Sandy Homeowner and Rental Assistance Program (SHRAP) in NJ – many residents suffered massive flooding and wind damage to their homes; households forced to rent out separate dwelling while maintaining mortgage payments on destroyed homes; need help replacing lost items and furniture, paying property taxes, making utility payments, etc. SHRAP steps in to fill gaps not covered by FEMA, private insurance, etc. Mental health and addiction services – natural disasters can be very disruptive to individuals already receiving these kinds of services, and the induced stress of the recovery can lead to an increase in need. Affected individuals can seek out counseling and detox services to help them maintain while addressing other recovery needs Job training services and child care subsidies can help struggling parents who lost employment as a result of the Hurricane to receiving training and apply for new jobs; child care subsidies can help to shoulder some of the responsibility parents would otherwise need to juggle while seeking new employment or attending to other recovery needs.   Vendor payment review: Where States provide indirect benefits or services to residents affected by Hurricane Sandy, including group intervention services, expansion of service staffing levels, or grants for the repair/renovation/rebuilding of service facilities, we conduct a vendor payment review. This review examines individual payments made to service vendors, and will assess whether the vendors provided adequate documentation (e.g., applications, authorizations) necessary to meet the eligibility requirements for these payments. We then calculate the amount of payments made in error (both under- and overpayments).   Repair, renovation, and rebuilding work is crucial to all States’ recovery efforts, but naturally does not lend itself to the case record review described earlier. In this case, reviews would examine the types of invoices and receipts accounting for the work to be done, and ensuring that all proper approvals have been obtained and documented according to State policy Expansion of service capacity (hiring more counselors and case managers for child protective services, domestic abuse and sexual assault prevention programs) is key to the provision of social services to those affected by Hurricane Sandy, however the focus of payments is on salaries and other programmatic expenses, which cannot easily be calculated into a benefit amount for a given client.   Each year, following all reviews, OCS will consolidate the error measures resulting from each review type across all States into one national error rate for the SSBG Hurricane Sandy Supplemental Funds.
  6. Stefan Following extensive discussions with each State, we have classified different awards and programs across each review type for all three States. When you look at the number of awards per state and by review type, you can see the variance between states in how they are administering their funds. NJ is spending a considerable amount if its allocation on programs subject to the case record review, whereas the majority of awards in NY and CT are subject to the vendor payment review. In total, the error rate reviews will encompass $378 million, or 80% of all SSBG HSSF allocated NJ, NY, and CT. Using a bifurcated methodology allows us to cover the majority f funds being spent. As previously mentioned, all funds not subject to improper payment reviews are subject to increased monitoring and oversight activities, documenting and testing key controls, and coordinating grantees’ corrective actions. Now that we have an idea of the number of awards subject to each review type, Michael is going to talk about our sampling methodology. --- Case Record review We identified 12 service programs in New Jersey subject to case record review, accounting for approximately $157 million (70%) of the State’s total allocation of SSBG HSSF. We identified 34 awards in New York that will be subject to case record review, accounting for approximately $32 million (14%) of the State’s total allocation of SSBG Supplemental Funds. We identified 1 program in Connecticut that will be subject to case record review, accounting for approximately $700,000, (7%) of the State’s total allocation of SSBG Supplemental Funds. Vendor Payment Review We identified 44 service programs in New Jersey subject to vendor payment review, accounting for approximately $64 million or 28% of NJ’s total SSBG Hurricane Sandy Supplemental Funds. We identified 226 awards in New York that will be subject to vendor payment review, accounting for approximately $113 million or 48% of NY’s total allocation of SSBG Supplemental Funds.  We identified 4 awards in Connecticut that will be subject to vendor payment review, accounting for approximately $9 million, or 93% of CT’s total SSBG Hurricane Sandy Supplemental Funds. In total, these two error rate reviews will encompass at least $378 million (80%) of all SSBG Hurricane Sandy Supplemental Funds allocated New York, New Jersey, and Connecticut.   As previously mentioned, all funds not subject to improper payment reviews are subject to increased monitoring and oversight activities, documenting and testing key controls, and coordinating grantees’ corrective actions.
  7. Michael Sampling methods and sample sizes were developed according to each review type, and were dependent on the number of cases to be served and vendor payments to be made for selected programs in each State. Sampling units for both reviews were essentially the same across all States Sampling unit: Case Reviews: individual, family, or household receiving a payment or benefit during the review period for HSSF services Benefits must have calculable dollar amount Vendor Payment: a payment made by the State to a vendor for expenditures made during the review period. Sampling universes were developed for each State and were bound by the review period in question (i.e., July 1, 2014 – June 30, 2015). For each program or award under review, States were asked to provide listings of all relevant cases served or vendor payments made. Sampling universe: Case Reviews: consists of all cases within the specific service programs or served by selected vendors during review period Cases listed by ID number, service program, vendor, and county served Vendor Payment: consists of all vendor payments made within selected contract awards during the review period. Payments listed by ID number, service program, vendor, and payment amount Sample sizes in each State were determined by the total number of expected cases or vendor payments to be administered by selected programs within the review period. Sample sizes were chosen to sufficiently estimate an improper payment rate with a 90 percent confidence interval of +/- 2.5 percent. Due to a lack of previous error rate measurements within SSBG, sample size estimates were generated assuming a base error rate of 10% Sample Size: Case Reviews: 383 cases for New York and New Jersey; 73 cases for Connecticut Vendor Payment: 341 payments for NY; 224 payments for NJ; 12 payments for Connecticut. Samples drawn using random number table and computed sampling interval After samples were drawn, they were shared with the States for confirmation and to begin the process of assembling records for review. A second sample of replacement cases was drawn for instances in which a State would note a selected item was unfit for review (e.g., a case did not receive any benefit payments within the review period) Where available, cases are replaced within the same program.
  8. Michael Review tools have been built around broad criteria that can be further catered to the specific programmatic needs and policies within each State. Discussions with States and vendors determined program-specific documentation and approval needs. Through in-depth discussions, we determined how States’ systems are used to process client and payment information. Creation of mandatory forms tables; naming all required forms and detailing necessary criteria that must be completed in order for forms to be valid (e.g., signed, dated); States provide feedback and clarification on draft mandatory forms tables prior to finalization. For case reviews we review client records based on program definitions for accuracy and completeness of application and eligibility materials Authorization Forms (e.g., intake/application forms) Eligibility for Service (e.g., proof of identity, proof of impact) Accuracy of Payments and Payment Approvals How do clients apply for and receive service benefits? What supporting documentation must clients submit to verify eligibility for benefits? What measures are taken to ensure non-duplication of payment, and have they been followed? Are all service benefits properly documented and approved? Are benefit amounts accurate given supporting documentation? For vendor payment reviews, we examine payments and invoices submitted by vendors to the State for approval. The reviews examine whether payments were properly reviewed and approved according to State policies. Areas of review include Claims Submission (e.g., purchase order forms, vendor attestations) Invoice Approval (e.g., invoices and/or detailed payment records if seeking reimbursement, completed payment approval forms) Accuracy of Payments How are invoices reviewed and approved? What supporting documentation must vendors submit with invoices? What measures are taken to ensure non-duplication of payment, and have they been followed? Does supporting documentation confirm invoicing amount? Does State review lead to appropriate payment amount? Upon completing a review, any errors are categorized according to guidance issued by the Office of Management and Budget. The latest guidance, issued in October 2014, lists 13 different potential error categories. These include errors due to administrative or process errors, failure to verify data relevant to payment, insufficient documentation to determine accuracy of payments, and inability to authenticate client eligibility. When reporting final results, OCS presents the number of overpayments and underpayments associated with each error category. At the end of each review period, total expenditures for programs and awards subject to review are gathered, and error findings are extrapolated to estimate total dollars in error within the review period for all selected programs. As error rate findings are calculated for each State, they are combined across all States to calculate a single, national error rate.
  9. Stefan Developing an improper payment methodology for FY 2014 brought with it a number of unique challenges and raised questions for our review team Approach – When developing our approach, it was important to ask -- Who will conduct the reviews? States; the OCS? There were concerns that the reviews would be a burden upon the States; By having OCS conduct the reviews, OCS can gain more intimate knowledge of how programs operate at the service level Flexibility of Block Grant Funding – initially considered only case record review, but because found States implementing many indirect service programs where case records are not kept or benefit amounts cannot be easily calculated at the client level, also significant repair/renovation/rebuilding projects have no case level data; development of bi-furcated methodology meant to explore client-level data where possible, and to ensure that other programs are meeting State level criteria for payment Specialization of review tools by program – while broad elements of the review tools are similar across programs, extensive conversations with the State were required to determine the exact contents of case record files across different programs in order to ensure client met all program-specific eligibility requirements before receiving services (e.g., how often must an attestation be submitted, what kinds of identification are sufficient to receive service) Similarly, processing of vendor payments varies across States and even within States (NJ has different payment processing criteria depending on the department administering each program) Timeline for implementation – timeframe for developing and approving the methodology was very tight going into FY 2014; we were commissioned by OCS to do this work mid-way through FY 2014 and were required to conduct all reviews and produce an error rate by early August 2014. This meant holding several calls with each State to determine number and nature of services being offered, which programs were expending funds, the nature of case record and vendor payment materials, gathering data needed for sampling, developing tools, coordinating site visits, conducting reviews, and computing error findings all within a period of several months. State Readiness – Discussions with States in early summer 2014 determined wide variety in States’ expenditure levels to date: New Jersey actively expending funds, but New York experience delays due to lengthy contracting process, and Connecticut in need of a revised plan for use of its funds Michael will now speak to how the error raste was calculated in FY 2014.
  10. MIchael FY 2014 SSBG improper payment reviews based only on case record review in New Jersey SSBG national error rate based on the NJ case record reviews due to insufficient applicable expenditures in Connecticut and New York Vendor payment review developed in June 2014, too late to implement in 2014 reviews. On-site review of 383 cases across nine programs – 80% from single program (SHRAP) Other programs included repair grants to seniors and disabled, detox and short-term residential clinical services, direct child care financial assistance, and respite programs for families with special needs children FY 2014 SSBG Error Rate: 13.5% Majority of funds in error (74%) resulted from a single vendor in a single service program (SHRAP) SHRAP program provides housing assistance to affected residents to help with mortgage/rental payments, utility or tax bills, replacement of essential home items, etc. Many residents’ homes made unlivable due to flooding/wind damage, peoples’ employment affected, households needing to rent apartments while keeping up with mortgage payments. Errors largely due to missing or insufficient documentation necessary to confirm client’s eligibility or to prove that payment was correct Service capacity was overwhelmed by the expressed needs of affected individuals, maintenance of paperwork and case files suffered, leading to many detected errors (e.g., missing proof of identity, proof of residence at time of the storm, various payment approval forms or proof of payment verifying need had been met) State able to locate much of missing documentation, but not in time to affect calculated error rate.
  11. Stefan We applied several changes to the FY 2015 work based on lessons learned during the FY 2014 error rate reviews: First, to spread burden out over the course of the year, we began working with States to draw quarterly samples of case records and vendor payments for review. States already report expenditure and service recipient numbers on a quarterly basis, and we have found it easier for States to respond with sampling data periodically over the course of the year rather than waiting until year end to draw all data for a single sampling quarterly reviews require less coordination between States and vendors in assembling documentation for review, and causing less disruption to service provision (e.g., holding case records for review when cases are active) In addition, where possible, we have been coordinating with States to provide documentation for remote review, as opposed to being on-site. This allows for more flexibility when reviewing materials, and eases the burden and cost of coordinating an on-site visit. Remote reviews tend to be easier for vendor payment reviews, as they are less prone to the kind of sensitive information that may be contained in client-level case record documentation We implemented a standard 30 day response period by which States could respond to initial error findings, either by providing missing documentation not present at the time of the review, or clarifying policy procedures justifying why an initial finding should not be considered in error. All documentation included in State follow up must have been in existence at the time of the review, but was not included in the file record due to administrative error. This is particularly helpful when conducting remote reviews and States or vendors may be copying or scanning hundreds of pages of material for review. Policy clarifications must make sense in light of mandatory forms table discussions; Combining 30 day response periods with quarterly reviews also provides States greater flexibility in responding to a smaller number of findings at any given time. Future quarterly reviews are also refined and streamlined as a result of policy clarifications, leading to fewer false positives More frequent reviews allow for increased State coordination in collecting and organizing records for review as all parties become familiar with the review process Finally, sources of error can be identified and corrected earlier, leading to fewer error findings in subsequent reviews. Michael will now share how we’ve implemented these changes in conducting our reviews in FY 2015.
  12. Michael Working from our experiences developing one review for one State in FY 2014, we found ourselves in FY 2015 developing and implementing five additional review across all three States – Connecticut, New Jersey and New York: After all reviews are completed across all States, the results must be compiled into a single, national error rate applied across all three States. Developing and organizing two case record reviews and three vendor payment reviews has involved many separate and on-going conversations with both States and service vendors in order to understand the nature of services being offered, how States and vendors set eligibility and payment approval processes across different programs and departments, and how to organize records for review In the case of New York and Connecticut, many of these conversations began late due either because the State’s planned use of funds was not yet approved (in Connecticut), or because awards not being finalized and programs were only just beginning to expend funding. This essentially meant beginning our work at the point of explaining the nature of the two reviews, and working with the States to understand how different programs and awards would best fit into one review or the other. With New Jersey, we did have the luxury of a case record review already being developed. However, New Jersey posed a unique challenge when approaching development of a vendor payment review. Specifically, whereas NY and CT have centralized vendor payment approval processes, New Jersey’s payment approvals are specific to each of the administering departments. So, whereas all vendor payments in New York are ultimately handled by a central lead agency, with uniform approval policies and documentation, we needed to work with each of three administering departments in NJ to understand the documentation standards in place for approving payments to vendors (at times, we had to work with different administering divisions within a single department).
  13. Michael Despite delays and challenges experienced in early FY 2015, we have seen dramatic improvement in States’ responsiveness and organization in conducting this year’s error rate reviews We feel much of this success can be attributed to drawing more frequent, smaller samples of cases and vendor payments for review, reducing the amount of burden put upon States and vendors at any one time, and allowing for States to improve organization of records and institute programmatic improvements as a result of earlier, more frequent reviews New Jersey’s case record review is a stand out example of this improvement. As a result of a tumultuous first review in FY 2014, the State worked with OCS to implement much stronger controls for the organization and record-keeping practices of its programs. Allowing the State a standard 30-day response period has also allowed the State to provide any missing documentation that may not have been present at the time of review due to administrative error. While final data are still under review, we can unequivocally say improper payment reviews of the SSBG Hurricane Sandy Funds have improved dramatically over the course of FY 2015. We look forward to publishing final results of the FY 2015 error rate reviews within the coming months Finally, States have recently been approved for extended use of their Hurricane Sandy funds throughout the course of the next few years, with all States now required to spend down their remaining allocations by September 30, 2017. This means that improper payment error rates will continue to be calculated for these SSBG supplemental funds at least the next two years. We look forward to continually improving the error rate review process moving forward.
  14. Stefan Today’s presentation focused on the challenges of implementing a new improper payment methodology for a flexible, multi-service block grant program, including: Developing multiple review types to capture improper payment information across a variety of social service programs in three separate States Specializing review tools to State- and vendor-specific policies and procedures Computation of Errors and Error Reporting across multiple States and review types Developing best practices for future reviews in light of lessons learned
  15. Thank you very much for your time today. Please let us know if you have any questions!