Sos And Pce Decentralization


Published on

Proposal to delegate approval of programmatic categorical exclusions to districts based on a system of standards of submission for deliverables and a QA/QC program for monitoring program performance.

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • ENV proposes to deal with these and other interrelated issues by working to move PCE determination from ENV to the districts. The shift to district determinations would be beneficial for two reasons. First, Districts should have more authority. PCEs are the routine work in most of TxDOT, and by their nature are minor projects that do not require lots of finesse. FHWA has beenwilling to move these determinations over to TxDOT. Second, ENV has critical missions that are hindered by the flood of PCEs. An essential prerequisite for moving PCE determination is to implement a QA/QC process that works to reduce error and misunderstanding. Additional benefits from a QA/QC system include opportunities to reduce the back and forth, get districts and ENV on the same page with respect to regulatory adequacy, and overall reduce preventable errors. Therefore, the QA/QC system would include forward-looking elements that feed directly into a continuous improvement program. In addition, implementation of a QA/QC system provides opportunities for starting new patterns of cooperation that prevent QA/QC problems by assuring that districts and ENV are reading from the same sheet of music, and promoting interactions .
  • TxDOT’s Agreement with FHWA defines the terms under which CEs are reviewed. The agreement establishes thresholds that qualify projects to be treated as PCEs. It further establishes that ENV will be responsible for making determinations that projects are PCEs by verifying that the thresholds have been met If the thresholds are exceeded, then the project is forwarded to FHWA as a CE. If the thresholds are not exceeded, then the project is assumed to be approved as a CE by FHWA, and it is processed by TxDOT. So a bottom line in the PCE agreement is that TxDOT must determine whether a project meets the thresholds for a PCE, and ENV is charged with making that determination.
  • We’re not starting from zero to gain FHWA approval to move determinations out to the districts. Primary problem is to work around the regulatory foundations to create a delegation FHWA is willing to risk. The groundwork for delegating PCE approval to Districts is already there, but needs a persuasive argument: Because FHWA already provides for PCE determinations by ENV, the question is not whether to give this to TxDOT, but a question of who within TxDOT should be able to make determinations. A congressionally approved model exists on which to build. Section 6005 of SAFETEA-LU enables states to assume FHWA’s NEPA responsibility, and requires the implementation of a QA/QC structure subject to FHWA review. The 6005 model specifies the characteristics a QA/QC system must have. The 6005 model can be oversimplified to three fundamental principles: (a) a formal agreement makes TxDOT the accountable party; (b) reduces FHWA participation to program audit; (c) links delegation to results of audit. Bottom line for district PCE determinations: we have to give FHWA assurances that we are managing their legal exposure in an acceptable way. These assurances are built into the 6005 model, so moving PCE determinations to the districts should be based on the 6005 model. The trick on the will be to persuade them that the move will be directly analogous to an FHWA to TxDOT delegation
  • TxDOT should take advantage of 6005 model because in general the 6005 process is already approved, and in particular FHWA has already demonstrated that it likes what TxDOT developed for 6005 delegation. FHWA in both DC and TX regarded our QA/QC/audit proposal as a model for other states. In fact, Alaska adopted it with minimal changes. FHWA in DC wanted the other states to adopt the District/ENV structure in which QA/QC oversight is separated from project development. There was a sense that the combination of QA/QC review and self-audit with reporting to FHWA would allow TxDOT to monitor itself reliably. So to emphasize again, adapting the 6005 model prepositions TxDOT for FHWA approval by adopting a model with which they are already comfortable instead of having to re-invent something they may or may not accept and for which there is no established model. Both ENV and FHWA recognized immediately that the system also doubles as a CI process that could further increase reliability by eliminating sources of error and fostering successful approaches. Amadeo agreed that it was a good system, and indicated that he wanted to adopt it even if the 6005 delegation didn’t occur. So the system has already been accepted by him too.
  • Adapting the 6005 model or any other model has important feature that does not apply to delegation under 6005: Unlike 6005, FHWA is still the defendant in federal court. FHWA has been comfortable with giving PCE determinations to ENV, but it is less comfortable with moving it to the districts. To gain acceptance, we have to convince FHWA that the shift will not increase their exposure to risk. Elements that were effective in the 6005 project were: --Maintaining a firewall between project development and document approval. FHWA has relied on the District/ENV structure to provide assurance of independent review. --Balancing project- and program-level QA/QC. Although 6005 didn’t say so explicitly, model required states to determine failure rates for environmental compliance. It did not assume perfection, but emphasized that a successful program would demonstrably reduce the rate of project-specific error. --Central key: FHWA is accountable to the public, resource agencies, and the courts. This accountability must pass down to the Districts and ENV
  • (1) Finalize/roll out standards of submission that define production and review standards. The SOSs will establish a common standard for the Districts and ENV, and serve as basis for QA/QC. (2) Finalize, roll out QA/QC system that (a) give producers a common target, and (b) give producers and reviewers a common understanding of how review will occur. An ENV:District QA/QC system will be analogous to an FHWA:ENV system. (3) After a test-drive to amass trial data, perform demonstration audits to objectively assess District/ENV performance, and provide CI feedback on how to remedy systemic error and replicate innovation. An audit system will be analogous to the system by which FHWA would assure itself that the 6005 delegation was working. It also will maintain the district/ENV firewall which FHWA believed was essential to successful compliance with the terms of the 6005 delegation. (4) After initial audits, request FHWA approval to move PCE determination to districts. FHWA will determine whether to make the move at all, whether to do it district-by-district or department-wide, and whether to treat locally let projects the same as regular TxDOT projects. Because FHWA is the designated litigatee, implementation will depend on a high degree of reliability in the districts. Basing the request to move determinations to the districts on a demonstrable record would give FHWA an opportunity to decide whether TxDOT’s performance is consistent with its comfort levels.
  • The centerpiece of the demonstration is the establishment of standards of submission. ENV began a shift in this direction during negotiation of the SH 130 agreement with CZ. The motivation was to increase the predictability of outcome to minimize costs resulting from losses of time. SOSs for SH 130 define criteria for minimum acceptable performance. Although all ambiguity cannot be eliminated, the SOSs narrow the range of ambiguity and therefore reduce the likelihood of disagreement. Importantly, by establishing thresholds of acceptable performance with reduced ambiguity, CZ’s environmental contractors could use the SOS as scopes of work which, if followed without error, would help guarantee that deliverables will be approved by ENV on the first submission. Because the SOSs were to be based on practices with demonstrated success, the SOSs also would maximize acceptance by resoruce agencies and FHWA When this was proposed to CZ, it was recognized immediately as a device with universal application in TxDOT The process is effectively undergoing a pilot study on SH 130, and it has resulted in high levels of success.
  • Emphasize that SOSs perform the desired functions by accomplishing these ends.
  • SOSs are grounded in statutory and regulatory requirements supplemented by the requirements of guidance and policy. Whenever possible, they should be developed in consultation with resource agencies. SOSs also should point out features that facilitate stakeholder acceptance even if those features do not have a regulatory basis—it may not be technically required, but it also may not be a good idea not to do it, especially with respect to building good will that enables TxDOT to get resource agencies to help us get past jams. The SOSs define standards from which deviation is possible. Districts will retain the authority to deviate from SOSs, but at the risk of rejection or revision. So districts can choose in advance to trade predictability for other desired benefits. This feature provides opportunities to take risks that result in identifying valuable innovations. This feature gives districts an incentive to reduce the risk of deviation by institutionalizing a method through which districts, ENV and FHWA can identify approaches that are front-loaded for successful review. We’ll come back to this in the next slide. Importantly, SOSs are one way to do things—they happen to be a way that is pre-approved for acceptance. However, they cannot be permanent. Laws change, agencies change, practice reveals ambiguity, consultants find ingenious ways to manipulate them in ways with unacceptable outcomes. So SOSs will need to change from time to time. One reason for this change will be to improve them as criteria for acceptance first time, every time.
  • SOSs are starting points for dealing with project-specific problems and opportunities for innovation. SOSs can be crafted to meet agency and FHWA expectations whether or not the agencies and FHWA formally approve them. They also can be points of departure for a project-specific solution, which in effect becomes a project-specific SOS if FHWA and ENV are made aware. Since successful review is a goal, districts can front-load project-specific solutions for success by involving FHWA and ENV. Primary benefit of agency and FHWA approval: Districts often try to get agency or FHWA approval of exceptionsDistricts often cite the exceptions as the new standard. But, agencies and FHWA can make exceptions without changing their overall expectations. So if an agency has approved an SOS, a history of exceptions provides an opportunity to consult with them to determine whether it’s time to revisit the SOS. For example, FHWA frequently makes exceptions for individual projects. Districts use these exceptions as precedents for subsequent cases. But FHWA may regard exceptions as one-time allowances. So when the exceptions keep showing up they may object that documents don’t meet its expectations. FHWA approval of an SOS gives it an opportunity to reply yes, we made an exception, but we didn’t change the standard. A special case: Consultation with FHWA and ENV can show that an exception is not even needed: many deviations occur because districts are afraid to disclose impacts that are acceptable if disclosed, and unacceptable if not disclosed. Negotiations can help districts identify strategies for disclosures that meet the hard look standard, and thereby foreclose re-dos and revisions.
  • The flip side of SOSs as performance standards is their function as review standards. By establishing standards for acceptable performance, they also establish standards for acceptable review. As a result, the constitute QA/QC standards. However, because the SOSs are founded on legal, guidance, and policy sources, they also identify the set of references that a document must meet if the district decides to try an alternative approach. Therefore, the statutory and other foundations of an SOS stand as a review standard. If the District, FHWA, and ENV agree on an alternative that meets the statutory requirements, then this project-specific SOS becomes the review and QA/QC standard for that project if the deviations are approved in advance and specified in sufficient detail. But, even if a deviation is not approved in advance, the SOS serves as a reference for the statutory and other requirements that must be met. So the reference standard for review and QA/QC disappears. Review in this manner is subject to the ambiguity and conflict that plague current practice.
  • Certification is a central element of the District QA/QC process.
  • Project-level QA/QC corrects errors for specific projects
  • Sos And Pce Decentralization

    1. 1. Standards of Submission, Environmental QA/QC/CI, PCE Determinations A Proposal to Decentralize Environmental Review (View in Notes Page mode to view script)
    2. 2. Baseline Perceptions <ul><li>District Perceptions </li></ul><ul><ul><li>Districts should have more authority </li></ul></ul><ul><ul><li>ENV standards too high </li></ul></ul><ul><li>ENV Perceptions </li></ul><ul><ul><li>Districts not reviewing NEPA documents, reports, and studies </li></ul></ul><ul><ul><li>Districts reluctant to disclose impacts </li></ul></ul><ul><ul><li>Consultants repeating same errors </li></ul></ul><ul><li>Shared Perceptions </li></ul><ul><ul><li>Too much back and forth </li></ul></ul><ul><ul><li>We’re here to build roads </li></ul></ul>
    3. 3. TxDOT’s Goal <ul><li>To more efficiently build Transportation Improvements that are environmentally sensitive and supported by the public </li></ul>
    4. 4. Proposal for Change <ul><li>Initiate use of Standards of Submission </li></ul><ul><li>Develop and implement QA/QC process for TxDOT </li></ul><ul><li>CI/training to reduce error, replicate success </li></ul><ul><li>Cooperate to prevent error on critical projects </li></ul><ul><li>Move PCE determination to Districts </li></ul><ul><li>Audit performance to identify training and guidance needs </li></ul>
    5. 5. PCE Agreement What it is <ul><li>Not Delegation </li></ul><ul><li>Establishes thresholds for PCEs </li></ul><ul><li>ENV ensures/verifies thresholds are met </li></ul><ul><li>If thresholds are exceeded, CE is forwarded to FHWA </li></ul><ul><li>If not exceeded, PCE is assumed approved by FHWA, processed by ENV </li></ul>
    6. 6. Persuading FHWA <ul><li>FHWA not opposed to delegation </li></ul><ul><li>SAFETEA-LU 6005 provides model </li></ul><ul><ul><li>Transfers FHWA accountability to State DOT’s </li></ul></ul><ul><ul><li>Reduces FHWA oversight to program audit </li></ul></ul><ul><ul><li>Links continuation of delegation to audit results </li></ul></ul><ul><li>Bottom line: 6005 delegation requires assurances and accountability for FHWA </li></ul><ul><li>Use 6005 model for PCE determination at District level </li></ul>
    7. 7. Persuading FHWA <ul><li>ENV developed a 6005 program </li></ul><ul><li>FHWA already likes TxDOT’s 6005 model: </li></ul><ul><ul><li>Regarded TxDOT’s QA/QC/audit system as model for other states; </li></ul></ul><ul><ul><li>Wanted others to adopt TxDOT District/ENV structure; </li></ul></ul><ul><ul><li>Know TxDOT can monitor itself reliably; </li></ul></ul><ul><ul><li>FHWA and ENV recognize CI value. </li></ul></ul><ul><li>Mr. Saenz wanted our 6005 system regardless of 6005 results </li></ul>
    8. 8. Adapting the 6005 Model <ul><li>Moving PCEs to Districts still leaves FHWA responsible—unlike 6005 </li></ul><ul><li>Keys to FHWA acceptance: </li></ul><ul><ul><li>Maintaining District/ENV firewall </li></ul></ul><ul><ul><li>Balancing project- vs. program-level QA/QC/audit </li></ul></ul><ul><ul><li>Balancing program standards vs. project-specific exceptions </li></ul></ul><ul><ul><li>Maintaining accountability: </li></ul></ul><ul><ul><ul><li>between TxDOT, FHWA, and stakeholders </li></ul></ul></ul><ul><ul><ul><li>within Districts and ENV individually </li></ul></ul></ul>
    9. 9. Implementation <ul><li>Sequence of events </li></ul><ul><ul><li>Centerpiece: Finalize, roll out standards of submission (SOSs) </li></ul></ul><ul><ul><li>Finalize, roll out, test-drive QA/QC/audit system </li></ul></ul><ul><ul><li>Perform demonstration audits/CI reviews </li></ul></ul><ul><ul><li>Request FHWA approval to move verification authority to districts </li></ul></ul><ul><ul><li>Repeat as necessary </li></ul></ul>
    10. 10. Standards of Submission <ul><li>Created to increase CZ control over review time/outcome for SH 130 </li></ul><ul><li>SOSs: </li></ul><ul><ul><li>Define minimum acceptable performance </li></ul></ul><ul><ul><li>Reduce ambiguity to minimum levels </li></ul></ul><ul><ul><li>Are Statements of Work for producers, reference for reviewers </li></ul></ul><ul><li>Maximizes first-time acceptance by ENV, agencies and FHWA </li></ul><ul><li>In place for SH 130 </li></ul><ul><ul><li>Effectively a pilot program </li></ul></ul>
    11. 11. Standards of Submission <ul><li>What they can do: </li></ul><ul><ul><li>Get producers, reviewers on same page </li></ul></ul><ul><ul><li>Front-load project for successful outcome </li></ul></ul><ul><ul><li>Reduce preventable re-dos </li></ul></ul><ul><ul><li>Increase consultant, District, ENV accountability </li></ul></ul><ul><ul><li>Provide baseline for consistent QA/QC/audit </li></ul></ul><ul><li>If producers meet an explicit standard, basic quality happens automatically </li></ul><ul><ul><li>If there is good reason for more, standard is place to start </li></ul></ul>
    12. 12. Standards of Submission <ul><li>What they are: </li></ul><ul><ul><li>Criteria for acceptance – Maximizes likelihood of acceptance of reports, documents, etc </li></ul></ul><ul><ul><li>Baseline for legal adequacy </li></ul></ul><ul><ul><li>Point of departure for innovation, risk management </li></ul></ul><ul><li>What they aren’t: </li></ul><ul><ul><li>The only acceptable way to do things </li></ul></ul><ul><ul><li>Permanently unchanging </li></ul></ul>
    13. 13. Standards of Submission <ul><li>Agency/FHWA approval useful, but not necessary </li></ul><ul><li>Even without approval </li></ul><ul><ul><li>Can be adapted to Agency/FHWA expectations </li></ul></ul><ul><ul><li>Point of departure for project-specific exceptions negotiated by District, FHWA, ENV </li></ul></ul><ul><li>If exceptions become common, adapt SOSs to changed Agency/FHWA expectations </li></ul>
    14. 14. Excerpt from Sample SOS REVIEW STANDARDS FOR DRAFT AND FINAL REPORTS FOR ARCHEOLOGICAL SURVEYS— INDIVIDUAL ANTIQUITIES PERMITS # Criterion Meets criterion? Yes / No / N/A 1 Report cover and title page include Hwy/limits, Counties, CSJ(s), District(s), Antiquities Permit #, Principal Investigator’s name, and investigative firm’s name. (Abstract form and curation form must accompany final report.) 2 Report includes a map of the area surveyed on a USGS 7.5’ Quadrangle or equivalent if 7.5’ Quad unavailable. (In final report, the map of the area surveyed cannot include site location. Map with site locations must be included as separate enclosure.) 3 Report includes Project Type/Description/Impacts and acreage of area surveyed. 4 Report defines the Area of Potential Effects (APE) in three dimensions, referring to project plans or to typical impacts for this class of project. 5 Report includes discussion of previous work/sites within one kilometer of the area surveyed with explicit reference to review of TARL files, THC or Historic Sites Atlas maps, and explicitly indicates trinomials of sites or absence of sites within one kilometer. 6 Report includes description of topography, soils, and geology. Report references soil survey maps and geological maps for the entire area surveyed or indicates that none are published for the area surveyed. 7 Report includes estimate of surface visibility, description of land use, and general description of vegetation in and adjacent to the area surveyed.
    15. 15. SOSs and QA/QC <ul><li>Legal, guidance, policy foundations of SOSs define: </li></ul><ul><ul><li>Performance standards for various environmental tasks </li></ul></ul><ul><ul><li>Standards for documentation/disclosure </li></ul></ul><ul><li>Therefore, SOSs define: </li></ul><ul><ul><li>Minimum standards for tech reports, documents, studies, etc </li></ul></ul><ul><ul><li>QA/QC/audit criteria that apply to both ENV and Districts </li></ul></ul>
    16. 16. District QA/QC <ul><li>Check reports, studies, and documents against SOSs </li></ul><ul><ul><li>For consultants, Districts, local governments </li></ul></ul><ul><li>District performs QA/QC review </li></ul><ul><li>Preparer and District certify: </li></ul><ul><ul><li>Report, study, document, meets SOSs, or </li></ul></ul><ul><ul><li>Report, study, document deviates from SOSs, but meets: </li></ul></ul><ul><ul><ul><li>Other criteria pre-approved by FHWA and ENV/other agencies (i.e., project-specific SOS) </li></ul></ul></ul><ul><ul><ul><li>Legal requirements covered by SOSs </li></ul></ul></ul><ul><ul><li>Under PCE determination by districts, SOSs become standards for District review and approval </li></ul></ul>
    17. 17. Excerpt from Draft Certification
    18. 18. ENV QA/QC/Audit <ul><li>Check certain % of reports, studies and documents for conformance with: </li></ul><ul><ul><li>SOSs, or </li></ul></ul><ul><ul><li>Pre-approved deviations, or </li></ul></ul><ul><ul><li>Legal requirements covered by SOSs </li></ul></ul><ul><li>Check environmental tasks for conformance with SOSs or legal requirements </li></ul><ul><li>Maintain data base of review results </li></ul><ul><li>Periodic audits to evaluate performance </li></ul><ul><li>Periodic meetings w/ Agencies/FHWA to gauge their perspective of program issues </li></ul>
    19. 19. Audit <ul><li>6005 model relies on program-level audit </li></ul><ul><ul><li>Determine success of delegation </li></ul></ul><ul><ul><li>Identify opportunities for mid-course change </li></ul></ul><ul><ul><li>Monitor effectiveness of mid-course change </li></ul></ul><ul><li>Approach consistent with: </li></ul><ul><ul><li>FHWA approach in LGPOTF </li></ul></ul><ul><ul><li>TxDOT use of Independent Engineers in CDAs </li></ul></ul><ul><ul><li>TxDOT’s general internal audit program </li></ul></ul><ul><li>Base PCE move to districts audit on 6005 model </li></ul>
    20. 20. Audit: Proposed Model <ul><li>Identify possible systemic error, success </li></ul><ul><ul><li>Use QA/QC data base </li></ul></ul><ul><ul><li>Look for correlations involving specific districts, issues, reviewers, project types, etc. </li></ul></ul><ul><li>Examine project files </li></ul><ul><ul><li>Is correlation actual vs apparent? </li></ul></ul><ul><ul><li>Can a cause be identified? </li></ul></ul><ul><ul><li>Is there problematic or innovative deviation from SOSs, legal requirements of SOSs, or pre-approved deviations? </li></ul></ul><ul><ul><li>SOSs become audit standards </li></ul></ul>
    21. 21. Audit & Continuous Improvement <ul><li>Program-level audit to identify opportunities for program improvement </li></ul><ul><li>Program-level QA/QC audit directed toward identifying: </li></ul><ul><ul><li>Systemic causes of error </li></ul></ul><ul><ul><li>Replicable innovations </li></ul></ul><ul><ul><li>Proposed changes </li></ul></ul><ul><li>Proposed changes function as CI recommendations </li></ul><ul><ul><li>Training, guidance, technical assistance likely to be major CI items </li></ul></ul>
    22. 22. Maintaining PCE move to Districts <ul><li>Principal key: Maintain quality </li></ul><ul><ul><li>Meet FHWA comfort levels </li></ul></ul><ul><ul><li>Maintain accountability </li></ul></ul><ul><ul><li>Maintain relationships with resource agencies </li></ul></ul><ul><ul><li>Document legal defensibility of program </li></ul></ul><ul><li>Prevent QA/QC problems </li></ul><ul><ul><li>Respond to CI recommendations </li></ul></ul><ul><ul><li>Foreclose review problems through </li></ul></ul><ul><ul><ul><li>Pre-approved deviations from SOSs </li></ul></ul></ul><ul><ul><ul><li>ENV technical assistance for difficult tasks </li></ul></ul></ul>