SlideShare a Scribd company logo
1 of 117
Download to read offline
STATE OF CALIFORNIA Edmund G. Brown Jr., Governor
PUBLIC UTILITIES COMMISSION
505 VAN NESS AVENUE
SAN FRANCISCO, CA 94102 3298
Commission
Staff Report
Lessons Learned From Summer 2012
Southern California Investor Owned
Utilities’’ Demand Response Programs
May 1, 2013
Performance of 2012 Demand Response programs of San Diego Gas and
Electric Company and Southern California Edison Company: report on
lessons learned, staff analysis, and recommendations for 2013 2014
program revisions in compliance with Ordering Paragraph 31 of Decision
13 04 017.
ACKNOWLEDGEMENT
The following Commission staff contributed to this report:
Bruce Kaneshiro
Scarlett Liang Uejio
Tim Drew
Rajan Mutialu
Dorris Chow
Paula Gruendling
Taaru Chawla
Jennifer Caron
Alan Meck
i
TABLE OF CONTENTS
EXECUTIVE SUMMARY....................................................................................................... 1
Chapter 1: Introduction.................................................................................................. 5
I. 2012 Summer Reliability and Demand Response Programs..................................................5
II. Energy Division November 16, 2012 Letter and the Staff Report..........................................6
Chapter 2: Demand Response Program Load Impact...................................................... 8
I. Summary of Staff Analysis and Recommendations ...............................................................8
II. Different DR Load Impact Estimates ...................................................................................... 9
III. Comparison of DR Daily Forecast and Ex Post Results ..........................................................9
IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)..............................26
Chapter 3: Demand Response Program Operations...................................................... 32
I. Summary of Staff Analysis and Recommendations .............................................................32
II. 2012 DR Program Trigger Criteria and Event Triggers .........................................................32
III. DR Events Vs. Peaker Plant Service Hours ...........................................................................33
IV. Peaker Plant Comparison..................................................................................................... 34
V. Conclusions .......................................................................................................................... 35
Chapter 4: Residential Demand Response Programs .................................................... 36
I. Summary of Staff Analysis and Recommendations .............................................................36
II. Residential Peak Time Rebate (PTR) ....................................................................................36
III. Residential Air Conditioning (AC) Cycling.............................................................................51
Chapter 5: Non Residential Demand Response Programs............................................. 57
I. Summary of Staff Analysis and Recommendations .............................................................57
II. Background and Summary of Utility Data............................................................................57
III. Commercial Air Conditioning (AC) Cycling...........................................................................59
IV. SCE’’s Auto DR....................................................................................................................... 63
V. SDG&E’’s Demand Bidding Program (DBP) ...........................................................................65
Chapter 6: Flex Alert Effectiveness ............................................................................... 67
I. Summary of Staff Analysis and Recommendations .............................................................67
II. Background .......................................................................................................................... 67
III. Utility Experience with Flex Alert.........................................................................................69
IV. Customer Experience ........................................................................................................... 69
V. The Future of Flex Alert........................................................................................................ 71
VI. DR Program Ex Post Load Impact Results on the Flex Alert Days........................................71
Chapter 7: Energy Price Spikes ..................................................................................... 73
ii
I. Summary of Staff Analysis and Recommendations .............................................................73
II. Definition of Price Spikes ..................................................................................................... 73
III. DR Programs and Price Spikes.............................................................................................. 73
IV. Conclusion............................................................................................................................ 74
Chapter 8: Coordination with the CAISO ...................................................................... 75
I. Staff Recommendations....................................................................................................... 75
II. DR Reporting Requirements in Summer 2012.....................................................................75
III. DR Reporting Requirements for 2013 2014.........................................................................76
Appendix A: Highlight of 2012 Summer Weather & Load Conditions.................................... 77
Appendix B: Energy Division November 16, 2012 Letter........................................................ 78
Appendix C: Descriptions of DR Load Impact Estimates......................................................... 79
Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW) ............................ 85
Appendix E: SCE 2012 DR Program Load Impact by Event (MW)........................................... 87
Appendix F: SDG&E 2012 Monthly Average DR Program Load Impact (MW) ....................... 91
Appendix G: SDG&E 2012 DR Program Load Impact by Event (MW)..................................... 92
Appendix H: SCE 2012 DR Program Overview ....................................................................... 93
Appendix I: SDG&E DR Program Overview............................................................................. 96
Appendix J: SCE Historical DR Event Hours............................................................................. 98
Appendix K: SCE Historical Number of DR Events .................................................................. 99
Appendix L: Summary of SCE’’s Reasons for the 2012 DR Triggers....................................... 100
Appendix M: SDG&E Historical DR Event Hours................................................................... 101
Appendix N: SDG&E Historical Number of DR Events .......................................................... 102
Appendix O: Utilities’’ Peaker Plant Total Permissible vs. Actual Service Hours................... 103
Appendix P: Ex Post Demand Response Load Impact on Flex Alert Days ............................ 104
Appendix Q: CAISO Energy Price Spikes................................................................................ 105
Appendix R: Utilities’’ Demand Response Reporting Requirements..................................... 111
Appendix S: Additional Information .................................................................................... 113
1
EXECUTIVE SUMMARY
This report is prepared by Energy Division in compliance with Ordering Paragraph 31 of
D.13 04 017. The purpose of this report is to provide the lessons learned from the 2012
Demand Response (DR) programs operated by San Diego Gas and Electric Company (SDG&E)
and Southern California Edison Company (SCE) (Utilities), and to recommend program or
operational revisions, including continuing, adding, or eliminating DR programs. Below are
highlighted conclusions and recommendations in the report. To see all recommendations,
please go to each chapter in the report.
In summary, Energy Division makes the following overarching conclusions about the
Utilities’’ DR programs:
Forecast vs. Ex Post: While a few DR programs met or even exceeded their daily
forecast when triggered, on average the ex post results for all program events
diverge from the daily forecast by a considerable degree. The majority of programs
either provided a ‘‘mixed’’ performance (the program both over and under
performed relative to its forecast) or were poor performers (consistently coming up
short relative to its forecast). Of particular note are the Utilities’’ Peak Time Rebate
program1
and SCE’’s Summer Discount Plan.2
(Chapter 2)
The divergence between the ex post results and the daily forecasts can be traced to
a variety of causes, such as inadequate forecasting methods employed by the
Utilities, program design flaws, non performance by program participants and/or
program operations. A complete explanation of the reasons for divergence across
all programs however, was not possible within the scope and timing of this report.
(Chapter 2)
2012 RA vs. Ex Post: Comparing the ex post results to the 2012 Resource Adequacy
(RA) forecast is not a good indicator as to how well a DR program performs. RA
forecasts are intended for resource planning needs. Ex post load impacts reflect
demand reductions obtained in response to operational needs at the time the
program is triggered. Resource planning and operational planning have different
conditions and serve different purposes. (Chapter 2)
DR vs. Peaker Plants: The Utilities used their DR programs fewer times and hours
than the programs’’ limits (each program is limited to a certain number of hours or
events). In contrast, the Utilities dispatched their peaker power plants far more
frequently in 2012 in comparison to 2006 –– 2011 historical averages. (Chapter 3)
Energy Price Spikes: DR programs are not currently designed to effectively mitigate
price spikes in the CAISO’’s energy market. On many days a DR event was called and
1
SCE’’s marketing name for Peak Time Rebate is ““Save Power Day”” , SDG&E calls it ““Reduce Your Use””.
2
Air conditioning (AC) cycling
2
no price spikes occurred, and conversely there were days where price spikes
occurred and DR events were not called. The timing and scope of this report did not
permit a quantification of the cost of unmitigated price spikes to ratepayers, but in
theory, avoidance of these spikes would benefit ratepayers. (Chapter 7)
Energy Division also makes the following program specific conclusions about the Utilities’’
DR programs:
SCE’’s AC Cycling Program Forecasting: SCE’’s 2012 forecasting methodology for its
air conditioning (AC) Cycling program (the DR program that SCE triggered the most
in 2012) cannot be relied upon to effectively predict actual program load reductions.
(Chapter 2)
SCE’’s AC Cycling Dispatch Strategy: SCE’’s sub group dispatch strategy for its AC
Cycling Program (also called Summer Discount Plan) created adverse ‘‘rebound’’
effects, thereby reducing the effectiveness of the program during critical hot
weather days, e.g. 1 in 10 weather. (Chapter 2)
SDG&E’’s Demand Bidding Program: SDG&E Demand Bidding Program produced on
average 5 MW of load reduction when triggered, although the US Navy did not
participate. The US Navy claimed certain program terms and conditions precluded
it from participating in the 2012 program. The Commission’’s decision to modify the
program to a 30 minute trigger may further limit the US Navy’’s ability to participate.
(Chapter 5)
Peak Time Rebate Awareness: SCE and SDG&E customers who received utility
notification of Peak Time Rebate (PTR) events had higher awareness of the program
when compared to customers who were not notified by the utility. More
importantly, customers who opted into receiving PTR alerts significantly reduced
load. All other customers in the program provided minimal load reduction. (Chapter
4)
Peak Time Rebate Free Ridership: The Utilities’’ PTR program has a potentially large
‘‘free ridership’’ problem, where customers receive incentives without significantly
reducing load. SCE paid $22 million (85% of total PTR incentives in 2012) in PTR bill
credits to customers whose load impact was not considered for forecast or ex post
purposes. 94% of SDG&E’’s 2012 PTR incentives ($10 million) were paid to
customers who did not provide significant load reduction. The inaccuracy of
settlement methodology (in comparison to the ex post results) is the main reason
for the ‘‘free ridership’’ problem. The default nature of the program (everyone is
automatically eligible for the incentives) aggravates the problem. (Chapter 4).
Flex Alert: There is a lack of data to evaluate the effectiveness and value of the Flex
Alert campaign. Attribution of savings from Flex Alert is complicated by the fact
that load reduction from the Utilities’’ DR programs on the two days Flex Alert was
3
triggered in 2012 contributed to reduced system peak load. A load impact
evaluation of Flex Alert is planned for 2013. (Chapter 6)
DR Reports: The Utilities’’ DR daily and weekly reports were useful to the CAISO and
the Commission for purposes of up to date monitoring of DR resources throughout
the summer. (Chapter 8)
In light of above findings, Energy Division recommends the following:
DR Evaluation: The Commission should require further evaluation of Utility DR
program operations in comparison to Utility operation of peaker plants for the
purpose of ensuring Utility compliance with the Loading Order. (Chapter 3)
Forecast Methods Generally: The Utilities’’ daily forecasting methods for all DR
programs (especially AC cycling and other poor performers) should undergo
meaningful and immediate improvements so that the day ahead forecasting
becomes an effective and reliable tool for grid operators and scheduling
coordinators. (Chapter 2)
Forecasting for SCE’’s AC Cycling Program: SCE should improve forecasting methods
for its residential AC Cycling Program with input from agencies and stakeholders.
SCE should also pilot more than one forecasting method for the program in 2013.
(Chapter 2)
Forecasting for SDG&E Programs: SDG&E’’s forecasting methods for its AC Cycling
Program (Summer Saver) could be improved doing the following: running a test
event and including a correlation variable that accounts for customer fatigue.
SDG&E’’s Capacity Bidding Program forecasting could be improved by including a
weather variable. (Chapter 2)
SCE’’s Outreach for Commercial AC Cycling: Through its outreach and marketing
efforts, SCE should clearly communicate the new features of its commercial AC
cycling program to avoid customer dissatisfaction and dropout. (Chapter 5)
Auto DR: Future studies are necessary to explore the load impacts of Auto DR.
(Chapter 5)
SDG&E’’s Demand Bidding Program: SDG&E should work collaboratively with the US
Navy to design a program to meet the unique needs of the Navy. Key attributes to
consider are a day ahead trigger, aggregation of 8 billable meters and a minimum
bid requirement of 3 megawatts (MW). (Chapter 5)
Peak Time Rebate Design Changes: The Utilities’’ residential PTR program should be
changed from a default program to an opt in program, so that bill credits are paid
only to customers who opt in. (Chapter 4)
SCE’’s AC Cycling Dispatch Strategy: SCE should reconsider its current strategy of
calling groups of residential AC cycling customers in sequential one hour cycling
events. Alternatively, if SCE retains its current strategy, it should modify the
4
program’’s incentive structure so that customers who are willing to have their AC
units cycled for an entire event (as opposed to just one hour) are compensated
more than those who can tolerate only one hour of cycling. (Chapter 4)
DR Reports: The Utilities (and Pacific Gas & Electric) should submit daily and weekly
DR reports to the CAISO and the Commission for the summers of 2013 and 2014.
They should follow the same format and data requirements in the 2012 reports,
unless otherwise directed by the Commission or Commission staff. (Chapter 8)
5
Chapter 1: Introduction
I. 2012 Summer Reliability and Demand Response Programs
San Onofre Nuclear Generating Station (SONGS) Units 2 and 3 were taken out of service in
January 2012. By March 2012, the Commission determined that the outage of SONGS’’ two
units could extend through summer 2012. Working closely with the Governor’’s Office, the
California Independent System Operator (CAISO), and the California Energy Commission (CEC),
the Commission took immediate mitigation actions to ensure that lights stay on in California
with the loss of 2,200 MW of capacity provided by SONGS.3
When considering adding new generation resources,4
an important action was to further
incorporate the Utilities’’ Demand Response (DR) programs into the CAISO’’s contingency
planning and daily grid operations during the summer. This included mapping the Utilities’’ DR
programs to grid contingency plans and developing new daily and weekly DR reporting
requirements. In addition, the Commission also moved swiftly to approve three new DR
programs for summer 2012: SDG&E’’s Peak Time Rebate (PTR) for commercial customers and
Demand Bidding Program (DBP); and SCE’’s 10 for 10 conservation program for non residential
customers.5
Because of the intensive interagency mitigation effort and relatively cool weather,
California grid reliability was not compromised in spite of the SONGS outage. Nevertheless,
southern California experienced several heat waves in August and September with the highest
temperature reaching 109°F in SDG&E’’s service area and 100°F for SCE on September 14.6
The
CAISO issued two Flex Alerts: on August 10 and 14. The Utilities triggered all of their DR
programs at least once and some on multiple occasions.
Throughout the summer, Energy Division (ED) staff monitored the Utilities’’ DR program
events on a daily basis and provided weekly briefings to the Governor’’s Office, the CAISO, and
the CEC. Staff observed that, for many event days, the load impact forecasts provided by the
Utilities to the CAISO and the Commission in their daily DR reports were inconsistent with the
results submitted seven days after each event (referred as the ““7 Day report””). In some cases,
the Utilities reported much lower load reduction results than they originally forecasted. In
addition, load impact forecasts provided by the Utilities throughout the summer were lower
than the capacity counted for the 2012 Resource Adequacy (RA) Requirement. This raised a
question as to whether the Commission might have overestimated DR load impact for RA
purposes or, rather, if the Utilities might have under utilized their DR programs.
Sometime in mid summer, the Utilities began to experience price spikes in CAISO’’s
wholesale energy market. Questions were raised on whether the DR programs could be used
to mitigate price spikes, and if so, should they be.
3
http://www.songscommunity.com/value.asp
4
Retired Huntington Beach Units 3 and 4 were brought back on line temporarily.
5
Resolutions E 4502 and E 4511
6
A 1 in 10 (or 10% probability) weather condition in any given years.
6
Some of the Utilities’’ DR programs were triggered on as many as 23 events over the five
summer months, and many were triggered on two or three consecutive days. Appendix A
highlights the DR program load impact on the three hottest days and the three days when
SDG&E and SCE experienced highest system peak load. Staff observed that SDG&E’’s system
peak correlate to temperature and biggest DR load reduction happened on the hottest day.
On the other hand, SCE’’s system peak load did not consistently correlate to weather. In
contrast, SCE’’s system load reached its annual peak at 90°F temperature, 10°F cooler than the
hottest day in its service territory. Counter intuitively, DR program load impact on a cooler day
was actually higher than the amount delivered on the hottest day. This led to questions how
the Utilities make decisions to trigger DR programs and whether aspects of the customers’’
experience, such as expectations and fatigue have an effect.
In August, CAISO issued two Flex Alerts when it determined a reliability risk due to
insufficient supply to meet demand. As expected, the Utilities triggered relatively large
amounts of DR programs on both days. CAISO reported that the actual peak load was
significantly lower than its hours ahead forecasts and attributed the load drop to Flex Alert
events. This parallel dispatch situation raises important questions regarding the effectiveness
of the Flex Alert when overlapped with the Utilities’’ DR program events and how customers
perceived with these statewide alerts versus local utility DR notifications.
Based on the above experience, the Commission concluded that staff should evaluate DR
program performance and other lessons learned in order to seek answers to these and other
questions. Such lessons could help the Commission to determine the extent of DR program
reliability and usefulness and in turn, to the extent to which DR resources can be counted on in
CAISO markets and operations.
II. Energy Division November 16, 2012 Letter and the Staff Report
On November 16, 2012, the Energy Division sent a letter (Energy Division Letter) to the
Utilities directing the Utilities to 1) file an application proposing DR program improvements for
2013 and 2014 to mitigate the SONGS outage and 2) provide data and responses to a set of
questions on lessons learned from 2012 DR programs. The questions were developed based on
the Utilities’’ 2012 demand response experience and fell into six categories:
1. DR Program Performance, which include load impact and program
operations,
2. CAISO Market, covering price spikes and market analysis
3. Customer Experience,
4. Coordination with the CAISO and Utility Operations
5. Emergency DR Program Dispatch Order, and
6. Flex Alert Effectiveness
The Energy Division Letter is attached in Appendix B of this report.
7
On December 21, 2012, the Utilities filed separate applications for the approval of the DR
program revisions for 2013 and 2014.7
The Utilities submitted data and responses to the
questions attached to the Energy Division Letter and subsequent Assigned Administrative Law
(ALJ) rulings for developing the record.8
Decision (D.)13 04 017 approved certain DR program
improvements for 2013 2014 and directed the Commission staff to develop a report on the
lessons learned from the DR programs in 2012.
This report is based on a snapshot of data and studies available at the time (i.e. ex post load
impact data, utility responses to Energy Division data requests, etc.) On going and future (e.g.
Flex Alert load impact analysis per D.13 04 021) evaluations will shed further light on the issues
raised in this report.
One point of emphasis in this report is the extent to which the current DR programs
delivered their forecasted savings when they were triggered by the utilities. It is important to
understand that there are a range of factors that can affect whether a program delivers its
forecasted savings targets. Some of these factors can be controlled through good program
design, operation and forecasting methodologies. Other factors that can impact program
performance are exogenous or outside the utilities’’ control such as temperature, participant
enrollment fluctuations, and behavioral or technological changes by the participants.
While this report contains certain findings and recommendations for DR programs, we
caution against sweeping conclusions or generalizations about DR programs based on this
report. The point of this report is to find ways to improve existing DR programs so that they
are more useful to grid operators, utilities, ratepayers and participants.
7
A.12 12 016 (SDG&E) and A.12 12 017 (SCE).
8
On January 18, 2013 and February 21, 2012.
8
Chapter 2: Demand Response Program Load Impact
I. Summary of Staff Analysis and Recommendations
SCE
Most of the program event ex post results diverge from the daily forecast by a considerable
degree. The daily forecast should be more consistent with the ex post results in order for the
day ahead forecasting to be valid and useful for grid operators. Staff recommends that the
daily forecasting methods for all programs undergo meaningful and substantial improvements,
including more thorough and transparent documentation and vetting through relevant agencies
and stakeholders.
The Summer Discount Plan (Residential AC Cycling) program forecasting methods in
particular requires an audience with a broad panel of agencies and stakeholders. Staff also
recommends that SCE pilot more than one forecasting method and conduct interim protocol
based load impact evaluations to identify the most reliable forecasting methods throughout the
2013 summer season.
SCE should also be required to address Summer Discount Plan program operation issues
before the 2013 summer peak season begins, if possible. Specifically, the strategy of calling
groups of customers for sequential one hour cycling events, rather than calling all the
customers for the duration of the full event (or other potential strategies), needs to be
reconsidered before the program is further deployed. As discussed in detail later in this
chapter, this strategy resulted in load increases during the latter hours of events, thereby
reducing the overall effectiveness of the program.
SDG&E
Similar to SCE, many of SDG&E’’s program event ex post results also diverge from the daily
forecast by a considerable degree. The Demand Bidding Program daily forecast was accurate
and reliable in predicting ex post results, while the Summer Saver and Capacity Bidding Day
Ahead and Day Of program daily forecasts did not accurately nor reliably predict ex post results.
The Peak Time Rebate Residential daily forecast was not accurate in predicting ex post results,
but consistently underestimated ex post results by approximately 80%. The Critical Peak
Pricing and Base Interruptible program did not accurately or reliably predict ex post results, but
consistently under predicted ex post load impacts. Due to a weak price signal and inelastic
customer demand, the PTR commercial program ex post results were not significant. The CPP E
was discontinued as of December 31, 2012.
Staff recommends (1) including only customers that opt in to receive e mail or text alerts in
the PTR residential daily forecast model (2) running a test event to measure % load impact per
customer in order to improve CPP daily forecast estimates (3) including a correlation variable in
the Summer Saver daily forecast model to account for customer fatigue during successive event
days (4) including a weather variable in the CBP daily forecast model in order to have parity
with the ex post regression model.
9
II. Different DR Load Impact Estimates
DR programs load impact are forecasted or estimated at different times for different
purposes. The following table summarizes the five different DR load impact estimates that are
discussed in this chapter. Detail descriptions and methodologies for each DR program
measurement are provided in Appendix C.
Table 1: DR Load Impact Estimates
DR Load Impact Estimates General Description Purpose
Ex Ante for RA (e.g., 2012 RA) A year ahead monthly ex ante load
impact potential attributed by
individual program under a 1 in 2
weather condition.
To determine the RA counting against
the Load Serving Entity’’s system and
local capacity requirements.
Daily Forecast The Utilities’’ daily estimate of hourly
load impact from DR programs during
an event period.
To provide the CAISO, CPUC, and CEC the
hourly MW provided by DR programs on
each event day.
7 Day Report The Utilities’’ preliminary estimate of
hourly load reduction results from
each triggered DR program
To report to the CAISO the load
reduction data from the triggered DR
programs seven days after each DR
event.
Ex Post Results The Utilities’’ most accurate
measurement of the load impact
results from all of the DR programs
triggered in a year. The ex post
results are calculated using
comprehensive regression models.
To report to the CPUC the actual results
of the DR events
Settlement A measurement of customers’’ load
reduction from their specific reference
load using a baseline method.
To calculate customers’’ incentive
payments for billing purpose.
In this proceeding, the Utilities provided the above DR load impact estimates for their DR
programs, which are shown in Appendices D to G.
III. Comparison of DR Daily Forecast and Ex Post Results
A. Overall Program Performance
The following section draws on data provided by the Utilities on March 4, 20139
in response
to the Feb 21, 2013 ALJ ruling, which compares event day forecasts (daily forecast or day ahead
forecast) to the event day ex post load reduction estimates. Detailed data and methodological
descriptions relevant to this chapter are provided in Appendices C and G. Subsequent to its
March 4 filing, SCE updated its ex post results for some of the DR program events in its April 2
Load Impact Report but did not update its March 4 filing accordingly. However, in most cases,
the April 2, 2013 updated ex post results are even lower than the March 4 preliminary data, e.g.,
the AC cycling. Therefore, if the updated data was used, it would further support staff’’s
findings.
9
SCE 03 and SGE 03.
10
On average, the ex post results for all program events diverge from the daily forecast by a
considerable degree. While some program events were forecasted more accurately and
consistently than others, Energy Division staff’’s overall conclusion is that the daily forecasting
methods for all programs requires meaningful and immediate improvements in order for the
day ahead forecasting can become an effective and reliable tool for grid operators.
Some of the divergence between the ex post results and the daily forecast estimates can
possibly be explained by inadequate program design and program operations. This section
focuses on the observed differences between the ex post and the daily forecast with an eye
towards identifying improvements for day ahead forecasting, and thus does not cover all
potential program improvements. Furthermore, many program design and operational
improvements that could lead to better ex post results may not be evident by simply inspecting
the daily forecast and ex post data.
The ex post analysis methods are guided by Commission adopted load impact protocols10
and the study results are carefully documented in reports prepared by independent consultants
managed by SCE staff. However, there are currently no comparable standards and processes
guiding the methods for daily forecasting. Indeed, during the course of preparing this report,
Energy Division staff became aware that the day ahead forecasting methods are far from
transparent, and in some cases lack the robust analysis that is expected of the Utilities. These
problems may be somewhat understandable, however, since the daily reports were only
formally instituted in 2012.
While this report is highly critical of the implementation of the day ahead forecasting, it is
important to recognize that the 2012 DR events as a whole did indeed reduce participants loads,
and some of the program load reductions were consistent with or better than the day ahead
forecast. To that end, staff has categorized the demand response programs into three
categories (good, mixed, and poor performance) based on how well the program events
performed relative to the day ahead forecasts.
SCE
Programs that performed well yielded load impacts that were consistent with or better than
the day ahead forecast. The Base Interruptible Program (BIP) and the Day of Capacity Bidding
Program events produced load reductions that were on par with the forecasts. It is worth
noting that BIP, the single largest program, was triggered on only one occasion in 2012 however,
and this was test event.
Program events with mixed performance were not consistent with the day ahead
forecast, but sometimes exceeded the forecast. Staff includes the Day ahead Capacity Bidding,
Demand Bidding, and the Residential Summer Discount Plan program events in this category
because these program events did indeed occasionally exceed the day ahead forecasts by a
significant margin. These programs are discussed in greater detail elsewhere in this section and
report. While considered to be mid performing programs, they do have many important issues
that deserve attention.
10
Decision 08 04 050
11
Program events that were consistently below the forecast are considered to be poor
performing programs. All of the Critical Peak Pricing, Peak Time Rebate, Demand Response
Contracts, Commercial Summer Discount Plan, and Agricultural Pumping Interruptible program
events triggered during 2012 produced load reductions that were lower than forecasted.
Table 2: SCE’’s DR Overall Performance
Programs
No. of
DR
Events
Daily Forecast Ex Post Difference %
Good Performance:
Capacity Bidding Program –– Day of 14 12 16 >2 >17%
Base Interruptible Program 1 514 573 59 12%
Mixed Performance:
Capacity Bidding Program –– Day Ahead 12 0.08 0.03 0.29 to 0.08 315% to 86%
Demand Bidding Program 8 84 76 33 to 16 40% to 21%
Summer Discount Plan (AC Cycling) Res. 23 280 184 603 to 92 100% to 58%
Poor Performance:
Critical Peak Pricing 12 50 37 < 5 < 11%
Peak Time Rebate 7 108 20 < 11 < 11%
Demand Response Contracts 3 230 148 < 70 < 34%
Summer Discount Plan (AC Cycling) Com. 2 5 3 2 35%
Agricultural Pumping Interruptible 2 48 21 < 19 < 52%
(Averaged MW over All Events) (Range from Low to High)
SDG&E
Utilizing the same criteria for evaluating SCE DR programs, The Base Interruptible Program
and the Critical Peak Pricing Program were categorized as good performers, the Capacity
Bidding Day Ahead, Capacity Bidding Day Of, Demand Bidding, and Summer Saver (AC Cycling)
were categorized as mixed performers, and the Critical Peak Pricing Emergency and residential
Peak Time Rebate programs were categorized as poor performers. As stated above, DR
program design and operation characteristics also need to be taken into account for a complete
evaluation of DR program performance.
12
Table 3: SDG&E’’s DR Overall Performance
B. Program Performance During Critical Event Days
The critical event days of August 10th, 13th, 14th, and September 14th were selected as a
focus because they occurred on Flex Alert days, the service area system peak day, or the
hottest days of the year. These are all conditions when demand response resources are most
critical.
August 10, 2012
SCE
Two SCE programs were called on August 10th, a Flex Alert day. The programs triggered
during that event were the Demand Bidding Program and the Save Power Day (also known as
the Peak Time Rebate program). The load reductions achieved during the Demand Bidding
Program event surpassed the forecast by 12%, while the Save Power Day event was below the
forecast by 11%.
Table 4: SCE’’s August 10, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Demand Bidding Program 85.59 95.82 10.23 11.95%
Save Power Day 107.24
11
95.85 11.39 10.62%
Total 192.83 191.67 1.16
11
SCE did not provide a daily forecast for this event, so the comparison for this event is done with the 7 day report
rather than the daily forecast.
Programs Number of
Events
Daily
Forecast
Ex Post Difference %
(Averaged MW over All
Events)
(Low To High)
Good Performance:
Base Interruptible Program 1 0.3 0.8 0.5 167%
Critical Peak Pricing 7 15 18 > 2.4 >3.1%
Mixed Performance:
Capacity Bidding Program –– Day
Ahead
7
8 6 4.9 to 0.1 32% to 12.2%
Capacity Bidding Program –– Day Of 5 12 10 3.2 to 0.7 27.4% to 6.0%
Demand Bidding Program 3 5 5 0.4 to 0.1 8.0% to 8.0%
Summer Saver (AC Cycling) 8 20 17 12.3 to 3.5 64.0 to 38.7%
Poor Performance:
Peak Time Rebate Residential 7 19 4 < 24 < 73.6%
Critical Peak Pricing –– Emergency 2 2 1 < 0.7 < 53.3%
13
SDG&E
Three DR programs were called on August 10th
. The Capacity Bidding Day Ahead program
load reduction exceeded the forecast by 1%. Conversely, the Summer Saver and residential
Peak Time Rebate forecasts under predicted the forecast by 32% and 75%.
Table 5: SDG&E August 10, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast & Ex
Post
MW
% Difference
Forecast &
Ex Post
A B C = B A D=C/A
Capacity Bidding Day Ahead 7.50 7.60 0.10 1.33%
Summer Saver (AC Cycling) 27.20 18.50 8.70 32.00%
Residential Peak Time Rebate 12.60 3.20 9.40 74.60%
Total 47.30 29.30 18.00
August 13, 2012
SCE
August 13, 2012 was the system peak day for the SCE service area, with a peak load of
22,428 MW. As shown in Table 6 below, the Critical Peak Pricing program, a dynamic pricing
program for commercial and industrial customers over 200 kW, and the Day Of Capacity
Bidding Program were triggered during this day. Again, the Capacity Bidding Programs
exceeded the forecast by a few MW. The Critical Peak Pricing program event had satisfactory
performance, falling short of the forecast by 15%.
Table 6: SCE’’s August 13, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Critical Peak Pricing 50.54 42.96 7.58 15.00%
Capacity Bidding Program (Day Of) 12.30 15.70 3.40 27.60%
Total 62.84 58.66 4.18
SDG&E
All three DR programs that were triggered on August 13th, Capacity Bidding Day Of,
Summer Saver (AC Cycling), and Critical Peak Pricing, had ex post load impacts that were
respectively below daily forecast predictions by 27%, 45%, and 48%.
14
Table 7: SDG&E’’s August 13, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C= B/A D= C/A
Capacity Bidding –– Day Of 11.70 8.50 3.20 27.33%
Summer Saver (AC Cycling) 33.30 21.40 11.90 45.35%
Critical Peak Pricing Emergency 2.30 1.20 1.10 47.83%
Total 47.30 31.10 16.20
August 14, 2012
SCE
August 14, 2012 was another Flex Alert day, during which seven events were called, using a
variety of DR programs. As shown in Table 8 below, all the events combined were forecasted to
reduce loads by 570 MW. However, the ex post load impact evaluations found that the actual
load reductions were short of the total forecast by 155 MW. 60% of the 155 MW shortfall is
attributed to the Demand Response Contract program. The Agriculture Pumping Interruptible
program event was short of the event forecast by 52%. Only the Capacity Bidding Program
exceeded the forecasted load reduction, but this only made up 4% of the Demand Response
Contract program forecast, and thus was insufficient to cover the overall event day shortfall. It
is worth noting that the Demand Response Contract and Capacity Bidding Programs share
something in common in that they are both commercial aggregator programs. The reason for
the difference in performance between these programs requires further study. It should be
noted that SCE’’s Demand Response Contracts expired on December 31, 2012 and have since
been replaced by new contracts that that expire at the end of 2014.12
Table 8: SCE’’s August 14, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Demand Response Contracts 275.00 182.05 92.95 33.80%
Demand Bidding Program 94.09 61.76 32.33 34.36%
Agriculture Pumping Interruptible 36.00 17.29 18.72 51.99%
Summer Discount Plan (Res) Group 1 130.40 119.40 11.00 8.44%
Capacity Bidding Program (Day Of) 12.30 17.82 5.52 44.86%
Summer Discount Plan (Res) Reliability 17.42 13.50 3.92 22.49%
Summer Discount Plan (Com) 4.77 3.10 1.67 35.04%
Total 569.98 414.91 155.07
12
D.13 01 024 http://docs.cpuc.ca.gov/PublishedDocs/Published/G000/M046/K233/46233814.PDF
15
SDG&E
Four DR programs, Demand Bidding, Critical Peak Pricing, Capacity Bidding Day Ahead,
and residential Peak Time Rebate, were called on August 14th
. While the Demand Bidding and
Capacity Bidding Program ex post load impacts closely matched the daily forecast, the Critical
Peak Pricing and residential Peak Time Rebate did not. Since the Critical Peak Pricing and
residential Peak Time Rebate programs are large scale residential programs it is possible that
the difference between the forecast and ex post load impacts reflect widely varying customer
behavior during DR events.
Table 9: SDG&E’’s August 14, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Demand Bidding Program 5.00 5.10 0.10 2.00%
Critical Peak Pricing 14.30 25.90 11.60 81.12%
Capacity Bidding Program (Day Ahead) 7.50 7.50 0.00 0.00%
Residential Peak Time Rebate 12.50 1.10 11.40 91.20%
Total 39.30 39.60 0.30
September 14, 2012
SCE
September 14, 2012 was the hottest day of the year in both the SCE and SDG&E service
areas (see Table 10 below). Understandably, SCE triggered their Summer Discount Plan
(residential AC Cycling Programs) during this day. The Capacity Bidding Program was also
triggered, with performance comparable to the other Capacity Bidding Program events on
critical days discussed above.
The September 14 residential Summer Discount Plan events consisted of three separate
customer groups sequentially triggered for one hour events. All three one hour events fell
considerably short of the forecasted load reductions.
Table 10: SCE’’s September 14, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Summer Discount Plan (Residential)
Groups 5 and 6 135.61 20.70 114.91 84.74%
Summer Discount Plan (Residential) Groups 1 and 2 110.89 37.80 73.09 65.91%
Capacity Bidding Program (Day Of) 11.90 16.21 4.31 36.18%
Summer Discount Plan (Residential) Groups 3 and 4 99.32 17.80 81.52 82.08%
Total 357.72 92.51 265.22
16
SDG&E
On September 14, 2012, the peak temperature in SDG&E’’s service territory was 109
degrees. The Demand Bidding, Summer Saver, and Base Interruptible Programs ex post load
impacts were above the daily forecast in a range between 8% and 167%. Since the absolute
value of the Base Interruptible Program load impact is ~ 1 MW, a small increase or decrease in
the daily forecast prediction can result in high variability in the percent difference between
these two figures. Conversely, the Capacity Bidding Day Of and Day Ahead Programs and the
Critical Peak Pricing Emergency Program daily forecasts were below the daily forecast in a range
between 12% and 44%.
Table 11: SDG&E’’s September 14, 2012 Demand Response Events
C. Detailed Program Analysis
The following section discusses programs and events that produced load reductions
forecasted by the daily reports, as well as programs that failed to produce the forecasted load
reductions. For this purpose, all programs and events that came within 10% (+/ ) of the
forecasted load reductions are considered to be consistent with the daily forecast and all
programs and events that were more or less than 50% of the forecasted load reductions are
considered to have failed to produce the forecasted load reductions.
SCE
There were a total of 104 separate events in the SCE service area in 2012. Only ten of these
events produced the load reductions consistent with those forecasted in the daily reports. As
shown in Table 12 below, all of these events produced fairly sizable load reductions, ranging
from 59 to 130 MW, with the exception of one Capacity Bidding Program event, which
produced a very small load reduction.
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast
& Ex Post
MW
%
Difference
Forecast &
Ex Post
A B C=B A D=C/A
Capacity Bidding Program (Day Of) 9.00 5.70 3.30 36.67%
Capacity Bidding Program (Day Ahead) 12.10 10.60 1.50 12.40%
Demand Bidding Program 5.00 5.40 0.40 8.00%
Summer Saver (AC Cycling) 15.50 22.50 7.00 45.16%
Base Interruptible Program 0.30 0.80 0.50 166.70%
Critical Peak Pricing Emergency 1.60 0.90 0.70 43.75%
Total 43.50 45.90 2.40
17
Table 12: SCE’’s DR Events with Ex Post Results within 10% of the Daily Forecast
Program Name Event Date
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
%
Difference
Forecast &
Ex Post
A B C=B A D=C/A
Summer Discount Plan (Residential) 08/14/12 130.40 119.40 11.00 8.44%
Summer Discount Plan (Residential) 08/29/12 82.56 80.30 2.26 2.74%
Summer Discount Plan (Residential) 08/01/12 58.60 57.10 1.50 2.56%
Summer Discount Plan (Residential) 08/15/12 77.77 77.50 0.27 0.35%
Demand Bidding Program 10/17/12 79.05 79.25 0.20 0.26%
Demand Bidding Program 10/01/12 78.75 79.78 1.03 1.31%
Summer Discount Plan (Residential) 08/09/12 118.06 121.20 3.14 2.66%
Summer Discount Plan (Residential) 08/28/12 83.86 88.20 4.34 5.18%
Capacity Bidding Program (Day Ahead) 07/31/12 0.0700 0.0740 0.00 5.71%
Demand Bidding Program 08/08/12 85.59 92.95 7.36 8.60%
Of the 104 events in 2012, thirty (or about 29%) of the events were more than 50% off of
the day ahead forecast. Five of these events produced load reductions that were greater than
the forecast, while the remaining 25 were lower than the forecast. The three events with the
highest percentage difference below the forecast were very small Day Ahead Capacity Bidding
Program events, and thus are not considered the most critical problem. Twenty one of the
remaining events were Summer Discount Plan (AC Cycling) events, and these varied markedly
off the forecast.
18
Table 13: SCE’’s DR Events with Ex Post Results greater than + 50% of the Daily Forecast
Program Name Event Date
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Capacity Bidding Program (Day Ahead) 10/01/12 0.09 0.20 0.29 315.22%
Capacity Bidding Program (Day Ahead) 10/02/12 0.09 0.10 0.20 213.04%
Capacity Bidding Program (Day Ahead) 10/05/12 0.09 0.07 0.16 170.65%
Save Power Days / Peak Time Rebates 09/07/12 108.66 23.11 131.77 121.27%
Summer Discount Plan (Residential) 06/20/12 128.01 0.50 127.51 99.61%
Save Power Days / Peak Time Rebates 09/10/12 108.52 1.65 106.87 98.48%
Summer Discount Plan (Residential) 09/14/12 135.61 20.70 114.91 84.74%
Summer Discount Plan (Residential) 07/10/12 263.67 44.70 218.97 83.05%
Summer Discount Plan (Residential) 09/14/12 99.32 17.80 81.52 82.08%
Summer Discount Plan (Residential) 06/29/12 178.26 33.30 144.96 81.32%
Summer Discount Plan (Residential) 09/20/12 77.39 14.60 62.79 81.14%
Summer Discount Plan (Residential) 06/29/12 178.26 35.80 142.46 79.92%
Summer Discount Plan (Residential) 07/10/12 263.67 66.60 197.07 74.74%
Summer Discount Plan (Residential) 10/02/12 298.91 86.20 212.71 71.16%
Summer Discount Plan (Residential) 07/10/12 263.67 76.70 186.97 70.91%
Summer Discount Plan (Residential) 09/20/12 65.53 21.10 44.43 67.80%
Summer Discount Plan (Residential) 09/20/12 65.73 21.90 43.83 66.68%
Summer Discount Plan (Residential) 09/14/12 110.89 37.80 73.09 65.91%
Summer Discount Plan (Residential) 08/22/12 115.03 42.40 72.63 63.14%
Agriculture Pumping Interruptible 09/26/12 60.56 24.00 36.56 60.36%
Summer Discount Plan (Residential) 09/21/12 168.96 69.10 99.86 59.10%
Summer Discount Plan (Residential) 09/28/12 55.06 24.50 30.56 55.50%
Agriculture Pumping Interruptible 08/14/12 36.00 17.29 18.72 51.99%
Summer Discount Plan (Residential) 10/17/12 127.25 62.30 64.95 51.04%
Summer Discount Plan (Residential) 10/17/12 146.77 72.30 74.47 50.74%
Summer Discount Plan (Residential) 08/17/12 101.30 153.00 51.70 51.04%
Capacity Bidding Program (Day Ahead) 10/29/12 0.09 0.15 0.06 59.78%
Summer Discount Plan (Residential) 08/17/12 58.00 98.30 40.30 69.48%
Capacity Bidding Program (Day Ahead) 10/18/12 0.09 0.17 0.08 85.87%
Summer Discount Plan (Residential) 09/10/12 18.98 68.40 49.42 260.42%
Summer Discount Plan
The Summer Discount Plan event variability ranges from 121% below the forecast (with a
load increase rather than a load reduction) to 260% above the forecast. Overall, the AC Cycling
program represents the most variance13
of all the SCE DR programs. When all of the variances
for individual events are aggregated, the AC Cycling program represents 49% of the total
variance. The Pearson Product Moment Correlation between the daily forecast and the ex post
load impacts is 0.21, representing a very weak positive correlation.
13
Variance in this context specifically refers to the absolute difference between the daily forecast and the event
day ex post load reductions.
19
The Pearson correlation between the average event temperature14
and the event level
variance (difference between the daily forecast and the event day ex post load reductions) is
0.37, representing a moderately weak correlation. In everyday language this means that SCE’’s
2012 Summer Discount Plan forecast method cannot be relied upon to effectively predict the
actual program load reductions. In addition, there appears to be little relationship between the
event day temperature and the difference between the daily forecast and the event day ex
post load reductions, potentially ruling out temperature as an explanatory factor for the
difference.
The Summer Discount Plan was (by far) the most often triggered program in SCE’’s 2012 DR
portfolio. There were 23 separate events, including two early test events15
. Most of the 23
events were split into 3 customer segments such that each group of customers was triggered
for only a portion (i.e. one hour) of each event (typically lasting three hours). Three events on
9/14, 9/20, and 9/28 deployed 6 customer segmentations. SCE operated the program in this
manner to avoid cycling their customers’’ air conditioners for more than one hour at a time16
.
The purpose of this strategy is so customers will be minimally impacted by the loss of one hour
of AC services, compared to multiple continuous hours, and in theory the utility would still be
able to reduce load when needed.
As shown in Table 14 below, the implementation of this strategy, however, resulted in a
rebound effect from the groups curtailed in event hours 1 & 2 that added load in hours 2 & 3 as
AC units ran at above normal capacity to return the participants’’ buildings to the original
temperature set points17
. The net effect was to dampen the average hourly load impact for the
entire event period, as illustrated in Table 14. It is possible that the daily forecasts were
prepared assuming that all customers would be curtailed at the same time over the entire
duration of the event. In such a case, the average hourly load reductions would likely have
been larger because all customers would be simultaneously curtailed and the rebound effect
would be delayed until after the event was over. This issue is further illustrated in Chapter 2,
Section IV ““Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)””.
Table 14: SCE’’s Hourly Load Impact from a Sept 14 Summer Discount Plan event
Event Hour
Ending:
Event Hours w/ Rebound Post Event Rebound
Event Hour
Average
16 17 18 19 20
15 39.6 25.1 17.0
16 27.1 27.0 39.6
17 21.3 49.6 37.8
Hour Total 39.6 2.0 22.7 89.2 37.8 6.3
14
SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013.
15
The last two events in late October were not included in the ex post analysis.
16
SCE 01 Testimony at 11.
17
SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013.
20
Another potential explanation for the suboptimal performance could be customers
exercising the override option in their enrollment contracts with SCE. However, SCE’’s A.12 12
016 testimony18
indicates that the proportion of customers with an override option is fairly
small (consisting of about 1% of the customers enrolled in SDP) and that these customers rarely
exercise the override option. Finally, it is possible that transitioning Summer Discount Plan
from an emergency program to a price responsive program could have introduced some
additional uncertainties that aren’’t adequately captured by the current forecasting methods.
Regardless of the explanation for the unexpectedly low load reductions during these events, it
is critical that SCE improve the day ahead forecast for the SDP program as a whole.
Energy Division staff reviewed SCE’’s method for forecasting the Summer Discount Plan
program.19
The methodology, provided in Appendix C, is described in a 1986 internal SCE
memorandum and consists of a simple algorithm which estimates the load reduction per ton of
AC based on the forecasted temperature. The equation coefficients were determined by a
1985 load reduction study that SCE staff could not locate when requested to do so by Energy
Division staff. Without the 1985 load reduction study Energy Division staff could not fully
evaluate the forecasting methodology. SCE did provide a revised algorithm which modifies the
equation structure. But the underlying methods for estimating those coefficients as yet remain
unexplained.
This evidence suggests that there is a critical flaw in either the way the Summer Discount
Plan events are forecasted or in the operation of the program, or both. The lack of a reliable
day ahead forecasting method is a major weakness that undermines the ability to fully consider
AC Cycling in the CAISO grid operations. Even if the utilities’’ DR resources are eventually to be
bid into the CAISO market, which currently are not, ED recommends that SCE immediately
document the forecasting methods to be used for the 2013 season and thoroughly vet the
methods with CPUC and CAISO staff and relevant stakeholders to ensure the proposed
forecasting methods are reasonable and reliable. Throughout the 2013 summer season (and
longer if necessary), SCE should consider piloting more than one forecasting method which
should be tested using small ex post load impact evaluations to identify the most reliable
forecasting methods.
Base Interruptible Program
The Base Interruptible Program was triggered only once during the entire 2012 season and
this was a test event. This single event produced 573 MW of load reductions on September 26.
The load reductions for this event were 59 MW more than the day ahead forecast. It is worth
noting that the single Base Interruptible event was more than three times the load reduction of
any other SCE program event during 2012, and it was not triggered on one of the critical event
days discussed earlier in this section.
The Commission should explore a policy requiring more frequent deployments of this
program since it appears to have significant, yet underutilized, potential.
18
SCE 01 Testimony at 11, Lines 3 5.
19
See Appendix S.
21
Capacity Bidding Program
The Capacity Bidding Program Day Ahead events produced an average load reduction of
0.03 MW across all events. With the exception of three events in October (that were
associated with negative load reductions in the ex post analysis) most events produced
relatively small load reductions forecasted by the daily report. None of the Capacity Bidding
Program day ahead events occurred in August and September when the load reductions are
typically most needed.
By comparison, all of SCE’’s Capacity Bidding Program Day Of events exceeded the
forecasted load reductions, by an average of 32%. The average load reduction for the Capacity
Bidding Program Day Of events was 15.9 MW, over 500 times the load reductions produced by
Day Ahead events.
This evidence suggests that, unlike the Day Of program, the Day Ahead Capacity Bidding
Program may not be serving a useful function in SCE’’s DR portfolio.
Demand Bidding Program
The Demand Bidding contracts were called on eight occasions during the summer of 2012.
Of these eight events, five occurred in August. The first two August events on August 8 and
August 10 resulted in load reductions that exceeded the daily forecast by an average of 10%.
The third and fourth events on August 14 and August 16 were 34% short of the forecasted load
reductions and the fifth event on August 29 was 40% below forecast, suggesting that perhaps a
decline in customer participation in events could be explored as a potential factor in
diminishing returns.
Demand Response Contracts (DRC) –– Nominated
Somewhat surprisingly, there were only two events for which Demand Response Contracts
were called. The ex post load reductions for these two events were both around 35% below
the daily forecast. Energy Division was not able to examine why this program performed so
poorly. As noted earlier, SCE’’s DRCs expired on December 31, 2012, and have since been
replaced by new contracts approved by the Commission.
Save Power Days / Peak Time Rebates (PTR) –– Price Responsive
Daily forecasts were not provided by SCE for the four PTR events that occurred in August,
thus comparisons between the daily forecast and ex post results are possible for only the two
events on September 7 and September 10. Both of the September events were forecasted to
reduce loads by 109 MW. Ex post results, however, indicate that the PTR events had no impact
at all. In fact, the September 7 event was correlated with a fairly significant load increase of
23.11 MW.
Ex post load reductions were estimated for the four August PTR events, for which day
ahead estimates were not provided by SCE. As a proxy for the daily forecast the 7 day reports
were used. As shown in Table 15 below, estimated load reductions were between 107 and 108,
while the ex post load reductions ranged between 0.02 and 96 MW.
22
Table 15: SCE’’s Peak Time Rebate MW
Event Day 7 Day Report Ex Post
8/10/2012 107.24 MW 95.85 MW
8/16/2012 107.61 MW 24.43 MW
8/29/2012 108.51 MW 21.93 MW
8/31/2012 108.73 MW 0.02 MW
Given the considerable variability in ex post results for the PTR program events, the day
ahead forecasting and event reporting will need significant revision to account for these
discrepancies. If the PTR program is going to continue, staff recommends that SCE prepare a
proposal for a viable forecast and submit that for staff to review.
SDG&E
There were a total of 46 DR program events that were triggered on 14 event days in
SDG&E’’s service area from June 2012 October 2012. Daily forecasts for twelve DR program
events were within + 10% of ex post load impacts. As depicted in Table 16, moderate load
reductions ranging from 5 to 17 MW were produced when these events were triggered. Three
programs delivered accurate results with a moderate degree of consistency: Demand Bidding
Program, Critical Peak Pricing, and Capacity Bidding Program Day Of.
Table 16: SDG&E’’s DR Events with Ex Post Results within + 10% of the Daily Forecast
Program Name
Event
Date
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast
& Ex Post
MW
% Difference
Between
Forecast &
Ex Post
Demand Bidding Program 10/2/2012 5 4.6 0.4 8.00%
Capacity Bidding Program (Day Of) 8/8/2012 11.7 11 0.7 5.98%
Capacity Bidding Program (Day Ahead) 8/9/2012 7.5 7.5 0 0.00%
Capacity Bidding Program (Day Ahead) 8/14/2012 7.5 7.5 0 0.00%
Capacity Bidding Program (Day Ahead) 8/10/2012 7.5 7.6 0.1 1.33%
Demand Bidding Program 8/14/2012 5 5.1 0.1 2.00%
Summer Saver (AC Cycling) 9/15/2012 8.6 8.8 0.2 2.33%
Critical Peak Pricing 10/2/2012 16 16.5 0.5 3.13%
Critical Peak Pricing 8/21/2012 16.5 17.2 0.7 4.24%
Critical Peak Pricing 9/15/2012 13.7 14.5 0.8 5.84%
Demand Bidding Program 9/14/2012 5 5.4 0.4 8.00%
Critical Peak Pricing 8/30/2012 16.2 17.8 1.6 9.88%
A total of 19 DR program events had ex post load impacts that were greater than + 50% of
the daily forecasts as depicted in Table 17. In particular, the residential and commercial Peak
Time Rebate program ex post load impacts deviated from the daily forecasts by greater than
70%. According to SDG&E, the commercial Peak Time Rebate ex post load impacts were
deemed to be not statistically significant. On this basis, SDG&E reported zero load impacts for
this program.
23
Table 17: SDG&E’’s DR Events with Ex Post Results greater than + 50% of the Daily Forecast
Program Name Event Date
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post MW
%
Difference
Between
Forecast &
Ex Post
A B C= B A D= C/A
Commercial Peak Time Rebate 8/9/2012 1.2 0 1.2 100.00%
Commercial Peak Time Rebate 8/10/2012 1.1 0 1.1 100.00%
Commercial Peak Time Rebate 8/11/2012 0.8 0 0.8 100.00%
Commercial Peak Time Rebate 8/14/2012 1.2 0 1.2 100.00%
Commercial Peak Time Rebate 8/21/2012 1.2 0 1.2 100.00%
Commercial Peak Time Rebate 9/15/2012 0.9 0 0.9 100.00%
Residential Peak Time Rebate 8/14/2012 12.5 1.1 11.4 91.20%
Residential Peak Time Rebate 8/21/2012 25 3 22 88.00%
Residential Peak Time Rebate 8/11/2012 12.2 1.7 10.5 86.07%
Residential Peak Time Rebate 8/9/2012 13.1 3.3 9.8 74.81%
Residential Peak Time Rebate 8/10/2012 12.6 3.2 9.4 74.60%
Residential Peak Time Rebate 9/15/2012 32.3 8.3 24 74.30%
Residential Peak Time Rebate 7/20/2012 23.9 6.3 17.6 73.64%
Capacity Bidding Program (Day Ahead) 10/1/2012 9 4.1 4.9 54.44%
Capacity Bidding Program (Day Ahead) 10/2/2012 9 4.2 4.8 53.33%
Summer Saver (AC Cycling) 9/14/2012 15.5 22.5 7 45.16%
Critical Peak Pricing 8/11/2012 11.7 18.4 6.7 57.26%
Critical Peak Pricing 8/14/2012 14.3 25.9 11.6 81.12%
Base Interruptible Program 9/14/2012 0.3 0.8 0.5 166.67%
Capacity Bidding Program Day Ahead (CBP DA)
The percent difference between the CBP DA daily forecast and ex post results respectively
ranged from 32% 12% (Table 3). Based upon this assessment, the daily forecasts for CBP DA
were not accurate or consistent predictors of ex post results.
Since the CBP DA daily forecast model does not have a variable that accounts for weather,
and the ex post models do, this methodological difference could account for the variability
between the two load impact measures. Another factor that could affect this difference is the
percent load impact per customer. Although customers submit load impact bids prior to each
DR event, the actual load reduction on the event day may not coincide with the projected load
reduction.
If weather affects event day load reduction by CBP customers, the addition of a weather
variable to the daily forecast model could increase its accuracy. In order to address uncertainty
in the percent load reduction per CBP customer, DR test events could be scheduled to measure
this value on event like days.
Capacity Bidding Program Day Of (CBP DO)
Similar to the CBP DA program, the CBP DO daily forecasts were not accurate nor consistent
predictors of ex post results based upon the range of the difference, 27.4% 6.0% (Table 2),
between the two load impact measures. As stated above, inclusion of a weather variable in the
24
daily forecast model and measurement of percent load reduction per customer during test
events could increase the accuracy and consistency of the daily forecast model to predict ex
post load impacts.
Demand Bidding Program (DBP)
The percent difference between the DBP daily forecasts and ex post load impacts ranged
from 8.0% to 8.0% (Table 3) for the three DBP events that were called during the summer.
Based upon this result, the DBP daily forecast accurately and consistently predicted ex post
load impacts.
One caveat for making a general assessment of the DBP forecast model is that only one
customer provided load reduction bids for the DR summer events. In order to do so, it would
be advised to examine forecast and load impact data from at least 5 10 event days.
Commercial Peak Time Rebate
SDG&E reported zero ex post load impacts for this program in its March 4th
filing. According
to SDG&E, zero values do not imply that no load reduction occurred but that the load impacts
were not statistically significant.20
Therefore, a comparison of daily forecasts and ex post load
impacts could not be performed.
Based upon conversations with SDG&E, the lack of effectiveness of the commercial Peak
Time Rebate program could be attributed to a weak price signal and inelastic customer demand
during event periods. SDG&E would be advised to discontinue the commercial Peak Time
Rebate program.
Residential Peak Time Rebate
The percent difference between daily forecast and ex post load impacts ranged from 91.2%
to 73.6% (Table 3). This implies that the residential Peak Time Rebate program daily forecast is
not an accurate predictor of ex post load impact. However, the residential Peak Time Rebate
program daily forecast consistently over predicted the ex post results.
Since the ex post methodology only modeled load impacts for customers that signed up to
receive e mail or text alerts and the daily forecast model does not, it is possible that the
accuracy of the daily forecast model could improve if there was parity between the two
methodologies. If only residential Peak Time Rebate opt in customers were included in the
daily forecast model this may resolve the discrepancy. As an alternative solution, since the
daily forecast consistently over predicted the ex post results, SDG&E might consider derating
daily forecasts by a factor of 0.7 to 0.9 when estimating ex post load impacts.
Summer Saver (AC Cycling)
The range of the percent difference between daily forecast and ex post load impacts,
64.0% 38.7%, presented in Table 3 indicates that the daily forecast is not an accurate or
consistent predictor of ex post load impacts.
20
SCE 03 at 21.
25
It should be noted that the both the residential and commercial Summer Saver ex post
methodologies (respectively a randomized experiment and a panel vs. customer regression)
differed from prior years due to the availability of smart meter data21
. This could account for
the difference between daily forecast and ex post results. In addition, both ex post
methodologies utilized control and treatment groups, whereas daily forecast methodologies did
not. According to this assessment, it would be advised to examine how the daily forecast and
ex post models could be harmonized.
Based upon a conversation with SDG&E, a temperature squared variable is utilized in the
daily forecast model. Compared to SCE’’s current AC cycling daily forecast model, SDG&E’’s daily
forecast model includes an additional measure of accuracy. However, in order to better predict
customer behavior on successive event days or prolonged event hours, SDG&E might consider
including an autocorrelation variable in the daily forecast model.
Critical Peak Pricing
The percent difference between the daily forecast and ex post results ranged from 3.1%
81.1%. This is the only program where the ex post results consistently outperformed the daily
forecast predictions.
According to SDG&E, the percent load impacts for the Critical Peak Pricing program in 2012
were lower in comparison to 2011 and led to an underestimation in the daily forecast22
. Critical
Peak Pricing has approximately ~ 1,000 customers and, as SDG&E claims, any variation in the
percent load reduction per customer could lead to high variation in the aggregate impact
estimates. This would also be the case for large scale residential DR programs including Peak
Time Rebate and Summer Saver (AC Cycling).
SDG&E also claims that measurement error might account for differences between load
impact category values. However, no explanation is provided to elucidate how the
measurement error occurred (e.g. since Smart Meters were not fully deployed in SDG&E’’s
territory during Summer 2012, measured load reductions obtained from analog meters were
not accurate).
Base Interruptible Program
The percent difference between the daily forecast and ex post load impact for the Base
Interruptible Program was 166.7%.
Since two large Base Interruptible Program customers dropped out of the program, SDG&E
was not able to accurately forecast the load impact from the remaining customers. It is
possible that further analysis with additional Base Interruptible Program load impact data might
shed light on the accuracy of the daily forecasting methods.
21
SDG&E load impact Filing Executive Summary, April 2, 2012 at 31.
22
SGE 03 at 19.
26
Critical Peak Pricing –– Emergency
Due to decreasing customer subscription to this tariff, the CPP E program was discontinued
as of December 31, 2012.23
D. Summary of Recommendations
Given the divergence between the daily forecast estimates and ex post load impact results,
staff makes the following recommendations:
The daily forecasting methods for all programs must be improved.
The daily forecasting methods should be better documented and should be
developed with relevant agencies and stakeholders.
SCE should test a number of different forecasting methods for the Summer Discount
Plan program.
SCE should change the Summer Discount Plan program strategy of calling groups of
customers for sequential one hour cycling events.
SDG&E should include only opt in customers in the residential PTR daily forecast
model.
SDG&E should run a test event to improve CPP daily forecast estimates.
SDG&E should account for customer behavior during successive event days in the
Summer Saver daily forecast model.
SDG&E should include a weather variable in the CBP forecast model.
IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)
A. Summary of the Staff Analysis and Recommendations
Comparing the 2012 ex post results with the 2012 RA forecast is not an accurate method of
determining how the DR programs performed. RA load forecast represents the maximum
capacity DR can provide under a set of condition for resource planning needs. Ex post load
impact reflects the demand reduction obtained during actual events in response to operational
planning needs. Resource planning and operational planning are different in terms of
conditions (i.e. event hours, participation, and temperature) and purposes.
However, in summer 2012, the Utilities’’ DR programs had not been utilized to its full
capacity even under an extreme hot weather condition. This raises the question of the
usefulness of the current RA forecast and whether RA forecast should be changed to reflect the
set of conditions reflecting operational needs that include the utilities’’ day to day resource
availability limitations and DR dispatch strategies for optimal customer experience. A working
group that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address the
forecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growth
rate, dropout rate) used for forecasting RA.
23
At 61, SDG&E load impact Filing Executive Summary, April 2nd
27
B. Background
The 2012 RA forecast represents the maximum capacity DR can provide under a set of
conditions for resource planning needs. The conditions entail a 1 in 2 weather year24
, portfolio
level, entire participation, five hour window event (1 p.m. to 6 p.m.), and enrollment forecast
assumption.
The 2012 ex post load impacts reflect the demand reductions obtained during actual events
in response to operational needs. Operational needs on the event day may not require the full
capacity of DR because the condition does not warrant it. Utilities have the discretion to call for
a few DR programs with shorter event hours or a smaller group of participants based on their
generation and DR resource dispatch strategies.25
This means an ex post impact may only
reflect a 1 hour event window versus an RA forecast that has a 5 hour event window.
Therefore, the ex post impact may reflect only a segment of a program’’s participants versus the
RA forecast that assumed the program’’s entire set of participants. The ex post impact may
reflect a lower temperature as versus the RA forecast that has a higher temperature of the 1 in
2 weather year condition.
C. Staff Analysis
Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate method
on how well the program performs against its forecast.
The table below contains August monthly average load impact for the 2012 Resource
Adequacy (RA) forecast as filed in the spring of 2011 and the ex post results that occurred in
2012. There are stark differences between what the Utilities forecasted a year ahead (RA) and
what the results are (Ex Post). On average for the month of August, the variability ranges from
485% (over performance) to 95% (under performance) for SCE and 58% to 97% for SDG&E.
The main reason for the discrepancy is because the RA data is used to assist in resource
planning, which means it is characterized as a 5 hour event in which all customers are called for
the entire period (1 6pm) for the summer. However, ex post results reflect the impact from
the actual DR operations, which means that it can be a 1 hour event in which some (not all)
customers are called for a short period of time. Other factors that contributed to the
discrepancy include temperature, enrollment and dual participation.
24
Represent the monthly peak day temperature for an average year. Exhibit SGE 03, Page 14.
25
SGE 06, Page 6.
28
Table 18: SCE Demand Response Load Impact
2012 Resource Adequacy vs. 2012 Ex Post
August Average (MW)
Program Name
RA
Forecast
26 Ex Post
27 Difference
RA vs. Ex Post
% Difference RA
vs. Ex Post
A B C=B A D=C/A
Demand Bidding Program 12 72 60 485%
Demand Response Contracts 105 182 77 74%
Base Interruptible Program
28
548 573 25 5%
Capacity Bidding Program Day Of 19 17 2 11%
Summer Advantage Incentive/Critical Peak Pricing 69 39 30 44%
Agricultural Pumping Interruptible 40 17 22 57%
Summer Discount Plan/ AC Cycling Residential 500 212 288 58%
Save Power Days / Peak Time Rebates 266 36 230 87%
Capacity Bidding Program Day Ahead
29
1 0 1 94%
Summer Discount Plan/AC Cycling –– Commercial 62 3 59 95%
Table 19: SDG&E Demand Response Load Impact
2012 Resource Adequacy vs. 2012 Ex Post
August Average (MW)
Program Name
RA
Forecast
30 Ex Post
31
Difference
RA vs. Ex Post
% Difference
RA vs. Ex Post
A B C=B A D=C/A
Critical Peak Pricing Default 12 19 7 58%
Summer Saver/ AC Cycling 15 19 4 27%
Capacity Bidding Program Day Ahead 10 8 2 20%
Capacity Bidding Program Day Of 22 10 12 55%
Base Interruptible Program
32
11 0.84 10.16 92%
Reduce Your Use / Peak Time Rebates 69 2 67 97%
Demand Bidding Program n/a
33
5 n/a n/a
Critical Peak Pricing Emergency n/a 1 n/a n/a
26
Exhibit SCE 03, Table 1.
27
Exhibit SCE 03, Table 1.
28
Number based on September average because there were no events for month of August.
29
Number based on July average because there were no events for month of August or September.
30
Exhibit SDG 03, Table 1
31
Exhibit SDG 03, Table 1
32
Number based on September average because there were no events for month of August.
33
DBP was not approved until the year after the 2012 RA forecast was filed.
29
Forecasting DR estimate for resource planning needs is different than forecasting for
operational needs.
Unlike resource planning needs, operational needs on the event day may not require the
full capacity of DR because the condition does not warrant it or the Utilities deployed ‘‘optimal’’
dispatch strategies for customer experience. Utilities have the discretion to call for shorter
event hours or a smaller group of participants if the system is adequately resourced for that day.
As discussed in Chapter 3, peaker or other generation resources may have been dispatched
instead of DR even though such operation would be contrary to the Loading Order.34
For
example, SCE can divide its residential Summer Discount Plan participants into three groups and
dispatch each group for one hour of an event, resulting in three consecutive one hour events
(see chart below). Approximately 1/3 of the customers can be curtailed in any given hour.
Rebound from the groups curtailed in event hours 1 and 2 can reduce the net impact in hours 2
and 3, lowering the average hourly impact for the entire event period. As a result, the average
impact per hour can be roughly 100 MW for operation needs. The following figures illustrate
the rebound effects from SCE’’s sub group dispatch strategy for its AC cycling.
Figure 1
Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex Ante
Briefing
34
http://www.cpuc.ca.gov/NR/rdonlyres/58ADCD6A 7FE6 4B32 8C70 7C85CB31EBE7/0/2008_EAP_UPDATE.PDF.
30
However for the RA forecast, resource planning needs require the full capacity of DR. For
example, SCE assumed all residential Summer Discount Plan participants would be curtailed at
the same time to represent the full program capabilities of a reliability event (see chart below).
Subsequent hourly impacts can be larger due to all customers being curtailed at once and
rebound effect being delayed until end of entire event window. As a result, the average impact
per hour for RA forecast can be roughly 300 MW, which is roughly 3 times greater than ex post
in an hour.
Figure 2
Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex Ante
Briefing
The opposite extreme condition could occur where the ex post result is higher than the RA
forecast. In the case of SCE’’s Demand Bidding Program, the average ex post result is 72 MW,
which is 6 times more than the RA forecast of 12 MW (see Table 18). Dual participation was the
major contributor to the discrepancy. For customers who enrolled in two programs such as
Base Interruptible Program and Demand Bidding Program, the RA forecast only counts the MW
in one program (Base Interruptible Program) to avoid double counting.35
Had the two programs
been called the same day, the ex post would have shown a much lower amount for Demand
Bidding Program.
35
Portfolio level.
31
September 14, 2012 was considered a hot day (1 in 10 weather year condition36
), however,
SCE still did not dispatch their entire residential Summer Discount Plan participants. Instead,
SCE only dispatched a portion of its participants for one hour of an event, resulting in a five
consecutive one hour events. On average, SCE received only 6.3 MW37
for the event, which is a
huge underperformance in comparison to RA forecast of 519 MW.38
This raises the question
that if SCE chose not to dispatch all of its Summer Discount Plan participants at the same event
hour during a 1 in 10 weather year condition, under what circumstances SCE will dispatch its
Summer Discount Plan to its full program capacity. The usefulness of the RA forecast is in
question if the utility does not test a DR program to its full capacity. Should the RA forecast
process be amended to include another Ex Ante forecast that is based on operational needs
including optimal customer experience, and if so what would that entail?
D. Conclusion and Recommendations
Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate method
in determining DR program performance because the ex post results are in response to
operational needs which can be entirely different than resource planning needs. However, in
2012 the RA forecast was not tested to its full capacity. This raises the question of whether RA
forecast should be changed to reflect both planning needs and operational needs. A working
group that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address the
forecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growth
rate, drop of rate) used for forecasting RA. This working group should meet in
December/January annually and come up with a set of input assumptions (i.e. growth rate,
drop off rate) used for forecasting DR estimates.
36
Represent the monthly peak temperatures for the highest year out of a 10 year span. Exhibit SGE 03, Page 14.
37
Christensen Associates Energy Consulting 2012 Load Impact Evaluation of Southern California Edison’’s
Residential Summer Discount Plan (SDP) Program, April 1, 2013, Table 4 3d.
38
Exhibit SCE 03, Table 1, 2012 RA for the month of September.
32
Chapter 3: Demand Response Program Operations
I. Summary of Staff Analysis and Recommendations
The 2006 to 2011 data shows that the Utilities historically triggered their DR programs far
below the program limits in terms of number of events and hours. Even with the SONGS
outage, the Utilities did not trigger their DR programs in 2012 summer more frequently as
anticipated. Almost all of the Utilities’’ 2012 DR program events and hours fall within the
historical averages or below the historical maximum. However, staff was surprised to find that
the Utilities dispatched their peaker power plants (peaker plants) three to four times more
frequently in 2012 than the historical averages. The peaker plant service hours were closer to
the plants’’ emission allowances than the DR events to the program limits.
Staff observed a trend where some DR program events decreased from 2006 to 2012 and
yet peaker service hours increased in the same period. This trend raises a concern that the
Utilities had under utilized DR programs and over relied on peaker plants. Under the ““Loading
Order””, DR is a preferred resource and intended to avoid the building and dispatching of peaker
plants.
Due to the time constraints and lack of additional information, Staff was unable to fully
address this question and the reasons behind these trends in this report. Therefore, staff
recommends in future DR program Measurement and Evaluations, the Commission evaluates
the DR program operations and designs in comparison with the peaker plant operations to
ensure the utilities’’ compliance with the Loading Order.
Specifically, the staff recommends that the Commission:
1. Require the Utilities to provide both DR event and peaker plant data and explanations
for the disparity between historical DR event hours and peaker plant service hours in
future DR evaluations and the next DR budget applications. The Utilities should include
the DR and peaker plant hourly data and explain why they did not trigger DR programs
during any of the hours when the peaker plant was dispatched. This information will
inform the future DR program designs to improve the DR usefulness.
2. Require that DR historical operations be reflected in the input assumptions for the Ex
Ante forecast and the evaluation of the program cost effectiveness.
3. Address the Loading Order policy in DR planning and operation and utilization of peaker
plants in the next DR Rulemaking and the Utilities’’ energy cost recovery proceedings.
II. 2012 DR Program Trigger Criteria and Event Triggers
Appendices H and I are a summary of the Utilities’’ 2012 DR program trigger criteria and the
event triggers. The DR program trigger criteria consists of a list of conditions, which is self
explanatory depending on the type of the program, e.g., Emergency Program triggers are based
on system contingencies and non Emergency Program triggers also include high temperature,
33
heat rate (economic), and resource limitations. The 2012 event triggers were the actual
conditions that led to the Utilities’’ decisions to call DR events.
While the DR trigger criteria provides some general ideas on how DR programs are triggered,
there is lack of transparent information on the Utilities’’ DR operations, e.g., when and how the
Utilities made decisions to trigger a DR program. It is necessary to evaluate the DR
performance not only from load impact perspective, but also from the DR operations to
determine the DR reliability and usefulness as a resource. Staff analyzed the 2006 2012 DR
event data and gained some understanding on how the Utilities had utilized DR programs and
how useful the programs were.
III. DR Events Vs. Peaker Plant Service Hours
How do the number compare to the 2012 limit and historically?
As shown in Appendices J and K, SCE has a few DR programs with unlimited number of
events or hours: Demand Bidding Program, Save Power Days (Peak Time Rebate), and Summer
Discount Plan –– Commercial (Enhanced). Others have various event/hour limits ranging from 24
hours/month to 180 hours/year or 15 events/year.39
For the DR programs with an event limit, most of them did not attain the maximum number
of events and/or hours except for SCE’’s Summer Advantage Incentive (Critical Peak Pricing).40
In summer 2012, SCE triggered 12 events for its Critical Peak Pricing, which is within the range
of 9 to 15 events/year. Other DR programs’’ event hours were well below the limits. For
example, SCE’’s residential Summer Discount Plan (AC cycling) is the second to highest triggered
programs with 23 DR events and 24 event hours in 2012, which is still far below the 180 hours
of its event limit despite the SONGS outage. The Base Interruptible Program (BIP) had only one
test event for two hours in 2012.
However, SCE’’s DR program event hours were either within the program historical ranges or
below the 2006 2011 maximum except for Agricultural Pumping Interruptible with 7 hours in
2012 as comparing to 0 to 2 from 2006 to 2011.
What were the reasons for the differences between the 2012 DR event numbers and hours
and the event limits?
SCE explained that the reasons for the differences between the 2012 DR event numbers and
hours vary for each program, which is summarized in Appendix L.41
The reasons can be
characterized for the three types of DR programs as: 1) trigger conditions, 2) optimal dispatches,
and 3) no nomination
As discussed above, DR program operations are based on the trigger criteria set for each
program. For the non Emergency Programs, SCE indicated that optimizing performance and
minimizing customer fatigue is an additional factor considered in its decision to trigger a DR
program. SCE’’s optimal dispatch strategy may have resulted in the DR events and hours far
39
SCE 02, Appendix E, Table 2 A at E 4 and E 5.
40
Id.
41
SCE 02, Appendix E, at E 6 and E 7.
34
below the maximum hours and events for the programs. For example, SCE’’s Summer Discount
Plan is available for 180 hours annually. However, customers would probably never expect that
this program will be triggered close to 180 hours based on their experience to date with the
program. As shown in Appendices M and N, staff finds a similar trend with SDG&E’’s DR event
data.
IV. Peaker Plant Comparison
Most of SCE’’s non Emergency Programs include resource limitation as a program trigger.
Therefore, in theory, one would expect that SCE would trigger DR programs before dispatching
its peaker plants in accordance with the Loading Order. In light of the SONGS outage, the
Commission anticipated more SCE and SDG&E DR events in 2012, yet SCE dispatched peaker
plants substantially more than DR programs (compared to their historical averages as discussed
below.
How do the historical DR events compare to the utilities’’ peaker plants?
SCE provided the permit and service hours for four of its own peaker plants, three were
located in the SONGS affected areas, which is shown in Appendix O.42
SCE historically
dispatched its peaker plants about 9% to 16% of the permissible service hours annually. As
shown in the table below, during the same period, SCE triggered its non Emergency DR
programs 11 to 106 hours on average. However, in 2012, SCE dispatched its peaker plants
three to four times more than the historical average. On the other hand, SCE’’s 2012 DR event
hours were less than the historical range. SDG&E’’s peaker plant and DR event data show a
similar trend as SCE. For example, SDG&E’’s Miramar ran 4,805 hours out of 5,000 hours of
emission allowance. In contrast, its Critical Peak Pricing with the most triggered hours was
dispatched 49 hours out of 126 hours of annual limit.
Table 20: DR Event Hour Vs. Peaker Plant Service Hours
2006 2011 Range 2012
SCE:
Peaker Plants 96 –– 129 Hours 405 –– 465 Hours
Non Emergency DR 11 –– 106 Hours 2 –– 64 Hours
SDG&E:
Peaker Plants 436 –– 1715 Hrs. 974 –– 4805 Hrs.
Non Emergency DR 19 –– 39 Hrs. 14 –– 49 Hrs.
In addition, staff observed that the Utilities highest DR event hours occurred in 2006 and
2007 during the summer heat storms but the highest peaker plan hours occurred in 2012. This
data suggests that the Utilities under utilized DR programs and over relied on its peaker plants,
which is inconsistent with the Loading Order.
42 SCE 01, Appendix C, Tables 9 and 10 at Page 17.
35
In its comments on the 2013 2014 DR Proposed Decision, SCE disagreed with the suggestion
of ““under utilization”” of DR programs based on the 2012 DR events. SCE argued that ““(s)imply
because SCE did not dispatch all of the programs’’ available hours does not mean the programs
should have been dispatched more……Optimal utilization (of DR) ensures the necessary amount
of load drop to enable a reliable grid……””43
SCE should explain why it dispatched its peaker plants
substantially more last summer instead of DR and whether SCE’’s optimal dispatch of DR or the
trigger criteria or designs resulted in SCE’’s increased reliance on peaker plants.
Due to the time constraint and absence of the Utilities’’ explanations, staff is unable to
comprehensively address this issue in this report. The Utilities data warrants further evaluation
to ensure the usefulness of DR resource as a replacement of peaker plants and the compliance
of the Loading Order.
V. Conclusions
Consistent with D.13 04 017, staff finds that most of SCE’’s DR programs did not attain the
maximum number of events and/or hours except for SCE’’s Critical Peak Pricing. The Utilities’’
total numbers of DR events and hours in 2012 were within the historically average, but far from
the program limits. However, in contrast, staff found that SCE owned and contracted peaker
plants were dispatched far more in 2012 in comparison with the historical averages. Some
peakers were much closer to their emission allowance than the DR hours were to their
operating limits. Staff reaches a similar conclusion with SDG&E’’s DR programs in comparison
with its peaker plants.
If the Utilities have historically never triggered their DR programs close to the available
hours, there is a concern with how realistic these limits are. There is a reliability risk if the
Utilities are relying on a DR resource that has never been used to its full capacity. In addition,
the DR cost effectiveness should reflect the historical operations. Staff recommends the
Commission to address the issue in future DR evaluation and budget approval proceedings.
43
SCE Opening Comment filed on April 4, at 4 5.
36
Chapter 4: Residential Demand Response Programs
I. Summary of Staff Analysis and Recommendations
Analysis of Residential programs included Peak Time Rebate (PTR) and AC Cycling. Overall,
customers seem satisfied with the programs based on utility reports and surveys. However staff
encountered problems with program design and operation that need to be addressed to
improve reliability and effectiveness of the programs.
For PTR, staff found that customers who received utility notification of events have higher
awareness of the program when compared to customers who were not notified by the utility or
received indirect notification such as mass media alerts. More importantly, data for both
utilities show that customers who opted into receiving alerts were the only group that
significantly reduced load. For both utilities, customers defaulted on MyAccount to receive
alerts did not reduce load significantly. However, the entire eligible customer class qualifies for
bill credits, which resulted in a problem of 'free ridership.' Both utilities should modify PTR from
a default to an opt in program, where only customers opting to receive event alerts would
qualify for bill credits.
For SCE's Residential AC Cycling staff found that the current group dispatch strategy is
resulting in a rebound effect. The rebound effect impacts the actual load reduction the program
is capable of producing. Staff recommends SCE to (1) align the maximum program event
duration with customer preference for shorter events to improve forecast, and to (2)
reconsider its incentive structure to favor participation in longer event duration.
Finally, both utilities should take advantage of AMI infrastructure and related enabling
technology that could improve program delivery, reliability and customer experience.
II. Residential Peak Time Rebate (PTR)
A. Overall Customer Experience
For both utilities, customers were generally satisfied with the program. For SCE, customers
seem satisfied with the level of incentives, the time between notification and event. However
customers would like more information regarding the program and bill credits. SDG&E’’s
customers reported overall customer satisfaction with the program, but similar to SCE’’s
customers, would benefit from more information and outreach.
Level of awareness for both utilities seems higher amongst customers who chose to sign up
to receive notifications. This is reflected in the overall load reduction verified by ex post data.
Only customers who signed up for event notification significantly reduced load.
For PTR, none of the utilities noticed evidence of customer fatigue, but this does mean it did
not occur; just that it was not noticeable.
37
B. SCE’’s Peak Time Rebate/Save Power Day
1) Summary
Customers who received utility notification of events have higher awareness of the program
when compared to customers who were not notified by the utility. More importantly,
customers who opted into receiving alerts were the only group that significantly reduced load.
Customers defaulted on MyAccount to receive alerts and the remaining customers not directly
notified by the utility did not reduce load significantly. SCE considered only customers who
received alerts in their forecast and ex post verification. However, the entire eligible customer
class qualifies for bill credits. Awareness of the program, reflected by the willingness to sign up
for receiving alerts, seems to indicate more willingness to reduce load. This factor should be
considered in program design. Staff identified an issue with ‘‘free ridership’’, where customers
are paid even though they didn’’t significantly reduce any load. Staff recommends changing PTR
from a default program to an opt in program, paying bill credits only to customers who opt in
to participate.
2) Background
D.09 08 028 approved Save Power Day, SCE’’s Peak Time Rebate (PTR) rate. The decision
approved bill credits of 0.75c/kWh reduced with an additional 0.50c/kWh for customers with
enabling technology.
This is a default program for residential customers with a smart meter and has been
available since 2012. The program provides incentives to eligible Bundled Service Customers,
who reduce a measurable amount of energy consumption below their Customer Specific
Reference Level (CSRL) during PTR Events.44,45
The utility may call events throughout the year on any day, excluding weekends and
holidays. Events will take place between 2pm and 6pm on days an event is called. Participants
receive a day ahead notification of the event. Bill credits will be paid in each billing cycle based
on the sum of events called and usage reduction during the period.46
Bill credits will be
recovered from the respective customer class through the Energy Resource Recovery Account
(ERRA).
During 2012, SCE started defaulting customers on MyAccount to receive email notifications,
with the remaining customers not directly notified by the utility. Alternatively, customers may
choose to opt in to receive alerts. As of November 30th, approximately 4 million customers are
on PTR and 824,000 were signed up to receive notifications (via MyAccount).47
According to SCE,
44
SCE Schedule D –– Domestic Service, sheet 3
45
CSLR: peak average usage level”” is the customer’’s average kWh usage during the 2:00 p.m. to 6:00 p.m. time
period of the three (3) highest kWh usage days of the five (5) non event, non holiday weekdays immediately
preceding the PTR Event. The CSRL is used to determine the customers kWh reduction for each PTR Event in
order to calculate the rebate.
46
SCE Schedule D –– Domestic Service, D.09 08 028 Att. C at 7.
47
SCE 01 Testimony at 27, lines 11, 18 19.
38
approximately 60,000 customers have opted in to receive alerts in 2012 during the summer
months.48
3) Lessons Learned
In support of its 2013 2014 Application, SCE provided data to highlight lessons learned from
the 2012 program year.
Customer awareness
Awareness of the program is higher amongst the group of customers whom the utility
notified of events: 66% of notified respondents were familiar with the program but only 43%
were familiar in the group not notified49
. When prompted for awareness of events, the same
pattern is noticeable. 72% of respondents in the group receiving notifications who were aware
of the program claimed awareness of specific events, compared to 40% in the group not
receiving notifications. When including customers aware and the ones prompted with
information about the program, 55% of the notified group was aware but only 23% of the non
notified respondents was aware.50
Customer satisfaction
There was no information regarding customer perception of fairness of savings/incentive
levels in SCE’’s data, however customers seem to link participation with expectation of savings
as 80% of respondents identified earning bill credits as important for participation51
. Moreover,
participants seem to be willing to participate even in the face of low savings.52
Event notification
The majority of respondents aware of the program found out about events via utility
notification (over 60% for the opt in group). Close to 23% of respondents in the overall
population found out about events in the news.53
According to results of the customer surveys, about 90% of customers notified of the event
and about 56% of customers not notified but aware of the event, were happy with the amount
of time between notification and event54
. It appears that a day ahead strategy could be
adequate, however customers were not prompted regarding preference for a day of reminder,
so it is not clear from the lessons learned if this could increase awareness and response. SCE
requested to add a day of notification in their 2013 2014 Program Augmentation Application,
which the Commission denied due to lack of evidence of need.55
48
Email communication with SCE (4/5/2013)
49
SCE 02 Appendix A at 3.It is important to note that the surveys only represented results for two groups:
customers notified by the utility and customers who were not notified. Defaulted customers and customers not
defaulted into receiving notifications from the utility were bundled together under notified customers.
50
SCE 02 Appendix A at 4
51
SCE 02 Appendix B at 24
52
SCE 02 Appendix B at 36
53
SCE 02 Appendix A at 5
54
SCE 02 Appendix A –– Save Power Day Incentive/Peak Time Rebate Post Event Customer Survey at 15
55
D. 13 04 017, at 28
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned
StaffReport_2012DRLessonsLearned

More Related Content

What's hot

The City of Bakersfield, CA GIS Implementation Plan (1997 - 1998)
The City of Bakersfield, CA GIS Implementation Plan (1997 - 1998)The City of Bakersfield, CA GIS Implementation Plan (1997 - 1998)
The City of Bakersfield, CA GIS Implementation Plan (1997 - 1998)
Juan Tobar
 
Sap co stepbystep config &amp; user manual part 2
Sap co stepbystep config &amp; user manual part 2Sap co stepbystep config &amp; user manual part 2
Sap co stepbystep config &amp; user manual part 2
PallaviChawla8
 
Kosovo Mid-Term KCBS Evaluation Report
Kosovo Mid-Term KCBS Evaluation ReportKosovo Mid-Term KCBS Evaluation Report
Kosovo Mid-Term KCBS Evaluation Report
nsegura85
 
Composite materials
Composite materialsComposite materials
Composite materials
svb243
 
21020-BAB-04101-HS-PR-0005_C1 FINAL
21020-BAB-04101-HS-PR-0005_C1 FINAL21020-BAB-04101-HS-PR-0005_C1 FINAL
21020-BAB-04101-HS-PR-0005_C1 FINAL
George Ralph
 
Wings to Fly Evaluation Final Report
Wings to Fly Evaluation Final ReportWings to Fly Evaluation Final Report
Wings to Fly Evaluation Final Report
Edwin Ochieng
 
National cancer institute clinical trials program guideline
National cancer institute clinical trials program guidelineNational cancer institute clinical trials program guideline
National cancer institute clinical trials program guideline
Jones Wu
 
Fet business studies gr 10 12 - we_b#c0fc
Fet   business studies gr 10 12 - we_b#c0fcFet   business studies gr 10 12 - we_b#c0fc
Fet business studies gr 10 12 - we_b#c0fc
Celtia Tladi
 
Monitoring and Supervision Procedures for Exploration and Development Activit...
Monitoring and Supervision Procedures for Exploration and Development Activit...Monitoring and Supervision Procedures for Exploration and Development Activit...
Monitoring and Supervision Procedures for Exploration and Development Activit...
Nicholas Varilone
 
Proposed project plan template
Proposed project plan templateProposed project plan template
Proposed project plan template
Michelle Joja
 

What's hot (20)

Sma10 4545
Sma10 4545Sma10 4545
Sma10 4545
 
2014 National Senior Certificate Examination Diagnostic report
2014 National Senior Certificate Examination Diagnostic report2014 National Senior Certificate Examination Diagnostic report
2014 National Senior Certificate Examination Diagnostic report
 
Project Management Report
Project Management ReportProject Management Report
Project Management Report
 
The City of Bakersfield, CA GIS Implementation Plan (1997 - 1998)
The City of Bakersfield, CA GIS Implementation Plan (1997 - 1998)The City of Bakersfield, CA GIS Implementation Plan (1997 - 1998)
The City of Bakersfield, CA GIS Implementation Plan (1997 - 1998)
 
Sap co stepbystep config &amp; user manual part 2
Sap co stepbystep config &amp; user manual part 2Sap co stepbystep config &amp; user manual part 2
Sap co stepbystep config &amp; user manual part 2
 
Kosovo Mid-Term KCBS Evaluation Report
Kosovo Mid-Term KCBS Evaluation ReportKosovo Mid-Term KCBS Evaluation Report
Kosovo Mid-Term KCBS Evaluation Report
 
User manual pt
User manual ptUser manual pt
User manual pt
 
Composite materials
Composite materialsComposite materials
Composite materials
 
Template plano de continuidade de negocios
Template plano de continuidade de negociosTemplate plano de continuidade de negocios
Template plano de continuidade de negocios
 
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
Comprehensive Multi-year Plan - Universal Immunization Program Reaching Every...
 
iccm DRC
iccm DRCiccm DRC
iccm DRC
 
21020-BAB-04101-HS-PR-0005_C1 FINAL
21020-BAB-04101-HS-PR-0005_C1 FINAL21020-BAB-04101-HS-PR-0005_C1 FINAL
21020-BAB-04101-HS-PR-0005_C1 FINAL
 
Phase2 tutorial manual
Phase2 tutorial manualPhase2 tutorial manual
Phase2 tutorial manual
 
Wings to Fly Evaluation Final Report
Wings to Fly Evaluation Final ReportWings to Fly Evaluation Final Report
Wings to Fly Evaluation Final Report
 
Mbg spmp project_management
Mbg spmp project_managementMbg spmp project_management
Mbg spmp project_management
 
National cancer institute clinical trials program guideline
National cancer institute clinical trials program guidelineNational cancer institute clinical trials program guideline
National cancer institute clinical trials program guideline
 
Fet business studies gr 10 12 - we_b#c0fc
Fet   business studies gr 10 12 - we_b#c0fcFet   business studies gr 10 12 - we_b#c0fc
Fet business studies gr 10 12 - we_b#c0fc
 
Spmp
SpmpSpmp
Spmp
 
Monitoring and Supervision Procedures for Exploration and Development Activit...
Monitoring and Supervision Procedures for Exploration and Development Activit...Monitoring and Supervision Procedures for Exploration and Development Activit...
Monitoring and Supervision Procedures for Exploration and Development Activit...
 
Proposed project plan template
Proposed project plan templateProposed project plan template
Proposed project plan template
 

Similar to StaffReport_2012DRLessonsLearned

Revised recovery programme preparation practical guide rev.00
Revised recovery programme preparation practical guide rev.00Revised recovery programme preparation practical guide rev.00
Revised recovery programme preparation practical guide rev.00
ElHusseinAllam
 
Operations and-maintenance-best-practices
Operations and-maintenance-best-practicesOperations and-maintenance-best-practices
Operations and-maintenance-best-practices
Nikhil Nangia
 
33134 handbook ict wp2013 fixed deadline calls v2 en
33134 handbook ict wp2013 fixed deadline calls v2 en33134 handbook ict wp2013 fixed deadline calls v2 en
33134 handbook ict wp2013 fixed deadline calls v2 en
Rob Blaauboer
 
Continuous Process Improvement Transformation Guidebook.docx
Continuous Process Improvement  Transformation Guidebook.docxContinuous Process Improvement  Transformation Guidebook.docx
Continuous Process Improvement Transformation Guidebook.docx
donnajames55
 
CSI program handbookdecember2009_v2
CSI program handbookdecember2009_v2CSI program handbookdecember2009_v2
CSI program handbookdecember2009_v2
kiakaha
 
Us gsa (1992) value engineering program guide for design and construction -...
Us gsa (1992)   value engineering program guide for design and construction -...Us gsa (1992)   value engineering program guide for design and construction -...
Us gsa (1992) value engineering program guide for design and construction -...
Wan Yusoff Wan Mahmood
 
ISTM 5900 CHARITY AND LOVE DATABASE DESIGN
ISTM 5900 CHARITY AND LOVE DATABASE DESIGNISTM 5900 CHARITY AND LOVE DATABASE DESIGN
ISTM 5900 CHARITY AND LOVE DATABASE DESIGN
Patricia Helligar
 
MASTERS THESIS (ESAMI-MAASTRICHT SCHOOL OF MANAGEMENT -MsM)
MASTERS THESIS (ESAMI-MAASTRICHT SCHOOL OF MANAGEMENT -MsM)MASTERS THESIS (ESAMI-MAASTRICHT SCHOOL OF MANAGEMENT -MsM)
MASTERS THESIS (ESAMI-MAASTRICHT SCHOOL OF MANAGEMENT -MsM)
Prof Handley Mpoki Mafwenga
 

Similar to StaffReport_2012DRLessonsLearned (20)

Revised recovery programme preparation practical guide rev.00
Revised recovery programme preparation practical guide rev.00Revised recovery programme preparation practical guide rev.00
Revised recovery programme preparation practical guide rev.00
 
Change Management Strategy
Change Management StrategyChange Management Strategy
Change Management Strategy
 
India Energy Security Scenarios Calculator - BTech Project
India Energy Security Scenarios Calculator - BTech ProjectIndia Energy Security Scenarios Calculator - BTech Project
India Energy Security Scenarios Calculator - BTech Project
 
Operations and-maintenance-best-practices
Operations and-maintenance-best-practicesOperations and-maintenance-best-practices
Operations and-maintenance-best-practices
 
33134 handbook ict wp2013 fixed deadline calls v2 en
33134 handbook ict wp2013 fixed deadline calls v2 en33134 handbook ict wp2013 fixed deadline calls v2 en
33134 handbook ict wp2013 fixed deadline calls v2 en
 
Continuous Process Improvement Transformation Guidebook.docx
Continuous Process Improvement  Transformation Guidebook.docxContinuous Process Improvement  Transformation Guidebook.docx
Continuous Process Improvement Transformation Guidebook.docx
 
CSI program handbookdecember2009_v2
CSI program handbookdecember2009_v2CSI program handbookdecember2009_v2
CSI program handbookdecember2009_v2
 
Us gsa (1992) value engineering program guide for design and construction -...
Us gsa (1992)   value engineering program guide for design and construction -...Us gsa (1992)   value engineering program guide for design and construction -...
Us gsa (1992) value engineering program guide for design and construction -...
 
U.s. air force probability of program success (po ps) spreadsheet operations ...
U.s. air force probability of program success (po ps) spreadsheet operations ...U.s. air force probability of program success (po ps) spreadsheet operations ...
U.s. air force probability of program success (po ps) spreadsheet operations ...
 
USACE Construction Quality Management manual
USACE Construction Quality Management manualUSACE Construction Quality Management manual
USACE Construction Quality Management manual
 
Software Requirements Specification on Student Information System (SRS on SIS)
Software Requirements Specification on Student Information System (SRS on SIS)Software Requirements Specification on Student Information System (SRS on SIS)
Software Requirements Specification on Student Information System (SRS on SIS)
 
Impact assessment-study-dit
Impact assessment-study-ditImpact assessment-study-dit
Impact assessment-study-dit
 
A.R.C. Usability Evaluation
A.R.C. Usability EvaluationA.R.C. Usability Evaluation
A.R.C. Usability Evaluation
 
IT Project Planning Standards V 1.2
IT Project Planning Standards V 1.2IT Project Planning Standards V 1.2
IT Project Planning Standards V 1.2
 
Thèse professionnelle sur les indicateurs de performance RSE et le management...
Thèse professionnelle sur les indicateurs de performance RSE et le management...Thèse professionnelle sur les indicateurs de performance RSE et le management...
Thèse professionnelle sur les indicateurs de performance RSE et le management...
 
Usability of Web Based Financial Services
Usability of Web Based Financial ServicesUsability of Web Based Financial Services
Usability of Web Based Financial Services
 
M&E Report SEAL 2012
M&E Report SEAL 2012M&E Report SEAL 2012
M&E Report SEAL 2012
 
ISTM 5900 CHARITY AND LOVE DATABASE DESIGN
ISTM 5900 CHARITY AND LOVE DATABASE DESIGNISTM 5900 CHARITY AND LOVE DATABASE DESIGN
ISTM 5900 CHARITY AND LOVE DATABASE DESIGN
 
Vol. 8 of emap environmental management plan
Vol. 8 of emap   environmental management planVol. 8 of emap   environmental management plan
Vol. 8 of emap environmental management plan
 
MASTERS THESIS (ESAMI-MAASTRICHT SCHOOL OF MANAGEMENT -MsM)
MASTERS THESIS (ESAMI-MAASTRICHT SCHOOL OF MANAGEMENT -MsM)MASTERS THESIS (ESAMI-MAASTRICHT SCHOOL OF MANAGEMENT -MsM)
MASTERS THESIS (ESAMI-MAASTRICHT SCHOOL OF MANAGEMENT -MsM)
 

StaffReport_2012DRLessonsLearned

  • 1. STATE OF CALIFORNIA Edmund G. Brown Jr., Governor PUBLIC UTILITIES COMMISSION 505 VAN NESS AVENUE SAN FRANCISCO, CA 94102 3298 Commission Staff Report Lessons Learned From Summer 2012 Southern California Investor Owned Utilities’’ Demand Response Programs May 1, 2013 Performance of 2012 Demand Response programs of San Diego Gas and Electric Company and Southern California Edison Company: report on lessons learned, staff analysis, and recommendations for 2013 2014 program revisions in compliance with Ordering Paragraph 31 of Decision 13 04 017.
  • 2. ACKNOWLEDGEMENT The following Commission staff contributed to this report: Bruce Kaneshiro Scarlett Liang Uejio Tim Drew Rajan Mutialu Dorris Chow Paula Gruendling Taaru Chawla Jennifer Caron Alan Meck
  • 3. i TABLE OF CONTENTS EXECUTIVE SUMMARY....................................................................................................... 1 Chapter 1: Introduction.................................................................................................. 5 I. 2012 Summer Reliability and Demand Response Programs..................................................5 II. Energy Division November 16, 2012 Letter and the Staff Report..........................................6 Chapter 2: Demand Response Program Load Impact...................................................... 8 I. Summary of Staff Analysis and Recommendations ...............................................................8 II. Different DR Load Impact Estimates ...................................................................................... 9 III. Comparison of DR Daily Forecast and Ex Post Results ..........................................................9 IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)..............................26 Chapter 3: Demand Response Program Operations...................................................... 32 I. Summary of Staff Analysis and Recommendations .............................................................32 II. 2012 DR Program Trigger Criteria and Event Triggers .........................................................32 III. DR Events Vs. Peaker Plant Service Hours ...........................................................................33 IV. Peaker Plant Comparison..................................................................................................... 34 V. Conclusions .......................................................................................................................... 35 Chapter 4: Residential Demand Response Programs .................................................... 36 I. Summary of Staff Analysis and Recommendations .............................................................36 II. Residential Peak Time Rebate (PTR) ....................................................................................36 III. Residential Air Conditioning (AC) Cycling.............................................................................51 Chapter 5: Non Residential Demand Response Programs............................................. 57 I. Summary of Staff Analysis and Recommendations .............................................................57 II. Background and Summary of Utility Data............................................................................57 III. Commercial Air Conditioning (AC) Cycling...........................................................................59 IV. SCE’’s Auto DR....................................................................................................................... 63 V. SDG&E’’s Demand Bidding Program (DBP) ...........................................................................65 Chapter 6: Flex Alert Effectiveness ............................................................................... 67 I. Summary of Staff Analysis and Recommendations .............................................................67 II. Background .......................................................................................................................... 67 III. Utility Experience with Flex Alert.........................................................................................69 IV. Customer Experience ........................................................................................................... 69 V. The Future of Flex Alert........................................................................................................ 71 VI. DR Program Ex Post Load Impact Results on the Flex Alert Days........................................71 Chapter 7: Energy Price Spikes ..................................................................................... 73
  • 4. ii I. Summary of Staff Analysis and Recommendations .............................................................73 II. Definition of Price Spikes ..................................................................................................... 73 III. DR Programs and Price Spikes.............................................................................................. 73 IV. Conclusion............................................................................................................................ 74 Chapter 8: Coordination with the CAISO ...................................................................... 75 I. Staff Recommendations....................................................................................................... 75 II. DR Reporting Requirements in Summer 2012.....................................................................75 III. DR Reporting Requirements for 2013 2014.........................................................................76 Appendix A: Highlight of 2012 Summer Weather & Load Conditions.................................... 77 Appendix B: Energy Division November 16, 2012 Letter........................................................ 78 Appendix C: Descriptions of DR Load Impact Estimates......................................................... 79 Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW) ............................ 85 Appendix E: SCE 2012 DR Program Load Impact by Event (MW)........................................... 87 Appendix F: SDG&E 2012 Monthly Average DR Program Load Impact (MW) ....................... 91 Appendix G: SDG&E 2012 DR Program Load Impact by Event (MW)..................................... 92 Appendix H: SCE 2012 DR Program Overview ....................................................................... 93 Appendix I: SDG&E DR Program Overview............................................................................. 96 Appendix J: SCE Historical DR Event Hours............................................................................. 98 Appendix K: SCE Historical Number of DR Events .................................................................. 99 Appendix L: Summary of SCE’’s Reasons for the 2012 DR Triggers....................................... 100 Appendix M: SDG&E Historical DR Event Hours................................................................... 101 Appendix N: SDG&E Historical Number of DR Events .......................................................... 102 Appendix O: Utilities’’ Peaker Plant Total Permissible vs. Actual Service Hours................... 103 Appendix P: Ex Post Demand Response Load Impact on Flex Alert Days ............................ 104 Appendix Q: CAISO Energy Price Spikes................................................................................ 105 Appendix R: Utilities’’ Demand Response Reporting Requirements..................................... 111 Appendix S: Additional Information .................................................................................... 113
  • 5. 1 EXECUTIVE SUMMARY This report is prepared by Energy Division in compliance with Ordering Paragraph 31 of D.13 04 017. The purpose of this report is to provide the lessons learned from the 2012 Demand Response (DR) programs operated by San Diego Gas and Electric Company (SDG&E) and Southern California Edison Company (SCE) (Utilities), and to recommend program or operational revisions, including continuing, adding, or eliminating DR programs. Below are highlighted conclusions and recommendations in the report. To see all recommendations, please go to each chapter in the report. In summary, Energy Division makes the following overarching conclusions about the Utilities’’ DR programs: Forecast vs. Ex Post: While a few DR programs met or even exceeded their daily forecast when triggered, on average the ex post results for all program events diverge from the daily forecast by a considerable degree. The majority of programs either provided a ‘‘mixed’’ performance (the program both over and under performed relative to its forecast) or were poor performers (consistently coming up short relative to its forecast). Of particular note are the Utilities’’ Peak Time Rebate program1 and SCE’’s Summer Discount Plan.2 (Chapter 2) The divergence between the ex post results and the daily forecasts can be traced to a variety of causes, such as inadequate forecasting methods employed by the Utilities, program design flaws, non performance by program participants and/or program operations. A complete explanation of the reasons for divergence across all programs however, was not possible within the scope and timing of this report. (Chapter 2) 2012 RA vs. Ex Post: Comparing the ex post results to the 2012 Resource Adequacy (RA) forecast is not a good indicator as to how well a DR program performs. RA forecasts are intended for resource planning needs. Ex post load impacts reflect demand reductions obtained in response to operational needs at the time the program is triggered. Resource planning and operational planning have different conditions and serve different purposes. (Chapter 2) DR vs. Peaker Plants: The Utilities used their DR programs fewer times and hours than the programs’’ limits (each program is limited to a certain number of hours or events). In contrast, the Utilities dispatched their peaker power plants far more frequently in 2012 in comparison to 2006 –– 2011 historical averages. (Chapter 3) Energy Price Spikes: DR programs are not currently designed to effectively mitigate price spikes in the CAISO’’s energy market. On many days a DR event was called and 1 SCE’’s marketing name for Peak Time Rebate is ““Save Power Day”” , SDG&E calls it ““Reduce Your Use””. 2 Air conditioning (AC) cycling
  • 6. 2 no price spikes occurred, and conversely there were days where price spikes occurred and DR events were not called. The timing and scope of this report did not permit a quantification of the cost of unmitigated price spikes to ratepayers, but in theory, avoidance of these spikes would benefit ratepayers. (Chapter 7) Energy Division also makes the following program specific conclusions about the Utilities’’ DR programs: SCE’’s AC Cycling Program Forecasting: SCE’’s 2012 forecasting methodology for its air conditioning (AC) Cycling program (the DR program that SCE triggered the most in 2012) cannot be relied upon to effectively predict actual program load reductions. (Chapter 2) SCE’’s AC Cycling Dispatch Strategy: SCE’’s sub group dispatch strategy for its AC Cycling Program (also called Summer Discount Plan) created adverse ‘‘rebound’’ effects, thereby reducing the effectiveness of the program during critical hot weather days, e.g. 1 in 10 weather. (Chapter 2) SDG&E’’s Demand Bidding Program: SDG&E Demand Bidding Program produced on average 5 MW of load reduction when triggered, although the US Navy did not participate. The US Navy claimed certain program terms and conditions precluded it from participating in the 2012 program. The Commission’’s decision to modify the program to a 30 minute trigger may further limit the US Navy’’s ability to participate. (Chapter 5) Peak Time Rebate Awareness: SCE and SDG&E customers who received utility notification of Peak Time Rebate (PTR) events had higher awareness of the program when compared to customers who were not notified by the utility. More importantly, customers who opted into receiving PTR alerts significantly reduced load. All other customers in the program provided minimal load reduction. (Chapter 4) Peak Time Rebate Free Ridership: The Utilities’’ PTR program has a potentially large ‘‘free ridership’’ problem, where customers receive incentives without significantly reducing load. SCE paid $22 million (85% of total PTR incentives in 2012) in PTR bill credits to customers whose load impact was not considered for forecast or ex post purposes. 94% of SDG&E’’s 2012 PTR incentives ($10 million) were paid to customers who did not provide significant load reduction. The inaccuracy of settlement methodology (in comparison to the ex post results) is the main reason for the ‘‘free ridership’’ problem. The default nature of the program (everyone is automatically eligible for the incentives) aggravates the problem. (Chapter 4). Flex Alert: There is a lack of data to evaluate the effectiveness and value of the Flex Alert campaign. Attribution of savings from Flex Alert is complicated by the fact that load reduction from the Utilities’’ DR programs on the two days Flex Alert was
  • 7. 3 triggered in 2012 contributed to reduced system peak load. A load impact evaluation of Flex Alert is planned for 2013. (Chapter 6) DR Reports: The Utilities’’ DR daily and weekly reports were useful to the CAISO and the Commission for purposes of up to date monitoring of DR resources throughout the summer. (Chapter 8) In light of above findings, Energy Division recommends the following: DR Evaluation: The Commission should require further evaluation of Utility DR program operations in comparison to Utility operation of peaker plants for the purpose of ensuring Utility compliance with the Loading Order. (Chapter 3) Forecast Methods Generally: The Utilities’’ daily forecasting methods for all DR programs (especially AC cycling and other poor performers) should undergo meaningful and immediate improvements so that the day ahead forecasting becomes an effective and reliable tool for grid operators and scheduling coordinators. (Chapter 2) Forecasting for SCE’’s AC Cycling Program: SCE should improve forecasting methods for its residential AC Cycling Program with input from agencies and stakeholders. SCE should also pilot more than one forecasting method for the program in 2013. (Chapter 2) Forecasting for SDG&E Programs: SDG&E’’s forecasting methods for its AC Cycling Program (Summer Saver) could be improved doing the following: running a test event and including a correlation variable that accounts for customer fatigue. SDG&E’’s Capacity Bidding Program forecasting could be improved by including a weather variable. (Chapter 2) SCE’’s Outreach for Commercial AC Cycling: Through its outreach and marketing efforts, SCE should clearly communicate the new features of its commercial AC cycling program to avoid customer dissatisfaction and dropout. (Chapter 5) Auto DR: Future studies are necessary to explore the load impacts of Auto DR. (Chapter 5) SDG&E’’s Demand Bidding Program: SDG&E should work collaboratively with the US Navy to design a program to meet the unique needs of the Navy. Key attributes to consider are a day ahead trigger, aggregation of 8 billable meters and a minimum bid requirement of 3 megawatts (MW). (Chapter 5) Peak Time Rebate Design Changes: The Utilities’’ residential PTR program should be changed from a default program to an opt in program, so that bill credits are paid only to customers who opt in. (Chapter 4) SCE’’s AC Cycling Dispatch Strategy: SCE should reconsider its current strategy of calling groups of residential AC cycling customers in sequential one hour cycling events. Alternatively, if SCE retains its current strategy, it should modify the
  • 8. 4 program’’s incentive structure so that customers who are willing to have their AC units cycled for an entire event (as opposed to just one hour) are compensated more than those who can tolerate only one hour of cycling. (Chapter 4) DR Reports: The Utilities (and Pacific Gas & Electric) should submit daily and weekly DR reports to the CAISO and the Commission for the summers of 2013 and 2014. They should follow the same format and data requirements in the 2012 reports, unless otherwise directed by the Commission or Commission staff. (Chapter 8)
  • 9. 5 Chapter 1: Introduction I. 2012 Summer Reliability and Demand Response Programs San Onofre Nuclear Generating Station (SONGS) Units 2 and 3 were taken out of service in January 2012. By March 2012, the Commission determined that the outage of SONGS’’ two units could extend through summer 2012. Working closely with the Governor’’s Office, the California Independent System Operator (CAISO), and the California Energy Commission (CEC), the Commission took immediate mitigation actions to ensure that lights stay on in California with the loss of 2,200 MW of capacity provided by SONGS.3 When considering adding new generation resources,4 an important action was to further incorporate the Utilities’’ Demand Response (DR) programs into the CAISO’’s contingency planning and daily grid operations during the summer. This included mapping the Utilities’’ DR programs to grid contingency plans and developing new daily and weekly DR reporting requirements. In addition, the Commission also moved swiftly to approve three new DR programs for summer 2012: SDG&E’’s Peak Time Rebate (PTR) for commercial customers and Demand Bidding Program (DBP); and SCE’’s 10 for 10 conservation program for non residential customers.5 Because of the intensive interagency mitigation effort and relatively cool weather, California grid reliability was not compromised in spite of the SONGS outage. Nevertheless, southern California experienced several heat waves in August and September with the highest temperature reaching 109°F in SDG&E’’s service area and 100°F for SCE on September 14.6 The CAISO issued two Flex Alerts: on August 10 and 14. The Utilities triggered all of their DR programs at least once and some on multiple occasions. Throughout the summer, Energy Division (ED) staff monitored the Utilities’’ DR program events on a daily basis and provided weekly briefings to the Governor’’s Office, the CAISO, and the CEC. Staff observed that, for many event days, the load impact forecasts provided by the Utilities to the CAISO and the Commission in their daily DR reports were inconsistent with the results submitted seven days after each event (referred as the ““7 Day report””). In some cases, the Utilities reported much lower load reduction results than they originally forecasted. In addition, load impact forecasts provided by the Utilities throughout the summer were lower than the capacity counted for the 2012 Resource Adequacy (RA) Requirement. This raised a question as to whether the Commission might have overestimated DR load impact for RA purposes or, rather, if the Utilities might have under utilized their DR programs. Sometime in mid summer, the Utilities began to experience price spikes in CAISO’’s wholesale energy market. Questions were raised on whether the DR programs could be used to mitigate price spikes, and if so, should they be. 3 http://www.songscommunity.com/value.asp 4 Retired Huntington Beach Units 3 and 4 were brought back on line temporarily. 5 Resolutions E 4502 and E 4511 6 A 1 in 10 (or 10% probability) weather condition in any given years.
  • 10. 6 Some of the Utilities’’ DR programs were triggered on as many as 23 events over the five summer months, and many were triggered on two or three consecutive days. Appendix A highlights the DR program load impact on the three hottest days and the three days when SDG&E and SCE experienced highest system peak load. Staff observed that SDG&E’’s system peak correlate to temperature and biggest DR load reduction happened on the hottest day. On the other hand, SCE’’s system peak load did not consistently correlate to weather. In contrast, SCE’’s system load reached its annual peak at 90°F temperature, 10°F cooler than the hottest day in its service territory. Counter intuitively, DR program load impact on a cooler day was actually higher than the amount delivered on the hottest day. This led to questions how the Utilities make decisions to trigger DR programs and whether aspects of the customers’’ experience, such as expectations and fatigue have an effect. In August, CAISO issued two Flex Alerts when it determined a reliability risk due to insufficient supply to meet demand. As expected, the Utilities triggered relatively large amounts of DR programs on both days. CAISO reported that the actual peak load was significantly lower than its hours ahead forecasts and attributed the load drop to Flex Alert events. This parallel dispatch situation raises important questions regarding the effectiveness of the Flex Alert when overlapped with the Utilities’’ DR program events and how customers perceived with these statewide alerts versus local utility DR notifications. Based on the above experience, the Commission concluded that staff should evaluate DR program performance and other lessons learned in order to seek answers to these and other questions. Such lessons could help the Commission to determine the extent of DR program reliability and usefulness and in turn, to the extent to which DR resources can be counted on in CAISO markets and operations. II. Energy Division November 16, 2012 Letter and the Staff Report On November 16, 2012, the Energy Division sent a letter (Energy Division Letter) to the Utilities directing the Utilities to 1) file an application proposing DR program improvements for 2013 and 2014 to mitigate the SONGS outage and 2) provide data and responses to a set of questions on lessons learned from 2012 DR programs. The questions were developed based on the Utilities’’ 2012 demand response experience and fell into six categories: 1. DR Program Performance, which include load impact and program operations, 2. CAISO Market, covering price spikes and market analysis 3. Customer Experience, 4. Coordination with the CAISO and Utility Operations 5. Emergency DR Program Dispatch Order, and 6. Flex Alert Effectiveness The Energy Division Letter is attached in Appendix B of this report.
  • 11. 7 On December 21, 2012, the Utilities filed separate applications for the approval of the DR program revisions for 2013 and 2014.7 The Utilities submitted data and responses to the questions attached to the Energy Division Letter and subsequent Assigned Administrative Law (ALJ) rulings for developing the record.8 Decision (D.)13 04 017 approved certain DR program improvements for 2013 2014 and directed the Commission staff to develop a report on the lessons learned from the DR programs in 2012. This report is based on a snapshot of data and studies available at the time (i.e. ex post load impact data, utility responses to Energy Division data requests, etc.) On going and future (e.g. Flex Alert load impact analysis per D.13 04 021) evaluations will shed further light on the issues raised in this report. One point of emphasis in this report is the extent to which the current DR programs delivered their forecasted savings when they were triggered by the utilities. It is important to understand that there are a range of factors that can affect whether a program delivers its forecasted savings targets. Some of these factors can be controlled through good program design, operation and forecasting methodologies. Other factors that can impact program performance are exogenous or outside the utilities’’ control such as temperature, participant enrollment fluctuations, and behavioral or technological changes by the participants. While this report contains certain findings and recommendations for DR programs, we caution against sweeping conclusions or generalizations about DR programs based on this report. The point of this report is to find ways to improve existing DR programs so that they are more useful to grid operators, utilities, ratepayers and participants. 7 A.12 12 016 (SDG&E) and A.12 12 017 (SCE). 8 On January 18, 2013 and February 21, 2012.
  • 12. 8 Chapter 2: Demand Response Program Load Impact I. Summary of Staff Analysis and Recommendations SCE Most of the program event ex post results diverge from the daily forecast by a considerable degree. The daily forecast should be more consistent with the ex post results in order for the day ahead forecasting to be valid and useful for grid operators. Staff recommends that the daily forecasting methods for all programs undergo meaningful and substantial improvements, including more thorough and transparent documentation and vetting through relevant agencies and stakeholders. The Summer Discount Plan (Residential AC Cycling) program forecasting methods in particular requires an audience with a broad panel of agencies and stakeholders. Staff also recommends that SCE pilot more than one forecasting method and conduct interim protocol based load impact evaluations to identify the most reliable forecasting methods throughout the 2013 summer season. SCE should also be required to address Summer Discount Plan program operation issues before the 2013 summer peak season begins, if possible. Specifically, the strategy of calling groups of customers for sequential one hour cycling events, rather than calling all the customers for the duration of the full event (or other potential strategies), needs to be reconsidered before the program is further deployed. As discussed in detail later in this chapter, this strategy resulted in load increases during the latter hours of events, thereby reducing the overall effectiveness of the program. SDG&E Similar to SCE, many of SDG&E’’s program event ex post results also diverge from the daily forecast by a considerable degree. The Demand Bidding Program daily forecast was accurate and reliable in predicting ex post results, while the Summer Saver and Capacity Bidding Day Ahead and Day Of program daily forecasts did not accurately nor reliably predict ex post results. The Peak Time Rebate Residential daily forecast was not accurate in predicting ex post results, but consistently underestimated ex post results by approximately 80%. The Critical Peak Pricing and Base Interruptible program did not accurately or reliably predict ex post results, but consistently under predicted ex post load impacts. Due to a weak price signal and inelastic customer demand, the PTR commercial program ex post results were not significant. The CPP E was discontinued as of December 31, 2012. Staff recommends (1) including only customers that opt in to receive e mail or text alerts in the PTR residential daily forecast model (2) running a test event to measure % load impact per customer in order to improve CPP daily forecast estimates (3) including a correlation variable in the Summer Saver daily forecast model to account for customer fatigue during successive event days (4) including a weather variable in the CBP daily forecast model in order to have parity with the ex post regression model.
  • 13. 9 II. Different DR Load Impact Estimates DR programs load impact are forecasted or estimated at different times for different purposes. The following table summarizes the five different DR load impact estimates that are discussed in this chapter. Detail descriptions and methodologies for each DR program measurement are provided in Appendix C. Table 1: DR Load Impact Estimates DR Load Impact Estimates General Description Purpose Ex Ante for RA (e.g., 2012 RA) A year ahead monthly ex ante load impact potential attributed by individual program under a 1 in 2 weather condition. To determine the RA counting against the Load Serving Entity’’s system and local capacity requirements. Daily Forecast The Utilities’’ daily estimate of hourly load impact from DR programs during an event period. To provide the CAISO, CPUC, and CEC the hourly MW provided by DR programs on each event day. 7 Day Report The Utilities’’ preliminary estimate of hourly load reduction results from each triggered DR program To report to the CAISO the load reduction data from the triggered DR programs seven days after each DR event. Ex Post Results The Utilities’’ most accurate measurement of the load impact results from all of the DR programs triggered in a year. The ex post results are calculated using comprehensive regression models. To report to the CPUC the actual results of the DR events Settlement A measurement of customers’’ load reduction from their specific reference load using a baseline method. To calculate customers’’ incentive payments for billing purpose. In this proceeding, the Utilities provided the above DR load impact estimates for their DR programs, which are shown in Appendices D to G. III. Comparison of DR Daily Forecast and Ex Post Results A. Overall Program Performance The following section draws on data provided by the Utilities on March 4, 20139 in response to the Feb 21, 2013 ALJ ruling, which compares event day forecasts (daily forecast or day ahead forecast) to the event day ex post load reduction estimates. Detailed data and methodological descriptions relevant to this chapter are provided in Appendices C and G. Subsequent to its March 4 filing, SCE updated its ex post results for some of the DR program events in its April 2 Load Impact Report but did not update its March 4 filing accordingly. However, in most cases, the April 2, 2013 updated ex post results are even lower than the March 4 preliminary data, e.g., the AC cycling. Therefore, if the updated data was used, it would further support staff’’s findings. 9 SCE 03 and SGE 03.
  • 14. 10 On average, the ex post results for all program events diverge from the daily forecast by a considerable degree. While some program events were forecasted more accurately and consistently than others, Energy Division staff’’s overall conclusion is that the daily forecasting methods for all programs requires meaningful and immediate improvements in order for the day ahead forecasting can become an effective and reliable tool for grid operators. Some of the divergence between the ex post results and the daily forecast estimates can possibly be explained by inadequate program design and program operations. This section focuses on the observed differences between the ex post and the daily forecast with an eye towards identifying improvements for day ahead forecasting, and thus does not cover all potential program improvements. Furthermore, many program design and operational improvements that could lead to better ex post results may not be evident by simply inspecting the daily forecast and ex post data. The ex post analysis methods are guided by Commission adopted load impact protocols10 and the study results are carefully documented in reports prepared by independent consultants managed by SCE staff. However, there are currently no comparable standards and processes guiding the methods for daily forecasting. Indeed, during the course of preparing this report, Energy Division staff became aware that the day ahead forecasting methods are far from transparent, and in some cases lack the robust analysis that is expected of the Utilities. These problems may be somewhat understandable, however, since the daily reports were only formally instituted in 2012. While this report is highly critical of the implementation of the day ahead forecasting, it is important to recognize that the 2012 DR events as a whole did indeed reduce participants loads, and some of the program load reductions were consistent with or better than the day ahead forecast. To that end, staff has categorized the demand response programs into three categories (good, mixed, and poor performance) based on how well the program events performed relative to the day ahead forecasts. SCE Programs that performed well yielded load impacts that were consistent with or better than the day ahead forecast. The Base Interruptible Program (BIP) and the Day of Capacity Bidding Program events produced load reductions that were on par with the forecasts. It is worth noting that BIP, the single largest program, was triggered on only one occasion in 2012 however, and this was test event. Program events with mixed performance were not consistent with the day ahead forecast, but sometimes exceeded the forecast. Staff includes the Day ahead Capacity Bidding, Demand Bidding, and the Residential Summer Discount Plan program events in this category because these program events did indeed occasionally exceed the day ahead forecasts by a significant margin. These programs are discussed in greater detail elsewhere in this section and report. While considered to be mid performing programs, they do have many important issues that deserve attention. 10 Decision 08 04 050
  • 15. 11 Program events that were consistently below the forecast are considered to be poor performing programs. All of the Critical Peak Pricing, Peak Time Rebate, Demand Response Contracts, Commercial Summer Discount Plan, and Agricultural Pumping Interruptible program events triggered during 2012 produced load reductions that were lower than forecasted. Table 2: SCE’’s DR Overall Performance Programs No. of DR Events Daily Forecast Ex Post Difference % Good Performance: Capacity Bidding Program –– Day of 14 12 16 >2 >17% Base Interruptible Program 1 514 573 59 12% Mixed Performance: Capacity Bidding Program –– Day Ahead 12 0.08 0.03 0.29 to 0.08 315% to 86% Demand Bidding Program 8 84 76 33 to 16 40% to 21% Summer Discount Plan (AC Cycling) Res. 23 280 184 603 to 92 100% to 58% Poor Performance: Critical Peak Pricing 12 50 37 < 5 < 11% Peak Time Rebate 7 108 20 < 11 < 11% Demand Response Contracts 3 230 148 < 70 < 34% Summer Discount Plan (AC Cycling) Com. 2 5 3 2 35% Agricultural Pumping Interruptible 2 48 21 < 19 < 52% (Averaged MW over All Events) (Range from Low to High) SDG&E Utilizing the same criteria for evaluating SCE DR programs, The Base Interruptible Program and the Critical Peak Pricing Program were categorized as good performers, the Capacity Bidding Day Ahead, Capacity Bidding Day Of, Demand Bidding, and Summer Saver (AC Cycling) were categorized as mixed performers, and the Critical Peak Pricing Emergency and residential Peak Time Rebate programs were categorized as poor performers. As stated above, DR program design and operation characteristics also need to be taken into account for a complete evaluation of DR program performance.
  • 16. 12 Table 3: SDG&E’’s DR Overall Performance B. Program Performance During Critical Event Days The critical event days of August 10th, 13th, 14th, and September 14th were selected as a focus because they occurred on Flex Alert days, the service area system peak day, or the hottest days of the year. These are all conditions when demand response resources are most critical. August 10, 2012 SCE Two SCE programs were called on August 10th, a Flex Alert day. The programs triggered during that event were the Demand Bidding Program and the Save Power Day (also known as the Peak Time Rebate program). The load reductions achieved during the Demand Bidding Program event surpassed the forecast by 12%, while the Save Power Day event was below the forecast by 11%. Table 4: SCE’’s August 10, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Demand Bidding Program 85.59 95.82 10.23 11.95% Save Power Day 107.24 11 95.85 11.39 10.62% Total 192.83 191.67 1.16 11 SCE did not provide a daily forecast for this event, so the comparison for this event is done with the 7 day report rather than the daily forecast. Programs Number of Events Daily Forecast Ex Post Difference % (Averaged MW over All Events) (Low To High) Good Performance: Base Interruptible Program 1 0.3 0.8 0.5 167% Critical Peak Pricing 7 15 18 > 2.4 >3.1% Mixed Performance: Capacity Bidding Program –– Day Ahead 7 8 6 4.9 to 0.1 32% to 12.2% Capacity Bidding Program –– Day Of 5 12 10 3.2 to 0.7 27.4% to 6.0% Demand Bidding Program 3 5 5 0.4 to 0.1 8.0% to 8.0% Summer Saver (AC Cycling) 8 20 17 12.3 to 3.5 64.0 to 38.7% Poor Performance: Peak Time Rebate Residential 7 19 4 < 24 < 73.6% Critical Peak Pricing –– Emergency 2 2 1 < 0.7 < 53.3%
  • 17. 13 SDG&E Three DR programs were called on August 10th . The Capacity Bidding Day Ahead program load reduction exceeded the forecast by 1%. Conversely, the Summer Saver and residential Peak Time Rebate forecasts under predicted the forecast by 32% and 75%. Table 5: SDG&E August 10, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C = B A D=C/A Capacity Bidding Day Ahead 7.50 7.60 0.10 1.33% Summer Saver (AC Cycling) 27.20 18.50 8.70 32.00% Residential Peak Time Rebate 12.60 3.20 9.40 74.60% Total 47.30 29.30 18.00 August 13, 2012 SCE August 13, 2012 was the system peak day for the SCE service area, with a peak load of 22,428 MW. As shown in Table 6 below, the Critical Peak Pricing program, a dynamic pricing program for commercial and industrial customers over 200 kW, and the Day Of Capacity Bidding Program were triggered during this day. Again, the Capacity Bidding Programs exceeded the forecast by a few MW. The Critical Peak Pricing program event had satisfactory performance, falling short of the forecast by 15%. Table 6: SCE’’s August 13, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Critical Peak Pricing 50.54 42.96 7.58 15.00% Capacity Bidding Program (Day Of) 12.30 15.70 3.40 27.60% Total 62.84 58.66 4.18 SDG&E All three DR programs that were triggered on August 13th, Capacity Bidding Day Of, Summer Saver (AC Cycling), and Critical Peak Pricing, had ex post load impacts that were respectively below daily forecast predictions by 27%, 45%, and 48%.
  • 18. 14 Table 7: SDG&E’’s August 13, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C= B/A D= C/A Capacity Bidding –– Day Of 11.70 8.50 3.20 27.33% Summer Saver (AC Cycling) 33.30 21.40 11.90 45.35% Critical Peak Pricing Emergency 2.30 1.20 1.10 47.83% Total 47.30 31.10 16.20 August 14, 2012 SCE August 14, 2012 was another Flex Alert day, during which seven events were called, using a variety of DR programs. As shown in Table 8 below, all the events combined were forecasted to reduce loads by 570 MW. However, the ex post load impact evaluations found that the actual load reductions were short of the total forecast by 155 MW. 60% of the 155 MW shortfall is attributed to the Demand Response Contract program. The Agriculture Pumping Interruptible program event was short of the event forecast by 52%. Only the Capacity Bidding Program exceeded the forecasted load reduction, but this only made up 4% of the Demand Response Contract program forecast, and thus was insufficient to cover the overall event day shortfall. It is worth noting that the Demand Response Contract and Capacity Bidding Programs share something in common in that they are both commercial aggregator programs. The reason for the difference in performance between these programs requires further study. It should be noted that SCE’’s Demand Response Contracts expired on December 31, 2012 and have since been replaced by new contracts that that expire at the end of 2014.12 Table 8: SCE’’s August 14, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Demand Response Contracts 275.00 182.05 92.95 33.80% Demand Bidding Program 94.09 61.76 32.33 34.36% Agriculture Pumping Interruptible 36.00 17.29 18.72 51.99% Summer Discount Plan (Res) Group 1 130.40 119.40 11.00 8.44% Capacity Bidding Program (Day Of) 12.30 17.82 5.52 44.86% Summer Discount Plan (Res) Reliability 17.42 13.50 3.92 22.49% Summer Discount Plan (Com) 4.77 3.10 1.67 35.04% Total 569.98 414.91 155.07 12 D.13 01 024 http://docs.cpuc.ca.gov/PublishedDocs/Published/G000/M046/K233/46233814.PDF
  • 19. 15 SDG&E Four DR programs, Demand Bidding, Critical Peak Pricing, Capacity Bidding Day Ahead, and residential Peak Time Rebate, were called on August 14th . While the Demand Bidding and Capacity Bidding Program ex post load impacts closely matched the daily forecast, the Critical Peak Pricing and residential Peak Time Rebate did not. Since the Critical Peak Pricing and residential Peak Time Rebate programs are large scale residential programs it is possible that the difference between the forecast and ex post load impacts reflect widely varying customer behavior during DR events. Table 9: SDG&E’’s August 14, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Demand Bidding Program 5.00 5.10 0.10 2.00% Critical Peak Pricing 14.30 25.90 11.60 81.12% Capacity Bidding Program (Day Ahead) 7.50 7.50 0.00 0.00% Residential Peak Time Rebate 12.50 1.10 11.40 91.20% Total 39.30 39.60 0.30 September 14, 2012 SCE September 14, 2012 was the hottest day of the year in both the SCE and SDG&E service areas (see Table 10 below). Understandably, SCE triggered their Summer Discount Plan (residential AC Cycling Programs) during this day. The Capacity Bidding Program was also triggered, with performance comparable to the other Capacity Bidding Program events on critical days discussed above. The September 14 residential Summer Discount Plan events consisted of three separate customer groups sequentially triggered for one hour events. All three one hour events fell considerably short of the forecasted load reductions. Table 10: SCE’’s September 14, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Summer Discount Plan (Residential) Groups 5 and 6 135.61 20.70 114.91 84.74% Summer Discount Plan (Residential) Groups 1 and 2 110.89 37.80 73.09 65.91% Capacity Bidding Program (Day Of) 11.90 16.21 4.31 36.18% Summer Discount Plan (Residential) Groups 3 and 4 99.32 17.80 81.52 82.08% Total 357.72 92.51 265.22
  • 20. 16 SDG&E On September 14, 2012, the peak temperature in SDG&E’’s service territory was 109 degrees. The Demand Bidding, Summer Saver, and Base Interruptible Programs ex post load impacts were above the daily forecast in a range between 8% and 167%. Since the absolute value of the Base Interruptible Program load impact is ~ 1 MW, a small increase or decrease in the daily forecast prediction can result in high variability in the percent difference between these two figures. Conversely, the Capacity Bidding Day Of and Day Ahead Programs and the Critical Peak Pricing Emergency Program daily forecasts were below the daily forecast in a range between 12% and 44%. Table 11: SDG&E’’s September 14, 2012 Demand Response Events C. Detailed Program Analysis The following section discusses programs and events that produced load reductions forecasted by the daily reports, as well as programs that failed to produce the forecasted load reductions. For this purpose, all programs and events that came within 10% (+/ ) of the forecasted load reductions are considered to be consistent with the daily forecast and all programs and events that were more or less than 50% of the forecasted load reductions are considered to have failed to produce the forecasted load reductions. SCE There were a total of 104 separate events in the SCE service area in 2012. Only ten of these events produced the load reductions consistent with those forecasted in the daily reports. As shown in Table 12 below, all of these events produced fairly sizable load reductions, ranging from 59 to 130 MW, with the exception of one Capacity Bidding Program event, which produced a very small load reduction. Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Capacity Bidding Program (Day Of) 9.00 5.70 3.30 36.67% Capacity Bidding Program (Day Ahead) 12.10 10.60 1.50 12.40% Demand Bidding Program 5.00 5.40 0.40 8.00% Summer Saver (AC Cycling) 15.50 22.50 7.00 45.16% Base Interruptible Program 0.30 0.80 0.50 166.70% Critical Peak Pricing Emergency 1.60 0.90 0.70 43.75% Total 43.50 45.90 2.40
  • 21. 17 Table 12: SCE’’s DR Events with Ex Post Results within 10% of the Daily Forecast Program Name Event Date Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Summer Discount Plan (Residential) 08/14/12 130.40 119.40 11.00 8.44% Summer Discount Plan (Residential) 08/29/12 82.56 80.30 2.26 2.74% Summer Discount Plan (Residential) 08/01/12 58.60 57.10 1.50 2.56% Summer Discount Plan (Residential) 08/15/12 77.77 77.50 0.27 0.35% Demand Bidding Program 10/17/12 79.05 79.25 0.20 0.26% Demand Bidding Program 10/01/12 78.75 79.78 1.03 1.31% Summer Discount Plan (Residential) 08/09/12 118.06 121.20 3.14 2.66% Summer Discount Plan (Residential) 08/28/12 83.86 88.20 4.34 5.18% Capacity Bidding Program (Day Ahead) 07/31/12 0.0700 0.0740 0.00 5.71% Demand Bidding Program 08/08/12 85.59 92.95 7.36 8.60% Of the 104 events in 2012, thirty (or about 29%) of the events were more than 50% off of the day ahead forecast. Five of these events produced load reductions that were greater than the forecast, while the remaining 25 were lower than the forecast. The three events with the highest percentage difference below the forecast were very small Day Ahead Capacity Bidding Program events, and thus are not considered the most critical problem. Twenty one of the remaining events were Summer Discount Plan (AC Cycling) events, and these varied markedly off the forecast.
  • 22. 18 Table 13: SCE’’s DR Events with Ex Post Results greater than + 50% of the Daily Forecast Program Name Event Date Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Capacity Bidding Program (Day Ahead) 10/01/12 0.09 0.20 0.29 315.22% Capacity Bidding Program (Day Ahead) 10/02/12 0.09 0.10 0.20 213.04% Capacity Bidding Program (Day Ahead) 10/05/12 0.09 0.07 0.16 170.65% Save Power Days / Peak Time Rebates 09/07/12 108.66 23.11 131.77 121.27% Summer Discount Plan (Residential) 06/20/12 128.01 0.50 127.51 99.61% Save Power Days / Peak Time Rebates 09/10/12 108.52 1.65 106.87 98.48% Summer Discount Plan (Residential) 09/14/12 135.61 20.70 114.91 84.74% Summer Discount Plan (Residential) 07/10/12 263.67 44.70 218.97 83.05% Summer Discount Plan (Residential) 09/14/12 99.32 17.80 81.52 82.08% Summer Discount Plan (Residential) 06/29/12 178.26 33.30 144.96 81.32% Summer Discount Plan (Residential) 09/20/12 77.39 14.60 62.79 81.14% Summer Discount Plan (Residential) 06/29/12 178.26 35.80 142.46 79.92% Summer Discount Plan (Residential) 07/10/12 263.67 66.60 197.07 74.74% Summer Discount Plan (Residential) 10/02/12 298.91 86.20 212.71 71.16% Summer Discount Plan (Residential) 07/10/12 263.67 76.70 186.97 70.91% Summer Discount Plan (Residential) 09/20/12 65.53 21.10 44.43 67.80% Summer Discount Plan (Residential) 09/20/12 65.73 21.90 43.83 66.68% Summer Discount Plan (Residential) 09/14/12 110.89 37.80 73.09 65.91% Summer Discount Plan (Residential) 08/22/12 115.03 42.40 72.63 63.14% Agriculture Pumping Interruptible 09/26/12 60.56 24.00 36.56 60.36% Summer Discount Plan (Residential) 09/21/12 168.96 69.10 99.86 59.10% Summer Discount Plan (Residential) 09/28/12 55.06 24.50 30.56 55.50% Agriculture Pumping Interruptible 08/14/12 36.00 17.29 18.72 51.99% Summer Discount Plan (Residential) 10/17/12 127.25 62.30 64.95 51.04% Summer Discount Plan (Residential) 10/17/12 146.77 72.30 74.47 50.74% Summer Discount Plan (Residential) 08/17/12 101.30 153.00 51.70 51.04% Capacity Bidding Program (Day Ahead) 10/29/12 0.09 0.15 0.06 59.78% Summer Discount Plan (Residential) 08/17/12 58.00 98.30 40.30 69.48% Capacity Bidding Program (Day Ahead) 10/18/12 0.09 0.17 0.08 85.87% Summer Discount Plan (Residential) 09/10/12 18.98 68.40 49.42 260.42% Summer Discount Plan The Summer Discount Plan event variability ranges from 121% below the forecast (with a load increase rather than a load reduction) to 260% above the forecast. Overall, the AC Cycling program represents the most variance13 of all the SCE DR programs. When all of the variances for individual events are aggregated, the AC Cycling program represents 49% of the total variance. The Pearson Product Moment Correlation between the daily forecast and the ex post load impacts is 0.21, representing a very weak positive correlation. 13 Variance in this context specifically refers to the absolute difference between the daily forecast and the event day ex post load reductions.
  • 23. 19 The Pearson correlation between the average event temperature14 and the event level variance (difference between the daily forecast and the event day ex post load reductions) is 0.37, representing a moderately weak correlation. In everyday language this means that SCE’’s 2012 Summer Discount Plan forecast method cannot be relied upon to effectively predict the actual program load reductions. In addition, there appears to be little relationship between the event day temperature and the difference between the daily forecast and the event day ex post load reductions, potentially ruling out temperature as an explanatory factor for the difference. The Summer Discount Plan was (by far) the most often triggered program in SCE’’s 2012 DR portfolio. There were 23 separate events, including two early test events15 . Most of the 23 events were split into 3 customer segments such that each group of customers was triggered for only a portion (i.e. one hour) of each event (typically lasting three hours). Three events on 9/14, 9/20, and 9/28 deployed 6 customer segmentations. SCE operated the program in this manner to avoid cycling their customers’’ air conditioners for more than one hour at a time16 . The purpose of this strategy is so customers will be minimally impacted by the loss of one hour of AC services, compared to multiple continuous hours, and in theory the utility would still be able to reduce load when needed. As shown in Table 14 below, the implementation of this strategy, however, resulted in a rebound effect from the groups curtailed in event hours 1 & 2 that added load in hours 2 & 3 as AC units ran at above normal capacity to return the participants’’ buildings to the original temperature set points17 . The net effect was to dampen the average hourly load impact for the entire event period, as illustrated in Table 14. It is possible that the daily forecasts were prepared assuming that all customers would be curtailed at the same time over the entire duration of the event. In such a case, the average hourly load reductions would likely have been larger because all customers would be simultaneously curtailed and the rebound effect would be delayed until after the event was over. This issue is further illustrated in Chapter 2, Section IV ““Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)””. Table 14: SCE’’s Hourly Load Impact from a Sept 14 Summer Discount Plan event Event Hour Ending: Event Hours w/ Rebound Post Event Rebound Event Hour Average 16 17 18 19 20 15 39.6 25.1 17.0 16 27.1 27.0 39.6 17 21.3 49.6 37.8 Hour Total 39.6 2.0 22.7 89.2 37.8 6.3 14 SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013. 15 The last two events in late October were not included in the ex post analysis. 16 SCE 01 Testimony at 11. 17 SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013.
  • 24. 20 Another potential explanation for the suboptimal performance could be customers exercising the override option in their enrollment contracts with SCE. However, SCE’’s A.12 12 016 testimony18 indicates that the proportion of customers with an override option is fairly small (consisting of about 1% of the customers enrolled in SDP) and that these customers rarely exercise the override option. Finally, it is possible that transitioning Summer Discount Plan from an emergency program to a price responsive program could have introduced some additional uncertainties that aren’’t adequately captured by the current forecasting methods. Regardless of the explanation for the unexpectedly low load reductions during these events, it is critical that SCE improve the day ahead forecast for the SDP program as a whole. Energy Division staff reviewed SCE’’s method for forecasting the Summer Discount Plan program.19 The methodology, provided in Appendix C, is described in a 1986 internal SCE memorandum and consists of a simple algorithm which estimates the load reduction per ton of AC based on the forecasted temperature. The equation coefficients were determined by a 1985 load reduction study that SCE staff could not locate when requested to do so by Energy Division staff. Without the 1985 load reduction study Energy Division staff could not fully evaluate the forecasting methodology. SCE did provide a revised algorithm which modifies the equation structure. But the underlying methods for estimating those coefficients as yet remain unexplained. This evidence suggests that there is a critical flaw in either the way the Summer Discount Plan events are forecasted or in the operation of the program, or both. The lack of a reliable day ahead forecasting method is a major weakness that undermines the ability to fully consider AC Cycling in the CAISO grid operations. Even if the utilities’’ DR resources are eventually to be bid into the CAISO market, which currently are not, ED recommends that SCE immediately document the forecasting methods to be used for the 2013 season and thoroughly vet the methods with CPUC and CAISO staff and relevant stakeholders to ensure the proposed forecasting methods are reasonable and reliable. Throughout the 2013 summer season (and longer if necessary), SCE should consider piloting more than one forecasting method which should be tested using small ex post load impact evaluations to identify the most reliable forecasting methods. Base Interruptible Program The Base Interruptible Program was triggered only once during the entire 2012 season and this was a test event. This single event produced 573 MW of load reductions on September 26. The load reductions for this event were 59 MW more than the day ahead forecast. It is worth noting that the single Base Interruptible event was more than three times the load reduction of any other SCE program event during 2012, and it was not triggered on one of the critical event days discussed earlier in this section. The Commission should explore a policy requiring more frequent deployments of this program since it appears to have significant, yet underutilized, potential. 18 SCE 01 Testimony at 11, Lines 3 5. 19 See Appendix S.
  • 25. 21 Capacity Bidding Program The Capacity Bidding Program Day Ahead events produced an average load reduction of 0.03 MW across all events. With the exception of three events in October (that were associated with negative load reductions in the ex post analysis) most events produced relatively small load reductions forecasted by the daily report. None of the Capacity Bidding Program day ahead events occurred in August and September when the load reductions are typically most needed. By comparison, all of SCE’’s Capacity Bidding Program Day Of events exceeded the forecasted load reductions, by an average of 32%. The average load reduction for the Capacity Bidding Program Day Of events was 15.9 MW, over 500 times the load reductions produced by Day Ahead events. This evidence suggests that, unlike the Day Of program, the Day Ahead Capacity Bidding Program may not be serving a useful function in SCE’’s DR portfolio. Demand Bidding Program The Demand Bidding contracts were called on eight occasions during the summer of 2012. Of these eight events, five occurred in August. The first two August events on August 8 and August 10 resulted in load reductions that exceeded the daily forecast by an average of 10%. The third and fourth events on August 14 and August 16 were 34% short of the forecasted load reductions and the fifth event on August 29 was 40% below forecast, suggesting that perhaps a decline in customer participation in events could be explored as a potential factor in diminishing returns. Demand Response Contracts (DRC) –– Nominated Somewhat surprisingly, there were only two events for which Demand Response Contracts were called. The ex post load reductions for these two events were both around 35% below the daily forecast. Energy Division was not able to examine why this program performed so poorly. As noted earlier, SCE’’s DRCs expired on December 31, 2012, and have since been replaced by new contracts approved by the Commission. Save Power Days / Peak Time Rebates (PTR) –– Price Responsive Daily forecasts were not provided by SCE for the four PTR events that occurred in August, thus comparisons between the daily forecast and ex post results are possible for only the two events on September 7 and September 10. Both of the September events were forecasted to reduce loads by 109 MW. Ex post results, however, indicate that the PTR events had no impact at all. In fact, the September 7 event was correlated with a fairly significant load increase of 23.11 MW. Ex post load reductions were estimated for the four August PTR events, for which day ahead estimates were not provided by SCE. As a proxy for the daily forecast the 7 day reports were used. As shown in Table 15 below, estimated load reductions were between 107 and 108, while the ex post load reductions ranged between 0.02 and 96 MW.
  • 26. 22 Table 15: SCE’’s Peak Time Rebate MW Event Day 7 Day Report Ex Post 8/10/2012 107.24 MW 95.85 MW 8/16/2012 107.61 MW 24.43 MW 8/29/2012 108.51 MW 21.93 MW 8/31/2012 108.73 MW 0.02 MW Given the considerable variability in ex post results for the PTR program events, the day ahead forecasting and event reporting will need significant revision to account for these discrepancies. If the PTR program is going to continue, staff recommends that SCE prepare a proposal for a viable forecast and submit that for staff to review. SDG&E There were a total of 46 DR program events that were triggered on 14 event days in SDG&E’’s service area from June 2012 October 2012. Daily forecasts for twelve DR program events were within + 10% of ex post load impacts. As depicted in Table 16, moderate load reductions ranging from 5 to 17 MW were produced when these events were triggered. Three programs delivered accurate results with a moderate degree of consistency: Demand Bidding Program, Critical Peak Pricing, and Capacity Bidding Program Day Of. Table 16: SDG&E’’s DR Events with Ex Post Results within + 10% of the Daily Forecast Program Name Event Date Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Between Forecast & Ex Post Demand Bidding Program 10/2/2012 5 4.6 0.4 8.00% Capacity Bidding Program (Day Of) 8/8/2012 11.7 11 0.7 5.98% Capacity Bidding Program (Day Ahead) 8/9/2012 7.5 7.5 0 0.00% Capacity Bidding Program (Day Ahead) 8/14/2012 7.5 7.5 0 0.00% Capacity Bidding Program (Day Ahead) 8/10/2012 7.5 7.6 0.1 1.33% Demand Bidding Program 8/14/2012 5 5.1 0.1 2.00% Summer Saver (AC Cycling) 9/15/2012 8.6 8.8 0.2 2.33% Critical Peak Pricing 10/2/2012 16 16.5 0.5 3.13% Critical Peak Pricing 8/21/2012 16.5 17.2 0.7 4.24% Critical Peak Pricing 9/15/2012 13.7 14.5 0.8 5.84% Demand Bidding Program 9/14/2012 5 5.4 0.4 8.00% Critical Peak Pricing 8/30/2012 16.2 17.8 1.6 9.88% A total of 19 DR program events had ex post load impacts that were greater than + 50% of the daily forecasts as depicted in Table 17. In particular, the residential and commercial Peak Time Rebate program ex post load impacts deviated from the daily forecasts by greater than 70%. According to SDG&E, the commercial Peak Time Rebate ex post load impacts were deemed to be not statistically significant. On this basis, SDG&E reported zero load impacts for this program.
  • 27. 23 Table 17: SDG&E’’s DR Events with Ex Post Results greater than + 50% of the Daily Forecast Program Name Event Date Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Between Forecast & Ex Post A B C= B A D= C/A Commercial Peak Time Rebate 8/9/2012 1.2 0 1.2 100.00% Commercial Peak Time Rebate 8/10/2012 1.1 0 1.1 100.00% Commercial Peak Time Rebate 8/11/2012 0.8 0 0.8 100.00% Commercial Peak Time Rebate 8/14/2012 1.2 0 1.2 100.00% Commercial Peak Time Rebate 8/21/2012 1.2 0 1.2 100.00% Commercial Peak Time Rebate 9/15/2012 0.9 0 0.9 100.00% Residential Peak Time Rebate 8/14/2012 12.5 1.1 11.4 91.20% Residential Peak Time Rebate 8/21/2012 25 3 22 88.00% Residential Peak Time Rebate 8/11/2012 12.2 1.7 10.5 86.07% Residential Peak Time Rebate 8/9/2012 13.1 3.3 9.8 74.81% Residential Peak Time Rebate 8/10/2012 12.6 3.2 9.4 74.60% Residential Peak Time Rebate 9/15/2012 32.3 8.3 24 74.30% Residential Peak Time Rebate 7/20/2012 23.9 6.3 17.6 73.64% Capacity Bidding Program (Day Ahead) 10/1/2012 9 4.1 4.9 54.44% Capacity Bidding Program (Day Ahead) 10/2/2012 9 4.2 4.8 53.33% Summer Saver (AC Cycling) 9/14/2012 15.5 22.5 7 45.16% Critical Peak Pricing 8/11/2012 11.7 18.4 6.7 57.26% Critical Peak Pricing 8/14/2012 14.3 25.9 11.6 81.12% Base Interruptible Program 9/14/2012 0.3 0.8 0.5 166.67% Capacity Bidding Program Day Ahead (CBP DA) The percent difference between the CBP DA daily forecast and ex post results respectively ranged from 32% 12% (Table 3). Based upon this assessment, the daily forecasts for CBP DA were not accurate or consistent predictors of ex post results. Since the CBP DA daily forecast model does not have a variable that accounts for weather, and the ex post models do, this methodological difference could account for the variability between the two load impact measures. Another factor that could affect this difference is the percent load impact per customer. Although customers submit load impact bids prior to each DR event, the actual load reduction on the event day may not coincide with the projected load reduction. If weather affects event day load reduction by CBP customers, the addition of a weather variable to the daily forecast model could increase its accuracy. In order to address uncertainty in the percent load reduction per CBP customer, DR test events could be scheduled to measure this value on event like days. Capacity Bidding Program Day Of (CBP DO) Similar to the CBP DA program, the CBP DO daily forecasts were not accurate nor consistent predictors of ex post results based upon the range of the difference, 27.4% 6.0% (Table 2), between the two load impact measures. As stated above, inclusion of a weather variable in the
  • 28. 24 daily forecast model and measurement of percent load reduction per customer during test events could increase the accuracy and consistency of the daily forecast model to predict ex post load impacts. Demand Bidding Program (DBP) The percent difference between the DBP daily forecasts and ex post load impacts ranged from 8.0% to 8.0% (Table 3) for the three DBP events that were called during the summer. Based upon this result, the DBP daily forecast accurately and consistently predicted ex post load impacts. One caveat for making a general assessment of the DBP forecast model is that only one customer provided load reduction bids for the DR summer events. In order to do so, it would be advised to examine forecast and load impact data from at least 5 10 event days. Commercial Peak Time Rebate SDG&E reported zero ex post load impacts for this program in its March 4th filing. According to SDG&E, zero values do not imply that no load reduction occurred but that the load impacts were not statistically significant.20 Therefore, a comparison of daily forecasts and ex post load impacts could not be performed. Based upon conversations with SDG&E, the lack of effectiveness of the commercial Peak Time Rebate program could be attributed to a weak price signal and inelastic customer demand during event periods. SDG&E would be advised to discontinue the commercial Peak Time Rebate program. Residential Peak Time Rebate The percent difference between daily forecast and ex post load impacts ranged from 91.2% to 73.6% (Table 3). This implies that the residential Peak Time Rebate program daily forecast is not an accurate predictor of ex post load impact. However, the residential Peak Time Rebate program daily forecast consistently over predicted the ex post results. Since the ex post methodology only modeled load impacts for customers that signed up to receive e mail or text alerts and the daily forecast model does not, it is possible that the accuracy of the daily forecast model could improve if there was parity between the two methodologies. If only residential Peak Time Rebate opt in customers were included in the daily forecast model this may resolve the discrepancy. As an alternative solution, since the daily forecast consistently over predicted the ex post results, SDG&E might consider derating daily forecasts by a factor of 0.7 to 0.9 when estimating ex post load impacts. Summer Saver (AC Cycling) The range of the percent difference between daily forecast and ex post load impacts, 64.0% 38.7%, presented in Table 3 indicates that the daily forecast is not an accurate or consistent predictor of ex post load impacts. 20 SCE 03 at 21.
  • 29. 25 It should be noted that the both the residential and commercial Summer Saver ex post methodologies (respectively a randomized experiment and a panel vs. customer regression) differed from prior years due to the availability of smart meter data21 . This could account for the difference between daily forecast and ex post results. In addition, both ex post methodologies utilized control and treatment groups, whereas daily forecast methodologies did not. According to this assessment, it would be advised to examine how the daily forecast and ex post models could be harmonized. Based upon a conversation with SDG&E, a temperature squared variable is utilized in the daily forecast model. Compared to SCE’’s current AC cycling daily forecast model, SDG&E’’s daily forecast model includes an additional measure of accuracy. However, in order to better predict customer behavior on successive event days or prolonged event hours, SDG&E might consider including an autocorrelation variable in the daily forecast model. Critical Peak Pricing The percent difference between the daily forecast and ex post results ranged from 3.1% 81.1%. This is the only program where the ex post results consistently outperformed the daily forecast predictions. According to SDG&E, the percent load impacts for the Critical Peak Pricing program in 2012 were lower in comparison to 2011 and led to an underestimation in the daily forecast22 . Critical Peak Pricing has approximately ~ 1,000 customers and, as SDG&E claims, any variation in the percent load reduction per customer could lead to high variation in the aggregate impact estimates. This would also be the case for large scale residential DR programs including Peak Time Rebate and Summer Saver (AC Cycling). SDG&E also claims that measurement error might account for differences between load impact category values. However, no explanation is provided to elucidate how the measurement error occurred (e.g. since Smart Meters were not fully deployed in SDG&E’’s territory during Summer 2012, measured load reductions obtained from analog meters were not accurate). Base Interruptible Program The percent difference between the daily forecast and ex post load impact for the Base Interruptible Program was 166.7%. Since two large Base Interruptible Program customers dropped out of the program, SDG&E was not able to accurately forecast the load impact from the remaining customers. It is possible that further analysis with additional Base Interruptible Program load impact data might shed light on the accuracy of the daily forecasting methods. 21 SDG&E load impact Filing Executive Summary, April 2, 2012 at 31. 22 SGE 03 at 19.
  • 30. 26 Critical Peak Pricing –– Emergency Due to decreasing customer subscription to this tariff, the CPP E program was discontinued as of December 31, 2012.23 D. Summary of Recommendations Given the divergence between the daily forecast estimates and ex post load impact results, staff makes the following recommendations: The daily forecasting methods for all programs must be improved. The daily forecasting methods should be better documented and should be developed with relevant agencies and stakeholders. SCE should test a number of different forecasting methods for the Summer Discount Plan program. SCE should change the Summer Discount Plan program strategy of calling groups of customers for sequential one hour cycling events. SDG&E should include only opt in customers in the residential PTR daily forecast model. SDG&E should run a test event to improve CPP daily forecast estimates. SDG&E should account for customer behavior during successive event days in the Summer Saver daily forecast model. SDG&E should include a weather variable in the CBP forecast model. IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA) A. Summary of the Staff Analysis and Recommendations Comparing the 2012 ex post results with the 2012 RA forecast is not an accurate method of determining how the DR programs performed. RA load forecast represents the maximum capacity DR can provide under a set of condition for resource planning needs. Ex post load impact reflects the demand reduction obtained during actual events in response to operational planning needs. Resource planning and operational planning are different in terms of conditions (i.e. event hours, participation, and temperature) and purposes. However, in summer 2012, the Utilities’’ DR programs had not been utilized to its full capacity even under an extreme hot weather condition. This raises the question of the usefulness of the current RA forecast and whether RA forecast should be changed to reflect the set of conditions reflecting operational needs that include the utilities’’ day to day resource availability limitations and DR dispatch strategies for optimal customer experience. A working group that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address the forecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growth rate, dropout rate) used for forecasting RA. 23 At 61, SDG&E load impact Filing Executive Summary, April 2nd
  • 31. 27 B. Background The 2012 RA forecast represents the maximum capacity DR can provide under a set of conditions for resource planning needs. The conditions entail a 1 in 2 weather year24 , portfolio level, entire participation, five hour window event (1 p.m. to 6 p.m.), and enrollment forecast assumption. The 2012 ex post load impacts reflect the demand reductions obtained during actual events in response to operational needs. Operational needs on the event day may not require the full capacity of DR because the condition does not warrant it. Utilities have the discretion to call for a few DR programs with shorter event hours or a smaller group of participants based on their generation and DR resource dispatch strategies.25 This means an ex post impact may only reflect a 1 hour event window versus an RA forecast that has a 5 hour event window. Therefore, the ex post impact may reflect only a segment of a program’’s participants versus the RA forecast that assumed the program’’s entire set of participants. The ex post impact may reflect a lower temperature as versus the RA forecast that has a higher temperature of the 1 in 2 weather year condition. C. Staff Analysis Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate method on how well the program performs against its forecast. The table below contains August monthly average load impact for the 2012 Resource Adequacy (RA) forecast as filed in the spring of 2011 and the ex post results that occurred in 2012. There are stark differences between what the Utilities forecasted a year ahead (RA) and what the results are (Ex Post). On average for the month of August, the variability ranges from 485% (over performance) to 95% (under performance) for SCE and 58% to 97% for SDG&E. The main reason for the discrepancy is because the RA data is used to assist in resource planning, which means it is characterized as a 5 hour event in which all customers are called for the entire period (1 6pm) for the summer. However, ex post results reflect the impact from the actual DR operations, which means that it can be a 1 hour event in which some (not all) customers are called for a short period of time. Other factors that contributed to the discrepancy include temperature, enrollment and dual participation. 24 Represent the monthly peak day temperature for an average year. Exhibit SGE 03, Page 14. 25 SGE 06, Page 6.
  • 32. 28 Table 18: SCE Demand Response Load Impact 2012 Resource Adequacy vs. 2012 Ex Post August Average (MW) Program Name RA Forecast 26 Ex Post 27 Difference RA vs. Ex Post % Difference RA vs. Ex Post A B C=B A D=C/A Demand Bidding Program 12 72 60 485% Demand Response Contracts 105 182 77 74% Base Interruptible Program 28 548 573 25 5% Capacity Bidding Program Day Of 19 17 2 11% Summer Advantage Incentive/Critical Peak Pricing 69 39 30 44% Agricultural Pumping Interruptible 40 17 22 57% Summer Discount Plan/ AC Cycling Residential 500 212 288 58% Save Power Days / Peak Time Rebates 266 36 230 87% Capacity Bidding Program Day Ahead 29 1 0 1 94% Summer Discount Plan/AC Cycling –– Commercial 62 3 59 95% Table 19: SDG&E Demand Response Load Impact 2012 Resource Adequacy vs. 2012 Ex Post August Average (MW) Program Name RA Forecast 30 Ex Post 31 Difference RA vs. Ex Post % Difference RA vs. Ex Post A B C=B A D=C/A Critical Peak Pricing Default 12 19 7 58% Summer Saver/ AC Cycling 15 19 4 27% Capacity Bidding Program Day Ahead 10 8 2 20% Capacity Bidding Program Day Of 22 10 12 55% Base Interruptible Program 32 11 0.84 10.16 92% Reduce Your Use / Peak Time Rebates 69 2 67 97% Demand Bidding Program n/a 33 5 n/a n/a Critical Peak Pricing Emergency n/a 1 n/a n/a 26 Exhibit SCE 03, Table 1. 27 Exhibit SCE 03, Table 1. 28 Number based on September average because there were no events for month of August. 29 Number based on July average because there were no events for month of August or September. 30 Exhibit SDG 03, Table 1 31 Exhibit SDG 03, Table 1 32 Number based on September average because there were no events for month of August. 33 DBP was not approved until the year after the 2012 RA forecast was filed.
  • 33. 29 Forecasting DR estimate for resource planning needs is different than forecasting for operational needs. Unlike resource planning needs, operational needs on the event day may not require the full capacity of DR because the condition does not warrant it or the Utilities deployed ‘‘optimal’’ dispatch strategies for customer experience. Utilities have the discretion to call for shorter event hours or a smaller group of participants if the system is adequately resourced for that day. As discussed in Chapter 3, peaker or other generation resources may have been dispatched instead of DR even though such operation would be contrary to the Loading Order.34 For example, SCE can divide its residential Summer Discount Plan participants into three groups and dispatch each group for one hour of an event, resulting in three consecutive one hour events (see chart below). Approximately 1/3 of the customers can be curtailed in any given hour. Rebound from the groups curtailed in event hours 1 and 2 can reduce the net impact in hours 2 and 3, lowering the average hourly impact for the entire event period. As a result, the average impact per hour can be roughly 100 MW for operation needs. The following figures illustrate the rebound effects from SCE’’s sub group dispatch strategy for its AC cycling. Figure 1 Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex Ante Briefing 34 http://www.cpuc.ca.gov/NR/rdonlyres/58ADCD6A 7FE6 4B32 8C70 7C85CB31EBE7/0/2008_EAP_UPDATE.PDF.
  • 34. 30 However for the RA forecast, resource planning needs require the full capacity of DR. For example, SCE assumed all residential Summer Discount Plan participants would be curtailed at the same time to represent the full program capabilities of a reliability event (see chart below). Subsequent hourly impacts can be larger due to all customers being curtailed at once and rebound effect being delayed until end of entire event window. As a result, the average impact per hour for RA forecast can be roughly 300 MW, which is roughly 3 times greater than ex post in an hour. Figure 2 Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex Ante Briefing The opposite extreme condition could occur where the ex post result is higher than the RA forecast. In the case of SCE’’s Demand Bidding Program, the average ex post result is 72 MW, which is 6 times more than the RA forecast of 12 MW (see Table 18). Dual participation was the major contributor to the discrepancy. For customers who enrolled in two programs such as Base Interruptible Program and Demand Bidding Program, the RA forecast only counts the MW in one program (Base Interruptible Program) to avoid double counting.35 Had the two programs been called the same day, the ex post would have shown a much lower amount for Demand Bidding Program. 35 Portfolio level.
  • 35. 31 September 14, 2012 was considered a hot day (1 in 10 weather year condition36 ), however, SCE still did not dispatch their entire residential Summer Discount Plan participants. Instead, SCE only dispatched a portion of its participants for one hour of an event, resulting in a five consecutive one hour events. On average, SCE received only 6.3 MW37 for the event, which is a huge underperformance in comparison to RA forecast of 519 MW.38 This raises the question that if SCE chose not to dispatch all of its Summer Discount Plan participants at the same event hour during a 1 in 10 weather year condition, under what circumstances SCE will dispatch its Summer Discount Plan to its full program capacity. The usefulness of the RA forecast is in question if the utility does not test a DR program to its full capacity. Should the RA forecast process be amended to include another Ex Ante forecast that is based on operational needs including optimal customer experience, and if so what would that entail? D. Conclusion and Recommendations Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate method in determining DR program performance because the ex post results are in response to operational needs which can be entirely different than resource planning needs. However, in 2012 the RA forecast was not tested to its full capacity. This raises the question of whether RA forecast should be changed to reflect both planning needs and operational needs. A working group that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address the forecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growth rate, drop of rate) used for forecasting RA. This working group should meet in December/January annually and come up with a set of input assumptions (i.e. growth rate, drop off rate) used for forecasting DR estimates. 36 Represent the monthly peak temperatures for the highest year out of a 10 year span. Exhibit SGE 03, Page 14. 37 Christensen Associates Energy Consulting 2012 Load Impact Evaluation of Southern California Edison’’s Residential Summer Discount Plan (SDP) Program, April 1, 2013, Table 4 3d. 38 Exhibit SCE 03, Table 1, 2012 RA for the month of September.
  • 36. 32 Chapter 3: Demand Response Program Operations I. Summary of Staff Analysis and Recommendations The 2006 to 2011 data shows that the Utilities historically triggered their DR programs far below the program limits in terms of number of events and hours. Even with the SONGS outage, the Utilities did not trigger their DR programs in 2012 summer more frequently as anticipated. Almost all of the Utilities’’ 2012 DR program events and hours fall within the historical averages or below the historical maximum. However, staff was surprised to find that the Utilities dispatched their peaker power plants (peaker plants) three to four times more frequently in 2012 than the historical averages. The peaker plant service hours were closer to the plants’’ emission allowances than the DR events to the program limits. Staff observed a trend where some DR program events decreased from 2006 to 2012 and yet peaker service hours increased in the same period. This trend raises a concern that the Utilities had under utilized DR programs and over relied on peaker plants. Under the ““Loading Order””, DR is a preferred resource and intended to avoid the building and dispatching of peaker plants. Due to the time constraints and lack of additional information, Staff was unable to fully address this question and the reasons behind these trends in this report. Therefore, staff recommends in future DR program Measurement and Evaluations, the Commission evaluates the DR program operations and designs in comparison with the peaker plant operations to ensure the utilities’’ compliance with the Loading Order. Specifically, the staff recommends that the Commission: 1. Require the Utilities to provide both DR event and peaker plant data and explanations for the disparity between historical DR event hours and peaker plant service hours in future DR evaluations and the next DR budget applications. The Utilities should include the DR and peaker plant hourly data and explain why they did not trigger DR programs during any of the hours when the peaker plant was dispatched. This information will inform the future DR program designs to improve the DR usefulness. 2. Require that DR historical operations be reflected in the input assumptions for the Ex Ante forecast and the evaluation of the program cost effectiveness. 3. Address the Loading Order policy in DR planning and operation and utilization of peaker plants in the next DR Rulemaking and the Utilities’’ energy cost recovery proceedings. II. 2012 DR Program Trigger Criteria and Event Triggers Appendices H and I are a summary of the Utilities’’ 2012 DR program trigger criteria and the event triggers. The DR program trigger criteria consists of a list of conditions, which is self explanatory depending on the type of the program, e.g., Emergency Program triggers are based on system contingencies and non Emergency Program triggers also include high temperature,
  • 37. 33 heat rate (economic), and resource limitations. The 2012 event triggers were the actual conditions that led to the Utilities’’ decisions to call DR events. While the DR trigger criteria provides some general ideas on how DR programs are triggered, there is lack of transparent information on the Utilities’’ DR operations, e.g., when and how the Utilities made decisions to trigger a DR program. It is necessary to evaluate the DR performance not only from load impact perspective, but also from the DR operations to determine the DR reliability and usefulness as a resource. Staff analyzed the 2006 2012 DR event data and gained some understanding on how the Utilities had utilized DR programs and how useful the programs were. III. DR Events Vs. Peaker Plant Service Hours How do the number compare to the 2012 limit and historically? As shown in Appendices J and K, SCE has a few DR programs with unlimited number of events or hours: Demand Bidding Program, Save Power Days (Peak Time Rebate), and Summer Discount Plan –– Commercial (Enhanced). Others have various event/hour limits ranging from 24 hours/month to 180 hours/year or 15 events/year.39 For the DR programs with an event limit, most of them did not attain the maximum number of events and/or hours except for SCE’’s Summer Advantage Incentive (Critical Peak Pricing).40 In summer 2012, SCE triggered 12 events for its Critical Peak Pricing, which is within the range of 9 to 15 events/year. Other DR programs’’ event hours were well below the limits. For example, SCE’’s residential Summer Discount Plan (AC cycling) is the second to highest triggered programs with 23 DR events and 24 event hours in 2012, which is still far below the 180 hours of its event limit despite the SONGS outage. The Base Interruptible Program (BIP) had only one test event for two hours in 2012. However, SCE’’s DR program event hours were either within the program historical ranges or below the 2006 2011 maximum except for Agricultural Pumping Interruptible with 7 hours in 2012 as comparing to 0 to 2 from 2006 to 2011. What were the reasons for the differences between the 2012 DR event numbers and hours and the event limits? SCE explained that the reasons for the differences between the 2012 DR event numbers and hours vary for each program, which is summarized in Appendix L.41 The reasons can be characterized for the three types of DR programs as: 1) trigger conditions, 2) optimal dispatches, and 3) no nomination As discussed above, DR program operations are based on the trigger criteria set for each program. For the non Emergency Programs, SCE indicated that optimizing performance and minimizing customer fatigue is an additional factor considered in its decision to trigger a DR program. SCE’’s optimal dispatch strategy may have resulted in the DR events and hours far 39 SCE 02, Appendix E, Table 2 A at E 4 and E 5. 40 Id. 41 SCE 02, Appendix E, at E 6 and E 7.
  • 38. 34 below the maximum hours and events for the programs. For example, SCE’’s Summer Discount Plan is available for 180 hours annually. However, customers would probably never expect that this program will be triggered close to 180 hours based on their experience to date with the program. As shown in Appendices M and N, staff finds a similar trend with SDG&E’’s DR event data. IV. Peaker Plant Comparison Most of SCE’’s non Emergency Programs include resource limitation as a program trigger. Therefore, in theory, one would expect that SCE would trigger DR programs before dispatching its peaker plants in accordance with the Loading Order. In light of the SONGS outage, the Commission anticipated more SCE and SDG&E DR events in 2012, yet SCE dispatched peaker plants substantially more than DR programs (compared to their historical averages as discussed below. How do the historical DR events compare to the utilities’’ peaker plants? SCE provided the permit and service hours for four of its own peaker plants, three were located in the SONGS affected areas, which is shown in Appendix O.42 SCE historically dispatched its peaker plants about 9% to 16% of the permissible service hours annually. As shown in the table below, during the same period, SCE triggered its non Emergency DR programs 11 to 106 hours on average. However, in 2012, SCE dispatched its peaker plants three to four times more than the historical average. On the other hand, SCE’’s 2012 DR event hours were less than the historical range. SDG&E’’s peaker plant and DR event data show a similar trend as SCE. For example, SDG&E’’s Miramar ran 4,805 hours out of 5,000 hours of emission allowance. In contrast, its Critical Peak Pricing with the most triggered hours was dispatched 49 hours out of 126 hours of annual limit. Table 20: DR Event Hour Vs. Peaker Plant Service Hours 2006 2011 Range 2012 SCE: Peaker Plants 96 –– 129 Hours 405 –– 465 Hours Non Emergency DR 11 –– 106 Hours 2 –– 64 Hours SDG&E: Peaker Plants 436 –– 1715 Hrs. 974 –– 4805 Hrs. Non Emergency DR 19 –– 39 Hrs. 14 –– 49 Hrs. In addition, staff observed that the Utilities highest DR event hours occurred in 2006 and 2007 during the summer heat storms but the highest peaker plan hours occurred in 2012. This data suggests that the Utilities under utilized DR programs and over relied on its peaker plants, which is inconsistent with the Loading Order. 42 SCE 01, Appendix C, Tables 9 and 10 at Page 17.
  • 39. 35 In its comments on the 2013 2014 DR Proposed Decision, SCE disagreed with the suggestion of ““under utilization”” of DR programs based on the 2012 DR events. SCE argued that ““(s)imply because SCE did not dispatch all of the programs’’ available hours does not mean the programs should have been dispatched more……Optimal utilization (of DR) ensures the necessary amount of load drop to enable a reliable grid……””43 SCE should explain why it dispatched its peaker plants substantially more last summer instead of DR and whether SCE’’s optimal dispatch of DR or the trigger criteria or designs resulted in SCE’’s increased reliance on peaker plants. Due to the time constraint and absence of the Utilities’’ explanations, staff is unable to comprehensively address this issue in this report. The Utilities data warrants further evaluation to ensure the usefulness of DR resource as a replacement of peaker plants and the compliance of the Loading Order. V. Conclusions Consistent with D.13 04 017, staff finds that most of SCE’’s DR programs did not attain the maximum number of events and/or hours except for SCE’’s Critical Peak Pricing. The Utilities’’ total numbers of DR events and hours in 2012 were within the historically average, but far from the program limits. However, in contrast, staff found that SCE owned and contracted peaker plants were dispatched far more in 2012 in comparison with the historical averages. Some peakers were much closer to their emission allowance than the DR hours were to their operating limits. Staff reaches a similar conclusion with SDG&E’’s DR programs in comparison with its peaker plants. If the Utilities have historically never triggered their DR programs close to the available hours, there is a concern with how realistic these limits are. There is a reliability risk if the Utilities are relying on a DR resource that has never been used to its full capacity. In addition, the DR cost effectiveness should reflect the historical operations. Staff recommends the Commission to address the issue in future DR evaluation and budget approval proceedings. 43 SCE Opening Comment filed on April 4, at 4 5.
  • 40. 36 Chapter 4: Residential Demand Response Programs I. Summary of Staff Analysis and Recommendations Analysis of Residential programs included Peak Time Rebate (PTR) and AC Cycling. Overall, customers seem satisfied with the programs based on utility reports and surveys. However staff encountered problems with program design and operation that need to be addressed to improve reliability and effectiveness of the programs. For PTR, staff found that customers who received utility notification of events have higher awareness of the program when compared to customers who were not notified by the utility or received indirect notification such as mass media alerts. More importantly, data for both utilities show that customers who opted into receiving alerts were the only group that significantly reduced load. For both utilities, customers defaulted on MyAccount to receive alerts did not reduce load significantly. However, the entire eligible customer class qualifies for bill credits, which resulted in a problem of 'free ridership.' Both utilities should modify PTR from a default to an opt in program, where only customers opting to receive event alerts would qualify for bill credits. For SCE's Residential AC Cycling staff found that the current group dispatch strategy is resulting in a rebound effect. The rebound effect impacts the actual load reduction the program is capable of producing. Staff recommends SCE to (1) align the maximum program event duration with customer preference for shorter events to improve forecast, and to (2) reconsider its incentive structure to favor participation in longer event duration. Finally, both utilities should take advantage of AMI infrastructure and related enabling technology that could improve program delivery, reliability and customer experience. II. Residential Peak Time Rebate (PTR) A. Overall Customer Experience For both utilities, customers were generally satisfied with the program. For SCE, customers seem satisfied with the level of incentives, the time between notification and event. However customers would like more information regarding the program and bill credits. SDG&E’’s customers reported overall customer satisfaction with the program, but similar to SCE’’s customers, would benefit from more information and outreach. Level of awareness for both utilities seems higher amongst customers who chose to sign up to receive notifications. This is reflected in the overall load reduction verified by ex post data. Only customers who signed up for event notification significantly reduced load. For PTR, none of the utilities noticed evidence of customer fatigue, but this does mean it did not occur; just that it was not noticeable.
  • 41. 37 B. SCE’’s Peak Time Rebate/Save Power Day 1) Summary Customers who received utility notification of events have higher awareness of the program when compared to customers who were not notified by the utility. More importantly, customers who opted into receiving alerts were the only group that significantly reduced load. Customers defaulted on MyAccount to receive alerts and the remaining customers not directly notified by the utility did not reduce load significantly. SCE considered only customers who received alerts in their forecast and ex post verification. However, the entire eligible customer class qualifies for bill credits. Awareness of the program, reflected by the willingness to sign up for receiving alerts, seems to indicate more willingness to reduce load. This factor should be considered in program design. Staff identified an issue with ‘‘free ridership’’, where customers are paid even though they didn’’t significantly reduce any load. Staff recommends changing PTR from a default program to an opt in program, paying bill credits only to customers who opt in to participate. 2) Background D.09 08 028 approved Save Power Day, SCE’’s Peak Time Rebate (PTR) rate. The decision approved bill credits of 0.75c/kWh reduced with an additional 0.50c/kWh for customers with enabling technology. This is a default program for residential customers with a smart meter and has been available since 2012. The program provides incentives to eligible Bundled Service Customers, who reduce a measurable amount of energy consumption below their Customer Specific Reference Level (CSRL) during PTR Events.44,45 The utility may call events throughout the year on any day, excluding weekends and holidays. Events will take place between 2pm and 6pm on days an event is called. Participants receive a day ahead notification of the event. Bill credits will be paid in each billing cycle based on the sum of events called and usage reduction during the period.46 Bill credits will be recovered from the respective customer class through the Energy Resource Recovery Account (ERRA). During 2012, SCE started defaulting customers on MyAccount to receive email notifications, with the remaining customers not directly notified by the utility. Alternatively, customers may choose to opt in to receive alerts. As of November 30th, approximately 4 million customers are on PTR and 824,000 were signed up to receive notifications (via MyAccount).47 According to SCE, 44 SCE Schedule D –– Domestic Service, sheet 3 45 CSLR: peak average usage level”” is the customer’’s average kWh usage during the 2:00 p.m. to 6:00 p.m. time period of the three (3) highest kWh usage days of the five (5) non event, non holiday weekdays immediately preceding the PTR Event. The CSRL is used to determine the customers kWh reduction for each PTR Event in order to calculate the rebate. 46 SCE Schedule D –– Domestic Service, D.09 08 028 Att. C at 7. 47 SCE 01 Testimony at 27, lines 11, 18 19.
  • 42. 38 approximately 60,000 customers have opted in to receive alerts in 2012 during the summer months.48 3) Lessons Learned In support of its 2013 2014 Application, SCE provided data to highlight lessons learned from the 2012 program year. Customer awareness Awareness of the program is higher amongst the group of customers whom the utility notified of events: 66% of notified respondents were familiar with the program but only 43% were familiar in the group not notified49 . When prompted for awareness of events, the same pattern is noticeable. 72% of respondents in the group receiving notifications who were aware of the program claimed awareness of specific events, compared to 40% in the group not receiving notifications. When including customers aware and the ones prompted with information about the program, 55% of the notified group was aware but only 23% of the non notified respondents was aware.50 Customer satisfaction There was no information regarding customer perception of fairness of savings/incentive levels in SCE’’s data, however customers seem to link participation with expectation of savings as 80% of respondents identified earning bill credits as important for participation51 . Moreover, participants seem to be willing to participate even in the face of low savings.52 Event notification The majority of respondents aware of the program found out about events via utility notification (over 60% for the opt in group). Close to 23% of respondents in the overall population found out about events in the news.53 According to results of the customer surveys, about 90% of customers notified of the event and about 56% of customers not notified but aware of the event, were happy with the amount of time between notification and event54 . It appears that a day ahead strategy could be adequate, however customers were not prompted regarding preference for a day of reminder, so it is not clear from the lessons learned if this could increase awareness and response. SCE requested to add a day of notification in their 2013 2014 Program Augmentation Application, which the Commission denied due to lack of evidence of need.55 48 Email communication with SCE (4/5/2013) 49 SCE 02 Appendix A at 3.It is important to note that the surveys only represented results for two groups: customers notified by the utility and customers who were not notified. Defaulted customers and customers not defaulted into receiving notifications from the utility were bundled together under notified customers. 50 SCE 02 Appendix A at 4 51 SCE 02 Appendix B at 24 52 SCE 02 Appendix B at 36 53 SCE 02 Appendix A at 5 54 SCE 02 Appendix A –– Save Power Day Incentive/Peak Time Rebate Post Event Customer Survey at 15 55 D. 13 04 017, at 28