STATE OF CALIFORNIA Edmund G. Brown Jr., Governor
PUBLIC UTILITIES COMMISSION
505 VAN NESS AVENUE
SAN FRANCISCO, CA 94102 3298
Commission
Staff Report
Lessons Learned From Summer 2012
Southern California Investor Owned
Utilities’’ Demand Response Programs
May 1, 2013
Performance of 2012 Demand Response programs of San Diego Gas and
Electric Company and Southern California Edison Company: report on
lessons learned, staff analysis, and recommendations for 2013 2014
program revisions in compliance with Ordering Paragraph 31 of Decision
13 04 017.
ACKNOWLEDGEMENT
The following Commission staff contributed to this report:
Bruce Kaneshiro
Scarlett Liang Uejio
Tim Drew
Rajan Mutialu
Dorris Chow
Paula Gruendling
Taaru Chawla
Jennifer Caron
Alan Meck
i
TABLE OF CONTENTS
EXECUTIVE SUMMARY....................................................................................................... 1
Chapter 1: Introduction.................................................................................................. 5
I. 2012 Summer Reliability and Demand Response Programs..................................................5
II. Energy Division November 16, 2012 Letter and the Staff Report..........................................6
Chapter 2: Demand Response Program Load Impact...................................................... 8
I. Summary of Staff Analysis and Recommendations ...............................................................8
II. Different DR Load Impact Estimates ...................................................................................... 9
III. Comparison of DR Daily Forecast and Ex Post Results ..........................................................9
IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)..............................26
Chapter 3: Demand Response Program Operations...................................................... 32
I. Summary of Staff Analysis and Recommendations .............................................................32
II. 2012 DR Program Trigger Criteria and Event Triggers .........................................................32
III. DR Events Vs. Peaker Plant Service Hours ...........................................................................33
IV. Peaker Plant Comparison..................................................................................................... 34
V. Conclusions .......................................................................................................................... 35
Chapter 4: Residential Demand Response Programs .................................................... 36
I. Summary of Staff Analysis and Recommendations .............................................................36
II. Residential Peak Time Rebate (PTR) ....................................................................................36
III. Residential Air Conditioning (AC) Cycling.............................................................................51
Chapter 5: Non Residential Demand Response Programs............................................. 57
I. Summary of Staff Analysis and Recommendations .............................................................57
II. Background and Summary of Utility Data............................................................................57
III. Commercial Air Conditioning (AC) Cycling...........................................................................59
IV. SCE’’s Auto DR....................................................................................................................... 63
V. SDG&E’’s Demand Bidding Program (DBP) ...........................................................................65
Chapter 6: Flex Alert Effectiveness ............................................................................... 67
I. Summary of Staff Analysis and Recommendations .............................................................67
II. Background .......................................................................................................................... 67
III. Utility Experience with Flex Alert.........................................................................................69
IV. Customer Experience ........................................................................................................... 69
V. The Future of Flex Alert........................................................................................................ 71
VI. DR Program Ex Post Load Impact Results on the Flex Alert Days........................................71
Chapter 7: Energy Price Spikes ..................................................................................... 73
ii
I. Summary of Staff Analysis and Recommendations .............................................................73
II. Definition of Price Spikes ..................................................................................................... 73
III. DR Programs and Price Spikes.............................................................................................. 73
IV. Conclusion............................................................................................................................ 74
Chapter 8: Coordination with the CAISO ...................................................................... 75
I. Staff Recommendations....................................................................................................... 75
II. DR Reporting Requirements in Summer 2012.....................................................................75
III. DR Reporting Requirements for 2013 2014.........................................................................76
Appendix A: Highlight of 2012 Summer Weather & Load Conditions.................................... 77
Appendix B: Energy Division November 16, 2012 Letter........................................................ 78
Appendix C: Descriptions of DR Load Impact Estimates......................................................... 79
Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW) ............................ 85
Appendix E: SCE 2012 DR Program Load Impact by Event (MW)........................................... 87
Appendix F: SDG&E 2012 Monthly Average DR Program Load Impact (MW) ....................... 91
Appendix G: SDG&E 2012 DR Program Load Impact by Event (MW)..................................... 92
Appendix H: SCE 2012 DR Program Overview ....................................................................... 93
Appendix I: SDG&E DR Program Overview............................................................................. 96
Appendix J: SCE Historical DR Event Hours............................................................................. 98
Appendix K: SCE Historical Number of DR Events .................................................................. 99
Appendix L: Summary of SCE’’s Reasons for the 2012 DR Triggers....................................... 100
Appendix M: SDG&E Historical DR Event Hours................................................................... 101
Appendix N: SDG&E Historical Number of DR Events .......................................................... 102
Appendix O: Utilities’’ Peaker Plant Total Permissible vs. Actual Service Hours................... 103
Appendix P: Ex Post Demand Response Load Impact on Flex Alert Days ............................ 104
Appendix Q: CAISO Energy Price Spikes................................................................................ 105
Appendix R: Utilities’’ Demand Response Reporting Requirements..................................... 111
Appendix S: Additional Information .................................................................................... 113
1
EXECUTIVE SUMMARY
This report is prepared by Energy Division in compliance with Ordering Paragraph 31 of
D.13 04 017. The purpose of this report is to provide the lessons learned from the 2012
Demand Response (DR) programs operated by San Diego Gas and Electric Company (SDG&E)
and Southern California Edison Company (SCE) (Utilities), and to recommend program or
operational revisions, including continuing, adding, or eliminating DR programs. Below are
highlighted conclusions and recommendations in the report. To see all recommendations,
please go to each chapter in the report.
In summary, Energy Division makes the following overarching conclusions about the
Utilities’’ DR programs:
Forecast vs. Ex Post: While a few DR programs met or even exceeded their daily
forecast when triggered, on average the ex post results for all program events
diverge from the daily forecast by a considerable degree. The majority of programs
either provided a ‘‘mixed’’ performance (the program both over and under
performed relative to its forecast) or were poor performers (consistently coming up
short relative to its forecast). Of particular note are the Utilities’’ Peak Time Rebate
program1
and SCE’’s Summer Discount Plan.2
(Chapter 2)
The divergence between the ex post results and the daily forecasts can be traced to
a variety of causes, such as inadequate forecasting methods employed by the
Utilities, program design flaws, non performance by program participants and/or
program operations. A complete explanation of the reasons for divergence across
all programs however, was not possible within the scope and timing of this report.
(Chapter 2)
2012 RA vs. Ex Post: Comparing the ex post results to the 2012 Resource Adequacy
(RA) forecast is not a good indicator as to how well a DR program performs. RA
forecasts are intended for resource planning needs. Ex post load impacts reflect
demand reductions obtained in response to operational needs at the time the
program is triggered. Resource planning and operational planning have different
conditions and serve different purposes. (Chapter 2)
DR vs. Peaker Plants: The Utilities used their DR programs fewer times and hours
than the programs’’ limits (each program is limited to a certain number of hours or
events). In contrast, the Utilities dispatched their peaker power plants far more
frequently in 2012 in comparison to 2006 –– 2011 historical averages. (Chapter 3)
Energy Price Spikes: DR programs are not currently designed to effectively mitigate
price spikes in the CAISO’’s energy market. On many days a DR event was called and
1
SCE’’s marketing name for Peak Time Rebate is ““Save Power Day”” , SDG&E calls it ““Reduce Your Use””.
2
Air conditioning (AC) cycling
2
no price spikes occurred, and conversely there were days where price spikes
occurred and DR events were not called. The timing and scope of this report did not
permit a quantification of the cost of unmitigated price spikes to ratepayers, but in
theory, avoidance of these spikes would benefit ratepayers. (Chapter 7)
Energy Division also makes the following program specific conclusions about the Utilities’’
DR programs:
SCE’’s AC Cycling Program Forecasting: SCE’’s 2012 forecasting methodology for its
air conditioning (AC) Cycling program (the DR program that SCE triggered the most
in 2012) cannot be relied upon to effectively predict actual program load reductions.
(Chapter 2)
SCE’’s AC Cycling Dispatch Strategy: SCE’’s sub group dispatch strategy for its AC
Cycling Program (also called Summer Discount Plan) created adverse ‘‘rebound’’
effects, thereby reducing the effectiveness of the program during critical hot
weather days, e.g. 1 in 10 weather. (Chapter 2)
SDG&E’’s Demand Bidding Program: SDG&E Demand Bidding Program produced on
average 5 MW of load reduction when triggered, although the US Navy did not
participate. The US Navy claimed certain program terms and conditions precluded
it from participating in the 2012 program. The Commission’’s decision to modify the
program to a 30 minute trigger may further limit the US Navy’’s ability to participate.
(Chapter 5)
Peak Time Rebate Awareness: SCE and SDG&E customers who received utility
notification of Peak Time Rebate (PTR) events had higher awareness of the program
when compared to customers who were not notified by the utility. More
importantly, customers who opted into receiving PTR alerts significantly reduced
load. All other customers in the program provided minimal load reduction. (Chapter
4)
Peak Time Rebate Free Ridership: The Utilities’’ PTR program has a potentially large
‘‘free ridership’’ problem, where customers receive incentives without significantly
reducing load. SCE paid $22 million (85% of total PTR incentives in 2012) in PTR bill
credits to customers whose load impact was not considered for forecast or ex post
purposes. 94% of SDG&E’’s 2012 PTR incentives ($10 million) were paid to
customers who did not provide significant load reduction. The inaccuracy of
settlement methodology (in comparison to the ex post results) is the main reason
for the ‘‘free ridership’’ problem. The default nature of the program (everyone is
automatically eligible for the incentives) aggravates the problem. (Chapter 4).
Flex Alert: There is a lack of data to evaluate the effectiveness and value of the Flex
Alert campaign. Attribution of savings from Flex Alert is complicated by the fact
that load reduction from the Utilities’’ DR programs on the two days Flex Alert was
3
triggered in 2012 contributed to reduced system peak load. A load impact
evaluation of Flex Alert is planned for 2013. (Chapter 6)
DR Reports: The Utilities’’ DR daily and weekly reports were useful to the CAISO and
the Commission for purposes of up to date monitoring of DR resources throughout
the summer. (Chapter 8)
In light of above findings, Energy Division recommends the following:
DR Evaluation: The Commission should require further evaluation of Utility DR
program operations in comparison to Utility operation of peaker plants for the
purpose of ensuring Utility compliance with the Loading Order. (Chapter 3)
Forecast Methods Generally: The Utilities’’ daily forecasting methods for all DR
programs (especially AC cycling and other poor performers) should undergo
meaningful and immediate improvements so that the day ahead forecasting
becomes an effective and reliable tool for grid operators and scheduling
coordinators. (Chapter 2)
Forecasting for SCE’’s AC Cycling Program: SCE should improve forecasting methods
for its residential AC Cycling Program with input from agencies and stakeholders.
SCE should also pilot more than one forecasting method for the program in 2013.
(Chapter 2)
Forecasting for SDG&E Programs: SDG&E’’s forecasting methods for its AC Cycling
Program (Summer Saver) could be improved doing the following: running a test
event and including a correlation variable that accounts for customer fatigue.
SDG&E’’s Capacity Bidding Program forecasting could be improved by including a
weather variable. (Chapter 2)
SCE’’s Outreach for Commercial AC Cycling: Through its outreach and marketing
efforts, SCE should clearly communicate the new features of its commercial AC
cycling program to avoid customer dissatisfaction and dropout. (Chapter 5)
Auto DR: Future studies are necessary to explore the load impacts of Auto DR.
(Chapter 5)
SDG&E’’s Demand Bidding Program: SDG&E should work collaboratively with the US
Navy to design a program to meet the unique needs of the Navy. Key attributes to
consider are a day ahead trigger, aggregation of 8 billable meters and a minimum
bid requirement of 3 megawatts (MW). (Chapter 5)
Peak Time Rebate Design Changes: The Utilities’’ residential PTR program should be
changed from a default program to an opt in program, so that bill credits are paid
only to customers who opt in. (Chapter 4)
SCE’’s AC Cycling Dispatch Strategy: SCE should reconsider its current strategy of
calling groups of residential AC cycling customers in sequential one hour cycling
events. Alternatively, if SCE retains its current strategy, it should modify the
4
program’’s incentive structure so that customers who are willing to have their AC
units cycled for an entire event (as opposed to just one hour) are compensated
more than those who can tolerate only one hour of cycling. (Chapter 4)
DR Reports: The Utilities (and Pacific Gas & Electric) should submit daily and weekly
DR reports to the CAISO and the Commission for the summers of 2013 and 2014.
They should follow the same format and data requirements in the 2012 reports,
unless otherwise directed by the Commission or Commission staff. (Chapter 8)
5
Chapter 1: Introduction
I. 2012 Summer Reliability and Demand Response Programs
San Onofre Nuclear Generating Station (SONGS) Units 2 and 3 were taken out of service in
January 2012. By March 2012, the Commission determined that the outage of SONGS’’ two
units could extend through summer 2012. Working closely with the Governor’’s Office, the
California Independent System Operator (CAISO), and the California Energy Commission (CEC),
the Commission took immediate mitigation actions to ensure that lights stay on in California
with the loss of 2,200 MW of capacity provided by SONGS.3
When considering adding new generation resources,4
an important action was to further
incorporate the Utilities’’ Demand Response (DR) programs into the CAISO’’s contingency
planning and daily grid operations during the summer. This included mapping the Utilities’’ DR
programs to grid contingency plans and developing new daily and weekly DR reporting
requirements. In addition, the Commission also moved swiftly to approve three new DR
programs for summer 2012: SDG&E’’s Peak Time Rebate (PTR) for commercial customers and
Demand Bidding Program (DBP); and SCE’’s 10 for 10 conservation program for non residential
customers.5
Because of the intensive interagency mitigation effort and relatively cool weather,
California grid reliability was not compromised in spite of the SONGS outage. Nevertheless,
southern California experienced several heat waves in August and September with the highest
temperature reaching 109°F in SDG&E’’s service area and 100°F for SCE on September 14.6
The
CAISO issued two Flex Alerts: on August 10 and 14. The Utilities triggered all of their DR
programs at least once and some on multiple occasions.
Throughout the summer, Energy Division (ED) staff monitored the Utilities’’ DR program
events on a daily basis and provided weekly briefings to the Governor’’s Office, the CAISO, and
the CEC. Staff observed that, for many event days, the load impact forecasts provided by the
Utilities to the CAISO and the Commission in their daily DR reports were inconsistent with the
results submitted seven days after each event (referred as the ““7 Day report””). In some cases,
the Utilities reported much lower load reduction results than they originally forecasted. In
addition, load impact forecasts provided by the Utilities throughout the summer were lower
than the capacity counted for the 2012 Resource Adequacy (RA) Requirement. This raised a
question as to whether the Commission might have overestimated DR load impact for RA
purposes or, rather, if the Utilities might have under utilized their DR programs.
Sometime in mid summer, the Utilities began to experience price spikes in CAISO’’s
wholesale energy market. Questions were raised on whether the DR programs could be used
to mitigate price spikes, and if so, should they be.
3
http://www.songscommunity.com/value.asp
4
Retired Huntington Beach Units 3 and 4 were brought back on line temporarily.
5
Resolutions E 4502 and E 4511
6
A 1 in 10 (or 10% probability) weather condition in any given years.
6
Some of the Utilities’’ DR programs were triggered on as many as 23 events over the five
summer months, and many were triggered on two or three consecutive days. Appendix A
highlights the DR program load impact on the three hottest days and the three days when
SDG&E and SCE experienced highest system peak load. Staff observed that SDG&E’’s system
peak correlate to temperature and biggest DR load reduction happened on the hottest day.
On the other hand, SCE’’s system peak load did not consistently correlate to weather. In
contrast, SCE’’s system load reached its annual peak at 90°F temperature, 10°F cooler than the
hottest day in its service territory. Counter intuitively, DR program load impact on a cooler day
was actually higher than the amount delivered on the hottest day. This led to questions how
the Utilities make decisions to trigger DR programs and whether aspects of the customers’’
experience, such as expectations and fatigue have an effect.
In August, CAISO issued two Flex Alerts when it determined a reliability risk due to
insufficient supply to meet demand. As expected, the Utilities triggered relatively large
amounts of DR programs on both days. CAISO reported that the actual peak load was
significantly lower than its hours ahead forecasts and attributed the load drop to Flex Alert
events. This parallel dispatch situation raises important questions regarding the effectiveness
of the Flex Alert when overlapped with the Utilities’’ DR program events and how customers
perceived with these statewide alerts versus local utility DR notifications.
Based on the above experience, the Commission concluded that staff should evaluate DR
program performance and other lessons learned in order to seek answers to these and other
questions. Such lessons could help the Commission to determine the extent of DR program
reliability and usefulness and in turn, to the extent to which DR resources can be counted on in
CAISO markets and operations.
II. Energy Division November 16, 2012 Letter and the Staff Report
On November 16, 2012, the Energy Division sent a letter (Energy Division Letter) to the
Utilities directing the Utilities to 1) file an application proposing DR program improvements for
2013 and 2014 to mitigate the SONGS outage and 2) provide data and responses to a set of
questions on lessons learned from 2012 DR programs. The questions were developed based on
the Utilities’’ 2012 demand response experience and fell into six categories:
1. DR Program Performance, which include load impact and program
operations,
2. CAISO Market, covering price spikes and market analysis
3. Customer Experience,
4. Coordination with the CAISO and Utility Operations
5. Emergency DR Program Dispatch Order, and
6. Flex Alert Effectiveness
The Energy Division Letter is attached in Appendix B of this report.
7
On December 21, 2012, the Utilities filed separate applications for the approval of the DR
program revisions for 2013 and 2014.7
The Utilities submitted data and responses to the
questions attached to the Energy Division Letter and subsequent Assigned Administrative Law
(ALJ) rulings for developing the record.8
Decision (D.)13 04 017 approved certain DR program
improvements for 2013 2014 and directed the Commission staff to develop a report on the
lessons learned from the DR programs in 2012.
This report is based on a snapshot of data and studies available at the time (i.e. ex post load
impact data, utility responses to Energy Division data requests, etc.) On going and future (e.g.
Flex Alert load impact analysis per D.13 04 021) evaluations will shed further light on the issues
raised in this report.
One point of emphasis in this report is the extent to which the current DR programs
delivered their forecasted savings when they were triggered by the utilities. It is important to
understand that there are a range of factors that can affect whether a program delivers its
forecasted savings targets. Some of these factors can be controlled through good program
design, operation and forecasting methodologies. Other factors that can impact program
performance are exogenous or outside the utilities’’ control such as temperature, participant
enrollment fluctuations, and behavioral or technological changes by the participants.
While this report contains certain findings and recommendations for DR programs, we
caution against sweeping conclusions or generalizations about DR programs based on this
report. The point of this report is to find ways to improve existing DR programs so that they
are more useful to grid operators, utilities, ratepayers and participants.
7
A.12 12 016 (SDG&E) and A.12 12 017 (SCE).
8
On January 18, 2013 and February 21, 2012.
8
Chapter 2: Demand Response Program Load Impact
I. Summary of Staff Analysis and Recommendations
SCE
Most of the program event ex post results diverge from the daily forecast by a considerable
degree. The daily forecast should be more consistent with the ex post results in order for the
day ahead forecasting to be valid and useful for grid operators. Staff recommends that the
daily forecasting methods for all programs undergo meaningful and substantial improvements,
including more thorough and transparent documentation and vetting through relevant agencies
and stakeholders.
The Summer Discount Plan (Residential AC Cycling) program forecasting methods in
particular requires an audience with a broad panel of agencies and stakeholders. Staff also
recommends that SCE pilot more than one forecasting method and conduct interim protocol
based load impact evaluations to identify the most reliable forecasting methods throughout the
2013 summer season.
SCE should also be required to address Summer Discount Plan program operation issues
before the 2013 summer peak season begins, if possible. Specifically, the strategy of calling
groups of customers for sequential one hour cycling events, rather than calling all the
customers for the duration of the full event (or other potential strategies), needs to be
reconsidered before the program is further deployed. As discussed in detail later in this
chapter, this strategy resulted in load increases during the latter hours of events, thereby
reducing the overall effectiveness of the program.
SDG&E
Similar to SCE, many of SDG&E’’s program event ex post results also diverge from the daily
forecast by a considerable degree. The Demand Bidding Program daily forecast was accurate
and reliable in predicting ex post results, while the Summer Saver and Capacity Bidding Day
Ahead and Day Of program daily forecasts did not accurately nor reliably predict ex post results.
The Peak Time Rebate Residential daily forecast was not accurate in predicting ex post results,
but consistently underestimated ex post results by approximately 80%. The Critical Peak
Pricing and Base Interruptible program did not accurately or reliably predict ex post results, but
consistently under predicted ex post load impacts. Due to a weak price signal and inelastic
customer demand, the PTR commercial program ex post results were not significant. The CPP E
was discontinued as of December 31, 2012.
Staff recommends (1) including only customers that opt in to receive e mail or text alerts in
the PTR residential daily forecast model (2) running a test event to measure % load impact per
customer in order to improve CPP daily forecast estimates (3) including a correlation variable in
the Summer Saver daily forecast model to account for customer fatigue during successive event
days (4) including a weather variable in the CBP daily forecast model in order to have parity
with the ex post regression model.
9
II. Different DR Load Impact Estimates
DR programs load impact are forecasted or estimated at different times for different
purposes. The following table summarizes the five different DR load impact estimates that are
discussed in this chapter. Detail descriptions and methodologies for each DR program
measurement are provided in Appendix C.
Table 1: DR Load Impact Estimates
DR Load Impact Estimates General Description Purpose
Ex Ante for RA (e.g., 2012 RA) A year ahead monthly ex ante load
impact potential attributed by
individual program under a 1 in 2
weather condition.
To determine the RA counting against
the Load Serving Entity’’s system and
local capacity requirements.
Daily Forecast The Utilities’’ daily estimate of hourly
load impact from DR programs during
an event period.
To provide the CAISO, CPUC, and CEC the
hourly MW provided by DR programs on
each event day.
7 Day Report The Utilities’’ preliminary estimate of
hourly load reduction results from
each triggered DR program
To report to the CAISO the load
reduction data from the triggered DR
programs seven days after each DR
event.
Ex Post Results The Utilities’’ most accurate
measurement of the load impact
results from all of the DR programs
triggered in a year. The ex post
results are calculated using
comprehensive regression models.
To report to the CPUC the actual results
of the DR events
Settlement A measurement of customers’’ load
reduction from their specific reference
load using a baseline method.
To calculate customers’’ incentive
payments for billing purpose.
In this proceeding, the Utilities provided the above DR load impact estimates for their DR
programs, which are shown in Appendices D to G.
III. Comparison of DR Daily Forecast and Ex Post Results
A. Overall Program Performance
The following section draws on data provided by the Utilities on March 4, 20139
in response
to the Feb 21, 2013 ALJ ruling, which compares event day forecasts (daily forecast or day ahead
forecast) to the event day ex post load reduction estimates. Detailed data and methodological
descriptions relevant to this chapter are provided in Appendices C and G. Subsequent to its
March 4 filing, SCE updated its ex post results for some of the DR program events in its April 2
Load Impact Report but did not update its March 4 filing accordingly. However, in most cases,
the April 2, 2013 updated ex post results are even lower than the March 4 preliminary data, e.g.,
the AC cycling. Therefore, if the updated data was used, it would further support staff’’s
findings.
9
SCE 03 and SGE 03.
10
On average, the ex post results for all program events diverge from the daily forecast by a
considerable degree. While some program events were forecasted more accurately and
consistently than others, Energy Division staff’’s overall conclusion is that the daily forecasting
methods for all programs requires meaningful and immediate improvements in order for the
day ahead forecasting can become an effective and reliable tool for grid operators.
Some of the divergence between the ex post results and the daily forecast estimates can
possibly be explained by inadequate program design and program operations. This section
focuses on the observed differences between the ex post and the daily forecast with an eye
towards identifying improvements for day ahead forecasting, and thus does not cover all
potential program improvements. Furthermore, many program design and operational
improvements that could lead to better ex post results may not be evident by simply inspecting
the daily forecast and ex post data.
The ex post analysis methods are guided by Commission adopted load impact protocols10
and the study results are carefully documented in reports prepared by independent consultants
managed by SCE staff. However, there are currently no comparable standards and processes
guiding the methods for daily forecasting. Indeed, during the course of preparing this report,
Energy Division staff became aware that the day ahead forecasting methods are far from
transparent, and in some cases lack the robust analysis that is expected of the Utilities. These
problems may be somewhat understandable, however, since the daily reports were only
formally instituted in 2012.
While this report is highly critical of the implementation of the day ahead forecasting, it is
important to recognize that the 2012 DR events as a whole did indeed reduce participants loads,
and some of the program load reductions were consistent with or better than the day ahead
forecast. To that end, staff has categorized the demand response programs into three
categories (good, mixed, and poor performance) based on how well the program events
performed relative to the day ahead forecasts.
SCE
Programs that performed well yielded load impacts that were consistent with or better than
the day ahead forecast. The Base Interruptible Program (BIP) and the Day of Capacity Bidding
Program events produced load reductions that were on par with the forecasts. It is worth
noting that BIP, the single largest program, was triggered on only one occasion in 2012 however,
and this was test event.
Program events with mixed performance were not consistent with the day ahead
forecast, but sometimes exceeded the forecast. Staff includes the Day ahead Capacity Bidding,
Demand Bidding, and the Residential Summer Discount Plan program events in this category
because these program events did indeed occasionally exceed the day ahead forecasts by a
significant margin. These programs are discussed in greater detail elsewhere in this section and
report. While considered to be mid performing programs, they do have many important issues
that deserve attention.
10
Decision 08 04 050
11
Program events that were consistently below the forecast are considered to be poor
performing programs. All of the Critical Peak Pricing, Peak Time Rebate, Demand Response
Contracts, Commercial Summer Discount Plan, and Agricultural Pumping Interruptible program
events triggered during 2012 produced load reductions that were lower than forecasted.
Table 2: SCE’’s DR Overall Performance
Programs
No. of
DR
Events
Daily Forecast Ex Post Difference %
Good Performance:
Capacity Bidding Program –– Day of 14 12 16 >2 >17%
Base Interruptible Program 1 514 573 59 12%
Mixed Performance:
Capacity Bidding Program –– Day Ahead 12 0.08 0.03 0.29 to 0.08 315% to 86%
Demand Bidding Program 8 84 76 33 to 16 40% to 21%
Summer Discount Plan (AC Cycling) Res. 23 280 184 603 to 92 100% to 58%
Poor Performance:
Critical Peak Pricing 12 50 37 < 5 < 11%
Peak Time Rebate 7 108 20 < 11 < 11%
Demand Response Contracts 3 230 148 < 70 < 34%
Summer Discount Plan (AC Cycling) Com. 2 5 3 2 35%
Agricultural Pumping Interruptible 2 48 21 < 19 < 52%
(Averaged MW over All Events) (Range from Low to High)
SDG&E
Utilizing the same criteria for evaluating SCE DR programs, The Base Interruptible Program
and the Critical Peak Pricing Program were categorized as good performers, the Capacity
Bidding Day Ahead, Capacity Bidding Day Of, Demand Bidding, and Summer Saver (AC Cycling)
were categorized as mixed performers, and the Critical Peak Pricing Emergency and residential
Peak Time Rebate programs were categorized as poor performers. As stated above, DR
program design and operation characteristics also need to be taken into account for a complete
evaluation of DR program performance.
12
Table 3: SDG&E’’s DR Overall Performance
B. Program Performance During Critical Event Days
The critical event days of August 10th, 13th, 14th, and September 14th were selected as a
focus because they occurred on Flex Alert days, the service area system peak day, or the
hottest days of the year. These are all conditions when demand response resources are most
critical.
August 10, 2012
SCE
Two SCE programs were called on August 10th, a Flex Alert day. The programs triggered
during that event were the Demand Bidding Program and the Save Power Day (also known as
the Peak Time Rebate program). The load reductions achieved during the Demand Bidding
Program event surpassed the forecast by 12%, while the Save Power Day event was below the
forecast by 11%.
Table 4: SCE’’s August 10, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Demand Bidding Program 85.59 95.82 10.23 11.95%
Save Power Day 107.24
11
95.85 11.39 10.62%
Total 192.83 191.67 1.16
11
SCE did not provide a daily forecast for this event, so the comparison for this event is done with the 7 day report
rather than the daily forecast.
Programs Number of
Events
Daily
Forecast
Ex Post Difference %
(Averaged MW over All
Events)
(Low To High)
Good Performance:
Base Interruptible Program 1 0.3 0.8 0.5 167%
Critical Peak Pricing 7 15 18 > 2.4 >3.1%
Mixed Performance:
Capacity Bidding Program –– Day
Ahead
7
8 6 4.9 to 0.1 32% to 12.2%
Capacity Bidding Program –– Day Of 5 12 10 3.2 to 0.7 27.4% to 6.0%
Demand Bidding Program 3 5 5 0.4 to 0.1 8.0% to 8.0%
Summer Saver (AC Cycling) 8 20 17 12.3 to 3.5 64.0 to 38.7%
Poor Performance:
Peak Time Rebate Residential 7 19 4 < 24 < 73.6%
Critical Peak Pricing –– Emergency 2 2 1 < 0.7 < 53.3%
13
SDG&E
Three DR programs were called on August 10th
. The Capacity Bidding Day Ahead program
load reduction exceeded the forecast by 1%. Conversely, the Summer Saver and residential
Peak Time Rebate forecasts under predicted the forecast by 32% and 75%.
Table 5: SDG&E August 10, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast & Ex
Post
MW
% Difference
Forecast &
Ex Post
A B C = B A D=C/A
Capacity Bidding Day Ahead 7.50 7.60 0.10 1.33%
Summer Saver (AC Cycling) 27.20 18.50 8.70 32.00%
Residential Peak Time Rebate 12.60 3.20 9.40 74.60%
Total 47.30 29.30 18.00
August 13, 2012
SCE
August 13, 2012 was the system peak day for the SCE service area, with a peak load of
22,428 MW. As shown in Table 6 below, the Critical Peak Pricing program, a dynamic pricing
program for commercial and industrial customers over 200 kW, and the Day Of Capacity
Bidding Program were triggered during this day. Again, the Capacity Bidding Programs
exceeded the forecast by a few MW. The Critical Peak Pricing program event had satisfactory
performance, falling short of the forecast by 15%.
Table 6: SCE’’s August 13, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Critical Peak Pricing 50.54 42.96 7.58 15.00%
Capacity Bidding Program (Day Of) 12.30 15.70 3.40 27.60%
Total 62.84 58.66 4.18
SDG&E
All three DR programs that were triggered on August 13th, Capacity Bidding Day Of,
Summer Saver (AC Cycling), and Critical Peak Pricing, had ex post load impacts that were
respectively below daily forecast predictions by 27%, 45%, and 48%.
14
Table 7: SDG&E’’s August 13, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C= B/A D= C/A
Capacity Bidding –– Day Of 11.70 8.50 3.20 27.33%
Summer Saver (AC Cycling) 33.30 21.40 11.90 45.35%
Critical Peak Pricing Emergency 2.30 1.20 1.10 47.83%
Total 47.30 31.10 16.20
August 14, 2012
SCE
August 14, 2012 was another Flex Alert day, during which seven events were called, using a
variety of DR programs. As shown in Table 8 below, all the events combined were forecasted to
reduce loads by 570 MW. However, the ex post load impact evaluations found that the actual
load reductions were short of the total forecast by 155 MW. 60% of the 155 MW shortfall is
attributed to the Demand Response Contract program. The Agriculture Pumping Interruptible
program event was short of the event forecast by 52%. Only the Capacity Bidding Program
exceeded the forecasted load reduction, but this only made up 4% of the Demand Response
Contract program forecast, and thus was insufficient to cover the overall event day shortfall. It
is worth noting that the Demand Response Contract and Capacity Bidding Programs share
something in common in that they are both commercial aggregator programs. The reason for
the difference in performance between these programs requires further study. It should be
noted that SCE’’s Demand Response Contracts expired on December 31, 2012 and have since
been replaced by new contracts that that expire at the end of 2014.12
Table 8: SCE’’s August 14, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Demand Response Contracts 275.00 182.05 92.95 33.80%
Demand Bidding Program 94.09 61.76 32.33 34.36%
Agriculture Pumping Interruptible 36.00 17.29 18.72 51.99%
Summer Discount Plan (Res) Group 1 130.40 119.40 11.00 8.44%
Capacity Bidding Program (Day Of) 12.30 17.82 5.52 44.86%
Summer Discount Plan (Res) Reliability 17.42 13.50 3.92 22.49%
Summer Discount Plan (Com) 4.77 3.10 1.67 35.04%
Total 569.98 414.91 155.07
12
D.13 01 024 http://docs.cpuc.ca.gov/PublishedDocs/Published/G000/M046/K233/46233814.PDF
15
SDG&E
Four DR programs, Demand Bidding, Critical Peak Pricing, Capacity Bidding Day Ahead,
and residential Peak Time Rebate, were called on August 14th
. While the Demand Bidding and
Capacity Bidding Program ex post load impacts closely matched the daily forecast, the Critical
Peak Pricing and residential Peak Time Rebate did not. Since the Critical Peak Pricing and
residential Peak Time Rebate programs are large scale residential programs it is possible that
the difference between the forecast and ex post load impacts reflect widely varying customer
behavior during DR events.
Table 9: SDG&E’’s August 14, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Demand Bidding Program 5.00 5.10 0.10 2.00%
Critical Peak Pricing 14.30 25.90 11.60 81.12%
Capacity Bidding Program (Day Ahead) 7.50 7.50 0.00 0.00%
Residential Peak Time Rebate 12.50 1.10 11.40 91.20%
Total 39.30 39.60 0.30
September 14, 2012
SCE
September 14, 2012 was the hottest day of the year in both the SCE and SDG&E service
areas (see Table 10 below). Understandably, SCE triggered their Summer Discount Plan
(residential AC Cycling Programs) during this day. The Capacity Bidding Program was also
triggered, with performance comparable to the other Capacity Bidding Program events on
critical days discussed above.
The September 14 residential Summer Discount Plan events consisted of three separate
customer groups sequentially triggered for one hour events. All three one hour events fell
considerably short of the forecasted load reductions.
Table 10: SCE’’s September 14, 2012 Demand Response Events
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Summer Discount Plan (Residential)
Groups 5 and 6 135.61 20.70 114.91 84.74%
Summer Discount Plan (Residential) Groups 1 and 2 110.89 37.80 73.09 65.91%
Capacity Bidding Program (Day Of) 11.90 16.21 4.31 36.18%
Summer Discount Plan (Residential) Groups 3 and 4 99.32 17.80 81.52 82.08%
Total 357.72 92.51 265.22
16
SDG&E
On September 14, 2012, the peak temperature in SDG&E’’s service territory was 109
degrees. The Demand Bidding, Summer Saver, and Base Interruptible Programs ex post load
impacts were above the daily forecast in a range between 8% and 167%. Since the absolute
value of the Base Interruptible Program load impact is ~ 1 MW, a small increase or decrease in
the daily forecast prediction can result in high variability in the percent difference between
these two figures. Conversely, the Capacity Bidding Day Of and Day Ahead Programs and the
Critical Peak Pricing Emergency Program daily forecasts were below the daily forecast in a range
between 12% and 44%.
Table 11: SDG&E’’s September 14, 2012 Demand Response Events
C. Detailed Program Analysis
The following section discusses programs and events that produced load reductions
forecasted by the daily reports, as well as programs that failed to produce the forecasted load
reductions. For this purpose, all programs and events that came within 10% (+/ ) of the
forecasted load reductions are considered to be consistent with the daily forecast and all
programs and events that were more or less than 50% of the forecasted load reductions are
considered to have failed to produce the forecasted load reductions.
SCE
There were a total of 104 separate events in the SCE service area in 2012. Only ten of these
events produced the load reductions consistent with those forecasted in the daily reports. As
shown in Table 12 below, all of these events produced fairly sizable load reductions, ranging
from 59 to 130 MW, with the exception of one Capacity Bidding Program event, which
produced a very small load reduction.
Program Name
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast
& Ex Post
MW
%
Difference
Forecast &
Ex Post
A B C=B A D=C/A
Capacity Bidding Program (Day Of) 9.00 5.70 3.30 36.67%
Capacity Bidding Program (Day Ahead) 12.10 10.60 1.50 12.40%
Demand Bidding Program 5.00 5.40 0.40 8.00%
Summer Saver (AC Cycling) 15.50 22.50 7.00 45.16%
Base Interruptible Program 0.30 0.80 0.50 166.70%
Critical Peak Pricing Emergency 1.60 0.90 0.70 43.75%
Total 43.50 45.90 2.40
17
Table 12: SCE’’s DR Events with Ex Post Results within 10% of the Daily Forecast
Program Name Event Date
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
%
Difference
Forecast &
Ex Post
A B C=B A D=C/A
Summer Discount Plan (Residential) 08/14/12 130.40 119.40 11.00 8.44%
Summer Discount Plan (Residential) 08/29/12 82.56 80.30 2.26 2.74%
Summer Discount Plan (Residential) 08/01/12 58.60 57.10 1.50 2.56%
Summer Discount Plan (Residential) 08/15/12 77.77 77.50 0.27 0.35%
Demand Bidding Program 10/17/12 79.05 79.25 0.20 0.26%
Demand Bidding Program 10/01/12 78.75 79.78 1.03 1.31%
Summer Discount Plan (Residential) 08/09/12 118.06 121.20 3.14 2.66%
Summer Discount Plan (Residential) 08/28/12 83.86 88.20 4.34 5.18%
Capacity Bidding Program (Day Ahead) 07/31/12 0.0700 0.0740 0.00 5.71%
Demand Bidding Program 08/08/12 85.59 92.95 7.36 8.60%
Of the 104 events in 2012, thirty (or about 29%) of the events were more than 50% off of
the day ahead forecast. Five of these events produced load reductions that were greater than
the forecast, while the remaining 25 were lower than the forecast. The three events with the
highest percentage difference below the forecast were very small Day Ahead Capacity Bidding
Program events, and thus are not considered the most critical problem. Twenty one of the
remaining events were Summer Discount Plan (AC Cycling) events, and these varied markedly
off the forecast.
18
Table 13: SCE’’s DR Events with Ex Post Results greater than + 50% of the Daily Forecast
Program Name Event Date
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post
MW
% Difference
Forecast &
Ex Post
A B C=B A D=C/A
Capacity Bidding Program (Day Ahead) 10/01/12 0.09 0.20 0.29 315.22%
Capacity Bidding Program (Day Ahead) 10/02/12 0.09 0.10 0.20 213.04%
Capacity Bidding Program (Day Ahead) 10/05/12 0.09 0.07 0.16 170.65%
Save Power Days / Peak Time Rebates 09/07/12 108.66 23.11 131.77 121.27%
Summer Discount Plan (Residential) 06/20/12 128.01 0.50 127.51 99.61%
Save Power Days / Peak Time Rebates 09/10/12 108.52 1.65 106.87 98.48%
Summer Discount Plan (Residential) 09/14/12 135.61 20.70 114.91 84.74%
Summer Discount Plan (Residential) 07/10/12 263.67 44.70 218.97 83.05%
Summer Discount Plan (Residential) 09/14/12 99.32 17.80 81.52 82.08%
Summer Discount Plan (Residential) 06/29/12 178.26 33.30 144.96 81.32%
Summer Discount Plan (Residential) 09/20/12 77.39 14.60 62.79 81.14%
Summer Discount Plan (Residential) 06/29/12 178.26 35.80 142.46 79.92%
Summer Discount Plan (Residential) 07/10/12 263.67 66.60 197.07 74.74%
Summer Discount Plan (Residential) 10/02/12 298.91 86.20 212.71 71.16%
Summer Discount Plan (Residential) 07/10/12 263.67 76.70 186.97 70.91%
Summer Discount Plan (Residential) 09/20/12 65.53 21.10 44.43 67.80%
Summer Discount Plan (Residential) 09/20/12 65.73 21.90 43.83 66.68%
Summer Discount Plan (Residential) 09/14/12 110.89 37.80 73.09 65.91%
Summer Discount Plan (Residential) 08/22/12 115.03 42.40 72.63 63.14%
Agriculture Pumping Interruptible 09/26/12 60.56 24.00 36.56 60.36%
Summer Discount Plan (Residential) 09/21/12 168.96 69.10 99.86 59.10%
Summer Discount Plan (Residential) 09/28/12 55.06 24.50 30.56 55.50%
Agriculture Pumping Interruptible 08/14/12 36.00 17.29 18.72 51.99%
Summer Discount Plan (Residential) 10/17/12 127.25 62.30 64.95 51.04%
Summer Discount Plan (Residential) 10/17/12 146.77 72.30 74.47 50.74%
Summer Discount Plan (Residential) 08/17/12 101.30 153.00 51.70 51.04%
Capacity Bidding Program (Day Ahead) 10/29/12 0.09 0.15 0.06 59.78%
Summer Discount Plan (Residential) 08/17/12 58.00 98.30 40.30 69.48%
Capacity Bidding Program (Day Ahead) 10/18/12 0.09 0.17 0.08 85.87%
Summer Discount Plan (Residential) 09/10/12 18.98 68.40 49.42 260.42%
Summer Discount Plan
The Summer Discount Plan event variability ranges from 121% below the forecast (with a
load increase rather than a load reduction) to 260% above the forecast. Overall, the AC Cycling
program represents the most variance13
of all the SCE DR programs. When all of the variances
for individual events are aggregated, the AC Cycling program represents 49% of the total
variance. The Pearson Product Moment Correlation between the daily forecast and the ex post
load impacts is 0.21, representing a very weak positive correlation.
13
Variance in this context specifically refers to the absolute difference between the daily forecast and the event
day ex post load reductions.
19
The Pearson correlation between the average event temperature14
and the event level
variance (difference between the daily forecast and the event day ex post load reductions) is
0.37, representing a moderately weak correlation. In everyday language this means that SCE’’s
2012 Summer Discount Plan forecast method cannot be relied upon to effectively predict the
actual program load reductions. In addition, there appears to be little relationship between the
event day temperature and the difference between the daily forecast and the event day ex
post load reductions, potentially ruling out temperature as an explanatory factor for the
difference.
The Summer Discount Plan was (by far) the most often triggered program in SCE’’s 2012 DR
portfolio. There were 23 separate events, including two early test events15
. Most of the 23
events were split into 3 customer segments such that each group of customers was triggered
for only a portion (i.e. one hour) of each event (typically lasting three hours). Three events on
9/14, 9/20, and 9/28 deployed 6 customer segmentations. SCE operated the program in this
manner to avoid cycling their customers’’ air conditioners for more than one hour at a time16
.
The purpose of this strategy is so customers will be minimally impacted by the loss of one hour
of AC services, compared to multiple continuous hours, and in theory the utility would still be
able to reduce load when needed.
As shown in Table 14 below, the implementation of this strategy, however, resulted in a
rebound effect from the groups curtailed in event hours 1 & 2 that added load in hours 2 & 3 as
AC units ran at above normal capacity to return the participants’’ buildings to the original
temperature set points17
. The net effect was to dampen the average hourly load impact for the
entire event period, as illustrated in Table 14. It is possible that the daily forecasts were
prepared assuming that all customers would be curtailed at the same time over the entire
duration of the event. In such a case, the average hourly load reductions would likely have
been larger because all customers would be simultaneously curtailed and the rebound effect
would be delayed until after the event was over. This issue is further illustrated in Chapter 2,
Section IV ““Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)””.
Table 14: SCE’’s Hourly Load Impact from a Sept 14 Summer Discount Plan event
Event Hour
Ending:
Event Hours w/ Rebound Post Event Rebound
Event Hour
Average
16 17 18 19 20
15 39.6 25.1 17.0
16 27.1 27.0 39.6
17 21.3 49.6 37.8
Hour Total 39.6 2.0 22.7 89.2 37.8 6.3
14
SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013.
15
The last two events in late October were not included in the ex post analysis.
16
SCE 01 Testimony at 11.
17
SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013.
20
Another potential explanation for the suboptimal performance could be customers
exercising the override option in their enrollment contracts with SCE. However, SCE’’s A.12 12
016 testimony18
indicates that the proportion of customers with an override option is fairly
small (consisting of about 1% of the customers enrolled in SDP) and that these customers rarely
exercise the override option. Finally, it is possible that transitioning Summer Discount Plan
from an emergency program to a price responsive program could have introduced some
additional uncertainties that aren’’t adequately captured by the current forecasting methods.
Regardless of the explanation for the unexpectedly low load reductions during these events, it
is critical that SCE improve the day ahead forecast for the SDP program as a whole.
Energy Division staff reviewed SCE’’s method for forecasting the Summer Discount Plan
program.19
The methodology, provided in Appendix C, is described in a 1986 internal SCE
memorandum and consists of a simple algorithm which estimates the load reduction per ton of
AC based on the forecasted temperature. The equation coefficients were determined by a
1985 load reduction study that SCE staff could not locate when requested to do so by Energy
Division staff. Without the 1985 load reduction study Energy Division staff could not fully
evaluate the forecasting methodology. SCE did provide a revised algorithm which modifies the
equation structure. But the underlying methods for estimating those coefficients as yet remain
unexplained.
This evidence suggests that there is a critical flaw in either the way the Summer Discount
Plan events are forecasted or in the operation of the program, or both. The lack of a reliable
day ahead forecasting method is a major weakness that undermines the ability to fully consider
AC Cycling in the CAISO grid operations. Even if the utilities’’ DR resources are eventually to be
bid into the CAISO market, which currently are not, ED recommends that SCE immediately
document the forecasting methods to be used for the 2013 season and thoroughly vet the
methods with CPUC and CAISO staff and relevant stakeholders to ensure the proposed
forecasting methods are reasonable and reliable. Throughout the 2013 summer season (and
longer if necessary), SCE should consider piloting more than one forecasting method which
should be tested using small ex post load impact evaluations to identify the most reliable
forecasting methods.
Base Interruptible Program
The Base Interruptible Program was triggered only once during the entire 2012 season and
this was a test event. This single event produced 573 MW of load reductions on September 26.
The load reductions for this event were 59 MW more than the day ahead forecast. It is worth
noting that the single Base Interruptible event was more than three times the load reduction of
any other SCE program event during 2012, and it was not triggered on one of the critical event
days discussed earlier in this section.
The Commission should explore a policy requiring more frequent deployments of this
program since it appears to have significant, yet underutilized, potential.
18
SCE 01 Testimony at 11, Lines 3 5.
19
See Appendix S.
21
Capacity Bidding Program
The Capacity Bidding Program Day Ahead events produced an average load reduction of
0.03 MW across all events. With the exception of three events in October (that were
associated with negative load reductions in the ex post analysis) most events produced
relatively small load reductions forecasted by the daily report. None of the Capacity Bidding
Program day ahead events occurred in August and September when the load reductions are
typically most needed.
By comparison, all of SCE’’s Capacity Bidding Program Day Of events exceeded the
forecasted load reductions, by an average of 32%. The average load reduction for the Capacity
Bidding Program Day Of events was 15.9 MW, over 500 times the load reductions produced by
Day Ahead events.
This evidence suggests that, unlike the Day Of program, the Day Ahead Capacity Bidding
Program may not be serving a useful function in SCE’’s DR portfolio.
Demand Bidding Program
The Demand Bidding contracts were called on eight occasions during the summer of 2012.
Of these eight events, five occurred in August. The first two August events on August 8 and
August 10 resulted in load reductions that exceeded the daily forecast by an average of 10%.
The third and fourth events on August 14 and August 16 were 34% short of the forecasted load
reductions and the fifth event on August 29 was 40% below forecast, suggesting that perhaps a
decline in customer participation in events could be explored as a potential factor in
diminishing returns.
Demand Response Contracts (DRC) –– Nominated
Somewhat surprisingly, there were only two events for which Demand Response Contracts
were called. The ex post load reductions for these two events were both around 35% below
the daily forecast. Energy Division was not able to examine why this program performed so
poorly. As noted earlier, SCE’’s DRCs expired on December 31, 2012, and have since been
replaced by new contracts approved by the Commission.
Save Power Days / Peak Time Rebates (PTR) –– Price Responsive
Daily forecasts were not provided by SCE for the four PTR events that occurred in August,
thus comparisons between the daily forecast and ex post results are possible for only the two
events on September 7 and September 10. Both of the September events were forecasted to
reduce loads by 109 MW. Ex post results, however, indicate that the PTR events had no impact
at all. In fact, the September 7 event was correlated with a fairly significant load increase of
23.11 MW.
Ex post load reductions were estimated for the four August PTR events, for which day
ahead estimates were not provided by SCE. As a proxy for the daily forecast the 7 day reports
were used. As shown in Table 15 below, estimated load reductions were between 107 and 108,
while the ex post load reductions ranged between 0.02 and 96 MW.
22
Table 15: SCE’’s Peak Time Rebate MW
Event Day 7 Day Report Ex Post
8/10/2012 107.24 MW 95.85 MW
8/16/2012 107.61 MW 24.43 MW
8/29/2012 108.51 MW 21.93 MW
8/31/2012 108.73 MW 0.02 MW
Given the considerable variability in ex post results for the PTR program events, the day
ahead forecasting and event reporting will need significant revision to account for these
discrepancies. If the PTR program is going to continue, staff recommends that SCE prepare a
proposal for a viable forecast and submit that for staff to review.
SDG&E
There were a total of 46 DR program events that were triggered on 14 event days in
SDG&E’’s service area from June 2012 October 2012. Daily forecasts for twelve DR program
events were within + 10% of ex post load impacts. As depicted in Table 16, moderate load
reductions ranging from 5 to 17 MW were produced when these events were triggered. Three
programs delivered accurate results with a moderate degree of consistency: Demand Bidding
Program, Critical Peak Pricing, and Capacity Bidding Program Day Of.
Table 16: SDG&E’’s DR Events with Ex Post Results within + 10% of the Daily Forecast
Program Name
Event
Date
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast
& Ex Post
MW
% Difference
Between
Forecast &
Ex Post
Demand Bidding Program 10/2/2012 5 4.6 0.4 8.00%
Capacity Bidding Program (Day Of) 8/8/2012 11.7 11 0.7 5.98%
Capacity Bidding Program (Day Ahead) 8/9/2012 7.5 7.5 0 0.00%
Capacity Bidding Program (Day Ahead) 8/14/2012 7.5 7.5 0 0.00%
Capacity Bidding Program (Day Ahead) 8/10/2012 7.5 7.6 0.1 1.33%
Demand Bidding Program 8/14/2012 5 5.1 0.1 2.00%
Summer Saver (AC Cycling) 9/15/2012 8.6 8.8 0.2 2.33%
Critical Peak Pricing 10/2/2012 16 16.5 0.5 3.13%
Critical Peak Pricing 8/21/2012 16.5 17.2 0.7 4.24%
Critical Peak Pricing 9/15/2012 13.7 14.5 0.8 5.84%
Demand Bidding Program 9/14/2012 5 5.4 0.4 8.00%
Critical Peak Pricing 8/30/2012 16.2 17.8 1.6 9.88%
A total of 19 DR program events had ex post load impacts that were greater than + 50% of
the daily forecasts as depicted in Table 17. In particular, the residential and commercial Peak
Time Rebate program ex post load impacts deviated from the daily forecasts by greater than
70%. According to SDG&E, the commercial Peak Time Rebate ex post load impacts were
deemed to be not statistically significant. On this basis, SDG&E reported zero load impacts for
this program.
23
Table 17: SDG&E’’s DR Events with Ex Post Results greater than + 50% of the Daily Forecast
Program Name Event Date
Daily
Forecast
MW
Ex Post
MW
Difference
Forecast &
Ex Post MW
%
Difference
Between
Forecast &
Ex Post
A B C= B A D= C/A
Commercial Peak Time Rebate 8/9/2012 1.2 0 1.2 100.00%
Commercial Peak Time Rebate 8/10/2012 1.1 0 1.1 100.00%
Commercial Peak Time Rebate 8/11/2012 0.8 0 0.8 100.00%
Commercial Peak Time Rebate 8/14/2012 1.2 0 1.2 100.00%
Commercial Peak Time Rebate 8/21/2012 1.2 0 1.2 100.00%
Commercial Peak Time Rebate 9/15/2012 0.9 0 0.9 100.00%
Residential Peak Time Rebate 8/14/2012 12.5 1.1 11.4 91.20%
Residential Peak Time Rebate 8/21/2012 25 3 22 88.00%
Residential Peak Time Rebate 8/11/2012 12.2 1.7 10.5 86.07%
Residential Peak Time Rebate 8/9/2012 13.1 3.3 9.8 74.81%
Residential Peak Time Rebate 8/10/2012 12.6 3.2 9.4 74.60%
Residential Peak Time Rebate 9/15/2012 32.3 8.3 24 74.30%
Residential Peak Time Rebate 7/20/2012 23.9 6.3 17.6 73.64%
Capacity Bidding Program (Day Ahead) 10/1/2012 9 4.1 4.9 54.44%
Capacity Bidding Program (Day Ahead) 10/2/2012 9 4.2 4.8 53.33%
Summer Saver (AC Cycling) 9/14/2012 15.5 22.5 7 45.16%
Critical Peak Pricing 8/11/2012 11.7 18.4 6.7 57.26%
Critical Peak Pricing 8/14/2012 14.3 25.9 11.6 81.12%
Base Interruptible Program 9/14/2012 0.3 0.8 0.5 166.67%
Capacity Bidding Program Day Ahead (CBP DA)
The percent difference between the CBP DA daily forecast and ex post results respectively
ranged from 32% 12% (Table 3). Based upon this assessment, the daily forecasts for CBP DA
were not accurate or consistent predictors of ex post results.
Since the CBP DA daily forecast model does not have a variable that accounts for weather,
and the ex post models do, this methodological difference could account for the variability
between the two load impact measures. Another factor that could affect this difference is the
percent load impact per customer. Although customers submit load impact bids prior to each
DR event, the actual load reduction on the event day may not coincide with the projected load
reduction.
If weather affects event day load reduction by CBP customers, the addition of a weather
variable to the daily forecast model could increase its accuracy. In order to address uncertainty
in the percent load reduction per CBP customer, DR test events could be scheduled to measure
this value on event like days.
Capacity Bidding Program Day Of (CBP DO)
Similar to the CBP DA program, the CBP DO daily forecasts were not accurate nor consistent
predictors of ex post results based upon the range of the difference, 27.4% 6.0% (Table 2),
between the two load impact measures. As stated above, inclusion of a weather variable in the
24
daily forecast model and measurement of percent load reduction per customer during test
events could increase the accuracy and consistency of the daily forecast model to predict ex
post load impacts.
Demand Bidding Program (DBP)
The percent difference between the DBP daily forecasts and ex post load impacts ranged
from 8.0% to 8.0% (Table 3) for the three DBP events that were called during the summer.
Based upon this result, the DBP daily forecast accurately and consistently predicted ex post
load impacts.
One caveat for making a general assessment of the DBP forecast model is that only one
customer provided load reduction bids for the DR summer events. In order to do so, it would
be advised to examine forecast and load impact data from at least 5 10 event days.
Commercial Peak Time Rebate
SDG&E reported zero ex post load impacts for this program in its March 4th
filing. According
to SDG&E, zero values do not imply that no load reduction occurred but that the load impacts
were not statistically significant.20
Therefore, a comparison of daily forecasts and ex post load
impacts could not be performed.
Based upon conversations with SDG&E, the lack of effectiveness of the commercial Peak
Time Rebate program could be attributed to a weak price signal and inelastic customer demand
during event periods. SDG&E would be advised to discontinue the commercial Peak Time
Rebate program.
Residential Peak Time Rebate
The percent difference between daily forecast and ex post load impacts ranged from 91.2%
to 73.6% (Table 3). This implies that the residential Peak Time Rebate program daily forecast is
not an accurate predictor of ex post load impact. However, the residential Peak Time Rebate
program daily forecast consistently over predicted the ex post results.
Since the ex post methodology only modeled load impacts for customers that signed up to
receive e mail or text alerts and the daily forecast model does not, it is possible that the
accuracy of the daily forecast model could improve if there was parity between the two
methodologies. If only residential Peak Time Rebate opt in customers were included in the
daily forecast model this may resolve the discrepancy. As an alternative solution, since the
daily forecast consistently over predicted the ex post results, SDG&E might consider derating
daily forecasts by a factor of 0.7 to 0.9 when estimating ex post load impacts.
Summer Saver (AC Cycling)
The range of the percent difference between daily forecast and ex post load impacts,
64.0% 38.7%, presented in Table 3 indicates that the daily forecast is not an accurate or
consistent predictor of ex post load impacts.
20
SCE 03 at 21.
25
It should be noted that the both the residential and commercial Summer Saver ex post
methodologies (respectively a randomized experiment and a panel vs. customer regression)
differed from prior years due to the availability of smart meter data21
. This could account for
the difference between daily forecast and ex post results. In addition, both ex post
methodologies utilized control and treatment groups, whereas daily forecast methodologies did
not. According to this assessment, it would be advised to examine how the daily forecast and
ex post models could be harmonized.
Based upon a conversation with SDG&E, a temperature squared variable is utilized in the
daily forecast model. Compared to SCE’’s current AC cycling daily forecast model, SDG&E’’s daily
forecast model includes an additional measure of accuracy. However, in order to better predict
customer behavior on successive event days or prolonged event hours, SDG&E might consider
including an autocorrelation variable in the daily forecast model.
Critical Peak Pricing
The percent difference between the daily forecast and ex post results ranged from 3.1%
81.1%. This is the only program where the ex post results consistently outperformed the daily
forecast predictions.
According to SDG&E, the percent load impacts for the Critical Peak Pricing program in 2012
were lower in comparison to 2011 and led to an underestimation in the daily forecast22
. Critical
Peak Pricing has approximately ~ 1,000 customers and, as SDG&E claims, any variation in the
percent load reduction per customer could lead to high variation in the aggregate impact
estimates. This would also be the case for large scale residential DR programs including Peak
Time Rebate and Summer Saver (AC Cycling).
SDG&E also claims that measurement error might account for differences between load
impact category values. However, no explanation is provided to elucidate how the
measurement error occurred (e.g. since Smart Meters were not fully deployed in SDG&E’’s
territory during Summer 2012, measured load reductions obtained from analog meters were
not accurate).
Base Interruptible Program
The percent difference between the daily forecast and ex post load impact for the Base
Interruptible Program was 166.7%.
Since two large Base Interruptible Program customers dropped out of the program, SDG&E
was not able to accurately forecast the load impact from the remaining customers. It is
possible that further analysis with additional Base Interruptible Program load impact data might
shed light on the accuracy of the daily forecasting methods.
21
SDG&E load impact Filing Executive Summary, April 2, 2012 at 31.
22
SGE 03 at 19.
26
Critical Peak Pricing –– Emergency
Due to decreasing customer subscription to this tariff, the CPP E program was discontinued
as of December 31, 2012.23
D. Summary of Recommendations
Given the divergence between the daily forecast estimates and ex post load impact results,
staff makes the following recommendations:
The daily forecasting methods for all programs must be improved.
The daily forecasting methods should be better documented and should be
developed with relevant agencies and stakeholders.
SCE should test a number of different forecasting methods for the Summer Discount
Plan program.
SCE should change the Summer Discount Plan program strategy of calling groups of
customers for sequential one hour cycling events.
SDG&E should include only opt in customers in the residential PTR daily forecast
model.
SDG&E should run a test event to improve CPP daily forecast estimates.
SDG&E should account for customer behavior during successive event days in the
Summer Saver daily forecast model.
SDG&E should include a weather variable in the CBP forecast model.
IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)
A. Summary of the Staff Analysis and Recommendations
Comparing the 2012 ex post results with the 2012 RA forecast is not an accurate method of
determining how the DR programs performed. RA load forecast represents the maximum
capacity DR can provide under a set of condition for resource planning needs. Ex post load
impact reflects the demand reduction obtained during actual events in response to operational
planning needs. Resource planning and operational planning are different in terms of
conditions (i.e. event hours, participation, and temperature) and purposes.
However, in summer 2012, the Utilities’’ DR programs had not been utilized to its full
capacity even under an extreme hot weather condition. This raises the question of the
usefulness of the current RA forecast and whether RA forecast should be changed to reflect the
set of conditions reflecting operational needs that include the utilities’’ day to day resource
availability limitations and DR dispatch strategies for optimal customer experience. A working
group that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address the
forecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growth
rate, dropout rate) used for forecasting RA.
23
At 61, SDG&E load impact Filing Executive Summary, April 2nd
27
B. Background
The 2012 RA forecast represents the maximum capacity DR can provide under a set of
conditions for resource planning needs. The conditions entail a 1 in 2 weather year24
, portfolio
level, entire participation, five hour window event (1 p.m. to 6 p.m.), and enrollment forecast
assumption.
The 2012 ex post load impacts reflect the demand reductions obtained during actual events
in response to operational needs. Operational needs on the event day may not require the full
capacity of DR because the condition does not warrant it. Utilities have the discretion to call for
a few DR programs with shorter event hours or a smaller group of participants based on their
generation and DR resource dispatch strategies.25
This means an ex post impact may only
reflect a 1 hour event window versus an RA forecast that has a 5 hour event window.
Therefore, the ex post impact may reflect only a segment of a program’’s participants versus the
RA forecast that assumed the program’’s entire set of participants. The ex post impact may
reflect a lower temperature as versus the RA forecast that has a higher temperature of the 1 in
2 weather year condition.
C. Staff Analysis
Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate method
on how well the program performs against its forecast.
The table below contains August monthly average load impact for the 2012 Resource
Adequacy (RA) forecast as filed in the spring of 2011 and the ex post results that occurred in
2012. There are stark differences between what the Utilities forecasted a year ahead (RA) and
what the results are (Ex Post). On average for the month of August, the variability ranges from
485% (over performance) to 95% (under performance) for SCE and 58% to 97% for SDG&E.
The main reason for the discrepancy is because the RA data is used to assist in resource
planning, which means it is characterized as a 5 hour event in which all customers are called for
the entire period (1 6pm) for the summer. However, ex post results reflect the impact from
the actual DR operations, which means that it can be a 1 hour event in which some (not all)
customers are called for a short period of time. Other factors that contributed to the
discrepancy include temperature, enrollment and dual participation.
24
Represent the monthly peak day temperature for an average year. Exhibit SGE 03, Page 14.
25
SGE 06, Page 6.
28
Table 18: SCE Demand Response Load Impact
2012 Resource Adequacy vs. 2012 Ex Post
August Average (MW)
Program Name
RA
Forecast
26 Ex Post
27 Difference
RA vs. Ex Post
% Difference RA
vs. Ex Post
A B C=B A D=C/A
Demand Bidding Program 12 72 60 485%
Demand Response Contracts 105 182 77 74%
Base Interruptible Program
28
548 573 25 5%
Capacity Bidding Program Day Of 19 17 2 11%
Summer Advantage Incentive/Critical Peak Pricing 69 39 30 44%
Agricultural Pumping Interruptible 40 17 22 57%
Summer Discount Plan/ AC Cycling Residential 500 212 288 58%
Save Power Days / Peak Time Rebates 266 36 230 87%
Capacity Bidding Program Day Ahead
29
1 0 1 94%
Summer Discount Plan/AC Cycling –– Commercial 62 3 59 95%
Table 19: SDG&E Demand Response Load Impact
2012 Resource Adequacy vs. 2012 Ex Post
August Average (MW)
Program Name
RA
Forecast
30 Ex Post
31
Difference
RA vs. Ex Post
% Difference
RA vs. Ex Post
A B C=B A D=C/A
Critical Peak Pricing Default 12 19 7 58%
Summer Saver/ AC Cycling 15 19 4 27%
Capacity Bidding Program Day Ahead 10 8 2 20%
Capacity Bidding Program Day Of 22 10 12 55%
Base Interruptible Program
32
11 0.84 10.16 92%
Reduce Your Use / Peak Time Rebates 69 2 67 97%
Demand Bidding Program n/a
33
5 n/a n/a
Critical Peak Pricing Emergency n/a 1 n/a n/a
26
Exhibit SCE 03, Table 1.
27
Exhibit SCE 03, Table 1.
28
Number based on September average because there were no events for month of August.
29
Number based on July average because there were no events for month of August or September.
30
Exhibit SDG 03, Table 1
31
Exhibit SDG 03, Table 1
32
Number based on September average because there were no events for month of August.
33
DBP was not approved until the year after the 2012 RA forecast was filed.
29
Forecasting DR estimate for resource planning needs is different than forecasting for
operational needs.
Unlike resource planning needs, operational needs on the event day may not require the
full capacity of DR because the condition does not warrant it or the Utilities deployed ‘‘optimal’’
dispatch strategies for customer experience. Utilities have the discretion to call for shorter
event hours or a smaller group of participants if the system is adequately resourced for that day.
As discussed in Chapter 3, peaker or other generation resources may have been dispatched
instead of DR even though such operation would be contrary to the Loading Order.34
For
example, SCE can divide its residential Summer Discount Plan participants into three groups and
dispatch each group for one hour of an event, resulting in three consecutive one hour events
(see chart below). Approximately 1/3 of the customers can be curtailed in any given hour.
Rebound from the groups curtailed in event hours 1 and 2 can reduce the net impact in hours 2
and 3, lowering the average hourly impact for the entire event period. As a result, the average
impact per hour can be roughly 100 MW for operation needs. The following figures illustrate
the rebound effects from SCE’’s sub group dispatch strategy for its AC cycling.
Figure 1
Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex Ante
Briefing
34
http://www.cpuc.ca.gov/NR/rdonlyres/58ADCD6A 7FE6 4B32 8C70 7C85CB31EBE7/0/2008_EAP_UPDATE.PDF.
30
However for the RA forecast, resource planning needs require the full capacity of DR. For
example, SCE assumed all residential Summer Discount Plan participants would be curtailed at
the same time to represent the full program capabilities of a reliability event (see chart below).
Subsequent hourly impacts can be larger due to all customers being curtailed at once and
rebound effect being delayed until end of entire event window. As a result, the average impact
per hour for RA forecast can be roughly 300 MW, which is roughly 3 times greater than ex post
in an hour.
Figure 2
Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex Ante
Briefing
The opposite extreme condition could occur where the ex post result is higher than the RA
forecast. In the case of SCE’’s Demand Bidding Program, the average ex post result is 72 MW,
which is 6 times more than the RA forecast of 12 MW (see Table 18). Dual participation was the
major contributor to the discrepancy. For customers who enrolled in two programs such as
Base Interruptible Program and Demand Bidding Program, the RA forecast only counts the MW
in one program (Base Interruptible Program) to avoid double counting.35
Had the two programs
been called the same day, the ex post would have shown a much lower amount for Demand
Bidding Program.
35
Portfolio level.
31
September 14, 2012 was considered a hot day (1 in 10 weather year condition36
), however,
SCE still did not dispatch their entire residential Summer Discount Plan participants. Instead,
SCE only dispatched a portion of its participants for one hour of an event, resulting in a five
consecutive one hour events. On average, SCE received only 6.3 MW37
for the event, which is a
huge underperformance in comparison to RA forecast of 519 MW.38
This raises the question
that if SCE chose not to dispatch all of its Summer Discount Plan participants at the same event
hour during a 1 in 10 weather year condition, under what circumstances SCE will dispatch its
Summer Discount Plan to its full program capacity. The usefulness of the RA forecast is in
question if the utility does not test a DR program to its full capacity. Should the RA forecast
process be amended to include another Ex Ante forecast that is based on operational needs
including optimal customer experience, and if so what would that entail?
D. Conclusion and Recommendations
Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate method
in determining DR program performance because the ex post results are in response to
operational needs which can be entirely different than resource planning needs. However, in
2012 the RA forecast was not tested to its full capacity. This raises the question of whether RA
forecast should be changed to reflect both planning needs and operational needs. A working
group that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address the
forecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growth
rate, drop of rate) used for forecasting RA. This working group should meet in
December/January annually and come up with a set of input assumptions (i.e. growth rate,
drop off rate) used for forecasting DR estimates.
36
Represent the monthly peak temperatures for the highest year out of a 10 year span. Exhibit SGE 03, Page 14.
37
Christensen Associates Energy Consulting 2012 Load Impact Evaluation of Southern California Edison’’s
Residential Summer Discount Plan (SDP) Program, April 1, 2013, Table 4 3d.
38
Exhibit SCE 03, Table 1, 2012 RA for the month of September.
32
Chapter 3: Demand Response Program Operations
I. Summary of Staff Analysis and Recommendations
The 2006 to 2011 data shows that the Utilities historically triggered their DR programs far
below the program limits in terms of number of events and hours. Even with the SONGS
outage, the Utilities did not trigger their DR programs in 2012 summer more frequently as
anticipated. Almost all of the Utilities’’ 2012 DR program events and hours fall within the
historical averages or below the historical maximum. However, staff was surprised to find that
the Utilities dispatched their peaker power plants (peaker plants) three to four times more
frequently in 2012 than the historical averages. The peaker plant service hours were closer to
the plants’’ emission allowances than the DR events to the program limits.
Staff observed a trend where some DR program events decreased from 2006 to 2012 and
yet peaker service hours increased in the same period. This trend raises a concern that the
Utilities had under utilized DR programs and over relied on peaker plants. Under the ““Loading
Order””, DR is a preferred resource and intended to avoid the building and dispatching of peaker
plants.
Due to the time constraints and lack of additional information, Staff was unable to fully
address this question and the reasons behind these trends in this report. Therefore, staff
recommends in future DR program Measurement and Evaluations, the Commission evaluates
the DR program operations and designs in comparison with the peaker plant operations to
ensure the utilities’’ compliance with the Loading Order.
Specifically, the staff recommends that the Commission:
1. Require the Utilities to provide both DR event and peaker plant data and explanations
for the disparity between historical DR event hours and peaker plant service hours in
future DR evaluations and the next DR budget applications. The Utilities should include
the DR and peaker plant hourly data and explain why they did not trigger DR programs
during any of the hours when the peaker plant was dispatched. This information will
inform the future DR program designs to improve the DR usefulness.
2. Require that DR historical operations be reflected in the input assumptions for the Ex
Ante forecast and the evaluation of the program cost effectiveness.
3. Address the Loading Order policy in DR planning and operation and utilization of peaker
plants in the next DR Rulemaking and the Utilities’’ energy cost recovery proceedings.
II. 2012 DR Program Trigger Criteria and Event Triggers
Appendices H and I are a summary of the Utilities’’ 2012 DR program trigger criteria and the
event triggers. The DR program trigger criteria consists of a list of conditions, which is self
explanatory depending on the type of the program, e.g., Emergency Program triggers are based
on system contingencies and non Emergency Program triggers also include high temperature,
33
heat rate (economic), and resource limitations. The 2012 event triggers were the actual
conditions that led to the Utilities’’ decisions to call DR events.
While the DR trigger criteria provides some general ideas on how DR programs are triggered,
there is lack of transparent information on the Utilities’’ DR operations, e.g., when and how the
Utilities made decisions to trigger a DR program. It is necessary to evaluate the DR
performance not only from load impact perspective, but also from the DR operations to
determine the DR reliability and usefulness as a resource. Staff analyzed the 2006 2012 DR
event data and gained some understanding on how the Utilities had utilized DR programs and
how useful the programs were.
III. DR Events Vs. Peaker Plant Service Hours
How do the number compare to the 2012 limit and historically?
As shown in Appendices J and K, SCE has a few DR programs with unlimited number of
events or hours: Demand Bidding Program, Save Power Days (Peak Time Rebate), and Summer
Discount Plan –– Commercial (Enhanced). Others have various event/hour limits ranging from 24
hours/month to 180 hours/year or 15 events/year.39
For the DR programs with an event limit, most of them did not attain the maximum number
of events and/or hours except for SCE’’s Summer Advantage Incentive (Critical Peak Pricing).40
In summer 2012, SCE triggered 12 events for its Critical Peak Pricing, which is within the range
of 9 to 15 events/year. Other DR programs’’ event hours were well below the limits. For
example, SCE’’s residential Summer Discount Plan (AC cycling) is the second to highest triggered
programs with 23 DR events and 24 event hours in 2012, which is still far below the 180 hours
of its event limit despite the SONGS outage. The Base Interruptible Program (BIP) had only one
test event for two hours in 2012.
However, SCE’’s DR program event hours were either within the program historical ranges or
below the 2006 2011 maximum except for Agricultural Pumping Interruptible with 7 hours in
2012 as comparing to 0 to 2 from 2006 to 2011.
What were the reasons for the differences between the 2012 DR event numbers and hours
and the event limits?
SCE explained that the reasons for the differences between the 2012 DR event numbers and
hours vary for each program, which is summarized in Appendix L.41
The reasons can be
characterized for the three types of DR programs as: 1) trigger conditions, 2) optimal dispatches,
and 3) no nomination
As discussed above, DR program operations are based on the trigger criteria set for each
program. For the non Emergency Programs, SCE indicated that optimizing performance and
minimizing customer fatigue is an additional factor considered in its decision to trigger a DR
program. SCE’’s optimal dispatch strategy may have resulted in the DR events and hours far
39
SCE 02, Appendix E, Table 2 A at E 4 and E 5.
40
Id.
41
SCE 02, Appendix E, at E 6 and E 7.
34
below the maximum hours and events for the programs. For example, SCE’’s Summer Discount
Plan is available for 180 hours annually. However, customers would probably never expect that
this program will be triggered close to 180 hours based on their experience to date with the
program. As shown in Appendices M and N, staff finds a similar trend with SDG&E’’s DR event
data.
IV. Peaker Plant Comparison
Most of SCE’’s non Emergency Programs include resource limitation as a program trigger.
Therefore, in theory, one would expect that SCE would trigger DR programs before dispatching
its peaker plants in accordance with the Loading Order. In light of the SONGS outage, the
Commission anticipated more SCE and SDG&E DR events in 2012, yet SCE dispatched peaker
plants substantially more than DR programs (compared to their historical averages as discussed
below.
How do the historical DR events compare to the utilities’’ peaker plants?
SCE provided the permit and service hours for four of its own peaker plants, three were
located in the SONGS affected areas, which is shown in Appendix O.42
SCE historically
dispatched its peaker plants about 9% to 16% of the permissible service hours annually. As
shown in the table below, during the same period, SCE triggered its non Emergency DR
programs 11 to 106 hours on average. However, in 2012, SCE dispatched its peaker plants
three to four times more than the historical average. On the other hand, SCE’’s 2012 DR event
hours were less than the historical range. SDG&E’’s peaker plant and DR event data show a
similar trend as SCE. For example, SDG&E’’s Miramar ran 4,805 hours out of 5,000 hours of
emission allowance. In contrast, its Critical Peak Pricing with the most triggered hours was
dispatched 49 hours out of 126 hours of annual limit.
Table 20: DR Event Hour Vs. Peaker Plant Service Hours
2006 2011 Range 2012
SCE:
Peaker Plants 96 –– 129 Hours 405 –– 465 Hours
Non Emergency DR 11 –– 106 Hours 2 –– 64 Hours
SDG&E:
Peaker Plants 436 –– 1715 Hrs. 974 –– 4805 Hrs.
Non Emergency DR 19 –– 39 Hrs. 14 –– 49 Hrs.
In addition, staff observed that the Utilities highest DR event hours occurred in 2006 and
2007 during the summer heat storms but the highest peaker plan hours occurred in 2012. This
data suggests that the Utilities under utilized DR programs and over relied on its peaker plants,
which is inconsistent with the Loading Order.
42 SCE 01, Appendix C, Tables 9 and 10 at Page 17.
35
In its comments on the 2013 2014 DR Proposed Decision, SCE disagreed with the suggestion
of ““under utilization”” of DR programs based on the 2012 DR events. SCE argued that ““(s)imply
because SCE did not dispatch all of the programs’’ available hours does not mean the programs
should have been dispatched more……Optimal utilization (of DR) ensures the necessary amount
of load drop to enable a reliable grid……””43
SCE should explain why it dispatched its peaker plants
substantially more last summer instead of DR and whether SCE’’s optimal dispatch of DR or the
trigger criteria or designs resulted in SCE’’s increased reliance on peaker plants.
Due to the time constraint and absence of the Utilities’’ explanations, staff is unable to
comprehensively address this issue in this report. The Utilities data warrants further evaluation
to ensure the usefulness of DR resource as a replacement of peaker plants and the compliance
of the Loading Order.
V. Conclusions
Consistent with D.13 04 017, staff finds that most of SCE’’s DR programs did not attain the
maximum number of events and/or hours except for SCE’’s Critical Peak Pricing. The Utilities’’
total numbers of DR events and hours in 2012 were within the historically average, but far from
the program limits. However, in contrast, staff found that SCE owned and contracted peaker
plants were dispatched far more in 2012 in comparison with the historical averages. Some
peakers were much closer to their emission allowance than the DR hours were to their
operating limits. Staff reaches a similar conclusion with SDG&E’’s DR programs in comparison
with its peaker plants.
If the Utilities have historically never triggered their DR programs close to the available
hours, there is a concern with how realistic these limits are. There is a reliability risk if the
Utilities are relying on a DR resource that has never been used to its full capacity. In addition,
the DR cost effectiveness should reflect the historical operations. Staff recommends the
Commission to address the issue in future DR evaluation and budget approval proceedings.
43
SCE Opening Comment filed on April 4, at 4 5.
36
Chapter 4: Residential Demand Response Programs
I. Summary of Staff Analysis and Recommendations
Analysis of Residential programs included Peak Time Rebate (PTR) and AC Cycling. Overall,
customers seem satisfied with the programs based on utility reports and surveys. However staff
encountered problems with program design and operation that need to be addressed to
improve reliability and effectiveness of the programs.
For PTR, staff found that customers who received utility notification of events have higher
awareness of the program when compared to customers who were not notified by the utility or
received indirect notification such as mass media alerts. More importantly, data for both
utilities show that customers who opted into receiving alerts were the only group that
significantly reduced load. For both utilities, customers defaulted on MyAccount to receive
alerts did not reduce load significantly. However, the entire eligible customer class qualifies for
bill credits, which resulted in a problem of 'free ridership.' Both utilities should modify PTR from
a default to an opt in program, where only customers opting to receive event alerts would
qualify for bill credits.
For SCE's Residential AC Cycling staff found that the current group dispatch strategy is
resulting in a rebound effect. The rebound effect impacts the actual load reduction the program
is capable of producing. Staff recommends SCE to (1) align the maximum program event
duration with customer preference for shorter events to improve forecast, and to (2)
reconsider its incentive structure to favor participation in longer event duration.
Finally, both utilities should take advantage of AMI infrastructure and related enabling
technology that could improve program delivery, reliability and customer experience.
II. Residential Peak Time Rebate (PTR)
A. Overall Customer Experience
For both utilities, customers were generally satisfied with the program. For SCE, customers
seem satisfied with the level of incentives, the time between notification and event. However
customers would like more information regarding the program and bill credits. SDG&E’’s
customers reported overall customer satisfaction with the program, but similar to SCE’’s
customers, would benefit from more information and outreach.
Level of awareness for both utilities seems higher amongst customers who chose to sign up
to receive notifications. This is reflected in the overall load reduction verified by ex post data.
Only customers who signed up for event notification significantly reduced load.
For PTR, none of the utilities noticed evidence of customer fatigue, but this does mean it did
not occur; just that it was not noticeable.
37
B. SCE’’s Peak Time Rebate/Save Power Day
1) Summary
Customers who received utility notification of events have higher awareness of the program
when compared to customers who were not notified by the utility. More importantly,
customers who opted into receiving alerts were the only group that significantly reduced load.
Customers defaulted on MyAccount to receive alerts and the remaining customers not directly
notified by the utility did not reduce load significantly. SCE considered only customers who
received alerts in their forecast and ex post verification. However, the entire eligible customer
class qualifies for bill credits. Awareness of the program, reflected by the willingness to sign up
for receiving alerts, seems to indicate more willingness to reduce load. This factor should be
considered in program design. Staff identified an issue with ‘‘free ridership’’, where customers
are paid even though they didn’’t significantly reduce any load. Staff recommends changing PTR
from a default program to an opt in program, paying bill credits only to customers who opt in
to participate.
2) Background
D.09 08 028 approved Save Power Day, SCE’’s Peak Time Rebate (PTR) rate. The decision
approved bill credits of 0.75c/kWh reduced with an additional 0.50c/kWh for customers with
enabling technology.
This is a default program for residential customers with a smart meter and has been
available since 2012. The program provides incentives to eligible Bundled Service Customers,
who reduce a measurable amount of energy consumption below their Customer Specific
Reference Level (CSRL) during PTR Events.44,45
The utility may call events throughout the year on any day, excluding weekends and
holidays. Events will take place between 2pm and 6pm on days an event is called. Participants
receive a day ahead notification of the event. Bill credits will be paid in each billing cycle based
on the sum of events called and usage reduction during the period.46
Bill credits will be
recovered from the respective customer class through the Energy Resource Recovery Account
(ERRA).
During 2012, SCE started defaulting customers on MyAccount to receive email notifications,
with the remaining customers not directly notified by the utility. Alternatively, customers may
choose to opt in to receive alerts. As of November 30th, approximately 4 million customers are
on PTR and 824,000 were signed up to receive notifications (via MyAccount).47
According to SCE,
44
SCE Schedule D –– Domestic Service, sheet 3
45
CSLR: peak average usage level”” is the customer’’s average kWh usage during the 2:00 p.m. to 6:00 p.m. time
period of the three (3) highest kWh usage days of the five (5) non event, non holiday weekdays immediately
preceding the PTR Event. The CSRL is used to determine the customers kWh reduction for each PTR Event in
order to calculate the rebate.
46
SCE Schedule D –– Domestic Service, D.09 08 028 Att. C at 7.
47
SCE 01 Testimony at 27, lines 11, 18 19.
38
approximately 60,000 customers have opted in to receive alerts in 2012 during the summer
months.48
3) Lessons Learned
In support of its 2013 2014 Application, SCE provided data to highlight lessons learned from
the 2012 program year.
Customer awareness
Awareness of the program is higher amongst the group of customers whom the utility
notified of events: 66% of notified respondents were familiar with the program but only 43%
were familiar in the group not notified49
. When prompted for awareness of events, the same
pattern is noticeable. 72% of respondents in the group receiving notifications who were aware
of the program claimed awareness of specific events, compared to 40% in the group not
receiving notifications. When including customers aware and the ones prompted with
information about the program, 55% of the notified group was aware but only 23% of the non
notified respondents was aware.50
Customer satisfaction
There was no information regarding customer perception of fairness of savings/incentive
levels in SCE’’s data, however customers seem to link participation with expectation of savings
as 80% of respondents identified earning bill credits as important for participation51
. Moreover,
participants seem to be willing to participate even in the face of low savings.52
Event notification
The majority of respondents aware of the program found out about events via utility
notification (over 60% for the opt in group). Close to 23% of respondents in the overall
population found out about events in the news.53
According to results of the customer surveys, about 90% of customers notified of the event
and about 56% of customers not notified but aware of the event, were happy with the amount
of time between notification and event54
. It appears that a day ahead strategy could be
adequate, however customers were not prompted regarding preference for a day of reminder,
so it is not clear from the lessons learned if this could increase awareness and response. SCE
requested to add a day of notification in their 2013 2014 Program Augmentation Application,
which the Commission denied due to lack of evidence of need.55
48
Email communication with SCE (4/5/2013)
49
SCE 02 Appendix A at 3.It is important to note that the surveys only represented results for two groups:
customers notified by the utility and customers who were not notified. Defaulted customers and customers not
defaulted into receiving notifications from the utility were bundled together under notified customers.
50
SCE 02 Appendix A at 4
51
SCE 02 Appendix B at 24
52
SCE 02 Appendix B at 36
53
SCE 02 Appendix A at 5
54
SCE 02 Appendix A –– Save Power Day Incentive/Peak Time Rebate Post Event Customer Survey at 15
55
D. 13 04 017, at 28
39
Customer preference
Another survey showed that customers would benefit from more information about the
program, most specifically in terms of expectations of savings. The majority of customers would
prefer to be notified by email and they believe that a reminder at the beginning of the summer
would help them to be more ready to participate.56
Program utilization
PTR has no limits on the number of events called, with maximum of 4 hours per event. SCE
called 7 events, 28 total event hours in 2012 and did not observe evidence of customer fatigue.
The trigger criterion was temperature for all events. 57
Although SCE explains the need to
balance usefulness with the preservation of the resource58
, the program appears underutilized
in 2012. Still, this is the first year of the program.
Other findings
SCE states that third party providers such as telecommunication companies, cable
companies, security providers, retailers, and manufacturers of thermostats or providers of
home automation services are potential partners to reach untapped load reduction potential in
the residential sector59
. As part of their 2013 2014 Program Augmentation Applications, SCE
has proposed a pilot to explore this market and the Commission has approved funding for this
pilot.60
4) Analysis of settlement and ex post data
Ex post load impact
SCE only calculated ex post data for customers notified of events; it did not verify ex post
load impact for customers not notified by the utility. This indicates that this group was not
expected by SCE to reduce load significantly. SCE’’s 2012 Load Impact Report found that
customers who opted into event notifications reduced a statistically significant average of
0.07kWh per hour.61
The same report found that customers defaulted into receiving
notifications did not produce statistically significant load impact.62
Incomplete data does not allow staff to verify with certainty the differences in load
reduction between all participant groups (opt in, defaulted in notification and the remaining of
the population). However staff looked at the all the data SCE provided to look for evidence of
what is most likely happening.
56
SCE 02 Appendix B –– Save Power Days Research Study Results at 39
57
SCE 03 –– March 4, 2013 –– Appendix B Table 4; SCE o1, Appendix C at 14.
58
SCE 01, Appendix C at 14
59
Email communication with SCE April 10, 2013.
60
D.13 04 017, OP 19
61
‘‘2012 Load Impact Evaluation of Southern California Edison’’s Peak Time Rebate Program’’ Christensen Associates
Energy Consulting (4/1/2013) at 1. This figure is slightly lower than what the 0.097kW reported on SCE 03
March 4, 2013 at 22.
62
Id at 24
40
It is interesting to notice that for the first four events, customers defaulted did reduce load
although not significantly, but for the three last events, their load in fact increased. In contrast,
the opt in group, to various degrees, reduced load for all events. The ex post results varied
considerably between events, even though the temperature seems fairly constant and not
extreme. It would be interesting to investigate why such variability and how it could help to
improve ex post results to improve reliability of the program. A more detailed analysis of
impact can be found on the sections above.
Table 21: 2012 Ex post Load Impact by Group (MW)
(Average Event Hour)
Event Date
Customers
who opted
into alerts
(a)
Customers
defaulted into email
alerts excluding
Opt in alerts
(b)
Customers not
notified directly
of events
(c)
Temperature
(d)
7/12/12 N/A N/A N/A 80
8/10/12 39.60 56.25 N/A 89
8/16/12 11.17 13.25 N/A 89
8/29/12 21.22 0.71 N/A 92
8/31/12 6.37 6.35 N/A 86
9/7/12 0.17 23.28 N/A 84
9/10/12 6.04 4.39 N/A 89
Source: Email communication with SCE (3/25/2013); SCE 01 Appendix C Table1
Settlement data analysis
In 2012, SCE paid a total of $27,349,008 in incentives for PTR residential customers.63
SCE
provided full settlement data, which shows evidence of a potentially large ‘‘free ridership’’
problem, where customers receive incentives without significantly reducing load.
63
Email communication with SCE (4/5/2013)
41
Table 22: Settlement Load Reductions MW
(Average Event Hour)
Event Date
Customers
who opted
into alerts
(a)
Customers
defaulted into
email alerts
excluding Opt
in alerts
(b)
Customers
not notified
directly of
events
(c)
Event
Settlement
(d)
7/12/12 85.9 140.08 1,613 1,839
8/10/12 55.94 134.68 827 1,018
8/16/12 87.99 233.35 1,499 1,821
8/29/12 37.19 84.36 579 700
8/31/12 52.95 132.79 981 1,166
9/7/12 60.85 165.71 1,105 1,332
9/10/12 61.9 139.65 1,049 1,250
Average (MW) 63.2 147.2 1093 1304
% 4.9% 11.3% 83.9% 100.0%
Average
Participants 60,190 160,430 1,265,544 1,486,165
% 4% 11% 85% 100%
Source: Email communication with SCE (4/5/2013)
According to settlement data, 84% of bill credits were paid to customers whose load impact
was not considered for forecast or ex post purposes. In addition, 11% of incentives were paid to
customers defaulted into receiving notifications and did not produce statistically significant
load impact.64
This means that in fact 95% of all incentives were paid to customers who either
were not expected to or did not reduce load significantly.
64
‘‘2012 Load Impact Evaluation of Southern California Edison’’s Peak Time Rebate Program’’ Christensen Associates
Energy Consulting (4/1/2013) at 24
42
Table 23: 2012 PTR Incentives Paid
Event Date
Customers
who opted
into alerts (Ex
post MW)
(a)
Customers
defaulted into
email alerts
excluding opt
in alerts (ex
post MW
reduction)
(b)
Customers not
notified by SCE
(c) Total
7/12/12 $254,572 $419,794 $4,836,197 $5,510,563
8/10/12 $166,245 $403,752 $2,480,819 $3,050,816
8/16/12 $261,825 $699,568 $4,495,547 $5,456,940
8/29/12 $110,681 $252,931 $1,734,182 $2,097,794
8/31/12 $157,557 $398,093 $2,939,474 $3,495,124
9/7/12 $181,406 $496,648 $3,312,785 $3,990,840
9/10/12 $184,349 $418,665 $3,143,816 $3,746,830
Total $1,316,635 $3,089,451 $22,942,822 $27,348,908
% 5% 11% 84% 100%
Source: Email communication with SCE (4/5/2013)
As there is no ex post data for customers not directly notified by the utility (either opted to
receive or defaulted in notification), it is not possible to verify their actual impact and if it would
be significant or not. However, based on the fact that not even defaulted customers reduced
load significantly and findings from SDG&E (see next section), it is fair to assume that results for
that group would not be significant.
Incentives and capacity cost
It is possible to notice difference in cost of capacity between the group who opted in to
receive notification and the group defaulted to receive notifications. In this report, staff
normally used the average event hour reductions. But as in SCE’’s case there is such variability in
ex post results, staff will use average hourly impact for all events as a simple way of showing
that the average capacity produced by the defaulted group is nearly six times more expensive
than the average capacity produced by the opt in group.
43
Table 24: 2012 PTR Cost of Capacity
Event Date
Customers
who opted
into alerts
(MW)
(a)
Incentives paid
to the group
per event
(b)
Customers
defaulted into
email alerts
excluding Opt
in alerts (MW)
(c)
Incentives paid
to the group
per event
(b)
7/12/12* N/A $254,572 N/A $419,794
8/10/12 39.60 $166,245 56.25 $403,752
8/16/12 11.17 $261,825 13.25 $699,568
8/29/12 21.22 $110,681 0.71 $252,931
8/31/12 6.37 $157,557 6.35 $398,093
9/7/12 0.17 $181,406 23.28 $496,648
9/10/12 6.04 $184,349 4.39 $418,665
Average 14.10 6.03
Total $1,062,064 $2,669,657
Cost of
Capacity $75.34 $442.62
Source: Email communication with SCE (4/5/2013). Staff did not include the 7/12/12 event
in the calculation as there is not ex post data for this event.
5) Findings
Based on analysis of program design, settlement and ex post load impact and customer
participation data for the summer of 2012, staff has found the following:
The program, as approved in the decision, pays the same amount of incentives for all
customers enrolled into the program. There is additional incentive for customers who have
enabling technology.
There are differences in performance, awareness and willingness to reduce load between
customers who were notified directly by the utility and customers who were not.
Customers are overall satisfied with notification mode, timing and level of incentives.
There is not enough information to determine if customer fatigue is an issue.
Ex post analysis of customers who opted into alerts significantly reduced their load in
comparison to customers only defaulted into alerts. This indicates that customer willingness
to participate (indicated by the action to sign up for alerts) may help improve load reduction.
Incomplete ex post load impact results show load reduction for customers notified by the
utility –– both who have signed up and defaulted into receiving alerts. No results were
available for the entire population.
It is not possible to verify if incentives paid to non notified customers did not result in
significant load reduction, but the fact that SCE does not include this group in its forecast
and ex post results indicates that their load impact is not significant.
44
There is potential ‘‘free ridership’’ issue in SCE’’s PTR.
C. SDG&E’’s Peak Time Rebate/Reduce Your Use
1) Summary
Overall, customers are satisfied with the program. There is difference, however, in load
awareness and load reduction between customers who opted into receiving alerts and the rest
of the population. Only customers who opted into receiving utility notification significantly
reduced load. However, the entire population qualifies for bill credits. Awareness of the
program, reflected by the willingness to sign up for receiving alerts, seems to indicate more
willingness to reduce load. Staff identified an issue with ‘‘free ridership’’, where customers are
paid even though they didn’’t significantly reduce any load. Staff recommends changing PTR
from a default program to an opt in program, paying bill credits only to customers who opt in
to participate.
2) Background
D.08 02 034 approved the Reduce Your Use program, SDG&E’’s Peak Time Rebate (PTR) rate,
the first dynamic rate of such design approved by the Commission65
. The program has been
available since the summer of 2012, with a pilot in 2011.
The program is implemented as proposed: ‘‘A two level PTR incentive with a higher level
payment for customers who reduce electric usage below an established CRL [customer
reference level]66
with enabling demand response technology, and a lower level payment to
customers without such technology.’’67
Customers receive a bill credit of 0.75$/kWh with an additional credit of 0.50$/kWh for
customers with enabling technology. SDG&E’’s tariff lists programmable communicating
thermostats (PCTs), AC cycling, pool pump cycling as examples of technologies eligible for the
0.50¢/kWh additional incentive.68
Commission has approved the addition of In Home Displays
(IHD) to the list of enabling technologies in SDG&E’’s tariff.69
The utility may call events throughout the year without limit to the number of events called.
Events will take place between 11am and 6pm on days an event is called and participants
receive a day ahead notification of the event. Bill credits will be paid in each billing cycle based
65
SCE’’s Save Power Day program was approved in 2009 on D.09 08 028.
66
Defined as the ‘‘total consumption for the PTR event period averaged over the three (3) highest days from within
the immediately preceding five (5) similar non holiday week days prior to the event. The highest days are
defined to be the days with the highest total consumption between 11 a.m. and 6 p.m. The similar days will
exclude weekends, holidays, other PTR event days, and will exclude other demand response program event days
for customers participating in multiple demand response programs.’’ SDG&E PTR Tariff.
67
D.08 02 034 at 22.
68
SDG&E PTR tariff defines enabling technologies as to be ‘‘initiated via a signal from the Utility, either directly to
the customer or the customer’’s device, or via a third party provider to the customer or the customer’’s device
that will reduce electric energy end use for specific electric equipment or appliances, is included in a designated
Utility demand response program, and that is acceptable to and approved by the Utility, subject to the
verification of processes necessary to safeguard confidential and proprietary Utility and customer information.’’
69
D.13 04 017, OP 22
45
on the sum of events called and usage reduction during the period. Bill credits will be recovered
from the respective customer class through the Energy Resource Recovery Account (ERRA).70
The utility can call only one event per day with a maximum of 7 hours.
3) Lessons Learned
In support of its 2013 2014 Application, SDG&E provided data to highlight lessons learned
from the 2012 program year. For PTR, SDG&E conducted three post event surveys.
Customer Awareness
Results of the surveys showed differences in level of awareness between the three main
groups71
of customers participating in PTR: customers who actively opted into day ahead event
notifications (opt in), customers registered onto MyAccount and receiving event notifications
(default) and customers not directly notified by the utility, but notified via mass media (no
MyAccount). In general, the opt in group demonstrated the highest level of awareness of the
PTR events. About 83% of the opt in group was aware of the program concept –– events and bill
credit, compared to 43% of respondents in the defaulted group and 40% in the no MyAccount
group.72
Customer Satisfaction
Customers are generally satisfied with the amount of incentives paid.73
Customers also
seem generally satisfied with number of notifications, although respondents did indicate that
more promotion and information about the program would be beneficial.74
SDG&E indicated
that is working to resolve issues of notification encountered in 2012 as well as working to
improve customer education for using online tools. 75
Overall, customers responded positively
to the program.
Program Utilization
In the summer of 2012, SDG&E called 7 events, a total of 49 event hours, and all events
were called due to temperature76
. Given that this program has no limit of events, the program
seems underutilized. However SDG&E states that even if a temperature point is reached, the
program may not be necessarily called, as system need is assessed internally. This approach
also takes into consideration customer experience. 77
Customer Fatigue
SDG&E states that it is difficult to determine if customer fatigue is an issue, but ex post
results show that when program was called three days consecutively in August, the load impact
70
SDG&E GRC Phase 2 Settlement at 8.
71
SDG&E in post event surveys segmented customers into more than the groups analysed in this report, but to
simplify the analysis, staff looked only at the main three groups of participants.
72
SGE 02 February 4th
, 2013 Attachment 6 (Table 5)
73
SGE 02 –– Revised appendix X at 20.
74
SGE 02 Feb 4th
, 2013 Att. 5 Table 13, Att. 6 Table 9
75
SGE 02, Revised Attachment 1 –– Revised Appendix X at19
76
SGE 02, Revised Attachment 1 –– Revised Appendix X Table 11
77
SGE 02, Revised Attachment 1 –– Revised Appendix X at 14
46
was lowest on the last day.78
Temperature does not seem to be a factor as the day with the
lowest reduction had similar temperature to two preceding days. Still, the result does not seem
conclusive.
Table 25: Customer Fatigue
79
Event Date
Average Event
Hour Reduction
(MW)
Temperature
(°F)
8/9/12 3.2 88
8/10/12 3.1 92
8/11/12 1.7 91
Enabling Technology
Enabling technology seems to be improving load reduction as preliminary results show that
customers with In Home Display (IHD) saved 5% to 8% on average during events, while
customers without saved between 0% to 2%.80
Effort to reduce usage during events
SDG&E investigated as part of post event surveys what actions customers would take on
event days and the level of effort made to respond. While actions taken were hypothetical, i.e.
do not reflect reported actions taken, respondents in all three groups seem aware of possible
actions to reduce load.
For instance 38% of opt in respondents, and around 30% of MyAccount and 30% no
MyAccount said they could unplug electronics. 41% of the opt in, 23% of my account and 19%
on no MyAccount would turn off AC. When prompted about the effort made to reduce usage
during the August 14th
event, 33% of opt in respondents indicated having made ‘‘a lot more
effort than usual’’ in comparison to around 10% for MyAccount and 10% no MyAccount
respondents. 54% of the opt in respondents and around 40% of MyAccount and 40% of no
MyAccount groups said they made somewhat of an effort. Finally 13% of the opt in, 50% of the
MyAccount and 44% of the no MyAccount made no more or less effort than usual to reduce
load. 81
The results seem to indicate that respondents in all groups, irrespective of IOU notification,
may have made an effort to reduce load and did know what options they had to do so. Still, ex
post load reduction shows that only the opt in group, about 6% of the entire population,
significantly reduced load, contradicting assumptions that mass media or defaulting customers
into email alerts could generate significant reduction.
78
SGE 02, Revised Attachment 1 –– Revised Appendix X at 11
79
Source: SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 3.
80
SGE 02 February 4th
, 2013 at 5, Lines 5 7.
81
SGE 02 February 4th
, 2013 Attachment 6 (Table 11 and 12).
47
4) Analysis of settlement and ex post data
Ex post load impact
Awareness of the program and willingness to participate (in the form of signing up to
receive alerts) seem to be an important factor in load reduction. This is supported by analysis of
ex post data. The opt in group was the only group to produce statistically significant load
reductions82
.
Table 26: Ex Post Load Reductions83
(Average Event Hour MW)
Event Date
Customers
who opted into
alerts
(a)
Customers
on MyAccount
excluding Opt in
alerts
(b)
Customers not on
MyAccount excluding
opt in alerts
(c)
Temperature
(d)
7/20/12 6.1 0 0 87
8/9/12 3.2 0 0 88
8/10/12 3.1 0 0 92
8/11/12 1.7 0 0 91
8/14/12 1.1 0 0 88
8/21/12 3 0 0 83
9/15/12 8.2 0 0 104
Settlement analysis
Based on average hour load reduction used for settlement calculation, 94% of incentives
were paid to customers either defaulted to receive email alerts on MyAccount or customers not
on MyAccount and 6% were paid to customers that opted into alerts.84
When compared to ex
post data, only customers who opted into alerts, or about 4% of the total population enrolled
on PTR, significantly reduced load.85,86
This points to an issue of ‘‘free ridership’’, where
customers receive incentives without significantly reducing load.
82
SGE 01a, at 3.
83
Source: SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 3.
84
SGE 02, Attachment 1, Revised Appendix X Table 3 and SGE 03 March 4, 2013 Table 3.
85
SGE 02, Attachment 1, Revised Appendix X at 4 and Table 3.
86
For PTR residential and small commercial the participants represent the customers who proactively opted into
alerts and the enrollment number represents all the customers who were eligible to receive a bill credit. Fifty
percent of residential customers are enrolled in MyAccount and received an e mail alert.’’ SGE 02, Attachment 1,
Revised Appendix X at 4.
48
Table 27: Settlement Load Reductions MW87
(Average Event Hour)
Event Date
Customers who
opted into alerts
(a)
Customers on
MyAccount
excluding Opt in
alerts (default)
(b)
Customers not on
MyAccount excluding
opt in alerts
(c)
Event
Settlement
(d)
7/20/12 10 79 71.2 160.1
8/9/12 13.7 100.2 89 202.8
8/10/12 12.6 97.1 87 197
8/11/12 12.7 117.3 101.2 231.1
8/14/12 14.8 118.4 106.7 240
8/21/12 29.9 270.3 258.4 559
9/15/12 17.3 151.1 129.8 298
Average (MW) 15.9 133.3 120.5 269.7
% 5.9% 49.4% 44.7% 100.0%
Average
Participants 45,268 562,982 608,250 1,171,232
% 4% 48% 52% 100%
Incentives and capacity cost
In 2012, SDG&E paid out $10,134,879 in incentives for PTR residential customers.88
If
assuming the estimate MW reported to the CAISO (7 day Report), the program maximum
expected capacity was an average event hour impact of 45.8 MW (event hours).89
This implies a
capacity cost of approximately $221/kW. According to ex post data, the actual capacity
generated was an average event hour of 8.2MW resulting in a cost of capacity of $1,232.7/kW.
This cost will be recovered from the residential class of customers.
5) Findings
Based on analysis of program design, settlement and ex post load impact and customer
participation data for the summer of 2012, staff has found the following:
The program, as approved in a Commission decision, pays the same amount of incentives
for all customers enrolled into the program. There is additional incentive for customers who
have enabling technology.
87
Source: Adapted from SGE 02 Attachment 1, Revised Appendix X, Table 2 5; SGE 03 March 4th Table 3.
88
SDG&E AL 2420 E, at 2.
89
SGE 02, Attachment 1, Revised Appendix X Table 3.
49
There are differences in performance, awareness and willingness to reduce load between
the three main groups of participants: customers who opted to sign in to receive alerts,
customers defaulted into MyAccount to receive event email alerts and customers not yet on
MyAccount and not being directly notified by the IOU and who finds out about events via
mass media.
There is not enough information to determine if customer fatigue is an issue.
Ex post load impact results show only customers who signed in to receive alerts significantly
reduced load.
94% of incentives paid did not result in significant load reduction.
‘‘Free ridership’’ is an issue in SDG&E’’s PTR, where the majority incentives were paid to
customers who did not significantly reduce load.
Based on incentives paid during the summer of 2012, the cost of capacity is five times
higher when adjusting forecasted load impact by ex post load impact.
D. Staff Recommended Changes for PTR
It is clear that ‘‘free ridership’’ is an issue that needs to be addressed. It is an issue when
forecasting load reduction –– the forecasted impact would be much higher than what could be
verified, and results in additional costs to ratepayers. While ‘‘free ridership’’ in most cases is a
baseline and settlement methodological issue, this issue could be partially alleviated by changes
in program design.
Incentives should reward and encourage customer engagement. Therefore, staff
recommends changing PTR from a default program to an opt in program, eliminating incentives
paid to customers not actively choosing to receive event alerts and keeping the current
incentive level to customers who sign up to receive alerts and use enabling technologies. Staff
suggests the following incentive structure:
Table 28: Propose Program Structure
Group $/kWh
Opt in to receive alerts 0.75
Opt in to receive alerts and
Enabling Technologies
1.25
Not opt in Not a participant in the
program
This approach to PTR would ensure that customers are rewarded for the level of action they
are prepared to take. If this proposed level of incentives were in place in 2012, it could have
reduced the amount of incentives paid by about 95% as shown below.90
90
To simplify the calculation, staff ignored the additional $0.50/kWh for enabling technology. These incentives
would be paid in addition to the $0.75 /kWh.
50
Table 29: Iillustration of Staff Proposed Changes for SCE
Current Incentive Structure
Group
Incentive Level
($/kWh)
Capacity (MW
Ex post*)
Total incentive
paid ($) Cost of capacity ($/kW)
All 0.75 95.8 27,349,009 285.48
Proposed Incentive Structure
Opt in 0.75
95.8
1,328,160
13.86No opt in 0
Potential reduction 95%
* Ex post for the entire program
Table 30 Iillustration of Staff Proposed Changes for SDG&E
Current Incentive Structure
Group
Incentive Level
($/kWh)
Capacity (MW
Ex post*)
Total incentive
paid ($) Cost of capacity ($/kW)
All 0.75 8.2 10,108,082 1,232.69
Proposed Incentive Structure
Opt in 0.75
8.2
582,750
71.07No opt in 0
Potential reduction 94%
* Ex post for the entire program
While issues of baseline and settlement methodology are out of the scope of this analysis
and would demand a much more in depth investigation, it is possible to attempt to alleviate the
impact of free ridership by limiting PTR bill credits to customers who do not opt to participate.
Utilities should focus on encouraging customers to adopt enabling technologies. Perhaps
some of the resources saved by having a three tier structure of incentives could be used to
subsidize enabling technologies to enable direct load control. Also, utilities should explore
alternatives to service delivery such third party entities. SCE found that the interest of third
parties is shifting towards the residential sector and such opportunities should be seriously
explored.
Finally, utilities should track as part of their ex post verification efforts, if the presence of
enabling technologies significantly improves load reduction and if there is difference between
different technologies used. In addition, utilities should look to investigate if customer fatigue is
an issue, especially in view of the SONGS outage potentially increasing the trigger of PTR events.
51
III. Residential Air Conditioning (AC) Cycling
A. Overall Customer Experience
Customers were generally satisfied with the program. For SCE, 2012 was the year the
program was transitioned from emergency to price trigger. SCE reports customers have kept a
positive view of the program and regarded incentives an important part of participating in the
program. Customers did report that they prefer shorter and more frequent events as opposed
to longer events.
SDG&E also reports overall customer satisfaction but points that the majority of customers
complaints were due to uncomfortable temperatures due to the unit cycling on/off. Also,
SDG&E reports customers were satisfied with the level of incentives.
No utilities reported customer fatigue, although SDG&E had three events in consecutive
days and load reduction dropped. However, without analyzing other factors, such as humidity
and customer perceptions of discomfort, amongst other factors, that could have contributed to
load impact reduction, it is not possible to say with certainty if it did occur or not.
A. SCE’’s Summer Discount Plan
1) Summary
SCE’’s AC Cycling changed its event trigger structure from emergency to price. Customers
seem satisfied with the current program design. Staff has identified that the program has issue
of ‘‘rebound effect’’ and recommends that the program design should be changed to include an
additional level of incentive that would cater to customers willing to cycle their unit for the
entire event duration in below.
2) Background
As part of a D.11 11 002, SCE agreed to transition the Residential Summer Discount Plan
(Res SDP) from emergency to price trigger and to bid Res SDP’’s load in the CAISO market for
dispatch. D.11 11 002 authorized revisions to SCE’’s program to enable the changes agreed to in
a settlement.91
As currently designed, Res SDP offers an annual incentive for customers who wish to
participate in the program. The program offers two choices for cycling duration as well as gives
customer the choice to override an event up to five times in the year for slightly lower
incentives. Incentives are calculated according to size of the equipment, cycling duration and
override option:92
91
D.11 11 002 at 2 4
92
SCE Schedule D SDP, sheet 1; SCE 01 Testimony Table II 2
52
Table 31: SCE Residential AC Cycling Incentives
Option Incentive p/
Summer Saver day p/
ton
100% cycling
maximum savings
(based on 4.5 ton unit)
50% cycling
maximum savings
(based on 4.5 ton unit)
Standard Option $0.36 (100% cycling)
$0.18 (50% cycling)
$200 $100
Override Option $0.18 (100% cycling)
$0.09 (50% cycling)
$100 $50
SCE Res SDP program has approximately 307,000 customers with an expected load
reduction of 466MW.93
Events can be dispatched year round with a maximum of 180 hours and
each event can last up to six hours. In 2012, SCE paid a total of $51,882,087 in incentives.
3) Lessons Learned
The 2012 summer season proved to be a transition year for this program. Customers had to
transition from an expectation of little service reduction to expecting several disruptions
throughout the year. Overall, SCE asserts that customers continue to have a positive view of the
program.
Lessons learned from the transition in 2012 showed that bill savings are an important
element for customer participation. The majority of customers opted for the Standard Option
preferring savings to override capability, and the ones who chose to override rarely used it.94
Only 1.5% of customer who left the program did so due to the program changes.
Preliminary findings of customer surveys found that customers prefer shorter events even if
more frequent. SCE experimented with different event duration calls and found that as events
got longer, customer dissatisfaction increased.
In 2012, SCE triggered 23 events for a total of 24 hours, for reasons of temperature, CAISO
Emergency and evaluation. Because the program changed the trigger condition and design in
2012, historical comparison would not be accurate. But data shows that Res SDP was called
more often than in previous years.95
B. SDG&E’’s Summer Saver
1) Summary
Customers seem satisfied with the program. The program performed in accordance with
past years. Staff does not recommend any changes to the program design.
93
SCE Schedule D SDP, sheet 1; SCE 01 Testimony at 9, Lines 21 23. Load impact based on ex ante estimates from
Commission Monthly Report (12/21/2012)
94
SCE 01 Testimony at 11, Lines 3 5
95
SCE 03 March 4, 2013, Appendix B Table 4.
53
2) Background
The Summer Saver program is a 15 year long term contract based procurement run by
Comverge.96
Comverge is responsible for installing, removing and servicing the AC unit.
Summer Saver is a direct load control program where a device is installed on the premise to
cycle the AC Unit when an event is called. It has a day of notification, meaning customers
receive event notification on the day of the event. The program runs May through October.
Customers are eligible to annual incentives for participation based on the cycling option and
size of the unit and participation period: 97
Table 32: Summer Saver incentives
Cycling
Option Res Bus
30% N/A $ 9.00 Per ton
50% $ 11.50 $ 15.00 Per ton
100% $ 38.00 N/A Per ton
The Summer Saver program had around 28,500 residential and commercial customers
enrolled in 2012.98
The majority of participants are residential customers –– 23,948 in 2012, and
this distribution has been fairly consistent since 2009.99
The program has an event limit of 15 events or 120 event hours. The utility can call one
event per day and events run for minimum 2 hours and maximum 4 hours. Events can be called
anytime from 12pm to 8pm on event days. In 2012, the utility called 8 events or 29 event hours,
an average of 3.6 hours/event. Events will be called based on temperature and system load.100
3) Lessons Learned
Residential customers were responsible for 84% of load reduction during the 2012 summer.
SDG&E paid $2.5 million in incentives to residential customers for 18.6 MW average event hour.
The majority of customer complaints were due to uncomfortable temperatures due to the
AC cycling.101
Overall, customers seem satisfied with the level of incentives as SDG&E reported
that less than 1% of customers who left did so due to unfair incentives.102
96
http://www.comverge.com/residential consumer/find a program
97
http://www.sdge.com/save money/demand response/summer saver program and email communication with
SDG&E on (3/4/2013)
98
SGE 02, Attachment 1, Revised Appendix X Table 2
99
Email communication with SDG&E (4/4/2013)
100
SGE 02, Attachment 1, Revised Appendix X Table 8
101
SGE 02, Attachment 1, Revised Appendix X at 19
102
SGE 02, Attachment 1, Revised Appendix X at 20
54
SDG&E did not report evidence of customer fatigue for Summer Saver, although it
recognizes that this does not mean fatigue does not occur, just that it is not measurable.103
Ex
post load impact results showed that when the program was called three days consecutively
there was a drop in the load reduction. However, SDG&E states that there is not enough
information to suggest that this is a result of fatigue. Humidity or outside temperature being
lower in the last day than the previous day amongst other factors could have contributed to
lower load reduction in the last day.
Table 33: AC Cycling customer fatigue104
Date
Ex post average over
event period (MW)
Res Res+Com Temperature
9/13/12 12.0 12.6 81
9/14/12 18.6 22.5 109
9/15/12 8.2 8.8 104
The frequency of events called has been fairly consistent throughout the program
availability (with a few exceptions like 2008), with the program being called in 2012 according
to historical average. But when compared to program design, it seems under utilized. Still,
there is a higher incidence of events in comparison to event hours inferring events are more
frequent, but shorter.
103
SGE 01, Direct Testimony of Michelle Costello at 10
104
SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 2; email communication
(4/3/2013).
55
Table 34: SDG&E Summer Saver Historical Comparison of Number of Events and Event
Hours105
Year
Event hour
(year)
Event hours
called
Number of events
(year) Events Called
2006 120 24 15 8
2007 120 43 15 12
2008 120 8 15 2
2009 120 30 15 7
2010 120 44 15 11
2011 120 22 15 6
2012 120 29 15 8
Average 120 29 15 8
Average historical
performance
compared to design 24% 51%
2012 compared to
historical average
According to
average
According to
average
C. Staff Recommended Changes for AC Cycling
Staff does not have any recommendations to change in program design for SDG&E at this
point. SDG&E’’s is a mature program and customers seem fairly satisfied with the offerings.
SCE’’s program trigger just changed from emergency to price and customers seem satisfied
with the program overall. However, last summer SCE deployed a new dispatch strategy of
which it divided the customers into three to six subgroups with one hour per event per
subgroup instead of the whole group triggered for the entire event duration. While such
strategy is optimal for customers’’ comfort, as discussed in Chapter 2, such strategy caused a
‘‘rebound effect’’106
Program design should help correct this issue. First, the program as designed
states that events can last up to six hours, even though customers seem to prefer shorter event
durations and dissatisfaction went up as event duration increased107
. Also, SCE counts a total of
six hours per event for RA purposes. SCE needs to review the program proposal to reflect
105
Based on SGE 02, Attachment 1, Revised Appendix X Table 11.
106
‘‘Effects of an event in subsequent hours, when electricity usage may exceed the curtailed customers’’ reference
load, as air conditioners work to return residences to original temperature set points.’’ 2012 Load Impact
Evaluation of Southern California Edison’’s Residential Summer Discount Plan (SDP) Program at 13
107
Staff does not have more detailed information on customer preference or what would be the ideal event
duration before customers drop off the program.
56
customer preference –– if customers will not favor being cycled for 6 hours the program should
not have such long event duration proposal.
Moreover, SCE should explore new ways of delivering the program, i.e. using temperature
control via a PCT instead of a switch in the equipment that cycles the unit off/on. This could
allow for longer event duration while maintaining customer engagement as the unit would
never be off completely108
. In fact, both SDG&E and SCE should take advantage of AMI
infrastructure and related enabling technology that could improve program delivery, reliability
and customer experience.
108
D. 13 04 017 at 27 states that innovative approaches via using PCTs and OpenADR could enable shorter event
duration. At the time, the Commission did not have data that reflected the rebound effect which may
discourage short event durations. This issue should be taken into consideration when designing the approved
pilot.
57
Chapter 5: Non Residential Demand Response Programs
I. Summary of Staff Analysis and Recommendations
The analysis of customer experience for DR programs for commercial customers focuses on
three key commercial programs: AC Cycling, Auto DR and Demand Bidding Program (DBP).
Staff recommends that the outreach and marketing of the new features of SCE’’s AC Cycling
program be clearly communicated to the customers to avoid customer dissatisfaction and
dropout. In addition, Staff finds that there is limited evidence that the Auto DR program
coupled with the Critical Peak Pricing (CPP) rate provides evidence of greater load impact than
the load impacts obtained by customers on the CPP rate alone. As a result, Staff recommends
that future studies continue to explore the load impacts of Auto DR.
Staff recommends SDG&E and the Navy collaboratively design a Navy only DBP program to
meet the unique needs of the Navy. Key attributes of the program would include a day ahead
trigger, aggregation of 8 billable meters and a minimum bid requirement of 3 MW.
II. Background and Summary of Utility Data
In response to the Energy Division letter, SCE and SDG&E provided data on the commercial
customer experience and commercial customer participation in the non residential DR
programs. Customer enrollment and participation numbers during events by program were
provided as well as the load impacts that those customers produced. In addition, SCE and
SDG&E provided qualitative information on the commercial customer experience of the DR
programs including how customers felt about the incentives offered, whether customers were
fatigued by consecutive DR events and if the customers felt that too many DR events were
called. SCE and SDG&E also provided information on the efficacy and customer experience of
DR event notification.
Overall, SDG&E reported that the customer experience was positive, and that it tried to
deliver notification to drop load earlier than required (for both commercial and residential
customers).
Among the various non residential program offerings, SDG&E offers a Capacity Bidding
Program (CBP) where participants can choose between a day ahead and a day of program.
Participants are required to reduce their usage by 20 kW or more. Program participants receive
a capacity payment and receive an energy payment for the hours of reduction. However, the
program also carries penalties for a reduction of less than a 50% pledge. The customer
feedback for this program came from aggregators109
who suggested that increasing the
incentives could potentially increase enrollment in CBP.
SDG&E offers a Demand Bidding Program and has two enrolled non residential customers in
this program. In 2012, the Demand Bidding Program was offered on a day ahead basis with
incentives to customers for reducing their demand during an event. In response to SDG&E’’s
109
An aggregator is an entity that aggregates or combines customer load and makes it available for interruption.
58
questions about incentives, the DBP customers indicated that the incentives were not high
enough. The Commission adopted SDG&E proposal of changing this program from a day ahead
to a day of, 30 minute trigger program110
.
Another SDG&E non residential program offering is the Peak Time Rebate(PTR) program
(Commercial). On event days, participating customers are required to reduce their electricity
use during the event duration. Customers can sign up to be notified of events in advance.
Commercial customers signed up for alerts at a much lower rate than residential customers and
also provided less load reduction. Most likely, this is due to limited ability to reduce load
between 11 a.m. and 6 p.m.
SDG&E included three post event PTR surveys, which provided results of residential and
small commercial customers’’ experiences to the PTR event. Key trends were:
Small commercial customers were generally aware of Reduce Your Use days. However,
event specific awareness was lower.
Small commercial customers indicated that they face different challenges than
residential customers in responding to PTR events. Feedback on program improvement
from commercial customers included the following comments: commercial customers
stated that they were not able to reduce more and were already doing what they could;
small commercial customers indicated that they would benefit from advanced
notification; and finally, they stated that responding to events would affect their
business operations or customer comfort.
General program feedback from SDG&E indicated that estimating the effects of customer
fatigue on load impacts is difficult. When event days are called in a row, there are many
varying factors, such as events being called on different days of the week, varying temperatures
on event days, that it is difficult to determine whether the change in load impact is due to
multiple event days or other influencing factors. SDG&E describes its experience with PTR
events called in quick succession, and indicates that preliminary load impacts were lowest on
the last day. This may be due to customer fatigue.
For the other programs, the load impacts did not show evidence of customer fatigue. Again,
this is not conclusive of customer fatigue not being present. Customer fatigue was simply not
measurable relative to other variations in load impacts between events.
SCE launched a Summer Readiness Campaign in April 2012 in order to prepare the
customers for the upcoming summer. Overall the customer experience was responsive and
positive.
SCE offers many non residential DR programs similar to the programs offered by SDG&E.
These programs include an AC Cycling program. This program offers customers various AC
Cycling options where the utility can directly turn off the customer’’s air conditioner when
needed. Customers receive a credit based on several factors, including the program cycling
options that they choose and the AC unit cooling capacity. SCE has proposed changes to this
110
D.13 04 017 at 15.
59
program and has requested the ability to call events not just for emergency reasons, but also
for when the prices are high.
Another non residential program offered is the Auto DR program. The program provides
incentives to offset the cost of purchase and installation of technology to help the customer
automatically manage their load. The customer determines their load control strategy and pre
sets it in the technology. With the technology in place, the program automates the process of
reducing demand by sending a signal from a central system to the customer’’s automated
control system, which then automatically reduces usage during program event duration.
During the 2012 Summer Event Season, the Demand Response Help Desk (for large business
customer DR program) received 1,410 calls. 21% of those calls were related to program events,
however, none of the callers indicated that there were too many events or that the incentive
payments were inadequate. Non event calls (79%) pertained to program eligibility, questions
about enrollment, assistance with online tools and other general program information.
In mid August, SCE conducted market research to gauge customer awareness of enrollment
campaigns and SCE messaging, customer actions in response to the campaigns, and attitudes
towards SCE and energy conservation. Overall, most of the residential and small business
customers heard the campaign message and attempted to reduce their usage. Most of the
customers understood the need to conserve energy over the summer and attempted to do so.
Customer awareness was raised and the campaign prompted some customers to enroll in SCE
programs. Customer attitudes about reliability and avoiding outage remained strong.
SCE did not observe customer fatigue during 2012 event season for DR programs in general.
The programs are able to avoid multiple consecutive days of events by flexibility in the dispatch
triggers of the programs.
III. Commercial Air Conditioning (AC) Cycling
A. SCE’’s Summer Discount Plan
In December 2012, SCE conducted a pilot telephone survey on several programs, including
the Summer Discount Plan (SDP) program111
. The overall sample size was 200 business
participants, though the sample size varied by the question being asked. The satisfaction with
the program was moderate, with only 72% of the participants aware that their business was
enrolled in SDP. The Decision (D.13 04 017) approved SCE’’s proposed changes to the SDP
Commercial program, and we examine the commercial customer experience, as presented
through this survey, in detail.
Overall, a larger percentage (81%) of participants felt that the program was worthwhile. Of
the three main touch points identified, billing was the key and customers were moderately
satisfied with this touch point. Relatively few participants (18%) had reasonably high familiarity
with the program details. Customers who had high familiarity tended to be more satisfied.
Customers who received targeted SDP communication were more familiar and more satisfied
with the program.
111
Service Delivery Satisfaction Recalibration: Summer Discount Plan 2012 Pilot Survey.
60
Most of the business participants were SCE customers at home (86%) and were
predominantly male (63%).
There were three main touch points, which had a significant impact on satisfaction. Billing,
enrollment and events were key drivers, with billing being the highest priority driver, and
events being the lowest priority driver. Customers had difficulties identifying the discount and
not thinking the discount was fair given the effort it took to participate. For enrollment,
customers were primarily concerned with delays in the device installation. Customer reasons
for event dissatisfaction were, specifically, the time, day, and frequency of events as well as a
perception of fairness.
The satisfaction with the billing was 78%, which was considered to be at a moderate level
compared to other SDP touch points. 83% received paper bills, while 19% received electronic
bills. 17% could not easily find the discount on the SDP bill. The top two comments on the SDP
bill were to provide a separate line item of discount and offer a bigger discount/lower rate. The
bill currently includes a separate line item for the discount so customers are not able to find it
and need to be reminded that it is there. The problems with event attributes were low (8%).
Satisfaction with enrollment was modest (78%). 7% experienced problems with enrollment
with most of the problems being related to confusions (amount of the discount, expected
savings) or delays (waiting for the device to be installed, multiple visits, and multiple phone
calls).
The more satisfied customers were the ones that were aware that:
1. They receive a discount regardless of event.
2. The indicator light identifies an event in progress.
3. Events occur between June 1 to October 1.
4. The maximum duration of event is 6 hours.
Relatively few participants (18%) had reasonably high familiarity with the program details.
Most business participants were aware of the 100% and 50% cycling options. Awareness of the
methods (indicator light, SCE.com) of determining whether the device was currently cycling the
AC off was at less than half of the respondents. Only 22% of the respondents knew the correct
start and end months of the program; many of the other customers did not know or did not
provide correct answers to the question. Those that were correct tended to be more satisfied
with the program.
Only 12% of the respondents identified the correct 6 hours that an event can last. 21% said
that there was no limit, and they were less likely to be satisfied with the program. 57% of the
respondents who knew that they receive discounts whether or not the events were called were
more likely to be satisfied as a result. The number of events did not impact satisfaction.
When investigating the reasons for program satisfaction, 36% of the respondents were
happy with the program and 17% responding that there was a good discount provided. 19% of
the comments were negative. 11% of this feedback was related to financial reasons such as the
bill being too high, or that the bill increased, or that the discount was small. Bad customer
service was also another negative at 5%.
61
Around a third of the customers provided suggestions for improving the SDP. In this
feedback, financial comments were paramount, with the following reasons being cited:
Lower rates (5%)
Bigger discount (5%)
Better communication (4%)
A large percentage of participants (77%) did not know the discount amount. Participants
who are most satisfied are likely to know the discount dollar amount.
Participants with moderate to high familiarity were more likely to have received recent
communication from SCE. All types of communication boosted familiarity though the written
method was the dominant form.
B. SDG&E’’s Summer Saver Program
The findings in this section are based on KEMA’’s process evaluation of the 2008 Summer
Saver program112
. At the time of the evaluation, the program had 4,500 commercial
participants (and even greater number of residential participants). Commercial customers can
chose between 30% and 50% cycling options and choose between 5 day and 7 day option. To
the extent to which commercial customer experience was provided, it is cited in this report.
For other cases, general feedback is cited.
Since this information is dated, we used it primarily for general feedback on the Summer
Saver Program at SDG&E and as a means of comparing the AC Cycling programs of SCE and
SDG&E.
A key conclusion of the report included improving the program marketing and informational
materials to reduce program dropouts and attract more interested customers. Better
information about cycling frequency could have resulted in less dissatisfaction and drop out.
Discomfort and program cycling were most often the top reasons for dropout. Better
marketing could have reached customers who are interested in the program. The report
recommended customizing marketing messages to customer subgroups. Surveys of Summer
Saver participants and non participants discovered that bill credit messages had greater appeal
to lower income customers while environmental messages had greater appeal to higher
income customers.
With regard to cycling options, the report recommended to reduce program complexity by
reducing the number of cycling options. A related cycling recommendation was to not increase
the cycling frequency. Currently the program cycles 10 12 times a year, and participants
indicated that they were uncomfortable during Summer Saver control events. The key reason
that participants joined the program was for the financial incentives.
112
Process Evaluation of SDG&E Summer Saver Program, March 19, 2009.
62
Key Lessons Learned from the AC Cycling Programs
When comparing the feedback received for the AC Cycling program for SCE and SDG&E, a
clear recommendation emerges –– marketing, clear communication and managing expectations
is a key facet of the program. When customers know what to expect, they tend to be more
satisfied with the program. SCE customers with high familiarity of program attributes fared
better and were more satisfied. SDG&E could also improve marketing and information
materials to reduce dropouts and attract interested new customers113
. Clarifying the program
and making it less complex is important to attract and retain customers. Targeting those
messages, by subgroups in the case of SDG&E, is another useful method in attracting customers
based on their values and priorities, whether they are financial or environmental.
D.13 04 017 approved SCE’’s proposal of modifying its commercial program from a reliability
based DR program to a program that can be dispatched for economic purposes. The new
trigger will allow the program to be called when there are high wholesale market prices, which
occur during times of extreme temperature or when the system demand is higher than
expected. Additionally, SCE will consolidate the Base and Enhanced commercial programs into
one program with different features, and proposes that the SDP be made available year around.
The key program changes are outlined below:
Table 35:
Program Element Current Design Approved Design
Curtailment Event Trigger Emergency Only Emergency and Economic
Program Availability
Events can occur June 1
September 30
Events can be called year round
with a maximum of 180 event
hours during a calendar year.
Event Duration 6 hours
Multiple events may occur in a
single day, with varying durations.
Maximum 6 hour interruption in a
day.
Customer Cycling Options 30%, 40%, 50%, and 100% 30%, 50%, 100%
With the movement to an economic based program and the new program features, it is
paramount that the marketing campaign clearly explain the changes, such as the duration of
the event, which is now expected to be shorter, yet the programs can be called year around.
Billing changes should also be made to assist customers in identifying discounts. SCE customer
feedback on program improvement was largely financial. SCE’’s new program design has new
incentive levels. The new proposed enhanced program will pay a greater incentive per
ton/month than the current enhanced program, and the new incentive should be
communicated clearly to the participants, whether it is through clear bill presentation or
marketing efforts or a combination of the two.
113 This survey is dated, so SDG&E may have made marketing modifications to alleviate some of the concerns
presented above.
63
In 2012, the Summer Discount Plan Commercial was triggered once for 5.6 hours114
. With
its movement to an economic based program, which can be called when wholesale market
prices are high, it is likely to be called more frequently. The Capacity Bidding Program Day
Ahead (CBP DA) had a heat rate trigger condition in 2012, and that program can be used as an
example of how frequently a non emergency based program may be called. In 2012, the CBP
DA was called 12 times.
The proposed changes for SCE’’s AC Cycling program may provide the needed megawatts
this summer and will also benefit customers who are often financially motivated. However,
these expected changes need to be communicated in a clear way to avoid customer
dissatisfaction and possible customer dropout of the program. If the marketing program is
managed carefully, SCE’’s Summer Discount Program for Commercial customers can be a useful
source of load impact for the summers of 2013 and 2014.
IV. SCE’’s Auto DR
Auto DR is a technology program whereby customers receive an incentive to install
equipment to improve the ability for load reduction during a DR event. Auto DR is considered
to provide a better load shed as described in the Decision (D.12 04 045):
““Limited data suggests that ADR customers have a higher participation
rate in DR programs and provide better load shed. Data also suggests that
customers on dynamic rates perform better with ADR.””
SCE’’s Auto DR customers pre program the level of DR participation and when a DR event is
called, the Auto DR technology enables the facility to automatically participate. This method
reduces the necessity of a manual response. All non residential customers must have an
interval meter and participate in an available price responsive DR program. As of 2013,
customers are paid 60% of technology incentives during installation testing verification; 40% of
eligible incentives are paid according to participation in a DR program115
.
By the end of 2012, SCE Auto DR program funding was 100% subscribed116
. In the
Application, SCE requested approval for an additional $5 million for Auto DR which would be
earmarked for projects in the Target Region. The majority of the funding ($4,200,000) was for
the technology incentive payments.
The key questions which arose were what the customer experience was with the Auto DR
program and how effective were customers in shedding load when events were called. In the
Application, SCE does not have a breakdown of DR programs by those customers who
participate in Auto DR. To understand customer experience better, we refer to other studies
done on the efficacy of Auto DR and customer feedback on the program.
CPP is a rate that sets a higher price for electricity during critical peak days. In return, the
customer receives a reduction in the non peak energy charge, demand charge or both charges.
114
SCE 03.
115 SCE 03 at 25.
116 SCE 02 at 18.
64
A report on the non residential CPP presents the estimated ex post load impacts of Technology
Incentives and Auto DR participants on average for 2011 CPP Event117
.
SCE called 12 CPP events in 2011 over the months of June and September. On average,
each event had 3,006 participants. For SCE’’s CPP customers for 2011, the percentage load
impact was 5.7% for the average event and the average load impact was 11.6 kW. There were
35 CPP customers on Auto DR, and they provided a percentage load impact of 21%. The
average load impact was 103 kW. Based on this information, customers on Auto DR and CPP
provide a greater load reduction than customers on the CPP rate alone.
To further understand Auto DR and its potential to provide load impacts, we examine a
study by Enernoc on CPP118
.
SCE and SDG&E offer Technology Assistance and Technology Incentives Programs. SDG&E’’s
Technical Assistance program provides customers with energy audit services to identify
potential for energy cost reduction and encourage participation in DR and EE programs. The
Technology Incentive program at SDG&E provides financial incentives and on bill financing
(interest free) for customer adoption and installation of DR measures and enabling technologies.
The EnerNOC study outlines the barriers to response to an event. The main barrier for
those customers which were the bottom performers, or the low performing participants, is the
lack of ability to reduce demand because of business needs, and that responding to an event
would negatively impact business functions. Examples of these limitations included a need to
maintain a temperature for preventing produce from spoiling or to protect sensitive equipment
from damage, or just for reasons of comfort of staff. A related barrier to response is the lack of
knowledge on how to reduce load. Additional barriers included lack of enabling technology,
however, this is not identified as a top concern.
Most of the top responders, or the high performing participants, are able to easily shift their
processes or shut down heavy energy using equipment, and respond to events. However, few
of the top responders use technology to automate their response.
The main barrier to responding to events is the ability to respond and not suffer negative
business consequences. For businesses which have the capacity to respond and not be
negatively impacted as a result, technology is a possible solution which can be explored. Due to
the small population size, only 16 technology enabled customers were interviewed. The
majority of those interviewed were SCE customers. This is a small sample size and the feedback
should be interpreted with caution. Half of these customers said that the technology was
important for their response, and the other half stated that they would have stayed on the CPP
rate without the technology. Four of the customers interviewed do not utilize the technology
installed; 3 of these customers respond to events.
Select feedback includes:
““The load we shed is entirely enabled by the Auto DR technology”” –– SCE technology enabled.
117 2011 California Statewide Non Residential Critical Peak Pricing Evaluation, p. 41.
118
California Statewide CPP Research on Improving Customer Response, December 3, 2012.
65
Most of the bottom responders do not use technology to respond and are not aware of
options in this regard.
From the quantitative data, Auto DR customers on the CPP program provide greater load
impacts than those customers on CPP without enabling technology. However, the data above,
though limited due to the small sample size, can provide direction for continued research. With
the additional Auto DR funding request in SCE’’s application, the participant pool continues to
grow. With this growth, it is possible to conduct better studies with more robust results.
Studies can attempt to get to benefits provided by Auto DR, in particular the load impact which
can be attributed to Auto DR and which, in its absence, could not have been achieved.
V. SDG&E’’s Demand Bidding Program (DBP)
The Commission approved SDG&E’’s Demand Bidding Program in the middle of summer
2012 as a part of the mitigation efforts to address SONGS outage.119
SDG&E called 3 Day
Ahead DBP events and obtained a load reduction of 5.1 MW, 5.4 MW and 4.6 MW. During the
course of 2012, SDG&E spent $44,192 on this program, which was minimal comparing to its
residential PTR program costs of $10 million for only 4 MW of load reduction. The recent DR
decision (D.13 04 017) approves SDG&E’’s proposed continuation of its Demand Bidding
Program modified from a day ahead to day off, 30 minute product . The purpose of this
modification was to align the program with the Energy Division Letter and provide programs
that can provide quick response capability. In its comments, the Navy stated that the DBP with
a 30 minute trigger would only permit participation from entities with automated demand
response systems, in effect reducing participation. The Navy requests the continuation of the
day ahead program.
The Navy states that the 2012 DBP did not allow for the Navy’’s participation and the change
to a day of, 30 minute program will further limit the Navy to participate.
Instead the Navy proposes a day ahead program, with some modifications. The Navy
proposes that the customer be allowed to aggregate 8 billable meters. The second proposal is
to lower the minimum bid requirement from 5 MWh to 3 MWh. The Navy states that it may
not be able to produce 5MWh at a single geographic location and cites its experience of August
2012, when during a Demand Reduction test, the Navy shed 4MWh from a multitude of shore
facilities on three Navy Installations.
SDG&E responded to the Navy’’s comments, and explained why it believes the Navy did not
participate in the 2012 DBP. SDG&E understand that the Navy will participate in an emergency
program; however, the DBP is not a day ahead emergency program.
In its response, SDG&E indicated its willingness to work with the Navy and create a demand
response program to meet the Navy’’s unique needs.
119
In Resolution E 4511 on July 12, 2012.
66
Staff recommends that SDG&E and the Navy collaboratively develop the Navy only DBP
program, which will address the following issues raised in the Navy’’s comments to the 2013
2014 DR Proposed Decision:120
1. A day ahead trigger to enable the Navy to appropriately plan for the event.
2. The ability to aggregate 8 billable meters.
3. Lower minimum bid requirement from 5 MW to 3 MW.
Experience from the Demand Reduction test demonstrates that the Navy has the ability to
reduce load and be a useful DR resource for SDG&E’’s system during the summer.
120
Filed on April 9, 2013 in A.12 12 016.
67
Chapter 6: Flex Alert Effectiveness
I. Summary of Staff Analysis and Recommendations
The Flex Alert campaign has not been evaluated since 2008. Earlier evaluations from 2004
2008 suggest that the impacts of an emergency alert have ranged from an estimated 45 MW to
282 MW. The utilities have identified areas to improve communication between CAISO and the
utilities when alerts are triggered and cancelled. The utilities also question whether customers
are confused about the differences between a Flex Alert event and local Peak Time Rebate
events. The utilities cite several reasons to consider transitioning Flex Alert from a utility
funded program, to a CAISO led and funded program.
Staff finds that there is a lack of data to evaluate the effectiveness and value of the Flex
Alert campaign. Staff agrees with the utilities that an evaluation in the current program cycle
is needed. Staff finds that there is merit to the utilities proposal to terminate Flex Alert as a
rate payer funded and utility led activity after 2015. Rather than providing recommendations in
this report staff defer to the proceeding that is currently reviewing the utilities statewide
marketing, education and outreach applications, (A. 12 08 007, A.12 08 008, A.12 08 009, and
A.12 08 010) and the Phase I Decision in that proceeding, D.13 04 021.
II. Background
““Flex Alert”” is the current name of a statewide marketing campaign that asks Californians to
conserve or shift electricity when the CAISO determines that there is a risk that electricity
supply may not be adequate to meet demand.121
This alert campaign is approved through
CPUC decision and the CPUC authorizes the three investor owned electric utilities to provide
the total budget. One utility acts as the lead utility, and contracts with a marketing agency to
develop TV and radio ads, and purchase advertising time. The marketing agency purchases
advertising slots throughout the state to run the ads during the summer season, when demand
is likely to be highest and the grid is more likely to be constrained. CAISO triggers an alert
based on grid conditions and informs the utilities and the marketing agency. The marketing
agency swaps out informational advertisements with emergency alert messages, calling a ““Flex
Alert”” and asking Californians to do three things during a six hour window of time on a specific
day 1) turn off unnecessary lights, 2) set air conditioners to 78 degrees, and 3) wait until after
7PM to use major appliances. Individuals and businesses also have the opportunity to sign up to
receive email or texts notifying them that there will be an alert.
Flex Alert Performance in 2012
In 2012, two Flex Alert events were called on August 10 and August 14. Initially Flex Alerts
were triggered on August 9 for August 10 12. However, the Alerts for August 11 and 12 were
later cancelled. A formal evaluation of Flex Alert was not conducted in 2012. The utilities did
121
From 2001 2004 the name for emergency alerts was Power Down, and from 2004 2007 they were referred to
as Flex Your Power Now.
68
not conduct any analysis to determine an estimate of the impacts that resulted from either Flex
Alert event.
SDG&E was concerned that customers would not recall the difference between Reduce
Your Use, which provides customers a bill credit, and Flex Alert, which provides no monetary
benefits. In the event that customers did not understand the difference between Flex Alert and
Reduce Your Use, SDG&E wanted to avoid customer frustration that could occur when
customers reduced their usage during a Flex Alert but were not paid for it. To mitigate
confusion, SDG&E triggered its Reduce Your Use program on the same days when Flex Alerts
were called. There were three days when the utility triggered a Reduce Your Use event when
there was no Flex Alert. However, the utility claims that the weather on the three Reduce Your
Use event days was atypical, and therefore the utility cannot determine the load reductions
attributable to Flex Alert by comparing Flex Alert days with the days when Reduce Your Use was
called and Flex Alert was not.122
SCE also states that with the limited data available the utility cannot determine the effect of
a Flex Alert. SCE did a basic comparison of two days with similar conditions when the same DR
programs were dispatched, one day with a Flex Alert and one day without and concluded that
Flex Alert could be counter productive because SCE’’s total system load was higher on the day
the Flex Alert was called.123
In comparison, there have been three evaluations of the alert campaign in its history: 2004
2005, 2006 2007, and 2008. The 2004 2005 evaluation did not estimate the impact of an alert
event. The 2006 2007 evaluation reported that the system wide demand response impact on
Flex Alert days, (including all other demand response programs that were called), ranged from
200 MW to 1100MW. The impact from Flex Alert was a portion of this total. The 2006 2007
evaluation estimated the load impacts associated with alert events, specifically from customers
adjusting their air conditioner settings in response to the ads. Although, the study estimated
impacts ranging from 93 MW to 495 MW,124
in 2008 the consultant redid its analysis with
revised assumptions and adjusted the estimate to be between 45 MW and 75 MW.125
The 2008
evaluation estimated load impacts based on customers turning off lights, and adjusting air
conditioners. The study estimated that impacts from alert events in 2008, based on customers
taking these two actions ranged from 222MW to 282 MW.126
Since 2008 there has been a long gap since Flex Alert has been evaluated. In 2009 and 2010
no Flex Alert events were triggered. In 2011 there was one event, but there was no evaluation.
Given that Flex Alert has not been evaluated since 2008, and the utilities seem unable to draw
any conclusions about load impacts attributable to Flex Alert events, it is reasonable to plan an
evaluation for the summer of 2013. The Commission issued a Decision on the utilities
122
SGE 02 at 25
123
SCE 01 at 59.
124
2006 2007 Flex Your Power Now! Evaluation Report, Summit Blue Consulting, May 22, 2008, p. 126. A link to
this report is provided in Appendix S.
125
2008 Flex Alert Campaign Evaluation Report, Summit Blue Consulting, December 10, 2008, p. 102. A link to this
report is provided in Appendix S.
126
Id.
69
statewide marketing application on April 18, 2013. The Decision includes a directive to evaluate
the program.127
III. Utility Experience with Flex Alert
SCE identified three weaknesses with the implementation of Flex Alert. First, the utility
states that challenges exist because neither a utility nor the PUC own the trademark to the
name Flex Alert. Second, the utilities did not receive advanced notice from the CAISO when a
Flex Alert was triggered or cancelled. Third, inability of the CAISO to accurately forecast the
duration of an alert resulted in confusion, when an alert was cancelled.128
The utilities state that they were contacted by CAISO at the same time that news media and
the general public was informed about a Flex Alert. CAISO held weekly calls with the utilities to
discuss weather forecasts and the likelihood a Flex Alert would be called. However, when an
alert was triggered the utilities learned of the event through a robo call, automated email or
text message, which are the same methods used to inform residential customers and media
outlets. The utilities would prefer to have advanced notification so that they are able to
strategically coordinate their own DR program initiation, and to proactively communicate with
customers.
The cancellation of the weekend alerts on the 11th
and 12th
also caused confusion. SDG&E
claims that both internal staff and local media were confused about whether or not
conservation and Reduce Your Use days were still necessary.129
SCE acknowledges that it is
inefficient and costly to re contact media outlets to cancel alerts. Flex Alert radio and television
commercials continued to air throughout the weekend, because the marketing agency was not
able to give the media stations adequate time to switch the messages before the ads were
locked in for the weekend. To add to the confusion, CAISO’’s website continued to indicate
there was a Flex Alert even though the agency had issued a press release stating the weekend
alert events were cancelled.130
Prior to the start of the 2013 Flex Alert season, the utilities, CAISO and the marketing
agency should discuss the weaknesses identified by the utilities from 2012. The organizations
should use their expertise and the recommendations from past Flex Alert evaluations to
identify methods to improve the timeliness of communication and ensure that implementation
is as efficient and effective as possible.
IV. Customer Experience
Only one utility, SCE, conducted a survey in 2012 to determine customer awareness and
reaction to the 2012 Flex Alert campaign. Although SDG&E did not conduct a survey, the utility
raised concerns that both customers and the media seem confused about the difference
between Reduce Your Use events and Flex Alert.131
SCE found that 10 percent of surveyed
127
D.13 04 021, Ordering Paragraph 14.
128
SCE 01 at 60.
129
SGE 02, Attachment 1 at 26.
130
SCE 01 at 60.
131
SGE 02, Attachment 1 at 26
70
customers were confused about the difference between the utility’’s Peak Time Rebate and Flex
Alert. 132
SCE reported the following results from its survey of 400 customers.133
Nearly 60% of residential customers reported hearing or seeing a Flex Alert
advertisement
54% of small business customers reported hearing or seeing a Flex Alert message
25% of residential customers surveyed reported that they took steps to reduce
electricity use on a Flex Alert day
21% of small business customers reported taking steps to reduce when a Flex alert was
called
Compared with past evaluations customer recall of Flex Alert ads has increased from one
evaluation period to the next. In 2004 2005, 12 percent could recall hearing or seeing an ad,
compared to 15 percent in 2005 2006, and 23 percent in 2008.134
A formal evaluation in 2013
can help determine if the jump in awareness reflected in SCE’’s survey results is an accurate
reflection of the trend. The 2013 evaluation should take into account the variety of
mechanisms used to relay information about alerts to customers. For example, 2012 was the
first year that the utilities conducted outreach through Community Based Organizations to help
prepare customers for a Flex Alert event.
Another highlight from SCE’’s survey is that 25 percent of residential customers took action
in response to an alert. This percentage is also an increase from past evaluation results. In past
years between 10 and 21 percent of residential customers reported taking action in response to
the ads.135
However, SCE’’s survey failed to determine whether customers accurately
understood the message that a Flex Alert is intended to convey. All three prior evaluations
found that customers did not understand that they were supposed to adjust their behavior for
just the day of the Flex Alert event. Instead customers reported continuing to conserve during
afternoon hours every day since the event had been called.136
While conservation has its own
benefits, the purpose of a Flex Alert is for customers to shift load during a brief peak event. It
will be important for utilities, the CAISO and the marketing agency to continue to strive to
accurately relay this concept, and for an evaluation to determine if the right message is getting
through to customers.
The utilities made one specific recommendation to improve the program in 2013 2014.
They proposed to continue community outreach partnership efforts in 2013 and 2014 in the
132
SCE 01 at 61
133
SCE 01 at 61
134
Process Evaluation of the 2004 / 2005 Flex Your Power Now! Statewide Marketing Campaign, Opinion Dynamics
Corporation, July 24, 2006 p. 5; 2006 2007 Flex Your Power Now! Evaluation Report, Summit Blue Consulting,
May 22, 2008, p. 90; 2008 Flex Alert Campaign Evaluation Report, Summit Blue Consulting, December 10, 2008,
p. 83. Links to these reports are provided in Appendix S.
135
Id.
136
Id.
71
demand response proceeding. The Commission adopted a Decision on April 18, 2013, which
approves these requests.
V. The Future of Flex Alert
SDG&E sites a passage from the SCE’’s testimony in its Statewide Marketing Application
which identifies several reasons that the Commission should consider that CAISO take full
control of the statewide emergency alert campaign starting in 2015. SDG&E states that they
support the recommendation made by SCE. SCE’’s testimony states that since 2004 the utilities
have funded alerts through ratepayer dollars. However, when alerts are called, the results
benefit customers outside of the utilities’’ service territories as well, yet neither CAISO nor non
utility Load Serving Entities contribute to the funding. SCE also pointed out that from 2007
2011 only one alert was triggered. Increased growth in utility demand response programs has
positively impacted grid reliability, the utility states. SCE found that balancing utility specific
regulatory constraints with the CAISO desired scope of the program was challenging. As an
example CAISO requested to share emergency alert messaging with Baja Mexico to promote
energy conservation in that region. SCE’’s testimony goes on to state that the utilities do not
have the discretion of when to trigger the program. SCE recommended that Since CAISO
triggers the program, the CAISO should assume total ownership of, and authority over it. SCE
requests that this recommendation is approved during 2013 2014 so that CAISO has the
opportunity to seek funding in it GMC cost recovery.137
The Commission adopted a Decision on Phase 1 of the utilities statewide marketing
application on April 18, 2013. The Decision authorizes a total of $20 million to be spent on Flex
Alerts between now and the end of 2014. The Decision also directs the program to be
evaluated. The Decision includes a directive for the utilities to work with CAISO to develop a
proposal for the transfer of the administration and funding of the Flex Alert program to the
CAISO or another entity, effective in 2015. The Decision directs SCE to submit the proposal in
the Statewide Marketing Proceeding by March 31, 2014.138
VI. DR Program Ex Post Load Impact Results on the Flex Alert Days
As shown in tables below, all three utilities triggered a DR event for some of their DR
programs during the two Flex Alert days with a total of 739 MW of load reduction from 4:00
5:00 p.m. on August 10, 2012 and 432 MW from 3:00 4:00 on August 14, 2012. The CAISO
reported that the actual system peak load during the peak hours between 3:00 p.m. to 5:00
p.m. were significantly lower than its forecasts and attributed the load drops to its Flex Alerts.
However, the data suggests that a large or some portion of the load reduction came from the
DR programs. Appendix P shows the ex post load impact for each of the utilities’’ DR programs
on the two Flex Alert days.
137
SGE 02, Attachment 1 at 28.
138
D.13 04 021 at 25 27.
72
Table 36: Utilities’’ DR Program Ex Post Load Impact on the Flex Alert Days139
Utility
Ex Post (MW)
3:00 4:00 p.m. 4:00 5:00 p.m.
8/10/12:
SCE 194 185
SDG&E 8 27
PG&E 459 527
Total 661 739
8/14/12:
SCE 394 242
SDG&E 38 38
PG&E No Events
Total 432 280
139
Provided to staff through emails. Data source: the utilities’’ April 2, 2013 Load Impact Reports (links to the
reports are provided in Appendix S).
73
Chapter 7: Energy Price Spikes
I. Summary of Staff Analysis and Recommendations
Because most DR programs are dispatched a day ahead or several hours ahead of events, it
is difficult for the utilities to effectively use DR programs in response to real time price spikes.
There were many days where price spikes occurred but DR programs were not called, and
conversely there were days where DR programs were called but no price spikes occurred. 30
minute or 15 minute DR programs could respond to price spikes much more efficiently.
II. Definition of Price Spikes
For the purposes of this report, a price spike day was defined as any day where the average
hourly real time price hit $150/MWh or more in any 3 or more hours from HE12 HE18. This
definition is designed to evaluate only those hours where DR could respond. By restricting the
definition to HE12 –– HE18 the definition considers only those hours where DR can be called. By
also restricting the definition to days where 3 or more hours were above $150/MWh, this
eliminates days with momentary jumps in price that DR could not reasonably be expected to
respond to.
III. DR Programs and Price Spikes
Using the definition above, SCE had 67 hours that averaged $150/MWh or more across the
hour, with 7 days where 3 or more hours averaged $150/MWh or more across the hour.
SDG&E had 126 hours that averaged $150/MWh or more across the hour, with 18 days where 3
or more hours averaged $150/MWh or more across the hour.
DR events overlapped real time price spikes with varying success. SCE was successful on 2
out of 7 days, whereas SDG&E was successful on 4 out of18 days.140
Table 37: Number of Days with Energy Price Spikes
SCE SDG&E
Days that DR events successfully overlapped price spike days (3 or more hours of
$150/MWh) between HE12 –– HE18
2 4
Number of price spike days (3 or more hours of $150/MW) between HE12 ––
HE18
7 18
Days that DR events were called 43 15
Days that DR events were called but without price spike ($150/MWh) occurring 31 6
Days with at least 1 price spike of $150/MWh 36 60
Most of the utility price responsive DR programs are currently designed to be called a full
day ahead of when the load reductions are needed. The existing programs therefore do not
have real time hourly prices as a trigger. They are triggered by other market indicators such as
heat rates and forecasted temperature. According to SCE, price spikes occur with 2.5 minute
notice, and that any resource that could be used to mitigate price spikes would have to be
140
For a more complete chart, see Appendix Q.
74
already bid into CAISO’’s market awaiting CAISO dispatch instructions141
. DR programs are
currently not bid into CAISO’’s market.
To the extent that DR programs were triggered when price spikes occurred, it is outside the
scope of this report to quantify the impact of DR programs on those price spikes. The
quantification of those impacts would require some method of modeling what the prices would
have been but for the load impacts of the DR programs.
In theory, DR should have had some impact on prices given that DR events overlapped price
spike days on a few occasions. Demand response on those days, in theory, probably had some
downward impact on the equilibrium price (i.e. mitigating the price spikes).
IV. Conclusion
DR programs are not able to address real time price spikes because of their current
design, and because the programs are not yet bid into CAISO markets. The Utilities should
design new DR programs that enable them to mitigate real time price spikes in anticipation that
these programs will be bid into CAISO markets.
141
A.12 12 017, SCE Exhibit 1, pages 48 49
75
Chapter 8: Coordination with the CAISO
I. Staff Recommendations
Because the Utilities’’ current DR programs are not integrated into the CAISO wholesale
energy market, there is no market mechanism to inform the CAISO how much DR capacity
exists in the system on daily and hourly basis. Such information is important for the CAISO’’s
operational consideration. The utilities’’ Weekly and Daily DR reports developed in summer
2012 are a valuable alternative to make their DR resource more visible to the CAISO. Staff
appreciates the Utilities’’ efforts in the development and submission of the Daily and Weekly DR
reports. Staff agrees with the CAISO that all three utilities should submit the Daily and Weekly
DR reports in summers 2013 and 2014. The utilities (including PG&E142
) DR reporting
requirements for 2013 2014 is summarized in Appendix R.
II. DR Reporting Requirements in Summer 2012
As discussed above, prior to the summer 2012, under the oversight of the Governor’’s Office,
the Commission worked closely and intensively with the CAISO, the CEC, and the Utilities on the
contingency planning to mitigate the potential affects from the SONGS outage and ensure
system reliability throughout the summer. One of the initial steps was to identify the Utilities’’
DR resources available to address the five different types of system contingencies such as
transmission, voltage collapse, generation deficiency, etc., which is referred as the mapping of
the DR programs.
The next step was to develop a mechanism to inform the CAISO how much Day Ahead and
Day of DR capacity is available on a daily and hourly basis. Unlike other generation resources,
currently, DR is not integrated in the CAISO’’s wholesale energy market. Under the CAISO’’s DR
Resource User Guide,143
the Utilities are required to only submit the forecast and results for the
triggered DR programs. Therefore, if no DR program is triggered, the CAISO is blind to how
much DR capacity exists in the system. With an exception of the Emergency Program, the DR
programs are dispatched by the utilities, not the CAISO. This operation as well as the reporting
requirements as set in the CAISO’’s guide since 2007 had not presented any problem in the past
when the system had sufficient resources.
However, in light of the SONGS outage, CAISO emphasized the importance of a daily
communication on the Utilities’’ DR programs so the CAISO’’s grid operator could request the
Utilities to dispatch their DR programs if and when they are needed. Working cooperatively
with CAISO and the Commission staff, the Utilities developed and submitted the Daily DR
reports from June 1, 2012 to October 31, 2012. The Utilities continued to submit the results of
the DR events seven days after each event (referred as the ““7 Day Report””) consistent with the
142
As staff guidance only because PG&E is not subject to this proceeding.
143
DRAFT Version 1.0, August 30, 2007. http://www.caiso.com/1c4a/1c4a9ef949620.pdf.
76
CAISO’’s guidance. Staff provided to the Governor’’s Office the data from the Daily and the 7
Day reports in weekly briefings during the summer 2012.
III. DR Reporting Requirements for 2013 2014
In its 2013 2014 DR application, SCE proposed to eliminate the Weekly and Daily DR
reporting requirements because it did not find these reports provided value for SCE. SCE
recommends transition back to the 2007 CAISO User Guide but suggests that the CAISO should
update and publish for all DR Providers.144
In its protest to SCE’’s application, the CAISO objects
SCE’’s proposal and requests that the Utilities resume the Daily DR reports after the winter
season ends. The CAISO contends that ““(t)he underlying purpose of the date forecasting and
publication was to benefit the system operator rather than the IOUs themselves. The ISO finds
good value in the daily demand response reports. Because the report mechanism, the ISO is no
longer blind to how much DR capability exists in the system in a daily and hourly basis, if and
when it is needed.””145
Staff finds that these reports not only have value to the CAISO, but also to the Commission.
Through the Daily and 7 Day reports, staff was able to monitor and provide timely DR status to
the Governor’’s Office throughout the summer. There were a number of lessons learned
leading the development of the comprehensive questions on the DR performance. Therefore,
staff recommends the continuation of all of the DR reports submitted in 2012 for 2013 2014 as
summarized in Appendix R.
144
A. 12 12 017, SCE 1, at p.54.
145
A. 12 12 017, CAISO’’s Comments filed on January 18, 2013.
77
Appendix A: Highlight of 2012 Summer Weather & Load Conditions146
SCE
Date
Max Temp
(°F)
Max RA Temp
(°F)
DR Ex Post
Load Impact (MW)
Peak Load
(MW)
8/10/2012 89 93 192 22,282
8/13/2012 90 95 59 22,428
9/14/2012 100 97 93 21,799
10/1/2012 95 N/A 80 21,355
10/17/2012 97 88 270 17,609
SDG&E
Date
Max Temp
(°F)
Max RA Temp
(°F)
Ex Post
Load Impact
(MW)
Peak Load
(MW)
8/13/2012 91 88 31 4,266
8/17/2012 94 88 23 4,266
9/14/2012 109 96 46 4,592
9/15/2012 104 96 32 4,313
10/2/2012 98 96 25 4,146.3
146
Include event days with top three highest temperatures and peak load.
78
Appendix B: Energy Division November 16, 2012 Letter
Provided in a separate PDF file
79
Appendix C: Descriptions of DR Load Impact Estimates
012 RA
The 2012 Resource Adequacy (RA) load is a monthly forecast estimate of the load reduction attributed
to individual DR programs under a 1 in 2 weather year condition. This value is utilized in load resource
planning and it is based on a year ahead forecasted customer enrollment.
SCE’’s Methodology
In SCE’’s A. 12 12 017 March 4th
Response To ALJ’’s February 21, 2013 Ruling Requesting Applicants To
Provide Additional Information, 2012 RA MW is based on SCE’’s ex ante load impact results under a 1 in 2
weather year condition, portfolio level, and average hourly impacts from 1pm to 6pm in May Oct. and
from 4pm to 9pm in Nov. Apr.
The PTR, Residential and Commercial Summer Discount Plan (AC Cycling) methodologies are outlined
by the following steps:
1. Defining data sources
2. Estimating ex ante regressions and simulating reference loads by customer and scenario
3. Calculating percentage load impacts from ex post results
4. Applying percentage load impacts to the reference loads; and
5. Scaling the reference loads using enrollment forecasts147
SDG&E’’s Methodology
The 2012 Resource Adequacy MW is based on SDG&E’’s ex ante load impact results under a 1 in 2
weather year condition, portfolio level, and average hourly impacts from 1pm to 6pm in May Oct. and
from 4pm to 9pm in Nov. Apr. The forecast is calculated in accordance with the load impact protocols149
.
The forecast is calculated by multiplying (1) historical load impact per participant as a function of weather
and (2) SDG&E’’s forecast of the number of participants per program.
Load Impact Per Participant150
147
Details of RA protocols obtained from SCE DRAFT 2012 Ex Post Ex Ante Load Impact for SCEs PTR pg. 16
http://www3.sce.com/law/cpucproceedings.nsf/vwOtherProceedings?Openview&Start=1&Count=25
148
Details of RA protocols obtained from SCE DRAFT 2012 Ex Post Ex Ante Load Impact for SCEs SDP
http://www3.sce.com/law/cpucproceedings.nsf/vwOtherProceedings?Openview&Start=1&Count=25
149
D.08 04 050.
150
Detailed information on RA protocols obtained from ““San Diego Gas & Electric Company Response to Administrative Law Judge’’s
Ruling Requesting Applicants to Provide Additional Information”” pg. 14 and communication with Kathryn Smith, SDG&E.
2012 SCE Resource Adequacy
Protocols ––Program Details148
Summer Discount
Plan (AC Cycling) &
Peak Time Rebate
(PTR)
Time of Day (hour)
Day of week
Variables for Monday and Friday
Month
Cooling degrees
Heating degrees
80
The first step in the process is the development of a regression model. The model used in the analysis
includes the following input variables: temperature, day of week, month, and participant loads prior to
the DR event (i.e. participant loads at 10 a.m.). A 1 in 2 weather year condition was used as an input
variable in the regression model and it represents the monthly peak day temperature for an average year.
SDG&E utilized 2003 11 historical weather data to calculate monthly system peak temperatures. In the
event that DR program enrollment, baselines, or the number of DR events changed significantly, data
from prior years was utilized. Regression variable coefficients in the 2011 Ex Post model were utilized for
the 2012 RA forecast model.
After the impact per participant regression model is developed, the model is re run with average
monthly peak temperature values. The output is the historical load impact per participant as a function of
weather.
SDG&E’’s Forecast of the Number of Participants per Program
The forecasted number of participants per DR program is obtained by examining historical trends and
program designed change.
2012 SDG&E Resource Adequacy Protocols ––Program Details151
ACSAVER 1 in 2 weather data for monthly system peak day
Enrollment estimates by customer type (residential and commercial) and by cycling option (Res
50%, 100% cycling; Com –– 30%, 50% cycling).
BIP A Time of Day, Day of Week, Month, Temperature (shape and trend variables (and interaction
terms) designed to track variation in load across days of the week and hours of the day).
Forecasted load in the absence of a DR event (i.e. the reference load)
Participant’’s Firm Service Level
Estimates of over or under performance
TOU period variables (binary variables representing when the underlying TOU rates changed
during the day and season)
CBP
DA/DO
Simulated per customer reference loads under 1 in 2 weather year condition and event type
scenarios (e.g., typical event, or monthly system peak day)
Estimates of reference loads and percentage load impacts, on a per enrolled customer basis,
based on modified versions of the ex post load impact regressions.
Estimated percentage load impacts combined with program enrollment forecasts from SDG&E to
develop alternative forecasts of aggregate load impacts.
Forecasts were developed at the program and program type (e.g., DA and DO) level.
CPP ––D Load impacts for existing CPP D customers were prepared for 2010 2020 based on per customer
reference loads and load impact estimates from the ex post evaluation, and enrollment
forecasts.
The enrollment forecast for CPP D is calculated using opt out rates by NAICS
CPP E Forecast is based on prior event data and accounts for temp. & customer growth
151
Details of RA protocols obtained from ““Executive Summary of the 2011 SDG&E Measurement and Evaluation Load Impact Reports””
http://www.sdge.com/sites/default/files/regulatory/SDGE_PY2011_LoadImpactFiling_ExecutiveSummary%20final.pdf
81
PTR Com
& Res
There are five major assumptions required to compute the expected PTR load reduction from
residential customers. 1) The meter deployment rate, 2) the rebate price, 3) the participation
rates, 4) the average load, and 5) the elasticity which determine the percent impact per customer
when combined with the prices.
Average load is based upon SDG&E’’s load research and daily load profile data.
Average daily energy use per hour in the peak and off peak periods
Elasticity of substitution between peak and off peak energy use
Average price during the peak and off peak pricing periods
Change in elasticity of substitution due to weather sensitivity
Average cooling degrees per hour during the peak period.
Change in elasticity of substitution due to the presence of central air conditioning
2012 Adjusted RA
The DR load impact for 2012 Adjusted RA is a monthly estimate of the expected load reduction
attributed to individual DR programs that accounts for current customer enrollment. This value is utilized
in load resource planning.
SCE’’s Methodology
Adjusted RA is calculated by taking the 2012 RA value and dividing by the 2012 RA enrollment to get
the average RA load impact per customer. The average RA load impact per customer is multiplied by the
number of ex post customers that were dispatched. The adjusted RA value accounts for the difference
between the number of customers forecasted for RA and the number of customers actually enrolled
during the ex post events; i.e. the adjusted RA represents what RA would have been if SCE had had
perfect knowledge of enrollment for 2012
SDG&E’’s Methodology
The adjusted 2012 RA load forecast is obtained by multiplying the 2012 RA impact per customer by
the number of current enrolled customers. SDG&E did not adjust its 2012 RA load forecast for weather or
other variables.
DR Daily Forecast and CAISO’’s 7 Day Report
The daily forecast is intended to provide an estimate of the expected hourly load reduction per DR
program during an event period.
The CAISO’’s 7 day Reports provide load reduction data that is calculated and reported to the CAISO
seven days after a DR event.
SCE’’s Methodology
AC Cycling
SCE’’s daily forecast for the Summer Discount Plan is calculated using an algorithm derived from a 1985
AC cycling load reduction analysis report. The algorithm is a linear equation:
MW Reduction = [a + b x (T x k)] x t
82
Where:
T = Temperature forecasted for the following business day in Covina, CA
t = Air conditioner tonnage available for cycling
k = Temperature adjustment factor
a = Constant adjustment factor
b = Slope adjustment factor
When the temperature in Covina is below 70 degrees, the assumption is that no AC Cycling DR is
available and thus no forecast is made. Specific values for a, b, and k are disclosed in a 1986 SCE internal
memo for 4 SCE service area weather zones and for the 50% and 100% cycling strategies.152
Adjustments
are made to the algorithm based on air conditioner tonnage available for cycling. This particular algorithm
is only valid for event day temperatures between 90 and 116 degrees.
As of this draft, the 1985 AC cycling load reduction analysis report has not been provided to ED staff.
Consequently ED staff has not been able to examine the specific slope, constant, and temperature
adjustment values.
SCE used a modification of this algorithm to accommodate the hourly forecasts requested by the CAISO
prior to August 28, 2012. The modified methodology uses the program load reduction estimates using a
temperature input of 100 degrees that is scaled based on actual temperatures below 100 degrees. Towards
the end of the summer, the legacy algorithm was built into a system where the temperatures could be
applied by hour across the different zones requested by the CAISO.
SCE’’s 7 day report for the Summer Discount Plan is calculated using the AC cycling load reduction
algorithm with a temperature input based on actual temperatures in Covina CA. When the temperature in
Covina CA is below 70 degrees, the assumption is that no AC Cycling DR is available. Adjustments are made
based on enrollment and temperature.
SDG&E’’s 7 day results reports for the AC Saver program are calculated using a one or two day baseline
with adjustments based on same day or historical days with the most similar weather conditions to the
event day. The 7 day report results provided to the CAISO are hourly but the event day results average
results from 1p.m. 6 p.m. for events including those hours and the average results over the event period for
events not including all of the hours 1p.m. 6p.m.
Peak Time Rebate
SCE’’s daily forecast and 7 day report for the Save Power Day peak time rebate program is calculated by
multiplying the population of residential customers actively enrolled in Save Power Day event notification
by a forecasted average load drop of 0.229 kW per participant.
SDG&E’’s methods for developing the daily forecast and 7 day report for the residential peak time
rebate program are the same as those described above for the AC Cycling program.
DR Contracts
SCE’’s daily forecast and 7 day report for DR Contracts program are calculated as the current month's
contract capacity with no adjustments are made for enrollment, temperature, or other factors.
152 See Appendix S.
83
SDG&E’’s Methodology
Daily Forecast
The daily forecast is calculated in two steps.
The first step is the creation of a regression model that predicts the entire load of participating
customers. Model input variables include temperature, day of week and month. Temperature inputs
utilized in the regression model are the monthly peak temperatures from the prior year. In some instances,
the load forecast may be scaled up or down according to the number of currently enrolled participants and
their impact on on peak load. In some instances, if large customers leave a program, the load forecast
regression is re run with participants that are still enrolled in the program.
The second step in the process is to multiply the estimated load of participating customers by a fixed
percentage load reduction that is based upon ex post results from the previous year.
CAISO’’s 7 day Report
Load reductions detailed in CAISO’’s 7 day Report are calculated by subtracting an estimated baseline
from the measured load during DR event hours. SDG&E utilizes 10 working days prior to an event to
calculate an estimated baseline for its CPP, CBP, CPP E, and BIP programs. For its residential programs,
SDG&E utilizes 1 to 2 days to calculate its estimated baseline. The exception is that if the PTR event occurs
on a Monday, then data from the prior work week (excluding event days) is used.
As of this draft, Energy Division staff has not inspected SDG&E’’s regression model, model inputs, or
cases where comparisons and judgment were applied to scale forecasts up or down.
Ex Post Results and Settlement Data
Ex Post Results
Ex post result is the measurement of MW delivered using Regression methods. Regression methods
use an entire season’’s data and data across multiple events to improve on the accuracy of impact estimates.
It relies on historical information about customer loads and focuses on understanding the relationship
between load, or load impacts, during hours of interest and other predictor variables (i.e., temperature,
population characteristics, resource effects, and observed loads in the hours preceding the DR event).
Whenever ex ante estimation is required, regression analysis is generally the preferred method because
it can incorporate the impact of a wide variety of key drivers of DR.
DR load Impact estimates are determined directly from the regression modal. Decision 08 04 050
adopts protocols for estimating the impact of DR activities for resource planning.
The purpose of the ex post results is to inform DR Resource Planning and Program Design
Settlement Data
Day matching is the primary approach used to calculate customer settlement for DR options involving
large Commercial and Industrial customers. Settlements refer to the methods of paying customers for
participating in DR program and it is an important component of DR program design and implementation.
Because of the need to produce estimates in a short time frame after an event for prompt payments, this
limits the amount of data collected. Forecasting future impacts of DR events is limited because Day
84
matching do not collect data on influential variables (i.e., weather conditions, seasonal factors, customer
population characteristics) that would cause impacts for vary in the future.
SCE Methodology
Load impact is calculated as the difference between the reference load (baseline) and the observed load
(usage). The purpose of the settlement data is to calculate payment to customers.
85
Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW)153
with RA Measurement Hours (1 6 p.m.)
Month
2012 RA
(1)
2012 Adjusted RA
Daily
Forecast
7 Day
Report
Year End Ex PostEnrollment
(2)
Enrollment
&
Weather(3)
Monthly Nominated Programs
Capacity Bidding Program (Day Ahead)
June 1.19 No Events N/A No Events No Events No Events No Events
July 1.24 0.60 N/A 0.07 0.07 0.08 0.07
August 1.27 No Events N/A No Events No Events No Events No Events
September 1.23 No Events N/A No Events No Events No Events No Events
October 1.18 0.50 N/A 0.09 0.09 0.04 0.01
Capacity Bidding Program (Day Of)
June 17.56 No Events N/A No Events No Events No Events No Events
July 18.21 14.10 N/A 11.74 11.74 14.84 15.28
August 18.63 12.85 N/A 12.30 12.30 15.38 16.66
September 18.49 12.51 N/A 11.90 11.90 14.65 16.21
October 17.25 11.47 N/A 11.72 11.72 15.02 14.78
Demand Bidding Program
June 11.49 No Events N/A No Events No Events No Events No Events
July 12.05 2.91 N/A 74.65 85.59 96.09 90.21
August 12.39 3.02 N/A 88.35 77.14 76.81 72.43
September 12.24 No Events N/A No Events No Events No Events No Events
October 12.27 2.90 N/A 78.90 71.67 90.33 79.52
Demand Response Contracts (Day Ahead & Day Of)
June 99.15 No Events N/A No Events No Events No Events No Events
July 102.51 No Events N/A No Events No Events No Events No Events
August 104.74 166.28 N/A 275.00 275.00 174.79 182.05
September 103.56 No Events N/A No Events No Events No Events No Events
October 100.22 139.10 N/A 185.00 185.00 122.11 114.90
Other Price Responsive Programs
Save Power Days / Peak Time Rebates
June 207.89 No Events N/A No Events No Events No Events No Events
July 256.82 74.02 N/A N/A 58.76 58.76 N/A
August 265.60 120.13 N/A N/A 108.02 108.02 35.56
September 238.08 107.66 N/A 108.59 108.62 108.62 10.73
October 202.43 No Events N/A No Events No Events No Events No Events
153
SCE 03, Table 1.
86
Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW) (Cont.)
with RA Measurement Hours (1 6 p.m.)
Month
2012 RA
(1)
2012 Adjusted RA
Daily
Forecast
7 Day
Report
Year End Ex PostEnrollment
(2)
Enrollment
&
Weather(3)
Summer Advantage Incentive Program / Critical Peak Pricing (CPP)
June 66.49 63.23 N/A 42.59 49.00 49.93 27.68
July 69.31 65.64 N/A 52.40 51.72 61.65 39.95
August 68.57 65.14 N/A 52.00 42.46 46.72 38.50
September 65.08 61.66 N/A 46.76 40.09 42.69 35.90
October 62.86 No Events N/A No Events No Events No Events No Events
Summer Discount Plan (Residential)
June 462.15 168.00 N/A 161.51 137.87 137.87 69.60
July 545.82 538.42 N/A 263.67 158.81 158.81 188.00
August 500.00 454.95 N/A 227.94 162.46 150.95 211.90
September 519.53 514.47 N/A 254.06 254.06 254.06 133.02
October 0.00 N/A N/A 292.62 292.62 292.62 101.95
Emergency Programs
Summer Discount Plan (Commercial)
June 33.62 No Events N/A No Events No Events No Events No Events
July 48.30 No Events N/A No Events No Events No Events No Events
August 62.43 4.99 N/A 4.77 3.43 3.43 3.10
September 53.70 No Events N/A No Events No Events No Events No Events
October 0.00 N/A N/A No Events No Events No Events No Events
Agriculture Pumping Interruptible
June 41.26 No Events N/A No Events No Events No Events No Events
July 39.66 No Events N/A No Events No Events No Events No Events
August 39.78 15.57 N/A 36.00 36.00 15.34 17.29
September 37.71 23.73 N/A 60.56 60.56 28.39 24.00
October 39.58 No Events N/A No Events No Events No Events No Events
Base Interruptible Program
June 553.24 No Events N/A No Events No Events No Events No Events
July 542.67 No Events N/A No Events No Events No Events No Events
August 542.52 No Events N/A No Events No Events No Events No Events
September 548.21 558.24 N/A 513.78 520.91 441.46 573.01
87
Appendix E: SCE 2012 DR Program Load Impact by Event (MW)
Daily Average by Event Hours
Event Date
Daily
Forecast
7 Day
Report
Year End Ex Post
Monthly Nominated Programs
Capacity Bidding Program (Day Ahead)
7/23/2012 0.07 0.07 0.03 0.04
7/24/2012 0.07 0.07 0.08 0.09
7/25/2012 0.07 0.07 0.08 0.08
7/30/2012 0.07 0.07 0.11 0.06
7/31/2012 0.07 0.07 0.10 0.07
10/1/2012 0.09 0.09 0.24 0.20
10/2/2012 0.09 0.09 0.33 0.10
10/3/2012 0.09 0.09 0.12 0.05
10/5/2012 0.09 0.09 0.18 0.07
10/17/2012 0.09 0.09 0.00 0.07
10/18/2012 0.09 0.09 0.02 0.17
10/29/2012 0.09 0.09 0.19 0.15
Capacity Bidding Program (Day Of)
7/20/2012 11.74 11.74 14.84 15.28
8/7/2012 12.30 12.30 14.92 16.46
8/13/2012 12.30 12.30 15.22 15.70
8/14/2012 12.30 12.30 16.01 17.82
9/14/2012 11.90 11.90 14.65 16.21
10/2/2012 11.72 11.72 14.24 15.80
10/18/2012 11.72 11.72 15.80 13.76
Demand Bidding Program
7/12/2012 74.65 85.59 96.09 90.21
8/8/2012 85.59 102.63 100.67 92.95
8/10/2012 85.59 94.84 98.76 95.82
8/14/2012 94.09 70.89 66.96 61.76
8/16/2012 94.35 55.50 56.16 62.70
8/29/2012 82.15 61.84 61.51 48.94
10/1/2012 78.75 80.85 98.54 79.78
10/17/2012 79.05 62.49 82.12 79.25
Demand Response Contracts (Day Ahead & Day Of)
8/14/2012 275.00 275.00 174.79 182.05
10/2/2012 185.00 185.00 122.11 114.90
88
Appendix E: SCE 2012 DR Program Load Impact by Event (MW) (Cont.)
Daily Average by Event Hours
Event Date
Daily
Forecast
7 Day
Report
Year End Ex Post
Other Price Responsive Programs
Save Power Days / Peak Time Rebates
7/12/2012 N/A 58.76 58.76
8/10/2012 N/A 107.24 107.24 95.85
8/16/2012 N/A 107.61 107.61 24.43
8/29/2012 N/A 108.51 108.51 21.93
8/31/2012 N/A 108.73 108.73 0.02
9/7/2012 108.66 108.66 108.66 23.11
9/10/2012 108.52 108.57 108.57 1.65
Summer Advantage Incentive Program / Critical Peak Pricing (CPP)
6/29/2012 42.59 49.00 49.93 27.68
7/12/2012 49.00 62.40 80.14 41.53
7/23/2012 55.79 41.05 43.17 38.36
8/7/2012 50.91 48.57 54.29 33.48
8/9/2012 50.91 53.07 59.96 39.14
8/13/2012 50.54 46.70 55.98 42.96
8/20/2012 53.21 44.04 44.52 45.19
8/27/2012 53.21 23.59 25.02 34.41
8/29/2012 53.21 38.79 40.55 35.85
9/10/2012 47.36 48.60 52.04 42.26
9/20/2012 47.36 26.92 30.09 27.42
9/28/2012 45.55 44.75 45.95 38.00
Summer Discount Plan (Residential)
6/20/2012 Group 1 128.01 8.23 8.23 0.50
6/29/2012 Group 1 178.26 41.89 41.89 35.80
6/29/2012 Group 2 178.26 87.75 87.75 33.30
7/10/2012 Group 1 263.67 29.17 29.17 44.70
7/10/2012 Group 2 263.67 41.89 41.89 66.60
7/10/2012 Group 3 263.67 87.75 87.75 76.70
8/1/2012 Group 1 60.50 29.17 29.17 49.10
8/1/2012 Group 2 46.40 29.56 29.56 56.40
8/1/2012 Group 3 58.60 46.63 46.63 57.10
8/3/2012 Group 1 60.50 29.17 29.17 35.70
8/3/2012 Group 2 54.90 21.83 21.83 65.60
8/3/2012 Group 3 58.60 46.63 46.63 46.00
8/8/2012 Group 1 135.52 67.69 67.69 104.60
8/8/2012 Group 2 133.55 66.33 66.33 100.00
8/8/2012 Group 3 151.14 98.88 98.88 128.40
89
Appendix E: SCE 2012 DR Program Load Impact by Event (MW) (Cont.)
Daily Average by Event Hours
Event Date
Daily
Forecast
7 Day
Report
Year End Ex Post
Other Price Responsive Programs
Summer Discount Plan (Residential) (cont.)
8/9/2012 Group 1 151.14 67.69 67.69 125.90
8/9/2012 Group 2 121.12 66.33 66.33 107.20
8/9/2012 Group 3 118.06 98.88 98.88 121.20
8/14/2012 Group 1 130.40 194.47 61.14 119.40
8/14/2012 Reliability 17.42 8.15 3.43 13.50
8/15/2012 Group 1 116.01 88.62 88.62 74.30
8/15/2012 Group 2 75.10 42.35 42.35 84.20
8/15/2012 Group 3 77.77 40.44 40.44 77.50
8/17/2012 Group 1 101.30 102.53 102.53 153.00
8/17/2012 Group 2 58.00 42.25 42.25 98.30
8/21/2012 Group 1 61.87 53.44 53.44 72.70
8/21/2012 Group 2 62.65 29.93 29.93 83.40
8/21/2012 Group 3 50.70 29.39 29.39 57.50
8/22/2012 Group 1 115.03 29.39 29.39 42.40
8/22/2012 Group 2 75.11 29.93 29.93 67.20
8/22/2012 Group 3 101.25 47.12 47.12 58.50
8/28/2012 Group 1 129.54 129.54 129.54 76.30
8/28/2012 Group 2 83.86 83.86 83.86 88.20
8/28/2012 Group 3 71.90 71.90 71.90 81.30
8/29/2012 Group 1 82.56 82.60 82.60 80.30
8/29/2012 Group 2 66.42 66.40 66.40 91.70
8/29/2012 Group 3 108.42 108.40 108.40 125.90
9/10/2012 Group 1 72.72 72.72 72.72 92.40
9/10/2012 Group 2 77.52 77.52 77.52 69.00
9/10/2012 Group 3 18.98 18.98 18.98 68.40
9/14/2012 Group 1
110.89 110.89 110.89 37.80
9/14/2012 Group 2
9/14/2012 Group 3
99.32 99.32 99.32 17.80
9/14/2012 Group 4
9/14/2012 Group 5
135.61 135.61 135.61 20.70
9/14/2012 Group 6
9/20/2012 Group 1
65.73 65.73 65.73 21.90
9/20/2012 Group 2
9/20/2012 Group 3
77.39 77.39 77.39 14.60
9/20/2012 Group 4
9/20/2012 Group 5 65.53 65.53 65.53 21.10
90
9/20/2012 Group 6
Appendix E: SCE 2012 DR Program Load Impact by Event (MW) (Cont.)
Daily Average by Event Hours
Event Date
Daily
Forecast
7 Day
Report
Year End Ex Post
Other Price Responsive Programs
Summer Discount Plan (Residential) (cont.)
9/21/2012 Group 1 130.98 130.98 130.98 67.00
9/21/2012 Group 2 168.96 168.96 168.96 69.10
9/21/2012 Group 3 105.16 105.16 105.16 77.10
9/28/2012 Group 1
43.16 43.16 43.16 29.30
9/28/2012 Group 2
9/28/2012 Group 3
55.06 55.06 55.06 24.50
9/28/2012 Group 4
9/28/2012 Group 5
43.28 43.28 43.28 34.40
9/28/2012 Group 6
10/2/2012 Group 1 298.91 298.91 298.91 86.20
10/2/2012 Group 2 198.32 198.32 198.32 130.90
10/17/2012 Group 1 127.25 127.25 127.25 62.30
10/17/2012 Group 2 146.77 146.77 146.77 72.30
10/17/2012 Group 3 92.50 92.50 92.50 56.10
10/18/2012 Group 1 154.37 154.37 154.37 N/A
10/18/2012 Group 2 58.71 58.71 58.71 N/A
10/26/2012 Group 1 38.65 38.65 38.65 N/A
10/26/2012 Group 2 47.23 47.23 47.23 N/A
10/26/2012 Group 3 7.77 7.77 7.77 N/A
Emergency Programs
Summer Discount Plan (Commercial)
8/14/2012 4.77 3.43 3.43 3.10
Agriculture Pumping Interruptible
8/14/2012 36.00 36.00 15.34 17.29
9/26/2012 60.56 60.56 28.39 24.00
Base Interruptible Program
9/26/2012 513.78 520.91 441.46 573.01
91
Appendix F: SDG&E 2012 Monthly Average DR Program Load Impact (MW)
with RA Measurement Hours (1 6 p.m.)
Program
Month 2012
RA
2012 Adjusted RA
Daily
Forecast
7 Day
Report
Ex
Post
Settlement
Enrollment
Enrollment &
Weather
Emergency Programs
BIP A 6 10 3 N/A N/A N/A N/A N/A
BIP A 7 11 4 N/A N/A N/A N/A N/A
BIP A 8 10 3 N/A N/A N/A N/A N/A
BIP A 9 11 3 N/A 0.34 1.3 0.84 N/A
BIP A 10 10 3 N/A N/A N/A N/A N/A
Monthly Nominated
CBP DA 6 9 8 N/A N/A N/A N/A N/A
CBP DA 7 10 8 N/A N/A N/A N/A N/A
CBP DA 8 10 9 N/A 8 9 8 9
CBP DA 9 10 8 N/A 9 7 7 7
CBP DA 10 10 8 N/A 9 8 4 8
CBP DO 6 20 10 N/A N/A N/A N/A N/A
CBP DO 7 22 10 N/A N/A N/A N/A N/A
CBP DO 8 22 10 N/A 12 11 10 11
CBP DO 9 23 11 N/A 12 10 11 10
CBP DO 10 23 10 N/A 12 10 9 10
Price Responsive
ACSAVER 6 7 7 N/A N/A N/A N/A N/A
ACSAVER 7 12 12 N/A N/A N/A N/A N/A
ACSAVER 8 15 14 N/A 27 18 19 N/A
ACSAVER 9 17 18 N/A 13 12 15 N/A
ACSAVER 10 18 18 N/A 15 9 18 N/A
CPP 6 12 16 N/A N/A N/A N/A N/A
CPP 7 15 18 N/A N/A N/A N/A N/A
CPP 8 12 15 N/A 14 20 19 N/A
CPP 9 12 14 N/A 14 6 14 N/A
CPP 10 14 16 N/A 16 16 16 N/A
DBP 6 N/A N/A N/A N/A N/A N/A N/A
DBP 7 N/A N/A N/A N/A N/A N/A N/A
DBP 8 N/A N/A N/A 5 8 5 8
DBP 9 N/A N/A N/A 5 9 5 9
DBP 10 N/A N/A N/A 5 8 5 8
PTR Com 6 N/A N/A N/A N/A N/A N/A N/A
PTR Com 7 N/A N/A N/A 2 0 0 31
PTR Com 8 N/A N/A N/A 1 4 0 37
PTR Com 9 N/A N/A N/A 1 0 0 33
PTR Com 10 N/A N/A N/A N/A N/A N/A N/A
PTR Res 6 46 46 N/A N/A N/A N/A N/A
PTR Res 7 70 70 N/A 24 13 6 160
PTR Res 8 69 69 N/A 15 21 2 286
PTR Res 9 63 63 N/A 32 46 8 298
PTR Res 10 52 52 N/A N/A N/A N/A 286
92
Appendix G: SDG&E 2012 DR Program Load Impact by Event (MW)
Daily Average by Event Hours
Program Event Date Daily Forecast 7 Day Report Ex Post Settlement
Emergency Programs
BIP A 9/14/2012 0.3 1.3 0.8 N/A
CPPE 8/13/2012 2.3 1.5 1.2 N/A
CPPE 9/14/2012 1.6 1.4 0.9 N/A
Monthly Nominated
CBP DA 8/9/2012 7.5 9.3 7.5 9.4
CBP DA 8/10/2012 7.5 9.5 7.6 9.5
CBP DA 8/14/2012 7.5 8.3 7.5 8.5
CBP DA 9/14/2012 9 5.8 5.7 5.9
CBP DA 9/17/2012 9 8 7.9 8.4
CBP DA 10/1/2012 9 7 4.1 7.3
CBP DA
10/2/2012 9 8 4.2 8.7
CBP DO 8/8/2012 11.7 11.2 11 11.5
CBP DO 8/13/2012 11.7 10.6 8.5 10.6
CBP DO 9/13/2012 12.1 10.5 10.6 10.7
CBP DO 9/14/2012 12.1 9.9 10.6 10.1
CBP DO 10/1/2012 12.1 9.5 9.2 9.5
Price Responsive
ACSAVER 8/8/2012 26.3 13.7 14 N/A
ACSAVER 8/10/2012 27.2 19.8 18.5 N/A
ACSAVER 8/13/2012 33.3 18.2 21.4 N/A
ACSAVER 8/17/2012 19.3 20.6 22.7 N/A
ACSAVER 9/13/2012 16 12.8 12.6 N/A
ACSAVER 9/14/2012 15.5 21.5 22.5 N/A
ACSAVER 9/15/2012 8.6 3.1 8.8 N/A
ACSAVER 10/1/2012 14.5 9.2 18 N/A
CPP 8/9/2012 13.5 20.9 15.9 N/A
CPP 8/11/2012 11.7 12.3 18.4 N/A
CPP 8/14/2012 14.3 27.1 25.9 N/A
CPP 8/21/2012 16.5 20 17.2 N/A
CPP 8/30/2012 16.2 20.3 17.8 N/A
CPP 9/15/2012 13.7 5.5 14.5 N/A
CPP 10/2/2012 16 16.1 16.5 N/A
PTR Com 7/20/2012 2 0.1 0 31.2
PTR Com 8/9/2012 1.2 0.3 0 27.4
PTR Com 8/10/2012 1.1 8 0 37.5
PTR Com 8/11/2012 0.8 0 0 26.2
PTR Com 8/14/2012 1.2 4.8 0 29.8
PTR Com 8/21/2012 1.2 4.5 0 62
PTR Com 9/15/2012 0.9 0 0 32.8
PTR Res 7/20/2012 23.9 13.3 6.3 160.1
PTR Res 8/9/2012 13.1 26.1 3.3 202.8
PTR Res 8/10/2012 12.6 28.1 3.2 197
PTR Res 8/11/2012 12.2 33.6 1.7 231.1
PTR Res 8/14/2012 12.5 6.9 1.1 240
PTR Res 8/21/2012 25 10 3 559
PTR Res 9/15/2012 32.3 45.8 8.3 298
93
Appendix H: SCE 2012 DR Program Overview
Program Type
Program
Season
Available
Annual
Events/Hours
Available
Monthly
Events/Hours
Available
Weekly
Events/Hours
Available
Daily
Events/Hours
# of Events
Triggered/
# of Hours
Available
Remaining
Available
Trigger Criteria
2012
Trigger
Condition
Agricultural
Pumping
Interruptible (API)
Day Of
Year Round
(excluding
Holidays)
150 Hours 25 Events 4 Events
1 Event
6 Hours Max
2 Events
7.1 Hours
143 Hours
•• CAISO Stage 1 Alert
•• CAISO Stage 2 Alert
•• SCE Grid Control
Center Discretion
•• Measurement &
Evaluation
•• System
Emergency
(San Joaquin
Valley)
•• Measurement
& Evaluation
Base Interruptible
Program (BIP)
Day Of
Year Round
(excluding
Holidays)
180 Hours 10 Events No Limit
1 Event
6 Hours Max
1 Event
2 Hours
178 Hours
•• CAISO Stage 1 Alert
•• CAISO Stage 2 Alert
•• SCE Grid Control
Center Discretion
•• Measurement &
Evaluation
•• Measurement
& Evaluation
Capacity Bidding
Program
Day
Ahead
May –– Oct
(excluding
Holidays)
No Limit 24 Hours Mon Fri
1 Event
8 Hours
(11am –– 7pm)
12 Events
July –– 17
Hrs
Oct –– 22
Hrs
May –– 24
Hrs
June –– 24
Hrs
July –– 7 Hrs
Aug –– 24 Hrs
Sep –– 24 Hrs
Oct –– 2 Hrs
•• High temperature
•• Resource
limitations
•• A generating unit
outage
•• Transmission
constraints
•• CAISO Alert or
Warning
•• SCE System
Emergency
•• Measurement &
Evaluation
•• Heat Rate
94
Appendix M (Cont.)
SCE 2012 DR Program Overview (Cont.)
Program Type
Program
Season
Available
Annual
Events/Hours
Available
Monthly
Events/Hours
Available
Weekly
Events/Hours
Available
Daily
Events/Hours
# of Events
Triggered/
# of Hours
Available
Remaining
Available
Trigger Criteria
2012
Trigger
Condition
Capacity Bidding
Program
Day Of
May –– Oct
(excluding
Holidays)
No Limit 24 Hours No Limit
1 Event
4,6, or 8 hour
event
duration
options
7 Events
July –– 3 Hrs
Aug –– 12 Hrs
Sept –– 6 Hrs
Oct –– 10 Hrs
May –– 24 Hrs
Jun –– 24 Hrs
Jul –– 21 Hrs
Aug –– 12 Hrs
Sep –– 18 Hrs
Oct –– 14 Hrs
•• High temperature
•• Resource
limitations
•• A generating unit
outage
•• Transmission
constraints
•• CAISO Alert or
Warning
•• SCE System
Emergency
•• Measurement &
Evaluation
•• Heat Rate
Demand Bidding
Program
Day
Ahead
Year Round
(excluding
Holidays)
No Limit No
No Limit
Mon Fri
1 Event
8 hours
8 Events
64 Hours
No Limit
•• CAISO Alert or
Warning
•• Day Ahead load
and/or Price Forecast
•• Extreme or unusual
temperature
conditions
•• SCE Procurement
needs
•• Measurement &
Evaluation
•• Heat Rate
DR Contracts
Day
Ahead
Varies
Varies by
Contract
Varies by
Contract
Varies by
Contract
Varies by
Contract
1 Event
2 Hours
Varies by
Contract
Varies by Contract
•• Peak Load
Forecast
DR Contracts Day Of Varies
Varies by
Contract
Varies by
Contract
Varies by
Contract
Varies by
Contract
2 Events
5 Hours
Varies by
Contract
Varies by Contract
•• Energy Prices
•• Peak Load
Forecast
95
Appendix M (Cont.)
SCE 2012 DR Program Overview (cont.)
Program Type
Program
Season
Available
Annual
Events/Hour
s
Available
Monthly
Events/Hour
s
Available
Weekly
Events/Hour
s
Available
Daily
Events/Hour
s
# of Events
Triggered/
# of Hours
Available
Remaining
Available
Trigger Criteria
2012
Trigger
Condition
Save Power Day
Day
Ahead
Year Round
(excluding
Holidays)
No Limit No Limit No Limit
1 Event
4 Hours
(2pm –– 6pm)
7 Events
28 Hours
No Limit •• Temperature
••
Temperature
Summer
Advantage
Incentive
Day Of
June –– Sep
(excluding
Holidays)
60 Hours
Min: 9 Events
Max: 15
Events
No Limit No Limit
1 Event
4 Hours
(2pm –– 6pm)
12 Events
48 Hours
3 Events
•• Temperature
•• CAISO Alert or
Warning
•• SCE System
Forecast
•• Extreme or unusual
temperature
conditions
•• Day Ahead load
and/or Price
Forecast
•• High
Temperature
•• Peak Load
Forecast
•• Day Ahead
load and/or
Price Forecast
Summer Discount
Plan Residential
Day Of
Year Round
(excluding
Holidays)
Unlimited
Events
180 Hours
No Limit No Limit
Unlimited
Events
6 Hours
23 Events
24 Hours
156 Hours
•• CAISO Alert or
Warning
•• CAISO Discretion
•• SCE Grid Control
Center Discretion
•• SCE Energy
Operations Center
Discretion
•• Measurement &
Evaluation
•• CAISO
Emergency
•• Heat Rate
••
Measurement
& Evaluation
Summer Discount
Plan –– Commercial
Day Of
Year Round
(excluding
Holidays)
Base –– 90
Hours
Enhanced ––
Unlimited
No Limit No Limit 6 Hours
1 Event
5.6 Hours
No Limit
•• CAISO Stage 1 Alert
•• CAISO Stage 2 Alert
•• SCE Grid Control
Center Discretion
•• Measurement &
Evaluation
•• CAISO
Emergency
96
Appendix I: SDG&E DR Program Overview
Program Type Program Season
Available Annual
Events/Hours
Available
Monthly
Events/Hours
Available Weekly
Events/Hours
Available Daily
Events/Hours
# of Events
Triggered
Available
Remaining
Trigger Criteria Trigger Condition
1 Event
Temperature and
system load
Always
*Monday: 86 ; 3472
MW
7 Hours
*Tues Fri: 84 ; 3837
MW
(11am 6pm)
*Saturday: 86 ; 3837
MW
May Oct 1 Event 7 Events Price:
Mon Fri Up to 8 Hours
Aug: 12 Hours
(3 events)
Aug: 32 Hours *Mon Friday only
(11am 7pm)
Sep: 8 Hours
(2 events)
Sep: 36 Hours
*Market Price equal
to or greater than
15,000 btu/kWh heat
rate
Oct: 8 Hours
(2 events)
Oct: 36 Hours
*Other Statewide or
local system
conditions
May Oct 1 Event 5 Events Price:
Day Of Mon Fri Up to 8 Hours
Aug:7 Hours
(2 events)
Aug: 37 Hours *Mon Friday only
(11am 7pm)
Sep: 8 Hours
(2 events)
Sep: 36 Hours
*Market Price equal
to or greater than
15,000 btu/kWh heat
rate
Oct: 4 hours
(1 event)
Oct: 40 Hours
*Other Statewide or
local system
conditions
1 Event 1 Event
CAISO forecasts a
Stage 1
1 ComplianceTest
Up to 4 Hours 4 Hours
CAISO declares a
Stage 2
2 Met trigger
criteria
CAISO calls for
interruptible load
Extreme weather or
system demands or
at SDGE discretion.
116 Hours
Base Interruptibile
Program (BIP)
Day Of 30
minute
Year Round 120 Hours 10 Events
Capacity Bidding
Program (CBP)
No Limit 44 Hours No Limit
Mitigate potential
price spikes and
load forecast
abolve 4000 MW
and/or Real Time
Load came in
higher than Day
Ahead forecast
Mitigate potential
price spikes and
load forecast
above 4000 MW
Critical Peak
Pricing Default
(CPP D)
Day Ahead Year Round 18 Events No Limit No Limit 7 Events 11 Events
Met trigger criteria
for all 7 events
Capacity Bidding
Program (CBP)
Day Ahead No Limit 44 Hours No Limit
97
Appendix N: SDG&E DR Program Overview (Cont.)
Program Type Program Season
Available Annual
Events/Hours
Available
Monthly
Events/Hours
Available Weekly
Events/Hours
Available Daily
Events/Hours
# of Events
Triggered
Available
Remaining
Trigger Criteria Trigger Condition
May Oct 15 Events 1 Event 8 Events
Temperature and
system load
Holidays Excluded or Noon to 8 pm
Aug: 15 Hours
(4 events)
Aug: 25 Hours
*Monday Friday:
3800 MW
120 Hours
Min 2/Max 4
Hours
Sep: 10 Hours
(3 events)
Sep: 30 Hours
*Saturday Sunday
Optional
Participation
Oct: 4 Hours
(1 events)
Oct: 36 Hours *CAISO Stage 1 or 2
Annual 91 Hours
*Local or system
emergency
1 Event
Temperature and
system load
Always
*Monday: 86 ; 3472
MW
7 Hours
*Tues Fri: 84 ; 3837
MW
(11am 6pm)
*Saturday: 86 ; 3837
MW
Day Of 2 Events
Aug:1 Event
(5 Hours)
Terminates Dec
31
30 minute
Sep:1 Event
(4 Hours)
Jul Dec 3 Events
CAISO 1,2,or 3
Emergency
2012 only 14 Hours
Transmission or
imminent system
emergency or as
warranted by the
utility
08/10/13
08/14/13
Conditions
warranted by
Utility
Flex Alerts in
Effect
71 Hours
Local utility
emergency with
intent to avoid any
firm load curtailment
CAISO calls for
Conditions
warranted by
Utility
Demand Bidding Day Ahead No Limit No Limit No Limit No Limit N/A
Critical Peak
Pricing Emergency
(CPP E) Year Round 80 Hours 40 Hours 4 Events 1 Event
Mitigate potential
price spikes and
load forecast
abolve 4000 MW
and/or Real Time
Load came in
higher than Day
Ahead forecast
Reduce Your Use Day Ahead Year Round No Limit No Limit No Limit 7 Events No Limit
Met trigger criteria
for all 7 events
Summer Saver Day Of 40 Hours 3 Events
98
Appendix J: SCE Historical DR Event Hours
DR Programs Event Limits
Max
Event
Duration
2012
2006
2011
Average
2006
2011
Max
2011 2010 2009 2008 2007 2006
Monthly Nominated
Capacity Bidding Program Day Ahead (1 4) 24 Hrs./Mo 4 hrs. 39 53 72 48 47 72 47 53
Capacity Bidding Program Day Ahead (2 6) 24 Hrs./Mo 6 hrs. 0 51 71 23 49 71 53 58
Capacity Bidding Program Day Ahead (4 8) 24 Hrs./Mo 8 hrs. 0 14 42 0 0 28 42 0
Capacity Bidding Program Day Of (1 4) 24 Hrs./Mo 4 hrs. 23 18 40 8 31 8 3 40
Capacity Bidding Program Day Of (2 6) 24 Hrs./Mo 6 hrs. 33 12 40 8 40 8 3 0
Capacity Bidding Program Day Of (4 8) 24 Hrs./Mo 8 hrs. 0 11 49 0 0 8 0 49
Demand Bidding Program Unlimited 8 hrs. 64 106 172 40 72 116 101 172 136
Demand Response Contracts Day Ahead Various 4 hrs. 2 19 71 8 8 6 71 0
Demand Response Contracts Day Of Various 4 hrs. 12 11 16 14 16 6 7 14
Other Price Responsive
Save Power Days / Peak Time Rebates Unlimited 4 hrs. 28
Summer Advantage Incentive / Critical Peak
Pricing (CPP)
15 Events/Yr. 4 hrs. 48 57 70 48 48 70 60
Summer Discount Plan Residential &
Commercial Base
15 Events/
Summer Season
6 hrs./day 15 38 0 22 5 0 38 24
Summer Discount Plan –– Residential &
Commercial Enhanced
Unlimited
Events/ Summer
Season
6 hrs./day 15 39 0 22 9 0 39 18
Summer Discount Plan Commercial –– Base 15 Events/
Summer Season
6 hrs./day
6
Summer Discount Plan Commercial
Enhanced
Unlimited
Events/ Summer
Season
6 hrs./day
6
Summer Discount Plan –– Residential 180 Hours/Yr. 6 hrs./day 24
Emergency
Agricultural Pumping Interruptible (API) 1/Day
4/Wk.
25/Mo.
6 hrs./Day
40 hrs./Mo
150 hrs./Yr.
7 1 2 1 2 0 1 0 0
Base Interruptible Program (BIP) 1/Day
10/Mo.
6 hrs./Day
180 hrs./Yr.
2 1 3 2 0 2 0 0 3
99
Appendix K: SCE Historical Number of DR Events
DR Programs Event Limits 2012
2006
2011
Average
2006
2011
Max
2011 2010 2009 2008 2007 2006
Monthly Nominated Programs
Capacity Bidding Program Day Ahead (1 4) 24 Hrs./Mo 12 20 26 19 18 26 20 15
Capacity Bidding Program Day Ahead (2 6) 24 Hrs./Mo 0 16 22 10 16 22 19 13
Capacity Bidding Program Day Ahead (4 8) 24 Hrs./Mo 0 3 11 0 0 6 11 0
Capacity Bidding Program Day Of (1 4) 24 Hrs./Mo 7 5 11 3 9 2 2 11
Capacity Bidding Program Day Of (2 6) 24 Hrs./Mo 7 3 8 2 8 2 2 0
Capacity Bidding Program Day Of (4 8) 24 Hrs./Mo 0 2 9 0 0 2 0 9
Demand Bidding Program Unlimited 8 14 22 5 9 15 15 22 17
Demand Response Contracts Day Ahead Various 1 5 18 2 2 1 18 0
Demand Response Contracts Day Of Various 2 3 5 5 2 1 3 3
Other Price Responsive
Save Power Days / Peak Time Rebates Unlimited 7
Summer Advantage Incentive / Critical Peak
Pricing (CPP)
15 Events/Yr.
12 12 12 12 12 12 12
Summer Discount Plan Residential &
Commercial Base
15 Events/
Summer Season
5 11 11 6 3 0 5 2
Summer Discount Plan –– Residential &
Commercial Enhanced
Unlimited
Events/ Summer
Season
8 22 10 22 5 0 6 2
Summer Discount Plan Commercial Base 15 Events/
Summer Season
1
Summer Discount Plan Commercial
Enhanced
Unlimited
Events/ Summer
Season
1
Summer Discount Plan –– Residential 180 Hours/Yr. 23
Emergency Programs
Agricultural Pumping Interruptible (API) 1/Day, 4/Wk.
25/Mo.
2 1 2 1 2 1 1 0 0
Base Interruptible Program (BIP) 1/Day,10/Mo. 1 1 1 1 0 1 0 0 1
100
Appendix L: Summary of SCE’’s Reasons for the 2012 DR Triggers
DR Program Category Programs Reasons
Monthly Nominated Capacity Bidding Program
Demand Bidding Program
DR Contracts
No nomination or trigger conditions
Trigger conditions plus SCE’’s discretion to
optimize performance & minimize
participant fatigue
Trigger conditions
Price responsive Save Power Day (PTR)
Summer Advantage Incentive (CPP)
Summer Discount Plan (SDP) –– Res.
SCE discretion to optimize performance &
minimize participant fatigue
Optimal dispatch
Transitioned to price trigger starting June
2012. Remaining hours reserved for
contingencies.
Emergency Agricultural Interruptible Program
Base Interruptible Program
Local transmission contingency
No emergency, test event only
101
Appendix M: SDG&E Historical DR Event Hours154
DR Programs Event Limits 2012
2006
2011
Average
2006
2011
Max
2011 2010 2009 2008 2007 2006
Monthly Nominated Programs
Capacity Bidding Program Day
Ahead
24 Hrs./Mo 24 19 38 19 28 24 4 38 0
Capacity Bidding Program Day Of 24 Hrs./Mo 20 28 50 28 50 37 6 45 0
Price Responsive Programs
Peak Time Rebate Unlimited 49 32 32 32
Critical Peak Pricing Default 98 Hrs. ('06 '07)
126 Hrs. ('08 '12)
49 39 70 14 28 56 0 63 70
Demand Bidding Program Unlimited 14 29 41 41 16
Summer Saver 120 Hrs./Yr. 30 29 44 22 44 30 8 43 24
Emergency Programs
Base Interruptible Program (BIP) 120 Hrs./Yr. 4 2 4 4 4 0 0 4 2
Critical Peak Pricing Emergency 80 Hrs./Yr. 9 4 14 0 0 0 0 14 7
154
Source for the 2006 2012 data: SGE 02, Attachment 1, Revised Appendix X, Tables 8 11.
102
Appendix N: SDG&E Historical Number of DR Events155
DR Programs Event Limits 2012
2006
2011
Average
2006
2011 Max
2011 2010 2009 2008 2007 2006
Monthly Nominated Programs
Capacity Bidding Program Day Ahead Unlimited 7 5 8 5 7 6 1 8 0
Capacity Bidding Program Day Of Unlimited 5 7 12 7 12 7 1 12 0
Price Responsive Programs
Peak Time Rebate Unlimited 7 5 5 5
Critical Peak Pricing –– Default 12 ('06 '07)
18 ('08 '12) 7 6 10 2 4 8 0 9 10
Demand Bidding Program Unlimited 3 7 9 9 4
Summer Saver 15/Yr. 8 8 12 6 11 7 2 12 8
Emergency Programs
Base Interruptible Program (BIP) 10/Mo. 1 1 1 1 1 0 0 1 1
Critical Peak Pricing –– Emergency Unlimited 2 1 3 0 0 0 0 3 2
155
Source for 2006 2012 data: SGE 02, Attachment 1, Revised Appendix X, Tables 8 11.
103
Appendix O: Utilities’’ Peaker Plant Total Permissible vs. Actual Service Hours
SCE Owned Peaker Plants
Within SONGS Affected Areas
Center Barre Grapeland Mira Loma
Permissible Service Hours 1096 955 1073 700
Actual Service Hours:
Sept. Dec. 2007 93 123 87 104
Jan. Dec. 2008 120 118 125 119
Jan. Dec. 2009 93 83 46 70
Jan. Dec. 2010 156 174 137 148
Jan. Dec. 2011 163 149 85 127
2007 2011 Average 125 129 96 114
% of Permitted 11% 14% 9% 16%
Jan. Oct. 2012 459 465 403 413
% of Permitted 42% 49% 38% 59%
% of 2007 2011 Avg. 367% 359% 420% 364%
SDG&E Owned Peaker Plants
Cuyamaca
El Cajon Energy
Center
Miramar Orange Grove
Permissible Service Hours N/A 2500 5000 6400
Actual Service Hours:
2006 200
2007 250
2008 373 671
2009 625 1919
2010 481 439 2946
2011 667 433 4306
Historical Average 537 436 1715
% of Permitted N/A 17% 34% N/A
2012 1621 974 4805 2148
% of Permitted N/A 39% 96% 34%
% of Historical Avg. 302% 223% 280% N/A
104
Appendix P: Ex Post Demand Response Load Impact on Flex Alert Days
Programs
Ex Post (MW)
3:00 4:00 p.m. 4:00 5:00 p.m.
8/10/12:
SCE
Demand Bidding Program 107 107
Save Power Day/Peak Time Rebate 87 78
Subtotal 194 185
SDG&E
Capacity Bidding Program 8 8
Summer Saver/AC Cycling (Res. & Com.) 19
Subtotal 8 27
PG&E
Capacity Bidding Program 41 41
Aggregator Managed Program 174 172
Peak Day Pricing/Critical Peak Pricing 22 24
Peak Choice 3 2
SmartAC 65
Base Interruptible Program 220 222
Subtotal 459 527
TOTAL 661 739
8/14/12:
SCE
Capacity Bidding Program
Demand Bidding Program 72 71
Demand Response Contract 184 180
Summer Discount Plan/AC Cycling Res. & Com. 137 22
Agricultural Pumping Interruptible 14
Subtotal 394 242
SDG&E
Capacity Bidding Program 8 8
Critical Peak Pricing 24 25
Peak Time Rebate 1 0
Demand Bidding Program 6 5
Subtotal 38 38
PG&E No Events
TOTAL 432 280
105
Appendix Q: CAISO Energy Price Spikes
SCE Price Spikes156
156
Source: SCE 03, SCE’’s Response to ALJ February 21, 2013 Ruling, Appendix B (Excel Data Tables in Response, Table 9)
106
107
108
SDG&E Price Spikes157
157
Source: SGE 02, SDG&E’’s Response to the ALJ February 4, 2013 Scoping Memo, Attachment 3.
109
110
111
Appendix R: Utilities’’ Demand Response Reporting Requirements158
(2013 2014)
1. DR Weekly Forecast
The utilities should continue to submit a 7 day (Monday to Sunday)159
DR forecast (MW) to the
CAISO/CPUC_ED/CEC and highlight the DR programs that they anticipate to trigger by noon every
Monday.
Daily Value
For the DR programs that have different hourly forecast, the utilities use slightly different methods
to determine the daily value as described below. If an averaging method is used, the daily value may
be higher or lower than the MW in a given hour such as the peak hours in the CAISO's demand
forecast. Energy Division staff uses an averaging method over the actual event hours for its reports on
the historical DR events.
Utility Methods for the Daily Value
SCE Average over the available event hours in the tariffs, which vary from program to
program.
SDG&E PROGRAM PERIOD AVERAGED
Day Ahead 11 a.m. –– 6 p.m.
Day Of 1 p.m. –– 6 p.m. (like RA)
PG&E PROGRAM PERIOD AVERAGED
BIP 1:00 pm –– 6:00 pm
PDP 2:00 pm –– 6:00 pm (no significant enrollment/load 12 2p)
SmartRate 2:00 pm –– 6:00 pm
SmartAC 1:00 pm –– 6:00 pm
For AMP, CBP, DBP, and PeakChoice, the hourly forecast does not vary;
therefore, PG&E will continue to submit the same hourly forecast amount for the
given month.
2. Daily DR Reporting to the CAISO (by 8 a.m. weekdays & weekends)
For the non summer months (January 1 to April 30 and November 1 to December 31), the utilities
should submit their Daily DR Reporting to the CAISO/CPUC_ED/CEC only when they intend to trigger a
DR program for that day. In the submission email, please identify the triggered DR program(s). If
there is no DR event, the utilities do not need to submit this report.
For the summer months (May 1 to October 31), the utilities should submit their Daily DR Reporting
to the CAISO/CPUC_ED/CEC on a daily basis as they did in 2012.
This report is based on a common template developed by the CAISO and in Excel spreadsheet. In
this report, the utilities provide the scheduled as of 8 a.m. and available MW for the day and next day
for all of their DR event based programs (including Day Ahead and Day of) on an aggregated basis.
SCE also has added the MW by each DR program. SDG&E only added the MW for the DR program(s)
triggered for the day or the next day.
158
For SCE and SDG&E. Staff guidance only for PG&E because it is subject to this proceeding (A.12 12 016 et al.). However,
staff includes the reporting requirements for PG&E as a guidance consistent with what are required for SCE and SDG&E.
159
Change from SCE and PG&E current days from Tuesday –– Monday to Monday –– Sunday.
112
3. Updated Reporting to the CAISO/CPUC_ED/CEC (by COB weekdays for DR events called after 8
a.m.)
PG&E:
PG&E continues to send the DR forecast for all of the Day Ahead and Day Of events triggered to
the CAISO and CPUC throughout the day as it used to do prior to summer 2012 in Excel spreadsheet.
These reports provide the forecasted MW for each DR program.
SCE:
SCE sends a revised Daily DR Report to include the Day Of events called after 8 a.m. and the
forecasted MW by program at the end of the event day. In the submission email, please identify the
triggered DR program(s).
SDG&E:
SDG&E also sends a revised Daily DR Report to include the all DR events called after 8 a.m. and the
forecasted MW by program at the end of the event day.
4. Reports on DR Results to the CAISO/CPUC_ED/CEC (Seven Days After the Events)
All three utilities should continue to provide the DR results in Excel spreadsheet seven days after
each DR event (CAISO 7 Day Report). The 7 Day Report should also include the DR results to date in
each year.160
The utilities should submit the DR Weekly DR Forecasts (No.1) to the following emails:
Entity/Individual Email Address
CAISO John Goodin jgoodin@caiso.com
CPUC Bruce Kaneshiro bruce.kaneshiro@cpuc.ca.gov
CPUC Scarlett Liang Uejio scarlett.liang uejio@cpuc.ca.gov
CPUC Dorris Chow dorris.chow@cpuc.ca.gov
CPUC Paula Gruendling paula.gruendling@cpuc.ca.gov
CEC Margaret Sheridan msherida@energy.ca.gov
The utilities should submit the Daily DR Reports, revisions, and Results (No.2 No.4) to the following
emails:
Entity/Individual Email Address
CAISO Shift Supervisors shiftsupervisors@caiso.com
CAISO Market Operations ISODAM@caiso.com
CAISO John Goodin jgoodin@caiso.com
CAISO Glen Perez gperez@caiso.com
CAISO Market Monitoring Keith Collins kcollins@caiso.com
CPUC Scarlett Liang Uejio scarlett.liang uejio@cpuc.ca.gov
CPUC Bruce Kaneshiro bruce.kaneshiro@cpuc.ca.gov
CPUC Dorris Chow dorris.chow@cpuc.ca.gov
CPUC Paula Gruendling paula.gruendling@cpuc.ca.gov
CEC Margaret Sheridan msherida@energy.ca.gov
160
See SCE’’s 2012 7 Day Reports as an example.
113
Appendix S: Additional Information
Provided in separate PDF files

StaffReport_2012DRLessonsLearned

  • 1.
    STATE OF CALIFORNIAEdmund G. Brown Jr., Governor PUBLIC UTILITIES COMMISSION 505 VAN NESS AVENUE SAN FRANCISCO, CA 94102 3298 Commission Staff Report Lessons Learned From Summer 2012 Southern California Investor Owned Utilities’’ Demand Response Programs May 1, 2013 Performance of 2012 Demand Response programs of San Diego Gas and Electric Company and Southern California Edison Company: report on lessons learned, staff analysis, and recommendations for 2013 2014 program revisions in compliance with Ordering Paragraph 31 of Decision 13 04 017.
  • 2.
    ACKNOWLEDGEMENT The following Commissionstaff contributed to this report: Bruce Kaneshiro Scarlett Liang Uejio Tim Drew Rajan Mutialu Dorris Chow Paula Gruendling Taaru Chawla Jennifer Caron Alan Meck
  • 3.
    i TABLE OF CONTENTS EXECUTIVESUMMARY....................................................................................................... 1 Chapter 1: Introduction.................................................................................................. 5 I. 2012 Summer Reliability and Demand Response Programs..................................................5 II. Energy Division November 16, 2012 Letter and the Staff Report..........................................6 Chapter 2: Demand Response Program Load Impact...................................................... 8 I. Summary of Staff Analysis and Recommendations ...............................................................8 II. Different DR Load Impact Estimates ...................................................................................... 9 III. Comparison of DR Daily Forecast and Ex Post Results ..........................................................9 IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)..............................26 Chapter 3: Demand Response Program Operations...................................................... 32 I. Summary of Staff Analysis and Recommendations .............................................................32 II. 2012 DR Program Trigger Criteria and Event Triggers .........................................................32 III. DR Events Vs. Peaker Plant Service Hours ...........................................................................33 IV. Peaker Plant Comparison..................................................................................................... 34 V. Conclusions .......................................................................................................................... 35 Chapter 4: Residential Demand Response Programs .................................................... 36 I. Summary of Staff Analysis and Recommendations .............................................................36 II. Residential Peak Time Rebate (PTR) ....................................................................................36 III. Residential Air Conditioning (AC) Cycling.............................................................................51 Chapter 5: Non Residential Demand Response Programs............................................. 57 I. Summary of Staff Analysis and Recommendations .............................................................57 II. Background and Summary of Utility Data............................................................................57 III. Commercial Air Conditioning (AC) Cycling...........................................................................59 IV. SCE’’s Auto DR....................................................................................................................... 63 V. SDG&E’’s Demand Bidding Program (DBP) ...........................................................................65 Chapter 6: Flex Alert Effectiveness ............................................................................... 67 I. Summary of Staff Analysis and Recommendations .............................................................67 II. Background .......................................................................................................................... 67 III. Utility Experience with Flex Alert.........................................................................................69 IV. Customer Experience ........................................................................................................... 69 V. The Future of Flex Alert........................................................................................................ 71 VI. DR Program Ex Post Load Impact Results on the Flex Alert Days........................................71 Chapter 7: Energy Price Spikes ..................................................................................... 73
  • 4.
    ii I. Summary ofStaff Analysis and Recommendations .............................................................73 II. Definition of Price Spikes ..................................................................................................... 73 III. DR Programs and Price Spikes.............................................................................................. 73 IV. Conclusion............................................................................................................................ 74 Chapter 8: Coordination with the CAISO ...................................................................... 75 I. Staff Recommendations....................................................................................................... 75 II. DR Reporting Requirements in Summer 2012.....................................................................75 III. DR Reporting Requirements for 2013 2014.........................................................................76 Appendix A: Highlight of 2012 Summer Weather & Load Conditions.................................... 77 Appendix B: Energy Division November 16, 2012 Letter........................................................ 78 Appendix C: Descriptions of DR Load Impact Estimates......................................................... 79 Appendix D: SCE 2012 Monthly Average DR Program Load Impact (MW) ............................ 85 Appendix E: SCE 2012 DR Program Load Impact by Event (MW)........................................... 87 Appendix F: SDG&E 2012 Monthly Average DR Program Load Impact (MW) ....................... 91 Appendix G: SDG&E 2012 DR Program Load Impact by Event (MW)..................................... 92 Appendix H: SCE 2012 DR Program Overview ....................................................................... 93 Appendix I: SDG&E DR Program Overview............................................................................. 96 Appendix J: SCE Historical DR Event Hours............................................................................. 98 Appendix K: SCE Historical Number of DR Events .................................................................. 99 Appendix L: Summary of SCE’’s Reasons for the 2012 DR Triggers....................................... 100 Appendix M: SDG&E Historical DR Event Hours................................................................... 101 Appendix N: SDG&E Historical Number of DR Events .......................................................... 102 Appendix O: Utilities’’ Peaker Plant Total Permissible vs. Actual Service Hours................... 103 Appendix P: Ex Post Demand Response Load Impact on Flex Alert Days ............................ 104 Appendix Q: CAISO Energy Price Spikes................................................................................ 105 Appendix R: Utilities’’ Demand Response Reporting Requirements..................................... 111 Appendix S: Additional Information .................................................................................... 113
  • 5.
    1 EXECUTIVE SUMMARY This reportis prepared by Energy Division in compliance with Ordering Paragraph 31 of D.13 04 017. The purpose of this report is to provide the lessons learned from the 2012 Demand Response (DR) programs operated by San Diego Gas and Electric Company (SDG&E) and Southern California Edison Company (SCE) (Utilities), and to recommend program or operational revisions, including continuing, adding, or eliminating DR programs. Below are highlighted conclusions and recommendations in the report. To see all recommendations, please go to each chapter in the report. In summary, Energy Division makes the following overarching conclusions about the Utilities’’ DR programs: Forecast vs. Ex Post: While a few DR programs met or even exceeded their daily forecast when triggered, on average the ex post results for all program events diverge from the daily forecast by a considerable degree. The majority of programs either provided a ‘‘mixed’’ performance (the program both over and under performed relative to its forecast) or were poor performers (consistently coming up short relative to its forecast). Of particular note are the Utilities’’ Peak Time Rebate program1 and SCE’’s Summer Discount Plan.2 (Chapter 2) The divergence between the ex post results and the daily forecasts can be traced to a variety of causes, such as inadequate forecasting methods employed by the Utilities, program design flaws, non performance by program participants and/or program operations. A complete explanation of the reasons for divergence across all programs however, was not possible within the scope and timing of this report. (Chapter 2) 2012 RA vs. Ex Post: Comparing the ex post results to the 2012 Resource Adequacy (RA) forecast is not a good indicator as to how well a DR program performs. RA forecasts are intended for resource planning needs. Ex post load impacts reflect demand reductions obtained in response to operational needs at the time the program is triggered. Resource planning and operational planning have different conditions and serve different purposes. (Chapter 2) DR vs. Peaker Plants: The Utilities used their DR programs fewer times and hours than the programs’’ limits (each program is limited to a certain number of hours or events). In contrast, the Utilities dispatched their peaker power plants far more frequently in 2012 in comparison to 2006 –– 2011 historical averages. (Chapter 3) Energy Price Spikes: DR programs are not currently designed to effectively mitigate price spikes in the CAISO’’s energy market. On many days a DR event was called and 1 SCE’’s marketing name for Peak Time Rebate is ““Save Power Day”” , SDG&E calls it ““Reduce Your Use””. 2 Air conditioning (AC) cycling
  • 6.
    2 no price spikesoccurred, and conversely there were days where price spikes occurred and DR events were not called. The timing and scope of this report did not permit a quantification of the cost of unmitigated price spikes to ratepayers, but in theory, avoidance of these spikes would benefit ratepayers. (Chapter 7) Energy Division also makes the following program specific conclusions about the Utilities’’ DR programs: SCE’’s AC Cycling Program Forecasting: SCE’’s 2012 forecasting methodology for its air conditioning (AC) Cycling program (the DR program that SCE triggered the most in 2012) cannot be relied upon to effectively predict actual program load reductions. (Chapter 2) SCE’’s AC Cycling Dispatch Strategy: SCE’’s sub group dispatch strategy for its AC Cycling Program (also called Summer Discount Plan) created adverse ‘‘rebound’’ effects, thereby reducing the effectiveness of the program during critical hot weather days, e.g. 1 in 10 weather. (Chapter 2) SDG&E’’s Demand Bidding Program: SDG&E Demand Bidding Program produced on average 5 MW of load reduction when triggered, although the US Navy did not participate. The US Navy claimed certain program terms and conditions precluded it from participating in the 2012 program. The Commission’’s decision to modify the program to a 30 minute trigger may further limit the US Navy’’s ability to participate. (Chapter 5) Peak Time Rebate Awareness: SCE and SDG&E customers who received utility notification of Peak Time Rebate (PTR) events had higher awareness of the program when compared to customers who were not notified by the utility. More importantly, customers who opted into receiving PTR alerts significantly reduced load. All other customers in the program provided minimal load reduction. (Chapter 4) Peak Time Rebate Free Ridership: The Utilities’’ PTR program has a potentially large ‘‘free ridership’’ problem, where customers receive incentives without significantly reducing load. SCE paid $22 million (85% of total PTR incentives in 2012) in PTR bill credits to customers whose load impact was not considered for forecast or ex post purposes. 94% of SDG&E’’s 2012 PTR incentives ($10 million) were paid to customers who did not provide significant load reduction. The inaccuracy of settlement methodology (in comparison to the ex post results) is the main reason for the ‘‘free ridership’’ problem. The default nature of the program (everyone is automatically eligible for the incentives) aggravates the problem. (Chapter 4). Flex Alert: There is a lack of data to evaluate the effectiveness and value of the Flex Alert campaign. Attribution of savings from Flex Alert is complicated by the fact that load reduction from the Utilities’’ DR programs on the two days Flex Alert was
  • 7.
    3 triggered in 2012contributed to reduced system peak load. A load impact evaluation of Flex Alert is planned for 2013. (Chapter 6) DR Reports: The Utilities’’ DR daily and weekly reports were useful to the CAISO and the Commission for purposes of up to date monitoring of DR resources throughout the summer. (Chapter 8) In light of above findings, Energy Division recommends the following: DR Evaluation: The Commission should require further evaluation of Utility DR program operations in comparison to Utility operation of peaker plants for the purpose of ensuring Utility compliance with the Loading Order. (Chapter 3) Forecast Methods Generally: The Utilities’’ daily forecasting methods for all DR programs (especially AC cycling and other poor performers) should undergo meaningful and immediate improvements so that the day ahead forecasting becomes an effective and reliable tool for grid operators and scheduling coordinators. (Chapter 2) Forecasting for SCE’’s AC Cycling Program: SCE should improve forecasting methods for its residential AC Cycling Program with input from agencies and stakeholders. SCE should also pilot more than one forecasting method for the program in 2013. (Chapter 2) Forecasting for SDG&E Programs: SDG&E’’s forecasting methods for its AC Cycling Program (Summer Saver) could be improved doing the following: running a test event and including a correlation variable that accounts for customer fatigue. SDG&E’’s Capacity Bidding Program forecasting could be improved by including a weather variable. (Chapter 2) SCE’’s Outreach for Commercial AC Cycling: Through its outreach and marketing efforts, SCE should clearly communicate the new features of its commercial AC cycling program to avoid customer dissatisfaction and dropout. (Chapter 5) Auto DR: Future studies are necessary to explore the load impacts of Auto DR. (Chapter 5) SDG&E’’s Demand Bidding Program: SDG&E should work collaboratively with the US Navy to design a program to meet the unique needs of the Navy. Key attributes to consider are a day ahead trigger, aggregation of 8 billable meters and a minimum bid requirement of 3 megawatts (MW). (Chapter 5) Peak Time Rebate Design Changes: The Utilities’’ residential PTR program should be changed from a default program to an opt in program, so that bill credits are paid only to customers who opt in. (Chapter 4) SCE’’s AC Cycling Dispatch Strategy: SCE should reconsider its current strategy of calling groups of residential AC cycling customers in sequential one hour cycling events. Alternatively, if SCE retains its current strategy, it should modify the
  • 8.
    4 program’’s incentive structureso that customers who are willing to have their AC units cycled for an entire event (as opposed to just one hour) are compensated more than those who can tolerate only one hour of cycling. (Chapter 4) DR Reports: The Utilities (and Pacific Gas & Electric) should submit daily and weekly DR reports to the CAISO and the Commission for the summers of 2013 and 2014. They should follow the same format and data requirements in the 2012 reports, unless otherwise directed by the Commission or Commission staff. (Chapter 8)
  • 9.
    5 Chapter 1: Introduction I.2012 Summer Reliability and Demand Response Programs San Onofre Nuclear Generating Station (SONGS) Units 2 and 3 were taken out of service in January 2012. By March 2012, the Commission determined that the outage of SONGS’’ two units could extend through summer 2012. Working closely with the Governor’’s Office, the California Independent System Operator (CAISO), and the California Energy Commission (CEC), the Commission took immediate mitigation actions to ensure that lights stay on in California with the loss of 2,200 MW of capacity provided by SONGS.3 When considering adding new generation resources,4 an important action was to further incorporate the Utilities’’ Demand Response (DR) programs into the CAISO’’s contingency planning and daily grid operations during the summer. This included mapping the Utilities’’ DR programs to grid contingency plans and developing new daily and weekly DR reporting requirements. In addition, the Commission also moved swiftly to approve three new DR programs for summer 2012: SDG&E’’s Peak Time Rebate (PTR) for commercial customers and Demand Bidding Program (DBP); and SCE’’s 10 for 10 conservation program for non residential customers.5 Because of the intensive interagency mitigation effort and relatively cool weather, California grid reliability was not compromised in spite of the SONGS outage. Nevertheless, southern California experienced several heat waves in August and September with the highest temperature reaching 109°F in SDG&E’’s service area and 100°F for SCE on September 14.6 The CAISO issued two Flex Alerts: on August 10 and 14. The Utilities triggered all of their DR programs at least once and some on multiple occasions. Throughout the summer, Energy Division (ED) staff monitored the Utilities’’ DR program events on a daily basis and provided weekly briefings to the Governor’’s Office, the CAISO, and the CEC. Staff observed that, for many event days, the load impact forecasts provided by the Utilities to the CAISO and the Commission in their daily DR reports were inconsistent with the results submitted seven days after each event (referred as the ““7 Day report””). In some cases, the Utilities reported much lower load reduction results than they originally forecasted. In addition, load impact forecasts provided by the Utilities throughout the summer were lower than the capacity counted for the 2012 Resource Adequacy (RA) Requirement. This raised a question as to whether the Commission might have overestimated DR load impact for RA purposes or, rather, if the Utilities might have under utilized their DR programs. Sometime in mid summer, the Utilities began to experience price spikes in CAISO’’s wholesale energy market. Questions were raised on whether the DR programs could be used to mitigate price spikes, and if so, should they be. 3 http://www.songscommunity.com/value.asp 4 Retired Huntington Beach Units 3 and 4 were brought back on line temporarily. 5 Resolutions E 4502 and E 4511 6 A 1 in 10 (or 10% probability) weather condition in any given years.
  • 10.
    6 Some of theUtilities’’ DR programs were triggered on as many as 23 events over the five summer months, and many were triggered on two or three consecutive days. Appendix A highlights the DR program load impact on the three hottest days and the three days when SDG&E and SCE experienced highest system peak load. Staff observed that SDG&E’’s system peak correlate to temperature and biggest DR load reduction happened on the hottest day. On the other hand, SCE’’s system peak load did not consistently correlate to weather. In contrast, SCE’’s system load reached its annual peak at 90°F temperature, 10°F cooler than the hottest day in its service territory. Counter intuitively, DR program load impact on a cooler day was actually higher than the amount delivered on the hottest day. This led to questions how the Utilities make decisions to trigger DR programs and whether aspects of the customers’’ experience, such as expectations and fatigue have an effect. In August, CAISO issued two Flex Alerts when it determined a reliability risk due to insufficient supply to meet demand. As expected, the Utilities triggered relatively large amounts of DR programs on both days. CAISO reported that the actual peak load was significantly lower than its hours ahead forecasts and attributed the load drop to Flex Alert events. This parallel dispatch situation raises important questions regarding the effectiveness of the Flex Alert when overlapped with the Utilities’’ DR program events and how customers perceived with these statewide alerts versus local utility DR notifications. Based on the above experience, the Commission concluded that staff should evaluate DR program performance and other lessons learned in order to seek answers to these and other questions. Such lessons could help the Commission to determine the extent of DR program reliability and usefulness and in turn, to the extent to which DR resources can be counted on in CAISO markets and operations. II. Energy Division November 16, 2012 Letter and the Staff Report On November 16, 2012, the Energy Division sent a letter (Energy Division Letter) to the Utilities directing the Utilities to 1) file an application proposing DR program improvements for 2013 and 2014 to mitigate the SONGS outage and 2) provide data and responses to a set of questions on lessons learned from 2012 DR programs. The questions were developed based on the Utilities’’ 2012 demand response experience and fell into six categories: 1. DR Program Performance, which include load impact and program operations, 2. CAISO Market, covering price spikes and market analysis 3. Customer Experience, 4. Coordination with the CAISO and Utility Operations 5. Emergency DR Program Dispatch Order, and 6. Flex Alert Effectiveness The Energy Division Letter is attached in Appendix B of this report.
  • 11.
    7 On December 21,2012, the Utilities filed separate applications for the approval of the DR program revisions for 2013 and 2014.7 The Utilities submitted data and responses to the questions attached to the Energy Division Letter and subsequent Assigned Administrative Law (ALJ) rulings for developing the record.8 Decision (D.)13 04 017 approved certain DR program improvements for 2013 2014 and directed the Commission staff to develop a report on the lessons learned from the DR programs in 2012. This report is based on a snapshot of data and studies available at the time (i.e. ex post load impact data, utility responses to Energy Division data requests, etc.) On going and future (e.g. Flex Alert load impact analysis per D.13 04 021) evaluations will shed further light on the issues raised in this report. One point of emphasis in this report is the extent to which the current DR programs delivered their forecasted savings when they were triggered by the utilities. It is important to understand that there are a range of factors that can affect whether a program delivers its forecasted savings targets. Some of these factors can be controlled through good program design, operation and forecasting methodologies. Other factors that can impact program performance are exogenous or outside the utilities’’ control such as temperature, participant enrollment fluctuations, and behavioral or technological changes by the participants. While this report contains certain findings and recommendations for DR programs, we caution against sweeping conclusions or generalizations about DR programs based on this report. The point of this report is to find ways to improve existing DR programs so that they are more useful to grid operators, utilities, ratepayers and participants. 7 A.12 12 016 (SDG&E) and A.12 12 017 (SCE). 8 On January 18, 2013 and February 21, 2012.
  • 12.
    8 Chapter 2: DemandResponse Program Load Impact I. Summary of Staff Analysis and Recommendations SCE Most of the program event ex post results diverge from the daily forecast by a considerable degree. The daily forecast should be more consistent with the ex post results in order for the day ahead forecasting to be valid and useful for grid operators. Staff recommends that the daily forecasting methods for all programs undergo meaningful and substantial improvements, including more thorough and transparent documentation and vetting through relevant agencies and stakeholders. The Summer Discount Plan (Residential AC Cycling) program forecasting methods in particular requires an audience with a broad panel of agencies and stakeholders. Staff also recommends that SCE pilot more than one forecasting method and conduct interim protocol based load impact evaluations to identify the most reliable forecasting methods throughout the 2013 summer season. SCE should also be required to address Summer Discount Plan program operation issues before the 2013 summer peak season begins, if possible. Specifically, the strategy of calling groups of customers for sequential one hour cycling events, rather than calling all the customers for the duration of the full event (or other potential strategies), needs to be reconsidered before the program is further deployed. As discussed in detail later in this chapter, this strategy resulted in load increases during the latter hours of events, thereby reducing the overall effectiveness of the program. SDG&E Similar to SCE, many of SDG&E’’s program event ex post results also diverge from the daily forecast by a considerable degree. The Demand Bidding Program daily forecast was accurate and reliable in predicting ex post results, while the Summer Saver and Capacity Bidding Day Ahead and Day Of program daily forecasts did not accurately nor reliably predict ex post results. The Peak Time Rebate Residential daily forecast was not accurate in predicting ex post results, but consistently underestimated ex post results by approximately 80%. The Critical Peak Pricing and Base Interruptible program did not accurately or reliably predict ex post results, but consistently under predicted ex post load impacts. Due to a weak price signal and inelastic customer demand, the PTR commercial program ex post results were not significant. The CPP E was discontinued as of December 31, 2012. Staff recommends (1) including only customers that opt in to receive e mail or text alerts in the PTR residential daily forecast model (2) running a test event to measure % load impact per customer in order to improve CPP daily forecast estimates (3) including a correlation variable in the Summer Saver daily forecast model to account for customer fatigue during successive event days (4) including a weather variable in the CBP daily forecast model in order to have parity with the ex post regression model.
  • 13.
    9 II. Different DRLoad Impact Estimates DR programs load impact are forecasted or estimated at different times for different purposes. The following table summarizes the five different DR load impact estimates that are discussed in this chapter. Detail descriptions and methodologies for each DR program measurement are provided in Appendix C. Table 1: DR Load Impact Estimates DR Load Impact Estimates General Description Purpose Ex Ante for RA (e.g., 2012 RA) A year ahead monthly ex ante load impact potential attributed by individual program under a 1 in 2 weather condition. To determine the RA counting against the Load Serving Entity’’s system and local capacity requirements. Daily Forecast The Utilities’’ daily estimate of hourly load impact from DR programs during an event period. To provide the CAISO, CPUC, and CEC the hourly MW provided by DR programs on each event day. 7 Day Report The Utilities’’ preliminary estimate of hourly load reduction results from each triggered DR program To report to the CAISO the load reduction data from the triggered DR programs seven days after each DR event. Ex Post Results The Utilities’’ most accurate measurement of the load impact results from all of the DR programs triggered in a year. The ex post results are calculated using comprehensive regression models. To report to the CPUC the actual results of the DR events Settlement A measurement of customers’’ load reduction from their specific reference load using a baseline method. To calculate customers’’ incentive payments for billing purpose. In this proceeding, the Utilities provided the above DR load impact estimates for their DR programs, which are shown in Appendices D to G. III. Comparison of DR Daily Forecast and Ex Post Results A. Overall Program Performance The following section draws on data provided by the Utilities on March 4, 20139 in response to the Feb 21, 2013 ALJ ruling, which compares event day forecasts (daily forecast or day ahead forecast) to the event day ex post load reduction estimates. Detailed data and methodological descriptions relevant to this chapter are provided in Appendices C and G. Subsequent to its March 4 filing, SCE updated its ex post results for some of the DR program events in its April 2 Load Impact Report but did not update its March 4 filing accordingly. However, in most cases, the April 2, 2013 updated ex post results are even lower than the March 4 preliminary data, e.g., the AC cycling. Therefore, if the updated data was used, it would further support staff’’s findings. 9 SCE 03 and SGE 03.
  • 14.
    10 On average, theex post results for all program events diverge from the daily forecast by a considerable degree. While some program events were forecasted more accurately and consistently than others, Energy Division staff’’s overall conclusion is that the daily forecasting methods for all programs requires meaningful and immediate improvements in order for the day ahead forecasting can become an effective and reliable tool for grid operators. Some of the divergence between the ex post results and the daily forecast estimates can possibly be explained by inadequate program design and program operations. This section focuses on the observed differences between the ex post and the daily forecast with an eye towards identifying improvements for day ahead forecasting, and thus does not cover all potential program improvements. Furthermore, many program design and operational improvements that could lead to better ex post results may not be evident by simply inspecting the daily forecast and ex post data. The ex post analysis methods are guided by Commission adopted load impact protocols10 and the study results are carefully documented in reports prepared by independent consultants managed by SCE staff. However, there are currently no comparable standards and processes guiding the methods for daily forecasting. Indeed, during the course of preparing this report, Energy Division staff became aware that the day ahead forecasting methods are far from transparent, and in some cases lack the robust analysis that is expected of the Utilities. These problems may be somewhat understandable, however, since the daily reports were only formally instituted in 2012. While this report is highly critical of the implementation of the day ahead forecasting, it is important to recognize that the 2012 DR events as a whole did indeed reduce participants loads, and some of the program load reductions were consistent with or better than the day ahead forecast. To that end, staff has categorized the demand response programs into three categories (good, mixed, and poor performance) based on how well the program events performed relative to the day ahead forecasts. SCE Programs that performed well yielded load impacts that were consistent with or better than the day ahead forecast. The Base Interruptible Program (BIP) and the Day of Capacity Bidding Program events produced load reductions that were on par with the forecasts. It is worth noting that BIP, the single largest program, was triggered on only one occasion in 2012 however, and this was test event. Program events with mixed performance were not consistent with the day ahead forecast, but sometimes exceeded the forecast. Staff includes the Day ahead Capacity Bidding, Demand Bidding, and the Residential Summer Discount Plan program events in this category because these program events did indeed occasionally exceed the day ahead forecasts by a significant margin. These programs are discussed in greater detail elsewhere in this section and report. While considered to be mid performing programs, they do have many important issues that deserve attention. 10 Decision 08 04 050
  • 15.
    11 Program events thatwere consistently below the forecast are considered to be poor performing programs. All of the Critical Peak Pricing, Peak Time Rebate, Demand Response Contracts, Commercial Summer Discount Plan, and Agricultural Pumping Interruptible program events triggered during 2012 produced load reductions that were lower than forecasted. Table 2: SCE’’s DR Overall Performance Programs No. of DR Events Daily Forecast Ex Post Difference % Good Performance: Capacity Bidding Program –– Day of 14 12 16 >2 >17% Base Interruptible Program 1 514 573 59 12% Mixed Performance: Capacity Bidding Program –– Day Ahead 12 0.08 0.03 0.29 to 0.08 315% to 86% Demand Bidding Program 8 84 76 33 to 16 40% to 21% Summer Discount Plan (AC Cycling) Res. 23 280 184 603 to 92 100% to 58% Poor Performance: Critical Peak Pricing 12 50 37 < 5 < 11% Peak Time Rebate 7 108 20 < 11 < 11% Demand Response Contracts 3 230 148 < 70 < 34% Summer Discount Plan (AC Cycling) Com. 2 5 3 2 35% Agricultural Pumping Interruptible 2 48 21 < 19 < 52% (Averaged MW over All Events) (Range from Low to High) SDG&E Utilizing the same criteria for evaluating SCE DR programs, The Base Interruptible Program and the Critical Peak Pricing Program were categorized as good performers, the Capacity Bidding Day Ahead, Capacity Bidding Day Of, Demand Bidding, and Summer Saver (AC Cycling) were categorized as mixed performers, and the Critical Peak Pricing Emergency and residential Peak Time Rebate programs were categorized as poor performers. As stated above, DR program design and operation characteristics also need to be taken into account for a complete evaluation of DR program performance.
  • 16.
    12 Table 3: SDG&E’’sDR Overall Performance B. Program Performance During Critical Event Days The critical event days of August 10th, 13th, 14th, and September 14th were selected as a focus because they occurred on Flex Alert days, the service area system peak day, or the hottest days of the year. These are all conditions when demand response resources are most critical. August 10, 2012 SCE Two SCE programs were called on August 10th, a Flex Alert day. The programs triggered during that event were the Demand Bidding Program and the Save Power Day (also known as the Peak Time Rebate program). The load reductions achieved during the Demand Bidding Program event surpassed the forecast by 12%, while the Save Power Day event was below the forecast by 11%. Table 4: SCE’’s August 10, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Demand Bidding Program 85.59 95.82 10.23 11.95% Save Power Day 107.24 11 95.85 11.39 10.62% Total 192.83 191.67 1.16 11 SCE did not provide a daily forecast for this event, so the comparison for this event is done with the 7 day report rather than the daily forecast. Programs Number of Events Daily Forecast Ex Post Difference % (Averaged MW over All Events) (Low To High) Good Performance: Base Interruptible Program 1 0.3 0.8 0.5 167% Critical Peak Pricing 7 15 18 > 2.4 >3.1% Mixed Performance: Capacity Bidding Program –– Day Ahead 7 8 6 4.9 to 0.1 32% to 12.2% Capacity Bidding Program –– Day Of 5 12 10 3.2 to 0.7 27.4% to 6.0% Demand Bidding Program 3 5 5 0.4 to 0.1 8.0% to 8.0% Summer Saver (AC Cycling) 8 20 17 12.3 to 3.5 64.0 to 38.7% Poor Performance: Peak Time Rebate Residential 7 19 4 < 24 < 73.6% Critical Peak Pricing –– Emergency 2 2 1 < 0.7 < 53.3%
  • 17.
    13 SDG&E Three DR programswere called on August 10th . The Capacity Bidding Day Ahead program load reduction exceeded the forecast by 1%. Conversely, the Summer Saver and residential Peak Time Rebate forecasts under predicted the forecast by 32% and 75%. Table 5: SDG&E August 10, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C = B A D=C/A Capacity Bidding Day Ahead 7.50 7.60 0.10 1.33% Summer Saver (AC Cycling) 27.20 18.50 8.70 32.00% Residential Peak Time Rebate 12.60 3.20 9.40 74.60% Total 47.30 29.30 18.00 August 13, 2012 SCE August 13, 2012 was the system peak day for the SCE service area, with a peak load of 22,428 MW. As shown in Table 6 below, the Critical Peak Pricing program, a dynamic pricing program for commercial and industrial customers over 200 kW, and the Day Of Capacity Bidding Program were triggered during this day. Again, the Capacity Bidding Programs exceeded the forecast by a few MW. The Critical Peak Pricing program event had satisfactory performance, falling short of the forecast by 15%. Table 6: SCE’’s August 13, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Critical Peak Pricing 50.54 42.96 7.58 15.00% Capacity Bidding Program (Day Of) 12.30 15.70 3.40 27.60% Total 62.84 58.66 4.18 SDG&E All three DR programs that were triggered on August 13th, Capacity Bidding Day Of, Summer Saver (AC Cycling), and Critical Peak Pricing, had ex post load impacts that were respectively below daily forecast predictions by 27%, 45%, and 48%.
  • 18.
    14 Table 7: SDG&E’’sAugust 13, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C= B/A D= C/A Capacity Bidding –– Day Of 11.70 8.50 3.20 27.33% Summer Saver (AC Cycling) 33.30 21.40 11.90 45.35% Critical Peak Pricing Emergency 2.30 1.20 1.10 47.83% Total 47.30 31.10 16.20 August 14, 2012 SCE August 14, 2012 was another Flex Alert day, during which seven events were called, using a variety of DR programs. As shown in Table 8 below, all the events combined were forecasted to reduce loads by 570 MW. However, the ex post load impact evaluations found that the actual load reductions were short of the total forecast by 155 MW. 60% of the 155 MW shortfall is attributed to the Demand Response Contract program. The Agriculture Pumping Interruptible program event was short of the event forecast by 52%. Only the Capacity Bidding Program exceeded the forecasted load reduction, but this only made up 4% of the Demand Response Contract program forecast, and thus was insufficient to cover the overall event day shortfall. It is worth noting that the Demand Response Contract and Capacity Bidding Programs share something in common in that they are both commercial aggregator programs. The reason for the difference in performance between these programs requires further study. It should be noted that SCE’’s Demand Response Contracts expired on December 31, 2012 and have since been replaced by new contracts that that expire at the end of 2014.12 Table 8: SCE’’s August 14, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Demand Response Contracts 275.00 182.05 92.95 33.80% Demand Bidding Program 94.09 61.76 32.33 34.36% Agriculture Pumping Interruptible 36.00 17.29 18.72 51.99% Summer Discount Plan (Res) Group 1 130.40 119.40 11.00 8.44% Capacity Bidding Program (Day Of) 12.30 17.82 5.52 44.86% Summer Discount Plan (Res) Reliability 17.42 13.50 3.92 22.49% Summer Discount Plan (Com) 4.77 3.10 1.67 35.04% Total 569.98 414.91 155.07 12 D.13 01 024 http://docs.cpuc.ca.gov/PublishedDocs/Published/G000/M046/K233/46233814.PDF
  • 19.
    15 SDG&E Four DR programs,Demand Bidding, Critical Peak Pricing, Capacity Bidding Day Ahead, and residential Peak Time Rebate, were called on August 14th . While the Demand Bidding and Capacity Bidding Program ex post load impacts closely matched the daily forecast, the Critical Peak Pricing and residential Peak Time Rebate did not. Since the Critical Peak Pricing and residential Peak Time Rebate programs are large scale residential programs it is possible that the difference between the forecast and ex post load impacts reflect widely varying customer behavior during DR events. Table 9: SDG&E’’s August 14, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Demand Bidding Program 5.00 5.10 0.10 2.00% Critical Peak Pricing 14.30 25.90 11.60 81.12% Capacity Bidding Program (Day Ahead) 7.50 7.50 0.00 0.00% Residential Peak Time Rebate 12.50 1.10 11.40 91.20% Total 39.30 39.60 0.30 September 14, 2012 SCE September 14, 2012 was the hottest day of the year in both the SCE and SDG&E service areas (see Table 10 below). Understandably, SCE triggered their Summer Discount Plan (residential AC Cycling Programs) during this day. The Capacity Bidding Program was also triggered, with performance comparable to the other Capacity Bidding Program events on critical days discussed above. The September 14 residential Summer Discount Plan events consisted of three separate customer groups sequentially triggered for one hour events. All three one hour events fell considerably short of the forecasted load reductions. Table 10: SCE’’s September 14, 2012 Demand Response Events Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Summer Discount Plan (Residential) Groups 5 and 6 135.61 20.70 114.91 84.74% Summer Discount Plan (Residential) Groups 1 and 2 110.89 37.80 73.09 65.91% Capacity Bidding Program (Day Of) 11.90 16.21 4.31 36.18% Summer Discount Plan (Residential) Groups 3 and 4 99.32 17.80 81.52 82.08% Total 357.72 92.51 265.22
  • 20.
    16 SDG&E On September 14,2012, the peak temperature in SDG&E’’s service territory was 109 degrees. The Demand Bidding, Summer Saver, and Base Interruptible Programs ex post load impacts were above the daily forecast in a range between 8% and 167%. Since the absolute value of the Base Interruptible Program load impact is ~ 1 MW, a small increase or decrease in the daily forecast prediction can result in high variability in the percent difference between these two figures. Conversely, the Capacity Bidding Day Of and Day Ahead Programs and the Critical Peak Pricing Emergency Program daily forecasts were below the daily forecast in a range between 12% and 44%. Table 11: SDG&E’’s September 14, 2012 Demand Response Events C. Detailed Program Analysis The following section discusses programs and events that produced load reductions forecasted by the daily reports, as well as programs that failed to produce the forecasted load reductions. For this purpose, all programs and events that came within 10% (+/ ) of the forecasted load reductions are considered to be consistent with the daily forecast and all programs and events that were more or less than 50% of the forecasted load reductions are considered to have failed to produce the forecasted load reductions. SCE There were a total of 104 separate events in the SCE service area in 2012. Only ten of these events produced the load reductions consistent with those forecasted in the daily reports. As shown in Table 12 below, all of these events produced fairly sizable load reductions, ranging from 59 to 130 MW, with the exception of one Capacity Bidding Program event, which produced a very small load reduction. Program Name Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Capacity Bidding Program (Day Of) 9.00 5.70 3.30 36.67% Capacity Bidding Program (Day Ahead) 12.10 10.60 1.50 12.40% Demand Bidding Program 5.00 5.40 0.40 8.00% Summer Saver (AC Cycling) 15.50 22.50 7.00 45.16% Base Interruptible Program 0.30 0.80 0.50 166.70% Critical Peak Pricing Emergency 1.60 0.90 0.70 43.75% Total 43.50 45.90 2.40
  • 21.
    17 Table 12: SCE’’sDR Events with Ex Post Results within 10% of the Daily Forecast Program Name Event Date Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Summer Discount Plan (Residential) 08/14/12 130.40 119.40 11.00 8.44% Summer Discount Plan (Residential) 08/29/12 82.56 80.30 2.26 2.74% Summer Discount Plan (Residential) 08/01/12 58.60 57.10 1.50 2.56% Summer Discount Plan (Residential) 08/15/12 77.77 77.50 0.27 0.35% Demand Bidding Program 10/17/12 79.05 79.25 0.20 0.26% Demand Bidding Program 10/01/12 78.75 79.78 1.03 1.31% Summer Discount Plan (Residential) 08/09/12 118.06 121.20 3.14 2.66% Summer Discount Plan (Residential) 08/28/12 83.86 88.20 4.34 5.18% Capacity Bidding Program (Day Ahead) 07/31/12 0.0700 0.0740 0.00 5.71% Demand Bidding Program 08/08/12 85.59 92.95 7.36 8.60% Of the 104 events in 2012, thirty (or about 29%) of the events were more than 50% off of the day ahead forecast. Five of these events produced load reductions that were greater than the forecast, while the remaining 25 were lower than the forecast. The three events with the highest percentage difference below the forecast were very small Day Ahead Capacity Bidding Program events, and thus are not considered the most critical problem. Twenty one of the remaining events were Summer Discount Plan (AC Cycling) events, and these varied markedly off the forecast.
  • 22.
    18 Table 13: SCE’’sDR Events with Ex Post Results greater than + 50% of the Daily Forecast Program Name Event Date Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Forecast & Ex Post A B C=B A D=C/A Capacity Bidding Program (Day Ahead) 10/01/12 0.09 0.20 0.29 315.22% Capacity Bidding Program (Day Ahead) 10/02/12 0.09 0.10 0.20 213.04% Capacity Bidding Program (Day Ahead) 10/05/12 0.09 0.07 0.16 170.65% Save Power Days / Peak Time Rebates 09/07/12 108.66 23.11 131.77 121.27% Summer Discount Plan (Residential) 06/20/12 128.01 0.50 127.51 99.61% Save Power Days / Peak Time Rebates 09/10/12 108.52 1.65 106.87 98.48% Summer Discount Plan (Residential) 09/14/12 135.61 20.70 114.91 84.74% Summer Discount Plan (Residential) 07/10/12 263.67 44.70 218.97 83.05% Summer Discount Plan (Residential) 09/14/12 99.32 17.80 81.52 82.08% Summer Discount Plan (Residential) 06/29/12 178.26 33.30 144.96 81.32% Summer Discount Plan (Residential) 09/20/12 77.39 14.60 62.79 81.14% Summer Discount Plan (Residential) 06/29/12 178.26 35.80 142.46 79.92% Summer Discount Plan (Residential) 07/10/12 263.67 66.60 197.07 74.74% Summer Discount Plan (Residential) 10/02/12 298.91 86.20 212.71 71.16% Summer Discount Plan (Residential) 07/10/12 263.67 76.70 186.97 70.91% Summer Discount Plan (Residential) 09/20/12 65.53 21.10 44.43 67.80% Summer Discount Plan (Residential) 09/20/12 65.73 21.90 43.83 66.68% Summer Discount Plan (Residential) 09/14/12 110.89 37.80 73.09 65.91% Summer Discount Plan (Residential) 08/22/12 115.03 42.40 72.63 63.14% Agriculture Pumping Interruptible 09/26/12 60.56 24.00 36.56 60.36% Summer Discount Plan (Residential) 09/21/12 168.96 69.10 99.86 59.10% Summer Discount Plan (Residential) 09/28/12 55.06 24.50 30.56 55.50% Agriculture Pumping Interruptible 08/14/12 36.00 17.29 18.72 51.99% Summer Discount Plan (Residential) 10/17/12 127.25 62.30 64.95 51.04% Summer Discount Plan (Residential) 10/17/12 146.77 72.30 74.47 50.74% Summer Discount Plan (Residential) 08/17/12 101.30 153.00 51.70 51.04% Capacity Bidding Program (Day Ahead) 10/29/12 0.09 0.15 0.06 59.78% Summer Discount Plan (Residential) 08/17/12 58.00 98.30 40.30 69.48% Capacity Bidding Program (Day Ahead) 10/18/12 0.09 0.17 0.08 85.87% Summer Discount Plan (Residential) 09/10/12 18.98 68.40 49.42 260.42% Summer Discount Plan The Summer Discount Plan event variability ranges from 121% below the forecast (with a load increase rather than a load reduction) to 260% above the forecast. Overall, the AC Cycling program represents the most variance13 of all the SCE DR programs. When all of the variances for individual events are aggregated, the AC Cycling program represents 49% of the total variance. The Pearson Product Moment Correlation between the daily forecast and the ex post load impacts is 0.21, representing a very weak positive correlation. 13 Variance in this context specifically refers to the absolute difference between the daily forecast and the event day ex post load reductions.
  • 23.
    19 The Pearson correlationbetween the average event temperature14 and the event level variance (difference between the daily forecast and the event day ex post load reductions) is 0.37, representing a moderately weak correlation. In everyday language this means that SCE’’s 2012 Summer Discount Plan forecast method cannot be relied upon to effectively predict the actual program load reductions. In addition, there appears to be little relationship between the event day temperature and the difference between the daily forecast and the event day ex post load reductions, potentially ruling out temperature as an explanatory factor for the difference. The Summer Discount Plan was (by far) the most often triggered program in SCE’’s 2012 DR portfolio. There were 23 separate events, including two early test events15 . Most of the 23 events were split into 3 customer segments such that each group of customers was triggered for only a portion (i.e. one hour) of each event (typically lasting three hours). Three events on 9/14, 9/20, and 9/28 deployed 6 customer segmentations. SCE operated the program in this manner to avoid cycling their customers’’ air conditioners for more than one hour at a time16 . The purpose of this strategy is so customers will be minimally impacted by the loss of one hour of AC services, compared to multiple continuous hours, and in theory the utility would still be able to reduce load when needed. As shown in Table 14 below, the implementation of this strategy, however, resulted in a rebound effect from the groups curtailed in event hours 1 & 2 that added load in hours 2 & 3 as AC units ran at above normal capacity to return the participants’’ buildings to the original temperature set points17 . The net effect was to dampen the average hourly load impact for the entire event period, as illustrated in Table 14. It is possible that the daily forecasts were prepared assuming that all customers would be curtailed at the same time over the entire duration of the event. In such a case, the average hourly load reductions would likely have been larger because all customers would be simultaneously curtailed and the rebound effect would be delayed until after the event was over. This issue is further illustrated in Chapter 2, Section IV ““Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA)””. Table 14: SCE’’s Hourly Load Impact from a Sept 14 Summer Discount Plan event Event Hour Ending: Event Hours w/ Rebound Post Event Rebound Event Hour Average 16 17 18 19 20 15 39.6 25.1 17.0 16 27.1 27.0 39.6 17 21.3 49.6 37.8 Hour Total 39.6 2.0 22.7 89.2 37.8 6.3 14 SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013. 15 The last two events in late October were not included in the ex post analysis. 16 SCE 01 Testimony at 11. 17 SCE Final 2012 Ex Post Ex Ante Load Impacts for SCEs SDP filed in R.07 01 041 on April 2, 2013.
  • 24.
    20 Another potential explanationfor the suboptimal performance could be customers exercising the override option in their enrollment contracts with SCE. However, SCE’’s A.12 12 016 testimony18 indicates that the proportion of customers with an override option is fairly small (consisting of about 1% of the customers enrolled in SDP) and that these customers rarely exercise the override option. Finally, it is possible that transitioning Summer Discount Plan from an emergency program to a price responsive program could have introduced some additional uncertainties that aren’’t adequately captured by the current forecasting methods. Regardless of the explanation for the unexpectedly low load reductions during these events, it is critical that SCE improve the day ahead forecast for the SDP program as a whole. Energy Division staff reviewed SCE’’s method for forecasting the Summer Discount Plan program.19 The methodology, provided in Appendix C, is described in a 1986 internal SCE memorandum and consists of a simple algorithm which estimates the load reduction per ton of AC based on the forecasted temperature. The equation coefficients were determined by a 1985 load reduction study that SCE staff could not locate when requested to do so by Energy Division staff. Without the 1985 load reduction study Energy Division staff could not fully evaluate the forecasting methodology. SCE did provide a revised algorithm which modifies the equation structure. But the underlying methods for estimating those coefficients as yet remain unexplained. This evidence suggests that there is a critical flaw in either the way the Summer Discount Plan events are forecasted or in the operation of the program, or both. The lack of a reliable day ahead forecasting method is a major weakness that undermines the ability to fully consider AC Cycling in the CAISO grid operations. Even if the utilities’’ DR resources are eventually to be bid into the CAISO market, which currently are not, ED recommends that SCE immediately document the forecasting methods to be used for the 2013 season and thoroughly vet the methods with CPUC and CAISO staff and relevant stakeholders to ensure the proposed forecasting methods are reasonable and reliable. Throughout the 2013 summer season (and longer if necessary), SCE should consider piloting more than one forecasting method which should be tested using small ex post load impact evaluations to identify the most reliable forecasting methods. Base Interruptible Program The Base Interruptible Program was triggered only once during the entire 2012 season and this was a test event. This single event produced 573 MW of load reductions on September 26. The load reductions for this event were 59 MW more than the day ahead forecast. It is worth noting that the single Base Interruptible event was more than three times the load reduction of any other SCE program event during 2012, and it was not triggered on one of the critical event days discussed earlier in this section. The Commission should explore a policy requiring more frequent deployments of this program since it appears to have significant, yet underutilized, potential. 18 SCE 01 Testimony at 11, Lines 3 5. 19 See Appendix S.
  • 25.
    21 Capacity Bidding Program TheCapacity Bidding Program Day Ahead events produced an average load reduction of 0.03 MW across all events. With the exception of three events in October (that were associated with negative load reductions in the ex post analysis) most events produced relatively small load reductions forecasted by the daily report. None of the Capacity Bidding Program day ahead events occurred in August and September when the load reductions are typically most needed. By comparison, all of SCE’’s Capacity Bidding Program Day Of events exceeded the forecasted load reductions, by an average of 32%. The average load reduction for the Capacity Bidding Program Day Of events was 15.9 MW, over 500 times the load reductions produced by Day Ahead events. This evidence suggests that, unlike the Day Of program, the Day Ahead Capacity Bidding Program may not be serving a useful function in SCE’’s DR portfolio. Demand Bidding Program The Demand Bidding contracts were called on eight occasions during the summer of 2012. Of these eight events, five occurred in August. The first two August events on August 8 and August 10 resulted in load reductions that exceeded the daily forecast by an average of 10%. The third and fourth events on August 14 and August 16 were 34% short of the forecasted load reductions and the fifth event on August 29 was 40% below forecast, suggesting that perhaps a decline in customer participation in events could be explored as a potential factor in diminishing returns. Demand Response Contracts (DRC) –– Nominated Somewhat surprisingly, there were only two events for which Demand Response Contracts were called. The ex post load reductions for these two events were both around 35% below the daily forecast. Energy Division was not able to examine why this program performed so poorly. As noted earlier, SCE’’s DRCs expired on December 31, 2012, and have since been replaced by new contracts approved by the Commission. Save Power Days / Peak Time Rebates (PTR) –– Price Responsive Daily forecasts were not provided by SCE for the four PTR events that occurred in August, thus comparisons between the daily forecast and ex post results are possible for only the two events on September 7 and September 10. Both of the September events were forecasted to reduce loads by 109 MW. Ex post results, however, indicate that the PTR events had no impact at all. In fact, the September 7 event was correlated with a fairly significant load increase of 23.11 MW. Ex post load reductions were estimated for the four August PTR events, for which day ahead estimates were not provided by SCE. As a proxy for the daily forecast the 7 day reports were used. As shown in Table 15 below, estimated load reductions were between 107 and 108, while the ex post load reductions ranged between 0.02 and 96 MW.
  • 26.
    22 Table 15: SCE’’sPeak Time Rebate MW Event Day 7 Day Report Ex Post 8/10/2012 107.24 MW 95.85 MW 8/16/2012 107.61 MW 24.43 MW 8/29/2012 108.51 MW 21.93 MW 8/31/2012 108.73 MW 0.02 MW Given the considerable variability in ex post results for the PTR program events, the day ahead forecasting and event reporting will need significant revision to account for these discrepancies. If the PTR program is going to continue, staff recommends that SCE prepare a proposal for a viable forecast and submit that for staff to review. SDG&E There were a total of 46 DR program events that were triggered on 14 event days in SDG&E’’s service area from June 2012 October 2012. Daily forecasts for twelve DR program events were within + 10% of ex post load impacts. As depicted in Table 16, moderate load reductions ranging from 5 to 17 MW were produced when these events were triggered. Three programs delivered accurate results with a moderate degree of consistency: Demand Bidding Program, Critical Peak Pricing, and Capacity Bidding Program Day Of. Table 16: SDG&E’’s DR Events with Ex Post Results within + 10% of the Daily Forecast Program Name Event Date Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Between Forecast & Ex Post Demand Bidding Program 10/2/2012 5 4.6 0.4 8.00% Capacity Bidding Program (Day Of) 8/8/2012 11.7 11 0.7 5.98% Capacity Bidding Program (Day Ahead) 8/9/2012 7.5 7.5 0 0.00% Capacity Bidding Program (Day Ahead) 8/14/2012 7.5 7.5 0 0.00% Capacity Bidding Program (Day Ahead) 8/10/2012 7.5 7.6 0.1 1.33% Demand Bidding Program 8/14/2012 5 5.1 0.1 2.00% Summer Saver (AC Cycling) 9/15/2012 8.6 8.8 0.2 2.33% Critical Peak Pricing 10/2/2012 16 16.5 0.5 3.13% Critical Peak Pricing 8/21/2012 16.5 17.2 0.7 4.24% Critical Peak Pricing 9/15/2012 13.7 14.5 0.8 5.84% Demand Bidding Program 9/14/2012 5 5.4 0.4 8.00% Critical Peak Pricing 8/30/2012 16.2 17.8 1.6 9.88% A total of 19 DR program events had ex post load impacts that were greater than + 50% of the daily forecasts as depicted in Table 17. In particular, the residential and commercial Peak Time Rebate program ex post load impacts deviated from the daily forecasts by greater than 70%. According to SDG&E, the commercial Peak Time Rebate ex post load impacts were deemed to be not statistically significant. On this basis, SDG&E reported zero load impacts for this program.
  • 27.
    23 Table 17: SDG&E’’sDR Events with Ex Post Results greater than + 50% of the Daily Forecast Program Name Event Date Daily Forecast MW Ex Post MW Difference Forecast & Ex Post MW % Difference Between Forecast & Ex Post A B C= B A D= C/A Commercial Peak Time Rebate 8/9/2012 1.2 0 1.2 100.00% Commercial Peak Time Rebate 8/10/2012 1.1 0 1.1 100.00% Commercial Peak Time Rebate 8/11/2012 0.8 0 0.8 100.00% Commercial Peak Time Rebate 8/14/2012 1.2 0 1.2 100.00% Commercial Peak Time Rebate 8/21/2012 1.2 0 1.2 100.00% Commercial Peak Time Rebate 9/15/2012 0.9 0 0.9 100.00% Residential Peak Time Rebate 8/14/2012 12.5 1.1 11.4 91.20% Residential Peak Time Rebate 8/21/2012 25 3 22 88.00% Residential Peak Time Rebate 8/11/2012 12.2 1.7 10.5 86.07% Residential Peak Time Rebate 8/9/2012 13.1 3.3 9.8 74.81% Residential Peak Time Rebate 8/10/2012 12.6 3.2 9.4 74.60% Residential Peak Time Rebate 9/15/2012 32.3 8.3 24 74.30% Residential Peak Time Rebate 7/20/2012 23.9 6.3 17.6 73.64% Capacity Bidding Program (Day Ahead) 10/1/2012 9 4.1 4.9 54.44% Capacity Bidding Program (Day Ahead) 10/2/2012 9 4.2 4.8 53.33% Summer Saver (AC Cycling) 9/14/2012 15.5 22.5 7 45.16% Critical Peak Pricing 8/11/2012 11.7 18.4 6.7 57.26% Critical Peak Pricing 8/14/2012 14.3 25.9 11.6 81.12% Base Interruptible Program 9/14/2012 0.3 0.8 0.5 166.67% Capacity Bidding Program Day Ahead (CBP DA) The percent difference between the CBP DA daily forecast and ex post results respectively ranged from 32% 12% (Table 3). Based upon this assessment, the daily forecasts for CBP DA were not accurate or consistent predictors of ex post results. Since the CBP DA daily forecast model does not have a variable that accounts for weather, and the ex post models do, this methodological difference could account for the variability between the two load impact measures. Another factor that could affect this difference is the percent load impact per customer. Although customers submit load impact bids prior to each DR event, the actual load reduction on the event day may not coincide with the projected load reduction. If weather affects event day load reduction by CBP customers, the addition of a weather variable to the daily forecast model could increase its accuracy. In order to address uncertainty in the percent load reduction per CBP customer, DR test events could be scheduled to measure this value on event like days. Capacity Bidding Program Day Of (CBP DO) Similar to the CBP DA program, the CBP DO daily forecasts were not accurate nor consistent predictors of ex post results based upon the range of the difference, 27.4% 6.0% (Table 2), between the two load impact measures. As stated above, inclusion of a weather variable in the
  • 28.
    24 daily forecast modeland measurement of percent load reduction per customer during test events could increase the accuracy and consistency of the daily forecast model to predict ex post load impacts. Demand Bidding Program (DBP) The percent difference between the DBP daily forecasts and ex post load impacts ranged from 8.0% to 8.0% (Table 3) for the three DBP events that were called during the summer. Based upon this result, the DBP daily forecast accurately and consistently predicted ex post load impacts. One caveat for making a general assessment of the DBP forecast model is that only one customer provided load reduction bids for the DR summer events. In order to do so, it would be advised to examine forecast and load impact data from at least 5 10 event days. Commercial Peak Time Rebate SDG&E reported zero ex post load impacts for this program in its March 4th filing. According to SDG&E, zero values do not imply that no load reduction occurred but that the load impacts were not statistically significant.20 Therefore, a comparison of daily forecasts and ex post load impacts could not be performed. Based upon conversations with SDG&E, the lack of effectiveness of the commercial Peak Time Rebate program could be attributed to a weak price signal and inelastic customer demand during event periods. SDG&E would be advised to discontinue the commercial Peak Time Rebate program. Residential Peak Time Rebate The percent difference between daily forecast and ex post load impacts ranged from 91.2% to 73.6% (Table 3). This implies that the residential Peak Time Rebate program daily forecast is not an accurate predictor of ex post load impact. However, the residential Peak Time Rebate program daily forecast consistently over predicted the ex post results. Since the ex post methodology only modeled load impacts for customers that signed up to receive e mail or text alerts and the daily forecast model does not, it is possible that the accuracy of the daily forecast model could improve if there was parity between the two methodologies. If only residential Peak Time Rebate opt in customers were included in the daily forecast model this may resolve the discrepancy. As an alternative solution, since the daily forecast consistently over predicted the ex post results, SDG&E might consider derating daily forecasts by a factor of 0.7 to 0.9 when estimating ex post load impacts. Summer Saver (AC Cycling) The range of the percent difference between daily forecast and ex post load impacts, 64.0% 38.7%, presented in Table 3 indicates that the daily forecast is not an accurate or consistent predictor of ex post load impacts. 20 SCE 03 at 21.
  • 29.
    25 It should benoted that the both the residential and commercial Summer Saver ex post methodologies (respectively a randomized experiment and a panel vs. customer regression) differed from prior years due to the availability of smart meter data21 . This could account for the difference between daily forecast and ex post results. In addition, both ex post methodologies utilized control and treatment groups, whereas daily forecast methodologies did not. According to this assessment, it would be advised to examine how the daily forecast and ex post models could be harmonized. Based upon a conversation with SDG&E, a temperature squared variable is utilized in the daily forecast model. Compared to SCE’’s current AC cycling daily forecast model, SDG&E’’s daily forecast model includes an additional measure of accuracy. However, in order to better predict customer behavior on successive event days or prolonged event hours, SDG&E might consider including an autocorrelation variable in the daily forecast model. Critical Peak Pricing The percent difference between the daily forecast and ex post results ranged from 3.1% 81.1%. This is the only program where the ex post results consistently outperformed the daily forecast predictions. According to SDG&E, the percent load impacts for the Critical Peak Pricing program in 2012 were lower in comparison to 2011 and led to an underestimation in the daily forecast22 . Critical Peak Pricing has approximately ~ 1,000 customers and, as SDG&E claims, any variation in the percent load reduction per customer could lead to high variation in the aggregate impact estimates. This would also be the case for large scale residential DR programs including Peak Time Rebate and Summer Saver (AC Cycling). SDG&E also claims that measurement error might account for differences between load impact category values. However, no explanation is provided to elucidate how the measurement error occurred (e.g. since Smart Meters were not fully deployed in SDG&E’’s territory during Summer 2012, measured load reductions obtained from analog meters were not accurate). Base Interruptible Program The percent difference between the daily forecast and ex post load impact for the Base Interruptible Program was 166.7%. Since two large Base Interruptible Program customers dropped out of the program, SDG&E was not able to accurately forecast the load impact from the remaining customers. It is possible that further analysis with additional Base Interruptible Program load impact data might shed light on the accuracy of the daily forecasting methods. 21 SDG&E load impact Filing Executive Summary, April 2, 2012 at 31. 22 SGE 03 at 19.
  • 30.
    26 Critical Peak Pricing–– Emergency Due to decreasing customer subscription to this tariff, the CPP E program was discontinued as of December 31, 2012.23 D. Summary of Recommendations Given the divergence between the daily forecast estimates and ex post load impact results, staff makes the following recommendations: The daily forecasting methods for all programs must be improved. The daily forecasting methods should be better documented and should be developed with relevant agencies and stakeholders. SCE should test a number of different forecasting methods for the Summer Discount Plan program. SCE should change the Summer Discount Plan program strategy of calling groups of customers for sequential one hour cycling events. SDG&E should include only opt in customers in the residential PTR daily forecast model. SDG&E should run a test event to improve CPP daily forecast estimates. SDG&E should account for customer behavior during successive event days in the Summer Saver daily forecast model. SDG&E should include a weather variable in the CBP forecast model. IV. Comparison of the 2012 Ex Post to the 2012 Resource Adequacy (RA) A. Summary of the Staff Analysis and Recommendations Comparing the 2012 ex post results with the 2012 RA forecast is not an accurate method of determining how the DR programs performed. RA load forecast represents the maximum capacity DR can provide under a set of condition for resource planning needs. Ex post load impact reflects the demand reduction obtained during actual events in response to operational planning needs. Resource planning and operational planning are different in terms of conditions (i.e. event hours, participation, and temperature) and purposes. However, in summer 2012, the Utilities’’ DR programs had not been utilized to its full capacity even under an extreme hot weather condition. This raises the question of the usefulness of the current RA forecast and whether RA forecast should be changed to reflect the set of conditions reflecting operational needs that include the utilities’’ day to day resource availability limitations and DR dispatch strategies for optimal customer experience. A working group that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address the forecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growth rate, dropout rate) used for forecasting RA. 23 At 61, SDG&E load impact Filing Executive Summary, April 2nd
  • 31.
    27 B. Background The 2012RA forecast represents the maximum capacity DR can provide under a set of conditions for resource planning needs. The conditions entail a 1 in 2 weather year24 , portfolio level, entire participation, five hour window event (1 p.m. to 6 p.m.), and enrollment forecast assumption. The 2012 ex post load impacts reflect the demand reductions obtained during actual events in response to operational needs. Operational needs on the event day may not require the full capacity of DR because the condition does not warrant it. Utilities have the discretion to call for a few DR programs with shorter event hours or a smaller group of participants based on their generation and DR resource dispatch strategies.25 This means an ex post impact may only reflect a 1 hour event window versus an RA forecast that has a 5 hour event window. Therefore, the ex post impact may reflect only a segment of a program’’s participants versus the RA forecast that assumed the program’’s entire set of participants. The ex post impact may reflect a lower temperature as versus the RA forecast that has a higher temperature of the 1 in 2 weather year condition. C. Staff Analysis Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate method on how well the program performs against its forecast. The table below contains August monthly average load impact for the 2012 Resource Adequacy (RA) forecast as filed in the spring of 2011 and the ex post results that occurred in 2012. There are stark differences between what the Utilities forecasted a year ahead (RA) and what the results are (Ex Post). On average for the month of August, the variability ranges from 485% (over performance) to 95% (under performance) for SCE and 58% to 97% for SDG&E. The main reason for the discrepancy is because the RA data is used to assist in resource planning, which means it is characterized as a 5 hour event in which all customers are called for the entire period (1 6pm) for the summer. However, ex post results reflect the impact from the actual DR operations, which means that it can be a 1 hour event in which some (not all) customers are called for a short period of time. Other factors that contributed to the discrepancy include temperature, enrollment and dual participation. 24 Represent the monthly peak day temperature for an average year. Exhibit SGE 03, Page 14. 25 SGE 06, Page 6.
  • 32.
    28 Table 18: SCEDemand Response Load Impact 2012 Resource Adequacy vs. 2012 Ex Post August Average (MW) Program Name RA Forecast 26 Ex Post 27 Difference RA vs. Ex Post % Difference RA vs. Ex Post A B C=B A D=C/A Demand Bidding Program 12 72 60 485% Demand Response Contracts 105 182 77 74% Base Interruptible Program 28 548 573 25 5% Capacity Bidding Program Day Of 19 17 2 11% Summer Advantage Incentive/Critical Peak Pricing 69 39 30 44% Agricultural Pumping Interruptible 40 17 22 57% Summer Discount Plan/ AC Cycling Residential 500 212 288 58% Save Power Days / Peak Time Rebates 266 36 230 87% Capacity Bidding Program Day Ahead 29 1 0 1 94% Summer Discount Plan/AC Cycling –– Commercial 62 3 59 95% Table 19: SDG&E Demand Response Load Impact 2012 Resource Adequacy vs. 2012 Ex Post August Average (MW) Program Name RA Forecast 30 Ex Post 31 Difference RA vs. Ex Post % Difference RA vs. Ex Post A B C=B A D=C/A Critical Peak Pricing Default 12 19 7 58% Summer Saver/ AC Cycling 15 19 4 27% Capacity Bidding Program Day Ahead 10 8 2 20% Capacity Bidding Program Day Of 22 10 12 55% Base Interruptible Program 32 11 0.84 10.16 92% Reduce Your Use / Peak Time Rebates 69 2 67 97% Demand Bidding Program n/a 33 5 n/a n/a Critical Peak Pricing Emergency n/a 1 n/a n/a 26 Exhibit SCE 03, Table 1. 27 Exhibit SCE 03, Table 1. 28 Number based on September average because there were no events for month of August. 29 Number based on July average because there were no events for month of August or September. 30 Exhibit SDG 03, Table 1 31 Exhibit SDG 03, Table 1 32 Number based on September average because there were no events for month of August. 33 DBP was not approved until the year after the 2012 RA forecast was filed.
  • 33.
    29 Forecasting DR estimatefor resource planning needs is different than forecasting for operational needs. Unlike resource planning needs, operational needs on the event day may not require the full capacity of DR because the condition does not warrant it or the Utilities deployed ‘‘optimal’’ dispatch strategies for customer experience. Utilities have the discretion to call for shorter event hours or a smaller group of participants if the system is adequately resourced for that day. As discussed in Chapter 3, peaker or other generation resources may have been dispatched instead of DR even though such operation would be contrary to the Loading Order.34 For example, SCE can divide its residential Summer Discount Plan participants into three groups and dispatch each group for one hour of an event, resulting in three consecutive one hour events (see chart below). Approximately 1/3 of the customers can be curtailed in any given hour. Rebound from the groups curtailed in event hours 1 and 2 can reduce the net impact in hours 2 and 3, lowering the average hourly impact for the entire event period. As a result, the average impact per hour can be roughly 100 MW for operation needs. The following figures illustrate the rebound effects from SCE’’s sub group dispatch strategy for its AC cycling. Figure 1 Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex Ante Briefing 34 http://www.cpuc.ca.gov/NR/rdonlyres/58ADCD6A 7FE6 4B32 8C70 7C85CB31EBE7/0/2008_EAP_UPDATE.PDF.
  • 34.
    30 However for theRA forecast, resource planning needs require the full capacity of DR. For example, SCE assumed all residential Summer Discount Plan participants would be curtailed at the same time to represent the full program capabilities of a reliability event (see chart below). Subsequent hourly impacts can be larger due to all customers being curtailed at once and rebound effect being delayed until end of entire event window. As a result, the average impact per hour for RA forecast can be roughly 300 MW, which is roughly 3 times greater than ex post in an hour. Figure 2 Source: SCE April 11, 2013 Power Point Presentation on 2012 Residential Summer Discount Program Ex Post vs. Ex Ante Briefing The opposite extreme condition could occur where the ex post result is higher than the RA forecast. In the case of SCE’’s Demand Bidding Program, the average ex post result is 72 MW, which is 6 times more than the RA forecast of 12 MW (see Table 18). Dual participation was the major contributor to the discrepancy. For customers who enrolled in two programs such as Base Interruptible Program and Demand Bidding Program, the RA forecast only counts the MW in one program (Base Interruptible Program) to avoid double counting.35 Had the two programs been called the same day, the ex post would have shown a much lower amount for Demand Bidding Program. 35 Portfolio level.
  • 35.
    31 September 14, 2012was considered a hot day (1 in 10 weather year condition36 ), however, SCE still did not dispatch their entire residential Summer Discount Plan participants. Instead, SCE only dispatched a portion of its participants for one hour of an event, resulting in a five consecutive one hour events. On average, SCE received only 6.3 MW37 for the event, which is a huge underperformance in comparison to RA forecast of 519 MW.38 This raises the question that if SCE chose not to dispatch all of its Summer Discount Plan participants at the same event hour during a 1 in 10 weather year condition, under what circumstances SCE will dispatch its Summer Discount Plan to its full program capacity. The usefulness of the RA forecast is in question if the utility does not test a DR program to its full capacity. Should the RA forecast process be amended to include another Ex Ante forecast that is based on operational needs including optimal customer experience, and if so what would that entail? D. Conclusion and Recommendations Comparing the 2012 ex post results to the 2012 RA load forecast is not an accurate method in determining DR program performance because the ex post results are in response to operational needs which can be entirely different than resource planning needs. However, in 2012 the RA forecast was not tested to its full capacity. This raises the question of whether RA forecast should be changed to reflect both planning needs and operational needs. A working group that consist of the CPUC, CEC, CAISO, and the IOUs should be assembled to address the forecast needs (i.e. resource planning, operational planning) and input assumptions (i.e. growth rate, drop of rate) used for forecasting RA. This working group should meet in December/January annually and come up with a set of input assumptions (i.e. growth rate, drop off rate) used for forecasting DR estimates. 36 Represent the monthly peak temperatures for the highest year out of a 10 year span. Exhibit SGE 03, Page 14. 37 Christensen Associates Energy Consulting 2012 Load Impact Evaluation of Southern California Edison’’s Residential Summer Discount Plan (SDP) Program, April 1, 2013, Table 4 3d. 38 Exhibit SCE 03, Table 1, 2012 RA for the month of September.
  • 36.
    32 Chapter 3: DemandResponse Program Operations I. Summary of Staff Analysis and Recommendations The 2006 to 2011 data shows that the Utilities historically triggered their DR programs far below the program limits in terms of number of events and hours. Even with the SONGS outage, the Utilities did not trigger their DR programs in 2012 summer more frequently as anticipated. Almost all of the Utilities’’ 2012 DR program events and hours fall within the historical averages or below the historical maximum. However, staff was surprised to find that the Utilities dispatched their peaker power plants (peaker plants) three to four times more frequently in 2012 than the historical averages. The peaker plant service hours were closer to the plants’’ emission allowances than the DR events to the program limits. Staff observed a trend where some DR program events decreased from 2006 to 2012 and yet peaker service hours increased in the same period. This trend raises a concern that the Utilities had under utilized DR programs and over relied on peaker plants. Under the ““Loading Order””, DR is a preferred resource and intended to avoid the building and dispatching of peaker plants. Due to the time constraints and lack of additional information, Staff was unable to fully address this question and the reasons behind these trends in this report. Therefore, staff recommends in future DR program Measurement and Evaluations, the Commission evaluates the DR program operations and designs in comparison with the peaker plant operations to ensure the utilities’’ compliance with the Loading Order. Specifically, the staff recommends that the Commission: 1. Require the Utilities to provide both DR event and peaker plant data and explanations for the disparity between historical DR event hours and peaker plant service hours in future DR evaluations and the next DR budget applications. The Utilities should include the DR and peaker plant hourly data and explain why they did not trigger DR programs during any of the hours when the peaker plant was dispatched. This information will inform the future DR program designs to improve the DR usefulness. 2. Require that DR historical operations be reflected in the input assumptions for the Ex Ante forecast and the evaluation of the program cost effectiveness. 3. Address the Loading Order policy in DR planning and operation and utilization of peaker plants in the next DR Rulemaking and the Utilities’’ energy cost recovery proceedings. II. 2012 DR Program Trigger Criteria and Event Triggers Appendices H and I are a summary of the Utilities’’ 2012 DR program trigger criteria and the event triggers. The DR program trigger criteria consists of a list of conditions, which is self explanatory depending on the type of the program, e.g., Emergency Program triggers are based on system contingencies and non Emergency Program triggers also include high temperature,
  • 37.
    33 heat rate (economic),and resource limitations. The 2012 event triggers were the actual conditions that led to the Utilities’’ decisions to call DR events. While the DR trigger criteria provides some general ideas on how DR programs are triggered, there is lack of transparent information on the Utilities’’ DR operations, e.g., when and how the Utilities made decisions to trigger a DR program. It is necessary to evaluate the DR performance not only from load impact perspective, but also from the DR operations to determine the DR reliability and usefulness as a resource. Staff analyzed the 2006 2012 DR event data and gained some understanding on how the Utilities had utilized DR programs and how useful the programs were. III. DR Events Vs. Peaker Plant Service Hours How do the number compare to the 2012 limit and historically? As shown in Appendices J and K, SCE has a few DR programs with unlimited number of events or hours: Demand Bidding Program, Save Power Days (Peak Time Rebate), and Summer Discount Plan –– Commercial (Enhanced). Others have various event/hour limits ranging from 24 hours/month to 180 hours/year or 15 events/year.39 For the DR programs with an event limit, most of them did not attain the maximum number of events and/or hours except for SCE’’s Summer Advantage Incentive (Critical Peak Pricing).40 In summer 2012, SCE triggered 12 events for its Critical Peak Pricing, which is within the range of 9 to 15 events/year. Other DR programs’’ event hours were well below the limits. For example, SCE’’s residential Summer Discount Plan (AC cycling) is the second to highest triggered programs with 23 DR events and 24 event hours in 2012, which is still far below the 180 hours of its event limit despite the SONGS outage. The Base Interruptible Program (BIP) had only one test event for two hours in 2012. However, SCE’’s DR program event hours were either within the program historical ranges or below the 2006 2011 maximum except for Agricultural Pumping Interruptible with 7 hours in 2012 as comparing to 0 to 2 from 2006 to 2011. What were the reasons for the differences between the 2012 DR event numbers and hours and the event limits? SCE explained that the reasons for the differences between the 2012 DR event numbers and hours vary for each program, which is summarized in Appendix L.41 The reasons can be characterized for the three types of DR programs as: 1) trigger conditions, 2) optimal dispatches, and 3) no nomination As discussed above, DR program operations are based on the trigger criteria set for each program. For the non Emergency Programs, SCE indicated that optimizing performance and minimizing customer fatigue is an additional factor considered in its decision to trigger a DR program. SCE’’s optimal dispatch strategy may have resulted in the DR events and hours far 39 SCE 02, Appendix E, Table 2 A at E 4 and E 5. 40 Id. 41 SCE 02, Appendix E, at E 6 and E 7.
  • 38.
    34 below the maximumhours and events for the programs. For example, SCE’’s Summer Discount Plan is available for 180 hours annually. However, customers would probably never expect that this program will be triggered close to 180 hours based on their experience to date with the program. As shown in Appendices M and N, staff finds a similar trend with SDG&E’’s DR event data. IV. Peaker Plant Comparison Most of SCE’’s non Emergency Programs include resource limitation as a program trigger. Therefore, in theory, one would expect that SCE would trigger DR programs before dispatching its peaker plants in accordance with the Loading Order. In light of the SONGS outage, the Commission anticipated more SCE and SDG&E DR events in 2012, yet SCE dispatched peaker plants substantially more than DR programs (compared to their historical averages as discussed below. How do the historical DR events compare to the utilities’’ peaker plants? SCE provided the permit and service hours for four of its own peaker plants, three were located in the SONGS affected areas, which is shown in Appendix O.42 SCE historically dispatched its peaker plants about 9% to 16% of the permissible service hours annually. As shown in the table below, during the same period, SCE triggered its non Emergency DR programs 11 to 106 hours on average. However, in 2012, SCE dispatched its peaker plants three to four times more than the historical average. On the other hand, SCE’’s 2012 DR event hours were less than the historical range. SDG&E’’s peaker plant and DR event data show a similar trend as SCE. For example, SDG&E’’s Miramar ran 4,805 hours out of 5,000 hours of emission allowance. In contrast, its Critical Peak Pricing with the most triggered hours was dispatched 49 hours out of 126 hours of annual limit. Table 20: DR Event Hour Vs. Peaker Plant Service Hours 2006 2011 Range 2012 SCE: Peaker Plants 96 –– 129 Hours 405 –– 465 Hours Non Emergency DR 11 –– 106 Hours 2 –– 64 Hours SDG&E: Peaker Plants 436 –– 1715 Hrs. 974 –– 4805 Hrs. Non Emergency DR 19 –– 39 Hrs. 14 –– 49 Hrs. In addition, staff observed that the Utilities highest DR event hours occurred in 2006 and 2007 during the summer heat storms but the highest peaker plan hours occurred in 2012. This data suggests that the Utilities under utilized DR programs and over relied on its peaker plants, which is inconsistent with the Loading Order. 42 SCE 01, Appendix C, Tables 9 and 10 at Page 17.
  • 39.
    35 In its commentson the 2013 2014 DR Proposed Decision, SCE disagreed with the suggestion of ““under utilization”” of DR programs based on the 2012 DR events. SCE argued that ““(s)imply because SCE did not dispatch all of the programs’’ available hours does not mean the programs should have been dispatched more……Optimal utilization (of DR) ensures the necessary amount of load drop to enable a reliable grid……””43 SCE should explain why it dispatched its peaker plants substantially more last summer instead of DR and whether SCE’’s optimal dispatch of DR or the trigger criteria or designs resulted in SCE’’s increased reliance on peaker plants. Due to the time constraint and absence of the Utilities’’ explanations, staff is unable to comprehensively address this issue in this report. The Utilities data warrants further evaluation to ensure the usefulness of DR resource as a replacement of peaker plants and the compliance of the Loading Order. V. Conclusions Consistent with D.13 04 017, staff finds that most of SCE’’s DR programs did not attain the maximum number of events and/or hours except for SCE’’s Critical Peak Pricing. The Utilities’’ total numbers of DR events and hours in 2012 were within the historically average, but far from the program limits. However, in contrast, staff found that SCE owned and contracted peaker plants were dispatched far more in 2012 in comparison with the historical averages. Some peakers were much closer to their emission allowance than the DR hours were to their operating limits. Staff reaches a similar conclusion with SDG&E’’s DR programs in comparison with its peaker plants. If the Utilities have historically never triggered their DR programs close to the available hours, there is a concern with how realistic these limits are. There is a reliability risk if the Utilities are relying on a DR resource that has never been used to its full capacity. In addition, the DR cost effectiveness should reflect the historical operations. Staff recommends the Commission to address the issue in future DR evaluation and budget approval proceedings. 43 SCE Opening Comment filed on April 4, at 4 5.
  • 40.
    36 Chapter 4: ResidentialDemand Response Programs I. Summary of Staff Analysis and Recommendations Analysis of Residential programs included Peak Time Rebate (PTR) and AC Cycling. Overall, customers seem satisfied with the programs based on utility reports and surveys. However staff encountered problems with program design and operation that need to be addressed to improve reliability and effectiveness of the programs. For PTR, staff found that customers who received utility notification of events have higher awareness of the program when compared to customers who were not notified by the utility or received indirect notification such as mass media alerts. More importantly, data for both utilities show that customers who opted into receiving alerts were the only group that significantly reduced load. For both utilities, customers defaulted on MyAccount to receive alerts did not reduce load significantly. However, the entire eligible customer class qualifies for bill credits, which resulted in a problem of 'free ridership.' Both utilities should modify PTR from a default to an opt in program, where only customers opting to receive event alerts would qualify for bill credits. For SCE's Residential AC Cycling staff found that the current group dispatch strategy is resulting in a rebound effect. The rebound effect impacts the actual load reduction the program is capable of producing. Staff recommends SCE to (1) align the maximum program event duration with customer preference for shorter events to improve forecast, and to (2) reconsider its incentive structure to favor participation in longer event duration. Finally, both utilities should take advantage of AMI infrastructure and related enabling technology that could improve program delivery, reliability and customer experience. II. Residential Peak Time Rebate (PTR) A. Overall Customer Experience For both utilities, customers were generally satisfied with the program. For SCE, customers seem satisfied with the level of incentives, the time between notification and event. However customers would like more information regarding the program and bill credits. SDG&E’’s customers reported overall customer satisfaction with the program, but similar to SCE’’s customers, would benefit from more information and outreach. Level of awareness for both utilities seems higher amongst customers who chose to sign up to receive notifications. This is reflected in the overall load reduction verified by ex post data. Only customers who signed up for event notification significantly reduced load. For PTR, none of the utilities noticed evidence of customer fatigue, but this does mean it did not occur; just that it was not noticeable.
  • 41.
    37 B. SCE’’s PeakTime Rebate/Save Power Day 1) Summary Customers who received utility notification of events have higher awareness of the program when compared to customers who were not notified by the utility. More importantly, customers who opted into receiving alerts were the only group that significantly reduced load. Customers defaulted on MyAccount to receive alerts and the remaining customers not directly notified by the utility did not reduce load significantly. SCE considered only customers who received alerts in their forecast and ex post verification. However, the entire eligible customer class qualifies for bill credits. Awareness of the program, reflected by the willingness to sign up for receiving alerts, seems to indicate more willingness to reduce load. This factor should be considered in program design. Staff identified an issue with ‘‘free ridership’’, where customers are paid even though they didn’’t significantly reduce any load. Staff recommends changing PTR from a default program to an opt in program, paying bill credits only to customers who opt in to participate. 2) Background D.09 08 028 approved Save Power Day, SCE’’s Peak Time Rebate (PTR) rate. The decision approved bill credits of 0.75c/kWh reduced with an additional 0.50c/kWh for customers with enabling technology. This is a default program for residential customers with a smart meter and has been available since 2012. The program provides incentives to eligible Bundled Service Customers, who reduce a measurable amount of energy consumption below their Customer Specific Reference Level (CSRL) during PTR Events.44,45 The utility may call events throughout the year on any day, excluding weekends and holidays. Events will take place between 2pm and 6pm on days an event is called. Participants receive a day ahead notification of the event. Bill credits will be paid in each billing cycle based on the sum of events called and usage reduction during the period.46 Bill credits will be recovered from the respective customer class through the Energy Resource Recovery Account (ERRA). During 2012, SCE started defaulting customers on MyAccount to receive email notifications, with the remaining customers not directly notified by the utility. Alternatively, customers may choose to opt in to receive alerts. As of November 30th, approximately 4 million customers are on PTR and 824,000 were signed up to receive notifications (via MyAccount).47 According to SCE, 44 SCE Schedule D –– Domestic Service, sheet 3 45 CSLR: peak average usage level”” is the customer’’s average kWh usage during the 2:00 p.m. to 6:00 p.m. time period of the three (3) highest kWh usage days of the five (5) non event, non holiday weekdays immediately preceding the PTR Event. The CSRL is used to determine the customers kWh reduction for each PTR Event in order to calculate the rebate. 46 SCE Schedule D –– Domestic Service, D.09 08 028 Att. C at 7. 47 SCE 01 Testimony at 27, lines 11, 18 19.
  • 42.
    38 approximately 60,000 customershave opted in to receive alerts in 2012 during the summer months.48 3) Lessons Learned In support of its 2013 2014 Application, SCE provided data to highlight lessons learned from the 2012 program year. Customer awareness Awareness of the program is higher amongst the group of customers whom the utility notified of events: 66% of notified respondents were familiar with the program but only 43% were familiar in the group not notified49 . When prompted for awareness of events, the same pattern is noticeable. 72% of respondents in the group receiving notifications who were aware of the program claimed awareness of specific events, compared to 40% in the group not receiving notifications. When including customers aware and the ones prompted with information about the program, 55% of the notified group was aware but only 23% of the non notified respondents was aware.50 Customer satisfaction There was no information regarding customer perception of fairness of savings/incentive levels in SCE’’s data, however customers seem to link participation with expectation of savings as 80% of respondents identified earning bill credits as important for participation51 . Moreover, participants seem to be willing to participate even in the face of low savings.52 Event notification The majority of respondents aware of the program found out about events via utility notification (over 60% for the opt in group). Close to 23% of respondents in the overall population found out about events in the news.53 According to results of the customer surveys, about 90% of customers notified of the event and about 56% of customers not notified but aware of the event, were happy with the amount of time between notification and event54 . It appears that a day ahead strategy could be adequate, however customers were not prompted regarding preference for a day of reminder, so it is not clear from the lessons learned if this could increase awareness and response. SCE requested to add a day of notification in their 2013 2014 Program Augmentation Application, which the Commission denied due to lack of evidence of need.55 48 Email communication with SCE (4/5/2013) 49 SCE 02 Appendix A at 3.It is important to note that the surveys only represented results for two groups: customers notified by the utility and customers who were not notified. Defaulted customers and customers not defaulted into receiving notifications from the utility were bundled together under notified customers. 50 SCE 02 Appendix A at 4 51 SCE 02 Appendix B at 24 52 SCE 02 Appendix B at 36 53 SCE 02 Appendix A at 5 54 SCE 02 Appendix A –– Save Power Day Incentive/Peak Time Rebate Post Event Customer Survey at 15 55 D. 13 04 017, at 28
  • 43.
    39 Customer preference Another surveyshowed that customers would benefit from more information about the program, most specifically in terms of expectations of savings. The majority of customers would prefer to be notified by email and they believe that a reminder at the beginning of the summer would help them to be more ready to participate.56 Program utilization PTR has no limits on the number of events called, with maximum of 4 hours per event. SCE called 7 events, 28 total event hours in 2012 and did not observe evidence of customer fatigue. The trigger criterion was temperature for all events. 57 Although SCE explains the need to balance usefulness with the preservation of the resource58 , the program appears underutilized in 2012. Still, this is the first year of the program. Other findings SCE states that third party providers such as telecommunication companies, cable companies, security providers, retailers, and manufacturers of thermostats or providers of home automation services are potential partners to reach untapped load reduction potential in the residential sector59 . As part of their 2013 2014 Program Augmentation Applications, SCE has proposed a pilot to explore this market and the Commission has approved funding for this pilot.60 4) Analysis of settlement and ex post data Ex post load impact SCE only calculated ex post data for customers notified of events; it did not verify ex post load impact for customers not notified by the utility. This indicates that this group was not expected by SCE to reduce load significantly. SCE’’s 2012 Load Impact Report found that customers who opted into event notifications reduced a statistically significant average of 0.07kWh per hour.61 The same report found that customers defaulted into receiving notifications did not produce statistically significant load impact.62 Incomplete data does not allow staff to verify with certainty the differences in load reduction between all participant groups (opt in, defaulted in notification and the remaining of the population). However staff looked at the all the data SCE provided to look for evidence of what is most likely happening. 56 SCE 02 Appendix B –– Save Power Days Research Study Results at 39 57 SCE 03 –– March 4, 2013 –– Appendix B Table 4; SCE o1, Appendix C at 14. 58 SCE 01, Appendix C at 14 59 Email communication with SCE April 10, 2013. 60 D.13 04 017, OP 19 61 ‘‘2012 Load Impact Evaluation of Southern California Edison’’s Peak Time Rebate Program’’ Christensen Associates Energy Consulting (4/1/2013) at 1. This figure is slightly lower than what the 0.097kW reported on SCE 03 March 4, 2013 at 22. 62 Id at 24
  • 44.
    40 It is interestingto notice that for the first four events, customers defaulted did reduce load although not significantly, but for the three last events, their load in fact increased. In contrast, the opt in group, to various degrees, reduced load for all events. The ex post results varied considerably between events, even though the temperature seems fairly constant and not extreme. It would be interesting to investigate why such variability and how it could help to improve ex post results to improve reliability of the program. A more detailed analysis of impact can be found on the sections above. Table 21: 2012 Ex post Load Impact by Group (MW) (Average Event Hour) Event Date Customers who opted into alerts (a) Customers defaulted into email alerts excluding Opt in alerts (b) Customers not notified directly of events (c) Temperature (d) 7/12/12 N/A N/A N/A 80 8/10/12 39.60 56.25 N/A 89 8/16/12 11.17 13.25 N/A 89 8/29/12 21.22 0.71 N/A 92 8/31/12 6.37 6.35 N/A 86 9/7/12 0.17 23.28 N/A 84 9/10/12 6.04 4.39 N/A 89 Source: Email communication with SCE (3/25/2013); SCE 01 Appendix C Table1 Settlement data analysis In 2012, SCE paid a total of $27,349,008 in incentives for PTR residential customers.63 SCE provided full settlement data, which shows evidence of a potentially large ‘‘free ridership’’ problem, where customers receive incentives without significantly reducing load. 63 Email communication with SCE (4/5/2013)
  • 45.
    41 Table 22: SettlementLoad Reductions MW (Average Event Hour) Event Date Customers who opted into alerts (a) Customers defaulted into email alerts excluding Opt in alerts (b) Customers not notified directly of events (c) Event Settlement (d) 7/12/12 85.9 140.08 1,613 1,839 8/10/12 55.94 134.68 827 1,018 8/16/12 87.99 233.35 1,499 1,821 8/29/12 37.19 84.36 579 700 8/31/12 52.95 132.79 981 1,166 9/7/12 60.85 165.71 1,105 1,332 9/10/12 61.9 139.65 1,049 1,250 Average (MW) 63.2 147.2 1093 1304 % 4.9% 11.3% 83.9% 100.0% Average Participants 60,190 160,430 1,265,544 1,486,165 % 4% 11% 85% 100% Source: Email communication with SCE (4/5/2013) According to settlement data, 84% of bill credits were paid to customers whose load impact was not considered for forecast or ex post purposes. In addition, 11% of incentives were paid to customers defaulted into receiving notifications and did not produce statistically significant load impact.64 This means that in fact 95% of all incentives were paid to customers who either were not expected to or did not reduce load significantly. 64 ‘‘2012 Load Impact Evaluation of Southern California Edison’’s Peak Time Rebate Program’’ Christensen Associates Energy Consulting (4/1/2013) at 24
  • 46.
    42 Table 23: 2012PTR Incentives Paid Event Date Customers who opted into alerts (Ex post MW) (a) Customers defaulted into email alerts excluding opt in alerts (ex post MW reduction) (b) Customers not notified by SCE (c) Total 7/12/12 $254,572 $419,794 $4,836,197 $5,510,563 8/10/12 $166,245 $403,752 $2,480,819 $3,050,816 8/16/12 $261,825 $699,568 $4,495,547 $5,456,940 8/29/12 $110,681 $252,931 $1,734,182 $2,097,794 8/31/12 $157,557 $398,093 $2,939,474 $3,495,124 9/7/12 $181,406 $496,648 $3,312,785 $3,990,840 9/10/12 $184,349 $418,665 $3,143,816 $3,746,830 Total $1,316,635 $3,089,451 $22,942,822 $27,348,908 % 5% 11% 84% 100% Source: Email communication with SCE (4/5/2013) As there is no ex post data for customers not directly notified by the utility (either opted to receive or defaulted in notification), it is not possible to verify their actual impact and if it would be significant or not. However, based on the fact that not even defaulted customers reduced load significantly and findings from SDG&E (see next section), it is fair to assume that results for that group would not be significant. Incentives and capacity cost It is possible to notice difference in cost of capacity between the group who opted in to receive notification and the group defaulted to receive notifications. In this report, staff normally used the average event hour reductions. But as in SCE’’s case there is such variability in ex post results, staff will use average hourly impact for all events as a simple way of showing that the average capacity produced by the defaulted group is nearly six times more expensive than the average capacity produced by the opt in group.
  • 47.
    43 Table 24: 2012PTR Cost of Capacity Event Date Customers who opted into alerts (MW) (a) Incentives paid to the group per event (b) Customers defaulted into email alerts excluding Opt in alerts (MW) (c) Incentives paid to the group per event (b) 7/12/12* N/A $254,572 N/A $419,794 8/10/12 39.60 $166,245 56.25 $403,752 8/16/12 11.17 $261,825 13.25 $699,568 8/29/12 21.22 $110,681 0.71 $252,931 8/31/12 6.37 $157,557 6.35 $398,093 9/7/12 0.17 $181,406 23.28 $496,648 9/10/12 6.04 $184,349 4.39 $418,665 Average 14.10 6.03 Total $1,062,064 $2,669,657 Cost of Capacity $75.34 $442.62 Source: Email communication with SCE (4/5/2013). Staff did not include the 7/12/12 event in the calculation as there is not ex post data for this event. 5) Findings Based on analysis of program design, settlement and ex post load impact and customer participation data for the summer of 2012, staff has found the following: The program, as approved in the decision, pays the same amount of incentives for all customers enrolled into the program. There is additional incentive for customers who have enabling technology. There are differences in performance, awareness and willingness to reduce load between customers who were notified directly by the utility and customers who were not. Customers are overall satisfied with notification mode, timing and level of incentives. There is not enough information to determine if customer fatigue is an issue. Ex post analysis of customers who opted into alerts significantly reduced their load in comparison to customers only defaulted into alerts. This indicates that customer willingness to participate (indicated by the action to sign up for alerts) may help improve load reduction. Incomplete ex post load impact results show load reduction for customers notified by the utility –– both who have signed up and defaulted into receiving alerts. No results were available for the entire population. It is not possible to verify if incentives paid to non notified customers did not result in significant load reduction, but the fact that SCE does not include this group in its forecast and ex post results indicates that their load impact is not significant.
  • 48.
    44 There is potential‘‘free ridership’’ issue in SCE’’s PTR. C. SDG&E’’s Peak Time Rebate/Reduce Your Use 1) Summary Overall, customers are satisfied with the program. There is difference, however, in load awareness and load reduction between customers who opted into receiving alerts and the rest of the population. Only customers who opted into receiving utility notification significantly reduced load. However, the entire population qualifies for bill credits. Awareness of the program, reflected by the willingness to sign up for receiving alerts, seems to indicate more willingness to reduce load. Staff identified an issue with ‘‘free ridership’’, where customers are paid even though they didn’’t significantly reduce any load. Staff recommends changing PTR from a default program to an opt in program, paying bill credits only to customers who opt in to participate. 2) Background D.08 02 034 approved the Reduce Your Use program, SDG&E’’s Peak Time Rebate (PTR) rate, the first dynamic rate of such design approved by the Commission65 . The program has been available since the summer of 2012, with a pilot in 2011. The program is implemented as proposed: ‘‘A two level PTR incentive with a higher level payment for customers who reduce electric usage below an established CRL [customer reference level]66 with enabling demand response technology, and a lower level payment to customers without such technology.’’67 Customers receive a bill credit of 0.75$/kWh with an additional credit of 0.50$/kWh for customers with enabling technology. SDG&E’’s tariff lists programmable communicating thermostats (PCTs), AC cycling, pool pump cycling as examples of technologies eligible for the 0.50¢/kWh additional incentive.68 Commission has approved the addition of In Home Displays (IHD) to the list of enabling technologies in SDG&E’’s tariff.69 The utility may call events throughout the year without limit to the number of events called. Events will take place between 11am and 6pm on days an event is called and participants receive a day ahead notification of the event. Bill credits will be paid in each billing cycle based 65 SCE’’s Save Power Day program was approved in 2009 on D.09 08 028. 66 Defined as the ‘‘total consumption for the PTR event period averaged over the three (3) highest days from within the immediately preceding five (5) similar non holiday week days prior to the event. The highest days are defined to be the days with the highest total consumption between 11 a.m. and 6 p.m. The similar days will exclude weekends, holidays, other PTR event days, and will exclude other demand response program event days for customers participating in multiple demand response programs.’’ SDG&E PTR Tariff. 67 D.08 02 034 at 22. 68 SDG&E PTR tariff defines enabling technologies as to be ‘‘initiated via a signal from the Utility, either directly to the customer or the customer’’s device, or via a third party provider to the customer or the customer’’s device that will reduce electric energy end use for specific electric equipment or appliances, is included in a designated Utility demand response program, and that is acceptable to and approved by the Utility, subject to the verification of processes necessary to safeguard confidential and proprietary Utility and customer information.’’ 69 D.13 04 017, OP 22
  • 49.
    45 on the sumof events called and usage reduction during the period. Bill credits will be recovered from the respective customer class through the Energy Resource Recovery Account (ERRA).70 The utility can call only one event per day with a maximum of 7 hours. 3) Lessons Learned In support of its 2013 2014 Application, SDG&E provided data to highlight lessons learned from the 2012 program year. For PTR, SDG&E conducted three post event surveys. Customer Awareness Results of the surveys showed differences in level of awareness between the three main groups71 of customers participating in PTR: customers who actively opted into day ahead event notifications (opt in), customers registered onto MyAccount and receiving event notifications (default) and customers not directly notified by the utility, but notified via mass media (no MyAccount). In general, the opt in group demonstrated the highest level of awareness of the PTR events. About 83% of the opt in group was aware of the program concept –– events and bill credit, compared to 43% of respondents in the defaulted group and 40% in the no MyAccount group.72 Customer Satisfaction Customers are generally satisfied with the amount of incentives paid.73 Customers also seem generally satisfied with number of notifications, although respondents did indicate that more promotion and information about the program would be beneficial.74 SDG&E indicated that is working to resolve issues of notification encountered in 2012 as well as working to improve customer education for using online tools. 75 Overall, customers responded positively to the program. Program Utilization In the summer of 2012, SDG&E called 7 events, a total of 49 event hours, and all events were called due to temperature76 . Given that this program has no limit of events, the program seems underutilized. However SDG&E states that even if a temperature point is reached, the program may not be necessarily called, as system need is assessed internally. This approach also takes into consideration customer experience. 77 Customer Fatigue SDG&E states that it is difficult to determine if customer fatigue is an issue, but ex post results show that when program was called three days consecutively in August, the load impact 70 SDG&E GRC Phase 2 Settlement at 8. 71 SDG&E in post event surveys segmented customers into more than the groups analysed in this report, but to simplify the analysis, staff looked only at the main three groups of participants. 72 SGE 02 February 4th , 2013 Attachment 6 (Table 5) 73 SGE 02 –– Revised appendix X at 20. 74 SGE 02 Feb 4th , 2013 Att. 5 Table 13, Att. 6 Table 9 75 SGE 02, Revised Attachment 1 –– Revised Appendix X at19 76 SGE 02, Revised Attachment 1 –– Revised Appendix X Table 11 77 SGE 02, Revised Attachment 1 –– Revised Appendix X at 14
  • 50.
    46 was lowest onthe last day.78 Temperature does not seem to be a factor as the day with the lowest reduction had similar temperature to two preceding days. Still, the result does not seem conclusive. Table 25: Customer Fatigue 79 Event Date Average Event Hour Reduction (MW) Temperature (°F) 8/9/12 3.2 88 8/10/12 3.1 92 8/11/12 1.7 91 Enabling Technology Enabling technology seems to be improving load reduction as preliminary results show that customers with In Home Display (IHD) saved 5% to 8% on average during events, while customers without saved between 0% to 2%.80 Effort to reduce usage during events SDG&E investigated as part of post event surveys what actions customers would take on event days and the level of effort made to respond. While actions taken were hypothetical, i.e. do not reflect reported actions taken, respondents in all three groups seem aware of possible actions to reduce load. For instance 38% of opt in respondents, and around 30% of MyAccount and 30% no MyAccount said they could unplug electronics. 41% of the opt in, 23% of my account and 19% on no MyAccount would turn off AC. When prompted about the effort made to reduce usage during the August 14th event, 33% of opt in respondents indicated having made ‘‘a lot more effort than usual’’ in comparison to around 10% for MyAccount and 10% no MyAccount respondents. 54% of the opt in respondents and around 40% of MyAccount and 40% of no MyAccount groups said they made somewhat of an effort. Finally 13% of the opt in, 50% of the MyAccount and 44% of the no MyAccount made no more or less effort than usual to reduce load. 81 The results seem to indicate that respondents in all groups, irrespective of IOU notification, may have made an effort to reduce load and did know what options they had to do so. Still, ex post load reduction shows that only the opt in group, about 6% of the entire population, significantly reduced load, contradicting assumptions that mass media or defaulting customers into email alerts could generate significant reduction. 78 SGE 02, Revised Attachment 1 –– Revised Appendix X at 11 79 Source: SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 3. 80 SGE 02 February 4th , 2013 at 5, Lines 5 7. 81 SGE 02 February 4th , 2013 Attachment 6 (Table 11 and 12).
  • 51.
    47 4) Analysis ofsettlement and ex post data Ex post load impact Awareness of the program and willingness to participate (in the form of signing up to receive alerts) seem to be an important factor in load reduction. This is supported by analysis of ex post data. The opt in group was the only group to produce statistically significant load reductions82 . Table 26: Ex Post Load Reductions83 (Average Event Hour MW) Event Date Customers who opted into alerts (a) Customers on MyAccount excluding Opt in alerts (b) Customers not on MyAccount excluding opt in alerts (c) Temperature (d) 7/20/12 6.1 0 0 87 8/9/12 3.2 0 0 88 8/10/12 3.1 0 0 92 8/11/12 1.7 0 0 91 8/14/12 1.1 0 0 88 8/21/12 3 0 0 83 9/15/12 8.2 0 0 104 Settlement analysis Based on average hour load reduction used for settlement calculation, 94% of incentives were paid to customers either defaulted to receive email alerts on MyAccount or customers not on MyAccount and 6% were paid to customers that opted into alerts.84 When compared to ex post data, only customers who opted into alerts, or about 4% of the total population enrolled on PTR, significantly reduced load.85,86 This points to an issue of ‘‘free ridership’’, where customers receive incentives without significantly reducing load. 82 SGE 01a, at 3. 83 Source: SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 3. 84 SGE 02, Attachment 1, Revised Appendix X Table 3 and SGE 03 March 4, 2013 Table 3. 85 SGE 02, Attachment 1, Revised Appendix X at 4 and Table 3. 86 For PTR residential and small commercial the participants represent the customers who proactively opted into alerts and the enrollment number represents all the customers who were eligible to receive a bill credit. Fifty percent of residential customers are enrolled in MyAccount and received an e mail alert.’’ SGE 02, Attachment 1, Revised Appendix X at 4.
  • 52.
    48 Table 27: SettlementLoad Reductions MW87 (Average Event Hour) Event Date Customers who opted into alerts (a) Customers on MyAccount excluding Opt in alerts (default) (b) Customers not on MyAccount excluding opt in alerts (c) Event Settlement (d) 7/20/12 10 79 71.2 160.1 8/9/12 13.7 100.2 89 202.8 8/10/12 12.6 97.1 87 197 8/11/12 12.7 117.3 101.2 231.1 8/14/12 14.8 118.4 106.7 240 8/21/12 29.9 270.3 258.4 559 9/15/12 17.3 151.1 129.8 298 Average (MW) 15.9 133.3 120.5 269.7 % 5.9% 49.4% 44.7% 100.0% Average Participants 45,268 562,982 608,250 1,171,232 % 4% 48% 52% 100% Incentives and capacity cost In 2012, SDG&E paid out $10,134,879 in incentives for PTR residential customers.88 If assuming the estimate MW reported to the CAISO (7 day Report), the program maximum expected capacity was an average event hour impact of 45.8 MW (event hours).89 This implies a capacity cost of approximately $221/kW. According to ex post data, the actual capacity generated was an average event hour of 8.2MW resulting in a cost of capacity of $1,232.7/kW. This cost will be recovered from the residential class of customers. 5) Findings Based on analysis of program design, settlement and ex post load impact and customer participation data for the summer of 2012, staff has found the following: The program, as approved in a Commission decision, pays the same amount of incentives for all customers enrolled into the program. There is additional incentive for customers who have enabling technology. 87 Source: Adapted from SGE 02 Attachment 1, Revised Appendix X, Table 2 5; SGE 03 March 4th Table 3. 88 SDG&E AL 2420 E, at 2. 89 SGE 02, Attachment 1, Revised Appendix X Table 3.
  • 53.
    49 There are differencesin performance, awareness and willingness to reduce load between the three main groups of participants: customers who opted to sign in to receive alerts, customers defaulted into MyAccount to receive event email alerts and customers not yet on MyAccount and not being directly notified by the IOU and who finds out about events via mass media. There is not enough information to determine if customer fatigue is an issue. Ex post load impact results show only customers who signed in to receive alerts significantly reduced load. 94% of incentives paid did not result in significant load reduction. ‘‘Free ridership’’ is an issue in SDG&E’’s PTR, where the majority incentives were paid to customers who did not significantly reduce load. Based on incentives paid during the summer of 2012, the cost of capacity is five times higher when adjusting forecasted load impact by ex post load impact. D. Staff Recommended Changes for PTR It is clear that ‘‘free ridership’’ is an issue that needs to be addressed. It is an issue when forecasting load reduction –– the forecasted impact would be much higher than what could be verified, and results in additional costs to ratepayers. While ‘‘free ridership’’ in most cases is a baseline and settlement methodological issue, this issue could be partially alleviated by changes in program design. Incentives should reward and encourage customer engagement. Therefore, staff recommends changing PTR from a default program to an opt in program, eliminating incentives paid to customers not actively choosing to receive event alerts and keeping the current incentive level to customers who sign up to receive alerts and use enabling technologies. Staff suggests the following incentive structure: Table 28: Propose Program Structure Group $/kWh Opt in to receive alerts 0.75 Opt in to receive alerts and Enabling Technologies 1.25 Not opt in Not a participant in the program This approach to PTR would ensure that customers are rewarded for the level of action they are prepared to take. If this proposed level of incentives were in place in 2012, it could have reduced the amount of incentives paid by about 95% as shown below.90 90 To simplify the calculation, staff ignored the additional $0.50/kWh for enabling technology. These incentives would be paid in addition to the $0.75 /kWh.
  • 54.
    50 Table 29: Iillustrationof Staff Proposed Changes for SCE Current Incentive Structure Group Incentive Level ($/kWh) Capacity (MW Ex post*) Total incentive paid ($) Cost of capacity ($/kW) All 0.75 95.8 27,349,009 285.48 Proposed Incentive Structure Opt in 0.75 95.8 1,328,160 13.86No opt in 0 Potential reduction 95% * Ex post for the entire program Table 30 Iillustration of Staff Proposed Changes for SDG&E Current Incentive Structure Group Incentive Level ($/kWh) Capacity (MW Ex post*) Total incentive paid ($) Cost of capacity ($/kW) All 0.75 8.2 10,108,082 1,232.69 Proposed Incentive Structure Opt in 0.75 8.2 582,750 71.07No opt in 0 Potential reduction 94% * Ex post for the entire program While issues of baseline and settlement methodology are out of the scope of this analysis and would demand a much more in depth investigation, it is possible to attempt to alleviate the impact of free ridership by limiting PTR bill credits to customers who do not opt to participate. Utilities should focus on encouraging customers to adopt enabling technologies. Perhaps some of the resources saved by having a three tier structure of incentives could be used to subsidize enabling technologies to enable direct load control. Also, utilities should explore alternatives to service delivery such third party entities. SCE found that the interest of third parties is shifting towards the residential sector and such opportunities should be seriously explored. Finally, utilities should track as part of their ex post verification efforts, if the presence of enabling technologies significantly improves load reduction and if there is difference between different technologies used. In addition, utilities should look to investigate if customer fatigue is an issue, especially in view of the SONGS outage potentially increasing the trigger of PTR events.
  • 55.
    51 III. Residential AirConditioning (AC) Cycling A. Overall Customer Experience Customers were generally satisfied with the program. For SCE, 2012 was the year the program was transitioned from emergency to price trigger. SCE reports customers have kept a positive view of the program and regarded incentives an important part of participating in the program. Customers did report that they prefer shorter and more frequent events as opposed to longer events. SDG&E also reports overall customer satisfaction but points that the majority of customers complaints were due to uncomfortable temperatures due to the unit cycling on/off. Also, SDG&E reports customers were satisfied with the level of incentives. No utilities reported customer fatigue, although SDG&E had three events in consecutive days and load reduction dropped. However, without analyzing other factors, such as humidity and customer perceptions of discomfort, amongst other factors, that could have contributed to load impact reduction, it is not possible to say with certainty if it did occur or not. A. SCE’’s Summer Discount Plan 1) Summary SCE’’s AC Cycling changed its event trigger structure from emergency to price. Customers seem satisfied with the current program design. Staff has identified that the program has issue of ‘‘rebound effect’’ and recommends that the program design should be changed to include an additional level of incentive that would cater to customers willing to cycle their unit for the entire event duration in below. 2) Background As part of a D.11 11 002, SCE agreed to transition the Residential Summer Discount Plan (Res SDP) from emergency to price trigger and to bid Res SDP’’s load in the CAISO market for dispatch. D.11 11 002 authorized revisions to SCE’’s program to enable the changes agreed to in a settlement.91 As currently designed, Res SDP offers an annual incentive for customers who wish to participate in the program. The program offers two choices for cycling duration as well as gives customer the choice to override an event up to five times in the year for slightly lower incentives. Incentives are calculated according to size of the equipment, cycling duration and override option:92 91 D.11 11 002 at 2 4 92 SCE Schedule D SDP, sheet 1; SCE 01 Testimony Table II 2
  • 56.
    52 Table 31: SCEResidential AC Cycling Incentives Option Incentive p/ Summer Saver day p/ ton 100% cycling maximum savings (based on 4.5 ton unit) 50% cycling maximum savings (based on 4.5 ton unit) Standard Option $0.36 (100% cycling) $0.18 (50% cycling) $200 $100 Override Option $0.18 (100% cycling) $0.09 (50% cycling) $100 $50 SCE Res SDP program has approximately 307,000 customers with an expected load reduction of 466MW.93 Events can be dispatched year round with a maximum of 180 hours and each event can last up to six hours. In 2012, SCE paid a total of $51,882,087 in incentives. 3) Lessons Learned The 2012 summer season proved to be a transition year for this program. Customers had to transition from an expectation of little service reduction to expecting several disruptions throughout the year. Overall, SCE asserts that customers continue to have a positive view of the program. Lessons learned from the transition in 2012 showed that bill savings are an important element for customer participation. The majority of customers opted for the Standard Option preferring savings to override capability, and the ones who chose to override rarely used it.94 Only 1.5% of customer who left the program did so due to the program changes. Preliminary findings of customer surveys found that customers prefer shorter events even if more frequent. SCE experimented with different event duration calls and found that as events got longer, customer dissatisfaction increased. In 2012, SCE triggered 23 events for a total of 24 hours, for reasons of temperature, CAISO Emergency and evaluation. Because the program changed the trigger condition and design in 2012, historical comparison would not be accurate. But data shows that Res SDP was called more often than in previous years.95 B. SDG&E’’s Summer Saver 1) Summary Customers seem satisfied with the program. The program performed in accordance with past years. Staff does not recommend any changes to the program design. 93 SCE Schedule D SDP, sheet 1; SCE 01 Testimony at 9, Lines 21 23. Load impact based on ex ante estimates from Commission Monthly Report (12/21/2012) 94 SCE 01 Testimony at 11, Lines 3 5 95 SCE 03 March 4, 2013, Appendix B Table 4.
  • 57.
    53 2) Background The SummerSaver program is a 15 year long term contract based procurement run by Comverge.96 Comverge is responsible for installing, removing and servicing the AC unit. Summer Saver is a direct load control program where a device is installed on the premise to cycle the AC Unit when an event is called. It has a day of notification, meaning customers receive event notification on the day of the event. The program runs May through October. Customers are eligible to annual incentives for participation based on the cycling option and size of the unit and participation period: 97 Table 32: Summer Saver incentives Cycling Option Res Bus 30% N/A $ 9.00 Per ton 50% $ 11.50 $ 15.00 Per ton 100% $ 38.00 N/A Per ton The Summer Saver program had around 28,500 residential and commercial customers enrolled in 2012.98 The majority of participants are residential customers –– 23,948 in 2012, and this distribution has been fairly consistent since 2009.99 The program has an event limit of 15 events or 120 event hours. The utility can call one event per day and events run for minimum 2 hours and maximum 4 hours. Events can be called anytime from 12pm to 8pm on event days. In 2012, the utility called 8 events or 29 event hours, an average of 3.6 hours/event. Events will be called based on temperature and system load.100 3) Lessons Learned Residential customers were responsible for 84% of load reduction during the 2012 summer. SDG&E paid $2.5 million in incentives to residential customers for 18.6 MW average event hour. The majority of customer complaints were due to uncomfortable temperatures due to the AC cycling.101 Overall, customers seem satisfied with the level of incentives as SDG&E reported that less than 1% of customers who left did so due to unfair incentives.102 96 http://www.comverge.com/residential consumer/find a program 97 http://www.sdge.com/save money/demand response/summer saver program and email communication with SDG&E on (3/4/2013) 98 SGE 02, Attachment 1, Revised Appendix X Table 2 99 Email communication with SDG&E (4/4/2013) 100 SGE 02, Attachment 1, Revised Appendix X Table 8 101 SGE 02, Attachment 1, Revised Appendix X at 19 102 SGE 02, Attachment 1, Revised Appendix X at 20
  • 58.
    54 SDG&E did notreport evidence of customer fatigue for Summer Saver, although it recognizes that this does not mean fatigue does not occur, just that it is not measurable.103 Ex post load impact results showed that when the program was called three days consecutively there was a drop in the load reduction. However, SDG&E states that there is not enough information to suggest that this is a result of fatigue. Humidity or outside temperature being lower in the last day than the previous day amongst other factors could have contributed to lower load reduction in the last day. Table 33: AC Cycling customer fatigue104 Date Ex post average over event period (MW) Res Res+Com Temperature 9/13/12 12.0 12.6 81 9/14/12 18.6 22.5 109 9/15/12 8.2 8.8 104 The frequency of events called has been fairly consistent throughout the program availability (with a few exceptions like 2008), with the program being called in 2012 according to historical average. But when compared to program design, it seems under utilized. Still, there is a higher incidence of events in comparison to event hours inferring events are more frequent, but shorter. 103 SGE 01, Direct Testimony of Michelle Costello at 10 104 SGE 02 Attachment 1, Revised Appendix X, Table 2 6; SGE 03 March 4th Table 2; email communication (4/3/2013).
  • 59.
    55 Table 34: SDG&ESummer Saver Historical Comparison of Number of Events and Event Hours105 Year Event hour (year) Event hours called Number of events (year) Events Called 2006 120 24 15 8 2007 120 43 15 12 2008 120 8 15 2 2009 120 30 15 7 2010 120 44 15 11 2011 120 22 15 6 2012 120 29 15 8 Average 120 29 15 8 Average historical performance compared to design 24% 51% 2012 compared to historical average According to average According to average C. Staff Recommended Changes for AC Cycling Staff does not have any recommendations to change in program design for SDG&E at this point. SDG&E’’s is a mature program and customers seem fairly satisfied with the offerings. SCE’’s program trigger just changed from emergency to price and customers seem satisfied with the program overall. However, last summer SCE deployed a new dispatch strategy of which it divided the customers into three to six subgroups with one hour per event per subgroup instead of the whole group triggered for the entire event duration. While such strategy is optimal for customers’’ comfort, as discussed in Chapter 2, such strategy caused a ‘‘rebound effect’’106 Program design should help correct this issue. First, the program as designed states that events can last up to six hours, even though customers seem to prefer shorter event durations and dissatisfaction went up as event duration increased107 . Also, SCE counts a total of six hours per event for RA purposes. SCE needs to review the program proposal to reflect 105 Based on SGE 02, Attachment 1, Revised Appendix X Table 11. 106 ‘‘Effects of an event in subsequent hours, when electricity usage may exceed the curtailed customers’’ reference load, as air conditioners work to return residences to original temperature set points.’’ 2012 Load Impact Evaluation of Southern California Edison’’s Residential Summer Discount Plan (SDP) Program at 13 107 Staff does not have more detailed information on customer preference or what would be the ideal event duration before customers drop off the program.
  • 60.
    56 customer preference ––if customers will not favor being cycled for 6 hours the program should not have such long event duration proposal. Moreover, SCE should explore new ways of delivering the program, i.e. using temperature control via a PCT instead of a switch in the equipment that cycles the unit off/on. This could allow for longer event duration while maintaining customer engagement as the unit would never be off completely108 . In fact, both SDG&E and SCE should take advantage of AMI infrastructure and related enabling technology that could improve program delivery, reliability and customer experience. 108 D. 13 04 017 at 27 states that innovative approaches via using PCTs and OpenADR could enable shorter event duration. At the time, the Commission did not have data that reflected the rebound effect which may discourage short event durations. This issue should be taken into consideration when designing the approved pilot.
  • 61.
    57 Chapter 5: NonResidential Demand Response Programs I. Summary of Staff Analysis and Recommendations The analysis of customer experience for DR programs for commercial customers focuses on three key commercial programs: AC Cycling, Auto DR and Demand Bidding Program (DBP). Staff recommends that the outreach and marketing of the new features of SCE’’s AC Cycling program be clearly communicated to the customers to avoid customer dissatisfaction and dropout. In addition, Staff finds that there is limited evidence that the Auto DR program coupled with the Critical Peak Pricing (CPP) rate provides evidence of greater load impact than the load impacts obtained by customers on the CPP rate alone. As a result, Staff recommends that future studies continue to explore the load impacts of Auto DR. Staff recommends SDG&E and the Navy collaboratively design a Navy only DBP program to meet the unique needs of the Navy. Key attributes of the program would include a day ahead trigger, aggregation of 8 billable meters and a minimum bid requirement of 3 MW. II. Background and Summary of Utility Data In response to the Energy Division letter, SCE and SDG&E provided data on the commercial customer experience and commercial customer participation in the non residential DR programs. Customer enrollment and participation numbers during events by program were provided as well as the load impacts that those customers produced. In addition, SCE and SDG&E provided qualitative information on the commercial customer experience of the DR programs including how customers felt about the incentives offered, whether customers were fatigued by consecutive DR events and if the customers felt that too many DR events were called. SCE and SDG&E also provided information on the efficacy and customer experience of DR event notification. Overall, SDG&E reported that the customer experience was positive, and that it tried to deliver notification to drop load earlier than required (for both commercial and residential customers). Among the various non residential program offerings, SDG&E offers a Capacity Bidding Program (CBP) where participants can choose between a day ahead and a day of program. Participants are required to reduce their usage by 20 kW or more. Program participants receive a capacity payment and receive an energy payment for the hours of reduction. However, the program also carries penalties for a reduction of less than a 50% pledge. The customer feedback for this program came from aggregators109 who suggested that increasing the incentives could potentially increase enrollment in CBP. SDG&E offers a Demand Bidding Program and has two enrolled non residential customers in this program. In 2012, the Demand Bidding Program was offered on a day ahead basis with incentives to customers for reducing their demand during an event. In response to SDG&E’’s 109 An aggregator is an entity that aggregates or combines customer load and makes it available for interruption.
  • 62.
    58 questions about incentives,the DBP customers indicated that the incentives were not high enough. The Commission adopted SDG&E proposal of changing this program from a day ahead to a day of, 30 minute trigger program110 . Another SDG&E non residential program offering is the Peak Time Rebate(PTR) program (Commercial). On event days, participating customers are required to reduce their electricity use during the event duration. Customers can sign up to be notified of events in advance. Commercial customers signed up for alerts at a much lower rate than residential customers and also provided less load reduction. Most likely, this is due to limited ability to reduce load between 11 a.m. and 6 p.m. SDG&E included three post event PTR surveys, which provided results of residential and small commercial customers’’ experiences to the PTR event. Key trends were: Small commercial customers were generally aware of Reduce Your Use days. However, event specific awareness was lower. Small commercial customers indicated that they face different challenges than residential customers in responding to PTR events. Feedback on program improvement from commercial customers included the following comments: commercial customers stated that they were not able to reduce more and were already doing what they could; small commercial customers indicated that they would benefit from advanced notification; and finally, they stated that responding to events would affect their business operations or customer comfort. General program feedback from SDG&E indicated that estimating the effects of customer fatigue on load impacts is difficult. When event days are called in a row, there are many varying factors, such as events being called on different days of the week, varying temperatures on event days, that it is difficult to determine whether the change in load impact is due to multiple event days or other influencing factors. SDG&E describes its experience with PTR events called in quick succession, and indicates that preliminary load impacts were lowest on the last day. This may be due to customer fatigue. For the other programs, the load impacts did not show evidence of customer fatigue. Again, this is not conclusive of customer fatigue not being present. Customer fatigue was simply not measurable relative to other variations in load impacts between events. SCE launched a Summer Readiness Campaign in April 2012 in order to prepare the customers for the upcoming summer. Overall the customer experience was responsive and positive. SCE offers many non residential DR programs similar to the programs offered by SDG&E. These programs include an AC Cycling program. This program offers customers various AC Cycling options where the utility can directly turn off the customer’’s air conditioner when needed. Customers receive a credit based on several factors, including the program cycling options that they choose and the AC unit cooling capacity. SCE has proposed changes to this 110 D.13 04 017 at 15.
  • 63.
    59 program and hasrequested the ability to call events not just for emergency reasons, but also for when the prices are high. Another non residential program offered is the Auto DR program. The program provides incentives to offset the cost of purchase and installation of technology to help the customer automatically manage their load. The customer determines their load control strategy and pre sets it in the technology. With the technology in place, the program automates the process of reducing demand by sending a signal from a central system to the customer’’s automated control system, which then automatically reduces usage during program event duration. During the 2012 Summer Event Season, the Demand Response Help Desk (for large business customer DR program) received 1,410 calls. 21% of those calls were related to program events, however, none of the callers indicated that there were too many events or that the incentive payments were inadequate. Non event calls (79%) pertained to program eligibility, questions about enrollment, assistance with online tools and other general program information. In mid August, SCE conducted market research to gauge customer awareness of enrollment campaigns and SCE messaging, customer actions in response to the campaigns, and attitudes towards SCE and energy conservation. Overall, most of the residential and small business customers heard the campaign message and attempted to reduce their usage. Most of the customers understood the need to conserve energy over the summer and attempted to do so. Customer awareness was raised and the campaign prompted some customers to enroll in SCE programs. Customer attitudes about reliability and avoiding outage remained strong. SCE did not observe customer fatigue during 2012 event season for DR programs in general. The programs are able to avoid multiple consecutive days of events by flexibility in the dispatch triggers of the programs. III. Commercial Air Conditioning (AC) Cycling A. SCE’’s Summer Discount Plan In December 2012, SCE conducted a pilot telephone survey on several programs, including the Summer Discount Plan (SDP) program111 . The overall sample size was 200 business participants, though the sample size varied by the question being asked. The satisfaction with the program was moderate, with only 72% of the participants aware that their business was enrolled in SDP. The Decision (D.13 04 017) approved SCE’’s proposed changes to the SDP Commercial program, and we examine the commercial customer experience, as presented through this survey, in detail. Overall, a larger percentage (81%) of participants felt that the program was worthwhile. Of the three main touch points identified, billing was the key and customers were moderately satisfied with this touch point. Relatively few participants (18%) had reasonably high familiarity with the program details. Customers who had high familiarity tended to be more satisfied. Customers who received targeted SDP communication were more familiar and more satisfied with the program. 111 Service Delivery Satisfaction Recalibration: Summer Discount Plan 2012 Pilot Survey.
  • 64.
    60 Most of thebusiness participants were SCE customers at home (86%) and were predominantly male (63%). There were three main touch points, which had a significant impact on satisfaction. Billing, enrollment and events were key drivers, with billing being the highest priority driver, and events being the lowest priority driver. Customers had difficulties identifying the discount and not thinking the discount was fair given the effort it took to participate. For enrollment, customers were primarily concerned with delays in the device installation. Customer reasons for event dissatisfaction were, specifically, the time, day, and frequency of events as well as a perception of fairness. The satisfaction with the billing was 78%, which was considered to be at a moderate level compared to other SDP touch points. 83% received paper bills, while 19% received electronic bills. 17% could not easily find the discount on the SDP bill. The top two comments on the SDP bill were to provide a separate line item of discount and offer a bigger discount/lower rate. The bill currently includes a separate line item for the discount so customers are not able to find it and need to be reminded that it is there. The problems with event attributes were low (8%). Satisfaction with enrollment was modest (78%). 7% experienced problems with enrollment with most of the problems being related to confusions (amount of the discount, expected savings) or delays (waiting for the device to be installed, multiple visits, and multiple phone calls). The more satisfied customers were the ones that were aware that: 1. They receive a discount regardless of event. 2. The indicator light identifies an event in progress. 3. Events occur between June 1 to October 1. 4. The maximum duration of event is 6 hours. Relatively few participants (18%) had reasonably high familiarity with the program details. Most business participants were aware of the 100% and 50% cycling options. Awareness of the methods (indicator light, SCE.com) of determining whether the device was currently cycling the AC off was at less than half of the respondents. Only 22% of the respondents knew the correct start and end months of the program; many of the other customers did not know or did not provide correct answers to the question. Those that were correct tended to be more satisfied with the program. Only 12% of the respondents identified the correct 6 hours that an event can last. 21% said that there was no limit, and they were less likely to be satisfied with the program. 57% of the respondents who knew that they receive discounts whether or not the events were called were more likely to be satisfied as a result. The number of events did not impact satisfaction. When investigating the reasons for program satisfaction, 36% of the respondents were happy with the program and 17% responding that there was a good discount provided. 19% of the comments were negative. 11% of this feedback was related to financial reasons such as the bill being too high, or that the bill increased, or that the discount was small. Bad customer service was also another negative at 5%.
  • 65.
    61 Around a thirdof the customers provided suggestions for improving the SDP. In this feedback, financial comments were paramount, with the following reasons being cited: Lower rates (5%) Bigger discount (5%) Better communication (4%) A large percentage of participants (77%) did not know the discount amount. Participants who are most satisfied are likely to know the discount dollar amount. Participants with moderate to high familiarity were more likely to have received recent communication from SCE. All types of communication boosted familiarity though the written method was the dominant form. B. SDG&E’’s Summer Saver Program The findings in this section are based on KEMA’’s process evaluation of the 2008 Summer Saver program112 . At the time of the evaluation, the program had 4,500 commercial participants (and even greater number of residential participants). Commercial customers can chose between 30% and 50% cycling options and choose between 5 day and 7 day option. To the extent to which commercial customer experience was provided, it is cited in this report. For other cases, general feedback is cited. Since this information is dated, we used it primarily for general feedback on the Summer Saver Program at SDG&E and as a means of comparing the AC Cycling programs of SCE and SDG&E. A key conclusion of the report included improving the program marketing and informational materials to reduce program dropouts and attract more interested customers. Better information about cycling frequency could have resulted in less dissatisfaction and drop out. Discomfort and program cycling were most often the top reasons for dropout. Better marketing could have reached customers who are interested in the program. The report recommended customizing marketing messages to customer subgroups. Surveys of Summer Saver participants and non participants discovered that bill credit messages had greater appeal to lower income customers while environmental messages had greater appeal to higher income customers. With regard to cycling options, the report recommended to reduce program complexity by reducing the number of cycling options. A related cycling recommendation was to not increase the cycling frequency. Currently the program cycles 10 12 times a year, and participants indicated that they were uncomfortable during Summer Saver control events. The key reason that participants joined the program was for the financial incentives. 112 Process Evaluation of SDG&E Summer Saver Program, March 19, 2009.
  • 66.
    62 Key Lessons Learnedfrom the AC Cycling Programs When comparing the feedback received for the AC Cycling program for SCE and SDG&E, a clear recommendation emerges –– marketing, clear communication and managing expectations is a key facet of the program. When customers know what to expect, they tend to be more satisfied with the program. SCE customers with high familiarity of program attributes fared better and were more satisfied. SDG&E could also improve marketing and information materials to reduce dropouts and attract interested new customers113 . Clarifying the program and making it less complex is important to attract and retain customers. Targeting those messages, by subgroups in the case of SDG&E, is another useful method in attracting customers based on their values and priorities, whether they are financial or environmental. D.13 04 017 approved SCE’’s proposal of modifying its commercial program from a reliability based DR program to a program that can be dispatched for economic purposes. The new trigger will allow the program to be called when there are high wholesale market prices, which occur during times of extreme temperature or when the system demand is higher than expected. Additionally, SCE will consolidate the Base and Enhanced commercial programs into one program with different features, and proposes that the SDP be made available year around. The key program changes are outlined below: Table 35: Program Element Current Design Approved Design Curtailment Event Trigger Emergency Only Emergency and Economic Program Availability Events can occur June 1 September 30 Events can be called year round with a maximum of 180 event hours during a calendar year. Event Duration 6 hours Multiple events may occur in a single day, with varying durations. Maximum 6 hour interruption in a day. Customer Cycling Options 30%, 40%, 50%, and 100% 30%, 50%, 100% With the movement to an economic based program and the new program features, it is paramount that the marketing campaign clearly explain the changes, such as the duration of the event, which is now expected to be shorter, yet the programs can be called year around. Billing changes should also be made to assist customers in identifying discounts. SCE customer feedback on program improvement was largely financial. SCE’’s new program design has new incentive levels. The new proposed enhanced program will pay a greater incentive per ton/month than the current enhanced program, and the new incentive should be communicated clearly to the participants, whether it is through clear bill presentation or marketing efforts or a combination of the two. 113 This survey is dated, so SDG&E may have made marketing modifications to alleviate some of the concerns presented above.
  • 67.
    63 In 2012, theSummer Discount Plan Commercial was triggered once for 5.6 hours114 . With its movement to an economic based program, which can be called when wholesale market prices are high, it is likely to be called more frequently. The Capacity Bidding Program Day Ahead (CBP DA) had a heat rate trigger condition in 2012, and that program can be used as an example of how frequently a non emergency based program may be called. In 2012, the CBP DA was called 12 times. The proposed changes for SCE’’s AC Cycling program may provide the needed megawatts this summer and will also benefit customers who are often financially motivated. However, these expected changes need to be communicated in a clear way to avoid customer dissatisfaction and possible customer dropout of the program. If the marketing program is managed carefully, SCE’’s Summer Discount Program for Commercial customers can be a useful source of load impact for the summers of 2013 and 2014. IV. SCE’’s Auto DR Auto DR is a technology program whereby customers receive an incentive to install equipment to improve the ability for load reduction during a DR event. Auto DR is considered to provide a better load shed as described in the Decision (D.12 04 045): ““Limited data suggests that ADR customers have a higher participation rate in DR programs and provide better load shed. Data also suggests that customers on dynamic rates perform better with ADR.”” SCE’’s Auto DR customers pre program the level of DR participation and when a DR event is called, the Auto DR technology enables the facility to automatically participate. This method reduces the necessity of a manual response. All non residential customers must have an interval meter and participate in an available price responsive DR program. As of 2013, customers are paid 60% of technology incentives during installation testing verification; 40% of eligible incentives are paid according to participation in a DR program115 . By the end of 2012, SCE Auto DR program funding was 100% subscribed116 . In the Application, SCE requested approval for an additional $5 million for Auto DR which would be earmarked for projects in the Target Region. The majority of the funding ($4,200,000) was for the technology incentive payments. The key questions which arose were what the customer experience was with the Auto DR program and how effective were customers in shedding load when events were called. In the Application, SCE does not have a breakdown of DR programs by those customers who participate in Auto DR. To understand customer experience better, we refer to other studies done on the efficacy of Auto DR and customer feedback on the program. CPP is a rate that sets a higher price for electricity during critical peak days. In return, the customer receives a reduction in the non peak energy charge, demand charge or both charges. 114 SCE 03. 115 SCE 03 at 25. 116 SCE 02 at 18.
  • 68.
    64 A report onthe non residential CPP presents the estimated ex post load impacts of Technology Incentives and Auto DR participants on average for 2011 CPP Event117 . SCE called 12 CPP events in 2011 over the months of June and September. On average, each event had 3,006 participants. For SCE’’s CPP customers for 2011, the percentage load impact was 5.7% for the average event and the average load impact was 11.6 kW. There were 35 CPP customers on Auto DR, and they provided a percentage load impact of 21%. The average load impact was 103 kW. Based on this information, customers on Auto DR and CPP provide a greater load reduction than customers on the CPP rate alone. To further understand Auto DR and its potential to provide load impacts, we examine a study by Enernoc on CPP118 . SCE and SDG&E offer Technology Assistance and Technology Incentives Programs. SDG&E’’s Technical Assistance program provides customers with energy audit services to identify potential for energy cost reduction and encourage participation in DR and EE programs. The Technology Incentive program at SDG&E provides financial incentives and on bill financing (interest free) for customer adoption and installation of DR measures and enabling technologies. The EnerNOC study outlines the barriers to response to an event. The main barrier for those customers which were the bottom performers, or the low performing participants, is the lack of ability to reduce demand because of business needs, and that responding to an event would negatively impact business functions. Examples of these limitations included a need to maintain a temperature for preventing produce from spoiling or to protect sensitive equipment from damage, or just for reasons of comfort of staff. A related barrier to response is the lack of knowledge on how to reduce load. Additional barriers included lack of enabling technology, however, this is not identified as a top concern. Most of the top responders, or the high performing participants, are able to easily shift their processes or shut down heavy energy using equipment, and respond to events. However, few of the top responders use technology to automate their response. The main barrier to responding to events is the ability to respond and not suffer negative business consequences. For businesses which have the capacity to respond and not be negatively impacted as a result, technology is a possible solution which can be explored. Due to the small population size, only 16 technology enabled customers were interviewed. The majority of those interviewed were SCE customers. This is a small sample size and the feedback should be interpreted with caution. Half of these customers said that the technology was important for their response, and the other half stated that they would have stayed on the CPP rate without the technology. Four of the customers interviewed do not utilize the technology installed; 3 of these customers respond to events. Select feedback includes: ““The load we shed is entirely enabled by the Auto DR technology”” –– SCE technology enabled. 117 2011 California Statewide Non Residential Critical Peak Pricing Evaluation, p. 41. 118 California Statewide CPP Research on Improving Customer Response, December 3, 2012.
  • 69.
    65 Most of thebottom responders do not use technology to respond and are not aware of options in this regard. From the quantitative data, Auto DR customers on the CPP program provide greater load impacts than those customers on CPP without enabling technology. However, the data above, though limited due to the small sample size, can provide direction for continued research. With the additional Auto DR funding request in SCE’’s application, the participant pool continues to grow. With this growth, it is possible to conduct better studies with more robust results. Studies can attempt to get to benefits provided by Auto DR, in particular the load impact which can be attributed to Auto DR and which, in its absence, could not have been achieved. V. SDG&E’’s Demand Bidding Program (DBP) The Commission approved SDG&E’’s Demand Bidding Program in the middle of summer 2012 as a part of the mitigation efforts to address SONGS outage.119 SDG&E called 3 Day Ahead DBP events and obtained a load reduction of 5.1 MW, 5.4 MW and 4.6 MW. During the course of 2012, SDG&E spent $44,192 on this program, which was minimal comparing to its residential PTR program costs of $10 million for only 4 MW of load reduction. The recent DR decision (D.13 04 017) approves SDG&E’’s proposed continuation of its Demand Bidding Program modified from a day ahead to day off, 30 minute product . The purpose of this modification was to align the program with the Energy Division Letter and provide programs that can provide quick response capability. In its comments, the Navy stated that the DBP with a 30 minute trigger would only permit participation from entities with automated demand response systems, in effect reducing participation. The Navy requests the continuation of the day ahead program. The Navy states that the 2012 DBP did not allow for the Navy’’s participation and the change to a day of, 30 minute program will further limit the Navy to participate. Instead the Navy proposes a day ahead program, with some modifications. The Navy proposes that the customer be allowed to aggregate 8 billable meters. The second proposal is to lower the minimum bid requirement from 5 MWh to 3 MWh. The Navy states that it may not be able to produce 5MWh at a single geographic location and cites its experience of August 2012, when during a Demand Reduction test, the Navy shed 4MWh from a multitude of shore facilities on three Navy Installations. SDG&E responded to the Navy’’s comments, and explained why it believes the Navy did not participate in the 2012 DBP. SDG&E understand that the Navy will participate in an emergency program; however, the DBP is not a day ahead emergency program. In its response, SDG&E indicated its willingness to work with the Navy and create a demand response program to meet the Navy’’s unique needs. 119 In Resolution E 4511 on July 12, 2012.
  • 70.
    66 Staff recommends thatSDG&E and the Navy collaboratively develop the Navy only DBP program, which will address the following issues raised in the Navy’’s comments to the 2013 2014 DR Proposed Decision:120 1. A day ahead trigger to enable the Navy to appropriately plan for the event. 2. The ability to aggregate 8 billable meters. 3. Lower minimum bid requirement from 5 MW to 3 MW. Experience from the Demand Reduction test demonstrates that the Navy has the ability to reduce load and be a useful DR resource for SDG&E’’s system during the summer. 120 Filed on April 9, 2013 in A.12 12 016.
  • 71.
    67 Chapter 6: FlexAlert Effectiveness I. Summary of Staff Analysis and Recommendations The Flex Alert campaign has not been evaluated since 2008. Earlier evaluations from 2004 2008 suggest that the impacts of an emergency alert have ranged from an estimated 45 MW to 282 MW. The utilities have identified areas to improve communication between CAISO and the utilities when alerts are triggered and cancelled. The utilities also question whether customers are confused about the differences between a Flex Alert event and local Peak Time Rebate events. The utilities cite several reasons to consider transitioning Flex Alert from a utility funded program, to a CAISO led and funded program. Staff finds that there is a lack of data to evaluate the effectiveness and value of the Flex Alert campaign. Staff agrees with the utilities that an evaluation in the current program cycle is needed. Staff finds that there is merit to the utilities proposal to terminate Flex Alert as a rate payer funded and utility led activity after 2015. Rather than providing recommendations in this report staff defer to the proceeding that is currently reviewing the utilities statewide marketing, education and outreach applications, (A. 12 08 007, A.12 08 008, A.12 08 009, and A.12 08 010) and the Phase I Decision in that proceeding, D.13 04 021. II. Background ““Flex Alert”” is the current name of a statewide marketing campaign that asks Californians to conserve or shift electricity when the CAISO determines that there is a risk that electricity supply may not be adequate to meet demand.121 This alert campaign is approved through CPUC decision and the CPUC authorizes the three investor owned electric utilities to provide the total budget. One utility acts as the lead utility, and contracts with a marketing agency to develop TV and radio ads, and purchase advertising time. The marketing agency purchases advertising slots throughout the state to run the ads during the summer season, when demand is likely to be highest and the grid is more likely to be constrained. CAISO triggers an alert based on grid conditions and informs the utilities and the marketing agency. The marketing agency swaps out informational advertisements with emergency alert messages, calling a ““Flex Alert”” and asking Californians to do three things during a six hour window of time on a specific day 1) turn off unnecessary lights, 2) set air conditioners to 78 degrees, and 3) wait until after 7PM to use major appliances. Individuals and businesses also have the opportunity to sign up to receive email or texts notifying them that there will be an alert. Flex Alert Performance in 2012 In 2012, two Flex Alert events were called on August 10 and August 14. Initially Flex Alerts were triggered on August 9 for August 10 12. However, the Alerts for August 11 and 12 were later cancelled. A formal evaluation of Flex Alert was not conducted in 2012. The utilities did 121 From 2001 2004 the name for emergency alerts was Power Down, and from 2004 2007 they were referred to as Flex Your Power Now.
  • 72.
    68 not conduct anyanalysis to determine an estimate of the impacts that resulted from either Flex Alert event. SDG&E was concerned that customers would not recall the difference between Reduce Your Use, which provides customers a bill credit, and Flex Alert, which provides no monetary benefits. In the event that customers did not understand the difference between Flex Alert and Reduce Your Use, SDG&E wanted to avoid customer frustration that could occur when customers reduced their usage during a Flex Alert but were not paid for it. To mitigate confusion, SDG&E triggered its Reduce Your Use program on the same days when Flex Alerts were called. There were three days when the utility triggered a Reduce Your Use event when there was no Flex Alert. However, the utility claims that the weather on the three Reduce Your Use event days was atypical, and therefore the utility cannot determine the load reductions attributable to Flex Alert by comparing Flex Alert days with the days when Reduce Your Use was called and Flex Alert was not.122 SCE also states that with the limited data available the utility cannot determine the effect of a Flex Alert. SCE did a basic comparison of two days with similar conditions when the same DR programs were dispatched, one day with a Flex Alert and one day without and concluded that Flex Alert could be counter productive because SCE’’s total system load was higher on the day the Flex Alert was called.123 In comparison, there have been three evaluations of the alert campaign in its history: 2004 2005, 2006 2007, and 2008. The 2004 2005 evaluation did not estimate the impact of an alert event. The 2006 2007 evaluation reported that the system wide demand response impact on Flex Alert days, (including all other demand response programs that were called), ranged from 200 MW to 1100MW. The impact from Flex Alert was a portion of this total. The 2006 2007 evaluation estimated the load impacts associated with alert events, specifically from customers adjusting their air conditioner settings in response to the ads. Although, the study estimated impacts ranging from 93 MW to 495 MW,124 in 2008 the consultant redid its analysis with revised assumptions and adjusted the estimate to be between 45 MW and 75 MW.125 The 2008 evaluation estimated load impacts based on customers turning off lights, and adjusting air conditioners. The study estimated that impacts from alert events in 2008, based on customers taking these two actions ranged from 222MW to 282 MW.126 Since 2008 there has been a long gap since Flex Alert has been evaluated. In 2009 and 2010 no Flex Alert events were triggered. In 2011 there was one event, but there was no evaluation. Given that Flex Alert has not been evaluated since 2008, and the utilities seem unable to draw any conclusions about load impacts attributable to Flex Alert events, it is reasonable to plan an evaluation for the summer of 2013. The Commission issued a Decision on the utilities 122 SGE 02 at 25 123 SCE 01 at 59. 124 2006 2007 Flex Your Power Now! Evaluation Report, Summit Blue Consulting, May 22, 2008, p. 126. A link to this report is provided in Appendix S. 125 2008 Flex Alert Campaign Evaluation Report, Summit Blue Consulting, December 10, 2008, p. 102. A link to this report is provided in Appendix S. 126 Id.
  • 73.
    69 statewide marketing applicationon April 18, 2013. The Decision includes a directive to evaluate the program.127 III. Utility Experience with Flex Alert SCE identified three weaknesses with the implementation of Flex Alert. First, the utility states that challenges exist because neither a utility nor the PUC own the trademark to the name Flex Alert. Second, the utilities did not receive advanced notice from the CAISO when a Flex Alert was triggered or cancelled. Third, inability of the CAISO to accurately forecast the duration of an alert resulted in confusion, when an alert was cancelled.128 The utilities state that they were contacted by CAISO at the same time that news media and the general public was informed about a Flex Alert. CAISO held weekly calls with the utilities to discuss weather forecasts and the likelihood a Flex Alert would be called. However, when an alert was triggered the utilities learned of the event through a robo call, automated email or text message, which are the same methods used to inform residential customers and media outlets. The utilities would prefer to have advanced notification so that they are able to strategically coordinate their own DR program initiation, and to proactively communicate with customers. The cancellation of the weekend alerts on the 11th and 12th also caused confusion. SDG&E claims that both internal staff and local media were confused about whether or not conservation and Reduce Your Use days were still necessary.129 SCE acknowledges that it is inefficient and costly to re contact media outlets to cancel alerts. Flex Alert radio and television commercials continued to air throughout the weekend, because the marketing agency was not able to give the media stations adequate time to switch the messages before the ads were locked in for the weekend. To add to the confusion, CAISO’’s website continued to indicate there was a Flex Alert even though the agency had issued a press release stating the weekend alert events were cancelled.130 Prior to the start of the 2013 Flex Alert season, the utilities, CAISO and the marketing agency should discuss the weaknesses identified by the utilities from 2012. The organizations should use their expertise and the recommendations from past Flex Alert evaluations to identify methods to improve the timeliness of communication and ensure that implementation is as efficient and effective as possible. IV. Customer Experience Only one utility, SCE, conducted a survey in 2012 to determine customer awareness and reaction to the 2012 Flex Alert campaign. Although SDG&E did not conduct a survey, the utility raised concerns that both customers and the media seem confused about the difference between Reduce Your Use events and Flex Alert.131 SCE found that 10 percent of surveyed 127 D.13 04 021, Ordering Paragraph 14. 128 SCE 01 at 60. 129 SGE 02, Attachment 1 at 26. 130 SCE 01 at 60. 131 SGE 02, Attachment 1 at 26
  • 74.
    70 customers were confusedabout the difference between the utility’’s Peak Time Rebate and Flex Alert. 132 SCE reported the following results from its survey of 400 customers.133 Nearly 60% of residential customers reported hearing or seeing a Flex Alert advertisement 54% of small business customers reported hearing or seeing a Flex Alert message 25% of residential customers surveyed reported that they took steps to reduce electricity use on a Flex Alert day 21% of small business customers reported taking steps to reduce when a Flex alert was called Compared with past evaluations customer recall of Flex Alert ads has increased from one evaluation period to the next. In 2004 2005, 12 percent could recall hearing or seeing an ad, compared to 15 percent in 2005 2006, and 23 percent in 2008.134 A formal evaluation in 2013 can help determine if the jump in awareness reflected in SCE’’s survey results is an accurate reflection of the trend. The 2013 evaluation should take into account the variety of mechanisms used to relay information about alerts to customers. For example, 2012 was the first year that the utilities conducted outreach through Community Based Organizations to help prepare customers for a Flex Alert event. Another highlight from SCE’’s survey is that 25 percent of residential customers took action in response to an alert. This percentage is also an increase from past evaluation results. In past years between 10 and 21 percent of residential customers reported taking action in response to the ads.135 However, SCE’’s survey failed to determine whether customers accurately understood the message that a Flex Alert is intended to convey. All three prior evaluations found that customers did not understand that they were supposed to adjust their behavior for just the day of the Flex Alert event. Instead customers reported continuing to conserve during afternoon hours every day since the event had been called.136 While conservation has its own benefits, the purpose of a Flex Alert is for customers to shift load during a brief peak event. It will be important for utilities, the CAISO and the marketing agency to continue to strive to accurately relay this concept, and for an evaluation to determine if the right message is getting through to customers. The utilities made one specific recommendation to improve the program in 2013 2014. They proposed to continue community outreach partnership efforts in 2013 and 2014 in the 132 SCE 01 at 61 133 SCE 01 at 61 134 Process Evaluation of the 2004 / 2005 Flex Your Power Now! Statewide Marketing Campaign, Opinion Dynamics Corporation, July 24, 2006 p. 5; 2006 2007 Flex Your Power Now! Evaluation Report, Summit Blue Consulting, May 22, 2008, p. 90; 2008 Flex Alert Campaign Evaluation Report, Summit Blue Consulting, December 10, 2008, p. 83. Links to these reports are provided in Appendix S. 135 Id. 136 Id.
  • 75.
    71 demand response proceeding.The Commission adopted a Decision on April 18, 2013, which approves these requests. V. The Future of Flex Alert SDG&E sites a passage from the SCE’’s testimony in its Statewide Marketing Application which identifies several reasons that the Commission should consider that CAISO take full control of the statewide emergency alert campaign starting in 2015. SDG&E states that they support the recommendation made by SCE. SCE’’s testimony states that since 2004 the utilities have funded alerts through ratepayer dollars. However, when alerts are called, the results benefit customers outside of the utilities’’ service territories as well, yet neither CAISO nor non utility Load Serving Entities contribute to the funding. SCE also pointed out that from 2007 2011 only one alert was triggered. Increased growth in utility demand response programs has positively impacted grid reliability, the utility states. SCE found that balancing utility specific regulatory constraints with the CAISO desired scope of the program was challenging. As an example CAISO requested to share emergency alert messaging with Baja Mexico to promote energy conservation in that region. SCE’’s testimony goes on to state that the utilities do not have the discretion of when to trigger the program. SCE recommended that Since CAISO triggers the program, the CAISO should assume total ownership of, and authority over it. SCE requests that this recommendation is approved during 2013 2014 so that CAISO has the opportunity to seek funding in it GMC cost recovery.137 The Commission adopted a Decision on Phase 1 of the utilities statewide marketing application on April 18, 2013. The Decision authorizes a total of $20 million to be spent on Flex Alerts between now and the end of 2014. The Decision also directs the program to be evaluated. The Decision includes a directive for the utilities to work with CAISO to develop a proposal for the transfer of the administration and funding of the Flex Alert program to the CAISO or another entity, effective in 2015. The Decision directs SCE to submit the proposal in the Statewide Marketing Proceeding by March 31, 2014.138 VI. DR Program Ex Post Load Impact Results on the Flex Alert Days As shown in tables below, all three utilities triggered a DR event for some of their DR programs during the two Flex Alert days with a total of 739 MW of load reduction from 4:00 5:00 p.m. on August 10, 2012 and 432 MW from 3:00 4:00 on August 14, 2012. The CAISO reported that the actual system peak load during the peak hours between 3:00 p.m. to 5:00 p.m. were significantly lower than its forecasts and attributed the load drops to its Flex Alerts. However, the data suggests that a large or some portion of the load reduction came from the DR programs. Appendix P shows the ex post load impact for each of the utilities’’ DR programs on the two Flex Alert days. 137 SGE 02, Attachment 1 at 28. 138 D.13 04 021 at 25 27.
  • 76.
    72 Table 36: Utilities’’DR Program Ex Post Load Impact on the Flex Alert Days139 Utility Ex Post (MW) 3:00 4:00 p.m. 4:00 5:00 p.m. 8/10/12: SCE 194 185 SDG&E 8 27 PG&E 459 527 Total 661 739 8/14/12: SCE 394 242 SDG&E 38 38 PG&E No Events Total 432 280 139 Provided to staff through emails. Data source: the utilities’’ April 2, 2013 Load Impact Reports (links to the reports are provided in Appendix S).
  • 77.
    73 Chapter 7: EnergyPrice Spikes I. Summary of Staff Analysis and Recommendations Because most DR programs are dispatched a day ahead or several hours ahead of events, it is difficult for the utilities to effectively use DR programs in response to real time price spikes. There were many days where price spikes occurred but DR programs were not called, and conversely there were days where DR programs were called but no price spikes occurred. 30 minute or 15 minute DR programs could respond to price spikes much more efficiently. II. Definition of Price Spikes For the purposes of this report, a price spike day was defined as any day where the average hourly real time price hit $150/MWh or more in any 3 or more hours from HE12 HE18. This definition is designed to evaluate only those hours where DR could respond. By restricting the definition to HE12 –– HE18 the definition considers only those hours where DR can be called. By also restricting the definition to days where 3 or more hours were above $150/MWh, this eliminates days with momentary jumps in price that DR could not reasonably be expected to respond to. III. DR Programs and Price Spikes Using the definition above, SCE had 67 hours that averaged $150/MWh or more across the hour, with 7 days where 3 or more hours averaged $150/MWh or more across the hour. SDG&E had 126 hours that averaged $150/MWh or more across the hour, with 18 days where 3 or more hours averaged $150/MWh or more across the hour. DR events overlapped real time price spikes with varying success. SCE was successful on 2 out of 7 days, whereas SDG&E was successful on 4 out of18 days.140 Table 37: Number of Days with Energy Price Spikes SCE SDG&E Days that DR events successfully overlapped price spike days (3 or more hours of $150/MWh) between HE12 –– HE18 2 4 Number of price spike days (3 or more hours of $150/MW) between HE12 –– HE18 7 18 Days that DR events were called 43 15 Days that DR events were called but without price spike ($150/MWh) occurring 31 6 Days with at least 1 price spike of $150/MWh 36 60 Most of the utility price responsive DR programs are currently designed to be called a full day ahead of when the load reductions are needed. The existing programs therefore do not have real time hourly prices as a trigger. They are triggered by other market indicators such as heat rates and forecasted temperature. According to SCE, price spikes occur with 2.5 minute notice, and that any resource that could be used to mitigate price spikes would have to be 140 For a more complete chart, see Appendix Q.
  • 78.
    74 already bid intoCAISO’’s market awaiting CAISO dispatch instructions141 . DR programs are currently not bid into CAISO’’s market. To the extent that DR programs were triggered when price spikes occurred, it is outside the scope of this report to quantify the impact of DR programs on those price spikes. The quantification of those impacts would require some method of modeling what the prices would have been but for the load impacts of the DR programs. In theory, DR should have had some impact on prices given that DR events overlapped price spike days on a few occasions. Demand response on those days, in theory, probably had some downward impact on the equilibrium price (i.e. mitigating the price spikes). IV. Conclusion DR programs are not able to address real time price spikes because of their current design, and because the programs are not yet bid into CAISO markets. The Utilities should design new DR programs that enable them to mitigate real time price spikes in anticipation that these programs will be bid into CAISO markets. 141 A.12 12 017, SCE Exhibit 1, pages 48 49
  • 79.
    75 Chapter 8: Coordinationwith the CAISO I. Staff Recommendations Because the Utilities’’ current DR programs are not integrated into the CAISO wholesale energy market, there is no market mechanism to inform the CAISO how much DR capacity exists in the system on daily and hourly basis. Such information is important for the CAISO’’s operational consideration. The utilities’’ Weekly and Daily DR reports developed in summer 2012 are a valuable alternative to make their DR resource more visible to the CAISO. Staff appreciates the Utilities’’ efforts in the development and submission of the Daily and Weekly DR reports. Staff agrees with the CAISO that all three utilities should submit the Daily and Weekly DR reports in summers 2013 and 2014. The utilities (including PG&E142 ) DR reporting requirements for 2013 2014 is summarized in Appendix R. II. DR Reporting Requirements in Summer 2012 As discussed above, prior to the summer 2012, under the oversight of the Governor’’s Office, the Commission worked closely and intensively with the CAISO, the CEC, and the Utilities on the contingency planning to mitigate the potential affects from the SONGS outage and ensure system reliability throughout the summer. One of the initial steps was to identify the Utilities’’ DR resources available to address the five different types of system contingencies such as transmission, voltage collapse, generation deficiency, etc., which is referred as the mapping of the DR programs. The next step was to develop a mechanism to inform the CAISO how much Day Ahead and Day of DR capacity is available on a daily and hourly basis. Unlike other generation resources, currently, DR is not integrated in the CAISO’’s wholesale energy market. Under the CAISO’’s DR Resource User Guide,143 the Utilities are required to only submit the forecast and results for the triggered DR programs. Therefore, if no DR program is triggered, the CAISO is blind to how much DR capacity exists in the system. With an exception of the Emergency Program, the DR programs are dispatched by the utilities, not the CAISO. This operation as well as the reporting requirements as set in the CAISO’’s guide since 2007 had not presented any problem in the past when the system had sufficient resources. However, in light of the SONGS outage, CAISO emphasized the importance of a daily communication on the Utilities’’ DR programs so the CAISO’’s grid operator could request the Utilities to dispatch their DR programs if and when they are needed. Working cooperatively with CAISO and the Commission staff, the Utilities developed and submitted the Daily DR reports from June 1, 2012 to October 31, 2012. The Utilities continued to submit the results of the DR events seven days after each event (referred as the ““7 Day Report””) consistent with the 142 As staff guidance only because PG&E is not subject to this proceeding. 143 DRAFT Version 1.0, August 30, 2007. http://www.caiso.com/1c4a/1c4a9ef949620.pdf.
  • 80.
    76 CAISO’’s guidance. Staffprovided to the Governor’’s Office the data from the Daily and the 7 Day reports in weekly briefings during the summer 2012. III. DR Reporting Requirements for 2013 2014 In its 2013 2014 DR application, SCE proposed to eliminate the Weekly and Daily DR reporting requirements because it did not find these reports provided value for SCE. SCE recommends transition back to the 2007 CAISO User Guide but suggests that the CAISO should update and publish for all DR Providers.144 In its protest to SCE’’s application, the CAISO objects SCE’’s proposal and requests that the Utilities resume the Daily DR reports after the winter season ends. The CAISO contends that ““(t)he underlying purpose of the date forecasting and publication was to benefit the system operator rather than the IOUs themselves. The ISO finds good value in the daily demand response reports. Because the report mechanism, the ISO is no longer blind to how much DR capability exists in the system in a daily and hourly basis, if and when it is needed.””145 Staff finds that these reports not only have value to the CAISO, but also to the Commission. Through the Daily and 7 Day reports, staff was able to monitor and provide timely DR status to the Governor’’s Office throughout the summer. There were a number of lessons learned leading the development of the comprehensive questions on the DR performance. Therefore, staff recommends the continuation of all of the DR reports submitted in 2012 for 2013 2014 as summarized in Appendix R. 144 A. 12 12 017, SCE 1, at p.54. 145 A. 12 12 017, CAISO’’s Comments filed on January 18, 2013.
  • 81.
    77 Appendix A: Highlightof 2012 Summer Weather & Load Conditions146 SCE Date Max Temp (°F) Max RA Temp (°F) DR Ex Post Load Impact (MW) Peak Load (MW) 8/10/2012 89 93 192 22,282 8/13/2012 90 95 59 22,428 9/14/2012 100 97 93 21,799 10/1/2012 95 N/A 80 21,355 10/17/2012 97 88 270 17,609 SDG&E Date Max Temp (°F) Max RA Temp (°F) Ex Post Load Impact (MW) Peak Load (MW) 8/13/2012 91 88 31 4,266 8/17/2012 94 88 23 4,266 9/14/2012 109 96 46 4,592 9/15/2012 104 96 32 4,313 10/2/2012 98 96 25 4,146.3 146 Include event days with top three highest temperatures and peak load.
  • 82.
    78 Appendix B: EnergyDivision November 16, 2012 Letter Provided in a separate PDF file
  • 83.
    79 Appendix C: Descriptionsof DR Load Impact Estimates 012 RA The 2012 Resource Adequacy (RA) load is a monthly forecast estimate of the load reduction attributed to individual DR programs under a 1 in 2 weather year condition. This value is utilized in load resource planning and it is based on a year ahead forecasted customer enrollment. SCE’’s Methodology In SCE’’s A. 12 12 017 March 4th Response To ALJ’’s February 21, 2013 Ruling Requesting Applicants To Provide Additional Information, 2012 RA MW is based on SCE’’s ex ante load impact results under a 1 in 2 weather year condition, portfolio level, and average hourly impacts from 1pm to 6pm in May Oct. and from 4pm to 9pm in Nov. Apr. The PTR, Residential and Commercial Summer Discount Plan (AC Cycling) methodologies are outlined by the following steps: 1. Defining data sources 2. Estimating ex ante regressions and simulating reference loads by customer and scenario 3. Calculating percentage load impacts from ex post results 4. Applying percentage load impacts to the reference loads; and 5. Scaling the reference loads using enrollment forecasts147 SDG&E’’s Methodology The 2012 Resource Adequacy MW is based on SDG&E’’s ex ante load impact results under a 1 in 2 weather year condition, portfolio level, and average hourly impacts from 1pm to 6pm in May Oct. and from 4pm to 9pm in Nov. Apr. The forecast is calculated in accordance with the load impact protocols149 . The forecast is calculated by multiplying (1) historical load impact per participant as a function of weather and (2) SDG&E’’s forecast of the number of participants per program. Load Impact Per Participant150 147 Details of RA protocols obtained from SCE DRAFT 2012 Ex Post Ex Ante Load Impact for SCEs PTR pg. 16 http://www3.sce.com/law/cpucproceedings.nsf/vwOtherProceedings?Openview&Start=1&Count=25 148 Details of RA protocols obtained from SCE DRAFT 2012 Ex Post Ex Ante Load Impact for SCEs SDP http://www3.sce.com/law/cpucproceedings.nsf/vwOtherProceedings?Openview&Start=1&Count=25 149 D.08 04 050. 150 Detailed information on RA protocols obtained from ““San Diego Gas & Electric Company Response to Administrative Law Judge’’s Ruling Requesting Applicants to Provide Additional Information”” pg. 14 and communication with Kathryn Smith, SDG&E. 2012 SCE Resource Adequacy Protocols ––Program Details148 Summer Discount Plan (AC Cycling) & Peak Time Rebate (PTR) Time of Day (hour) Day of week Variables for Monday and Friday Month Cooling degrees Heating degrees
  • 84.
    80 The first stepin the process is the development of a regression model. The model used in the analysis includes the following input variables: temperature, day of week, month, and participant loads prior to the DR event (i.e. participant loads at 10 a.m.). A 1 in 2 weather year condition was used as an input variable in the regression model and it represents the monthly peak day temperature for an average year. SDG&E utilized 2003 11 historical weather data to calculate monthly system peak temperatures. In the event that DR program enrollment, baselines, or the number of DR events changed significantly, data from prior years was utilized. Regression variable coefficients in the 2011 Ex Post model were utilized for the 2012 RA forecast model. After the impact per participant regression model is developed, the model is re run with average monthly peak temperature values. The output is the historical load impact per participant as a function of weather. SDG&E’’s Forecast of the Number of Participants per Program The forecasted number of participants per DR program is obtained by examining historical trends and program designed change. 2012 SDG&E Resource Adequacy Protocols ––Program Details151 ACSAVER 1 in 2 weather data for monthly system peak day Enrollment estimates by customer type (residential and commercial) and by cycling option (Res 50%, 100% cycling; Com –– 30%, 50% cycling). BIP A Time of Day, Day of Week, Month, Temperature (shape and trend variables (and interaction terms) designed to track variation in load across days of the week and hours of the day). Forecasted load in the absence of a DR event (i.e. the reference load) Participant’’s Firm Service Level Estimates of over or under performance TOU period variables (binary variables representing when the underlying TOU rates changed during the day and season) CBP DA/DO Simulated per customer reference loads under 1 in 2 weather year condition and event type scenarios (e.g., typical event, or monthly system peak day) Estimates of reference loads and percentage load impacts, on a per enrolled customer basis, based on modified versions of the ex post load impact regressions. Estimated percentage load impacts combined with program enrollment forecasts from SDG&E to develop alternative forecasts of aggregate load impacts. Forecasts were developed at the program and program type (e.g., DA and DO) level. CPP ––D Load impacts for existing CPP D customers were prepared for 2010 2020 based on per customer reference loads and load impact estimates from the ex post evaluation, and enrollment forecasts. The enrollment forecast for CPP D is calculated using opt out rates by NAICS CPP E Forecast is based on prior event data and accounts for temp. & customer growth 151 Details of RA protocols obtained from ““Executive Summary of the 2011 SDG&E Measurement and Evaluation Load Impact Reports”” http://www.sdge.com/sites/default/files/regulatory/SDGE_PY2011_LoadImpactFiling_ExecutiveSummary%20final.pdf
  • 85.
    81 PTR Com & Res Thereare five major assumptions required to compute the expected PTR load reduction from residential customers. 1) The meter deployment rate, 2) the rebate price, 3) the participation rates, 4) the average load, and 5) the elasticity which determine the percent impact per customer when combined with the prices. Average load is based upon SDG&E’’s load research and daily load profile data. Average daily energy use per hour in the peak and off peak periods Elasticity of substitution between peak and off peak energy use Average price during the peak and off peak pricing periods Change in elasticity of substitution due to weather sensitivity Average cooling degrees per hour during the peak period. Change in elasticity of substitution due to the presence of central air conditioning 2012 Adjusted RA The DR load impact for 2012 Adjusted RA is a monthly estimate of the expected load reduction attributed to individual DR programs that accounts for current customer enrollment. This value is utilized in load resource planning. SCE’’s Methodology Adjusted RA is calculated by taking the 2012 RA value and dividing by the 2012 RA enrollment to get the average RA load impact per customer. The average RA load impact per customer is multiplied by the number of ex post customers that were dispatched. The adjusted RA value accounts for the difference between the number of customers forecasted for RA and the number of customers actually enrolled during the ex post events; i.e. the adjusted RA represents what RA would have been if SCE had had perfect knowledge of enrollment for 2012 SDG&E’’s Methodology The adjusted 2012 RA load forecast is obtained by multiplying the 2012 RA impact per customer by the number of current enrolled customers. SDG&E did not adjust its 2012 RA load forecast for weather or other variables. DR Daily Forecast and CAISO’’s 7 Day Report The daily forecast is intended to provide an estimate of the expected hourly load reduction per DR program during an event period. The CAISO’’s 7 day Reports provide load reduction data that is calculated and reported to the CAISO seven days after a DR event. SCE’’s Methodology AC Cycling SCE’’s daily forecast for the Summer Discount Plan is calculated using an algorithm derived from a 1985 AC cycling load reduction analysis report. The algorithm is a linear equation: MW Reduction = [a + b x (T x k)] x t
  • 86.
    82 Where: T = Temperatureforecasted for the following business day in Covina, CA t = Air conditioner tonnage available for cycling k = Temperature adjustment factor a = Constant adjustment factor b = Slope adjustment factor When the temperature in Covina is below 70 degrees, the assumption is that no AC Cycling DR is available and thus no forecast is made. Specific values for a, b, and k are disclosed in a 1986 SCE internal memo for 4 SCE service area weather zones and for the 50% and 100% cycling strategies.152 Adjustments are made to the algorithm based on air conditioner tonnage available for cycling. This particular algorithm is only valid for event day temperatures between 90 and 116 degrees. As of this draft, the 1985 AC cycling load reduction analysis report has not been provided to ED staff. Consequently ED staff has not been able to examine the specific slope, constant, and temperature adjustment values. SCE used a modification of this algorithm to accommodate the hourly forecasts requested by the CAISO prior to August 28, 2012. The modified methodology uses the program load reduction estimates using a temperature input of 100 degrees that is scaled based on actual temperatures below 100 degrees. Towards the end of the summer, the legacy algorithm was built into a system where the temperatures could be applied by hour across the different zones requested by the CAISO. SCE’’s 7 day report for the Summer Discount Plan is calculated using the AC cycling load reduction algorithm with a temperature input based on actual temperatures in Covina CA. When the temperature in Covina CA is below 70 degrees, the assumption is that no AC Cycling DR is available. Adjustments are made based on enrollment and temperature. SDG&E’’s 7 day results reports for the AC Saver program are calculated using a one or two day baseline with adjustments based on same day or historical days with the most similar weather conditions to the event day. The 7 day report results provided to the CAISO are hourly but the event day results average results from 1p.m. 6 p.m. for events including those hours and the average results over the event period for events not including all of the hours 1p.m. 6p.m. Peak Time Rebate SCE’’s daily forecast and 7 day report for the Save Power Day peak time rebate program is calculated by multiplying the population of residential customers actively enrolled in Save Power Day event notification by a forecasted average load drop of 0.229 kW per participant. SDG&E’’s methods for developing the daily forecast and 7 day report for the residential peak time rebate program are the same as those described above for the AC Cycling program. DR Contracts SCE’’s daily forecast and 7 day report for DR Contracts program are calculated as the current month's contract capacity with no adjustments are made for enrollment, temperature, or other factors. 152 See Appendix S.
  • 87.
    83 SDG&E’’s Methodology Daily Forecast Thedaily forecast is calculated in two steps. The first step is the creation of a regression model that predicts the entire load of participating customers. Model input variables include temperature, day of week and month. Temperature inputs utilized in the regression model are the monthly peak temperatures from the prior year. In some instances, the load forecast may be scaled up or down according to the number of currently enrolled participants and their impact on on peak load. In some instances, if large customers leave a program, the load forecast regression is re run with participants that are still enrolled in the program. The second step in the process is to multiply the estimated load of participating customers by a fixed percentage load reduction that is based upon ex post results from the previous year. CAISO’’s 7 day Report Load reductions detailed in CAISO’’s 7 day Report are calculated by subtracting an estimated baseline from the measured load during DR event hours. SDG&E utilizes 10 working days prior to an event to calculate an estimated baseline for its CPP, CBP, CPP E, and BIP programs. For its residential programs, SDG&E utilizes 1 to 2 days to calculate its estimated baseline. The exception is that if the PTR event occurs on a Monday, then data from the prior work week (excluding event days) is used. As of this draft, Energy Division staff has not inspected SDG&E’’s regression model, model inputs, or cases where comparisons and judgment were applied to scale forecasts up or down. Ex Post Results and Settlement Data Ex Post Results Ex post result is the measurement of MW delivered using Regression methods. Regression methods use an entire season’’s data and data across multiple events to improve on the accuracy of impact estimates. It relies on historical information about customer loads and focuses on understanding the relationship between load, or load impacts, during hours of interest and other predictor variables (i.e., temperature, population characteristics, resource effects, and observed loads in the hours preceding the DR event). Whenever ex ante estimation is required, regression analysis is generally the preferred method because it can incorporate the impact of a wide variety of key drivers of DR. DR load Impact estimates are determined directly from the regression modal. Decision 08 04 050 adopts protocols for estimating the impact of DR activities for resource planning. The purpose of the ex post results is to inform DR Resource Planning and Program Design Settlement Data Day matching is the primary approach used to calculate customer settlement for DR options involving large Commercial and Industrial customers. Settlements refer to the methods of paying customers for participating in DR program and it is an important component of DR program design and implementation. Because of the need to produce estimates in a short time frame after an event for prompt payments, this limits the amount of data collected. Forecasting future impacts of DR events is limited because Day
  • 88.
    84 matching do notcollect data on influential variables (i.e., weather conditions, seasonal factors, customer population characteristics) that would cause impacts for vary in the future. SCE Methodology Load impact is calculated as the difference between the reference load (baseline) and the observed load (usage). The purpose of the settlement data is to calculate payment to customers.
  • 89.
    85 Appendix D: SCE2012 Monthly Average DR Program Load Impact (MW)153 with RA Measurement Hours (1 6 p.m.) Month 2012 RA (1) 2012 Adjusted RA Daily Forecast 7 Day Report Year End Ex PostEnrollment (2) Enrollment & Weather(3) Monthly Nominated Programs Capacity Bidding Program (Day Ahead) June 1.19 No Events N/A No Events No Events No Events No Events July 1.24 0.60 N/A 0.07 0.07 0.08 0.07 August 1.27 No Events N/A No Events No Events No Events No Events September 1.23 No Events N/A No Events No Events No Events No Events October 1.18 0.50 N/A 0.09 0.09 0.04 0.01 Capacity Bidding Program (Day Of) June 17.56 No Events N/A No Events No Events No Events No Events July 18.21 14.10 N/A 11.74 11.74 14.84 15.28 August 18.63 12.85 N/A 12.30 12.30 15.38 16.66 September 18.49 12.51 N/A 11.90 11.90 14.65 16.21 October 17.25 11.47 N/A 11.72 11.72 15.02 14.78 Demand Bidding Program June 11.49 No Events N/A No Events No Events No Events No Events July 12.05 2.91 N/A 74.65 85.59 96.09 90.21 August 12.39 3.02 N/A 88.35 77.14 76.81 72.43 September 12.24 No Events N/A No Events No Events No Events No Events October 12.27 2.90 N/A 78.90 71.67 90.33 79.52 Demand Response Contracts (Day Ahead & Day Of) June 99.15 No Events N/A No Events No Events No Events No Events July 102.51 No Events N/A No Events No Events No Events No Events August 104.74 166.28 N/A 275.00 275.00 174.79 182.05 September 103.56 No Events N/A No Events No Events No Events No Events October 100.22 139.10 N/A 185.00 185.00 122.11 114.90 Other Price Responsive Programs Save Power Days / Peak Time Rebates June 207.89 No Events N/A No Events No Events No Events No Events July 256.82 74.02 N/A N/A 58.76 58.76 N/A August 265.60 120.13 N/A N/A 108.02 108.02 35.56 September 238.08 107.66 N/A 108.59 108.62 108.62 10.73 October 202.43 No Events N/A No Events No Events No Events No Events 153 SCE 03, Table 1.
  • 90.
    86 Appendix D: SCE2012 Monthly Average DR Program Load Impact (MW) (Cont.) with RA Measurement Hours (1 6 p.m.) Month 2012 RA (1) 2012 Adjusted RA Daily Forecast 7 Day Report Year End Ex PostEnrollment (2) Enrollment & Weather(3) Summer Advantage Incentive Program / Critical Peak Pricing (CPP) June 66.49 63.23 N/A 42.59 49.00 49.93 27.68 July 69.31 65.64 N/A 52.40 51.72 61.65 39.95 August 68.57 65.14 N/A 52.00 42.46 46.72 38.50 September 65.08 61.66 N/A 46.76 40.09 42.69 35.90 October 62.86 No Events N/A No Events No Events No Events No Events Summer Discount Plan (Residential) June 462.15 168.00 N/A 161.51 137.87 137.87 69.60 July 545.82 538.42 N/A 263.67 158.81 158.81 188.00 August 500.00 454.95 N/A 227.94 162.46 150.95 211.90 September 519.53 514.47 N/A 254.06 254.06 254.06 133.02 October 0.00 N/A N/A 292.62 292.62 292.62 101.95 Emergency Programs Summer Discount Plan (Commercial) June 33.62 No Events N/A No Events No Events No Events No Events July 48.30 No Events N/A No Events No Events No Events No Events August 62.43 4.99 N/A 4.77 3.43 3.43 3.10 September 53.70 No Events N/A No Events No Events No Events No Events October 0.00 N/A N/A No Events No Events No Events No Events Agriculture Pumping Interruptible June 41.26 No Events N/A No Events No Events No Events No Events July 39.66 No Events N/A No Events No Events No Events No Events August 39.78 15.57 N/A 36.00 36.00 15.34 17.29 September 37.71 23.73 N/A 60.56 60.56 28.39 24.00 October 39.58 No Events N/A No Events No Events No Events No Events Base Interruptible Program June 553.24 No Events N/A No Events No Events No Events No Events July 542.67 No Events N/A No Events No Events No Events No Events August 542.52 No Events N/A No Events No Events No Events No Events September 548.21 558.24 N/A 513.78 520.91 441.46 573.01
  • 91.
    87 Appendix E: SCE2012 DR Program Load Impact by Event (MW) Daily Average by Event Hours Event Date Daily Forecast 7 Day Report Year End Ex Post Monthly Nominated Programs Capacity Bidding Program (Day Ahead) 7/23/2012 0.07 0.07 0.03 0.04 7/24/2012 0.07 0.07 0.08 0.09 7/25/2012 0.07 0.07 0.08 0.08 7/30/2012 0.07 0.07 0.11 0.06 7/31/2012 0.07 0.07 0.10 0.07 10/1/2012 0.09 0.09 0.24 0.20 10/2/2012 0.09 0.09 0.33 0.10 10/3/2012 0.09 0.09 0.12 0.05 10/5/2012 0.09 0.09 0.18 0.07 10/17/2012 0.09 0.09 0.00 0.07 10/18/2012 0.09 0.09 0.02 0.17 10/29/2012 0.09 0.09 0.19 0.15 Capacity Bidding Program (Day Of) 7/20/2012 11.74 11.74 14.84 15.28 8/7/2012 12.30 12.30 14.92 16.46 8/13/2012 12.30 12.30 15.22 15.70 8/14/2012 12.30 12.30 16.01 17.82 9/14/2012 11.90 11.90 14.65 16.21 10/2/2012 11.72 11.72 14.24 15.80 10/18/2012 11.72 11.72 15.80 13.76 Demand Bidding Program 7/12/2012 74.65 85.59 96.09 90.21 8/8/2012 85.59 102.63 100.67 92.95 8/10/2012 85.59 94.84 98.76 95.82 8/14/2012 94.09 70.89 66.96 61.76 8/16/2012 94.35 55.50 56.16 62.70 8/29/2012 82.15 61.84 61.51 48.94 10/1/2012 78.75 80.85 98.54 79.78 10/17/2012 79.05 62.49 82.12 79.25 Demand Response Contracts (Day Ahead & Day Of) 8/14/2012 275.00 275.00 174.79 182.05 10/2/2012 185.00 185.00 122.11 114.90
  • 92.
    88 Appendix E: SCE2012 DR Program Load Impact by Event (MW) (Cont.) Daily Average by Event Hours Event Date Daily Forecast 7 Day Report Year End Ex Post Other Price Responsive Programs Save Power Days / Peak Time Rebates 7/12/2012 N/A 58.76 58.76 8/10/2012 N/A 107.24 107.24 95.85 8/16/2012 N/A 107.61 107.61 24.43 8/29/2012 N/A 108.51 108.51 21.93 8/31/2012 N/A 108.73 108.73 0.02 9/7/2012 108.66 108.66 108.66 23.11 9/10/2012 108.52 108.57 108.57 1.65 Summer Advantage Incentive Program / Critical Peak Pricing (CPP) 6/29/2012 42.59 49.00 49.93 27.68 7/12/2012 49.00 62.40 80.14 41.53 7/23/2012 55.79 41.05 43.17 38.36 8/7/2012 50.91 48.57 54.29 33.48 8/9/2012 50.91 53.07 59.96 39.14 8/13/2012 50.54 46.70 55.98 42.96 8/20/2012 53.21 44.04 44.52 45.19 8/27/2012 53.21 23.59 25.02 34.41 8/29/2012 53.21 38.79 40.55 35.85 9/10/2012 47.36 48.60 52.04 42.26 9/20/2012 47.36 26.92 30.09 27.42 9/28/2012 45.55 44.75 45.95 38.00 Summer Discount Plan (Residential) 6/20/2012 Group 1 128.01 8.23 8.23 0.50 6/29/2012 Group 1 178.26 41.89 41.89 35.80 6/29/2012 Group 2 178.26 87.75 87.75 33.30 7/10/2012 Group 1 263.67 29.17 29.17 44.70 7/10/2012 Group 2 263.67 41.89 41.89 66.60 7/10/2012 Group 3 263.67 87.75 87.75 76.70 8/1/2012 Group 1 60.50 29.17 29.17 49.10 8/1/2012 Group 2 46.40 29.56 29.56 56.40 8/1/2012 Group 3 58.60 46.63 46.63 57.10 8/3/2012 Group 1 60.50 29.17 29.17 35.70 8/3/2012 Group 2 54.90 21.83 21.83 65.60 8/3/2012 Group 3 58.60 46.63 46.63 46.00 8/8/2012 Group 1 135.52 67.69 67.69 104.60 8/8/2012 Group 2 133.55 66.33 66.33 100.00 8/8/2012 Group 3 151.14 98.88 98.88 128.40
  • 93.
    89 Appendix E: SCE2012 DR Program Load Impact by Event (MW) (Cont.) Daily Average by Event Hours Event Date Daily Forecast 7 Day Report Year End Ex Post Other Price Responsive Programs Summer Discount Plan (Residential) (cont.) 8/9/2012 Group 1 151.14 67.69 67.69 125.90 8/9/2012 Group 2 121.12 66.33 66.33 107.20 8/9/2012 Group 3 118.06 98.88 98.88 121.20 8/14/2012 Group 1 130.40 194.47 61.14 119.40 8/14/2012 Reliability 17.42 8.15 3.43 13.50 8/15/2012 Group 1 116.01 88.62 88.62 74.30 8/15/2012 Group 2 75.10 42.35 42.35 84.20 8/15/2012 Group 3 77.77 40.44 40.44 77.50 8/17/2012 Group 1 101.30 102.53 102.53 153.00 8/17/2012 Group 2 58.00 42.25 42.25 98.30 8/21/2012 Group 1 61.87 53.44 53.44 72.70 8/21/2012 Group 2 62.65 29.93 29.93 83.40 8/21/2012 Group 3 50.70 29.39 29.39 57.50 8/22/2012 Group 1 115.03 29.39 29.39 42.40 8/22/2012 Group 2 75.11 29.93 29.93 67.20 8/22/2012 Group 3 101.25 47.12 47.12 58.50 8/28/2012 Group 1 129.54 129.54 129.54 76.30 8/28/2012 Group 2 83.86 83.86 83.86 88.20 8/28/2012 Group 3 71.90 71.90 71.90 81.30 8/29/2012 Group 1 82.56 82.60 82.60 80.30 8/29/2012 Group 2 66.42 66.40 66.40 91.70 8/29/2012 Group 3 108.42 108.40 108.40 125.90 9/10/2012 Group 1 72.72 72.72 72.72 92.40 9/10/2012 Group 2 77.52 77.52 77.52 69.00 9/10/2012 Group 3 18.98 18.98 18.98 68.40 9/14/2012 Group 1 110.89 110.89 110.89 37.80 9/14/2012 Group 2 9/14/2012 Group 3 99.32 99.32 99.32 17.80 9/14/2012 Group 4 9/14/2012 Group 5 135.61 135.61 135.61 20.70 9/14/2012 Group 6 9/20/2012 Group 1 65.73 65.73 65.73 21.90 9/20/2012 Group 2 9/20/2012 Group 3 77.39 77.39 77.39 14.60 9/20/2012 Group 4 9/20/2012 Group 5 65.53 65.53 65.53 21.10
  • 94.
    90 9/20/2012 Group 6 AppendixE: SCE 2012 DR Program Load Impact by Event (MW) (Cont.) Daily Average by Event Hours Event Date Daily Forecast 7 Day Report Year End Ex Post Other Price Responsive Programs Summer Discount Plan (Residential) (cont.) 9/21/2012 Group 1 130.98 130.98 130.98 67.00 9/21/2012 Group 2 168.96 168.96 168.96 69.10 9/21/2012 Group 3 105.16 105.16 105.16 77.10 9/28/2012 Group 1 43.16 43.16 43.16 29.30 9/28/2012 Group 2 9/28/2012 Group 3 55.06 55.06 55.06 24.50 9/28/2012 Group 4 9/28/2012 Group 5 43.28 43.28 43.28 34.40 9/28/2012 Group 6 10/2/2012 Group 1 298.91 298.91 298.91 86.20 10/2/2012 Group 2 198.32 198.32 198.32 130.90 10/17/2012 Group 1 127.25 127.25 127.25 62.30 10/17/2012 Group 2 146.77 146.77 146.77 72.30 10/17/2012 Group 3 92.50 92.50 92.50 56.10 10/18/2012 Group 1 154.37 154.37 154.37 N/A 10/18/2012 Group 2 58.71 58.71 58.71 N/A 10/26/2012 Group 1 38.65 38.65 38.65 N/A 10/26/2012 Group 2 47.23 47.23 47.23 N/A 10/26/2012 Group 3 7.77 7.77 7.77 N/A Emergency Programs Summer Discount Plan (Commercial) 8/14/2012 4.77 3.43 3.43 3.10 Agriculture Pumping Interruptible 8/14/2012 36.00 36.00 15.34 17.29 9/26/2012 60.56 60.56 28.39 24.00 Base Interruptible Program 9/26/2012 513.78 520.91 441.46 573.01
  • 95.
    91 Appendix F: SDG&E2012 Monthly Average DR Program Load Impact (MW) with RA Measurement Hours (1 6 p.m.) Program Month 2012 RA 2012 Adjusted RA Daily Forecast 7 Day Report Ex Post Settlement Enrollment Enrollment & Weather Emergency Programs BIP A 6 10 3 N/A N/A N/A N/A N/A BIP A 7 11 4 N/A N/A N/A N/A N/A BIP A 8 10 3 N/A N/A N/A N/A N/A BIP A 9 11 3 N/A 0.34 1.3 0.84 N/A BIP A 10 10 3 N/A N/A N/A N/A N/A Monthly Nominated CBP DA 6 9 8 N/A N/A N/A N/A N/A CBP DA 7 10 8 N/A N/A N/A N/A N/A CBP DA 8 10 9 N/A 8 9 8 9 CBP DA 9 10 8 N/A 9 7 7 7 CBP DA 10 10 8 N/A 9 8 4 8 CBP DO 6 20 10 N/A N/A N/A N/A N/A CBP DO 7 22 10 N/A N/A N/A N/A N/A CBP DO 8 22 10 N/A 12 11 10 11 CBP DO 9 23 11 N/A 12 10 11 10 CBP DO 10 23 10 N/A 12 10 9 10 Price Responsive ACSAVER 6 7 7 N/A N/A N/A N/A N/A ACSAVER 7 12 12 N/A N/A N/A N/A N/A ACSAVER 8 15 14 N/A 27 18 19 N/A ACSAVER 9 17 18 N/A 13 12 15 N/A ACSAVER 10 18 18 N/A 15 9 18 N/A CPP 6 12 16 N/A N/A N/A N/A N/A CPP 7 15 18 N/A N/A N/A N/A N/A CPP 8 12 15 N/A 14 20 19 N/A CPP 9 12 14 N/A 14 6 14 N/A CPP 10 14 16 N/A 16 16 16 N/A DBP 6 N/A N/A N/A N/A N/A N/A N/A DBP 7 N/A N/A N/A N/A N/A N/A N/A DBP 8 N/A N/A N/A 5 8 5 8 DBP 9 N/A N/A N/A 5 9 5 9 DBP 10 N/A N/A N/A 5 8 5 8 PTR Com 6 N/A N/A N/A N/A N/A N/A N/A PTR Com 7 N/A N/A N/A 2 0 0 31 PTR Com 8 N/A N/A N/A 1 4 0 37 PTR Com 9 N/A N/A N/A 1 0 0 33 PTR Com 10 N/A N/A N/A N/A N/A N/A N/A PTR Res 6 46 46 N/A N/A N/A N/A N/A PTR Res 7 70 70 N/A 24 13 6 160 PTR Res 8 69 69 N/A 15 21 2 286 PTR Res 9 63 63 N/A 32 46 8 298 PTR Res 10 52 52 N/A N/A N/A N/A 286
  • 96.
    92 Appendix G: SDG&E2012 DR Program Load Impact by Event (MW) Daily Average by Event Hours Program Event Date Daily Forecast 7 Day Report Ex Post Settlement Emergency Programs BIP A 9/14/2012 0.3 1.3 0.8 N/A CPPE 8/13/2012 2.3 1.5 1.2 N/A CPPE 9/14/2012 1.6 1.4 0.9 N/A Monthly Nominated CBP DA 8/9/2012 7.5 9.3 7.5 9.4 CBP DA 8/10/2012 7.5 9.5 7.6 9.5 CBP DA 8/14/2012 7.5 8.3 7.5 8.5 CBP DA 9/14/2012 9 5.8 5.7 5.9 CBP DA 9/17/2012 9 8 7.9 8.4 CBP DA 10/1/2012 9 7 4.1 7.3 CBP DA 10/2/2012 9 8 4.2 8.7 CBP DO 8/8/2012 11.7 11.2 11 11.5 CBP DO 8/13/2012 11.7 10.6 8.5 10.6 CBP DO 9/13/2012 12.1 10.5 10.6 10.7 CBP DO 9/14/2012 12.1 9.9 10.6 10.1 CBP DO 10/1/2012 12.1 9.5 9.2 9.5 Price Responsive ACSAVER 8/8/2012 26.3 13.7 14 N/A ACSAVER 8/10/2012 27.2 19.8 18.5 N/A ACSAVER 8/13/2012 33.3 18.2 21.4 N/A ACSAVER 8/17/2012 19.3 20.6 22.7 N/A ACSAVER 9/13/2012 16 12.8 12.6 N/A ACSAVER 9/14/2012 15.5 21.5 22.5 N/A ACSAVER 9/15/2012 8.6 3.1 8.8 N/A ACSAVER 10/1/2012 14.5 9.2 18 N/A CPP 8/9/2012 13.5 20.9 15.9 N/A CPP 8/11/2012 11.7 12.3 18.4 N/A CPP 8/14/2012 14.3 27.1 25.9 N/A CPP 8/21/2012 16.5 20 17.2 N/A CPP 8/30/2012 16.2 20.3 17.8 N/A CPP 9/15/2012 13.7 5.5 14.5 N/A CPP 10/2/2012 16 16.1 16.5 N/A PTR Com 7/20/2012 2 0.1 0 31.2 PTR Com 8/9/2012 1.2 0.3 0 27.4 PTR Com 8/10/2012 1.1 8 0 37.5 PTR Com 8/11/2012 0.8 0 0 26.2 PTR Com 8/14/2012 1.2 4.8 0 29.8 PTR Com 8/21/2012 1.2 4.5 0 62 PTR Com 9/15/2012 0.9 0 0 32.8 PTR Res 7/20/2012 23.9 13.3 6.3 160.1 PTR Res 8/9/2012 13.1 26.1 3.3 202.8 PTR Res 8/10/2012 12.6 28.1 3.2 197 PTR Res 8/11/2012 12.2 33.6 1.7 231.1 PTR Res 8/14/2012 12.5 6.9 1.1 240 PTR Res 8/21/2012 25 10 3 559 PTR Res 9/15/2012 32.3 45.8 8.3 298
  • 97.
    93 Appendix H: SCE2012 DR Program Overview Program Type Program Season Available Annual Events/Hours Available Monthly Events/Hours Available Weekly Events/Hours Available Daily Events/Hours # of Events Triggered/ # of Hours Available Remaining Available Trigger Criteria 2012 Trigger Condition Agricultural Pumping Interruptible (API) Day Of Year Round (excluding Holidays) 150 Hours 25 Events 4 Events 1 Event 6 Hours Max 2 Events 7.1 Hours 143 Hours •• CAISO Stage 1 Alert •• CAISO Stage 2 Alert •• SCE Grid Control Center Discretion •• Measurement & Evaluation •• System Emergency (San Joaquin Valley) •• Measurement & Evaluation Base Interruptible Program (BIP) Day Of Year Round (excluding Holidays) 180 Hours 10 Events No Limit 1 Event 6 Hours Max 1 Event 2 Hours 178 Hours •• CAISO Stage 1 Alert •• CAISO Stage 2 Alert •• SCE Grid Control Center Discretion •• Measurement & Evaluation •• Measurement & Evaluation Capacity Bidding Program Day Ahead May –– Oct (excluding Holidays) No Limit 24 Hours Mon Fri 1 Event 8 Hours (11am –– 7pm) 12 Events July –– 17 Hrs Oct –– 22 Hrs May –– 24 Hrs June –– 24 Hrs July –– 7 Hrs Aug –– 24 Hrs Sep –– 24 Hrs Oct –– 2 Hrs •• High temperature •• Resource limitations •• A generating unit outage •• Transmission constraints •• CAISO Alert or Warning •• SCE System Emergency •• Measurement & Evaluation •• Heat Rate
  • 98.
    94 Appendix M (Cont.) SCE2012 DR Program Overview (Cont.) Program Type Program Season Available Annual Events/Hours Available Monthly Events/Hours Available Weekly Events/Hours Available Daily Events/Hours # of Events Triggered/ # of Hours Available Remaining Available Trigger Criteria 2012 Trigger Condition Capacity Bidding Program Day Of May –– Oct (excluding Holidays) No Limit 24 Hours No Limit 1 Event 4,6, or 8 hour event duration options 7 Events July –– 3 Hrs Aug –– 12 Hrs Sept –– 6 Hrs Oct –– 10 Hrs May –– 24 Hrs Jun –– 24 Hrs Jul –– 21 Hrs Aug –– 12 Hrs Sep –– 18 Hrs Oct –– 14 Hrs •• High temperature •• Resource limitations •• A generating unit outage •• Transmission constraints •• CAISO Alert or Warning •• SCE System Emergency •• Measurement & Evaluation •• Heat Rate Demand Bidding Program Day Ahead Year Round (excluding Holidays) No Limit No No Limit Mon Fri 1 Event 8 hours 8 Events 64 Hours No Limit •• CAISO Alert or Warning •• Day Ahead load and/or Price Forecast •• Extreme or unusual temperature conditions •• SCE Procurement needs •• Measurement & Evaluation •• Heat Rate DR Contracts Day Ahead Varies Varies by Contract Varies by Contract Varies by Contract Varies by Contract 1 Event 2 Hours Varies by Contract Varies by Contract •• Peak Load Forecast DR Contracts Day Of Varies Varies by Contract Varies by Contract Varies by Contract Varies by Contract 2 Events 5 Hours Varies by Contract Varies by Contract •• Energy Prices •• Peak Load Forecast
  • 99.
    95 Appendix M (Cont.) SCE2012 DR Program Overview (cont.) Program Type Program Season Available Annual Events/Hour s Available Monthly Events/Hour s Available Weekly Events/Hour s Available Daily Events/Hour s # of Events Triggered/ # of Hours Available Remaining Available Trigger Criteria 2012 Trigger Condition Save Power Day Day Ahead Year Round (excluding Holidays) No Limit No Limit No Limit 1 Event 4 Hours (2pm –– 6pm) 7 Events 28 Hours No Limit •• Temperature •• Temperature Summer Advantage Incentive Day Of June –– Sep (excluding Holidays) 60 Hours Min: 9 Events Max: 15 Events No Limit No Limit 1 Event 4 Hours (2pm –– 6pm) 12 Events 48 Hours 3 Events •• Temperature •• CAISO Alert or Warning •• SCE System Forecast •• Extreme or unusual temperature conditions •• Day Ahead load and/or Price Forecast •• High Temperature •• Peak Load Forecast •• Day Ahead load and/or Price Forecast Summer Discount Plan Residential Day Of Year Round (excluding Holidays) Unlimited Events 180 Hours No Limit No Limit Unlimited Events 6 Hours 23 Events 24 Hours 156 Hours •• CAISO Alert or Warning •• CAISO Discretion •• SCE Grid Control Center Discretion •• SCE Energy Operations Center Discretion •• Measurement & Evaluation •• CAISO Emergency •• Heat Rate •• Measurement & Evaluation Summer Discount Plan –– Commercial Day Of Year Round (excluding Holidays) Base –– 90 Hours Enhanced –– Unlimited No Limit No Limit 6 Hours 1 Event 5.6 Hours No Limit •• CAISO Stage 1 Alert •• CAISO Stage 2 Alert •• SCE Grid Control Center Discretion •• Measurement & Evaluation •• CAISO Emergency
  • 100.
    96 Appendix I: SDG&EDR Program Overview Program Type Program Season Available Annual Events/Hours Available Monthly Events/Hours Available Weekly Events/Hours Available Daily Events/Hours # of Events Triggered Available Remaining Trigger Criteria Trigger Condition 1 Event Temperature and system load Always *Monday: 86 ; 3472 MW 7 Hours *Tues Fri: 84 ; 3837 MW (11am 6pm) *Saturday: 86 ; 3837 MW May Oct 1 Event 7 Events Price: Mon Fri Up to 8 Hours Aug: 12 Hours (3 events) Aug: 32 Hours *Mon Friday only (11am 7pm) Sep: 8 Hours (2 events) Sep: 36 Hours *Market Price equal to or greater than 15,000 btu/kWh heat rate Oct: 8 Hours (2 events) Oct: 36 Hours *Other Statewide or local system conditions May Oct 1 Event 5 Events Price: Day Of Mon Fri Up to 8 Hours Aug:7 Hours (2 events) Aug: 37 Hours *Mon Friday only (11am 7pm) Sep: 8 Hours (2 events) Sep: 36 Hours *Market Price equal to or greater than 15,000 btu/kWh heat rate Oct: 4 hours (1 event) Oct: 40 Hours *Other Statewide or local system conditions 1 Event 1 Event CAISO forecasts a Stage 1 1 ComplianceTest Up to 4 Hours 4 Hours CAISO declares a Stage 2 2 Met trigger criteria CAISO calls for interruptible load Extreme weather or system demands or at SDGE discretion. 116 Hours Base Interruptibile Program (BIP) Day Of 30 minute Year Round 120 Hours 10 Events Capacity Bidding Program (CBP) No Limit 44 Hours No Limit Mitigate potential price spikes and load forecast abolve 4000 MW and/or Real Time Load came in higher than Day Ahead forecast Mitigate potential price spikes and load forecast above 4000 MW Critical Peak Pricing Default (CPP D) Day Ahead Year Round 18 Events No Limit No Limit 7 Events 11 Events Met trigger criteria for all 7 events Capacity Bidding Program (CBP) Day Ahead No Limit 44 Hours No Limit
  • 101.
    97 Appendix N: SDG&EDR Program Overview (Cont.) Program Type Program Season Available Annual Events/Hours Available Monthly Events/Hours Available Weekly Events/Hours Available Daily Events/Hours # of Events Triggered Available Remaining Trigger Criteria Trigger Condition May Oct 15 Events 1 Event 8 Events Temperature and system load Holidays Excluded or Noon to 8 pm Aug: 15 Hours (4 events) Aug: 25 Hours *Monday Friday: 3800 MW 120 Hours Min 2/Max 4 Hours Sep: 10 Hours (3 events) Sep: 30 Hours *Saturday Sunday Optional Participation Oct: 4 Hours (1 events) Oct: 36 Hours *CAISO Stage 1 or 2 Annual 91 Hours *Local or system emergency 1 Event Temperature and system load Always *Monday: 86 ; 3472 MW 7 Hours *Tues Fri: 84 ; 3837 MW (11am 6pm) *Saturday: 86 ; 3837 MW Day Of 2 Events Aug:1 Event (5 Hours) Terminates Dec 31 30 minute Sep:1 Event (4 Hours) Jul Dec 3 Events CAISO 1,2,or 3 Emergency 2012 only 14 Hours Transmission or imminent system emergency or as warranted by the utility 08/10/13 08/14/13 Conditions warranted by Utility Flex Alerts in Effect 71 Hours Local utility emergency with intent to avoid any firm load curtailment CAISO calls for Conditions warranted by Utility Demand Bidding Day Ahead No Limit No Limit No Limit No Limit N/A Critical Peak Pricing Emergency (CPP E) Year Round 80 Hours 40 Hours 4 Events 1 Event Mitigate potential price spikes and load forecast abolve 4000 MW and/or Real Time Load came in higher than Day Ahead forecast Reduce Your Use Day Ahead Year Round No Limit No Limit No Limit 7 Events No Limit Met trigger criteria for all 7 events Summer Saver Day Of 40 Hours 3 Events
  • 102.
    98 Appendix J: SCEHistorical DR Event Hours DR Programs Event Limits Max Event Duration 2012 2006 2011 Average 2006 2011 Max 2011 2010 2009 2008 2007 2006 Monthly Nominated Capacity Bidding Program Day Ahead (1 4) 24 Hrs./Mo 4 hrs. 39 53 72 48 47 72 47 53 Capacity Bidding Program Day Ahead (2 6) 24 Hrs./Mo 6 hrs. 0 51 71 23 49 71 53 58 Capacity Bidding Program Day Ahead (4 8) 24 Hrs./Mo 8 hrs. 0 14 42 0 0 28 42 0 Capacity Bidding Program Day Of (1 4) 24 Hrs./Mo 4 hrs. 23 18 40 8 31 8 3 40 Capacity Bidding Program Day Of (2 6) 24 Hrs./Mo 6 hrs. 33 12 40 8 40 8 3 0 Capacity Bidding Program Day Of (4 8) 24 Hrs./Mo 8 hrs. 0 11 49 0 0 8 0 49 Demand Bidding Program Unlimited 8 hrs. 64 106 172 40 72 116 101 172 136 Demand Response Contracts Day Ahead Various 4 hrs. 2 19 71 8 8 6 71 0 Demand Response Contracts Day Of Various 4 hrs. 12 11 16 14 16 6 7 14 Other Price Responsive Save Power Days / Peak Time Rebates Unlimited 4 hrs. 28 Summer Advantage Incentive / Critical Peak Pricing (CPP) 15 Events/Yr. 4 hrs. 48 57 70 48 48 70 60 Summer Discount Plan Residential & Commercial Base 15 Events/ Summer Season 6 hrs./day 15 38 0 22 5 0 38 24 Summer Discount Plan –– Residential & Commercial Enhanced Unlimited Events/ Summer Season 6 hrs./day 15 39 0 22 9 0 39 18 Summer Discount Plan Commercial –– Base 15 Events/ Summer Season 6 hrs./day 6 Summer Discount Plan Commercial Enhanced Unlimited Events/ Summer Season 6 hrs./day 6 Summer Discount Plan –– Residential 180 Hours/Yr. 6 hrs./day 24 Emergency Agricultural Pumping Interruptible (API) 1/Day 4/Wk. 25/Mo. 6 hrs./Day 40 hrs./Mo 150 hrs./Yr. 7 1 2 1 2 0 1 0 0 Base Interruptible Program (BIP) 1/Day 10/Mo. 6 hrs./Day 180 hrs./Yr. 2 1 3 2 0 2 0 0 3
  • 103.
    99 Appendix K: SCEHistorical Number of DR Events DR Programs Event Limits 2012 2006 2011 Average 2006 2011 Max 2011 2010 2009 2008 2007 2006 Monthly Nominated Programs Capacity Bidding Program Day Ahead (1 4) 24 Hrs./Mo 12 20 26 19 18 26 20 15 Capacity Bidding Program Day Ahead (2 6) 24 Hrs./Mo 0 16 22 10 16 22 19 13 Capacity Bidding Program Day Ahead (4 8) 24 Hrs./Mo 0 3 11 0 0 6 11 0 Capacity Bidding Program Day Of (1 4) 24 Hrs./Mo 7 5 11 3 9 2 2 11 Capacity Bidding Program Day Of (2 6) 24 Hrs./Mo 7 3 8 2 8 2 2 0 Capacity Bidding Program Day Of (4 8) 24 Hrs./Mo 0 2 9 0 0 2 0 9 Demand Bidding Program Unlimited 8 14 22 5 9 15 15 22 17 Demand Response Contracts Day Ahead Various 1 5 18 2 2 1 18 0 Demand Response Contracts Day Of Various 2 3 5 5 2 1 3 3 Other Price Responsive Save Power Days / Peak Time Rebates Unlimited 7 Summer Advantage Incentive / Critical Peak Pricing (CPP) 15 Events/Yr. 12 12 12 12 12 12 12 Summer Discount Plan Residential & Commercial Base 15 Events/ Summer Season 5 11 11 6 3 0 5 2 Summer Discount Plan –– Residential & Commercial Enhanced Unlimited Events/ Summer Season 8 22 10 22 5 0 6 2 Summer Discount Plan Commercial Base 15 Events/ Summer Season 1 Summer Discount Plan Commercial Enhanced Unlimited Events/ Summer Season 1 Summer Discount Plan –– Residential 180 Hours/Yr. 23 Emergency Programs Agricultural Pumping Interruptible (API) 1/Day, 4/Wk. 25/Mo. 2 1 2 1 2 1 1 0 0 Base Interruptible Program (BIP) 1/Day,10/Mo. 1 1 1 1 0 1 0 0 1
  • 104.
    100 Appendix L: Summaryof SCE’’s Reasons for the 2012 DR Triggers DR Program Category Programs Reasons Monthly Nominated Capacity Bidding Program Demand Bidding Program DR Contracts No nomination or trigger conditions Trigger conditions plus SCE’’s discretion to optimize performance & minimize participant fatigue Trigger conditions Price responsive Save Power Day (PTR) Summer Advantage Incentive (CPP) Summer Discount Plan (SDP) –– Res. SCE discretion to optimize performance & minimize participant fatigue Optimal dispatch Transitioned to price trigger starting June 2012. Remaining hours reserved for contingencies. Emergency Agricultural Interruptible Program Base Interruptible Program Local transmission contingency No emergency, test event only
  • 105.
    101 Appendix M: SDG&EHistorical DR Event Hours154 DR Programs Event Limits 2012 2006 2011 Average 2006 2011 Max 2011 2010 2009 2008 2007 2006 Monthly Nominated Programs Capacity Bidding Program Day Ahead 24 Hrs./Mo 24 19 38 19 28 24 4 38 0 Capacity Bidding Program Day Of 24 Hrs./Mo 20 28 50 28 50 37 6 45 0 Price Responsive Programs Peak Time Rebate Unlimited 49 32 32 32 Critical Peak Pricing Default 98 Hrs. ('06 '07) 126 Hrs. ('08 '12) 49 39 70 14 28 56 0 63 70 Demand Bidding Program Unlimited 14 29 41 41 16 Summer Saver 120 Hrs./Yr. 30 29 44 22 44 30 8 43 24 Emergency Programs Base Interruptible Program (BIP) 120 Hrs./Yr. 4 2 4 4 4 0 0 4 2 Critical Peak Pricing Emergency 80 Hrs./Yr. 9 4 14 0 0 0 0 14 7 154 Source for the 2006 2012 data: SGE 02, Attachment 1, Revised Appendix X, Tables 8 11.
  • 106.
    102 Appendix N: SDG&EHistorical Number of DR Events155 DR Programs Event Limits 2012 2006 2011 Average 2006 2011 Max 2011 2010 2009 2008 2007 2006 Monthly Nominated Programs Capacity Bidding Program Day Ahead Unlimited 7 5 8 5 7 6 1 8 0 Capacity Bidding Program Day Of Unlimited 5 7 12 7 12 7 1 12 0 Price Responsive Programs Peak Time Rebate Unlimited 7 5 5 5 Critical Peak Pricing –– Default 12 ('06 '07) 18 ('08 '12) 7 6 10 2 4 8 0 9 10 Demand Bidding Program Unlimited 3 7 9 9 4 Summer Saver 15/Yr. 8 8 12 6 11 7 2 12 8 Emergency Programs Base Interruptible Program (BIP) 10/Mo. 1 1 1 1 1 0 0 1 1 Critical Peak Pricing –– Emergency Unlimited 2 1 3 0 0 0 0 3 2 155 Source for 2006 2012 data: SGE 02, Attachment 1, Revised Appendix X, Tables 8 11.
  • 107.
    103 Appendix O: Utilities’’Peaker Plant Total Permissible vs. Actual Service Hours SCE Owned Peaker Plants Within SONGS Affected Areas Center Barre Grapeland Mira Loma Permissible Service Hours 1096 955 1073 700 Actual Service Hours: Sept. Dec. 2007 93 123 87 104 Jan. Dec. 2008 120 118 125 119 Jan. Dec. 2009 93 83 46 70 Jan. Dec. 2010 156 174 137 148 Jan. Dec. 2011 163 149 85 127 2007 2011 Average 125 129 96 114 % of Permitted 11% 14% 9% 16% Jan. Oct. 2012 459 465 403 413 % of Permitted 42% 49% 38% 59% % of 2007 2011 Avg. 367% 359% 420% 364% SDG&E Owned Peaker Plants Cuyamaca El Cajon Energy Center Miramar Orange Grove Permissible Service Hours N/A 2500 5000 6400 Actual Service Hours: 2006 200 2007 250 2008 373 671 2009 625 1919 2010 481 439 2946 2011 667 433 4306 Historical Average 537 436 1715 % of Permitted N/A 17% 34% N/A 2012 1621 974 4805 2148 % of Permitted N/A 39% 96% 34% % of Historical Avg. 302% 223% 280% N/A
  • 108.
    104 Appendix P: ExPost Demand Response Load Impact on Flex Alert Days Programs Ex Post (MW) 3:00 4:00 p.m. 4:00 5:00 p.m. 8/10/12: SCE Demand Bidding Program 107 107 Save Power Day/Peak Time Rebate 87 78 Subtotal 194 185 SDG&E Capacity Bidding Program 8 8 Summer Saver/AC Cycling (Res. & Com.) 19 Subtotal 8 27 PG&E Capacity Bidding Program 41 41 Aggregator Managed Program 174 172 Peak Day Pricing/Critical Peak Pricing 22 24 Peak Choice 3 2 SmartAC 65 Base Interruptible Program 220 222 Subtotal 459 527 TOTAL 661 739 8/14/12: SCE Capacity Bidding Program Demand Bidding Program 72 71 Demand Response Contract 184 180 Summer Discount Plan/AC Cycling Res. & Com. 137 22 Agricultural Pumping Interruptible 14 Subtotal 394 242 SDG&E Capacity Bidding Program 8 8 Critical Peak Pricing 24 25 Peak Time Rebate 1 0 Demand Bidding Program 6 5 Subtotal 38 38 PG&E No Events TOTAL 432 280
  • 109.
    105 Appendix Q: CAISOEnergy Price Spikes SCE Price Spikes156 156 Source: SCE 03, SCE’’s Response to ALJ February 21, 2013 Ruling, Appendix B (Excel Data Tables in Response, Table 9)
  • 110.
  • 111.
  • 112.
    108 SDG&E Price Spikes157 157 Source:SGE 02, SDG&E’’s Response to the ALJ February 4, 2013 Scoping Memo, Attachment 3.
  • 113.
  • 114.
  • 115.
    111 Appendix R: Utilities’’Demand Response Reporting Requirements158 (2013 2014) 1. DR Weekly Forecast The utilities should continue to submit a 7 day (Monday to Sunday)159 DR forecast (MW) to the CAISO/CPUC_ED/CEC and highlight the DR programs that they anticipate to trigger by noon every Monday. Daily Value For the DR programs that have different hourly forecast, the utilities use slightly different methods to determine the daily value as described below. If an averaging method is used, the daily value may be higher or lower than the MW in a given hour such as the peak hours in the CAISO's demand forecast. Energy Division staff uses an averaging method over the actual event hours for its reports on the historical DR events. Utility Methods for the Daily Value SCE Average over the available event hours in the tariffs, which vary from program to program. SDG&E PROGRAM PERIOD AVERAGED Day Ahead 11 a.m. –– 6 p.m. Day Of 1 p.m. –– 6 p.m. (like RA) PG&E PROGRAM PERIOD AVERAGED BIP 1:00 pm –– 6:00 pm PDP 2:00 pm –– 6:00 pm (no significant enrollment/load 12 2p) SmartRate 2:00 pm –– 6:00 pm SmartAC 1:00 pm –– 6:00 pm For AMP, CBP, DBP, and PeakChoice, the hourly forecast does not vary; therefore, PG&E will continue to submit the same hourly forecast amount for the given month. 2. Daily DR Reporting to the CAISO (by 8 a.m. weekdays & weekends) For the non summer months (January 1 to April 30 and November 1 to December 31), the utilities should submit their Daily DR Reporting to the CAISO/CPUC_ED/CEC only when they intend to trigger a DR program for that day. In the submission email, please identify the triggered DR program(s). If there is no DR event, the utilities do not need to submit this report. For the summer months (May 1 to October 31), the utilities should submit their Daily DR Reporting to the CAISO/CPUC_ED/CEC on a daily basis as they did in 2012. This report is based on a common template developed by the CAISO and in Excel spreadsheet. In this report, the utilities provide the scheduled as of 8 a.m. and available MW for the day and next day for all of their DR event based programs (including Day Ahead and Day of) on an aggregated basis. SCE also has added the MW by each DR program. SDG&E only added the MW for the DR program(s) triggered for the day or the next day. 158 For SCE and SDG&E. Staff guidance only for PG&E because it is subject to this proceeding (A.12 12 016 et al.). However, staff includes the reporting requirements for PG&E as a guidance consistent with what are required for SCE and SDG&E. 159 Change from SCE and PG&E current days from Tuesday –– Monday to Monday –– Sunday.
  • 116.
    112 3. Updated Reportingto the CAISO/CPUC_ED/CEC (by COB weekdays for DR events called after 8 a.m.) PG&E: PG&E continues to send the DR forecast for all of the Day Ahead and Day Of events triggered to the CAISO and CPUC throughout the day as it used to do prior to summer 2012 in Excel spreadsheet. These reports provide the forecasted MW for each DR program. SCE: SCE sends a revised Daily DR Report to include the Day Of events called after 8 a.m. and the forecasted MW by program at the end of the event day. In the submission email, please identify the triggered DR program(s). SDG&E: SDG&E also sends a revised Daily DR Report to include the all DR events called after 8 a.m. and the forecasted MW by program at the end of the event day. 4. Reports on DR Results to the CAISO/CPUC_ED/CEC (Seven Days After the Events) All three utilities should continue to provide the DR results in Excel spreadsheet seven days after each DR event (CAISO 7 Day Report). The 7 Day Report should also include the DR results to date in each year.160 The utilities should submit the DR Weekly DR Forecasts (No.1) to the following emails: Entity/Individual Email Address CAISO John Goodin jgoodin@caiso.com CPUC Bruce Kaneshiro bruce.kaneshiro@cpuc.ca.gov CPUC Scarlett Liang Uejio scarlett.liang uejio@cpuc.ca.gov CPUC Dorris Chow dorris.chow@cpuc.ca.gov CPUC Paula Gruendling paula.gruendling@cpuc.ca.gov CEC Margaret Sheridan msherida@energy.ca.gov The utilities should submit the Daily DR Reports, revisions, and Results (No.2 No.4) to the following emails: Entity/Individual Email Address CAISO Shift Supervisors shiftsupervisors@caiso.com CAISO Market Operations ISODAM@caiso.com CAISO John Goodin jgoodin@caiso.com CAISO Glen Perez gperez@caiso.com CAISO Market Monitoring Keith Collins kcollins@caiso.com CPUC Scarlett Liang Uejio scarlett.liang uejio@cpuc.ca.gov CPUC Bruce Kaneshiro bruce.kaneshiro@cpuc.ca.gov CPUC Dorris Chow dorris.chow@cpuc.ca.gov CPUC Paula Gruendling paula.gruendling@cpuc.ca.gov CEC Margaret Sheridan msherida@energy.ca.gov 160 See SCE’’s 2012 7 Day Reports as an example.
  • 117.
    113 Appendix S: AdditionalInformation Provided in separate PDF files