Mine-Weather 2.0: Cloud-Based & Extending Assessment Capability to Include Outage Prediction Model . We are introducing a new cloud-based Mine-Weather platform
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Mine-Weather 2.0: Cloud-Based & Extending Assessment Capability to Include Outage Prediction Model
1. Mine-Weather 2.0: Cloud-Based &
Extending Assessment Capability to Include
Outage Prediction Model
G.N. Shah
D. Yenigun
J. Singer
R.O. Mueller
MACROSOFT NEW PRODUCT
ANNOUNCEMENT
May 5, 2017
2. Overview of Mine-Weather 2.0
We are introducing a new cloud-based Mine-Weather platform
• CLOUD-BASED SYSTEM
The new system is fully cloud-based so users can begin using its capabilities right after they sign up.
• WEATHER FORECAST ASSESSMENTS
M-W 2.0 provides the analytical and visualization capabilities similar to M-W 1.0 to assess the accuracy of
weather forecast data, but does so from the cloud. it also contains many more scoring algorithms for users
to choose from.
• OUTAGE PREDICTION MODEL ASSESSMENTS
In addition, M-W 2.0 provides NEW capabilities to help utilities assess the accuracy of their Outage
Prediction Model(s). That assessment can be done in two different ways, as described below.
• ENHANCEMENTS TO OUTAGE PREDICTION MODEL
Once the baseline assessment is done for a utility’s OPM, we offer an add-on service where we work with
the utility team to identify and implement enhancements to their OPM system. As the utility team makes
these enhancements to the OPM, we will then use M-W 2.0 to assess the improvements in accuracy
achieved from each set of enhancements.
3. Advantages of Cloud Version
• The system is ready to use. There is no development process needed to
customize and configure Mine-Weather for a particular utility.
• Since the system is cloud-based, there is no IT testing and implementation
process needed to validate and approve the system for internal use behind the
utility’s firewall.
• The cloud system already has all the links built-in to the NOAA data sources for
both forecast and observational data.
• Using the built-in dashboard, a user can evaluate the accuracy of forecasts for a
single weather station, or for an ensemble of weather stations that cover the
utility’s footprint.
• As the cloud version of Mine-Weather continues to be enhanced, the user will
automatically have access to the new improved features, without having to
schedule and do another full IT testing and implementation process.
4. Assessing Weather Forecasts - 1
• Users can access anytime and assess the accuracy of weather forecast data for
their particular location and for the specific time frame of interest.
• The system has pre-configured links into NOAA weather forecast and
observational data.
• We are in the process of adding in other observational stations including:
Weather bug, state weather networks, and other sources.
• Once a user registers and logs into the system, they can select from the list of all
NOAA weather station locations, the subset that are of interest either by
selecting from the list of all individual sites, or by selecting sites in one or more
particular counties or states.
• Using the M-W Dashboard, the user can then access weather data for the
selected set of NOAA stations and start examining how accurate the forecast
data is during different time frames.
• In particular, the user can examine forecast accuracy during storm events.
5. Assessing Weather Forecasts - 2
• There are three major uses cases for the system.
• The first use of the system is the straightforward one of using it to evaluate the
accuracy of the NCOA weather forecasts for the user’s particular geographical
footprint.
• The user identifies the NCOA weather stations of interest to initiate an analysis.
The set of identified weather stations can be changed by the user as new
analyses are performed.
• The system then collects the NCOA weather forecast and actual observation
data for the selected weather stations for as long a historical period as possible
(usually at least one year or longer).
• The system then calculates all the accuracy measures already built into the
system, and the user can then view the results via a variety of visualization
templates.
• It is expected that the primary purpose of this tool will be to evaluate weather
forecast accuracy during storm events, which the user defines in the system via
a start and stop hour/date.
6. Assessing a Utility’s Outage Prediction Model
• M-W 2.0 can support a utility in assessing the accuracy of their OPM in two
different ways.
• WEATHER SNIPPETS
In the first way, M-W provides the utility with extracts of weather data for their geography for
specific storm events. The utility then uses this weather data as input to their OPM, and runs the
OPM model on the utility’s local environment to determine the accuracy.
• OPM BASELINE RESULTS
In this second approach, we set up the utility’s OPM model on a high security partition of our cloud
environment. We then run the tests of their OPM system for all of the storm weather events
identified by the user. We provide the user with a thorough assessment of the accuracy of their
current OPM system.
7. Assessing a Utility’s OPM- Weather Snippets
• In this first way, M-W provides the utility with extracts of weather data for their
geography for specific storm events identified by the utility.
• The utility user tests their OPM model predictions in the user’s local environment given
these storm weather snippets provided by M-W.
• Given a specified start and stop hour for a storm event, M-W creates an output file for
the user to download to their local environment.
• A standard format for the output weather data is provided to the user, and contains both
forecast and actual data.
• The system also provides the user with the overall accuracy measures of the weather
forecasts for this particular storm, as determined using M-W.
• With the weather data in hand the user can then use that weather data as input to the
utility’s outage prediction model.
• So the user will be able to assess the accuracy of the OPM results based on this weather
data.
• The users will also be able to determine the breakout of the OPM forecast results in
terms of those errors that are OPM model specific and those that are due to inaccuracies
in the weather data inputs.
8. Assessing a Utility’s OPM-
M-W Does the OPM Assessment
• In the second way, the M-W team does the assessment of the utility’s OPM
system.
• We work with the utility to setup the OPM system in our highly secure cloud
testing environment.
• Then we do all the steps outlined in the prior chart.
• In particular, we run the tests of their OPM system for all of the storm weather
events identified by the user.
• At the conclusion, we provide the user with a thorough assessment of the
accuracy of their current OPM system.
• As a key part of our report, we breakout the errors in the final OPM system
results that are intrinsic to the OPM forecast model itself and those errors that
are introduced via the weather forecast data.
9. Enhancements to a Utility’s OPM System
• Once M-W has completed its baseline assessment of a utility’s current OPM system, the
M-W team can then work with the utility to test out various enhancements to the OPM
system.
• The process could work somewhat like the following. A utility makes a first set of
enhancements to their OPM system, and when completed it is then uploaded to the
utility’s secure M-W cloud environment.
• We then perform a new set of tests on the new model and provide an update report on
the accuracy results comparing the new model to the original model.
• If the results are positive – that is the new model provides much better predictions –
then presumably the utility will move to this new model as its new baseline standard.
• If on the other hand the new results are not better, or perhaps worse, the utility will
presumable stay with their current model or consider a new round on enhancements.
• We can continue this iterative process, for each major new release the utility makes to
their OPM system.