eFrame® for Insurance Solvency II Internal ModelSecondFloor
In addition to the risk and finance data challenges of the Standard Formula, the Internal Model approach brings with it the challenge of model validation and governance.
Also, dry runs are highlighting the logistical challenges of running some Solvency infrastructures, even with the support of the project team that built it, which will disperse in the near future. The next step beyond compliance is efficiency in a business-as-usual environment.
This solution is already in production at number of large insurers, and is founded on experience with insurers who pioneered risk and economic capital programmes long before the regulations were as clear as they are today. As such, this solution is adaptable to regulatory changes and the evolution of the insurers IT and business landscape.
For more information please visit: http://www.secondfloor.com/solution/eframe-for-insurance-solvency-ii-internal-model
eFrame® for Insurance Solvency II Internal ModelSecondFloor
In addition to the risk and finance data challenges of the Standard Formula, the Internal Model approach brings with it the challenge of model validation and governance.
Also, dry runs are highlighting the logistical challenges of running some Solvency infrastructures, even with the support of the project team that built it, which will disperse in the near future. The next step beyond compliance is efficiency in a business-as-usual environment.
This solution is already in production at number of large insurers, and is founded on experience with insurers who pioneered risk and economic capital programmes long before the regulations were as clear as they are today. As such, this solution is adaptable to regulatory changes and the evolution of the insurers IT and business landscape.
For more information please visit: http://www.secondfloor.com/solution/eframe-for-insurance-solvency-ii-internal-model
A Refreshing Start Pre-Issue Documentation, Best's Review, December 2008Gates Ouimette
#Scanning as part of the overall #lifeinsurance policy #workflow and process.
Pre-issue #process visioning, both strategic and tactical, is critical to the #policylifecycle.
The decision to invest in systems or #offshore resources ultimately should get evaluated within the context of current #businessprocesses.
CMMI High Maturity Best Practices HMBP 2010: CMMI® FOR SERVICES: INSIGHTS AND...QAI
CMMI® FOR SERVICES: INSIGHTS AND BEYOND
-Rajesh Naik
QAI.
presented at 1st International Collquium on CMMI High Maturity Best Practices 2010 held on May 21,2010 organized by QAI
bpmEdge - Enterprise Process Automation EcosystemPERICENT
A flexible and state of art BPM Platform build over cutting edge technology framework and crafted by Pericent engineers for adapting ever-changing business scenario in the enterprises. bpmEdge's modern and sophisticated architecture make it preferred choice for CXOs to utilise in their new age current or future technology stack.
Rich dashboards and powerful ad hoc wizard-based Report studio ensure quick decisions. Design, manage and deploy any sophisticated/complex business processes in less time with bpmEdge. Developed with "Less coding, more integration" philosophy.
**Trusted by well-known brands as well as reputed companies like Niyogin and USHA International Limited.
***Featured in "10 Best Performing BPM Solution Provider List 2017"
Everything is Process - bpmEdge
Transition to online AMS reduces maintenance costs and improves operationsEmerson Exchange
Presented by David Rabon and Mike Ruhle
at the 2011 Emerson Exchange in Nashville.
Abstract: The utilities department at Pfizer Inc. in Groton, CT has leveraged an investment in SMART technologies to move from a failure/preventative maintenance process to a reliability based process. With a DeltaV upgrade, online AMS and Valvelink snap-on were installed. The software along with training/mentoring and a change in work practices have improved maintenance efficiency and costs. In the past work was performed when operations identified a process related alert only. Now the technicians help identify issues, generate work orders and resolve them before they become process issues.
Infosys - Enterprise System Integration Software | White PaperInfosys
"
Enterprise system integration requires a clear understanding of information flow across the enterprise and shop floor for any given software process"
An introductory presentation to the periodic table I gave to my children's 5th grade classes. If you enjoy it, please, use it in your own class, if you are a teacher, or, use it to give a similar talk at your children's elementary school.
How to design with science and not destroy the magicJoe Leech
By @mrjoe http://mrjoe.uk
The poet John Keats famously blamed scientists experimenting with light for 'unweaving the magic of the rainbow'.
Joe will look at applying science to design to make our apps and websites better.
We'll look at different types of data, from user research and analytics, to psychology. How to research, collect, source, asses and most importantly design using data without losing the magic.
The Mysterious Process of Business Analysis SolvedHeadChannel
When you hear the words Business Analyst, a big company with a complex hierarchical structure most probably comes to your mind. However, severe market demands challenge both small and big companies to meet the criteria of mess-free process management and facilitated communication. This can all be achieved through implementing strong business analysis skills.
A Refreshing Start Pre-Issue Documentation, Best's Review, December 2008Gates Ouimette
#Scanning as part of the overall #lifeinsurance policy #workflow and process.
Pre-issue #process visioning, both strategic and tactical, is critical to the #policylifecycle.
The decision to invest in systems or #offshore resources ultimately should get evaluated within the context of current #businessprocesses.
CMMI High Maturity Best Practices HMBP 2010: CMMI® FOR SERVICES: INSIGHTS AND...QAI
CMMI® FOR SERVICES: INSIGHTS AND BEYOND
-Rajesh Naik
QAI.
presented at 1st International Collquium on CMMI High Maturity Best Practices 2010 held on May 21,2010 organized by QAI
bpmEdge - Enterprise Process Automation EcosystemPERICENT
A flexible and state of art BPM Platform build over cutting edge technology framework and crafted by Pericent engineers for adapting ever-changing business scenario in the enterprises. bpmEdge's modern and sophisticated architecture make it preferred choice for CXOs to utilise in their new age current or future technology stack.
Rich dashboards and powerful ad hoc wizard-based Report studio ensure quick decisions. Design, manage and deploy any sophisticated/complex business processes in less time with bpmEdge. Developed with "Less coding, more integration" philosophy.
**Trusted by well-known brands as well as reputed companies like Niyogin and USHA International Limited.
***Featured in "10 Best Performing BPM Solution Provider List 2017"
Everything is Process - bpmEdge
Transition to online AMS reduces maintenance costs and improves operationsEmerson Exchange
Presented by David Rabon and Mike Ruhle
at the 2011 Emerson Exchange in Nashville.
Abstract: The utilities department at Pfizer Inc. in Groton, CT has leveraged an investment in SMART technologies to move from a failure/preventative maintenance process to a reliability based process. With a DeltaV upgrade, online AMS and Valvelink snap-on were installed. The software along with training/mentoring and a change in work practices have improved maintenance efficiency and costs. In the past work was performed when operations identified a process related alert only. Now the technicians help identify issues, generate work orders and resolve them before they become process issues.
Infosys - Enterprise System Integration Software | White PaperInfosys
"
Enterprise system integration requires a clear understanding of information flow across the enterprise and shop floor for any given software process"
An introductory presentation to the periodic table I gave to my children's 5th grade classes. If you enjoy it, please, use it in your own class, if you are a teacher, or, use it to give a similar talk at your children's elementary school.
How to design with science and not destroy the magicJoe Leech
By @mrjoe http://mrjoe.uk
The poet John Keats famously blamed scientists experimenting with light for 'unweaving the magic of the rainbow'.
Joe will look at applying science to design to make our apps and websites better.
We'll look at different types of data, from user research and analytics, to psychology. How to research, collect, source, asses and most importantly design using data without losing the magic.
The Mysterious Process of Business Analysis SolvedHeadChannel
When you hear the words Business Analyst, a big company with a complex hierarchical structure most probably comes to your mind. However, severe market demands challenge both small and big companies to meet the criteria of mess-free process management and facilitated communication. This can all be achieved through implementing strong business analysis skills.
Getting to the core, requirements gathering in the wildFemke Goedhart
Session slides as delivered on March 18th 2014 at Engage in Breda, The Netherlands by Sophie Lavignac-Le Madec & Femke Goedhart
Abstract: The basis of any good project is good requirements. Knowing what it is you are going to build / get determines whether your project will be a success or a flat out failure. In reality though the requirements phase is often trivialized or even forgotten. This session will give you tips & tricks as well as explain to you the basic techniques on how to effectively get to the core of the requirements, identify ways of prioritizing them and explain some core concepts of Functional and Technical design elements. Coming from a requirement gathering as well as development & customer point of view Femke & Sophie will take you through some of the real life examples they have come across and a lot of do's & don'ts they have seen (and despaired over)
Tthe 8-step business analysis process that you can apply whether you are in an agile environment or a traditional one, whether you are purchasing off-the-shelf software or building custom code, whether you are responsible for a multi-million dollar project or a one-week project.
Depending on the size and complexity of your project, you can go through these steps quickly or slowly, but to get to a successful outcome you must go through them
Business process analysis and optimization: A pragmatic approach to business ...Mozammel Hoque
The rapidly changing economic and socio-economic environment has led to think how to keep the business processes continuously optimized in highly uncertain and unexpected markets. This turbulent market situation has been brought two major challenges - Socio-cultural (Behavioral) challenge and Technical challenge (IT). The current industry practice and the academic researchers are trying to get out of this by looking the answer from the technology and business model end: “HOW” to manage the challenges of continuous change concentrating on flexibility and speed, maintainability and scalability, cost. Aftermath of it, there is numerous business process modeling techniques are being proposed by the researchers and the technology industry that well captures both approaches - Quantitative analysis: Objective Approach and Qualitative analysis: Subjective Approach though these approaches have its own drawback. (It is not the purpose of this seminar to enlighten on this drawback.) But, the socio-cultural challenge is ignored though our investigation reveals that Information behavior changes faster than information systems, which has driven us to work on it. Therefore, the aim of this seminar is to demonstrate how socio-cultural factors have significant impact, i.e. WHY IT MATTERS, on the success of business process optimization.
A Questionnaire for Identify Failures in Business Analysis Phase of ERP ProjectsVaruna Harshana
Identifying business needs and designing solutions is done through the processes of “Business Analysis”. Many solutions are developed to provide the needs of businesses which include the implementation of ERP systems. ERP systems mostly cuts across many business processes hence create many complexities while designing them. The probability rate for these complexities to turn into failures is high. The Business analyst is mostly responsible in handling these issues and reduces them as well. Business analysis includes many phases which can be shown as follows;
Enterprise/Company analysis
Requirements planning and management
Requirements elicitation
Requirements analysis and documentation
Requirements communication
Solution assessment and validation
The above mentioned phases need to be executed in order to do a proper business analysis. Many aspects need to be considered and standards need to be followed in doing this analysis, so as mentioned earlier there is a high probability for these phases not to function in the expected manner. So the identification of the potential process failures needs to be done.
This can be done by preparing a questionnaire which will monitor important elements of each of the above mentioned phases of business analysis. These questions will be addressing many aspects such as standards used, tools used, parties responsible, causes of actions, etc. In this manner this questionnaire could simply identify the failures that could occur while carrying out the Business analysis stage.
The questionnaire that we prepared will clearly indicate how effectively anyone could point out potential failures of “Business analysis” stage.
Process wind tunnel - A novel capability for data-driven business process imp...Sudhendu Rai
A talk I gave recently on data-driven process improvement methodology and techniques with applications and results from insurance and finance processes
Business Case Calculator for DevOps Initiatives - Leading credit card service...Capgemini
The 2015 World Quality Report data reveals that 61% of respondent’s rate time-to-market as very important which is the key reason for the proliferation of DevOps. The biggest ingredient is speed based on efficiencies upstream and in operations. Technology leaders now need to wear a business hat and build their strategy based on cost to achieve desired velocity as opposed to cost savings.
Join MasterCard and Capgemini to learn about a real time to market driven DevOps business case calculator with technology, process and tool components.
Presented at HPE Discover Las Vegas 2016.
Brave New World - A wider perspective of our opportunitiesJayathirtha Rao
The world has changed. We feel it in the work environment. We see it in the numbers. We smell it in the questions. Much that once was, is lost, for few remember it, even less still use it. In the name of agility and lean processes, we have turned a blind eye to quality, favouring quick execution over customer delight. And some things that should not have been forgotten were lost.
Join Jay Rao and Vishal Anand to explore the new world of “agile delivery”. Understand the myths vs the real picture, the anti-patterns to watch out for and how you can look forward to understanding new measurements for delivery, and various functions within – from architecture to deployment – and where each of us has opportunities that align with common sense and outcomes
Transforming Technology Transfer and Recipe Management: From Spreadsheets to ...guest070fdd
Presented by Paul Wlodarczyk at Documentation and Training Life Sciences, June 23-26, 2008 in Indianapolis.
The creation and management of formulation and control recipes is a process that is overdue for transformation. Today, most pharmaceutical companies still rely on error-prone, manual recipe-management approaches, in which master recipes are treated as static and disconnected documents. These outdated approaches lead to delays in technology transfer and introduce errors as formulations are entered into execution and quality management systems. Inefficient technology transfer, in turn, leads to delays in commercialization, waste or poor yield, compliance challenges, and risks to product quality.
Recipe standardization and management can improve every aspect of the product lifecycle, from late-stage discovery through clinical and commercial manufacturing. As pharmaceutical companies increasingly implement Quality by Design principles, recipe standardization will ensure that critical process parameters and their ranges are documented in a uniform fashion, from the earliest phases of process development and then managed effectively through all stages of manufacturing.
This slide deck explores new approaches for standardizing recipe management to mitigate risk and accelerate time to market. You will see case studies and be provided with a framework for understanding how to migrate to standards-based recipe-management practices.
Can ML help software developers? (TEQnation 2022)Maurício Aniche
We all have heard of the amazing things Machine Learning can do. It can drive cars, it can detect whether people are using safe equipments, it can play games. But... can it help software developers in, say, find bugs or refactor code?
In this talk, I'll go through the different research projects I was involved in the area. I'll show that machine learning models can, in fact, learn a lot about how we develop software, and recommend interesting things to developers. In particular, I'll talk about models that recommend refactoring (done together with ING), models that find bugs and models that recommend which methods to log (both done together with Adyen).
You don't need to be an expert in machine learning to follow this talk.
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Opportunity Assessment and Advanced ControlJim Cahill
Gregory K. McMillan ( http://www.modelingandcontrol.com ) presents the process of assessing opportunities to apply advanced process control (APC), their potential benefits, and exposes some common myths.
Performance measurement and exception management in investment processingNIIT Technologies
This paper analyzes a system that monitors the complete back office operation s involved with trade processing ; evaluates robustness of processes and the efficiency of employed resources to manage the complete transaction lifecycle and yield optimum results at competitive cost. Despite significant IT investments, there is lot of room to improve at every stage of trade processing — from order initiation to settlement.
From Data to Insights: How IT Operations Data Can Boost QualityCognizant
By leveraging highly-analyzed operational data - the voice of customers, machines and tests - quality assurance (QA) and IT groups can derive major gains in quality of apps and in user experience.
Similar to Predictive Analytics: Business Process Analysis And Optimization a CRM Case Study (20)
From Data to Insights: How IT Operations Data Can Boost Quality
Predictive Analytics: Business Process Analysis And Optimization a CRM Case Study
1. Statistical Analytics Approach To Business Process
Analysis And Optimization
And
Triangulation Analysis of The analysis results will
High Volume Business allow for making
Large Sample
Processes allows for finding „predictions“ as to which
Sizes
and predicting business process
Reliable
process outcomes Results
patterns/characteristics will
Known yield „best parctice“ results
Uncertainties without having to analyze
transactionsl processes in
detail
Analysis Result
Clustering
Variational Analysis
Analysis Pattern
Mean, Median Recognition
Standard Variation Dominant
Data Distributions Characteritics
1
2. CRM Case Study Example: CRM “Process” Data Visible
Through BPM Application Allows For Variation Analysis
Activate Schedule/ Notes
Receiver Reschedule
Service Call
# calls analyzed 98 45
AHT 545 s 325 s AHT variation > 60%
Average # of Screens per 7 8
process
Process variation (as defined 92% 63%
by AHT variance )
Following internal best 55% 36% Supports the conclusion of a more constrained workflow, but number of
practice observed calls not following best practice still significant
User Error Rate ~16% ~16% Considerably lower than the average for the entire population of calls at
47%
System Error Rate ~2% ~2% Compared to ~17% for the entire population of calls
% main process screen was ~90% ~90% Further indication of stable process; main source of process variation likely
used correctly to be unnecessary screen jumping, where Agents may be viewing other
screens for information that may or may not be useful
Talk/wait times without ~48% ~48%
system interaction (as % of
AHT)
CRM : KM breakdown (as % CRM: 82% CRM: 72%
of AHT) KM: 6% KM: 11%
3. Variation Analysis Results Give First Interesting Insights
into Process Performance (I)
Number of successfully analyzed calls (45 – Schedule/ Reschedule Service Call and 98 –
Activate Receiver) make the results below more or less statistically significant (statistical error
~15% and ~ 10%)
Though AHT is significantly, and expectedly, different for both processes (Activate receiver AHT ~
545 sec. vs. 325 sec. for other process) both show significant AHT variation (> 60%)
However, process variation as seen in e.g. AHT and Number of Screens (~60%) used is
significantly less than the variation seen in BPM system for the TOP CRM 10 screens (~100 –
400%) indicating that both processes have better constrained workflows than other processes.
However, both mean and median number of screens per process is high (7 - 8)
This notion is confirmed when clustering workflow sequences and comparing to internal best-
practice workflow sequences – 55% of Activate Receiver and 36% of Schedule/Reschedule
Service Call followed internal best practice – number of processes not following best practice is
still significant
Also, when comparing user error rates on finds that both processes have an error rate per call of
~16%. Very high, but still about 3 times lower than the error rate per call seen in the general
population of about 47%
3
4. Variation Analysis Results Give First Interesting Insights
into Process Performance (II)
The system error rate per call of ~ 2% is significantly lower than the system error rate per call of
~17% in the general population indicating more stable, though still not satisfactory system
performance
Another indicator of a more robust process is the percentage of times the main process screen(s)
were not only used but also used properly – about 90% for both processes – indicating that main
source of process variation originates from unnecessary screen jumping
Significant talk and wait times w/o system interaction, the latter especially for the activate
receiver process ~ 48% of AHT, indicate substantial improvement opportunities
Usage of CRM, KM, OMS and other apps as % of AHT is quite different between the two
processes e.g. KM: 11% for Schedule/Reschedule vs. 6% for Activate Receiver and 72% vs.
82% for CRM but indicates that majority of time is spent in CRM system
The frequency and average time of system usage per process for KM (9% and 49 sec. vs. 19%
and 172 sec.) and OMS (22% and 154 sec. vs. 11% and 72 sec. for activate receiver) indicate,
not unexpectedly, substantial usage variation by process which could be better understood with
BPM system templates
4
5. Advanced data mining methods give even deeper insight
Business Process Example used: Activate Receiver Process
Used statistical analysis tool similar to SPSS
Required usage of more advanced data mining methods e.g.
HAC – Hierarchical Clustering, call handle time in KM, CRM, OMS,
time in other apps and time talking/waiting without system
interactions
Group Characterization
Principal Component Analysis
Multiple Correspondence Analysis
which could be easily utilized without additional development effort
after extracting relevant business process data from systems
5
6. Results of the advanced statistical/”predictive” analysis
Example used: Activate Receiver Process
Outliers (4 of 98) had to be removed to yield descriptive results
Clustering resulted in 5 clusters with 2 clusters encompassing ~ 80% of all
processes – (Cluster 1 – 59 and Cluster 2 – 17 processes)
Cluster 1 is characterized by a significantly lower mean for all system & call
handle as well as lower standard deviation for AHT and CRM time compared to
the entire population
More processes in Cluster 1 are user error free and follow internal best-practice
than compared to the entire population
Cluster 2 is almost the inverse of Cluster 1 in terms of system and AHT times and
standard deviations as well as less processes are user error free and follow
internal best practice compared to the entire population
Drilling into Cluster 1 further yields 5 clusters with 2 clusters containing nearly
75% of processes. Main differentiator between the two is AHT and Time in CRM
with cluster 1 having significantly lower durations and lower variation
While New Cluster 1 contained no user errors vs. about 20% in New Cluster 2, the
distribution of process following internal best-practice was about the same
More detailed screen sequencing data is necessary to better correlate business
process effects (AHT, System time and errors)
6
7. Summary of Statistical Business Process Analysis
Both processes show better process stability/performance (variation in e.g.
Number of CRM Screens used, error rates, % following internal best-practice)
compared to results of December analysis – better constrained CRM workflow
The Activate Receiver process shows better process stability but significant
amount of wait/non-value-add time – significant process improvement
opportunity
Though usage of CRM/KM/OMS is different for both processes, agents spent
significantly more time in CRM – 70 – 80% of AHT – than in either KM or OMS
and less then 25% of all analyzed processes use either KM or OMS
Detailed clustering analysis for the activate receiver process showed
Several distinct process groups/clusters with in part very different
process characteristics
Main points of distinction were very different mean and variation in AHT,
CRM and KM time as well as non-system interactions time
Also best-performing vs. poorer performing groups were distinguished by
no or few user-errors, significant % of internal-best practice and no
technical troubleshooting
7
8. Appendix: Detailed Clustering Results
Cluster 4 Cluster 2 Cluster 3 Cluster 5 Cluster 1
Examples [ 62.8 %] 59 Examples [ 18.1 %] 17 Examples [ 8.5 %] 8 Examples [ 8.5 %] 8 Examples [ 2.1 %] 2
Attribute Description Test value Group Overall Attribute Description Test value Group Overall Attribute Description Test value Group Overall Attribute Description Test value Group Overall Attribute Description Test value Group Overall
Continuous attributes : Mean (StdDev) Continuous attributes : Mean (StdDev) Continuous attributes : Mean (StdDev) Continuous attributes : Mean (StdDev) Continuous attributes : Mean (StdDev)
Time in DORIS -1.95 4.44 (24.93) 13.34 (57.03) Time in RIO 5.95 829.94 (204.18) 430.65 (304.00) Time in other applications 8.65 227.25 (54.27) 26.03 (68.41) Time in OMS 8.19 50.75 (29.38) 4.64 (16.57) Time in DORIS 8.39 350.00 (155.56) 13.34 (57.03)
Talk/Wait time w/o system Talk/Wait time w/o system
Time in other applications -3.44 7.24 (21.95) 26.03 (68.41) Call duration 4.87 841.12 (203.17) 474.66 (341.04) 4.59 453.38 (313.41) 146.11 (197.10) 0.46 177.25 (99.41) 146.11 (197.10) Call duration 4.48 1549.50 (154.86) 474.66 (341.04)
interaction interaction
Talk/Wait time w/o system
Time in OMS -3.51 0.00 (0.00) 4.64 (16.57) 3.19 285.06 (236.57) 146.11 (197.10) Call duration 2.5 764.13 (489.49) 474.66 (341.04) Call duration 0.13 489.38 (195.14) 474.66 (341.04) Time in RIO 3.6 1199.50 (310.42) 430.65 (304.00)
interaction
Talk/Wait time w/o system Talk/Wait time w/o system
-5.6 58.02 (73.70) 146.11 (197.10) Time in DORIS -0.26 10.12 (20.71) 13.34 (57.03) Time in RIO 0.92 526.13 (460.92) 430.65 (304.00) Time in other applications -0.13 23.00 (65.05) 26.03 (68.41) 0.46 210.00 (296.98) 146.11 (197.10)
interaction interaction
Time in RIO -6.22 279.71 (116.72) 430.65 (304.00) Time in OMS -1.27 0.00 (0.00) 4.64 (16.57) Time in OMS -0.16 3.75 (10.61) 4.64 (16.57) Time in RIO -0.22 407.63 (185.11) 430.65 (304.00) Time in OMS -0.4 0.00 (0.00) 4.64 (16.57)
Call duration -6.73 291.39 (127.41) 474.66 (341.04) Time in other applications -1.65 1.06 (4.37) 26.03 (68.41) Time in DORIS -0.33 7.00 (14.71) 13.34 (57.03) Time in DORIS -0.28 8.00 (16.14) 13.34 (57.03) Time in other applications -0.54 0.00 (0.00) 26.03 (68.41)
Discrete attributes : [Recall] Accuracy Discrete attributes : [Recall] Accuracy Discrete attributes : [Recall] Accuracy Discrete attributes : [Recall] Accuracy Discrete attributes : [Recall] Accuracy
Was technical trouble- Was technical trouble- Was technical trouble- Was technical trouble-
Was there a system error
shooting necessary during 3.69 [ 73.9 %] 86.4 % 73.40% shooting necessary during 2.2 [ 37.5 %] 35.3 % 17.00% 3.28 [ 100.0 %] 12.5 % 1.10% shooting necessary during 2.58 [ 25.0 %] 50.0 % 17.00% shooting necessary during 1.95 [ 11.1 %] 50.0 % 9.60%
during the call?=Y
call?=N call?=Y call?=Y call?=Y
Was functionality of main Was technical trouble- Was functionality of main
Was there a system error Process followed best-
1.3 [ 63.4 %] 100.0 % 98.90% screen for process used as 0.96 [ 18.9 %] 100.0 % 95.70% shooting necessary during 1.6 [ 18.8 %] 37.5 % 17.00% screen for process used as 1.2 [ 25.0 %] 12.5 % 4.30% 1.25 [ 3.8 %] 100.0 % 56.40%
during the call?=N practice?=Y
intended?=Y call?=Y intended?=N
Request completed in
Was there a user error Was there a user error Process followed best- Was there a user error
0.82 [ 64.6 %] 86.4 % 84.00% 0.94 [ 26.7 %] 23.5 % 16.00% 1.12 [ 12.2 %] 62.5 % 43.60% screen designed for 0.62 [ 8.9 %] 100.0 % 95.70% 0.62 [ 2.5 %] 100.0 % 84.00%
during the call?=N during the call?=Y practice?=N during the call?=N
request?=Y
Request completed in
Process followed best- Process followed best- Was there a user error Process followed best-
0.74 [ 66.0 %] 59.3 % 56.40% 0.85 [ 22.0 %] 52.9 % 43.60% 0.73 [ 13.3 %] 25.0 % 16.00% 0.36 [ 9.4 %] 62.5 % 56.40% screen designed for 0.3 [ 2.2 %] 100.0 % 95.70%
practice?=Y practice?=N during the call?=Y practice?=Y
request?=Y
Request completed in Was functionality of main Was functionality of main
Was there a system error Was there a system error
screen designed for 0.51 [ 75.0 %] 5.1 % 4.30% 0.47 [ 18.3 %] 100.0 % 98.90% screen for process used as 0.62 [ 8.9 %] 100.0 % 95.70% 0.3 [ 8.6 %] 100.0 % 98.90% screen for process used as 0.3 [ 2.2 %] 100.0 % 95.70%
during the call?=N during the call?=N
request?=N intended?=Y intended?=Y
Was functionality of main Request completed in Request completed in
Was there a user error Was there a system error
screen for process used as 0.51 [ 75.0 %] 5.1 % 4.30% screen designed for 0.37 [ 25.0 %] 5.9 % 4.30% screen designed for 0.62 [ 8.9 %] 100.0 % 95.70% 0.28 [ 8.9 %] 87.5 % 84.00% 0.15 [ 2.2 %] 100.0 % 98.90%
during the call?=N during the call?=N
intended?=N request?=N request?=Y
Was technical trouble- Was technical trouble- Was technical trouble-
Was there a user error Was there a system error
shooting necessary during -0.47 [ 55.6 %] 8.5 % 9.60% shooting necessary during 0.34 [ 22.2 %] 11.8 % 9.60% shooting necessary during 0.29 [ 11.1 %] 12.5 % 9.60% -0.28 [ 6.7 %] 12.5 % 16.00% -0.15 [ 0.0 %] 0.0 % 1.10%
during the call?=Y during the call?=Y
call?=Y call?=Y call?=Y
Was functionality of main Request completed in Was functionality of main Was functionality of main
Was there a system error
screen for process used as -0.51 [ 62.2 %] 94.9 % 95.70% screen designed for -0.37 [ 17.8 %] 94.1 % 95.70% screen for process used as -0.62 [ 0.0 %] 0.0 % 4.30% -0.3 [ 0.0 %] 0.0 % 1.10% screen for process used as -0.3 [ 0.0 %] 0.0 % 4.30%
during the call?=Y
intended?=Y request?=Y intended?=N intended?=N
Request completed in Request completed in Request completed in
Was there a system error Process followed best-
screen designed for -0.51 [ 62.2 %] 94.9 % 95.70% -0.47 [ 0.0 %] 0.0 % 1.10% screen designed for -0.62 [ 0.0 %] 0.0 % 4.30% -0.36 [ 7.3 %] 37.5 % 43.60% screen designed for -0.3 [ 0.0 %] 0.0 % 4.30%
during the call?=Y practice?=N
request?=Y request?=N request?=N
Request completed in
Process followed best- Process followed best- Was there a user error Was there a user error
-0.74 [ 58.5 %] 40.7 % 43.60% -0.85 [ 15.1 %] 47.1 % 56.40% -0.73 [ 7.6 %] 75.0 % 84.00% screen designed for -0.62 [ 0.0 %] 0.0 % 4.30% -0.62 [ 0.0 %] 0.0 % 16.00%
practice?=N practice?=Y during the call?=N during the call?=Y
request?=N
Was technical trouble- Was technical trouble-
Was there a user error Was there a user error Process followed best-
-0.82 [ 53.3 %] 13.6 % 16.00% -0.94 [ 16.5 %] 76.5 % 84.00% -1.12 [ 5.7 %] 37.5 % 56.40% shooting necessary during -0.96 [ 0.0 %] 0.0 % 9.60% shooting necessary during -0.64 [ 0.0 %] 0.0 % 17.00%
during the call?=Y during the call?=N practice?=Y
call?=Y call?=Y
Was functionality of main Was technical trouble- Was functionality of main Was technical trouble-
Was there a system error
-1.3 [ 0.0 %] 0.0 % 1.10% screen for process used as -0.96 [ 0.0 %] 0.0 % 4.30% shooting necessary during -1.56 [ 5.8 %] 50.0 % 73.40% screen for process used as -1.2 [ 7.8 %] 87.5 % 95.70% shooting necessary during -0.75 [ 1.4 %] 50.0 % 73.40%
during the call?=Y
intended?=N call?=N intended?=Y call?=N
Was technical trouble- Was technical trouble- Was technical trouble-
Was there a system error Process followed best-
shooting necessary during -3.98 [ 18.8 %] 5.1 % 17.00% shooting necessary during -2.1 [ 13.0 %] 52.9 % 73.40% -3.28 [ 7.5 %] 87.5 % 98.90% shooting necessary during -1.56 [ 5.8 %] 50.0 % 73.40% -1.25 [ 0.0 %] 0.0 % 43.60%
during the call?=N practice?=N
call?=Y call?=N call?=N
8