These slides were presented at the Use R! 2011 Conference, held in the University of Warwick (Coventry) in August 2011. It was one of the presentations of the Lightning Talks session.
Six Sigma Quality Using R: Tools and Training Emilio L. Cano
This document discusses using the statistical software R for Six Sigma quality improvement projects. It introduces Six Sigma methodology and the DMAIC problem-solving strategy. The presentation outlines how R can be used for Six Sigma tools and analysis, highlights useful R packages for Six Sigma, and describes efforts to spread the use of R for quality improvement.
The document discusses using a card wall to visualize team workflow and track progress on agile projects. It explains how a card wall displays cards representing tasks or user stories that move through different stages like "To Do", "In Progress", and "Done". This allows teams to see bottlenecks, work distribution, and ensure smooth flow of work through the development process. Key benefits of the card wall include establishing shared vision, facilitating communication, and helping maintain a sustainable pace.
Calidad Seis Sigma con R: Competitividad e InnovaciónEmilio L. Cano
Este documento presenta la metodología Seis Sigma y su aplicación con el software R. Explica los fundamentos de Seis Sigma, incluyendo el ciclo DMAIC y los roles en los proyectos. También describe las herramientas estadísticas y de planificación que se pueden usar en cada fase, así como software comercial disponible. El objetivo es difundir el uso de Seis Sigma junto con R para mejorar la calidad y competitividad de las empresas.
Decision Making under Uncertainty: R implementation for Energy Efficient Buil...Emilio L. Cano
The document describes a decision support system for operators of energy efficient buildings developed by Emilio L. Cano and Javier M. Moguerza of the University Rey Juan Carlos. It presents an R implementation for modeling decision making under uncertainty that symbolically specifies optimization problems, generates solver input files, and analyzes solutions to support strategic and operational energy management decisions in public buildings.
The presented work aims to contribute towards
the standardization and the interoperability off the Future
Internet through an open and scalable architecture design.
We present S3OiA as a syntactic/semantic Service-Oriented
Architecture that allows the integration of any type of object
or device, not mattering their nature, on the Internet of
Things. Moreover, the architecture makes possible the use
of underlying heterogeneous resources as a substrate for
the automatic composition of complex applications through
a semantic Triple Space paradigm. Created applications are
dynamic and adaptive since they are able to evolve depending
on the context where they are executed. The validation scenario
of this architecture encompasses areas which are prone to
involve human beings in order to promote personal autonomy,
such as home-care automation environments and Ambient
Assisted Living.
The document provides examples of resources, activities, and costs for different startup examples including a life science company, robotic agriculture startup, medical device company, and Apple. Each example outlines key resources, activities, partnerships, costs, and potential revenue streams for the different business models.
Six Sigma Quality Using R: Tools and Training Emilio L. Cano
This document discusses using the statistical software R for Six Sigma quality improvement projects. It introduces Six Sigma methodology and the DMAIC problem-solving strategy. The presentation outlines how R can be used for Six Sigma tools and analysis, highlights useful R packages for Six Sigma, and describes efforts to spread the use of R for quality improvement.
The document discusses using a card wall to visualize team workflow and track progress on agile projects. It explains how a card wall displays cards representing tasks or user stories that move through different stages like "To Do", "In Progress", and "Done". This allows teams to see bottlenecks, work distribution, and ensure smooth flow of work through the development process. Key benefits of the card wall include establishing shared vision, facilitating communication, and helping maintain a sustainable pace.
Calidad Seis Sigma con R: Competitividad e InnovaciónEmilio L. Cano
Este documento presenta la metodología Seis Sigma y su aplicación con el software R. Explica los fundamentos de Seis Sigma, incluyendo el ciclo DMAIC y los roles en los proyectos. También describe las herramientas estadísticas y de planificación que se pueden usar en cada fase, así como software comercial disponible. El objetivo es difundir el uso de Seis Sigma junto con R para mejorar la calidad y competitividad de las empresas.
Decision Making under Uncertainty: R implementation for Energy Efficient Buil...Emilio L. Cano
The document describes a decision support system for operators of energy efficient buildings developed by Emilio L. Cano and Javier M. Moguerza of the University Rey Juan Carlos. It presents an R implementation for modeling decision making under uncertainty that symbolically specifies optimization problems, generates solver input files, and analyzes solutions to support strategic and operational energy management decisions in public buildings.
The presented work aims to contribute towards
the standardization and the interoperability off the Future
Internet through an open and scalable architecture design.
We present S3OiA as a syntactic/semantic Service-Oriented
Architecture that allows the integration of any type of object
or device, not mattering their nature, on the Internet of
Things. Moreover, the architecture makes possible the use
of underlying heterogeneous resources as a substrate for
the automatic composition of complex applications through
a semantic Triple Space paradigm. Created applications are
dynamic and adaptive since they are able to evolve depending
on the context where they are executed. The validation scenario
of this architecture encompasses areas which are prone to
involve human beings in order to promote personal autonomy,
such as home-care automation environments and Ambient
Assisted Living.
The document provides examples of resources, activities, and costs for different startup examples including a life science company, robotic agriculture startup, medical device company, and Apple. Each example outlines key resources, activities, partnerships, costs, and potential revenue streams for the different business models.
Advanced Project Analysis: An Introduction to Fuse 3.0Acumen
This document summarizes an advanced project analysis presentation. It discusses (1) why sound planning is important for project success, (2) using metrics for project analysis, and (3) a case study demonstrating a project analysis software called Fuse 3.0. Fuse 3.0 provides enterprise project analysis through modules for metric, logic, and forensic analysis. It integrates with various scheduling and project management software to improve planning and execution through schedule validation, bid analysis, and regulatory compliance assessments.
100% R and More: Plus What's New in Revolution R Enterprise 6.0Revolution Analytics
R users already know why the R language is the lingua franca of statisticians today: because it's the most powerful statistical language in the world. Revolution Analytics builds on the power of open source R, and adds performance, productivity and integration features to create Revolution R Enterprise. In this webinar, author and blogger David Smith will introduce the additional capabilities of Revolution R Enterprise.
VP of Product Development, Dr. Sue Ranney will also provide an overview of the features introduced in Revolution R Enterprise 6.0 including:
1. Big Data Generalized Linear Model, the new RevoScaleR function that provides a fast, scalable, distributable implementation of generalized linear models, offering impressive speed-ups relative to glm on in-memory data frames
2. Platform LSF Cluster Support, which allows you to create a distributed compute context for the Platform LSF workload manager
3. Azure Burst support added to RxHpcServer
4. Updated R engine (R 2.14.2)
5. Ability to use RevoScaleR analysis functions with non-xdf data sources such as SAS, SPSS or text
6. New methods for RxXdfData data sources including head, tail, names, dim, colnames, length, str, and formula
7. New function rxRoc for generating ROC curves
Design for People, Effective Innovation and SustainabilityMusstanser Tinauli
Presentation for the thesis titled, " Designing for people, effective innovation and sustainability: Introducing experiential factors in an observational framework to evaluate technology assisted systems".
Tools that can generate automatic test scripts from requirements will become more prevalent as a way to verify requirements and reduce testing effort.
Existing challenge:
Bridging the gap between natural language requirements and automated testing.
S-functions Presentation: The S-parameters for nonlinear components - Measure...NMDG NV
The document discusses S-functions, which are proposed as behavioral models for nonlinear components and applications, analogous to how S-parameters are used for linear components and applications. S-functions aim to simplify design and testing of nonlinear RF/microwave circuits by providing a uniform characterization approach, as S-parameters do for linear circuits. By extracting S-functions from a component, its nonlinear behavior can be modeled and its performance can be simulated, enabling more efficient system-level design and easier comparison to measurements during manufacturing testing. The document outlines benefits of the S-function approach and similarities to the established S-parameter methodology.
We at Revolution Analytics are often asked “What is the best way to learn R?” While acknowledging that there may be as many effective learning styles as there are people we have identified three factors that greatly facilitate learning R. For a quick start:
- Find a way of orienting yourself in the open source R world
- Have a definite application area in mind
- Set an initial goal of doing something useful and then build on it
In this webinar, we focus on data mining as the application area and show how anyone with just a basic knowledge of elementary data mining techniques can become immediately productive in R. We will:
- Provide an orientation to R’s data mining resources
- Show how to use the "point and click" open source data mining GUI, rattle, to perform the basic data mining functions of exploring and visualizing data, building classification models on training data sets, and using these models to classify new data.
- Show the simple R commands to accomplish these same tasks without the GUI
- Demonstrate how to build on these fundamental skills to gain further competence in R
- Move away from using small test data sets and show with the same level of skill one could analyze some fairly large data sets with RevoScaleR
Data scientists and analysts using other statistical software as well as students who are new to data mining should come away with a plan for getting started with R.
Colored petri nets theory and applicationsAbu Hussein
This document discusses colored Petri nets (CP-nets) and their applications. CP-nets combine Petri nets with programming languages to model systems involving concurrency, communication, and resource sharing. They allow for simulation and formal verification. The document provides examples of CP-net applications in various domains including protocols, software, hardware, control systems, and military systems. It also describes how CP-net models can be used to automatically generate code for system implementations.
Revolution Analytics provides Revolution R, which adds functionality to the open source R programming language for statistical analysis and predictive modeling. Revolution R improves R's productivity, performance, and ability to handle large datasets. It features an interactive development environment, multi-threaded math for faster computation, and tools to perform distributed, parallel analytics on big data in Hadoop and databases.
ILAP (held at the 3rd Digital Oilfield Summit)André Torkveen
The presentation I gave back in October ’15. The conference was held in London by ACI (see http://www.wplgroup.com/aci/event/digital-oilfield-summit-europe/)
The document discusses STMicroelectronics' deployment of functional qualification methodologies using Certitude mutation analysis. It outlines ST's initial engagement with Certess in 2004 and how they have expanded usage of the technology to now cover 80% of ST's IPs. The document also provides details on ST's functional qualification methodology, sharing of best practices, detection strategies used, and two case studies on measuring quality of third-party IPs and detecting issues in a video codec design.
Deploying Functional Qualification at STMicroelectronicsDVClub
The document discusses STMicroelectronics' deployment of functional qualification methodologies using Certitude mutation analysis. It began as a collaboration with Certess in 2004 and has now expanded to cover 80% of ST's IPs. The methodology focuses on using Certitude metrics to measure verification quality and guide improvement efforts. Case studies demonstrate detecting issues in third-party IP and improving verification of a video codec through hierarchical fault dropping and incremental detection strategies.
Revolution R is a commercial product that adds functionality to the open source R programming language. It provides an integrated development environment, improved performance through multi-threaded math, and capabilities for handling big data through interfaces with Hadoop and Netezza. Revolution R also includes tools for interactive debugging, organizing code and data, and performing distributed analytics on large datasets using algorithms in RevoScaleR.
1) The document discusses enterprise optimization through analytics that go beyond traditional business intelligence (BI) and spreadsheets.
2) It promotes the benefits of TIBCO's analytics solutions, including clarity of visualization, freedom of spreadsheets, relevance of applications, and confidence in statistics.
3) TIBCO's analytics can help organizations better analyze processes and events in real-time to improve decision making and business outcomes.
Revolution Analytics provides an advanced analytics platform called Revolution R Enterprise that allows users to leverage the open source R language for big data analytics. The presentation discusses how R can be used to extract value from large, complex datasets through data exploration, visualization, and predictive modeling. It also outlines best practices for implementing an advanced analytics stack and how Revolution R Enterprise optimizes R for distributed computing across multiple data platforms like Hadoop and databases. The key benefits of the Revolution R platform are that it makes R scalable for big data, provides an enterprise-ready environment, and allows organizations to leverage R's flexibility for analytics innovation.
The document summarizes a conference on revamping the audit approach using XBRL-tagged accounting equation data. It discusses modeling the audit using a "top-cycle" approach, developing a domain-specific language for auditing, and applying XBRL tagging to all phases of a new 5-phase audit process for continuous, real-time auditing and reporting. The conference brings together academics and practitioners to advance this new computational auditing approach using XBRL data processing and modeling.
This document compares the software quality analysis tools CAST and SONAR. It finds that CAST covers more functionality overall, covering 80% of all functionality compared to 60% for SONAR. However, SONAR has some advantages in testing capabilities. Both tools cover the main technologies used at Amadeus, like Java and C++, but CAST supports more technologies. While CAST has more features, it also has higher license costs compared to the open source SONAR.
This document compares the software quality analysis tools CAST and SONAR. It finds that CAST covers more functionality overall, covering 80% of all functionality compared to 60% for SONAR. However, SONAR has some advantages in testing capabilities. Both tools cover most major functionalities like scanning for quality rules and providing scores. CAST has extra capabilities like generating action plans and estimating effort, while SONAR can execute unit tests and analyze code coverage. The document discusses pros and cons of each tool.
This document provides an executive summary of a white paper that reviews SAP Sybase IQ 15.4, a database platform designed to support business analytics and big data workloads. The white paper was sponsored by Sybase Inc. and conducted by independent analyst firm WinterCorp. Key points covered in the executive summary include:
- SAP Sybase IQ 15.4 aims to make the entire analytics process work smoothly and cost-effectively for both structured and unstructured data.
- It features a new analytic services layer, parallel processing with Hadoop, support for the R language, and expanded ecosystem support from third parties.
- At its core is a mature columnar database with data compression and query optimization capabilities designed for
The document outlines the key aspects of a project to develop an Android application called HandSimDroid. It included sections on project overview, demonstrations, operations and processes, planning and risks. The application allows running simulations of Ptolemy models, an open-source modeling tool, on handheld devices. It aimed to serve as a proof of concept and inspire innovation at Bosch, the client. The team tracked hours, identified process improvements, and released multiple versions to complete requirements and adapt to changes.
Probe Card Cost Drivers from Architecture to Zero Defects - IEEE Semiconductor Wafer Test Workshop 2011 presentation by Ira Feldman (www.hightechbizdev.com)
R and Shiny to support real estate appraisers: An expert algorithm implementa...Emilio L. Cano
This document describes an expert algorithm implementation for Automated Valuation Models (AVM) to help real estate appraisers value properties. The algorithm collects property characteristics and sale prices from online listings to find comparable properties in a similar way to a human appraiser. It uses an modified inverse distance weighting estimator to determine a property's value based on comparable properties found. The algorithm is implemented using R and Shiny to allow configuration of rules and provide an interactive interface for appraisers to explore estimation results. Ongoing work aims to improve precision through machine learning and geostatistics models.
Generación de materiales didácticos multiformato con bookdownEmilio L. Cano
Este documento describe cómo usar la herramienta bookdown para generar materiales didácticos en múltiples formatos. El objetivo es guiar a los estudiantes y proporcionar recursos atractivos y actualizables en varias plataformas. Se utiliza R Markdown para crear los documentos, que luego se compilan en formato libro usando bookdown. Esto permite incluir código R y resultados dinámicos. El resultado es un sitio web con los apuntes que los estudiantes pueden usar en diferentes dispositivos.
Advanced Project Analysis: An Introduction to Fuse 3.0Acumen
This document summarizes an advanced project analysis presentation. It discusses (1) why sound planning is important for project success, (2) using metrics for project analysis, and (3) a case study demonstrating a project analysis software called Fuse 3.0. Fuse 3.0 provides enterprise project analysis through modules for metric, logic, and forensic analysis. It integrates with various scheduling and project management software to improve planning and execution through schedule validation, bid analysis, and regulatory compliance assessments.
100% R and More: Plus What's New in Revolution R Enterprise 6.0Revolution Analytics
R users already know why the R language is the lingua franca of statisticians today: because it's the most powerful statistical language in the world. Revolution Analytics builds on the power of open source R, and adds performance, productivity and integration features to create Revolution R Enterprise. In this webinar, author and blogger David Smith will introduce the additional capabilities of Revolution R Enterprise.
VP of Product Development, Dr. Sue Ranney will also provide an overview of the features introduced in Revolution R Enterprise 6.0 including:
1. Big Data Generalized Linear Model, the new RevoScaleR function that provides a fast, scalable, distributable implementation of generalized linear models, offering impressive speed-ups relative to glm on in-memory data frames
2. Platform LSF Cluster Support, which allows you to create a distributed compute context for the Platform LSF workload manager
3. Azure Burst support added to RxHpcServer
4. Updated R engine (R 2.14.2)
5. Ability to use RevoScaleR analysis functions with non-xdf data sources such as SAS, SPSS or text
6. New methods for RxXdfData data sources including head, tail, names, dim, colnames, length, str, and formula
7. New function rxRoc for generating ROC curves
Design for People, Effective Innovation and SustainabilityMusstanser Tinauli
Presentation for the thesis titled, " Designing for people, effective innovation and sustainability: Introducing experiential factors in an observational framework to evaluate technology assisted systems".
Tools that can generate automatic test scripts from requirements will become more prevalent as a way to verify requirements and reduce testing effort.
Existing challenge:
Bridging the gap between natural language requirements and automated testing.
S-functions Presentation: The S-parameters for nonlinear components - Measure...NMDG NV
The document discusses S-functions, which are proposed as behavioral models for nonlinear components and applications, analogous to how S-parameters are used for linear components and applications. S-functions aim to simplify design and testing of nonlinear RF/microwave circuits by providing a uniform characterization approach, as S-parameters do for linear circuits. By extracting S-functions from a component, its nonlinear behavior can be modeled and its performance can be simulated, enabling more efficient system-level design and easier comparison to measurements during manufacturing testing. The document outlines benefits of the S-function approach and similarities to the established S-parameter methodology.
We at Revolution Analytics are often asked “What is the best way to learn R?” While acknowledging that there may be as many effective learning styles as there are people we have identified three factors that greatly facilitate learning R. For a quick start:
- Find a way of orienting yourself in the open source R world
- Have a definite application area in mind
- Set an initial goal of doing something useful and then build on it
In this webinar, we focus on data mining as the application area and show how anyone with just a basic knowledge of elementary data mining techniques can become immediately productive in R. We will:
- Provide an orientation to R’s data mining resources
- Show how to use the "point and click" open source data mining GUI, rattle, to perform the basic data mining functions of exploring and visualizing data, building classification models on training data sets, and using these models to classify new data.
- Show the simple R commands to accomplish these same tasks without the GUI
- Demonstrate how to build on these fundamental skills to gain further competence in R
- Move away from using small test data sets and show with the same level of skill one could analyze some fairly large data sets with RevoScaleR
Data scientists and analysts using other statistical software as well as students who are new to data mining should come away with a plan for getting started with R.
Colored petri nets theory and applicationsAbu Hussein
This document discusses colored Petri nets (CP-nets) and their applications. CP-nets combine Petri nets with programming languages to model systems involving concurrency, communication, and resource sharing. They allow for simulation and formal verification. The document provides examples of CP-net applications in various domains including protocols, software, hardware, control systems, and military systems. It also describes how CP-net models can be used to automatically generate code for system implementations.
Revolution Analytics provides Revolution R, which adds functionality to the open source R programming language for statistical analysis and predictive modeling. Revolution R improves R's productivity, performance, and ability to handle large datasets. It features an interactive development environment, multi-threaded math for faster computation, and tools to perform distributed, parallel analytics on big data in Hadoop and databases.
ILAP (held at the 3rd Digital Oilfield Summit)André Torkveen
The presentation I gave back in October ’15. The conference was held in London by ACI (see http://www.wplgroup.com/aci/event/digital-oilfield-summit-europe/)
The document discusses STMicroelectronics' deployment of functional qualification methodologies using Certitude mutation analysis. It outlines ST's initial engagement with Certess in 2004 and how they have expanded usage of the technology to now cover 80% of ST's IPs. The document also provides details on ST's functional qualification methodology, sharing of best practices, detection strategies used, and two case studies on measuring quality of third-party IPs and detecting issues in a video codec design.
Deploying Functional Qualification at STMicroelectronicsDVClub
The document discusses STMicroelectronics' deployment of functional qualification methodologies using Certitude mutation analysis. It began as a collaboration with Certess in 2004 and has now expanded to cover 80% of ST's IPs. The methodology focuses on using Certitude metrics to measure verification quality and guide improvement efforts. Case studies demonstrate detecting issues in third-party IP and improving verification of a video codec through hierarchical fault dropping and incremental detection strategies.
Revolution R is a commercial product that adds functionality to the open source R programming language. It provides an integrated development environment, improved performance through multi-threaded math, and capabilities for handling big data through interfaces with Hadoop and Netezza. Revolution R also includes tools for interactive debugging, organizing code and data, and performing distributed analytics on large datasets using algorithms in RevoScaleR.
1) The document discusses enterprise optimization through analytics that go beyond traditional business intelligence (BI) and spreadsheets.
2) It promotes the benefits of TIBCO's analytics solutions, including clarity of visualization, freedom of spreadsheets, relevance of applications, and confidence in statistics.
3) TIBCO's analytics can help organizations better analyze processes and events in real-time to improve decision making and business outcomes.
Revolution Analytics provides an advanced analytics platform called Revolution R Enterprise that allows users to leverage the open source R language for big data analytics. The presentation discusses how R can be used to extract value from large, complex datasets through data exploration, visualization, and predictive modeling. It also outlines best practices for implementing an advanced analytics stack and how Revolution R Enterprise optimizes R for distributed computing across multiple data platforms like Hadoop and databases. The key benefits of the Revolution R platform are that it makes R scalable for big data, provides an enterprise-ready environment, and allows organizations to leverage R's flexibility for analytics innovation.
The document summarizes a conference on revamping the audit approach using XBRL-tagged accounting equation data. It discusses modeling the audit using a "top-cycle" approach, developing a domain-specific language for auditing, and applying XBRL tagging to all phases of a new 5-phase audit process for continuous, real-time auditing and reporting. The conference brings together academics and practitioners to advance this new computational auditing approach using XBRL data processing and modeling.
This document compares the software quality analysis tools CAST and SONAR. It finds that CAST covers more functionality overall, covering 80% of all functionality compared to 60% for SONAR. However, SONAR has some advantages in testing capabilities. Both tools cover the main technologies used at Amadeus, like Java and C++, but CAST supports more technologies. While CAST has more features, it also has higher license costs compared to the open source SONAR.
This document compares the software quality analysis tools CAST and SONAR. It finds that CAST covers more functionality overall, covering 80% of all functionality compared to 60% for SONAR. However, SONAR has some advantages in testing capabilities. Both tools cover most major functionalities like scanning for quality rules and providing scores. CAST has extra capabilities like generating action plans and estimating effort, while SONAR can execute unit tests and analyze code coverage. The document discusses pros and cons of each tool.
This document provides an executive summary of a white paper that reviews SAP Sybase IQ 15.4, a database platform designed to support business analytics and big data workloads. The white paper was sponsored by Sybase Inc. and conducted by independent analyst firm WinterCorp. Key points covered in the executive summary include:
- SAP Sybase IQ 15.4 aims to make the entire analytics process work smoothly and cost-effectively for both structured and unstructured data.
- It features a new analytic services layer, parallel processing with Hadoop, support for the R language, and expanded ecosystem support from third parties.
- At its core is a mature columnar database with data compression and query optimization capabilities designed for
The document outlines the key aspects of a project to develop an Android application called HandSimDroid. It included sections on project overview, demonstrations, operations and processes, planning and risks. The application allows running simulations of Ptolemy models, an open-source modeling tool, on handheld devices. It aimed to serve as a proof of concept and inspire innovation at Bosch, the client. The team tracked hours, identified process improvements, and released multiple versions to complete requirements and adapt to changes.
Probe Card Cost Drivers from Architecture to Zero Defects - IEEE Semiconductor Wafer Test Workshop 2011 presentation by Ira Feldman (www.hightechbizdev.com)
R and Shiny to support real estate appraisers: An expert algorithm implementa...Emilio L. Cano
This document describes an expert algorithm implementation for Automated Valuation Models (AVM) to help real estate appraisers value properties. The algorithm collects property characteristics and sale prices from online listings to find comparable properties in a similar way to a human appraiser. It uses an modified inverse distance weighting estimator to determine a property's value based on comparable properties found. The algorithm is implemented using R and Shiny to allow configuration of rules and provide an interactive interface for appraisers to explore estimation results. Ongoing work aims to improve precision through machine learning and geostatistics models.
Generación de materiales didácticos multiformato con bookdownEmilio L. Cano
Este documento describe cómo usar la herramienta bookdown para generar materiales didácticos en múltiples formatos. El objetivo es guiar a los estudiantes y proporcionar recursos atractivos y actualizables en varias plataformas. Se utiliza R Markdown para crear los documentos, que luego se compilan en formato libro usando bookdown. Esto permite incluir código R y resultados dinámicos. El resultado es un sitio web con los apuntes que los estudiantes pueden usar en diferentes dispositivos.
Unattended SVM parameters fitting for monitoring nonlinear profilesEmilio L. Cano
This document discusses using support vector machines (SVM) for unattended parameter fitting to monitor nonlinear profiles. It presents an illustrative example of using SVM regression to smooth measured density profiles of engineered wood boards. The key points are:
1) SVM regression requires selecting parameters C (regularization parameter) and ε (width of insensitive zone), which control the complexity and deviations of the model.
2) Methods are presented for unattended selection of C and ε based on properties of the input noise and data.
3) The SVM model is applied to smooth individual nonlinear profiles from measured wood board density data and identify potential outliers.
Six Sigma as a Quality Improvement Tool for Academic ProgramsEmilio L. Cano
The document discusses using Six Sigma as a quality improvement tool for academic programs. It aims to design and improve an Internal System Quality Assurance for a university to comply with accreditation standards. The authors extend the Six Sigma methodology, which uses the DMAIC strategy of Define, Measure, Analyze, Improve, Control to industrial quality processes, to academic processes. They develop a catalog of process typologies and apply Six Sigma to examples like defining quality policies and student selection. The goal is to systematically identify variations and continuously improve procedures.
Appling Scrum to Organize University Degrees CourseworkEmilio L. Cano
The document discusses applying the Scrum framework to organize university coursework. Scrum is an agile project management framework typically used for software development. It involves sprints, daily stand-up meetings, product backlogs and user stories. The authors applied Scrum concepts like sprints and user stories to structure practical work for a university course. Students worked in scrum teams on assignments divided into sprints. They found Scrum helped organize their work and the teachers found it improved classroom work organization and planning.
Monitoring nonlinear profiles with {R}: an application to quality controlEmilio L. Cano
This document discusses using R to analyze nonlinear profiles. It introduces the SixSigma package for smoothing nonlinear profiles using support vector machines. An example is provided using particle board density data to create a prototype profile and identify out-of-control boards. Nonlinear profiles allow more complex quality characteristics to be modeled and can be used with Shewhart control charts.
Energy-efficient technology investments using a decision support system frame...Emilio L. Cano
This document presents an integrated framework for decision support systems using R. It describes using R and related packages to represent stochastic energy optimization problems, generate input files for solvers, analyze results, and produce reproducible reports. Stochastic models are developed and solved within this framework. The framework allows statistical analysis, graphical output, model equations, solver inputs/outputs, and comprehensive reports to be combined for modeling, analysis, and stakeholder communication.
Generación y corrección automática de trabajos evaluables personalizados con ...Emilio L. Cano
El documento describe un método para generar trabajos evaluables personalizados para estudiantes individuales utilizando el software R. El método genera datos y enunciados únicos para cada estudiante, crea archivos de trabajo en formato Excel, evalúa automáticamente los trabajos terminados y califica a los estudiantes de forma eficiente. El objetivo es proporcionar una evaluación justa y diferenciada que promueva métodos de enseñanza innovadores.
Talentyon: how to turn R expertise into business within the collaborative eco...Emilio L. Cano
Talentyon is a platform that connects data analytics experts with businesses seeking their expertise. It aims to address challenges like the talent crunch in analytics and the rise of freelancers by building a network of verified experts. The case study describes how Talentyon matched an industrial manager at a food company with an R expert, providing statistical training and support to improve the company's industrial processes. As a result, the company benefited from ongoing improvement projects using statistical methods, while the expert earned remuneration through the Talentyon network.
Las normas ISO como puerta de entrada de la Estadística en la empresaEmilio L. Cano
Una norma ISO está reconocida y aceptada internacionalmente. Son desarrolladas por expertos de todo el mundo a través de comités técnicos a los que pertenecen entidades de normalización nacionales como AENOR, que canaliza la participación española en la elaboración de normas. El subcomité AENOR de métodos estadísticos CTN66/SC3 participa en el comité técnico de ISO TC69 ``Applications of statistical methods''. El subcomité CTN66/SC3 participa en el desarrollo y adopción de normas internacionales en estadística, así como su traducción y adopción a nivel nacional como normas UNE-ISO. Algunas de las normas adoptadas como normas UNE-ISO tratan sobre Seis Sigma(serie ISO 13053), gráficos de control (serie ISO 7870), inspección por muestreo (series ISO 2589 e ISO 3951), vocabulario (serie ISO 3534), entre otras. La normalización proporciona beneficios directos a las empresas, y una manera de llevar la Estadística a las empresas es a través de las normas.
Las 7 herramientas básicas de la calidad con REmilio L. Cano
Este documento presenta las 7 herramientas básicas de la calidad con R. Describe cada una de las herramientas, incluyendo el diagrama de causa-efecto, la hoja de verificación, el gráfico de control, el histograma, el gráfico de Pareto, el gráfico de dispersión y la estratificación. Muestra cómo crear cada una de estas herramientas estadísticas básicas utilizando paquetes de R como qcc y SixSigma.
Análisis de inversiones energéticas en el ámbito del edificioEmilio L. Cano
El documento analiza las inversiones energéticas en edificios bajo condiciones de incertidumbre. Explica que los enfoques deterministas conducen a riesgos al no considerar la variabilidad. Propone el uso de modelos de optimización estocástica para gestionar el riesgo. Presenta el Sistema de Ayuda a la Decisión EnRiMa desarrollado para apoyar la toma de decisiones estratégicas a largo plazo en condiciones de incertidumbre.
Standardisation on Statistics: ISO Standards and R ToolsEmilio L. Cano
This document discusses standardization in statistics through ISO standards and how R statistical software can support them. It provides an overview of ISO/TC 69, which develops standards for statistical applications, and AENOR, Spain's standards body involved in adopting and managing statistical standards. The document concludes that data scientists can benefit from understanding both standards and using R, as R code is open source and can be easily verified, meeting requirements of standards like ISO 9001.
An integrated Solver Manager: using R and Python for energy systems optimizationEmilio L. Cano
1) Decision support systems are needed to address new challenges for building managers around energy planning given global changes and local needs.
2) A Solver Manager was developed to integrate optimization models and solvers in a flexible and extensible way for use in decision support systems.
3) An example energy systems optimization model is presented involving minimizing costs subject to capacity and demand constraints. The model is specified, an instance is generated with data, and the solution is obtained.
Calidad Seis Sigma con R: Aplicación a la docenciaEmilio L. Cano
This document discusses using R software to support Six Sigma methodology. It introduces reproducible research approaches for statistical training, provides examples using Sweave documents to integrate R code and LaTeX, and outlines an EADAPU training program covering Six Sigma phases and tools. The document also describes using R for process mapping, loss function analysis, and measurement system analysis for quality improvement projects.
Strategic Energy Systems Planning under UncertaintyEmilio L. Cano
The document discusses a decision support system (DSS) called EnRiMa that was developed for operators of energy-efficient buildings. The DSS uses a strategic model to make long-term decisions about technology installations and a linked operational model to determine short-term energy dispatching. The model accounts for uncertainty through a scenario tree and stochastic optimization. An example application to a building evaluating photovoltaic and combined heat and power technologies under different demand scenarios is presented.
Reproducible Operations Research. An Application to Energy Systems OptimizationEmilio L. Cano
This document discusses reproducible operations research using an integrated framework in R. It presents a case study on the EnRiMa project, which developed a decision support system for energy systems optimization. The key components discussed include a symbolic model specification to represent optimization models mathematically, a solver manager to generate solver input and output documentation, and reporting of results. The goal is to tie specific instructions to data analysis and models so results can be recreated and better understood.
A Solver Manager for energy systems planning within a Stochastic Optimization...Emilio L. Cano
The document describes an energy systems planning model within a stochastic optimization framework. It includes both strategic decisions about technology deployment and operational decisions about energy system usage. A solver manager is proposed to integrate different optimization solvers to solve the strategic and operational subproblems. The model is being developed as part of the EnRiMa project to create a decision support system for efficient energy management in buildings.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
How information systems are built or acquired puts information, which is what they should be about, in a secondary place. Our language adapted accordingly, and we no longer talk about information systems but applications. Applications evolved in a way to break data into diverse fragments, tightly coupled with applications and expensive to integrate. The result is technical debt, which is re-paid by taking even bigger "loans", resulting in an ever-increasing technical debt. Software engineering and procurement practices work in sync with market forces to maintain this trend. This talk demonstrates how natural this situation is. The question is: can something be done to reverse the trend?
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframePrecisely
Inconsistent user experience and siloed data, high costs, and changing customer expectations – Citizens Bank was experiencing these challenges while it was attempting to deliver a superior digital banking experience for its clients. Its core banking applications run on the mainframe and Citizens was using legacy utilities to get the critical mainframe data to feed customer-facing channels, like call centers, web, and mobile. Ultimately, this led to higher operating costs (MIPS), delayed response times, and longer time to market.
Ever-changing customer expectations demand more modern digital experiences, and the bank needed to find a solution that could provide real-time data to its customer channels with low latency and operating costs. Join this session to learn how Citizens is leveraging Precisely to replicate mainframe data to its customer channels and deliver on their “modern digital bank” experiences.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/how-axelera-ai-uses-digital-compute-in-memory-to-deliver-fast-and-energy-efficient-computer-vision-a-presentation-from-axelera-ai/
Bram Verhoef, Head of Machine Learning at Axelera AI, presents the “How Axelera AI Uses Digital Compute-in-memory to Deliver Fast and Energy-efficient Computer Vision” tutorial at the May 2024 Embedded Vision Summit.
As artificial intelligence inference transitions from cloud environments to edge locations, computer vision applications achieve heightened responsiveness, reliability and privacy. This migration, however, introduces the challenge of operating within the stringent confines of resource constraints typical at the edge, including small form factors, low energy budgets and diminished memory and computational capacities. Axelera AI addresses these challenges through an innovative approach of performing digital computations within memory itself. This technique facilitates the realization of high-performance, energy-efficient and cost-effective computer vision capabilities at the thin and thick edge, extending the frontier of what is achievable with current technologies.
In this presentation, Verhoef unveils his company’s pioneering chip technology and demonstrates its capacity to deliver exceptional frames-per-second performance across a range of standard computer vision networks typical of applications in security, surveillance and the industrial sector. This shows that advanced computer vision can be accessible and efficient, even at the very edge of our technological ecosystem.
“How Axelera AI Uses Digital Compute-in-memory to Deliver Fast and Energy-eff...
Six Sigma is Possible with R
1. Six Sigma is Possible with R
And Even Better
Emilio Lopez
Department of Statistics and Operations Research
Rey Juan Carlos University (Madrid)
University of Warwick, August 2011
The R User Conference 2011 - Lightning Talk 1/15
2. The DMAIC Cycle
The R User Conference 2011 - Lightning Talk 2/15
3. The Scientific Method
http://electroncafe.wordpress.com/2011/
05/04/scientific-process-rage/
The R User Conference 2011 - Lightning Talk 3/15
4. The Scientific Method & Six Sigma
DMAIC Cycle Scientific Method
Ask a question
Define
Do some background
Measure research
Construct a hypothesis
Analyze
Test the hypothesis
with an experiment
Improve
Analyze the data and
draw conclusions
Control
Communicate results
The R User Conference 2011 - Lightning Talk 4/15
5. The key to success
Science is organized knowledge.
Herbert Spencer
Six Sigma is a quality paradigm which translates the
complicated scientific terminology into a simple way
to apply the scientific method within every
organization.
The R User Conference 2011 - Lightning Talk 5/15
8. R Challenges
Why Not
Six Sigma uses Statistics.
Six Sigma is based in the Scientific Method.
Six Sigma should use R!.
Outstanding advantages
Every Statistical Tool even in base installation
Extending possibilities
Powerful graphics
The R User Conference 2011 - Lightning Talk 8/15
9. Packages
qcc Shewhart quality control charts for
continuous, attribute and count data
IQCC Builds statistical control charts with
exact limits for univariate and
multivariate cases.
qualityTools This is a package for teaching
statistical methods in the field of Quality
Science [. . . ] The focus is on teaching
[. . . ]
SixSigma Our effort for spreading the R thinking
along Six Sigma practitioners.
The R User Conference 2011 - Lightning Talk 9/15
10. Six Sigma with R - Springer Use R! Series
Features
Title:
Six Sigma with R
Due 2012
350 pages approx.
Wide background
scope
Examples, a Case
Study and practices
The R User Conference 2011 - Lightning Talk 10/15
11. Process Map
Six Sigma Process Map
operators
INPUTS
tools
X raw material
facilities
INSPECTION ASSEMBLY TEST LABELING
sheets sheets helicopter helicopter
...
INPUTS
INPUTS
INPUTS
INPUTS
Param.(x): width NC Param.(x): operator C Param.(x): operator C Param.(x): operator C
operator C cut P throw P label P
Measure pattern P fix P discard P Featur.(y): label
discard P rotor.width C environment N
Featur.(y): ok rotor.length C Featur.(y): time
paperclip C
tape C
Featur.(y): weight
LEGEND
helicopter
(C)ontrollable OUTPUTS
(Cr)itical
(N)oise
Y
(P)rocedure
Paper Helicopter Project
The R User Conference 2011 - Lightning Talk 11/15
12. Gage R&R
Six Sigma Gage R&R Study
Components of Variation Var by Part
1.8
q
80 q
q
q
q
1.6
60
q
q
Percent
1.4
var
40 q
q
q
q q
q
1.2
20 q
q
q
q q
1.0 q
0
q
G.R&R Repeat Reprod Part2Part
prot #1 prot #2 prot #3
%Contribution %Study Var
R Chart by appraiser Var by appraiser
prot #1 prot #2 prot #3
1.8
q q
op #1 op #2 op #3 q
0.5 q
q q
q q
1.6
0.4 q
q q
q 1.4
var
0.3 q
var
q
q q q
q q q
q
0.2 1.2
q q q q
q
q q q
0.1
q q 1.0 q
q
prot #1 prot #2 prot #3 prot #1 prot #2 prot #3
op #1 op #2 op #3
part
x Chart by appraiser Part*appraiser Interaction
prot #1 prot #2 prot #3
1.7 q
op #1 op #2 op #3
1.7 q
1.6 q
1.6 q q 1.5
var
1.5 1.4
var
1.4 q
1.3
q q
1.3
q 1.2
1.2 q
q q
1.1 q q 1.1 q
q
prot #1 prot #2 prot #3
prot #1 prot #2 prot #3 prot #1 prot #2 prot #3
op #1 op #3
part op #2
Helicopter Project
The R User Conference 2011 - Lightning Talk 12/15
13. Capability Analysis
Six Sigma Capability Analysis Study
Histogram & Density Density Lines Legend
Target Density ST
Theoretical Dens. ST
Density LT
Theoretical Density LT
LSL USL Specifications
LSL: 740
Target: 750
USL: 760
Short Term Process Long Term
740 745 750 755 760
Mean: 749.7625 Mean: 752.8443
Check Normality SD: 2.1042 SD: 2.9577
n: 20 n: 40
Shapiro−Wilk Test Zs: 3.14 Zs: 2.42
q
p−value: 0.07506 DPMO:
q
Short Term Indices Long Term
qq
q qq
Lilliefors (K−S) Test Cp: 1.5841 Pp: 1.1270
qq
qq
qq
q p−value: 0.2291 CI: [1.1,2.1] CI: [0.9,1.4]
qq
qq
q
q Cpk: 1.5465 Ppk: 0.8065
CI: [1.1,2.1] CI: [0.9,1.4]
Normality accepted when p−value > 0.05
Winery Project
The R User Conference 2011 - Lightning Talk 13/15
14. Open Platform for Quality Methodologies
Open Platform for Quality Methodologies
Improving the European Factory
FP7 PPP Funding Scheme
Looking for Partners
Other Projects
We are available for other projects that need
partners in this area
The R User Conference 2011 - Lightning Talk 14/15
15. Conclusion
Thanks
We hope we will be able to convince Six Sigma
practitioners that not only is it possible with R, but
it is BETTER with R.
@emilopezcano | emilio.lopez@urjc.es
The R User Conference 2011 - Lightning Talk 15/15