The document discusses how globalization affects the US economy. It notes that American jobs are moving overseas, making the US dependent on other countries. This contributes to higher prices for American products, making them less competitive globally. Some solutions proposed include buying American-made products to support US jobs, negotiating fair trade policies with other countries, and putting tariffs on imports. However, the US cannot force other countries to change policies due to sovereignty issues. Overall, while globalization lowers prices, losing US jobs means citizens cannot afford products. Cooperation is needed to ensure trade benefits all countries.
The document provides information about the Birmingham Bloomfield Chamber of Commerce, including its mission to serve six communities and advance business interests. It lists the chamber staff and 2012 Ambassador of the Year. The chamber offers various resources, events, and programs to its members to help promote their businesses and networking opportunities. These include a member directory, educational forums, exclusive member benefits, and events like Business After Hours. Contact information is provided at the end for those interested in joining or learning more about the chamber.
Corinna Muntean, Paris Aye, Jasmine Alexander, Rhoneshia Hudson, Meghan Connelly, Bret Raupp, Jordan Jonas, Tim Weber, and Rupi Sidhu volunteered at the Baldwin Center in Pontiac, Michigan.
64% of respondents in a survey had a negative attitude towards the police. A YouTube video was linked that may provide additional context on policing issues. In summary, the document reported negative public perception of police based on survey data and referenced an online video for further information.
The intern worked on various digital marketing projects for Rick Young Insurance including updating website content, maintaining social media accounts, creating blog articles, videos and graphics. This included adding new pages to the website as well as maintaining accounts on Facebook, Twitter, LinkedIn, YouTube, Pinterest and other platforms by posting updates and engaging audiences.
Revisiting the Challenges in Aligning RE and V&V: Experiences from the Public...Markus Borg
Paper presented at 1st International Workshop on Requirements Engineering and Testing, Karlskrona, Sweden, 2014.
Successful coordination of Requirements Engineering and Testing (RET) is crucial in large-scale software engineering. If the activities involved in RET are not aligned, effort is inevitably wasted, and the probability of delivering high quality software products in time decreases. Previous work has identified sixteen challenges in aligning RET in a case study of six companies. However, all six case companies selected for the study are active in proprietary software engineering. In this experience report, we discuss to what extent the identified RET alignment challenges apply to the development of a large information system for managing grants from the European Union. We confirm that most of the findings from previous work also apply to the public sector, including the challenges of aligning goals within an organization, specifying high-quality requirements, and verifying quality aspects. Furthermore, we emphasize that the public sector might be impacted by shifting political power, and that several RET alignment challenges are amplified in multi-project environments.
CorinnaMuntean founded a non-profit focused on education reform. She advises interns to arrive early, accept all assignments, and introduce themselves to everyone. A study found 2 in 3 interns receive full-time job offers. Students should schedule mock interviews and resume reviews with their career services office, apply for internships, and sign up for internship courses to improve their chances of getting hired. Gaining work experience through internships helps students do better than previous generations by preparing them for careers.
The document discusses how globalization affects the US economy. It notes that American jobs are moving overseas, making the US dependent on other countries. This contributes to higher prices for American products, making them less competitive globally. Some solutions proposed include buying American-made products to support US jobs, negotiating fair trade policies with other countries, and putting tariffs on imports. However, the US cannot force other countries to change policies due to sovereignty issues. Overall, while globalization lowers prices, losing US jobs means citizens cannot afford products. Cooperation is needed to ensure trade benefits all countries.
The document provides information about the Birmingham Bloomfield Chamber of Commerce, including its mission to serve six communities and advance business interests. It lists the chamber staff and 2012 Ambassador of the Year. The chamber offers various resources, events, and programs to its members to help promote their businesses and networking opportunities. These include a member directory, educational forums, exclusive member benefits, and events like Business After Hours. Contact information is provided at the end for those interested in joining or learning more about the chamber.
Corinna Muntean, Paris Aye, Jasmine Alexander, Rhoneshia Hudson, Meghan Connelly, Bret Raupp, Jordan Jonas, Tim Weber, and Rupi Sidhu volunteered at the Baldwin Center in Pontiac, Michigan.
64% of respondents in a survey had a negative attitude towards the police. A YouTube video was linked that may provide additional context on policing issues. In summary, the document reported negative public perception of police based on survey data and referenced an online video for further information.
The intern worked on various digital marketing projects for Rick Young Insurance including updating website content, maintaining social media accounts, creating blog articles, videos and graphics. This included adding new pages to the website as well as maintaining accounts on Facebook, Twitter, LinkedIn, YouTube, Pinterest and other platforms by posting updates and engaging audiences.
Revisiting the Challenges in Aligning RE and V&V: Experiences from the Public...Markus Borg
Paper presented at 1st International Workshop on Requirements Engineering and Testing, Karlskrona, Sweden, 2014.
Successful coordination of Requirements Engineering and Testing (RET) is crucial in large-scale software engineering. If the activities involved in RET are not aligned, effort is inevitably wasted, and the probability of delivering high quality software products in time decreases. Previous work has identified sixteen challenges in aligning RET in a case study of six companies. However, all six case companies selected for the study are active in proprietary software engineering. In this experience report, we discuss to what extent the identified RET alignment challenges apply to the development of a large information system for managing grants from the European Union. We confirm that most of the findings from previous work also apply to the public sector, including the challenges of aligning goals within an organization, specifying high-quality requirements, and verifying quality aspects. Furthermore, we emphasize that the public sector might be impacted by shifting political power, and that several RET alignment challenges are amplified in multi-project environments.
CorinnaMuntean founded a non-profit focused on education reform. She advises interns to arrive early, accept all assignments, and introduce themselves to everyone. A study found 2 in 3 interns receive full-time job offers. Students should schedule mock interviews and resume reviews with their career services office, apply for internships, and sign up for internship courses to improve their chances of getting hired. Gaining work experience through internships helps students do better than previous generations by preparing them for careers.
Recommendation Systems for Issue ManagementMarkus Borg
Presentation of research on recommendation systems for bug management in large software engineering projects. Contains a background and some highlights from my own research. Target audience: industry practitioners
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive function. Exercise stimulates the production of endorphins in the brain which elevate mood and reduce stress levels.
The document discusses the history and evolution of oral interpretation as a field of study from ancient Greece to modern times. It describes how oral interpretation originated with ancient Greek rhapsodists but was later subsumed under oratory and rhetoric. In the 18th-19th centuries, actors and ministers received training in interpretation. In the 19th century, James Rush and Francois Delsarte made influential contributions by developing systems for vocal technique and bodily expression. By the 20th century, interpretation was increasingly taught in colleges alongside elocution, though instruction varied between teachers. Theories continued to develop regarding naturalism vs. mechanics. Contemporary interpretation celebrates multi-vocal texts and diverse performance venues and styles.
Nine Oakland University students made a difference by volunteering their time to raise money and collect food for Baldwin Center visitors. Dishing Out to the "D" raised $430 after hosting three fundraising events and spent six hours at the Baldwin Center on two occassions serving soup to the less fortunate, organizing the church, and putting food and clothes into boxes. To find out more about Dishing Out to the "D," please contact Corinna Muntean at munteancorinna@gmail.com.
Conference presentation from CSMR 2013, Genova, Italy.
Abstract: Completely analyzed and closed issue reports in
software development projects, particularly in the development of safety-critical systems, often carry important information about issue-related change locations. These locations may be in the source code, as well as traces to test cases affected by the issue, and related design and requirements documents. In order
to help developers analyze new issues, knowledge about issue clones and duplicates, as well as other relations between the new issue and existing issue reports would be useful. This paper analyses, in an exploratory study, issue reports contained in two Issue Management Systems (IMS) containing approximately 20.000 issue reports. The purpose of the analysis is to gain a better understanding of relationships between issue reports
in IMSs. We found that link-mining explicit references can
reveal complex networks of issue reports. Furthermore, we
found that textual similarity analysis might have the potential to complement the explicitly signaled links by recommending additional relations. In line with work in other fields, links between software artifacts have a potential to improve search and navigation in large software engineering projects.
Findability through Traceability - A Realistic Application of Candidate Tr...Markus Borg
Conference presentation from ENASE 2012 in Wroclaw, Poland.
Abstract: Since software development is of dynamic nature, the impact analysis is an inevitable work task. Traceability is known as one factor that supports this task, and several researchers have proposed traceability recovery tools to propose trace links in an existing system. However, these semi-automatic tools have not yet proven useful in industrial applications. Based on an established automation model, we analyzed the potential value of such a tool. We based our analysis on a pilot case study of an impact analysis process in a safety-critical development context, and argue that traceability recovery should be considered an investment in ndability. Moreover, several risks involved in an increased level of impact analysis automation are already plaguing the state-
of-practice workflow. Consequently, deploying a traceability recovery tool involves a lower degree
of change than has previously been acknowledged.
Automation in the Bug Flow - Machine Learning for Triaging and TracingMarkus Borg
Issue management is a costly part of software development. In large projects, the continuous inflow of issue reports contributes to the information overload in a project, i.e., "a state where individuals do not have time or capacity to process all available information". In issue triaging, an initial step in issue management, a developer must be able to overview existing issue reports and easily navigate the software engineering project landscape. In this presentation, we present support for two work tasks involved in issue management: 1) issue assignment and 2) change impact analysis. We use machine learning to harness the ever-growing number of issue reports, by training recommendation systems on previous issues. Our industrial evaluations on 50,000+ issue reports in two large software development organizations indicate that automated issue assignment performs in line with current manual work. Moreover, we present how traceability from already resolved issue reports to various artifacts can be reused to jump start change impact analyses for newly submitted issues. Finally, we speculate on future ways to tame information overload into helpful software engineering recommendations.
Testing Quality Requirements of a System-of-Systems in the Public Sector - Ch...Markus Borg
This document discusses challenges in testing quality requirements for a system-of-systems project in the public sector and proposes potential solutions. It identifies five key challenges: (1) requirements evolving during testing, (2) testers needing business knowledge, (3) requirements not being quantified, (4) lack of prioritization of requirements, and (5) difficulty simulating all operational states. Potential solutions proposed include adopting integrated requirements engineering, extending the Twin Peaks model, using the QUPER model for requirements quantification, focusing on architecturally significant requirements, and implementing virtual plumblines.
This document discusses key aspects of a school system including general education subjects, different types of learners, providing feedback to students, in-class discussions, and assigning practical homework. It emphasizes that teachers should focus on all types of learners by incorporating kinesthetic, auditory, social, tactile and visual activities. Feedback should identify what students did correctly and patterns of mistakes, and discussions should present multiple sides of issues. Homework assignments aim to prepare students for life after graduation.
Agility in Software 2.0 - Notebook Interfaces and MLOps with Buttresses and R...Markus Borg
Keynote at the 6th Int’l. Conference on Lean and Agile Software Development, January 22, 2022
Artificial intelligence through machine learning is increasingly used in the digital society. Solutions based on machine learning bring both great opportunities, thus coined "Software 2.0," but also great challenges for the engineering community to tackle. Due to the experimental approach used by data scientists when developing machine learning models, agility is an essential characteristic. In this keynote address, we discuss two contemporary development phenomena that are fundamental in machine learning development, i.e., notebook interfaces and MLOps. First, we present a solution that can remedy some of the intrinsic weaknesses of working in notebooks by supporting easy transitions to integrated development environments. Second, we propose reinforced engineering of AI systems by introducing metaphorical buttresses and rebars in the MLOps context. Machine learning-based solutions are dynamic in nature, and we argue that reinforced continuous engineering is required to quality assure the trustworthy AI systems of tomorrow.
Quality Assurance Of Generative Dialog Models in an evolving Conversationa...Markus Borg
Presented at CAIN22 - 1st Conference on AI Engineering – Software Engineering for AI
Due to the migration megatrend, efficient and effective second-
language acquisition is vital. One proposed solution involves AI-
enabled conversational agents for person-centered interactive language practice. We present results from ongoing action research
targeting quality assurance of proprietary generative dialog models trained for virtual job interviews. The action team elicited a set of 38 requirements for which we designed corresponding automated test cases for 15 of particular interest to the evolving solution. Our results show that six of the test case designs can detect meaningful differences between candidate models. While quality assurance of natural language processing applications is complex, we provide initial steps toward an automated framework for machine learning model selection in the context of an evolving conversational agent.
Future work will focus on model selection in an MLOps setting.
Test Automation with Grad-CAM Heatmaps - A Future Pipe Segment in MLOps for V...Markus Borg
This document discusses using Grad-CAM heatmaps to test machine learning models in an MLOps context. Grad-CAM visualizes the areas of an image that a deep learning model focuses on to make predictions. The document proposes using Grad-CAM heatmaps during model validation and monitoring to help ensure models are operating as intended and meeting requirements for trustworthy AI like transparency and accountability. Heatmaps could provide evidence for assurance cases by demonstrating a model's decisions are explained and allow for human oversight. Overall, Grad-CAM may be a useful technique for testing perception models as part of continuous integration and deployment in MLOps.
Digital Twins Are Not Monozygotic - Cross-Replicating ADAS Testing in Two Ind...Markus Borg
The increasing levels of software- and data-intensive driving automation call for an evolution of automotive software testing. As a recommended practice of the Verification and Validation (V&V) process of ISO/PAS 21448, a candidate standard for safety of the intended functionality for road vehicles, simulation-based testing has the potential to reduce both risks and costs. There is a growing body of research on devising test automation techniques using simulators for Advanced Driver-Assistance Systems (ADAS). However, how similar are the results if the same test scenarios are executed in different simulators? We conduct a replication study of applying a Search-Based Software Testing (SBST) solution to a real-world ADAS (PeVi, a pedestrian vision detection system) using two different commercial simulators, namely, TASS/Siemens PreScan and ESI Pro-SiVIC. Based on a minimalistic scene, we compare critical test scenarios generated using our SBST solution in these two simulators. We show that SBST can be used to effectively and efficiently generate critical test scenarios in both simulators, and the test results obtained from the two simulators can reveal several weaknesses of the ADAS under test. However, executing the same test scenarios in the two simulators leads to notable differences in the details of the test outputs, in particular, related to (1) safety violations revealed by tests, and (2) dynamics of cars and pedestrians. Based on our findings, we recommend future V&V plans to include multiple simulators to support robust simulation-based testing and to base test objectives on measures that are less dependant on the internals of the simulators.
Illuminating a Blind Spot in Digitalization - Software Development in Sweden’...Markus Borg
This document summarizes research on software development in Sweden's private and public sectors. A survey of over 3,000 companies found that 35% do in-house software development, with higher rates at larger companies. Demand for programming skills is increasing across all industry sectors as digitalization grows. In the public sector, a study of government agencies found 39% conduct software development, with varying practices. The researcher concludes digitalization requires an evidence-based policy approach to meet the escalating demand for development skills.
While Deep Neural Networks (DNN) have revolutionized applications that rely on computer vision, their characteristics introduce substantial challenges to automotive safety engineering. The behavior of a DNN is not explicitly expressed by an engineer in source code, instead enormous amounts of annotated data are used to learn a mapping between input and output. Functional safety as defined by ISO 26262 is not sufficient to match the needs for the new generation of data-driven software.
Earlier this year, ISO/PAS 21148 Safety of the Intended Functionality (SOTIF) was published by ISO. SOTIF is a Publicly Available Specification (PAS), a response to a pressing need of an automotive safety standard appropriate for machine learning. A PAS is a stepping stone toward a new ISO standard, and SOTIF is intended to complement conventional functional safety as defined in ISO 26262.
In this presentation, we introduce the SOTIF process and present our contributions on how to support safety of the intended function. First, we present search-based software testing to efficiently and effectively idenify test scenarios that cause safety violations in simulated environments. Second, we present a safety cage architecture that helps percepiont systems reject input that does not resemble the training data.
SZZ Unleashed: An Open Implementation of the SZZ AlgorithmMarkus Borg
Numerous empirical software engineering studies rely on detailed information about bugs. While issue trackers often contain information about when bugs were fixed, details about when they were introduced to the system are often absent. As a remedy, researchers often rely on the SZZ algorithm as a heuristic approach to identify bug-introducing software changes. Unfortunately, as reported in a recent systematic literature review, few researchers have made their SZZ implementations publicly available. Consequently, there is a risk that research effort is wasted as new projects based on SZZ output need to initially reimplement the approach. Furthermore, there is a risk that newly developed (closed source) SZZ implementations have not been properly tested, thus conducting research based on their output might introduce threats to validity. We present SZZ Unleashed, an open implementation of the SZZ algorithm for git repositories. This paper describes our implementation along with a usage example for the Jenkins project, and conclude with an illustrative study on just-in-time bug prediction. We hope to continue evolving SZZ Unleashed on GitHub, and warmly invite the community to contribute.
Explainability First! Cousteauing the Depths of Neural NetworksMarkus Borg
Markus Borg is a senior researcher at RISE Research Institutes of Sweden who focuses on software engineering for machine learning. He presented on the importance of explainability for neural networks used in safety-critical systems like autonomous vehicles. Specifically, he discussed how to provide evidence for safety certification by tracing safety requirements to elements of a deep learning model, its training process, and validation tests. He also showed how fault tree analysis can be used to identify failure modes and ensure the system has mechanisms for graceful degradation if failures occur.
Test Automation Research... Is That Really Needed in 2018?Markus Borg
We run a k€ 21,752 EU project to push the test automation research front. This talk motivates why this is (tax) money well spent and presents some research highlights: 1) test result visualization, 2) mutation testing, and 3) AI-assisted bug assignment.
Supporting Change Impact Analysis Using a Recommendation System - An Industri...Markus Borg
Journal first presentation at ICSE'17 in Buenos Aires, Argentina.
M. Borg, K. Wnuk, B. Regnell, and P. Runeson. Supporting Change Impact Analysis Using a Recommendation System: An Industrial Case Study in a Safety-Critical Context, IEEE Transactions on Software Engineering, 43(6), pp. 675-700, 2017.
Component Source Origin Decisions in Practice - A Survey of Decision Making i...Markus Borg
The document summarizes a survey on decision making for component sourcing options in software engineering. It finds that companies typically consider developing components in-house as well as using commercial off-the-shelf (COTS) components, open source software (OSS), and outsourcing. Decisions are mainly based on expert judgment, with functionality being the most important criteria. Other important qualities include reliability, maintainability, performance, and security. Estimating component performance and reliability takes more time.
Recommendation Systems for Issue ManagementMarkus Borg
Presentation of research on recommendation systems for bug management in large software engineering projects. Contains a background and some highlights from my own research. Target audience: industry practitioners
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive function. Exercise stimulates the production of endorphins in the brain which elevate mood and reduce stress levels.
The document discusses the history and evolution of oral interpretation as a field of study from ancient Greece to modern times. It describes how oral interpretation originated with ancient Greek rhapsodists but was later subsumed under oratory and rhetoric. In the 18th-19th centuries, actors and ministers received training in interpretation. In the 19th century, James Rush and Francois Delsarte made influential contributions by developing systems for vocal technique and bodily expression. By the 20th century, interpretation was increasingly taught in colleges alongside elocution, though instruction varied between teachers. Theories continued to develop regarding naturalism vs. mechanics. Contemporary interpretation celebrates multi-vocal texts and diverse performance venues and styles.
Nine Oakland University students made a difference by volunteering their time to raise money and collect food for Baldwin Center visitors. Dishing Out to the "D" raised $430 after hosting three fundraising events and spent six hours at the Baldwin Center on two occassions serving soup to the less fortunate, organizing the church, and putting food and clothes into boxes. To find out more about Dishing Out to the "D," please contact Corinna Muntean at munteancorinna@gmail.com.
Conference presentation from CSMR 2013, Genova, Italy.
Abstract: Completely analyzed and closed issue reports in
software development projects, particularly in the development of safety-critical systems, often carry important information about issue-related change locations. These locations may be in the source code, as well as traces to test cases affected by the issue, and related design and requirements documents. In order
to help developers analyze new issues, knowledge about issue clones and duplicates, as well as other relations between the new issue and existing issue reports would be useful. This paper analyses, in an exploratory study, issue reports contained in two Issue Management Systems (IMS) containing approximately 20.000 issue reports. The purpose of the analysis is to gain a better understanding of relationships between issue reports
in IMSs. We found that link-mining explicit references can
reveal complex networks of issue reports. Furthermore, we
found that textual similarity analysis might have the potential to complement the explicitly signaled links by recommending additional relations. In line with work in other fields, links between software artifacts have a potential to improve search and navigation in large software engineering projects.
Findability through Traceability - A Realistic Application of Candidate Tr...Markus Borg
Conference presentation from ENASE 2012 in Wroclaw, Poland.
Abstract: Since software development is of dynamic nature, the impact analysis is an inevitable work task. Traceability is known as one factor that supports this task, and several researchers have proposed traceability recovery tools to propose trace links in an existing system. However, these semi-automatic tools have not yet proven useful in industrial applications. Based on an established automation model, we analyzed the potential value of such a tool. We based our analysis on a pilot case study of an impact analysis process in a safety-critical development context, and argue that traceability recovery should be considered an investment in ndability. Moreover, several risks involved in an increased level of impact analysis automation are already plaguing the state-
of-practice workflow. Consequently, deploying a traceability recovery tool involves a lower degree
of change than has previously been acknowledged.
Automation in the Bug Flow - Machine Learning for Triaging and TracingMarkus Borg
Issue management is a costly part of software development. In large projects, the continuous inflow of issue reports contributes to the information overload in a project, i.e., "a state where individuals do not have time or capacity to process all available information". In issue triaging, an initial step in issue management, a developer must be able to overview existing issue reports and easily navigate the software engineering project landscape. In this presentation, we present support for two work tasks involved in issue management: 1) issue assignment and 2) change impact analysis. We use machine learning to harness the ever-growing number of issue reports, by training recommendation systems on previous issues. Our industrial evaluations on 50,000+ issue reports in two large software development organizations indicate that automated issue assignment performs in line with current manual work. Moreover, we present how traceability from already resolved issue reports to various artifacts can be reused to jump start change impact analyses for newly submitted issues. Finally, we speculate on future ways to tame information overload into helpful software engineering recommendations.
Testing Quality Requirements of a System-of-Systems in the Public Sector - Ch...Markus Borg
This document discusses challenges in testing quality requirements for a system-of-systems project in the public sector and proposes potential solutions. It identifies five key challenges: (1) requirements evolving during testing, (2) testers needing business knowledge, (3) requirements not being quantified, (4) lack of prioritization of requirements, and (5) difficulty simulating all operational states. Potential solutions proposed include adopting integrated requirements engineering, extending the Twin Peaks model, using the QUPER model for requirements quantification, focusing on architecturally significant requirements, and implementing virtual plumblines.
This document discusses key aspects of a school system including general education subjects, different types of learners, providing feedback to students, in-class discussions, and assigning practical homework. It emphasizes that teachers should focus on all types of learners by incorporating kinesthetic, auditory, social, tactile and visual activities. Feedback should identify what students did correctly and patterns of mistakes, and discussions should present multiple sides of issues. Homework assignments aim to prepare students for life after graduation.
Agility in Software 2.0 - Notebook Interfaces and MLOps with Buttresses and R...Markus Borg
Keynote at the 6th Int’l. Conference on Lean and Agile Software Development, January 22, 2022
Artificial intelligence through machine learning is increasingly used in the digital society. Solutions based on machine learning bring both great opportunities, thus coined "Software 2.0," but also great challenges for the engineering community to tackle. Due to the experimental approach used by data scientists when developing machine learning models, agility is an essential characteristic. In this keynote address, we discuss two contemporary development phenomena that are fundamental in machine learning development, i.e., notebook interfaces and MLOps. First, we present a solution that can remedy some of the intrinsic weaknesses of working in notebooks by supporting easy transitions to integrated development environments. Second, we propose reinforced engineering of AI systems by introducing metaphorical buttresses and rebars in the MLOps context. Machine learning-based solutions are dynamic in nature, and we argue that reinforced continuous engineering is required to quality assure the trustworthy AI systems of tomorrow.
Quality Assurance Of Generative Dialog Models in an evolving Conversationa...Markus Borg
Presented at CAIN22 - 1st Conference on AI Engineering – Software Engineering for AI
Due to the migration megatrend, efficient and effective second-
language acquisition is vital. One proposed solution involves AI-
enabled conversational agents for person-centered interactive language practice. We present results from ongoing action research
targeting quality assurance of proprietary generative dialog models trained for virtual job interviews. The action team elicited a set of 38 requirements for which we designed corresponding automated test cases for 15 of particular interest to the evolving solution. Our results show that six of the test case designs can detect meaningful differences between candidate models. While quality assurance of natural language processing applications is complex, we provide initial steps toward an automated framework for machine learning model selection in the context of an evolving conversational agent.
Future work will focus on model selection in an MLOps setting.
Test Automation with Grad-CAM Heatmaps - A Future Pipe Segment in MLOps for V...Markus Borg
This document discusses using Grad-CAM heatmaps to test machine learning models in an MLOps context. Grad-CAM visualizes the areas of an image that a deep learning model focuses on to make predictions. The document proposes using Grad-CAM heatmaps during model validation and monitoring to help ensure models are operating as intended and meeting requirements for trustworthy AI like transparency and accountability. Heatmaps could provide evidence for assurance cases by demonstrating a model's decisions are explained and allow for human oversight. Overall, Grad-CAM may be a useful technique for testing perception models as part of continuous integration and deployment in MLOps.
Digital Twins Are Not Monozygotic - Cross-Replicating ADAS Testing in Two Ind...Markus Borg
The increasing levels of software- and data-intensive driving automation call for an evolution of automotive software testing. As a recommended practice of the Verification and Validation (V&V) process of ISO/PAS 21448, a candidate standard for safety of the intended functionality for road vehicles, simulation-based testing has the potential to reduce both risks and costs. There is a growing body of research on devising test automation techniques using simulators for Advanced Driver-Assistance Systems (ADAS). However, how similar are the results if the same test scenarios are executed in different simulators? We conduct a replication study of applying a Search-Based Software Testing (SBST) solution to a real-world ADAS (PeVi, a pedestrian vision detection system) using two different commercial simulators, namely, TASS/Siemens PreScan and ESI Pro-SiVIC. Based on a minimalistic scene, we compare critical test scenarios generated using our SBST solution in these two simulators. We show that SBST can be used to effectively and efficiently generate critical test scenarios in both simulators, and the test results obtained from the two simulators can reveal several weaknesses of the ADAS under test. However, executing the same test scenarios in the two simulators leads to notable differences in the details of the test outputs, in particular, related to (1) safety violations revealed by tests, and (2) dynamics of cars and pedestrians. Based on our findings, we recommend future V&V plans to include multiple simulators to support robust simulation-based testing and to base test objectives on measures that are less dependant on the internals of the simulators.
Illuminating a Blind Spot in Digitalization - Software Development in Sweden’...Markus Borg
This document summarizes research on software development in Sweden's private and public sectors. A survey of over 3,000 companies found that 35% do in-house software development, with higher rates at larger companies. Demand for programming skills is increasing across all industry sectors as digitalization grows. In the public sector, a study of government agencies found 39% conduct software development, with varying practices. The researcher concludes digitalization requires an evidence-based policy approach to meet the escalating demand for development skills.
While Deep Neural Networks (DNN) have revolutionized applications that rely on computer vision, their characteristics introduce substantial challenges to automotive safety engineering. The behavior of a DNN is not explicitly expressed by an engineer in source code, instead enormous amounts of annotated data are used to learn a mapping between input and output. Functional safety as defined by ISO 26262 is not sufficient to match the needs for the new generation of data-driven software.
Earlier this year, ISO/PAS 21148 Safety of the Intended Functionality (SOTIF) was published by ISO. SOTIF is a Publicly Available Specification (PAS), a response to a pressing need of an automotive safety standard appropriate for machine learning. A PAS is a stepping stone toward a new ISO standard, and SOTIF is intended to complement conventional functional safety as defined in ISO 26262.
In this presentation, we introduce the SOTIF process and present our contributions on how to support safety of the intended function. First, we present search-based software testing to efficiently and effectively idenify test scenarios that cause safety violations in simulated environments. Second, we present a safety cage architecture that helps percepiont systems reject input that does not resemble the training data.
SZZ Unleashed: An Open Implementation of the SZZ AlgorithmMarkus Borg
Numerous empirical software engineering studies rely on detailed information about bugs. While issue trackers often contain information about when bugs were fixed, details about when they were introduced to the system are often absent. As a remedy, researchers often rely on the SZZ algorithm as a heuristic approach to identify bug-introducing software changes. Unfortunately, as reported in a recent systematic literature review, few researchers have made their SZZ implementations publicly available. Consequently, there is a risk that research effort is wasted as new projects based on SZZ output need to initially reimplement the approach. Furthermore, there is a risk that newly developed (closed source) SZZ implementations have not been properly tested, thus conducting research based on their output might introduce threats to validity. We present SZZ Unleashed, an open implementation of the SZZ algorithm for git repositories. This paper describes our implementation along with a usage example for the Jenkins project, and conclude with an illustrative study on just-in-time bug prediction. We hope to continue evolving SZZ Unleashed on GitHub, and warmly invite the community to contribute.
Explainability First! Cousteauing the Depths of Neural NetworksMarkus Borg
Markus Borg is a senior researcher at RISE Research Institutes of Sweden who focuses on software engineering for machine learning. He presented on the importance of explainability for neural networks used in safety-critical systems like autonomous vehicles. Specifically, he discussed how to provide evidence for safety certification by tracing safety requirements to elements of a deep learning model, its training process, and validation tests. He also showed how fault tree analysis can be used to identify failure modes and ensure the system has mechanisms for graceful degradation if failures occur.
Test Automation Research... Is That Really Needed in 2018?Markus Borg
We run a k€ 21,752 EU project to push the test automation research front. This talk motivates why this is (tax) money well spent and presents some research highlights: 1) test result visualization, 2) mutation testing, and 3) AI-assisted bug assignment.
Supporting Change Impact Analysis Using a Recommendation System - An Industri...Markus Borg
Journal first presentation at ICSE'17 in Buenos Aires, Argentina.
M. Borg, K. Wnuk, B. Regnell, and P. Runeson. Supporting Change Impact Analysis Using a Recommendation System: An Industrial Case Study in a Safety-Critical Context, IEEE Transactions on Software Engineering, 43(6), pp. 675-700, 2017.
Component Source Origin Decisions in Practice - A Survey of Decision Making i...Markus Borg
The document summarizes a survey on decision making for component sourcing options in software engineering. It finds that companies typically consider developing components in-house as well as using commercial off-the-shelf (COTS) components, open source software (OSS), and outsourcing. Decisions are mainly based on expert judgment, with functionality being the most important criteria. Other important qualities include reliability, maintainability, performance, and security. Estimating component performance and reliability takes more time.
Enabling Visual Analytics with Unity - Exploring Regression Test Results in A...Markus Borg
Automation in ASIC verification generates large amounts of test results that can overload engineers. The document proposes addressing this by using visual analytics through a game engine to help engineers focus verification efforts. It describes a prototype tool built in Unity that visualizes test results across files and commits as an interactive "test execution cityscape". Future work includes evaluating the tool with additional engineers and students.
Sexuality - Issues, Attitude and Behaviour - Applied Social Psychology - Psyc...PsychoTech Services
A proprietary approach developed by bringing together the best of learning theories from Psychology, design principles from the world of visualization, and pedagogical methods from over a decade of training experience, that enables you to: Learn better, faster!
Microbial interaction
Microorganisms interacts with each other and can be physically associated with another organisms in a variety of ways.
One organism can be located on the surface of another organism as an ectobiont or located within another organism as endobiont.
Microbial interaction may be positive such as mutualism, proto-cooperation, commensalism or may be negative such as parasitism, predation or competition
Types of microbial interaction
Positive interaction: mutualism, proto-cooperation, commensalism
Negative interaction: Ammensalism (antagonism), parasitism, predation, competition
I. Mutualism:
It is defined as the relationship in which each organism in interaction gets benefits from association. It is an obligatory relationship in which mutualist and host are metabolically dependent on each other.
Mutualistic relationship is very specific where one member of association cannot be replaced by another species.
Mutualism require close physical contact between interacting organisms.
Relationship of mutualism allows organisms to exist in habitat that could not occupied by either species alone.
Mutualistic relationship between organisms allows them to act as a single organism.
Examples of mutualism:
i. Lichens:
Lichens are excellent example of mutualism.
They are the association of specific fungi and certain genus of algae. In lichen, fungal partner is called mycobiont and algal partner is called
II. Syntrophism:
It is an association in which the growth of one organism either depends on or improved by the substrate provided by another organism.
In syntrophism both organism in association gets benefits.
Compound A
Utilized by population 1
Compound B
Utilized by population 2
Compound C
utilized by both Population 1+2
Products
In this theoretical example of syntrophism, population 1 is able to utilize and metabolize compound A, forming compound B but cannot metabolize beyond compound B without co-operation of population 2. Population 2is unable to utilize compound A but it can metabolize compound B forming compound C. Then both population 1 and 2 are able to carry out metabolic reaction which leads to formation of end product that neither population could produce alone.
Examples of syntrophism:
i. Methanogenic ecosystem in sludge digester
Methane produced by methanogenic bacteria depends upon interspecies hydrogen transfer by other fermentative bacteria.
Anaerobic fermentative bacteria generate CO2 and H2 utilizing carbohydrates which is then utilized by methanogenic bacteria (Methanobacter) to produce methane.
ii. Lactobacillus arobinosus and Enterococcus faecalis:
In the minimal media, Lactobacillus arobinosus and Enterococcus faecalis are able to grow together but not alone.
The synergistic relationship between E. faecalis and L. arobinosus occurs in which E. faecalis require folic acid
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...Sérgio Sacani
Context. The observation of several L-band emission sources in the S cluster has led to a rich discussion of their nature. However, a definitive answer to the classification of the dusty objects requires an explanation for the detection of compact Doppler-shifted Brγ emission. The ionized hydrogen in combination with the observation of mid-infrared L-band continuum emission suggests that most of these sources are embedded in a dusty envelope. These embedded sources are part of the S-cluster, and their relationship to the S-stars is still under debate. To date, the question of the origin of these two populations has been vague, although all explanations favor migration processes for the individual cluster members. Aims. This work revisits the S-cluster and its dusty members orbiting the supermassive black hole SgrA* on bound Keplerian orbits from a kinematic perspective. The aim is to explore the Keplerian parameters for patterns that might imply a nonrandom distribution of the sample. Additionally, various analytical aspects are considered to address the nature of the dusty sources. Methods. Based on the photometric analysis, we estimated the individual H−K and K−L colors for the source sample and compared the results to known cluster members. The classification revealed a noticeable contrast between the S-stars and the dusty sources. To fit the flux-density distribution, we utilized the radiative transfer code HYPERION and implemented a young stellar object Class I model. We obtained the position angle from the Keplerian fit results; additionally, we analyzed the distribution of the inclinations and the longitudes of the ascending node. Results. The colors of the dusty sources suggest a stellar nature consistent with the spectral energy distribution in the near and midinfrared domains. Furthermore, the evaporation timescales of dusty and gaseous clumps in the vicinity of SgrA* are much shorter ( 2yr) than the epochs covered by the observations (≈15yr). In addition to the strong evidence for the stellar classification of the D-sources, we also find a clear disk-like pattern following the arrangements of S-stars proposed in the literature. Furthermore, we find a global intrinsic inclination for all dusty sources of 60 ± 20◦, implying a common formation process. Conclusions. The pattern of the dusty sources manifested in the distribution of the position angles, inclinations, and longitudes of the ascending node strongly suggests two different scenarios: the main-sequence stars and the dusty stellar S-cluster sources share a common formation history or migrated with a similar formation channel in the vicinity of SgrA*. Alternatively, the gravitational influence of SgrA* in combination with a massive perturber, such as a putative intermediate mass black hole in the IRS 13 cluster, forces the dusty objects and S-stars to follow a particular orbital arrangement. Key words. stars: black holes– stars: formation– Galaxy: center– galaxies: star formation
Discovery of An Apparent Red, High-Velocity Type Ia Supernova at 𝐳 = 2.9 wi...Sérgio Sacani
We present the JWST discovery of SN 2023adsy, a transient object located in a host galaxy JADES-GS
+
53.13485
−
27.82088
with a host spectroscopic redshift of
2.903
±
0.007
. The transient was identified in deep James Webb Space Telescope (JWST)/NIRCam imaging from the JWST Advanced Deep Extragalactic Survey (JADES) program. Photometric and spectroscopic followup with NIRCam and NIRSpec, respectively, confirm the redshift and yield UV-NIR light-curve, NIR color, and spectroscopic information all consistent with a Type Ia classification. Despite its classification as a likely SN Ia, SN 2023adsy is both fairly red (
�
(
�
−
�
)
∼
0.9
) despite a host galaxy with low-extinction and has a high Ca II velocity (
19
,
000
±
2
,
000
km/s) compared to the general population of SNe Ia. While these characteristics are consistent with some Ca-rich SNe Ia, particularly SN 2016hnk, SN 2023adsy is intrinsically brighter than the low-
�
Ca-rich population. Although such an object is too red for any low-
�
cosmological sample, we apply a fiducial standardization approach to SN 2023adsy and find that the SN 2023adsy luminosity distance measurement is in excellent agreement (
≲
1
�
) with
Λ
CDM. Therefore unlike low-
�
Ca-rich SNe Ia, SN 2023adsy is standardizable and gives no indication that SN Ia standardized luminosities change significantly with redshift. A larger sample of distant SNe Ia is required to determine if SN Ia population characteristics at high-
�
truly diverge from their low-
�
counterparts, and to confirm that standardized luminosities nevertheless remain constant with redshift.
Evidence of Jet Activity from the Secondary Black Hole in the OJ 287 Binary S...Sérgio Sacani
Wereport the study of a huge optical intraday flare on 2021 November 12 at 2 a.m. UT in the blazar OJ287. In the binary black hole model, it is associated with an impact of the secondary black hole on the accretion disk of the primary. Our multifrequency observing campaign was set up to search for such a signature of the impact based on a prediction made 8 yr earlier. The first I-band results of the flare have already been reported by Kishore et al. (2024). Here we combine these data with our monitoring in the R-band. There is a big change in the R–I spectral index by 1.0 ±0.1 between the normal background and the flare, suggesting a new component of radiation. The polarization variation during the rise of the flare suggests the same. The limits on the source size place it most reasonably in the jet of the secondary BH. We then ask why we have not seen this phenomenon before. We show that OJ287 was never before observed with sufficient sensitivity on the night when the flare should have happened according to the binary model. We also study the probability that this flare is just an oversized example of intraday variability using the Krakow data set of intense monitoring between 2015 and 2023. We find that the occurrence of a flare of this size and rapidity is unlikely. In machine-readable Tables 1 and 2, we give the full orbit-linked historical light curve of OJ287 as well as the dense monitoring sample of Krakow.
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...Scintica Instrumentation
Targeting Hsp90 and its pathogen Orthologs with Tethered Inhibitors as a Diagnostic and Therapeutic Strategy for cancer and infectious diseases with Dr. Timothy Haystead.
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...Sérgio Sacani
Context. With a mass exceeding several 104 M⊙ and a rich and dense population of massive stars, supermassive young star clusters
represent the most massive star-forming environment that is dominated by the feedback from massive stars and gravitational interactions
among stars.
Aims. In this paper we present the Extended Westerlund 1 and 2 Open Clusters Survey (EWOCS) project, which aims to investigate
the influence of the starburst environment on the formation of stars and planets, and on the evolution of both low and high mass stars.
The primary targets of this project are Westerlund 1 and 2, the closest supermassive star clusters to the Sun.
Methods. The project is based primarily on recent observations conducted with the Chandra and JWST observatories. Specifically,
the Chandra survey of Westerlund 1 consists of 36 new ACIS-I observations, nearly co-pointed, for a total exposure time of 1 Msec.
Additionally, we included 8 archival Chandra/ACIS-S observations. This paper presents the resulting catalog of X-ray sources within
and around Westerlund 1. Sources were detected by combining various existing methods, and photon extraction and source validation
were carried out using the ACIS-Extract software.
Results. The EWOCS X-ray catalog comprises 5963 validated sources out of the 9420 initially provided to ACIS-Extract, reaching a
photon flux threshold of approximately 2 × 10−8 photons cm−2
s
−1
. The X-ray sources exhibit a highly concentrated spatial distribution,
with 1075 sources located within the central 1 arcmin. We have successfully detected X-ray emissions from 126 out of the 166 known
massive stars of the cluster, and we have collected over 71 000 photons from the magnetar CXO J164710.20-455217.
JAMES WEBB STUDY THE MASSIVE BLACK HOLE SEEDSSérgio Sacani
The pathway(s) to seeding the massive black holes (MBHs) that exist at the heart of galaxies in the present and distant Universe remains an unsolved problem. Here we categorise, describe and quantitatively discuss the formation pathways of both light and heavy seeds. We emphasise that the most recent computational models suggest that rather than a bimodal-like mass spectrum between light and heavy seeds with light at one end and heavy at the other that instead a continuum exists. Light seeds being more ubiquitous and the heavier seeds becoming less and less abundant due the rarer environmental conditions required for their formation. We therefore examine the different mechanisms that give rise to different seed mass spectrums. We show how and why the mechanisms that produce the heaviest seeds are also among the rarest events in the Universe and are hence extremely unlikely to be the seeds for the vast majority of the MBH population. We quantify, within the limits of the current large uncertainties in the seeding processes, the expected number densities of the seed mass spectrum. We argue that light seeds must be at least 103 to 105 times more numerous than heavy seeds to explain the MBH population as a whole. Based on our current understanding of the seed population this makes heavy seeds (Mseed > 103 M⊙) a significantly more likely pathway given that heavy seeds have an abundance pattern than is close to and likely in excess of 10−4 compared to light seeds. Finally, we examine the current state-of-the-art in numerical calculations and recent observations and plot a path forward for near-future advances in both domains.
PPT on Alternate Wetting and Drying presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.