This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
Qualitative Research Questions and MethodologyLevelwing
Big Data isn't just about numbers and charts; qualitative research provides rich insight to help with any business question you may have. This presentation provides an overview of qualitative research methodology and the importance and process of developing scalable research questions. Learn more about Levelwing's research capabilities: http://ow.ly/gcSXU
Qualitative Research Questions and MethodologyLevelwing
Big Data isn't just about numbers and charts; qualitative research provides rich insight to help with any business question you may have. This presentation provides an overview of qualitative research methodology and the importance and process of developing scalable research questions. Learn more about Levelwing's research capabilities: http://ow.ly/gcSXU
HCI LAB MANUAL
1
To understand the trouble of interacting with machines - Redesign interfaces of home
appliances.
2 Design a system based on user-centered approach.
3 Understand the principles of good screen design.
4 Redesign existing Graphical User Interface with screen complexity
5 Design Web User Interface based on Gestalt Theory
6 Implementation of Different Kinds of Menus
7 Implementation of Different Kinds of Windows
8 Design a system with proper guidelines for icons
Data science has grown in popularity as a result of the rise of programming languages such as Python and techniques for collecting, analyzing, and interpreting data.
Making sense of data, on the other hand, has a long history and has been a source of contention for scientists, statisticians, librarians, computer scientists, and others for many years. This infographic portrays the evolution of the phrase "Data Science" over time, as well as attempts to define it and related terms. For more details visit:- https://www.careerera.com/blog/history-of-data-science
RESEARCH DESIGN AND METHODOLOGY - MAZPA EJIKEM NIMSA.pptxTORASIF
Research Design:
the overall structure or plan of the research project
your plan to answer the research question.
Research Methodology:
The study of the scientific steps adopted by a researcher in conducting a research.
Largely theoretical.
Research Methods:
The steps and techniques used for conduction of research, especially data collection and analysis.
It has a narrower scope than methodology.
Find a handheld device (e.g., a PDA, mobile phone) and examine how it has been designed, paying particular attention to how the user is meant to interact with it.
(a) From your first impressions, write down what first comes to mind as to what is good or bad about the way the device works. Then list (i) its functionality and (ii) the range of tasks a typical user would want to do using it. Is the functionality greater, equal, or less than what the user wants to do?
(b) Based on your study, compile your own set of usability and user experience goals that you think will be useful in evaluating the device. Decide which are the most important ones and why.
(c) Translate the core usability and user experience goals you have selected into two or three questions. Then use these questions to assess your device (sample questions: what mechanisms have been used to ensure safety? Is it fun to use, etc.).
(d) Evaluate the device using User Centered Design Prncipal’s design principles.
(e) Discuss possible improvements that can be made to the interface to improve its usability based on your evaluation.
Quantitative Research: Surveys and ExperimentsMartin Kretzer
- Example lecture of the course "Methods and Theories in Information Systems"
- Target group: students who want to get an impression of the course before joining it
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.it/.
http://www.ivanomalavolta.com
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
HCI LAB MANUAL
1
To understand the trouble of interacting with machines - Redesign interfaces of home
appliances.
2 Design a system based on user-centered approach.
3 Understand the principles of good screen design.
4 Redesign existing Graphical User Interface with screen complexity
5 Design Web User Interface based on Gestalt Theory
6 Implementation of Different Kinds of Menus
7 Implementation of Different Kinds of Windows
8 Design a system with proper guidelines for icons
Data science has grown in popularity as a result of the rise of programming languages such as Python and techniques for collecting, analyzing, and interpreting data.
Making sense of data, on the other hand, has a long history and has been a source of contention for scientists, statisticians, librarians, computer scientists, and others for many years. This infographic portrays the evolution of the phrase "Data Science" over time, as well as attempts to define it and related terms. For more details visit:- https://www.careerera.com/blog/history-of-data-science
RESEARCH DESIGN AND METHODOLOGY - MAZPA EJIKEM NIMSA.pptxTORASIF
Research Design:
the overall structure or plan of the research project
your plan to answer the research question.
Research Methodology:
The study of the scientific steps adopted by a researcher in conducting a research.
Largely theoretical.
Research Methods:
The steps and techniques used for conduction of research, especially data collection and analysis.
It has a narrower scope than methodology.
Find a handheld device (e.g., a PDA, mobile phone) and examine how it has been designed, paying particular attention to how the user is meant to interact with it.
(a) From your first impressions, write down what first comes to mind as to what is good or bad about the way the device works. Then list (i) its functionality and (ii) the range of tasks a typical user would want to do using it. Is the functionality greater, equal, or less than what the user wants to do?
(b) Based on your study, compile your own set of usability and user experience goals that you think will be useful in evaluating the device. Decide which are the most important ones and why.
(c) Translate the core usability and user experience goals you have selected into two or three questions. Then use these questions to assess your device (sample questions: what mechanisms have been used to ensure safety? Is it fun to use, etc.).
(d) Evaluate the device using User Centered Design Prncipal’s design principles.
(e) Discuss possible improvements that can be made to the interface to improve its usability based on your evaluation.
Quantitative Research: Surveys and ExperimentsMartin Kretzer
- Example lecture of the course "Methods and Theories in Information Systems"
- Target group: students who want to get an impression of the course before joining it
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.it/.
http://www.ivanomalavolta.com
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
1How to Perform ExperimentsBasic Concepts CSCI .docxdrennanmicah
1
How to Perform Experiments:
Basic Concepts
CSCI 783: Empirical Software Engineering
2
Empirical Software Engineering: How to use empirical research in software engineering?
Repetition of empirical studies is necessary!
Definition
Planning and Design
Execution
Analysis
Packaging
Definition: Determine study goal(s)
Design: and research hypothesis(es). Select type of empirical study to be employed Operationalize study goal(s) and hypotheses. Make study plan: what needs to be done by whom and when. Prepare material required to conduct the study
Execution: Run study according to plan and collect required data
Analysis: Analyze collected data to answer operationalized study goals and hypotheses
Packaging: Report your studies
3
Empiricism in Software Engineering
Confirmation
Evaluation
Identification
Validation
Understanding
Guidance / Control
Of more or less accepted hypotheses:
For example: object-orientation is good for reuse
Of Methods:
For example: Whether Java produces higher quality code than C++
Of Relationships:
For example: Find a relationship between fault prone components and design concepts
Of Models and Measures:
For example: Validate a specific cost estimate model
Of Methods, Techniques and Models:
For example: To understand the relationship between inspections and testing
to help in Management:
For example: as input to personnel to software inspections
To support Decision- Making with respect to Changes:
For example: Whether or not to introduce a new development tool
C
Change / Improve
Experimentation in software engineering
4
Experiment Objective
Cause
Construct
Effect
Construct
Cause-effect
Construct
Theory
Treatment
Outcome
Treatment - Outcome
Construct
Observation
Experiment Operation
Independent variable
Dependent variable
5
What is Empirical Software Engineering Research
What kinds of questions are "interesting"?
What kinds of results help to answer these questions, and what research methods can produce these results?
What kinds of evidence can demonstrate the validity of a result, and how to distinguish good results from bad ones?
6
Types of Research Questions
What kinds of questions are "interesting"?
Types of Research Questions
Method or means of development
Method for analysis
Design, evaluation, or analysis of a particular instance
Generalization or characterization
Feasibility
How can we do/create (or automate doing) X?
What is a better way to do/create X?
How can I evaluate the quality/correctness of X?
How do I choose between X and Y?
What is a (better) design or implementation for application X?
What is property X of artifact/method Y?
How does X compare to Y?
What is the current state of X / practice of Y?
Given X, what will Y (necessarily) be?
What, exactly, do we mean by X?
What are the important characteristics of X?
What is a good formal/empirical model for X?
What are the varieties of X, how are they related?
Is it possible to accomplis.
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Advancing Testing Using Axioms by Paul Gerrard. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Empirical Methods in Software Engineering - an Overviewalessio_ferrari
A first introductory lecture on empirical methods in software engineering. It includes:
1) Motivation for empirical software engineering studies
2) How to define research questions
3) Measures and data collection methods
4) Formulating theories in software engineering
5) Software engineering research strategies
Find the videos at: https://www.youtube.com/playlist?list=PLSKM4VZcJjV-P3fFJYMu2OhlTjEr9Bjl0
Modeling Framework to Support Evidence-Based DecisionsAlbert Simard
Describes a framework for modelling in a regulatory environment founded on sound scientific and knowledge management concepts. It includes 1) demand (isue-driven) and supply (model driven) approaches to modelling, 2) balancing modeler, manager, and user perspectives, 3) documentation to demonstrate due diligence, and a 700-term glossary.
Lecture on case study design and reporting in empirical software engineering. The lecture touches on the topics of units of analysis, data collection, data analysis, validity procedures, and collaboration with industries.
This is a presentation from video on 'Introduction to Operations Research' available at the end of this presentations and directly at https://youtu.be/PSOW3_gX2OU
Topics like Organisations of Operations Research, History of Operations Research Role of Operations Research(OR), Scope of Operations Research(OR), Characteristics of Operations Research(OR), Attributes of Operations Research(OR).
This video also talks about Models of Operations Research
• Degree of abstraction
o Mathematical models
o Language models
o Concrete models
• Function
o Descriptive models
o Predictive models
o Normative models
• Time Horizon
o Static models
o Dynamic models
• Structure
o Iconic or physical models
o Analog or schematic models
o Symbolic or mathematical models
• Nature of environment
o Deterministic models
o Probabilistic models
• Extent of generality
o General model
o Specific models
Systematic Literature Reviews and Systematic Mapping Studiesalessio_ferrari
Lecture slides on Systematic Literature Reviews and Systematic Mapping Studies in software engineering. It describes the different steps, discusses differences between the two methods, and gives guidelines on how to conduct these types of study.
Conducting Experiments on the Software Architecture of Robotic Systems (QRARS...Ivano Malavolta
Slides of my invited talk at the 2nd workshop on Quality and Reliability Assessment of Robotic Software Architectures and Components (QRARSAC), co-located with the International Conference on Robotics and Automation (ICRA 2023).
Abstract of the talk. Today robotic systems are central to many industrial sectors, such as logistics, autonomous warehousing, and healthcare. If on one side ROS is helping roboticists by providing a standardized communication platform for robotic systems, on the other side ROSsystems are getting more and more large and complex, thus making it extremely difficult to ensure their level of quality, e.g., in terms of performance, security, energy efficiency, testability, maintainability. Improving the quality of robotic systems is not a new activity, but in this talk, we tackle it from a different perspective: we look at them from a software architecture perspective. In this talk, I will walk you through a series of experiments we conducted at the Vrije Universiteit Amsterdam targeting the architecture of ROS systems, we will discuss some architectural tactics for ROS systems, and will close with an overview of our open-source tool for automatically executing experiments on robotics software.
The slides of a short presentation I gave about my experience about working in the context of EU grants. It contains tips and tricks for the before/during/after phases of a EU project.
The Green Lab - Research cocktail @Vrije Universiteit Amsterdam (October 2020)Ivano Malavolta
The slides of my presentation about the Green Lab at the event called Research Cocktail (October 2020). The event is organized by the Computer Science Department of the Vrije Universiteit Amsterdam.
The source code of our tools and the replication package of our experiments performed in the Green Lab can be found here: https://github.com/S2-group
For further details about the Green Lab and all our activities around it, you can contact me at i.malavolta@vu.nl
Navigation-aware and Personalized Prefetching of Network Requests in Android ...Ivano Malavolta
Slides of my presentation at the NIER track of the 41th International Conference on Software Engineering (ICSE 2019).
The paper is available here: http://www.ivanomalavolta.com/files/papers/ICSE_2019_NAPPA.pdf
How Maintainability Issues of Android Apps Evolve [ICSME 2018]Ivano Malavolta
Slides of my presentation at the Research track of the 34th International Conference on Software Maintenance and Evolution (ICSME 2018).
The full paper is available here: http://www.ivanomalavolta.com/files/papers/ICSME_2018.pdf
Collaborative Model-Driven Software Engineering: a Classification Framework a...Ivano Malavolta
Slides of my presentation at the Journal first track of the 40th International Conference on Software Engineering (ICSE 2018).
The accompanying extended abstract is available here: http://www.ivanomalavolta.com/files/papers/ICSE_2018_JournalFirst.pdf
The original TSE paper is available here: http://www.ivanomalavolta.com/files/papers/TSE_2017.pdf
This presentation is about a lecture I gave within the "Software Design" course of the Computer Science bachelor program, of the Vrije Universiteit Amsterdam.
http://www.ivanomalavolta.com
Modeling behaviour via UML state machines [Software Design] [Computer Science...Ivano Malavolta
This presentation is about a lecture I gave within the "Software Design" course of the Computer Science bachelor program, of the Vrije Universiteit Amsterdam.
http://www.ivanomalavolta.com
This presentation is about a lecture I gave within the "Software Design" course of the Computer Science bachelor program, of the Vrije Universiteit Amsterdam.
http://www.ivanomalavolta.com
This presentation is about a lecture I gave within the "Software Design" course of the Computer Science bachelor program, of the Vrije Universiteit Amsterdam.
http://www.ivanomalavolta.com
Requirements engineering with UML [Software Design] [Computer Science] [Vrije...Ivano Malavolta
This presentation is about a lecture I gave within the "Software Design" course of the Computer Science bachelor program, of the Vrije Universiteit Amsterdam.
http://www.ivanomalavolta.com
Modeling and abstraction, software development process [Software Design] [Com...Ivano Malavolta
This presentation is about a lecture I gave within the "Software Design" course of the Computer Science bachelor program, of the Vrije Universiteit Amsterdam.
http://www.ivanomalavolta.com
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
[2017/2018] AADL - Architecture Analysis and Design LanguageIvano Malavolta
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
[2017/2018] Introduction to Software ArchitectureIvano Malavolta
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
4. Software engineering research
Some contents of this part of lecture extracted from Ivica Crnkovic’s lecture on
software engineering research at Mälardalen University (Sweden)
5. What makes good research?
is it HARD?
is it USEFUL?
is it ELEGANT?
These are all
orthogonal and
equally respectful
Very little chances
that you will excel in
all three axes
We are young
researchers, don’t
refuse usefulness,
why limit your impact
to dusty publications?
http://goo.gl/d1YM9v
6. My vision about research
Research
Theory Programming Industrial projects Experimentation
Ivano Malavolta. Research Statement. November 2013. http://goo.gl/99N5AS
8. Research objectives
Real world
practical PROBLEM
Real world
practical SOLUTION
Key objectives
• Quality àutility as well as functional correctness
• Cost à both of development and of use
• Timeliness à good-enough result, when it’s needed
Address problems that affect practical software
11. Research strategy
Real world
practical PROBLEM
Real world
practical SOLUTION
Research setting
IDEALIZED PROBLEM
Research setting
SOLUTION to
IDEALIZED PROBLEM
Research product
(technique, method,
model, system, …)
13. Validation of the results
Real world
practical PROBLEM
Real world
practical SOLUTION
Research setting
IDEALIZED PROBLEM
Research setting
SOLUTION to
IDEALIZED PROBLEM
Validation task 1
Does the product
solve the idealized problem?
Research product
(technique, method,
model, system, …)
14. Validation of the results
Real world
practical PROBLEM
Real world
Validation task 2
Does the product
help to solve the practical problem?
practical SOLUTION
Research setting
IDEALIZED PROBLEM
Research setting
SOLUTION to
IDEALIZED PROBLEM
Validation task 1
Does the product
solve the idealized problem?
Research product
(technique, method,
model, system, …)
17. Types of research questions
FEASIBILITY
CHARACTERIZATION
METHOD/MEANS
GENERALIZATION
DISCRIMINATION
Does X exist, and what is it?
Is it possible to do X at all?
What are the characteristics of X?
What exactly do we mean by X?
What are the varieties of X, and how are they
related?
How can we do X?
What is a better way to do X?
How can we automate doing X?
Is X always true of Y?
Given X, what will Y be?
How do I decide whether X or Y?
18. Example: software architecture
The software architecture of a program or computing system is the
structure or structures of the system, which comprise software
components, the externally visible properties of those components and
the relationships among them
System
subsystem Subsystem
component component component
L. Bass, P. Clements, R. Kazman, Software Architecture In Practise, Addison Wesley, 1998
19. Example: SA research questions
FEASIBILITY
CHARACTERIZATION
METHOD/MEANS
GENERALIZATION
DISCRIMINATION
Is it possible to automatically generate code
from an architectural specification?
What are the important concepts for
modeling software architectures?
How can we exploit domain knowledge to
improve software development?
What patterns capture and explain a
significant set of architectural constructs?
How can a designer make tradeoff choices
among architectural alternatives?
21. Research results
Real world
practical PROBLEM
Real world
practical SOLUTION
Research setting
IDEALIZED PROBLEM
Research product
(technique, method,
model, system, …)
22. Types of research results
QUALITATIVE &
DESCRIPTIVE
MODELS
TECHNIQUES
SYSTEM
EMPIRICAL
MODELS
ANALYTIC
MODELS
Report interesting observations
Generalize from (real-life) examples
Structure a problem area; ask good questions
Invent new ways to do some tasks, including
implementation techniques
Develop ways to select from alternatives
Embody result in a system, using the system
both for insight and as carrier of results
Develop empirical predictive models from
observed data
Develop structural models that permit formal
analysis
23. Example: SA research results
QUALITATIVE &
DESCRIPTIVE
MODELS
TECHNIQUES
SYSTEM
EMPIRICAL
MODELS
ANALYTIC
MODELS
Early architectural models
Architectural patterns
Domain-specific software architectures
UML to support object-oriented design
Architectural languages
Communication metrics as indicator of impact
on project complexity
Formal specification of higher-level
architecture for simulation
25. Research validation
Real world
practical PROBLEM
Real world
Validation task 2
Does the result
help to solve the practical problem?
practical SOLUTION
Research setting
IDEALIZED PROBLEM
Research setting
SOLUTION to
IDEALIZED PROBLEM
Validation task 1
Does the product
solve the idealized problem?
Research product
(technique, method,
model, system, …)
26. Types of research validation
PERSUASION
IMPLEMENTATION
EVALUATION
ANALYSIS
Formal model
Empirical model
EXPERIENCE
Qualitative model
Decision criteria
Empirical model
I thought hard about this, and I believe…
Here is a prototype of a system that…
Given these criteria, the object rates as…
Given the facts, here are consequences…
Rigorous derivation and proof
Data on use in controlled situation
Report on use in practice
Narrative
Comparison of systems in actual use
Data, usually statistical, on practice
27. Example: SA research validation
PERSUASION
IMPLEMENTATION
EVALUATION
ANALYSIS
Formal model
Empirical model
EXPERIENCE
Qualitative model
Decision criteria
Empirical model
Early architectural models
Early architectural languages
Taxonomies, performance improvement
Formal schedulability analysis
User interface structure
Architectural patterns
Domain-specific architectures
Communication and project
complexity
28. “NO-NO”s for software engineering
research
• Assume that a result demonstrated fro a 10K-line system
will scale to a 500K-line system
• Expect everyone to do things “my way”
• Believe functional correctness is sufficient
• Assume the existence of a complete, consistent
specification
• Just build things without extracting enduring lessons
• Devise a solution in ignorance of how the world really
works
29. Building blocks for research
Question Result Validation
Feasibility
Characterization
Method/means
Generalization
Selection
Qualitative model
Technique
System
Empirical model
Analytic model
Persuasion
Implementation
Evaluation
Analysis
Experience
30. Is this a good plan?
Question Result Validation
Feasibility
Characterization
Method/means
Generalization
Selection
Qualitative model
Technique
System
Empirical model
Analytic model
Persuasion
Implementation
Evaluation
Analysis
Experience
31. A common good plan
Question Result Validation
Feasibility
Characterization
Can X be
done better?
Generalization
Selection
Qualitative model
Technique
Build Y
Empirical model
Analytic model
Persuasion
Implementation
Measure Y,
compare to X
Analysis
Experience
32. Is this a good plan?
Question Result Validation
Feasibility
Characterization
Method/means
Generalization
Selection
Qualitative model
Technique
System
Empirical model
Analytic model
Persuasion
Implementation
Evaluation
Analysis
Experience
33. A common, but bad, plan
Question Result Validation
Feasibility
Characterization
Method/means
Generalization
Selection
Qualitative model
Technique
System
Empirical model
Analytic model
Persuasion
Implementation
Evaluation
Analysis
Experience
34. Two other good plans
Question Result Validation
Can X be done
at all?
Characterization
Method/means Evaluation
Is X always
true of Y?
Selection
Qualitative model
Technique
Build a Y
that does X
Empirical model
Formally model
Y, prove X
“Look it works!”
Implementation
Check proof
Experience
35. How do you trust a research then?
Real world
practical PROBLEM
Real world
practical SOLUTION
?
1. What are the problems from the real world?
– Are they general?
– What are the elements of them?
2. Are the solutions general? What are their limits?
EMPIRICAL SOFTWARE ENGINEERING
36. *We will have a dedicated course on this topic
Empirical strategies*
Some contents of this part of lecture extracted from Matthias Galster ‘s tutorial
titled “Introduction to Empirical Research Methodologies” at ECSA 2014
37. Empirical software engineering
Scientific use of quantitative and qualitative data to
– understand and
– improve
software products and software development processes
[Victor Basili]
Data is central to address any research question
Issues related to validity addressed continuously
38. Why empirical studies?
Anecdotal evidence or “common-sense” often not good
enough
– Anecdotes often insufficient to support decisions in the industry
– Practitioners need better advice on how and when to use
methodologies
Evidence important for successful technology transfer
– systematic gathering of evidence
– wide dissemination of evidence
39. Dimensions of empirical studies
“In the lab” versus “in the wild” studies
Qualitative versus quantitative studies
Primary versus secondary studies
40. “In the lab” versus “in the wild” studies
Common “in the lab” methods
– Controlled experiments
– Literature reviews
– Simulations
Common “in the wild” methods
– Quasi-experiments
– Case studies
– Survey research
– Ethnographies
– Action research
42. Qualitative versus quantitative studies
Qualitative research
studying objects in their natural setting and letting the
findings emerge from the observations
– inductive process
– the subject is the person
They are
complementary
Quantitative research
quantifying a relationship or to compare two or more groups
with the aim to identify a cause-effect relationship
– fixed implied factors
– focus on collected quantitative data à promotes comparison and
statistical analyses
43.
44. Primary versus secondary studies
Primary studies
empirical studies in which we directly make measurements
or observations about the objects of interest, whether by
surveys, experiments, case studies, etc.
Secondary studies
empirical studies that do not generate any data from direct
measurements, but:
– analyze a set of primary studies
– usually seek to aggregate the results from these in order to
provide stronger forms of evidence about a phenomenon
48. Survey
Def: a system for collecting information from or about people
to describe, compare or explain their knowledge, attitudes
and behavior
Often an investigation performed in retrospect
Interviews and questionnaires are the primary means of
gathering qualitative or quantitative data
These are done through taking a sample which is
representative from the population to be studied
49. Example: our survey on arch. languages
1. ALs Identification
– Definition of a preliminary set of ALs
– Systematic search
2. Planning the Survey
3. Designing the survey
4. Analyzing the Data
– vertical analysis (and coding) + horizontal analysis
50. Case study
Def: an empirical enquiry to investigate one instance (or a
small number of instances) of a contemporary software
engineering phenomenon within its real-life context,
especially when the boundary between phenomenon and
context cannot be clearly specified
Observational study
Data collected to track a specific attribute or establishing
relationships between different attributes
Multivariate statistical analysis is often applied
52. Experiment
Def: an empirical enquiry that manipulates one factor or
variable of the studied setting.
1. Identify and understand the variables that play a role in software
development, and the connections between variables
2. Learn cause-effect relationships between the development
process and the obtained products
3. Establish laws and theories about software construction that
explain development behaviour
56. How to have an impact in reality?
This is called technology transfer
57. Writing good software
engineering papers
Contents of this part of lecture extracted from Ivica Crnkovic’s lecture on
software engineering research papers writing at Mälardalen University (Sweden)
58. Research Papers
The basic and most important activity of the research
• Visible results, quality stamp
• Means for communications with other researchers
59. A good research paper should
answer a number of questions
What, precisely, was your contribution?
– What question did you answer?
– Why should the reader care?
– What larger question does this address?
What is your new result?
– What new knowledge have you contributed that the reader can use
elsewhere?
– What previous work (yours or someone else’s) do you build on? What do
you provide a superior alternative to?
– How is your result different from and better than this prior work?
– What, precisely and in detail, is your new result?
Why should the reader believe your result?
– What standard should be used to evaluate your claim?
– What concrete evidence shows that your result satisfies your claim?
If you answer these questions clearly, you’ll probably
communicate your result well
60. Let’s reconsider our SE research
process…
Research
questions
Research
results
Research
validation
61. What do program committees
look for?
The program committee looks for
Research
questions
– a clear statement of the specific problem you solved
– the question about software development you answered
– an explanation of how the answer will help solve an important
software engineering problem
You'll devote most of your paper to describing your result,
but you should begin by explaining what question you're
answering and why the answer matters
62.
63. Research results
Explain precisely
– what you have contributed to the store of software engineering
knowledge
– how this is useful beyond your own project
64.
65.
66. What do program committees look
for?
The program committee looks for
– interesting, novel, exciting results that significantly enhance our
ability
• to develop and maintain software
• to know the quality of the software we develop
• to recognize general principles about software
• or to analyze properties of software
You should explain your result in such a way that someone
else could use your ideas
67. What do program committees look
for? What’s new here?
Use verbs that shows
RESULTS, not only efforts
69. What has been done before? How is
your work different or better?
• What existing technology does your research build on?
• What existing technology or prior research does your
research provide a superior alternative to?
• What’s new here compared to your own previous work?
• What alternatives have other researchers pursued?
• How is your work different or better?
71. What, precisely, is the result?
• Explain what your result is and how it works. Be concrete
and specific. Use examples.
– Example: system implementation
• If the implementation demonstrates an implementation
technique, how does it help the reader use the technique
in another setting?
• If the implementation demonstrates a capability or
performance improvement, what concrete evidence does
it offer to support the claim?
• If the system is itself the result, in what way is it a
contribution to knowledge? Does it, for example, show you
can do something that no one has done before?
72. Why should the reader believe your
result?
Show evidence that your result is valid—that it actually helps
to solve the problem you set out to solve
74. What do program committees look for? Why
should the reader believe your result?
• If you claim to improve on prior art, compare your result
objectively to the prior art
• If you used an analysis technique, follow the rules of that
analysis technique
• If you offer practical experience as evidence for your result,
establish the effect your research has. If at all possible, compare
similar situations with and without your result
• If you performed a controlled experiment, explain the
experimental design. What is the hypothesis? What is the
treatment? What is being controlled?
• If you performed an empirical study, explain what you
measured, how you analyzed it, and what you concluded
75. A couple of words on the abstract of
a paper
People judge papers by their abstracts and read the abstract
in order to decide whether to read the whole paper.
It's important for the abstract to tell the whole story
Don't assume, though, that simply adding a sentence about
analysis or experience to your abstract is sufficient; the paper
must deliver what the abstract promises
76. Example of an abstract structure:
1. Two or three sentences about the current state of the art,
identifying a particular problem
2. One or two sentences about what this paper contributes to
improving the situation
3. One or two sentences about the specific result of the paper
and the main idea behind it
4. A sentence about how the result is demonstrated or defended
77. Coming back to the initial example…
✓✗ ✓ ✗ ✓
State of
the art
Overall
contribution
Specific
results Validation
78. Second try…
State of
the art
Overall
contribution
Specific
results Validation
80. Homework
ICSE 2014 features a "Future of Software Engineering" track,
which provides delegates with a unique opportunity to
assess the current status of software engineering and to
indicate where the field is heading in the future.
FOSE is an invitation-only ICSE track that is held (approx.)
every 7 or more years at ICSE
An international group of leading experts has been invited to
report on different topics, to provide a broad and in-depth
view of the evolution of the field.
http://2014.icse-conferences.org/fose
81. Homework
GOALS:
1. to have the chance to study a specific area of software
engineering that may be of interest to you
2. to be exposed to recurrent and important problems in
software engineering
TASKS:
1. Pick an article from the FOSE 2014 proceedings
2. Carefully read it and analyse it in terms of:
– its research domain, its evolution over time, and its future challenges
– [where possible] understand which research strategies have been
applied either in the paper or in the research area in general
3. give a presentation (max 25 slides) to the classroom
– other post-docs and students will attend the presentations
82. What this lecture means to you?
You now know how to carry on research in SE
Don’t focus on the “size” of the problem, but on
– the relevance (the practical, but also the theoretical!)
– the accuracy in the investigation (problem and evaluation research)
When conducting empirical research, don’t make claims you
cannot eventually measure
Finally, don’t think in black and white only
– don’t divide the world in methods, analyses, case study, etc.
– don’t be afraid to look also at other disciplines à we are software
engineers in any case J
83. Suggested readings
1. Checking App Behavior Against App Descriptions (Alessandra Gorla,
Ilaria Tavecchia, Florian Gross, Andreas Zeller), In Proceedings of the
36th International Conference on Software Engineering, ACM, 2014.
2. Linares-Vásquez, M., Bavota, G., Bernal-Cárdenas, C., Oliveto, R., Di
Penta, M., and Poshyvanyk, D., "Mining Energy-Greedy API Usage
Patterns in Android Apps: an Empirical Study", in Proceedings of 11th
IEEE Working Conference on Mining Software Repositories (MSR'14),
Hyderabad, India, May 31- June 1, 2014, pp. 2-11
3. Shaw, M. (2003), Writing Good Software Engineering Research Paper.,
in Lori A. Clarke; Laurie Dillon & Walter F. Tichy, ed., 'ICSE' , IEEE
Computer Society, , pp. 726-737 .
4. Shaw, M. (2002), 'What makes good research in software
engineering?', STTT 4 (1) , 1-7 .