Requirements Engineering – Writing the Software Requirements Specification (SRS). A CRI Group Workshop. The requirements engineering approach employed successfully in the EHR4CR process is shown and discussed in order to extract lessons learned and to use it for new projects.
Requirements engineering is the process of eliciting stakeholder needs and desires and develope them into an agreed set of detailed requirements. It serves as basis for all subsequent software development activities.
In general, a project begins with the requirement acquisition phase and ends with the specification of requirements in form of the Software Requirement Specification (SRS). Requirements specification may even be used to manage the consistency of the entire system.
Learning from the Requirements Engineering process in the the EU project EHR4CR. Especially the topics of Requirements Scenarios in the process of requirement gathering and the iterative writing and validation of software requirements specification (SRS) document can be applied to new projects. The Requirements Process consists of 4 steps: Requirements Elicitation – the art to receive meaningful requirements. Requirements Analysis – iterative improvement of quality of requirements. Writing the Requirements Specification document (Software Requirement Specification) and Requirements Validation - this is also done iteratively with several workshops.
Novel is the introduction of an iterative process for requirements engineering. Start with only a subset of software requirements, iterate the collection and validation until the full system is implemented. In each iteration, design modifications are made and new functional capabilities are added. Following tools for requirements gathering were used: Use Cases, Descriptions of current situation and workflow, Context diagram, Stakeholder interviews, Scenarios and Use Case workshops.
A novel scenario based approach for requirements engineering is being introduced: The domain scenario is used to estimate probable effects (situation analysis and long-range planning). The domain scenario is broken down into high-level "Usage Scenarios". Usage Scenarios describe critical business interactions and their anticipated operations; they serve as context for the use cases and the generation of requirements; they make sure requirements are complete.
Development of the SRS with involvement of scenarios: 1. Begin with Domain Scenarios; 2. Development of Usage Scenarios; 3. Software Requirements Specification document. Several round of change management were employed during writing the SRS. This possibility for correction and improvement ensured that the requirements are of high quality and applicability.
Software Requirements and Specificationsvustudent1
CS510 - SRS handouts for Computer Science students of Virtual University of Pakistan.
Prepared by ForumVU.com Staff from the updated lectures and PowerPoint slides of CS510 - Software Requirements and Specifications in VU LMS.
This presentation describe the importance of trade-off between software architecture quality attribute (NFR). Explain about Performance, Security, Availability and Scalability in depth and other in briefly.
Presented on tech talk @ DFN Technology.
Test Mühendisliğine Giriş Eğitimi - Bölüm 1Mesut Günes
ISTQB ve ISEB Foundation level gibi "Test Uzmanlığı" ile ilgili yapılan sınavlara hazırlık olarak tüketilecek dökümandır. Ayrıca yazılım test mühendisliği ile ilgili bilgi edinmek isteyenlerin okuyabileceği Türkçe kaynaktır.
Software Requirements and Specificationsvustudent1
CS510 - SRS handouts for Computer Science students of Virtual University of Pakistan.
Prepared by ForumVU.com Staff from the updated lectures and PowerPoint slides of CS510 - Software Requirements and Specifications in VU LMS.
This presentation describe the importance of trade-off between software architecture quality attribute (NFR). Explain about Performance, Security, Availability and Scalability in depth and other in briefly.
Presented on tech talk @ DFN Technology.
Test Mühendisliğine Giriş Eğitimi - Bölüm 1Mesut Günes
ISTQB ve ISEB Foundation level gibi "Test Uzmanlığı" ile ilgili yapılan sınavlara hazırlık olarak tüketilecek dökümandır. Ayrıca yazılım test mühendisliği ile ilgili bilgi edinmek isteyenlerin okuyabileceği Türkçe kaynaktır.
Requirements analysis, also called requirements engineering, is the process of determining user expectations for a new or modified product. These features, called requirements, must be quantifiable, relevant and detailed. In software engineering, such requirements are often called functional specifications. Requirements analysis is an important aspect of project management.
Use of personalized medicine tools for clinical research networksWolfgang Kuchinke
Patient-centric clinical trials can gain enormously from the employment of personalised medicine tools. Here we address software tools created by the p-medicine network, which developed thr ObTiMA data management system, Patient Empowerment Tool, data mining, data warehousing, biobank access, decision support, image annotation (DrEye) and simulation (Oncosimulator). We evaluated of some of these tools for their suitablity to perform clinical trials. Is their usage conform with regulations and standards (GCP, GDPR, GAMP, computer system validation)? Can these tools be integrated into the existing systems (IT infrastructure / organisational framework) of an international clinical trials network (ECRIN)?
Kuchinke Personalized Medicine tools for clinical research networksWolfgang Kuchinke
Personalized medicine for clinical trials networks.
The p-medicine project is presented. It deals with the creation of an integrative infrastructure for Personalised Medicine, which aims to accelerate personalized medicine and personal clinical research. For this purpose p-medicine developed a comprehensive set of software tools, including ObTiMA data management system, Patient Empowerment Tool, data
mining, data warehousing, biobank access, decision support, image annotation (DrEye) and simulation (Oncosimulator). Here we show the evaluation of some of the p-medicine tools for their suitablity to perform clinical trials. Is their usage conform with regulations and standards (GCP, GDPR, GAMP, computer system validation)? Can these tools be integrated into the existing systems (IT infrastructure / organisational framework) of an international clinical trials network (ECRIN)? To perform clinical trials, a legal and ethical framework based on international requirements and approved concepts for data security must be adopted. GCP (Good Clinical Practice) is such an international ethical and scientific quality standard for designing, recording and reporting trials that involve the participation of human subjects.
Evaluation of usability of p-medicine software tools for clinical trials was done with two surveys: (1) survey of p-medicine tools in the ECRIN network and (2) p-medicine developer survey. The tool integration topics contained questions about the employment of the right Clinical Data Management System (CDMS) at the many ECRIN centres. There is competition between different solutions, like VISTA (EORTC) MACRO, secuTrial, RAVE, OpenClinica. CDMS should be usable for all types of trials and the usability in clinical trials must be demonstrated by integration of biobank access / safety functions. Only ObTiMA is able to specifically address the challenges of personal medicine clinical trials. The results of the evaluation was that there exists some compliance gaps for quality management during software development, no complete GCP compliance yet and the missing of a robust business model for software sustainability. To address the latter, a Reciprocal Integration approach was developed to integrate p-medicine tools into clinical research networks.
Requirements analysis, also called requirements engineering, is the process of determining user expectations for a new or modified product. These features, called requirements, must be quantifiable, relevant and detailed. In software engineering, such requirements are often called functional specifications. Requirements analysis is an important aspect of project management.
Use of personalized medicine tools for clinical research networksWolfgang Kuchinke
Patient-centric clinical trials can gain enormously from the employment of personalised medicine tools. Here we address software tools created by the p-medicine network, which developed thr ObTiMA data management system, Patient Empowerment Tool, data mining, data warehousing, biobank access, decision support, image annotation (DrEye) and simulation (Oncosimulator). We evaluated of some of these tools for their suitablity to perform clinical trials. Is their usage conform with regulations and standards (GCP, GDPR, GAMP, computer system validation)? Can these tools be integrated into the existing systems (IT infrastructure / organisational framework) of an international clinical trials network (ECRIN)?
Kuchinke Personalized Medicine tools for clinical research networksWolfgang Kuchinke
Personalized medicine for clinical trials networks.
The p-medicine project is presented. It deals with the creation of an integrative infrastructure for Personalised Medicine, which aims to accelerate personalized medicine and personal clinical research. For this purpose p-medicine developed a comprehensive set of software tools, including ObTiMA data management system, Patient Empowerment Tool, data
mining, data warehousing, biobank access, decision support, image annotation (DrEye) and simulation (Oncosimulator). Here we show the evaluation of some of the p-medicine tools for their suitablity to perform clinical trials. Is their usage conform with regulations and standards (GCP, GDPR, GAMP, computer system validation)? Can these tools be integrated into the existing systems (IT infrastructure / organisational framework) of an international clinical trials network (ECRIN)? To perform clinical trials, a legal and ethical framework based on international requirements and approved concepts for data security must be adopted. GCP (Good Clinical Practice) is such an international ethical and scientific quality standard for designing, recording and reporting trials that involve the participation of human subjects.
Evaluation of usability of p-medicine software tools for clinical trials was done with two surveys: (1) survey of p-medicine tools in the ECRIN network and (2) p-medicine developer survey. The tool integration topics contained questions about the employment of the right Clinical Data Management System (CDMS) at the many ECRIN centres. There is competition between different solutions, like VISTA (EORTC) MACRO, secuTrial, RAVE, OpenClinica. CDMS should be usable for all types of trials and the usability in clinical trials must be demonstrated by integration of biobank access / safety functions. Only ObTiMA is able to specifically address the challenges of personal medicine clinical trials. The results of the evaluation was that there exists some compliance gaps for quality management during software development, no complete GCP compliance yet and the missing of a robust business model for software sustainability. To address the latter, a Reciprocal Integration approach was developed to integrate p-medicine tools into clinical research networks.
Kuchinke Clinical Trials Networks supported by tools and servicesWolfgang Kuchinke
Clinical Trials Networks supported by Tools and Services from Infrastructure Projects.
International clinical trials are a challenge to management. Though, the number of clinical trials worldwide is increasing by around 10% per year, approvals for new molecule entities and biomedical licenses show little long-term increase. Main challenges are the need to recruit and retain sufficient numbers of patients and the successful implementing e-Clinical Trials technologies, especially for trials incorporating ePRO (patient reported outcome) and eRecruitment services. We suggest that clinical trials networks should cooperate with infrastructure projects to enable the implementation of eTrials and patient-centric trials.
Clinical trials systems can be optimised by coordination through information sharing and collaboration and by building networks. Here infrastructures can function as enablers by the provision of
software tools, especially patient centric trials, ePRO (Patient Reported Outcome) and data collection and recruitment using EHRs (Electronic Health Records) and the implementation of nessessary data protection, privacy protection and identity management. As example for a clinical trials network ECRIN is addressed. ECRIN is a public, non-profit organisation that links scientific partners and networks across Europe to facilitate multinational clinical research. We suggest the integration of clinical research at ECRIN with several infrastructure services developed by BBMRI, EATRIS, EUDAT, TransForm, p-medicine, BioMedBridges, etc., resulting in an increase in interoperability of clinical data management, biobanking, genetic databases, Electronic Health Records (EHR), query systems, data warehouses, data repositories and imaging data.
The SUPERSEDE project will provide advancements in several research areas, from end-user feedback and contextual data analysis, to decision making support in software evolution and adaptation. But the major novel contribution will be in integrating methods and tools from the mentioned areas, thus providing a new solution framework for software evolution and adaptation for data-intensive applications.
Evaluation of the importance of standards for data and metadata exchange for ...Wolfgang Kuchinke
Electronic Data Capture (EDC) in clinical research can ensure high-quality, clean data capture. Especially easy is the capture of repeating measurement like Adverse Events.
Here, a project is describes that evaluates the importance of data and metadata exchange for clinical research with the focus on EDC systems, standards that are relevant and the Computer System Validation (CSV) of EDC systems.
The project creats process descriptions, as well as necessary documents and checklists for any GCP-compliant system validation (Computer System Validation, CSV), like the Traceability Matrix.
Research networks are made aware of the necessity and importance of using EDC systems and conducting the computer system validation for regulatory compliance. The important role of CDISC standard, the standard for collection, exchange, submission to authorities and archiving of data from clinical studies is discussed in detail.
Health Care: Cost Reductions through Data Insights - The Data Analysis GroupJames Karis
An overview of the cost reduction opportunities for a Health Care provider. These opportunities can be identified, quantified and optimised through data-driven insights. The slide pack also provides a strategic overview of how one would set up such a project within a large organisation, whilst mitigating patient-care concerns.
Standards for clinical research data - steps to an information model (CRIM).Wolfgang Kuchinke
Standards for clinical research data: Introduction to CDISC standards CDASH, SHARE, PRM and BRIDG and their evaaluation to create a Information model for clinical research (CRIM). In particular, CRIM should allow the integrative usage of medical care data together with clinical research data; it should support the processes of the Learning Health System (LHS).
CDASH is Clinical Data Acquisition Standards Harmonization; it identifies the basic data collection fields needed from a clinical, scientific and regulatory perspective to enable more efficient data collection at the Investigator sites. SHARE is a globally accessible electronic library built on a common information model, which enables precise and standardized data element definitions that can be used in studies and applications to improve biomedical research. SHARE is intended to be a healthcare‐biomedical research enriched data dictionary. The Protocol Representation Model (PRM) focuses on the characteristics of a clinical study and the definitions and association of activities within the protocols and defines over 100 common protocol elements. The BRIDG Model is an instance of the Domain Analysis Model. The dynamic component of BRIDG defines the various processes and dynamic behaviour of the domain; the static component describes the concepts, attributes, and relationships of the static constructs which collectively define a domain-of-interest.
The CRIM was developed based on activity models and use cases. CRIM specifies the necessary information objects, their relationships and associated activities. It is required to fully support the development of TRANSFoRm project's tools for the Learning Health System. All activity objects of the workflows were defined and characterized according to their data requirements and information needs and mapped to the concepts of established information models including the above mentioned CDISC standards.
The best mapping results were achieved with PCROM and it was decided to use PCROM as basis for the development of CRIM. The comparison of PCROM with BRIDG found a significant overlap of concepts but also several areas important to research that were either not yet represented or represented quite differently in BRIDG. Adaption of PCROM to the needs of CRIM was acchieved by adding 14 information object types from BRIDG, two extensions of existing objects and the introduction of two new high-ranking concepts (CARE area and ENTRY area).
Ontologies for Clinical Research - Assessment and DevelopmentWolfgang Kuchinke
Ontologies for Clinical Research. Ontologies are representations with names and categories, properties and relations between concepts and entities of a domain of knowledge; they are showing the properties of a subject area and how they are related. The purpose of an ontology is to limit the complexity of information and to organize data into information and knowledge. The topic of my presentation deals with Ontology-Based Data Integration for clinical research. Data from clinical trials need to be reused and shared for secondary research purposes. Integrated clinical terminologies are necessary for an efficient clinical trial system. Following ontologies for clinical research were assessed: Clinical Trial Ontology (CTO), Ontology of Clinical Research (OCRe), Ontology for Biomedical Investigations (OBI), Cochrane PICO Ontology, ACGT Master Ontology, Basic Formal Ontology (BFO). The aim of our efforts was to find ontologies to enable and simplify data reuse and data sharing between pre-clinical research, clinical phase I – phase III clinical trials and health data research. Existing ontologies are insufficient for this purpose and it exists a need for an ontology for clinical trial
data integration. We want to create such an ontology by joining PICO, OCRe and OBI to form a new, more comprehensive ontology.
Temporal relations in queries of ehr data for researchWolfgang Kuchinke
Temporal Relations in Queries of Electronic Patient Records. The main usage scenario for queries covers patient identification and recruitment process for clinical trials. For this purpose an extension of the EHR4CR workbench to support patient recruitment was created.
This workbench covers following requirements: Need for built-in privacy protection. Patient identification and recruitment tracking tools have been made available to the clinical sites in the form of the workbench. Each participating clinical site has its own installation only used locally (patient data don't leave the hospital).
One important requirement for the workbench is the ability to generate queries with temporal relations and constraints for eligibility criteria.
Queries in EHRs often have a temporal component. But available user interfaces allow only simple queries with simple temporal expressions. Time points and time intervals are the main concepts that must be considered, being related to instantaneous events (e.g. a single myocardial infarction), or to situations lasting for a time span (e.g. a drug therapy for 2 weeks). Intervals can be represented using time points by their upper and lower temporal boundaries: the start and end time points. Temporal relations (e.g before, after) can be expressed via anchors. The dates of these anchor events can be retrieved and event dates relative to this anchor events can be calculated. EHR4CR decided to build the workbench upon a simple, time-stamp database. To each patient’s attribute a time-stamp, which corresponds to the time of the attribute’s
occurrence was assigned. The processing of temporal intervals was necessary since many questions dealing with inclusion / exclusion criteria often involve complex temporal periods. A graphical interface to use boxes for querying with temporal relations was created to simplify query generation. We think that the easiest way to specify temporal operators is with an user interface based on the combination of boxes. Temporal operators based on Allen’s algebra were included. Expressions are displayed as graphic boxes and combined by operators. Events are specified and a temporal operator selected from a
predefined list.
Computer validation of e-source and EHR in clinical trials-KuchinkeWolfgang Kuchinke
Clinical Trials in the Learning Health System (LHS): Computer System Validation of eSource and EHR Data.
The question that was addressed: How to make a clinical trial data management system that uses EHR data, Patient Reported Outcome (PRO) and eSource data as part of the Learning Health System compliant with regulations and with Good Clinical Practice (GCP)?
The Learning Health System (LHS) connects health care with translational and clinical research. It generates new medical knowledge as a by-product of the care process and its aim is to improve health and safety of patients. The LHS generates and applies knowledge. For this purpose, clinical research, which is research involving humans, must be part of the LHS. Two general types of research exists: observational studies and clinical trials.
Clinical data drive the LHS, because results from randomized controlled trials are seen as “gold standard” for medical evidence. For this reason the concept of using data gathered directly from the patient care environment has enormous potential for accelerating the rate at which useful knowledge is generated.
All computer systems involved in clinical trials must undergo Computer System Validation (CSV). For this process, a legal framework for the TRANSFoRm project was developed. It was used for data privacy analysis of the data flow in two research use cases: an epidemiological cohort study on Diabetes and a randomised clinical trial about different GORD treatment regimes.
Computerized system validation is the documented process to produce evidence that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. The validation of electronic source data in clinical trials presents many challenges because of the blurring of the border between care and research. Here we present our approach for the validation of eSource data capture and the developed documentation for the CSV of the complete data flow in the LHS developed by the TRANSFoRm project. An important part hereby played the GORD Valuation Study.
Personalized medicine tools for clinical trials - KuchinkeWolfgang Kuchinke
Tools for personalised medicine in clinical trials.
The implementation of clinical trials in personalized medicine is a different way of doing clinical research, compared to the standard way of large clinical trials aiming for statistical significance. Personalized medicine uses a medical model that separates people into different groups with medical decisions, practices, drugs, interventions being tailored to the individual patient based on their predicted response. Basis for this approach is the progress of the study of the human genome and its variation over the last two decades. Especially advancements in automated DNA sequencing, PCR technologies and the use of expressed sequence tags (ESTs), cDNAs, antisense molecules, small interfering RNAs (siRNAs).
But the adoption of personalized medicine requires an active and flexible and highly integrated infrastructure, which must allow the joining of many different competences and technologies. We asked the question: can the tools developed for personalized medicine in the p-pedicine project be employed effectively in a clinical trials network to support personalised clinical trials? We conducted an analysis of tool integration and the evaluation of tool usage requirements. Based on the survey results, the tendency for the clinical trial network ECRIN is to use software as a service in the form of SaaS or ASP. ECRIN data centres will (probably) not install and employ p-medicine tools in one of their data centres. A robust business model for the provision of services and the implementation and employment of tools does not yet exist.
How can the personalized medicine infrastructure p-medicine and the clinical trials network ECRIN gain from each other to allow the conduct of personalized clinical trials? We suggest a business model, in which personalized medicine infrastructures and clinical trials networks exchange their services to gain jointly from each other. An integration of networks by reciprocal exchange of services may be the solution. Not only software as a service will be exchanged, but also knowledge, personnel and staff trainings.
Kuchinke-The Learning Health System (LHS) Introduction and meeting 1fpdfWolfgang Kuchinke
We present the results of the conference: The Learning Health System (LHS) in Europe - A conference held in Brussels. A comparison between the US and the European approach to implementing a LHS was achieved. The perspective from the US was given by Charles Friedman; the European point of view by Federico Paoli. C. Friedman also introduced the concept of Learning by virtious cycles.
An example for the implementing the LHS in Europe was demonstrated in detail: the TRANSFoRm project, which aims to develop and demonstrate methods, models, standards and a
digital infrastructure for three specific components of the LHS:
– Genotype-phenotype epidemiological studies
– Randomised Clinical Trials (RCTs) with both data and trial processes embedded within the functionality of EHRs
– Decision support for diagnosis, based on clinical prediction rules
Other topics presented were: Sustainability and Business Development ofthe LHS. Knowledge Management & Data Standards in the LHS. Data privacy and security in the LHS (EHR4CR solutions for privacy protection, HIPAA - US experiences with privacy protection, TRANSFoRm Zone Model for data privacy protection, IMI code of practice for the secondary use of health data). Knowledge Translation and Decision Support in the LHS (Decision support for interventions in the LHS, TRANSFoRm Decision support for diagnosis,
TRANSFoRm Extensible model for diagnostic evidence, CareWell - A learning integrated care system, Improving quality of care with routine data: the perspective of a statutory health insurer, The impact on LHS).
CDISC and clinical trial standards in the LHS, pan-European platforms for the re- use of EHRs, TRANSFoRm and CDISC standards, Provenance and GCP for real world clinical trials,
Using Health Data for Innovative Trial Designs, ...
Importance of data standards and system validation of software for clinical r...Wolfgang Kuchinke
We present our evaluation of existing data standards for clinical trials. For this purpose a survey about the importance of data standards for clinical trials centers and EDC software companies were conducted. Electronic data capture in clinical trials uses a computerized system designed for the collection of clinical data in electronic form in Case Report Forms (CRF). It also covers medical data captured during clinical trials, safety data related to clinical trials, and patient reported outcome. The degree of implementation of standards, like CDISC ODM in available EDC software products was evaluated. Failure to establish data standards will make it difficult or impossible to connect data between different systems for efficient clinical study execution. The next step after purchasing a software solution is the computer system validation. Validation is about bringing computerized systems into regulatory compliance and making them compliant with GCP, GLP and GMP and other regulations (e.g. data protection). The basis standard for validation is provided by the GAMP Good Practice Guide, which provides a framework of best practices to ensure that computer systems are suitable for use and compliant with the legislation. The newest version uses a risk-based approach to computer system validation A system is evaluated and assigned to a predefined category based on its intended use and complexity. For validation one should define how all elements of the computer system are supposed to work (functional requirements), develop corresponding scripts and test routines to validate it is functioning as it should.
Legal and Ethical Issues of International Clinical TrialsWolfgang Kuchinke
An analysis of regulatory, ethical and international aspects of clinical trials is presented, covering all relevant regulatory and ethical requirements for the conduct of international clinical trials. Our analysis is extended by discussing certain legal and ethical issues that are of importance for personalised medicine, especially for the use of software tools dealing with sensitive patient data and supporting patients in their decision making. Following ethical issues connected to the use of personalised medicine tools are discussed:
Tools that have been completely developed and tested for use in a medical environment and are GCP compliant, tools that capture patient data that have to be accurate, reliable and correct. Compliant clinical data management systems (CDMS) that are based on computer system validation. The ethical environment for GCP compliance in data management (audits, subject identification codes, importance of retention of sponsor-specific essential documents). Security of the portal so that unauthorised persons do not have access; the issue of personal data used in clinical trials and “directly" or "indirectly" identifiable data; legal and ethical issues arising by the deep integration of tools for clinical trials in personalised medicine; and issues caused by software that falls under the medical device law. An important aspect is to ensure patient’s autonomy during use of integrated tools for Patient Empowerment.
Computer System Validation - The Validation Master PlanWolfgang Kuchinke
Computer System Validation (CSV) is the process used to ensure and document that a computerbased system is operating according to predefined requirements. CSV is necessary when replacing paper records, like
Case Report Forms for clinical trials, with an electronic system within the highly regulated data zone that impacts public health and safety. Necessary validation documents are for example the Standard Operating Procedures (SOPs), which outline how the computer system should be used. Here, we describe in detail the System Validation Master Plan, the most important document in Computer System Validation. In contains topics, like: Validation Policy, Definition of Validation, Rules and Regulations in CSV, Legal basis, FDA 21 CFR Part 11, FDA Guidance for industry, ICH Guideline GCP, Annex 11 EU-GMP, Validation Philosophy, Organisation validation document, Audit Reports, Organisation guidelines, Organisation quality management handbook, etc.
The steps of the Validation Life Cycle are: 1. System Specification, 2. System Classification, 3. Validation Planning, 4. Establishing of the validated state, 5. Maintaining the validated state, 6. System Retirement.
Legal Assessment Tool (LAT) - interactive help for data sharingWolfgang Kuchinke
We implemented an approach where relevant concepts derived from computer science were adapted to define legal requirements for data bridges. “Legal interoperability” is defined as an extension of the general interoperability concept for diverse systems. In this concept the data bridges between infrastructures become “interfaces”. The LAT presents a number of questions, which the user answers and LAT assesses the answers to show corresponding legal texts, data access rules and regulations.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
3. Purpose of the workshop
• The requirements enginering approach
employed successfully in the EHR4CR
process is shown in order to use it for new
projects
4. Requirements engineering
• Process of eliciting stakeholder needs and
desires and developing them into an
agreed-upon set of detailed requirements
• Serves as basis for all subsequent
development activities
• Make the problem clear and complete, and
to ensure that the developed solution is
correct, reasonable, and effective
W. Kuchinke (2017)
5. Begin with requirements
acquisition
• In general, a project begins with the
requirements acquisition phase
• Ends with the specification of requirements
• Often projects begin with the analysis phase
• Sometimes some kind of requirements
specification exists, often in form of functional
specifications
• Requirements specification may be used to
manage the consistency of the entire system
W. Kuchinke (2017)
6. Aim
• The Requirements Engineering process played a
major part in the EHR4CR project
• It was successfully conducted
• It applied Requirements Scenarios and an iterative
approach for Requirements Engineering and
writing the Software Requirements Specification
(SRS)
• Prototyping was used
• Therefore, it is very suitable to be used as a model
to improve Requirements Engineering in other
projects
7. Overview
• Learning from the Requirements Engineering
process in the the EU project EHR4CR
• Ability to use the requirement gathering process
for other EU projects
• Topics
– Requirements Engineering Process
– Role of Requirements Scenarios in the process of
requirement gathering
– Introduction to software requirements specification
(SRS) document
W. Kuchinke (2017)
9. Project Objectives
• Promotion of re-use of EHRs to accelerate
regulated clinical trials, across Europe
• EHR4CR project produced
– Requirements specification for EHR systems to support
clinical research and for integrating information across
hospitals in different countries
– EHR4CR Technical Platform (consisting of tools and
services)
– Pilots for validating the solutions
– EHR4CR business model
• Requirements were generated for the use of EHR
for clinical trials
W. Kuchinke (2017)
10. Electronic Health Records for
Clinical Research
• Providing adaptable, reusable and
scalable solutions (tools and services) for
reusing data from EHR systems for clinical
research
• The EHR offers significant opportunities
for the advancement of medical research,
the improvement of healthcare, and the
enhancement of patient safety
W. Kuchinke (2017)
11. The EHR4CR Scenarios
• Protocol feasibility
• Patient identification recruitment
• EHR-EDC integration
• Pharmaco-vigilance
• Scenarios act
– Across different therapeutic areas: oncology,
inflammatory diseases, neuroscience, diabetes,
cardiovascular diseases etc.
– Across different EU countries under different legal
frameworks
12. The Setting
• Central role: Study Manager
– Needs to monitor site recruitment performance
• Role: Investigator
– Needs to identify local patients that meet protocol
inclusion / exclusion criteria
– Patients can be recruited for clinical trial participation
• This may require the research physician to
reach out to local treating physicians for
candidates / referrals
13. Electronic Health Records for
Clinical Research
• EHR4CR project set out to find ways to allow researchers
running clinical trials to search medical records in hospitals
across Europe
• Discover potentially suitable patients for clinical trials
• Assessment of the number of potential trial patients from
the hospitals’ electronic records
• Guarantee privacy protection of sensitive data
– https://www.imi.europa.eu/projects-results/project-
factsheets/ehr4cr
– Final Report:
https://www.imi.europa.eu/sites/default/files/uploads/documents/pr
ojects/EHR4CR_summary_final_report.pdf
14. EHR4CR Technical Platform
• Feasibility, exploration, design and execution of clinical studies
• Long-term surveillance of patient populations
• Trial eligibility and recruitment criteria must be expressed in
ways that permit searching for suitable patients across
different EHR systems
• Access to multiple heterogeneous and distributed EHR
systems
• Integration with existing clinical trials infrastructures (e.g. EDC
systems for data collection)
• Improvement of data quality to enable routine clinical data to
contribute to clinical trials
W. Kuchinke (2017)
17. The four Steps of the
Requirements Process Model
• Requirements Elicitation – the art to receive
meaningful requirements
• Requirements Analysis – iterative
improvement of quality of requirements
• Writing the Requirements Specification
document (Software Requirement
Specification)
• Requirements Validationthis- this is also
done iteratively with several workshops
18. Requirements Engineering
Process Model
Requirements Management
Requirements Management
•Requirements
Specification
Document
•Requirements
Specification
Document
•Reviews / Workshops
•Stakeholder issues
•Legal framework
•Developer / IT engineer
issues
•Reviews / Workshops
•Stakeholder issues
•Legal framework
•Developer / IT engineer
issues
•Conceptual modeling
•Classification,
prioritization
•Conceptual modeling
•Classification,
prioritization
•Identification of
stakeholders & user
•Understanding user
and stakeholder
needs
•Surveys, Interviews, …
•Identification of
stakeholders & user
•Understanding user
and stakeholder
needs
•Surveys, Interviews, …
Requirements
Elicitation
Requirements
Elicitation
Requirements
Analysis
Requirements
Analysis
Requirements
Specification
Requirements
Specification
Requirements
Validation
Requirements
Validation
19. Not only requirements, but
quality requirements
• Aim is not only to gather requirements, but
quality requirements
• Quality requirement refer to a condition or a
capability that must be present in a
requirement
• Represent what is needed to validate the
successful completion of a project deliverable
• It contains means for the validation of the
acceptability of the requirements
20. Requirements engineering is a
cyclic process
• Requirements gathering
• Analysis
• Implementation
• Software Testing
• Evaluation of requirements
• Improvement of software / creation of new
requirements
W. Kuchinke (2017)
22. Iterative process of
requirements engineering
• Develop a system through repeated cycles
• Start with only a subset of software requirements,
iterate until the full system is implemented
• In each iteration, design modifications are made
and new functional capabilities are added
• Topics to be considered
– Protocol Feasibility
– Patient Recruitment
– Trial Execution, Clinical Data Collection, Adverse Events
Reporting
W. Kuchinke (2017)
23. Tools for requirements
gathering
• Use Cases
• Current situation and workflow
• Context diagram
• Stakeholder interviews
• Concentration on the essence of the work /
software
• Use Case workshop
– Scenarios, rules, analysis and discussion
24. A novel scenario based
approach
• Starting with a subset of the software requirements
• Iteration by addition of requirements until the full platform is specified
• Each iteration step
– Design modifications are made
– New functional capabilities are added
– The domain scenario is used to estimate probable effects (situation analysis
and long-range planning)
• The domain scenario describes the entire domain
– It is broken down into high-level ‘Usage Scenarios’
• Usage Scenarios describe critical business interactions and their
anticipated operations
– They serve as context for the use cases and the generation of requirements
– They make sure requirements are complete
W. Kuchinke (2017)
27. Findings of the stakeholder
interviews
• Requirements elaborated based on
interviews with pharma & academic
domain experts
• Challenges were identified for the
generation of queries and the handling of
temporal relations in EHR4CR
• Differences were identified between
pharma and academia
W. Kuchinke (2017)
31. Development of Use Cases
Problem environment
Business Use Case
Product Use Case
Process
1
Process
2
Process
4
Process
3
Described as
Use Case
Actor Use case
32. Writing the requirements
• Requirement Specifiction Dokument
– SRS
– Volere template
– Open issues, risks, costs, training
• Quality control of requirements
– Consistent terminology
– Completeness
– Meaningfulness
– Traceability of relevance to purpose
– Viable within constraints
W. Kuchinke (2017)
33. Writing good requirements
• Requirements should be unambiguous
• Requirements should be short
• Requirements must be feasible
• Requirements should be prioritized
• Requirements should be testable
• Requirements should be consistent
• Requirements use „shall“
• See: Appendix C. How to Write a Good Requirement-
https://www.nasa.gov/seh/appendix-c-how-to-write-a-
good-requirement
W. Kuchinke (2017)
34. Prototyping
• Use simulations
– Help to find requirements
– Validation of requirements
• Prototyping of requirements for
– User Interface
– Design and build of software
– Testing UI in real user environment
W. Kuchinke (2017)
35. Resources
• https://www.volere.org/templates/volere-requiremen
ts-specification-template/
• Book: Mastering the Requirements Process by
Suzanne Robertson and James Robertson
• Contents
– Project Drivers, Constraints, Functional Requirements,
Non-functional Requirements, Performance
Requirements, Operational and Environmental
Requirements, Maintainability and Support
Requirements, . Security Requirements, Legal
Requirements, Project Issues, Open Issues, Risks,
Costs, User Documentation and Training
36. The final step is the
development of the complete
SRS document
37. Purpose of the SRS in EHR
project
• Description of the expected functionalities
of the EHR4CR platform
• Focus is on the envisaged functionality
provided by EHR4CR to identify individual
subjects that match a set of pre-defined
criteria
• Support of further follow-up and possible
enrolment of the subject in clinical studies
W. Kuchinke (2017)
38. Development of the SRS with
involvement of Scenarios
W. Kuchinke (2017)
1.Begin with Domain Scenarios
2.Development of Usage Scenarios
3. Software Requirements Specification
Document
– Is the basis for building the software
– Begins with the Capabilities Description Document,
a high-level description of the envisaged system
that is extensively discussed by all stakeholders
– Contains also mockups, workflows and use cases
39. SRS of EHR4CR - Overview of
Content
• Tools and methods used for the specification of the
EHR4CR platform
• Actors, brief description and associated responsibilities of
actors / roles involved
• Use cases specify the envisaged usage of the EHR4CR
system in terms of a conceptual model
• Functional requirements, which documents and specifies
required functionalities of the envisaged system
• Non-functional requirements
• Data Requirements
• Appendix: GUI mock-up
W. Kuchinke (2017)
40. Change management for
requirements
• Several round of change management were
employed
• Extensive change management during
writing the SRS
• Extensive discussions and at least two
iterations
• This possibility for correction and
improvement ensured that the requirements
had a high quality
41. Change management during
writing the SRS
Problems, Defects,
Innovation, Tuning
identified
Change Request
• Report to advisory
board for
Requirements
Management
Analysis &
Prioritisation
• Analysis and prioritisation
of change requests
• Estimation of efforts
Planning
• Planning for improvement
• Change advisory board
judgement
• Next steps
Release, Version
• Documentation of Changes
• Refusal causes delay and new iteration step
• Acceptance means new version of SRS
42. Validation of Requirements
• Validation workshop is well suited for discussion
the requirements
• 2 review iterations were conducted
• Writing a document that contains all remarks,
questions and comments connected to the
requirements provided by reviewers and the
response from requirements engineering team
• This document makes it easier to generate a high
quality Requirements Specification Documents
(SRS)
43. Enhancement to Requirements
Engineering
• Several enhancements were introduced introduced
into the requirements engineering process
– Use of GUI Mock-ups to envisage workflows and main
use cases
– Prototyping of most important requirements
– Requirements Workshops with many different Working
Groups
– Inclusion of working group for legal/ethical issues
requirements)
– Inclusion of many Domain Experts (Patient Identification &
Recruitment) from hospitals and pharma industry
– Involvement with the Validation of Requirements
W. Kuchinke (2017)
45. New projects to apply the
requirement engineering process
• CORBEL project
– Stakeholder Needs and Requirements Document
– Sharing and re-use of individual participant data from clinical
trials
• BioMedBridges project
– Legal requirements specification
– Building data bridges between biological and medical
infrastructures in Europe
– Legal and privacy requirements during data sharing to guarantee
legal interoperability
• p-medicine project
– Generation of legal and ethical requirement clusters
W. Kuchinke (2017)
47. Example: Requirements
engineering for LAT
For the
BioMedBridges
project
requiremens
engineering was
used to generate
legal requirements
for data sharing
W. Kuchinke (2017)
48. Contact
Wolfgang Kuchinke
UDUS, Duesseldorf, Germany
This presentation contains additional explanatory material for
Q&A and workshop
wolfgang.kuchinke@uni-duesseldorf.de
wokuchinke@outlook.de
Presentation motive from freepik.com