Data quality is context-dependent. That is, the quality of data cannot
be assessed without contextual knowledge about the production or the use of
data. As expected, context-based data quality assessment requires a formal model
of context. Accordingly, we propose a model of context that addresses quality
concerns that are related to the production and use of data.
Here we follow and extend a context model for the assessment of the quality
of a database instance that was proposed in a previous work [1]. In that framework,
the context takes the form of a possibly virtual database or data integration
system into which a database instance under quality assessment is mapped, for
additional analysis and processing, enabling quality data extraction. In this work
we extend contexts with dimensions, and by doing so, we make possible a multidimensional
data quality assessment. Multidimensional contexts are represented
as ontologies written in Datalog±. We use this language for representing dimensional
constraints, and dimensional rules, and also for doing query answering
based on dimensional navigation, which becomes an important auxiliary activity
in the assessment of data.We show ideas and mechanisms by means of examples.
RuleML2015: Ontology-Based Multidimensional Contexts with Applications to Qua...RuleML
Data quality assessment and data cleaning are context
dependent activities. Starting from this observation, in previous work
a context model for the assessment of the quality of a database was
proposed. A context takes the form of a possibly virtual database or
a data integration system into which the database under assessment is
mapped, for additional analysis, processing, and quality data extraction.
In this work, we extend contexts with dimensions, and by doing so, multidimensional
data quality assessment becomes possible. At the core of
multidimensional contexts we find ontologies written as Datalog
±
programs
with provably good properties in terms of query answering. We
use this language to represent dimension hierarchies, dimensional constraints,
dimensional rules, and specifying quality data. Query answering
relies on and triggers dimensional navigation, and becomes an important
tool for the extraction of quality data.
RuleML2015: Ontology-Based Multidimensional Contexts with Applications to Qua...RuleML
Data quality assessment and data cleaning are context
dependent activities. Starting from this observation, in previous work
a context model for the assessment of the quality of a database was
proposed. A context takes the form of a possibly virtual database or
a data integration system into which the database under assessment is
mapped, for additional analysis, processing, and quality data extraction.
In this work, we extend contexts with dimensions, and by doing so, multidimensional
data quality assessment becomes possible. At the core of
multidimensional contexts we find ontologies written as Datalog
±
programs
with provably good properties in terms of query answering. We
use this language to represent dimension hierarchies, dimensional constraints,
dimensional rules, and specifying quality data. Query answering
relies on and triggers dimensional navigation, and becomes an important
tool for the extraction of quality data.
Applying ‘best fit’ frameworks to systematic review data extractionAndrea Miller-Nesbitt
Presented at the 7th International Conference on Qualitative and Quantitative Methods in Libraries, Paris France. Application for the 'best fit' framework synthesis methodology to systematic review data extraction
SDTM (Study Data Tabulation Model) defines a standard for organizing and formatting data to streamline processes in collection, management, analysis and reporting of human clinical trial data tabulations and for non-clinical study data tabulations which are to be submitted as part of a product application(IND and NDA) to a regulatory authority such as the United States Food and Drug Administration (FDA) and PMDA (Japan)
SDTM (Study Data Tabulation Model) defines a standard structure for human clinical trial (study) data tabulations and for nonclinical study data tabulations that are to be submitted as part of a product application to a regulatory authority such as the United States Food and Drug Administration (FDA).
RuleML2015: Ontology-Based Multidimensional Contexts with Applications to Qua...RuleML
Data quality assessment and data cleaning are context
dependent activities. Starting from this observation, in previous work
a context model for the assessment of the quality of a database was
proposed. A context takes the form of a possibly virtual database or
a data integration system into which the database under assessment is
mapped, for additional analysis, processing, and quality data extraction.
In this work, we extend contexts with dimensions, and by doing so, multidimensional
data quality assessment becomes possible. At the core of
multidimensional contexts we find ontologies written as Datalog
±
programs
with provably good properties in terms of query answering. We
use this language to represent dimension hierarchies, dimensional constraints,
dimensional rules, and specifying quality data. Query answering
relies on and triggers dimensional navigation, and becomes an important
tool for the extraction of quality data.
RuleML2015: Ontology-Based Multidimensional Contexts with Applications to Qua...RuleML
Data quality assessment and data cleaning are context
dependent activities. Starting from this observation, in previous work
a context model for the assessment of the quality of a database was
proposed. A context takes the form of a possibly virtual database or
a data integration system into which the database under assessment is
mapped, for additional analysis, processing, and quality data extraction.
In this work, we extend contexts with dimensions, and by doing so, multidimensional
data quality assessment becomes possible. At the core of
multidimensional contexts we find ontologies written as Datalog
±
programs
with provably good properties in terms of query answering. We
use this language to represent dimension hierarchies, dimensional constraints,
dimensional rules, and specifying quality data. Query answering
relies on and triggers dimensional navigation, and becomes an important
tool for the extraction of quality data.
Applying ‘best fit’ frameworks to systematic review data extractionAndrea Miller-Nesbitt
Presented at the 7th International Conference on Qualitative and Quantitative Methods in Libraries, Paris France. Application for the 'best fit' framework synthesis methodology to systematic review data extraction
SDTM (Study Data Tabulation Model) defines a standard for organizing and formatting data to streamline processes in collection, management, analysis and reporting of human clinical trial data tabulations and for non-clinical study data tabulations which are to be submitted as part of a product application(IND and NDA) to a regulatory authority such as the United States Food and Drug Administration (FDA) and PMDA (Japan)
SDTM (Study Data Tabulation Model) defines a standard structure for human clinical trial (study) data tabulations and for nonclinical study data tabulations that are to be submitted as part of a product application to a regulatory authority such as the United States Food and Drug Administration (FDA).
tribhuvan University
M.A population Studies
Research methods for population analysis
Data Processing, editing and coding
if any mistakes, suggest me to improve it.
thank you
hope its useful for all :)
· On the basis of what you learned in the readings, define the tLesleyWhitesidefv
· On the basis of what you learned in the readings, define the terms "sample" and "population" and describe some of the advantages and disadvantages of using a sample compared to a population. Support your reasoning with examples.
· A researcher is studying the effects of caffeine on exam scores of college students. In this study:
· What would be the population and sample of this research Extrapolate your views of the advantages and disadvantages of samples and populations to this example.
· Why would the researcher want to use a sample or a population in this study
· When responding to your classmates' posts, comment on the examples that they provided to illustrate the advantages and disadvantages of a sample compared to a population. What advantages and disadvantages can you add
· Do you agree with your classmates regarding what should be used in this study—a sample or a population, Comment on the reasons they provided and explain why you agree or disagree with them.
CLASSMATE REPSONE:
The critical piece here is that the sample must represent the population.
If my population is high school students in Florida and my population comes from the local high school, does the sample represent the same kids as those who attend high school in Miami on the other side of the state, Probably not. If I sample only those in the 9th grade, does this represent "high school students" in Florida or even in the same high school, No, because 9th graders may not be like 12 graders.
It is a natural tendency to think of population as "everyone." but it may be a single school, a single grade or program at a school, a single business,a community, a state, people that represent a specific identity within a specific community, etc.
2
Week 7: Data Analytics
Student’s name
Instructor
Course
Date
Do you recommend that the data analyst examine aggregate data, detailed data, or both, to investigate this quality issue? Please explain your rationale.
As a data analyst, I believe that in this situation, when the goal is to enhance quality, the analyst should analyze aggregate data as well as more specific data. The "large picture" may be gained through aggregating data (Campbell, 2018). Big thinkers notice possibilities and take advantage of them. For the sake of profit, they're prepared to take risks. Detailed data analysis would reveal where and why procedures failed. It is considerably more intriguing to look at transactional data than it is to put them into demographic categories (Campbell, 2018).
Do you recommend that the data analyst use a retrospective data warehouse, clinical data store, or both, to investigate the mortality rate? Please explain your rationale.
According to Campbell (2018), medical trial data collection is currently a time-consuming, error-prone, and sometimes incomplete process due to the complexity of the data. To increase data quality and minimize data collecting times, new and more reliable procedures are required if info ...
A discriminative-feature-space-for-detecting-and-recognizing-pathologies-of-t...Damian R. Mingle, MBA
Each year it has become more and more difficult for healthcare providers to determine if a patient has a pathology
related to the vertebral column. There is great potential to become more efficient and effective in terms of quality
of care provided to patients through the use of automated systems. However, in many cases automated systems
can allow for misclassification and force providers to have to review more causes than necessary. In this study, we
analyzed methods to increase the True Positives and lower the False Positives while comparing them against stateof-the-art
techniques in the biomedical community. We found that by applying the studied techniques of a data-driven
model, the benefits to healthcare providers are significant and align with the methodologies and techniques utilized
in the current research community.
Course Project Part 3—Translating Evidence Into PracticeIn Part.docxbuffydtesurina
Course Project: Part 3—Translating Evidence Into Practice
In Part 3 of the Course Project, you consider how the evidence you gathered during Part 2 can be translated into nursing practice.
Now that you have located available research on your PICOT question, you will examine what the research indicates about nursing practices. Connecting research evidence and findings to actual decisions and tasks that nurses complete in their daily practice is essentially what evidence-based practice is all about. This final component of the Course Project asks you to translate the evidence and data from your literature review into authentic practices that can be adopted to improve health care outcomes. In addition, you will also consider possible methods and strategies for disseminating evidence-based practices to your colleagues and to the broader health care field.
To prepare:
Consider Parts 1 and 2 of your Course Project. How does the research address your PICOT question?
PLEASE REFER TO FILES ATTACHED BELOW
With your PICOT question in mind, identify at least one nursing practice that is supported by the evidence in two or more of the articles from your literature review. Consider what the evidence indicates about how this practice contributes to better outcomes.
Explore possible consequences of failing to adopt the evidence-based practice that you identified.
Consider how you would disseminate information about this evidence-based practice throughout your organization or practice setting. How would you communicate the importance of the practice?
To complete:
In a 3- to 4-page paper:
1) Restate your PICOT question and its significance to nursing practice.
My PICOT question is:
does hand washing and appropriate staff dressing among the surgical ward nurses reduce cross infection during patient management?
2) Summarize the findings from the articles you selected for your literature review. Describe at least one nursing practice that is supported by the evidence in the articles. Justify your response with specific references to at least 2 of the articles. Please refer to the articles below:
Aiken, A. M., Karuri, D. M., Wanyoro, A. K., & Macleod, J. (2012). Interventional studies for preventing surgical site infections in sub-Saharan Africa.
International Journal of Surgery
, 242-249. Doi: 10.1016/
j
.ijsu.2012.04.004
Al-Khawaldeh, O., Al-Hussami, M., & Darawad, M. (2015).
Influence of Nursing Students Handwashing Knowledge, Beliefs, and Attitudes on Their Handwashing Compliance
.
Scientific Research Publishing
. Doi:
http://dx.doi.org.ezp.waldenulibrary.org/10.4236/health.2015.75068
Bukhari, S., Hussain, W., Banjar, A., Almaimani, W., Karima, T., & Fatani, M. (2011).
Hand hygiene compliance rate among healthcare professionals.
PubMed - NCBI
.
Ncbi.nlm.nih.gov
. Retrieved 1 April 2016, from
http://www.ncbi.nlm.nih.gov/pubmed/21556474
3) Explain how the evidence-based practice that you identified contributes to better outcomes. In addi.
#35321 Topic Discussion Validity in Quantitative Research Design.docxAASTHA76
#35321 Topic: Discussion: Validity in Quantitative Research Designs
Number of Pages: 1 (Double Spaced)
Number of sources: 3
Writing Style: APA
Type of document: Essay
Academic Level:Master
Category: Nursing
VIP Support: N/A
Language Style: English (U.S.)
Order Instructions: ATTACHED
Validity in research refers to the extent researchers can be confident that the cause and effect they identify in their research are in fact causal relationships. If there is low validity in a study, it usually means that the research design is flawed and the results will be of little or no value. Four different aspects of validity should be considered when reviewing a research design: statistical conclusion validity, internal validity, construct validity, and external validity. In this Discussion, you consider the importance of each of these aspects in judging the validity of quantitative research.
To prepare:
Review the information in Chapter 10 of the course text on rigor and validity.
Read the method section of one of the following quasi-experimental studies (also located in this week抯 Learning Resources). Identify at least one potential concern that could be raised about the study抯 internal validity.
Metheny, N. A., Davis-Jackson, J., & Stewart, B. J. (2010). Effectiveness of an aspiration risk-reduction protocol. Nursing Research, 59(1), 18�.
Padula, C. A., Hughes, C., & Baumhover, L. (2009). Impact of a nurse-driven mobility protocol on functional decline in hospitalized older adults. Journal of Nursing Care Quality, 24(4), 325�1.
Yuan, S., Chou, M., Hwu, L., Chang, Y., Hsu, W., & Kuo, H. (2009). An intervention program to promote health-related physical fitness in nurses. Journal of Clinical Nursing, 18(10), 1,404�411.
Consider strategies that could be used to strengthen the study抯 internal validity and how this would impact the three other types of validity.
Think about the consequences of an advanced practice nurse neglecting to consider the validity of a research study when reviewing the research for potential use in developing an evidence-based practice.
By Day 3
Post the title of the study that you selected and your analysis of the potential concerns that could be raised about the study抯 internal validity. Propose recommendations to strengthen the internal validity and assess the effect your changes could have with regard to the other three types of validity. Discuss the dangers of failing to consider the validity of a research study.
Required Readings
Polit, D. F., & Beck, C. T. (2017). Nursing research: Generating and assessing evidence for nursing practice (10th ed.). Philadelphia, PA: Wolters Kluwer.
Chapter 10, 揜igor and Validity in Quantitative Research�
This chapter introduces the concept of validity in research and describes the different types of validity that must be addressed. Key threats to validity are also explored.
Chapter 11, 揝pecific Types of Quantitative Research�
This chapter focuses on th.
Designing and launching the Clinical Reference LibraryKerstin Forsberg
Presentation for the European Clinical Data Forum conference, 24 May, 2011. Describing the business problems and drivers behind the design of a ISO11179 based metadata registry for clinical data. And also introducing the features of the CRL application.
Course Project Part 3—Translating Evidence Into PracticeIn Pa.docxbuffydtesurina
Course Project: Part 3—Translating Evidence Into Practice
In Part 3 of the Course Project, you consider how the evidence you gathered during Part 2 can be translated into nursing practice.
Now that you have located available research on your PICOT question, you will examine what the research indicates about nursing practices. Connecting research evidence and findings to actual decisions and tasks that nurses complete in their daily practice is essentially what evidence-based practice is all about. This final component of the Course Project asks you to translate the evidence and data from your literature review into authentic practices that can be adopted to improve health care outcomes. In addition, you will also consider possible methods and strategies for disseminating evidence-based practices to your colleagues and to the broader health care field.
To prepare:
Consider Parts 1 and 2 of your Course Project. How does the research address your PICOT question?
With your PICOT question in mind, identify at least one nursing practice that is supported by the evidence in two or more of the articles from your literature review. Consider what the evidence indicates about how this practice contributes to better outcomes.
Explore possible consequences of failing to adopt the evidence-based practice that you identified.
Consider how you would disseminate information about this evidence-based practice throughout your organization or practice setting. How would you communicate the importance of the practice?
To complete:
In a 3- to 4-page paper:
1) Restate your PICOT question and its significance to nursing practice.
My PICOT question is:
does hand washing and appropriate staff dressing among the surgical ward nurses reduce cross infection during patient management?
2) Summarize the findings from the articles you selected for your literature review. Describe at least one nursing practice that is supported by the evidence in the articles. Justify your response with specific references to at least 2 of the articles. Please refer to the articles below:
Aiken, A. M., Karuri, D. M., Wanyoro, A. K., & Macleod, J. (2012). Interventional studies for preventing surgical site infections in sub-Saharan Africa.
International Journal of Surgery
, 242-249. Doi: 10.1016/
j
.ijsu.2012.04.004
Al-Khawaldeh, O., Al-Hussami, M., & Darawad, M. (2015).
Influence of Nursing Students Handwashing Knowledge, Beliefs, and Attitudes on Their Handwashing Compliance
.
Scientific Research Publishing
. Doi: http://dx.doi.org.ezp.waldenulibrary.org/10.4236/health.2015.75068
Bukhari, S., Hussain, W., Banjar, A., Almaimani, W., Karima, T., & Fatani, M. (2011).
Hand hygiene compliance rate among healthcare professionals.
PubMed - NCBI
.
Ncbi.nlm.nih.gov
. Retrieved 1 April 2016, from http://www.ncbi.nlm.nih.gov/pubmed/21556474
3) Explain how the evidence-based practice that you identified contributes to better outcomes. In addition, identify potential negative outcom.
Ontology-Driven Clinical Intelligence: Removing Data Barriers for Cross-Disci...Remedy Informatics
The presentation describes how Remedy Informatics is advocating and innovating "flexible standardization" through an ontology-driven approach to clinical research. You will see in greater detail how a foundational, standardized Mosaic Ontology can be extended for more specific research applications and even more specific and focused disease research.
tribhuvan University
M.A population Studies
Research methods for population analysis
Data Processing, editing and coding
if any mistakes, suggest me to improve it.
thank you
hope its useful for all :)
· On the basis of what you learned in the readings, define the tLesleyWhitesidefv
· On the basis of what you learned in the readings, define the terms "sample" and "population" and describe some of the advantages and disadvantages of using a sample compared to a population. Support your reasoning with examples.
· A researcher is studying the effects of caffeine on exam scores of college students. In this study:
· What would be the population and sample of this research Extrapolate your views of the advantages and disadvantages of samples and populations to this example.
· Why would the researcher want to use a sample or a population in this study
· When responding to your classmates' posts, comment on the examples that they provided to illustrate the advantages and disadvantages of a sample compared to a population. What advantages and disadvantages can you add
· Do you agree with your classmates regarding what should be used in this study—a sample or a population, Comment on the reasons they provided and explain why you agree or disagree with them.
CLASSMATE REPSONE:
The critical piece here is that the sample must represent the population.
If my population is high school students in Florida and my population comes from the local high school, does the sample represent the same kids as those who attend high school in Miami on the other side of the state, Probably not. If I sample only those in the 9th grade, does this represent "high school students" in Florida or even in the same high school, No, because 9th graders may not be like 12 graders.
It is a natural tendency to think of population as "everyone." but it may be a single school, a single grade or program at a school, a single business,a community, a state, people that represent a specific identity within a specific community, etc.
2
Week 7: Data Analytics
Student’s name
Instructor
Course
Date
Do you recommend that the data analyst examine aggregate data, detailed data, or both, to investigate this quality issue? Please explain your rationale.
As a data analyst, I believe that in this situation, when the goal is to enhance quality, the analyst should analyze aggregate data as well as more specific data. The "large picture" may be gained through aggregating data (Campbell, 2018). Big thinkers notice possibilities and take advantage of them. For the sake of profit, they're prepared to take risks. Detailed data analysis would reveal where and why procedures failed. It is considerably more intriguing to look at transactional data than it is to put them into demographic categories (Campbell, 2018).
Do you recommend that the data analyst use a retrospective data warehouse, clinical data store, or both, to investigate the mortality rate? Please explain your rationale.
According to Campbell (2018), medical trial data collection is currently a time-consuming, error-prone, and sometimes incomplete process due to the complexity of the data. To increase data quality and minimize data collecting times, new and more reliable procedures are required if info ...
A discriminative-feature-space-for-detecting-and-recognizing-pathologies-of-t...Damian R. Mingle, MBA
Each year it has become more and more difficult for healthcare providers to determine if a patient has a pathology
related to the vertebral column. There is great potential to become more efficient and effective in terms of quality
of care provided to patients through the use of automated systems. However, in many cases automated systems
can allow for misclassification and force providers to have to review more causes than necessary. In this study, we
analyzed methods to increase the True Positives and lower the False Positives while comparing them against stateof-the-art
techniques in the biomedical community. We found that by applying the studied techniques of a data-driven
model, the benefits to healthcare providers are significant and align with the methodologies and techniques utilized
in the current research community.
Course Project Part 3—Translating Evidence Into PracticeIn Part.docxbuffydtesurina
Course Project: Part 3—Translating Evidence Into Practice
In Part 3 of the Course Project, you consider how the evidence you gathered during Part 2 can be translated into nursing practice.
Now that you have located available research on your PICOT question, you will examine what the research indicates about nursing practices. Connecting research evidence and findings to actual decisions and tasks that nurses complete in their daily practice is essentially what evidence-based practice is all about. This final component of the Course Project asks you to translate the evidence and data from your literature review into authentic practices that can be adopted to improve health care outcomes. In addition, you will also consider possible methods and strategies for disseminating evidence-based practices to your colleagues and to the broader health care field.
To prepare:
Consider Parts 1 and 2 of your Course Project. How does the research address your PICOT question?
PLEASE REFER TO FILES ATTACHED BELOW
With your PICOT question in mind, identify at least one nursing practice that is supported by the evidence in two or more of the articles from your literature review. Consider what the evidence indicates about how this practice contributes to better outcomes.
Explore possible consequences of failing to adopt the evidence-based practice that you identified.
Consider how you would disseminate information about this evidence-based practice throughout your organization or practice setting. How would you communicate the importance of the practice?
To complete:
In a 3- to 4-page paper:
1) Restate your PICOT question and its significance to nursing practice.
My PICOT question is:
does hand washing and appropriate staff dressing among the surgical ward nurses reduce cross infection during patient management?
2) Summarize the findings from the articles you selected for your literature review. Describe at least one nursing practice that is supported by the evidence in the articles. Justify your response with specific references to at least 2 of the articles. Please refer to the articles below:
Aiken, A. M., Karuri, D. M., Wanyoro, A. K., & Macleod, J. (2012). Interventional studies for preventing surgical site infections in sub-Saharan Africa.
International Journal of Surgery
, 242-249. Doi: 10.1016/
j
.ijsu.2012.04.004
Al-Khawaldeh, O., Al-Hussami, M., & Darawad, M. (2015).
Influence of Nursing Students Handwashing Knowledge, Beliefs, and Attitudes on Their Handwashing Compliance
.
Scientific Research Publishing
. Doi:
http://dx.doi.org.ezp.waldenulibrary.org/10.4236/health.2015.75068
Bukhari, S., Hussain, W., Banjar, A., Almaimani, W., Karima, T., & Fatani, M. (2011).
Hand hygiene compliance rate among healthcare professionals.
PubMed - NCBI
.
Ncbi.nlm.nih.gov
. Retrieved 1 April 2016, from
http://www.ncbi.nlm.nih.gov/pubmed/21556474
3) Explain how the evidence-based practice that you identified contributes to better outcomes. In addi.
#35321 Topic Discussion Validity in Quantitative Research Design.docxAASTHA76
#35321 Topic: Discussion: Validity in Quantitative Research Designs
Number of Pages: 1 (Double Spaced)
Number of sources: 3
Writing Style: APA
Type of document: Essay
Academic Level:Master
Category: Nursing
VIP Support: N/A
Language Style: English (U.S.)
Order Instructions: ATTACHED
Validity in research refers to the extent researchers can be confident that the cause and effect they identify in their research are in fact causal relationships. If there is low validity in a study, it usually means that the research design is flawed and the results will be of little or no value. Four different aspects of validity should be considered when reviewing a research design: statistical conclusion validity, internal validity, construct validity, and external validity. In this Discussion, you consider the importance of each of these aspects in judging the validity of quantitative research.
To prepare:
Review the information in Chapter 10 of the course text on rigor and validity.
Read the method section of one of the following quasi-experimental studies (also located in this week抯 Learning Resources). Identify at least one potential concern that could be raised about the study抯 internal validity.
Metheny, N. A., Davis-Jackson, J., & Stewart, B. J. (2010). Effectiveness of an aspiration risk-reduction protocol. Nursing Research, 59(1), 18�.
Padula, C. A., Hughes, C., & Baumhover, L. (2009). Impact of a nurse-driven mobility protocol on functional decline in hospitalized older adults. Journal of Nursing Care Quality, 24(4), 325�1.
Yuan, S., Chou, M., Hwu, L., Chang, Y., Hsu, W., & Kuo, H. (2009). An intervention program to promote health-related physical fitness in nurses. Journal of Clinical Nursing, 18(10), 1,404�411.
Consider strategies that could be used to strengthen the study抯 internal validity and how this would impact the three other types of validity.
Think about the consequences of an advanced practice nurse neglecting to consider the validity of a research study when reviewing the research for potential use in developing an evidence-based practice.
By Day 3
Post the title of the study that you selected and your analysis of the potential concerns that could be raised about the study抯 internal validity. Propose recommendations to strengthen the internal validity and assess the effect your changes could have with regard to the other three types of validity. Discuss the dangers of failing to consider the validity of a research study.
Required Readings
Polit, D. F., & Beck, C. T. (2017). Nursing research: Generating and assessing evidence for nursing practice (10th ed.). Philadelphia, PA: Wolters Kluwer.
Chapter 10, 揜igor and Validity in Quantitative Research�
This chapter introduces the concept of validity in research and describes the different types of validity that must be addressed. Key threats to validity are also explored.
Chapter 11, 揝pecific Types of Quantitative Research�
This chapter focuses on th.
Designing and launching the Clinical Reference LibraryKerstin Forsberg
Presentation for the European Clinical Data Forum conference, 24 May, 2011. Describing the business problems and drivers behind the design of a ISO11179 based metadata registry for clinical data. And also introducing the features of the CRL application.
Course Project Part 3—Translating Evidence Into PracticeIn Pa.docxbuffydtesurina
Course Project: Part 3—Translating Evidence Into Practice
In Part 3 of the Course Project, you consider how the evidence you gathered during Part 2 can be translated into nursing practice.
Now that you have located available research on your PICOT question, you will examine what the research indicates about nursing practices. Connecting research evidence and findings to actual decisions and tasks that nurses complete in their daily practice is essentially what evidence-based practice is all about. This final component of the Course Project asks you to translate the evidence and data from your literature review into authentic practices that can be adopted to improve health care outcomes. In addition, you will also consider possible methods and strategies for disseminating evidence-based practices to your colleagues and to the broader health care field.
To prepare:
Consider Parts 1 and 2 of your Course Project. How does the research address your PICOT question?
With your PICOT question in mind, identify at least one nursing practice that is supported by the evidence in two or more of the articles from your literature review. Consider what the evidence indicates about how this practice contributes to better outcomes.
Explore possible consequences of failing to adopt the evidence-based practice that you identified.
Consider how you would disseminate information about this evidence-based practice throughout your organization or practice setting. How would you communicate the importance of the practice?
To complete:
In a 3- to 4-page paper:
1) Restate your PICOT question and its significance to nursing practice.
My PICOT question is:
does hand washing and appropriate staff dressing among the surgical ward nurses reduce cross infection during patient management?
2) Summarize the findings from the articles you selected for your literature review. Describe at least one nursing practice that is supported by the evidence in the articles. Justify your response with specific references to at least 2 of the articles. Please refer to the articles below:
Aiken, A. M., Karuri, D. M., Wanyoro, A. K., & Macleod, J. (2012). Interventional studies for preventing surgical site infections in sub-Saharan Africa.
International Journal of Surgery
, 242-249. Doi: 10.1016/
j
.ijsu.2012.04.004
Al-Khawaldeh, O., Al-Hussami, M., & Darawad, M. (2015).
Influence of Nursing Students Handwashing Knowledge, Beliefs, and Attitudes on Their Handwashing Compliance
.
Scientific Research Publishing
. Doi: http://dx.doi.org.ezp.waldenulibrary.org/10.4236/health.2015.75068
Bukhari, S., Hussain, W., Banjar, A., Almaimani, W., Karima, T., & Fatani, M. (2011).
Hand hygiene compliance rate among healthcare professionals.
PubMed - NCBI
.
Ncbi.nlm.nih.gov
. Retrieved 1 April 2016, from http://www.ncbi.nlm.nih.gov/pubmed/21556474
3) Explain how the evidence-based practice that you identified contributes to better outcomes. In addition, identify potential negative outcom.
Ontology-Driven Clinical Intelligence: Removing Data Barriers for Cross-Disci...Remedy Informatics
The presentation describes how Remedy Informatics is advocating and innovating "flexible standardization" through an ontology-driven approach to clinical research. You will see in greater detail how a foundational, standardized Mosaic Ontology can be extended for more specific research applications and even more specific and focused disease research.
BioSHaRE: Maelstrom Research tools for data harmonization and co-analysis - I...Lisette Giepmans
BioSHaRE conference July 28th, 2015, Milan - Latest tools and services for data sharing
Stream 1: Tools for data sharing analysis and enhancement
BioSHaRE is one of the founding projects of Maelstrom Research (MR, https://www.maelstrom-research.org/), an international research program aiming to facilitate collaborative epidemiological research through rigorous data documentation, harmonization, integration and co-analysis. In order to ensure the continuity of tools and resources developed in the context of the BioSHaRE program, MR provides a range of services to meet the data cataloguing, data harmonization, federated analyses and software infrastructure needs of collaborative epidemiological research. The harmonisation platform and federated analyses infrastructure that are fundamental components of the BioSHaRE approach have been developed and supported by Maelstrom Research.
A few examples of services offered by MR are:
Study and research data catalogues: MR works with research networks to create searchable and scalable metadata catalogues providing data users with quick information on who is collecting what data and samples.
Data harmonization: MR works with research networks to assess the compatibility of data across studies and generate common-format variables for co-analysis.
Software development and support: MR provides technical support for the use and customization of software products to answer data collection, management, harmonization, analysis and dissemination needs. MR develops and supports the use of Opal, Mica and DataSHIELD softwares.
Expert advice: MR offers guidance to emerging research networks in the planning of data harmonization, harmonized data analysis and data dissemination strategies.
MR Services are helping epidemiological study networks and consortia make the most out of collaborative research. Existing or emerging study networks interested in MR Services are encouraged to contact Maelstrom Research at info@maelstrom-research.org. More information on Maelstrom Research and its current partnerships is available at: www.maelstrom-research.
Next generation electronic medical records and search a test implementation i...lucenerevolution
Presented by David Piraino, Chief Imaging Information Officer, Imaging Institute Cleveland Clinic, Cleveland Clinic
& Daniel Palmer, Chief Imaging Information Officer, Imaging Institute Cleveland Clinic, Cleveland Clinic
Most patient specifc medical information is document oriented with varying amounts of associated meta-data. Most of pateint medical information is textual and semi-structured. Electronic Medical Record Systems (EMR) are not optimized to present the textual information to users in the most understandable ways. Present EMRs show information to the user in a reverse time oriented patient specific manner only. This talk discribes the construction and use of Solr search technologies to provide relevant historical information at the point of care while intepreting radiology images.
Radiology reports over a 4 year period were extracted from our Radiology Information System (RIS) and passed through a text processing engine to extract the results, impression, exam description, location, history, and date. Fifteen cases reported during clinical practice were used as test cases to determine if ""similar"" historical cases were found . The results were evaluated by the number of searches that returned any result in less than 3 seconds and the number of cases that illustrated the questioned diagnosis in the top 10 results returned as determined by a bone and joint radiologist. Also methods to better optimize the search results were reviewed.
An average of 7.8 out of the 10 highest rated reports showed a similar case highly related to the present case. The best search showed 10 out of 10 cases that were good examples and the lowest match search showed 2 out of 10 cases that were good examples.The talk will highlight this specific use case and the issues and advances of using Solr search technology in medicine with focus on point of care applications.
Aggregates in Recursion: Issues and SolutionsRuleML
Aggregates are commonplace in database query languages. It is natural to include them also into logic programming. However, doing so raises a number of issues, in particular when aggregates are used in conjunction with recursive definitions. This talk will shed some light on the underlying issues and some of the solutions proposed in the literature so far.
A software agent controlling 2 robot arms in co-operating concurrent tasksRuleML
TeleoR is a major extension of Nilsson’s Teleo-Reactive (TR)
rule based robotic agent programming language. Programs comprise sequences of guarded action rules grouped into parameterised procedures.
The guards are deductive queries to a set of rapidly changing percept and other dynamic facts in the agent’s Belief Store. The actions are either tuples of primitive actions for external robotic resources, to be executed in parallel, or a single call to a TeleoR procedure, which can be a recursive call. The guards form a sub-goal tree routed at the guard of the first rule. When partially instantiated by the arguments of some call, this guard is the goal of the call.
TeleoR extends TR in being typed and higher order, with extra forms of rules that allow finer control over sub-goal achieving task behaviour.
Its Belief Store inference language is a higher order logic+function rule language, QuLog. QuLog also has action rules and primitive actions for updating the Belief Store and sending messages. The action of a TeleoR rule may be a combination of the action of a TR rule and a sequence of
QuLog actions. TeleoR’s most important extension of TR is the concept of task atomic procedures, some arguments of which belong to a special but application specific resource type. This allows the high level programming of multitasking agents using multiple robotic resources. When two or more tasks
need to use overlapping resources their use is alternated between task atomic calls in each task, in such a way that there is no interference, deadlock or task starvation.
This multi-task programming is illustrated by giving the essentials of a program for an agent controlling two robotic arms in multiple block tower assembly tasks. It has been used to control both a Python interactive graphical simulation and a Baxter robot building real block towers, in each case with help or hindrance from a human. The arms move in parallel whenever it can be done without risk of clashing.
Port Clearance Rules in PSOA RuleML: From Controlled-English Regulation to Ob...RuleML
The Decision Management (DM) Community Challenge of
March 2016 consisted of creating decision models from ten English Port Clearance Rules inspired by the International Ship and Port Facility Security
Code. Based on an analysis of the moderately controlled English
rules and current online solutions, we formalized the rules in PositionalSlotted,
Object-Applicative (PSOA) RuleML. This resulted in: (1) a
reordering, subgrouping, and explanation of the original rules on the
specialized decision-model expressiveness level of (deontically contextualized)
near-Datalog, non-recursive, near-deterministic, ground-queried,
and non-subpredicating rules; (2) an object-relational PSOA RuleML
rulebase which was complemented by facts to form a knowledge base queried in PSOATransRun for decision-making. Thus, the DM and logical formalizations get connected, which leads to generalized decision models with Hornlog, recursive, non-deterministic, non-ground-queried, and subpredicating rules.
Big data, with its four main characteristics (Volume, Velocity,
Variety, and Veracity) pose challenges to the gathering, management, analytics, and visualization of events. These very same four characteristics, however, also hold a great promise in unlocking the story behind data. In this talk, we focus on the observation that event creation is guided by processes. For example, GPS information, emitted by buses in an urban setting follow the bus scheduled route. Also, RTLS information about the whereabouts of patients and nurses in a hospital is guided by the predefined schedule of work. With this observation at hand, we thoroughly seek a method for mining, not the data, but rather the rules that guide data creation and show how, by knowing such rules, big data tasks become more efficient and more effective. In particular, we demonstrate how, by knowing the rules that govern event creation, we can detect complex events sooner and make use of historical data to predict future behaviors.
RuleML 2015: Ontology Reasoning using Rules in an eHealth ContextRuleML
Traditionally, nurse call systems in hospitals are rather simple:
patients have a button next to their bed to call a nurse. Which specific
nurse is called cannot be controlled, as there is no extra information
available. This is different for solutions based on semantic knowledge:
if the state of care givers (busy or free), their current position, and for
example their skills are known, a system can always choose the best
suitable nurse for a call. In this paper we describe such a semantic nurse
call system implemented using the EYE reasoner and Notation3 rules.
The system is able to perform OWL-RL reasoning. Additionally, we use
rules to implement complex decision trees. We compare our solution to
an implementation using OWL-DL, the Pellet reasoner, and SPARQL
queries. We show that our purely rule-based approach gives promising
results. Further improvements will lead to a mature product which will
significantly change the organization of modern hospitals.
RuleML 2015: Semantics of Notation3 Logic: A Solution for Implicit Quantifica...RuleML
Since the development of Notation3 Logic, several years have
passed in which the theory has been refined and used in practice by different reasoning engines such as cwm, FuXi or EYE. Nevertheless, a clear model-theoretic definition of its semantics is still missing. This leaves room for individual interpretations and renders it difficult to make clear
statements about its relation to other logics such as DL or FOL or even about such basic concepts as correctness. In this paper we address one of the main open challenges: the formalization of implicit quantification.
We point out how the interpretation of implicit quantifiers differs in two of the above mentioned reasoning engines and how the specification, proposed in the W3C team submission, could be formalized. Our formalization is then put into context by integrating it into a model-theoretic definition of the whole language. We finish our contribution by arguing why universal quantification should be handled differently than currently
prescribed.
Challenge@RuleML2015 Developing Situation-Aware Applications for Disaster Man...RuleML
In order to enhance interoperability and productivity in the develop-ment of situation-aware applications for disaster management, proper mecha-nisms and guidelines are required. They must address the lack of semantics in modelling emergency situations. In addition, the ever-changing and unpredicta-ble nature of disaster scenarios present challenges for information processing and collaboration. This paper proposes a framework that combines the follow-ing elements: (i) a foundational ontology for temporal conceptualization; (ii) well-founded specifications of structural and behavioral models; (iii) a CEP en-gine based on a distributed rule-based platform for situation management; (iv) a model-driven approach. We illustrate the operation of the framework with a scenario for monitoring tuberculosis epidemy.
Rule Generalization Strategies in Incremental Learning of Disjunctive ConceptsRuleML
Symbolic Machine Learning systems and applications, especially when applied to real-world domains, must face the problem of concepts that cannot be captured by a single definition, but require several alternate definitions, each of which covers part of the full concept extension. This problem is particularly relevant for incremental systems, where progressive covering approaches are not applicable, and the learning and refinement of the various definitions is interleaved during the learning phase. In these systems, not only the learned model depends on the order in which the examples are provided, but it also depends on
the choice of the specific definition to be refined. This paper proposes different strategies for determining the order in which the alternate definitions of a concept should be considered in a generalization step, and
evaluates their performance on a real-world domain dataset.
RuleML 2015 Constraint Handling Rules - What Else?RuleML
Constraint Handling Rules (CHR) is both a versatile theoretical formalism based on logic and an efficient practical high-level programming language based on rules and constraints.
Procedural knowledge is often expressed by if-then rules, events and actions are related by reaction rules, change is expressed by update rules. Algorithms are often specified using inference rules, rewrite rules, transition rules, sequents, proof rules, or logical axioms. All these kinds of rules can be directly written in CHR. The clean logical semantics of CHR facilitates non-trivial program analysis and transformation. About a dozen implementations of CHR exist in Prolog, Haskell, Java, Javascript and C. Some of them allow to apply millions of rules per second. CHR is also available as WebCHR for online experimentation with more than 40 example programs. More than 200 academic and industrial projects worldwide use CHR, and about 2000 research papers reference it.
RuleML2015 The Herbrand Manifesto - Thinking Inside the Box RuleML
The traditional semantics for First Order Logic (sometimes called Tarskian semantics) is based on the notion of interpretations of constants. Herbrand semantics is an alternative semantics based directly on truth assignments for ground sentences rather than interpretations of constants. Herbrand semantics is simpler and more intuitive than Tarskian semantics; and, consequently, it is easier to teach and learn. Moreover, it is more expressive. For example, while it is not possible to finitely axiomatize integer arithmetic with Tarskian semantics, this can be done easily with Herbrand Semantics. The downside is a loss of some common logical properties, such as compactness and completeness. However, there is no loss of inferential power. Anything that can be proved according to Tarskian semantics can also be proved according to Herbrand semantics. In this presentation, we define Herbrand semantics; we look at the implications for research on logic and rules systems and automated reasoning; and and we assess the potential for popularizing logic.
Industry@RuleML2015: Norwegian State of Estate A Reporting Service for the St...RuleML
Data distribution
•Public and private
•Data complexity
•Rich in attributes and location based
•Time dimension
•Example of data model from the Norwegian Mapping Authority
Datalog+-Track Introduction & Reasoning on UML Class Diagrams via Datalog+-RuleML
UML class diagrams (UCDs) are a widely adopted formalism
for modeling the intensional structure of a software system. Although
UCDs are typically guiding the implementation of a system, it is common
in practice that developers need to recover the class diagram from an
implemented system. This process is known as reverse engineering. A
fundamental property of reverse engineered (or simply re-engineered)
UCDs is consistency, showing that the system is realizable in practice.
In this work, we investigate the consistency of re-engineered UCDs, and
we show is pspace-complete. The upper bound is obtained by exploiting
algorithmic techniques developed for conjunctive query answering under
guarded Datalog+/-, that is, a key member of the Datalog+/- family
of KR languages, while the lower bound is obtained by simulating the
behavior of a polynomial space Turing machine.
RuleML2015: Binary Frontier-guarded ASP with Function SymbolsRuleML
It has been acknowledged that emerging Web applications
require features that are not available in standard rule languages like
Datalog or Answer Set Programming (ASP), e.g., they are not powerful
enough to deal with anonymous values (objects that are not explicitly
mentioned in the data but whose existence is implied by the background
knowledge). In this paper, we introduce a new rule language based on
ASP extended with function symbols, which can be used to reason about
anonymous values. In particular, we define binary frontier-guarded programs
(BFG programs) that allow for disjunction, function symbols, and
negation under the stable model semantics. In order to ensure decidability,
BFG programs are syntactically restricted by allowing at most
binary predicates and by requiring rules to be frontier-guarded. BFG programs
are expressive enough to simulate ontologies expressed in popular
Description Logics (DLs), capture their recent non-monotonic extensions,
and can simulate conjunctive query answering over many standard DLs.
We provide an elegant automata-based algorithm to reason in BFG programs,
which yields a 3ExpTime upper bound for reasoning tasks like
deciding consistency or cautious entailment. Due to existing results, these
problems are known to be 2ExpTime-hard.
RuleML2015: API4KP Metamodel: A Meta-API for Heterogeneous Knowledge PlatformsRuleML
API4KP (API for Knowledge Platforms) is a standard
development effort that targets the basic administration services as
well as the retrieval, modification and processing of expressions in
machine-readable languages, including but not limited to knowledge
representation and reasoning (KRR) languages, within heterogeneous
(multi-language, multi-nature) knowledge platforms. KRR languages of
concern in this paper include but are not limited to RDF(S), OWL,
RuleML and Common Logic, and the knowledge platforms may support
one or several of these. Additional languages are integrated using mappings
into KRR languages. A general notion of structure for knowledge
sources is developed using monads. The presented API4KP metamodel,
in the form of an OWL ontology, provides the foundation of an abstract
syntax for communications about knowledge sources and environments,
including a classification of knowledge source by mutability, structure,
and an abstraction hierarchy as well as the use of performatives (inform,
query, ...), languages, logics, dialects, formats and lineage. Finally, the
metamodel provides a classification of operations on knowledge sources
and environments which may be used for requests (message-passing).
RuleML2015: Rule-Based Exploration of Structured Data in the BrowserRuleML
We present Dexter, a browser-based, domain-independent
structured-data explorer for users. Dexter enables users to explore data
from multiple local and Web-accessible heterogeneous data sources such
as files, Web pages, APIs and databases in the form of tables. Dexter’s
users can also compute tables from existing ones as well as validate
the tables (base or computed) through declarative rules. Dexter enables
users to perform ad hoc queries over their tables with higher expressivity
than that is supported by the underlying data sources. Dexter evaluates
a user’s query on the client side while evaluating sub-queries on remote
sources whenever possible. Dexter also allows users to visualize and share
tables, and export (e.g., in JSON, plain XML, and RuleML) tables along
with their computation rules. Dexter has been tested for a variety of data
sets from domains such as government and apparel manufacturing. Dexter
is available online at http://dexter.stanford.edu.
RuleML2015: Compact representation of conditional probability for rule-based...RuleML
Context-aware systems gained huge popularity in recent
years due to rapid evolution of personal mobile devices. Equipped with
variety of sensors, such devices are sources of a lot of valuable information
that allows the system to act in an intelligent way. However, the
certainty and presence of this information may depend on many factors
like measurement accuracy or sensor availability. Such a dynamic
nature of information may cause the system not to work properly or
not to work at all. To allow for robustness of the context-aware system
an uncertainty handling mechanism should be provided with it. Several
approaches were developed to solve uncertainty in context knowledge
bases, including probabilistic reasoning, fuzzy logic, or certainty
factors. In this paper, we present a representation method that combines
strengths of rules based on the attributive logic and Bayesian networks.
Such a combination allows efficiently encode conditional probability distribution
of random variables into a reasoning structure called XTT2.
This provides a method for building hybrid context-aware systems that
allows for robust inference in uncertain knowledge bases.
RuleML2015: Learning Characteristic Rules in Geographic Information SystemsRuleML
We provide a general framework for learning characterization
rules of a set of objects in Geographic Information Systems (GIS) relying
on the definition of distance quantified paths. Such expressions specify
how to navigate between the different layers of the GIS starting from
the target set of objects to characterize. We have defined a generality
relation between quantified paths and proved that it is monotonous with
respect to the notion of coverage, thus allowing to develop an interactive
and effective algorithm to explore the search space of possible rules. We
describe GISMiner, an interactive system that we have developed based
on our framework. Finally, we present our experimental results from a
real GIS about mineral exploration.
RuleML2015: Using Substitutive Itemset Mining Framework for Finding Synonymou...RuleML
Over the last two decades frequent itemset and association
rule mining has attracted huge attention from the scientific community
which resulted in numerous publications, models, algorithms, and optimizations
of basic frameworks. In this paper we introduce an extension
of the frequent itemset framework, called substitutive itemsets. Substitutive
itemsets allow to discover equivalences between items, i.e., they
represent pairs of items that can be used interchangeably in many contexts.
In the paper we present basic notions pertaining to substitutive
itemsets, describe the implementation of the proposed method available
as a RapidMiner plugin, and illustrate the use of the framework for mining
substitutive object properties in the Linked Data.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxRASHMI M G
Abnormal or anomalous secondary growth in plants. It defines secondary growth as an increase in plant girth due to vascular cambium or cork cambium. Anomalous secondary growth does not follow the normal pattern of a single vascular cambium producing xylem internally and phloem externally.
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
As consumer awareness of health and wellness rises, the nutraceutical market—which includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutrition—is growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
In silico drugs analogue design: novobiocin analogues.pptx
Doctoral Consortium@RuleML2015 -Multidimensional Ontologies for Contextual Quality Data Specification and Extraction
1. Multidimensional Ontologies for Contextual Quality Data
Specification and Extraction
Mostafa Milani
Supervisor: Prof. Leopoldo Bertossi
Carleton University
School of Computer Science
Ottawa, Canada
(Carleton University) Ontology-Based Multidimensional Contexts 1 / 15
2. Problem Statement Introduction
Multidimensional Contexts and Data Quality
Measurements table
contains the
temperatures of patients
at a hospital
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
(Carleton University) Ontology-Based Multidimensional Contexts 2 / 15
3. Problem Statement Introduction
Multidimensional Contexts and Data Quality
Measurements table
contains the
temperatures of patients
at a hospital
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
A doctor suppose/expects the table to contain:
(Carleton University) Ontology-Based Multidimensional Contexts 2 / 15
4. Problem Statement Introduction
Multidimensional Contexts and Data Quality
Measurements table
contains the
temperatures of patients
at a hospital
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
A doctor suppose/expects the table to contain:
”The body temperatures of Tom Waits for September 5
taken around noon with a thermometer of brand B1”
(Carleton University) Ontology-Based Multidimensional Contexts 2 / 15
5. Problem Statement Introduction
Multidimensional Contexts and Data Quality
Measurements table
contains the
temperatures of patients
at a hospital
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
A doctor suppose/expects the table to contain:
”The body temperatures of Tom Waits for September 5
taken around noon with a thermometer of brand B1”
But Measurements does not contain the information to make this
assessment
(Carleton University) Ontology-Based Multidimensional Contexts 2 / 15
6. Problem Statement Introduction
Multidimensional Contexts and Data Quality
An external context can provide that information, making it possible
to assess the given data
(Carleton University) Ontology-Based Multidimensional Contexts 3 / 15
7. Problem Statement Introduction
Multidimensional Contexts and Data Quality
An external context can provide that information, making it possible
to assess the given data
Contex is modeled as relational databases (Bertossi et al., BIRTE 2010)
(Carleton University) Ontology-Based Multidimensional Contexts 3 / 15
8. Problem Statement Introduction
Multidimensional Contexts and Data Quality
An external context can provide that information, making it possible
to assess the given data
Contex is modeled as relational databases (Bertossi et al., BIRTE 2010)
The database under assessment is mapped into the contextual
database for further data quality analysis and cleaning
(Carleton University) Ontology-Based Multidimensional Contexts 3 / 15
9. Problem Statement Introduction
Multidimensional Contexts and Data Quality
An external context can provide that information, making it possible
to assess the given data
Contex is modeled as relational databases (Bertossi et al., BIRTE 2010)
The database under assessment is mapped into the contextual
database for further data quality analysis and cleaning
Context is commonly of a multi-dimensional nature
(Carleton University) Ontology-Based Multidimensional Contexts 3 / 15
10. Problem Statement Introduction
Multidimensional Contexts and Data Quality
An external context can provide that information, making it possible
to assess the given data
Contex is modeled as relational databases (Bertossi et al., BIRTE 2010)
The database under assessment is mapped into the contextual
database for further data quality analysis and cleaning
Context is commonly of a multi-dimensional nature
The dimensional aspects of context are not considered in
(Bertossi et al., BIRTE 2010)
(Carleton University) Ontology-Based Multidimensional Contexts 3 / 15
11. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
We can see the context as an ontology, containing:
(Carleton University) Ontology-Based Multidimensional Contexts 4 / 15
12. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
We can see the context as an ontology, containing:
A MD data model/instance:
(Carleton University) Ontology-Based Multidimensional Contexts 4 / 15
13. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
We can see the context as an ontology, containing:
A MD data model/instance:
PatientWard: A table containing the location of patients
Hospital dimension: Represents the hierarchy of locations
(Carleton University) Ontology-Based Multidimensional Contexts 4 / 15
14. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
We can see the context as an ontology, containing:
A MD data model/instance:
PatientWard: A table containing the location of patients
Hospital dimension: Represents the hierarchy of locations
Information such as a hospital guideline:
(Carleton University) Ontology-Based Multidimensional Contexts 4 / 15
15. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
We can see the context as an ontology, containing:
A MD data model/instance:
PatientWard: A table containing the location of patients
Hospital dimension: Represents the hierarchy of locations
Information such as a hospital guideline:
”Temperature measurement for patients in standard care unit
have to be taken with thermometers of Brand B1”
(Carleton University) Ontology-Based Multidimensional Contexts 4 / 15
16. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
We can see the context as an ontology, containing:
A MD data model/instance:
PatientWard: A table containing the location of patients
Hospital dimension: Represents the hierarchy of locations
Information such as a hospital guideline:
”Temperature measurement for patients in standard care unit
have to be taken with thermometers of Brand B1”
Basis data model: HM model (Hurtado and Mendelzon, 2005)
(Carleton University) Ontology-Based Multidimensional Contexts 4 / 15
17. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
We can see the context as an ontology, containing:
A MD data model/instance:
PatientWard: A table containing the location of patients
Hospital dimension: Represents the hierarchy of locations
Information such as a hospital guideline:
”Temperature measurement for patients in standard care unit
have to be taken with thermometers of Brand B1”
Basis data model: HM model (Hurtado and Mendelzon, 2005)
We extend the HM model (Maleki et al., AMW 2012)
(Carleton University) Ontology-Based Multidimensional Contexts 4 / 15
18. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Informally, some of the new ingredients in MD contexts:
(Carleton University) Ontology-Based Multidimensional Contexts 5 / 15
19. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Informally, some of the new ingredients in MD contexts:
Dimensions as in the HM
(Carleton University) Ontology-Based Multidimensional Contexts 5 / 15
20. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Informally, some of the new ingredients in MD contexts:
Dimensions as in the HM
Categorical relations: Generalize fact tables, not necessarily numerical
values, linked to different levels of dimensions, possibly incomplete
(Carleton University) Ontology-Based Multidimensional Contexts 5 / 15
21. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Informally, some of the new ingredients in MD contexts:
Dimensions as in the HM
Categorical relations: Generalize fact tables, not necessarily numerical
values, linked to different levels of dimensions, possibly incomplete
Dimensional rules: Generate data where missing
(Carleton University) Ontology-Based Multidimensional Contexts 5 / 15
22. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Informally, some of the new ingredients in MD contexts:
Dimensions as in the HM
Categorical relations: Generalize fact tables, not necessarily numerical
values, linked to different levels of dimensions, possibly incomplete
Dimensional rules: Generate data where missing
Dimensional constraints: Constraints on (combinations of) categorical
relations, involve values from dimension categories)
(Carleton University) Ontology-Based Multidimensional Contexts 5 / 15
23. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Informally, some of the new ingredients in MD contexts:
Dimensions as in the HM
Categorical relations: Generalize fact tables, not necessarily numerical
values, linked to different levels of dimensions, possibly incomplete
Dimensional rules: Generate data where missing
Dimensional constraints: Constraints on (combinations of) categorical
relations, involve values from dimension categories)
Dimensional rules and constraints can support and restrict
upward/downard navigation
(Carleton University) Ontology-Based Multidimensional Contexts 5 / 15
24. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Example
Ward and Unit:
categories of Hospital
dimension
(Carleton University) Ontology-Based Multidimensional Contexts 6 / 15
25. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Example
Ward and Unit:
categories of Hospital
dimension
UnitWard(unit,ward): a
parent/child relation
(Carleton University) Ontology-Based Multidimensional Contexts 6 / 15
26. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Example
Ward and Unit:
categories of Hospital
dimension
UnitWard(unit,ward): a
parent/child relation
PatientUnit
id Unit Day Patient
1 Standard Sep/5 Tom Waits
2 Standard Sep/6 Tom Waits
3 Intensive Sep/7 Tom Waits
4 Intensive Sep/6 Lou Reed
5 Standard Sep/5 Lou Reed
PatientWard
id Ward Day Patient
1 W1 Sep/5 Tom Waits
2 W1 Sep/6 Tom Waits
3 W3 Sep/7 Tom Waits
4 W3 Sep/6 Lou Reed
5 W2 Sep/5 Lou Reed
AllHospital
Institution
Unit
Ward
Standard Intensive Terminal
W1 W2 W3 W4
H1 H2
allHospital
AllTime
Year
Month
Day
Time
(Carleton University) Ontology-Based Multidimensional Contexts 6 / 15
27. Multidimensional Context Extended HM Data Model
Extending Context with Multidimensional Data
Example
Ward and Unit:
categories of Hospital
dimension
UnitWard(unit,ward): a
parent/child relation
PatientUnit
id Unit Day Patient
1 Standard Sep/5 Tom Waits
2 Standard Sep/6 Tom Waits
3 Intensive Sep/7 Tom Waits
4 Intensive Sep/6 Lou Reed
5 Standard Sep/5 Lou Reed
PatientWard
id Ward Day Patient
1 W1 Sep/5 Tom Waits
2 W1 Sep/6 Tom Waits
3 W3 Sep/7 Tom Waits
4 W3 Sep/6 Lou Reed
5 W2 Sep/5 Lou Reed
AllHospital
Institution
Unit
Ward
Standard Intensive Terminal
W1 W2 W3 W4
H1 H2
allHospital
AllTime
Year
Month
Day
Time
PatientWard: categorical relation with Ward and Day categorical
attributes taking values from dimension categories
(Carleton University) Ontology-Based Multidimensional Contexts 6 / 15
28. Multidimensional Context Extended HM Data Model
Dimensional Constraints
Example
Categorical relations are subject to dimensional constraints:
(Carleton University) Ontology-Based Multidimensional Contexts 7 / 15
29. Multidimensional Context Extended HM Data Model
Dimensional Constraints
Example
Categorical relations are subject to dimensional constraints:
A referential constraint restricting units in PatientUnit
to elements in the Unit category, as a negative constraint:
(Carleton University) Ontology-Based Multidimensional Contexts 7 / 15
30. Multidimensional Context Extended HM Data Model
Dimensional Constraints
Example
Categorical relations are subject to dimensional constraints:
A referential constraint restricting units in PatientUnit
to elements in the Unit category, as a negative constraint:
⊥ ← PatientUnit(u, d; p), ¬Unit(u)
(Carleton University) Ontology-Based Multidimensional Contexts 7 / 15
31. Multidimensional Context Extended HM Data Model
Dimensional Constraints
Example
Categorical relations are subject to dimensional constraints:
A referential constraint restricting units in PatientUnit
to elements in the Unit category, as a negative constraint:
⊥ ← PatientUnit(u, d; p), ¬Unit(u)
“All thermometers used in a unit are of the same type”:
(Carleton University) Ontology-Based Multidimensional Contexts 7 / 15
32. Multidimensional Context Extended HM Data Model
Dimensional Constraints
Example
Categorical relations are subject to dimensional constraints:
A referential constraint restricting units in PatientUnit
to elements in the Unit category, as a negative constraint:
⊥ ← PatientUnit(u, d; p), ¬Unit(u)
“All thermometers used in a unit are of the same type”:
t = t ← Thermometer(w, t; n), Thermometer(w , t ; n ),
UnitWard(u, w), UnitWard(u, w ) An EGD
(Carleton University) Ontology-Based Multidimensional Contexts 7 / 15
33. Multidimensional Context Extended HM Data Model
Dimensional Constraints
Example
Categorical relations are subject to dimensional constraints:
A referential constraint restricting units in PatientUnit
to elements in the Unit category, as a negative constraint:
⊥ ← PatientUnit(u, d; p), ¬Unit(u)
“All thermometers used in a unit are of the same type”:
t = t ← Thermometer(w, t; n), Thermometer(w , t ; n ),
UnitWard(u, w), UnitWard(u, w ) An EGD
“No patient in intensive care unit on August /2005”:
(Carleton University) Ontology-Based Multidimensional Contexts 7 / 15
34. Multidimensional Context Extended HM Data Model
Dimensional Constraints
Example
Categorical relations are subject to dimensional constraints:
A referential constraint restricting units in PatientUnit
to elements in the Unit category, as a negative constraint:
⊥ ← PatientUnit(u, d; p), ¬Unit(u)
“All thermometers used in a unit are of the same type”:
t = t ← Thermometer(w, t; n), Thermometer(w , t ; n ),
UnitWard(u, w), UnitWard(u, w ) An EGD
“No patient in intensive care unit on August /2005”:
⊥ ← PatientWard(w, d; p), UnitWard(Intensive, w),
MonthDay(August/2005, d)
(Carleton University) Ontology-Based Multidimensional Contexts 7 / 15
35. Multidimensional Context Extended HM Data Model
Dimensional Rules
Example
Data in PatientWard generate data about patients for
higher-level categorical relation PatientUnit:
(Carleton University) Ontology-Based Multidimensional Contexts 8 / 15
36. Multidimensional Context Extended HM Data Model
Dimensional Rules
Example
Data in PatientWard generate data about patients for
higher-level categorical relation PatientUnit:
PatientUnit(u, d; p) ← PatientWard(w, d; p),
UnitWard(u, w)
(Carleton University) Ontology-Based Multidimensional Contexts 8 / 15
37. Multidimensional Context Extended HM Data Model
Dimensional Rules
Example
Data in PatientWard generate data about patients for
higher-level categorical relation PatientUnit:
PatientUnit(u, d; p) ← PatientWard(w, d; p),
UnitWard(u, w)
Since relation schemas ”match”, ∃-variable in the head is not needed
(Carleton University) Ontology-Based Multidimensional Contexts 8 / 15
38. Multidimensional Context Extended HM Data Model
Dimensional Rules
Example
Data in PatientWard generate data about patients for
higher-level categorical relation PatientUnit:
PatientUnit(u, d; p) ← PatientWard(w, d; p),
UnitWard(u, w)
Since relation schemas ”match”, ∃-variable in the head is not needed
Rule is used to navigate from PatientWard.Ward upwards to
PatientUnit.Unit via UnitWard
(Carleton University) Ontology-Based Multidimensional Contexts 8 / 15
39. Multidimensional Context Extended HM Data Model
Dimensional Rules
Example
Data in PatientWard generate data about patients for
higher-level categorical relation PatientUnit:
PatientUnit(u, d; p) ← PatientWard(w, d; p),
UnitWard(u, w)
Since relation schemas ”match”, ∃-variable in the head is not needed
Rule is used to navigate from PatientWard.Ward upwards to
PatientUnit.Unit via UnitWard
Once at the level of Unit, it is possible to take advantage of a
guideline -in the form of a rule- stating that:
(Carleton University) Ontology-Based Multidimensional Contexts 8 / 15
40. Multidimensional Context Extended HM Data Model
Dimensional Rules
Example
Data in PatientWard generate data about patients for
higher-level categorical relation PatientUnit:
PatientUnit(u, d; p) ← PatientWard(w, d; p),
UnitWard(u, w)
Since relation schemas ”match”, ∃-variable in the head is not needed
Rule is used to navigate from PatientWard.Ward upwards to
PatientUnit.Unit via UnitWard
Once at the level of Unit, it is possible to take advantage of a
guideline -in the form of a rule- stating that:
“Temperatures of patients in a standard care unit are taken with oral
thermometers”
(Carleton University) Ontology-Based Multidimensional Contexts 8 / 15
41. Multidimensional Context Ontological Representation of the Extended MD Model
Datalog± as Representation Language
We use Datalog± as our representation language (Cali et al., 2009)
(Carleton University) Ontology-Based Multidimensional Contexts 9 / 15
42. Multidimensional Context Ontological Representation of the Extended MD Model
Datalog± as Representation Language
We use Datalog± as our representation language (Cali et al., 2009)
An extension of Datalog for ontology building with efficient
access to underlying data sources
(Carleton University) Ontology-Based Multidimensional Contexts 9 / 15
43. Multidimensional Context Ontological Representation of the Extended MD Model
Datalog± as Representation Language
We use Datalog± as our representation language (Cali et al., 2009)
An extension of Datalog for ontology building with efficient
access to underlying data sources
A family of languages with different syntactic restrictions on rules to
guarantee decidability
(Carleton University) Ontology-Based Multidimensional Contexts 9 / 15
44. Multidimensional Context Ontological Representation of the Extended MD Model
Datalog± as Representation Language
We use Datalog± as our representation language (Cali et al., 2009)
An extension of Datalog for ontology building with efficient
access to underlying data sources
A family of languages with different syntactic restrictions on rules to
guarantee decidability
The chase (that forwards propagates data through rules) may not
terminate
(Carleton University) Ontology-Based Multidimensional Contexts 9 / 15
45. Multidimensional Context Ontological Representation of the Extended MD Model
Datalog± as Representation Language
We use Datalog± as our representation language (Cali et al., 2009)
An extension of Datalog for ontology building with efficient
access to underlying data sources
A family of languages with different syntactic restrictions on rules to
guarantee decidability
The chase (that forwards propagates data through rules) may not
terminate
Our MD contexts has the general forms of dimensional rules and
constraints captured by Datalog± TGDs, EGDs, and Negative
Constraints
(Carleton University) Ontology-Based Multidimensional Contexts 9 / 15
46. Multidimensional Context Ontological Representation of the Extended MD Model
Properties of MD Ontologies and Query Answering
Our Datalog± MD ontologies become weakly-sticky Datalog±
programs (Cali et al., 2012)
(Carleton University) Ontology-Based Multidimensional Contexts 10 / 15
47. Multidimensional Context Ontological Representation of the Extended MD Model
Properties of MD Ontologies and Query Answering
Our Datalog± MD ontologies become weakly-sticky Datalog±
programs (Cali et al., 2012)
It is crucial that repeated variables in TGDs are for categorical
attributes (a finite number of values can be taken by them)
(Carleton University) Ontology-Based Multidimensional Contexts 10 / 15
48. Multidimensional Context Ontological Representation of the Extended MD Model
Properties of MD Ontologies and Query Answering
Our Datalog± MD ontologies become weakly-sticky Datalog±
programs (Cali et al., 2012)
It is crucial that repeated variables in TGDs are for categorical
attributes (a finite number of values can be taken by them)
Weak-stickiness guarantees tractability of conjunctive query
answering (QA): only an initial portion of the chase has to be
inspected
(Carleton University) Ontology-Based Multidimensional Contexts 10 / 15
49. Multidimensional Context Ontological Representation of the Extended MD Model
Properties of MD Ontologies and Query Answering
Our Datalog± MD ontologies become weakly-sticky Datalog±
programs (Cali et al., 2012)
It is crucial that repeated variables in TGDs are for categorical
attributes (a finite number of values can be taken by them)
Weak-stickiness guarantees tractability of conjunctive query
answering (QA): only an initial portion of the chase has to be
inspected
A non-deterministic algorithm WeaklySticky-QAns for weakly-sticky
Datalog± (Cali et al., 2012)
(Carleton University) Ontology-Based Multidimensional Contexts 10 / 15
50. Multidimensional Context Ontological Representation of the Extended MD Model
Properties of MD Ontologies and Query Answering
Our Datalog± MD ontologies become weakly-sticky Datalog±
programs (Cali et al., 2012)
It is crucial that repeated variables in TGDs are for categorical
attributes (a finite number of values can be taken by them)
Weak-stickiness guarantees tractability of conjunctive query
answering (QA): only an initial portion of the chase has to be
inspected
A non-deterministic algorithm WeaklySticky-QAns for weakly-sticky
Datalog± (Cali et al., 2012)
We proposed a deterministic version of the algorithm for
weakly-sticky programs and studied optimization technique (Milani and
Bertossi, AMW 2015)
(Carleton University) Ontology-Based Multidimensional Contexts 10 / 15
51. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
The MD ontology M becomes part of the context for data quality
assessment
(Carleton University) Ontology-Based Multidimensional Contexts 11 / 15
52. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
The MD ontology M becomes part of the context for data quality
assessment
The original instance D of schema S is to be assessed or cleaned
through the context
(Carleton University) Ontology-Based Multidimensional Contexts 11 / 15
53. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
The MD ontology M becomes part of the context for data quality
assessment
The original instance D of schema S is to be assessed or cleaned
through the context
By mapping D into the contextual schema/instance C
(Carleton University) Ontology-Based Multidimensional Contexts 11 / 15
54. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
The MD ontology M becomes part of the context for data quality
assessment
The original instance D of schema S is to be assessed or cleaned
through the context
By mapping D into the contextual schema/instance C
Example
A dimensional rule in M:
(Carleton University) Ontology-Based Multidimensional Contexts 11 / 15
55. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
The MD ontology M becomes part of the context for data quality
assessment
The original instance D of schema S is to be assessed or cleaned
through the context
By mapping D into the contextual schema/instance C
Example
A dimensional rule in M:
PatientUnit(u, t; p) ← PatientWard(w, d; p), DayTime(d, t),
UnitWard(u, w)
(Carleton University) Ontology-Based Multidimensional Contexts 11 / 15
56. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
The MD ontology M becomes part of the context for data quality
assessment
The original instance D of schema S is to be assessed or cleaned
through the context
By mapping D into the contextual schema/instance C
Example
A dimensional rule in M:
PatientUnit(u, t; p) ← PatientWard(w, d; p), DayTime(d, t),
UnitWard(u, w)
A quality predicate:
(Carleton University) Ontology-Based Multidimensional Contexts 11 / 15
57. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
The MD ontology M becomes part of the context for data quality
assessment
The original instance D of schema S is to be assessed or cleaned
through the context
By mapping D into the contextual schema/instance C
Example
A dimensional rule in M:
PatientUnit(u, t; p) ← PatientWard(w, d; p), DayTime(d, t),
UnitWard(u, w)
A quality predicate:
(Carleton University) Ontology-Based Multidimensional Contexts 11 / 15
58. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
The MD ontology M becomes part of the context for data quality
assessment
The original instance D of schema S is to be assessed or cleaned
through the context
By mapping D into the contextual schema/instance C
Example
A dimensional rule in M:
PatientUnit(u, t; p) ← PatientWard(w, d; p), DayTime(d, t),
UnitWard(u, w)
A quality predicate:
TakenWithTherm(t, p, b) ← PatientUnit(u, t; p), u = Standard, b = B1
(Carleton University) Ontology-Based Multidimensional Contexts 11 / 15
59. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Quality version Measurementsq
:
(Carleton University) Ontology-Based Multidimensional Contexts 12 / 15
60. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Quality version Measurementsq
:
Measurementsq
(t, p, v) ← Measurements (t, p, v),
TakenWithTherm(t, p, b), b = B1, y = certified
(Carleton University) Ontology-Based Multidimensional Contexts 12 / 15
61. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Quality version Measurementsq
:
Measurementsq
(t, p, v) ← Measurements (t, p, v),
TakenWithTherm(t, p, b), b = B1, y = certified
A doctor asks the body temperatures of Tom Waits for September 5
taken around noon:
(Carleton University) Ontology-Based Multidimensional Contexts 12 / 15
62. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Quality version Measurementsq
:
Measurementsq
(t, p, v) ← Measurements (t, p, v),
TakenWithTherm(t, p, b), b = B1, y = certified
A doctor asks the body temperatures of Tom Waits for September 5
taken around noon:
Q(t, v) : Measurements(t, Tom Waits, v) ∧ Sep5-11:45 ≤ t ≤ Sep5-12:15
(Carleton University) Ontology-Based Multidimensional Contexts 12 / 15
63. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Quality version Measurementsq
:
Measurementsq
(t, p, v) ← Measurements (t, p, v),
TakenWithTherm(t, p, b), b = B1, y = certified
A doctor asks the body temperatures of Tom Waits for September 5
taken around noon:
Q(t, v) : Measurements(t, Tom Waits, v) ∧ Sep5-11:45 ≤ t ≤ Sep5-12:15
He expects that the measurements are taken with a thermometer of
brand B1
(Carleton University) Ontology-Based Multidimensional Contexts 12 / 15
64. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Replacing predicates of S in Q with their quality versions in Sq:
(Carleton University) Ontology-Based Multidimensional Contexts 13 / 15
65. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Replacing predicates of S in Q with their quality versions in Sq:
Qq
(t, v):Measurementsq
(t, Tom Waits, v)∧Sep5-11:45 ≤ t ≤ Sep5-12:15
(Carleton University) Ontology-Based Multidimensional Contexts 13 / 15
66. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Replacing predicates of S in Q with their quality versions in Sq:
Qq
(t, v):Measurementsq
(t, Tom Waits, v)∧Sep5-11:45 ≤ t ≤ Sep5-12:15
Applying the definition of quality versions:
(Carleton University) Ontology-Based Multidimensional Contexts 13 / 15
67. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Replacing predicates of S in Q with their quality versions in Sq:
Qq
(t, v):Measurementsq
(t, Tom Waits, v)∧Sep5-11:45 ≤ t ≤ Sep5-12:15
Applying the definition of quality versions:
QC
(t, v): Measurements (t, p, v) ∧ TakenWithTherm(t, p, B1) ∧
p = Tom Waits ∧ Sep/5-11:45 ≤ t ≤ Sep/5-12:15
(Carleton University) Ontology-Based Multidimensional Contexts 13 / 15
68. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Replacing predicates of S in Q with their quality versions in Sq:
Qq
(t, v):Measurementsq
(t, Tom Waits, v)∧Sep5-11:45 ≤ t ≤ Sep5-12:15
Applying the definition of quality versions:
QC
(t, v): Measurements (t, p, v) ∧ TakenWithTherm(t, p, B1) ∧
p = Tom Waits ∧ Sep/5-11:45 ≤ t ≤ Sep/5-12:15
Unfolding the definition of quality predicates in P:
(Carleton University) Ontology-Based Multidimensional Contexts 13 / 15
69. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Replacing predicates of S in Q with their quality versions in Sq:
Qq
(t, v):Measurementsq
(t, Tom Waits, v)∧Sep5-11:45 ≤ t ≤ Sep5-12:15
Applying the definition of quality versions:
QC
(t, v): Measurements (t, p, v) ∧ TakenWithTherm(t, p, B1) ∧
p = Tom Waits ∧ Sep/5-11:45 ≤ t ≤ Sep/5-12:15
Unfolding the definition of quality predicates in P:
QM
(t, v):Measurements (t, p, v) ∧ PatientUnit(u, t; p) ∧ u =Standard ∧
p = Tom Waits ∧ Sep/5-11:45 ≤ t ≤ Sep/5-12:15
(Carleton University) Ontology-Based Multidimensional Contexts 13 / 15
70. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Measurements has the same extension of Measurements
(Carleton University) Ontology-Based Multidimensional Contexts 14 / 15
71. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Measurements has the same extension of Measurements
PatientUnit is computed by QA on M
(Carleton University) Ontology-Based Multidimensional Contexts 14 / 15
72. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Measurements has the same extension of Measurements
PatientUnit is computed by QA on M
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
(Carleton University) Ontology-Based Multidimensional Contexts 14 / 15
73. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Measurements has the same extension of Measurements
PatientUnit is computed by QA on M
The first second and last
measurements have the
expected quality
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
(Carleton University) Ontology-Based Multidimensional Contexts 14 / 15
74. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Measurements has the same extension of Measurements
PatientUnit is computed by QA on M
The first second and last
measurements have the
expected quality
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
(Carleton University) Ontology-Based Multidimensional Contexts 14 / 15
75. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Measurements has the same extension of Measurements
PatientUnit is computed by QA on M
The first second and last
measurements have the
expected quality
The first measurement is a
clean answer to Q:
t = Sep/5-12:10 and v=38.2
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
(Carleton University) Ontology-Based Multidimensional Contexts 14 / 15
76. Multidimensional Context MD Context for Quality Data Assessment
MD Contexts and Quality Query Answering: The Gist
Example
Measurements has the same extension of Measurements
PatientUnit is computed by QA on M
The first second and last
measurements have the
expected quality
The first measurement is a
clean answer to Q:
t = Sep/5-12:10 and v=38.2
Measurements
Time Patient Value
Sep/5-12:10 Tom Waits 38.2
Sep/6-11:50 Tom Waits 37.1
Sep/7-12:15 Tom Waits 37.7
Sep/9-12:00 Tom Waits 37.0
Sep/6-11:05 Lou Reed 37.5
Sep/5-12:05 Lou Reed 38.0
(Carleton University) Ontology-Based Multidimensional Contexts 14 / 15
78. Conclusions
Conclusions
Multidimensional contexts are represented as Datalog± ontologies
They allow us to specify data quality conditions, and to retrieve
quality data
(Carleton University) Ontology-Based Multidimensional Contexts 15 / 15
79. Conclusions
Conclusions
Multidimensional contexts are represented as Datalog± ontologies
They allow us to specify data quality conditions, and to retrieve
quality data
Development, implementation of the query answering algorithms is
ongoing work
(Carleton University) Ontology-Based Multidimensional Contexts 15 / 15
80. Conclusions
Conclusions
Multidimensional contexts are represented as Datalog± ontologies
They allow us to specify data quality conditions, and to retrieve
quality data
Development, implementation of the query answering algorithms is
ongoing work
Several extensions:
(Carleton University) Ontology-Based Multidimensional Contexts 15 / 15
81. Conclusions
Conclusions
Multidimensional contexts are represented as Datalog± ontologies
They allow us to specify data quality conditions, and to retrieve
quality data
Development, implementation of the query answering algorithms is
ongoing work
Several extensions:
Uncertain downward-navigation in dimensional rules
(Carleton University) Ontology-Based Multidimensional Contexts 15 / 15
82. Conclusions
Conclusions
Multidimensional contexts are represented as Datalog± ontologies
They allow us to specify data quality conditions, and to retrieve
quality data
Development, implementation of the query answering algorithms is
ongoing work
Several extensions:
Uncertain downward-navigation in dimensional rules
Checking dimensional constraints not only on the result of the chase
but while data generation
(Carleton University) Ontology-Based Multidimensional Contexts 15 / 15
83. Conclusions
Conclusions
Multidimensional contexts are represented as Datalog± ontologies
They allow us to specify data quality conditions, and to retrieve
quality data
Development, implementation of the query answering algorithms is
ongoing work
Several extensions:
Uncertain downward-navigation in dimensional rules
Checking dimensional constraints not only on the result of the chase
but while data generation
Relaxing the assumption of complete categorical data, and studying its
effect on dimensions
(Carleton University) Ontology-Based Multidimensional Contexts 15 / 15