This presentation was provided by Bert Carelli of TrendMD, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
This presentation was provided by Stephanie Roth of Temple University, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
CIRPA 2016: It's Show Time: Are Your Data Ready to be the "Next Big Thing"?Stephen Childs
Stephen Childs gave a presentation on institutional data and analysis. He discussed how institutional research offices can support external research projects by providing data and a research assistant. For projects to be successful, there needs to be clear agreements between the institution and researchers, with commitments of resources from both sides. The research assistant plays a key role in project success but their contributions are often unseen. Communication between institutional analysts and researchers is important to address technical challenges and ensure consistency of the data provided.
There are three main types of data for research: primary data, secondary data, and qualitative data. Primary data involves directly collecting information through surveys or questionnaires, but it requires conducting the research yourself. Secondary data analyzes previously collected data, which is quicker and easier to access for free, but the information may not be tailored to your needs or accurate. Qualitative data describes characteristics through flexible interviews with small samples, allowing for depth, but the findings cannot be generalized to larger audiences. Quantitative data involves statistically valid surveys to generalize results to populations, but has limitations in probing answers and potential response biases.
Developing a Workplace Health and Safety Action Plan with NVivoQSR International
See how data was gathered from multiple sources, including consultation sessions, focus groups and a survey. See how the thematic analysis was conducted, including how NVivo features such as auto-coding, word frequency queries, and matrix coding queries were used to inform the analysis.
Christina engaging the biomedical researchersdjmichael156
- A survey was conducted of 416 biomedical researchers to understand their information behaviors and needs. The majority were research scientists.
- The survey found that researchers primarily store documents on personal computers and share via email. They use reference managers like EndNote but few collaboration tools.
- Researchers expressed needs for better ways to find funding, collaborate online, access tools and data, and disseminate research outputs. Next steps include further engagement activities to develop services addressing these needs.
This presentation was provided by Bert Carelli of TrendMD, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
This presentation was provided by Stephanie Roth of Temple University, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
CIRPA 2016: It's Show Time: Are Your Data Ready to be the "Next Big Thing"?Stephen Childs
Stephen Childs gave a presentation on institutional data and analysis. He discussed how institutional research offices can support external research projects by providing data and a research assistant. For projects to be successful, there needs to be clear agreements between the institution and researchers, with commitments of resources from both sides. The research assistant plays a key role in project success but their contributions are often unseen. Communication between institutional analysts and researchers is important to address technical challenges and ensure consistency of the data provided.
There are three main types of data for research: primary data, secondary data, and qualitative data. Primary data involves directly collecting information through surveys or questionnaires, but it requires conducting the research yourself. Secondary data analyzes previously collected data, which is quicker and easier to access for free, but the information may not be tailored to your needs or accurate. Qualitative data describes characteristics through flexible interviews with small samples, allowing for depth, but the findings cannot be generalized to larger audiences. Quantitative data involves statistically valid surveys to generalize results to populations, but has limitations in probing answers and potential response biases.
Developing a Workplace Health and Safety Action Plan with NVivoQSR International
See how data was gathered from multiple sources, including consultation sessions, focus groups and a survey. See how the thematic analysis was conducted, including how NVivo features such as auto-coding, word frequency queries, and matrix coding queries were used to inform the analysis.
Christina engaging the biomedical researchersdjmichael156
- A survey was conducted of 416 biomedical researchers to understand their information behaviors and needs. The majority were research scientists.
- The survey found that researchers primarily store documents on personal computers and share via email. They use reference managers like EndNote but few collaboration tools.
- Researchers expressed needs for better ways to find funding, collaborate online, access tools and data, and disseminate research outputs. Next steps include further engagement activities to develop services addressing these needs.
Taming the regulatory tiger with jwg and smartlogicAnn Kelly
From CEOs to board members to operational managers, regulatory compliance is an ongoing concern. In a rapidly changing marketplace where complex regulations come from multiple regulatory bodies, the consequences of non-compliance can be costly to the enterprise in time, money and damage to their reputation.
JWG, a London think tank, has created RegDelta – a state-of-the-art regulatory change management platform - that allows individual stakeholders to quickly understand the impact of regulations and maintain a single source of truth for their regulatory obligations.
Hear Elliot Burgess, Head of Product and Client Services at JWG and Paul Gunstone, Sales Director at Smartlogic discuss the challenges organizations face identifying and complying with relevant regulations, JWG’s approach to taming the regulatory tiger with semantics and see a demo of the JWG RegDelta platform.
Electronic platforms like Survey Monkey, Qualtrics, and Question Pro can be useful tools for assessment, baseline studies, and simple information gathering, but have limitations for complex longitudinal studies, monitoring, and evaluation. While offering friendly interfaces for survey design and data collection, they provide only basic statistical analysis capabilities and require exporting data to other tools like SPSS or Excel for more robust analysis. These platforms reduce costs compared to printed surveys and make accessing internet-connected samples easier, but are not full monitoring and evaluation systems and may not be suitable for following indicators over time in complex, high-impact studies. Researchers should carefully consider the platform's functions and licensing options to determine what level of data collection, analysis and reporting it can adequately support for
Advanced Keyword Research SMX Toronto March 2013BrightEdge
The document discusses several concepts for ranking search results, including:
- Documents containing more query terms are considered more relevant. Longer documents are discounted.
- Hilltop introduced the concept of "authority" to determine relevance based on the number of unaffiliated pages linking to a page on the same subject.
- Google artificially inflates Wikipedia results because it views Wikipedia as an authoritative resource. However, Wikipedia is not infallible.
Closing the Knowledge Gap Between Evaluators and StakeholdersCesToronto
This document summarizes a presentation on closing the knowledge gap between evaluators and stakeholders. It discusses useful evaluator attitudes, aptitudes and skills, as well as evaluation methodologies that integrate opportunities for learning. Specifically, it presents the concentric circles methodology and snowball methodology, highlighting how each approach allows evaluators to gradually build knowledge and refine their evaluation. It also examines how computer-assisted qualitative data analysis software (CAQDAS) can enhance evaluator learning and evaluation quality, but notes its underutilization in the evaluation field. The presentation demonstrates the capabilities of Atlas.ti software.
Using NVivo in Healthcare - Steps for a More Effective StrategyQSR International
Illustrating the potential uses of NVivo in healthcare settings and going beyond academic research. See how the effective use of NVivo can improve the quality and efficiency of multiple analysis activities in healthcare settings.
Access Lab 2020: Context aware unified institutional knowledge services: an open architecture for digital libraries to offer a seamless user journey to content
Alvet Miranda, senior manager or South/West Asia, Oceania and Africa, EBSCO
Founder's Story: AiCure by Adam Hanina-CEO & Co-Founderams345
Cornell Health Tech Conference held on March 4,2016 at Cooper Union (https://healthconference2016.splashthat.com/) included Founder's Story: Adam Hanina, CEO & Co-Founder of AiCure. Here is the video that accompany's this slide deck:
https://youtu.be/Cm3MtgsKbrM
Presenter: Adam Hanina-CEO & Co-Founder, AiCure
This document discusses the key steps and considerations for conducting a survey:
1) Determine the purpose and objectives of the survey and what questions need to be asked.
2) Decide who will carry out the different roles for implementing the survey such as supervisors, interviewers, and data entry staff.
3) Plan the logistics of carrying out field work such as sampling approach, survey team structure, materials, and costs.
Holistic Monitoring and Evaluation Data Driven and Gender Sensitive Mixed Met...QSR International
Holis&c M&E uses a gender sensi&ve, data driven approach with mixed methodologies to improve accountability, partnerships, and program learning. Key outcomes include increased transparency, sustainable results, and capacity building. NVivo so`ware helps code and analyze qualita&ve data, while STATA can import codes and run complementary quan&ta&ve analysis. The document provides &ps on using these tools at different stages of the M&E process.
This document discusses foresight studies and their role in strategic planning. It provides an overview of the key phases and aspects of conducting foresight studies, including:
1) Defining the scope and question to be addressed, potential solutions, and governance strategy.
2) Analyzing trends and future alternatives to understand how systems may evolve and identify disruptive solutions.
3) Prioritizing solutions and converging on a collective vision of the future, then developing an implementation strategy and monitoring plan.
3) The document also outlines some tools that can be used to support foresight studies, such as knowledge management platforms, social network analysis, and technology assessment.
Embedding ORCID across researcher career pathsORCID, Inc
Northumbria University is exploring broader implementation of ORCID identifiers across researcher career paths and the research cycle through a partnership. This will involve encouraging graduate students to get an ORCID ID during enrollment and induction, capturing IDs for the annual HESA return, and integrating ORCID into the professional development and student records systems. Next steps involve promoting ORCID in research training and using ORCID membership features to streamline sign-up and identifier capture.
ImpactSense Research Bootcamp - Maximising Your Research Practices for 2020 a...Veronica Massoud
We hosted a Bootcamp on real-world research practices and applicable strategies to implement into your business. It's a unique opportunity to explore decades of gathered insight from CX professionals - looking at everything from quant and qual techniques to using machine learning to filter through data.
You can speak to us on this topic at:
hello@impactsense.com
+44 207 164 6489
www.impactsense.com
Predictive analytics uses data mining, statistics, modeling, machine learning and artificial intelligence to analyze current and historical facts to make predictions about future or otherwise unknown events. This presentation provides an overview of predictive analytics, including its business applications such as customer retention, risk management and operational optimization. Common predictive analytics methods and tools are also discussed.
This document provides a rubric to evaluate student performance on essays across four levels: beginning to develop, progressing toward the standard, meets the standard, and exceeds the standard. The rubric includes criteria in three categories: application of skills (30%), processing skills (30%), and following instructions (10%). Each criterion lists descriptions of the type of performance expected at each level. A fourth category, reflection (20%), focuses on a student's ability to assess their own performance and recommend improvements. The rubric provides a detailed framework to assess and differentiate student work at various performance levels.
The document outlines the distributed science value proposition, which includes better science through improved reproducibility, cheaper research through increased return on investment, and faster medical breakthroughs by reducing administrative delays. It notes current issues like a lack of reproducibility in 20% of U.S. health research and the high costs of non-replicable studies. Blockchain and related technologies could help address these problems by enabling greater transparency, standardization, and data sharing to improve research quality while reducing costs and speeding up the research process.
Blockchain for Health Research - HHS PCOR ManionSean Manion PhD
Blockchain for Health Research presentation by Sean Manion on 16 Dec 2019 for the U.S. Dept of Health and Human Services Asst Secretary for Programs & Evaluation, Patient Centered Outcomes Research Trust Fund Webinar
Taming the regulatory tiger with jwg and smartlogicAnn Kelly
From CEOs to board members to operational managers, regulatory compliance is an ongoing concern. In a rapidly changing marketplace where complex regulations come from multiple regulatory bodies, the consequences of non-compliance can be costly to the enterprise in time, money and damage to their reputation.
JWG, a London think tank, has created RegDelta – a state-of-the-art regulatory change management platform - that allows individual stakeholders to quickly understand the impact of regulations and maintain a single source of truth for their regulatory obligations.
Hear Elliot Burgess, Head of Product and Client Services at JWG and Paul Gunstone, Sales Director at Smartlogic discuss the challenges organizations face identifying and complying with relevant regulations, JWG’s approach to taming the regulatory tiger with semantics and see a demo of the JWG RegDelta platform.
Electronic platforms like Survey Monkey, Qualtrics, and Question Pro can be useful tools for assessment, baseline studies, and simple information gathering, but have limitations for complex longitudinal studies, monitoring, and evaluation. While offering friendly interfaces for survey design and data collection, they provide only basic statistical analysis capabilities and require exporting data to other tools like SPSS or Excel for more robust analysis. These platforms reduce costs compared to printed surveys and make accessing internet-connected samples easier, but are not full monitoring and evaluation systems and may not be suitable for following indicators over time in complex, high-impact studies. Researchers should carefully consider the platform's functions and licensing options to determine what level of data collection, analysis and reporting it can adequately support for
Advanced Keyword Research SMX Toronto March 2013BrightEdge
The document discusses several concepts for ranking search results, including:
- Documents containing more query terms are considered more relevant. Longer documents are discounted.
- Hilltop introduced the concept of "authority" to determine relevance based on the number of unaffiliated pages linking to a page on the same subject.
- Google artificially inflates Wikipedia results because it views Wikipedia as an authoritative resource. However, Wikipedia is not infallible.
Closing the Knowledge Gap Between Evaluators and StakeholdersCesToronto
This document summarizes a presentation on closing the knowledge gap between evaluators and stakeholders. It discusses useful evaluator attitudes, aptitudes and skills, as well as evaluation methodologies that integrate opportunities for learning. Specifically, it presents the concentric circles methodology and snowball methodology, highlighting how each approach allows evaluators to gradually build knowledge and refine their evaluation. It also examines how computer-assisted qualitative data analysis software (CAQDAS) can enhance evaluator learning and evaluation quality, but notes its underutilization in the evaluation field. The presentation demonstrates the capabilities of Atlas.ti software.
Using NVivo in Healthcare - Steps for a More Effective StrategyQSR International
Illustrating the potential uses of NVivo in healthcare settings and going beyond academic research. See how the effective use of NVivo can improve the quality and efficiency of multiple analysis activities in healthcare settings.
Access Lab 2020: Context aware unified institutional knowledge services: an open architecture for digital libraries to offer a seamless user journey to content
Alvet Miranda, senior manager or South/West Asia, Oceania and Africa, EBSCO
Founder's Story: AiCure by Adam Hanina-CEO & Co-Founderams345
Cornell Health Tech Conference held on March 4,2016 at Cooper Union (https://healthconference2016.splashthat.com/) included Founder's Story: Adam Hanina, CEO & Co-Founder of AiCure. Here is the video that accompany's this slide deck:
https://youtu.be/Cm3MtgsKbrM
Presenter: Adam Hanina-CEO & Co-Founder, AiCure
This document discusses the key steps and considerations for conducting a survey:
1) Determine the purpose and objectives of the survey and what questions need to be asked.
2) Decide who will carry out the different roles for implementing the survey such as supervisors, interviewers, and data entry staff.
3) Plan the logistics of carrying out field work such as sampling approach, survey team structure, materials, and costs.
Holistic Monitoring and Evaluation Data Driven and Gender Sensitive Mixed Met...QSR International
Holis&c M&E uses a gender sensi&ve, data driven approach with mixed methodologies to improve accountability, partnerships, and program learning. Key outcomes include increased transparency, sustainable results, and capacity building. NVivo so`ware helps code and analyze qualita&ve data, while STATA can import codes and run complementary quan&ta&ve analysis. The document provides &ps on using these tools at different stages of the M&E process.
This document discusses foresight studies and their role in strategic planning. It provides an overview of the key phases and aspects of conducting foresight studies, including:
1) Defining the scope and question to be addressed, potential solutions, and governance strategy.
2) Analyzing trends and future alternatives to understand how systems may evolve and identify disruptive solutions.
3) Prioritizing solutions and converging on a collective vision of the future, then developing an implementation strategy and monitoring plan.
3) The document also outlines some tools that can be used to support foresight studies, such as knowledge management platforms, social network analysis, and technology assessment.
Embedding ORCID across researcher career pathsORCID, Inc
Northumbria University is exploring broader implementation of ORCID identifiers across researcher career paths and the research cycle through a partnership. This will involve encouraging graduate students to get an ORCID ID during enrollment and induction, capturing IDs for the annual HESA return, and integrating ORCID into the professional development and student records systems. Next steps involve promoting ORCID in research training and using ORCID membership features to streamline sign-up and identifier capture.
ImpactSense Research Bootcamp - Maximising Your Research Practices for 2020 a...Veronica Massoud
We hosted a Bootcamp on real-world research practices and applicable strategies to implement into your business. It's a unique opportunity to explore decades of gathered insight from CX professionals - looking at everything from quant and qual techniques to using machine learning to filter through data.
You can speak to us on this topic at:
hello@impactsense.com
+44 207 164 6489
www.impactsense.com
Predictive analytics uses data mining, statistics, modeling, machine learning and artificial intelligence to analyze current and historical facts to make predictions about future or otherwise unknown events. This presentation provides an overview of predictive analytics, including its business applications such as customer retention, risk management and operational optimization. Common predictive analytics methods and tools are also discussed.
This document provides a rubric to evaluate student performance on essays across four levels: beginning to develop, progressing toward the standard, meets the standard, and exceeds the standard. The rubric includes criteria in three categories: application of skills (30%), processing skills (30%), and following instructions (10%). Each criterion lists descriptions of the type of performance expected at each level. A fourth category, reflection (20%), focuses on a student's ability to assess their own performance and recommend improvements. The rubric provides a detailed framework to assess and differentiate student work at various performance levels.
The document outlines the distributed science value proposition, which includes better science through improved reproducibility, cheaper research through increased return on investment, and faster medical breakthroughs by reducing administrative delays. It notes current issues like a lack of reproducibility in 20% of U.S. health research and the high costs of non-replicable studies. Blockchain and related technologies could help address these problems by enabling greater transparency, standardization, and data sharing to improve research quality while reducing costs and speeding up the research process.
Blockchain for Health Research - HHS PCOR ManionSean Manion PhD
Blockchain for Health Research presentation by Sean Manion on 16 Dec 2019 for the U.S. Dept of Health and Human Services Asst Secretary for Programs & Evaluation, Patient Centered Outcomes Research Trust Fund Webinar
Data Science in Biomedicine - Where Are We Headed?Philip Bourne
The document discusses the future of data science in biomedicine. It notes that we are currently at a point of deception, where digitization is occurring but disruption has not yet fully happened. It outlines implications such as open collaborative science becoming more important, and data/analytics increasing in scholarly value. Initiatives like the Precision Medicine Initiative and Big Data to Knowledge are aiming to improve research efficiency and enable precision medicine through large datasets and new methodologies. The future will require cooperation across funders and changes to training to address new skills needed.
Blockchain in Health Research Overview - ManionSean Manion PhD
Blockchain in Health Research 2019 was the 2nd annual summit hosted at Georgetown University on 27 Apr 2019 by Sean Manion, Science Distributed and Gilles Hilary, Georgetown University.
Blockchain for a TBI Research Network - ManionSean Manion PhD
Blockchain in Health Research 2019 was the 2nd annual summit hosted at Georgetown University on 27 Apr 2019 by Sean Manion, Science Distributed and Gilles Hilary, Georgetown University.
The document summarizes NIH's approach to data science and the ADDS mission. It discusses establishing a data ecosystem through community, policy, and infrastructure. The goals are to foster sustainability, efficiency, collaboration, reproducibility, and accessibility. NIH plans to seed the ecosystem through existing resources and funding. Example initiatives include establishing a data commons, standards, and training programs to develop a diverse data science workforce. The overall aim is to support a "digital enterprise" that enhances biomedical research and health outcomes.
John Koch presents on Merck's creation of an agile and scalable data science platform to improve information findability and accessibility for research and development. The platform addresses the challenges of thousands of people, information types, and repositories and hundreds of teams and decisions. It combines internal and external data using NoSQL tools on a flexible platform. This enables faster, integrated analysis across data sets compared to previous fragmented systems. The first analytics built on the platform were a scientific search application and a pharmacology knowledgebase. The flexible platform and integrated multi-disciplinary teams allow adding more data sources and analytics over time to further improve decision making.
This document summarizes a panel discussion on ResearchGate, a social media network for researchers. The panelists answered 7 questions about using ResearchGate and how it compares to other research networks. Key points addressed include: ResearchGate's business model and use of user data; copyright and open access issues; who uses ResearchGate; reliability of answers on the site; differences between ResearchGate and networks like Academia.edu and LinkedIn; issues with ResearchGate metrics and scores; and how to integrate ResearchGate with profiles on other sites like ORCID. The takeaway message is to focus engagement on networks where your target audience is active and to use ORCID to maintain a complete publication profile across sites.
phd research proposal should be written in such a way that it makes a positive and powerful first impression about your potential to become a good researcher and allows the university to assess whether you are a good match for the mentors or supervisors and their areas of research expertise.
Check out the scope for future research proposal topics in big data 2023 - https://rb.gy/6yoy0
Slide presentation from Day Two of the PCORnet Partners meeting. The January 21-2, 2014 meeting took place at the Brookings Institute. This event launched the development of the nation’s most ambitious and promising clinical research network aimed at delivering high quality care through patient-centered outcomes research.
Gather evidence to demonstrate the impact of your researchIUPUI
This workshop is the 3rd in a series of 4 titled "Maximize your impact" offered by the IUPUI University Library Center for Digital Scholarship. Faculty must provide strong evidence of impact in order to achieve promotion and tenure. Having strong evidence in year 5 is made easier by strategic dissemination early in your tenure track. In this hands-on workshop, we will introduce key sources of evidence to support your case, demonstrate strategies for gathering this evidence, and provide a variety of examples. These sources include citation metrics, article level metrics, and altmetrics as indicators of impact to support your narrative of excellence.
Pandemic Tech: How Emerging Technologies Can Mitigate the Impact of Covid-19 ...Tory Cenaj
This document summarizes how emerging technologies like blockchain can help address challenges posed by the COVID-19 pandemic and future pandemics. It discusses how blockchain can enable self-sovereign identity and health records, improve supply chain management, sustain economies, create medical professional registries, and incentivize responsible behavior. The document also provides an overview of Sean Manion's presentation on how blockchain can accelerate medical research by improving data sharing, tracking of methods/analyses, peer review processes, and publishing models.
This document discusses a SWOT analysis of data science at the National Institutes of Health (NIH). It outlines opportunities and threats, as well as strengths and weaknesses. Key points include that data science will become more central to biomedical research but the transition is taking longer than needed. Opportunities include open science and patient-centered healthcare, while threats include insufficient resources and issues with data access and privacy. Strengths include large datasets and diverse data types available, as well as collaboration, but weaknesses include gender/race inequality and sustainability challenges. The Big Data to Knowledge (BD2K) program aims to address these through initiatives like the NIH Commons and policy recommendations.
Emerging Participatory Research Methods for Digital Health CommunicationsAlex Bornkessel
This presentation was given as a part of the Health Communication Capacity Collaborative's (HC3) Health Communication Innovation Webinar Series. Many of the research methods highlighted however can also apply beyond the realm of just health communication and into other areas and across a diverse set of population groups.
Recording is located at: http://www.healthcommcapacity.org/blog/hc3-innovation-webinar-2-research-methodologies
A Big Picture in Research Data ManagementCarole Goble
A personal view of the big picture in Research Data Management, given at GFBio - de.NBI Summer School 2018 Riding the Data Life Cycle! Braunschweig Integrated Centre of Systems Biology (BRICS), 03 - 07 September 2018
This document outlines the key steps and considerations for determining a research design, including identifying a research problem, assessing available information, developing a theoretical framework, and writing a research proposal. The main steps are to identify the research problem, determine the purpose of the research, develop a theoretical framework, define the research question/hypothesis, identify any limitations or delimitations of the study, and decide on an appropriate methodology. Good research requires a clear statement of objectives, an appropriate methodology, unbiased conduct, sufficient resources, and adherence to ethical standards.
The document discusses research design and measurement. It defines key concepts in research design such as the different types of scales used in measurement (nominal, ordinal, interval, ratio), sources of measurement error, and criteria for evaluating measurement tools. It also outlines the different descriptors of research design including the degree of question crystallization, data collection methods, time dimensions, research environment, and purpose of studies.
This document discusses the NIH's efforts to create a sustainable ecosystem for biomedical data sharing and analysis. It outlines Philip Bourne's perspective as Associate Director for Data Science at NIH. Key points include:
1) NIH spends an estimated $1 billion per year on data but has little understanding of how much it should spend.
2) ADDS aims to foster a digital biomedical research enterprise through developing infrastructure, policy, training and business models.
3) Example initiatives include establishing a "Commons" for shared data storage and analysis, harmonizing clinical data standards, and creating sustainable business models for data sharing.
4) The goals are to improve efficiency, collaboration, reproducibility and discovery through better management and
Qualitative research relies on textual data rather than numbers, focusing on accurate information. Quantitative research depends on objective numeric data suitable for statistics. It is important for media researchers to collect both qualitative and quantitative data to provide depth, cover a wide range of needed data effectively, and make results more reliable and understandable for audiences. Collecting both surveyed and textual information creates a more effective research project than only one type, as each provides different valuable insights.
Similar to Blockchain, Science Publishing, and Replicability (20)
Validation of Clinical Artificial Intelligence: Where We Are and Where We Are...Sean Manion PhD
This is the deck from a presentation I gave to the Pittsburgh Industrial Statisticians Association (PISA) for their PISA23 event in a session on Artificial Intelligence and Machine Learning.
The deck itself is not intended to be stand alone without the accompanying verbal presentation, however many of the slides contain key elements with references, and my contact information is available at the end if anyone has questions.
How much is that data in the window : Healthcare data valuationSean Manion PhD
Presentation on healthcare data valuation, data confidence fabrics, layers of trust in healthcare, and health data marketplaces as part of the Health Data Valuation event, Session 10 of the IEEE Healthcare: Blockchain & AI Virtual Series on 25 August 2021
Overview of Library & Systematic Review (LASYR) Infrastructure for Blockchain and Emerging Technologies project at IEEE Healthcare: Blockchain & AI event - 07 April 2021
"Your Health App may be Illegal" IEEE 3 Feb 2021, ManionSean Manion PhD
This document discusses some of the key ethical issues related to the use of artificial intelligence and blockchain in healthcare. It outlines principles of ethics like autonomy, beneficence, non-maleficence, and justice. It also examines specific ethical issues for AI like consent, data privacy, bias and fairness, transparency, and safety. For blockchain, it looks at issues like job loss, wealth creation, and potential to facilitate crime or be overhyped. The document advocates that regulatory frameworks may need to be developed to provide oversight of AI systems, such as through institutional review boards, to help address ethical challenges.
Researchers and data safety monitoring boards currently provide oversight of research data and evidence. However, future projects aim to utilize blockchain and other technologies to establish more transparent, verifiable, and crowd-sourced methods of ensuring data integrity, conducting peer review of datasets and evidence, and developing clinical practice guidelines. These include initiatives from ConsenSys Health, Intel, Dell, Microsoft, and others to create decentralized data marketplaces and fabrics for verifying research artifacts.
Nicole tay the blockchain future_ society and the selfSean Manion PhD
Blockchain in Health Research Summit 2019 Georgetown University 27 Apr hosted by Gilles Hilary, Georgetown University and Sean Manion, Science Distributed
Design thinking Blockchain for Research - El SeedSean Manion PhD
Blockchain in Health Research 2019 was the 2nd annual summit hosted at Georgetown University on 27 Apr 2019 by Sean Manion, Science Distributed and Gilles Hilary, Georgetown University.
Blockchain and Patient-Centered Outcomes Measures - GoldwaterSean Manion PhD
Blockchain has the potential to transform how patient-reported outcome measures (PROMs) are developed and used. By decentralizing clinical data collection and giving patients control over their personal health information, blockchain addresses current challenges in PROMs around representation, participation, and data integration. Quantified data streams from smartphones and other devices could provide real-time, patient-centered insights to develop more relevant PROMs and measure treatment effectiveness. A blockchain-based system is proposed where patients use apps to collect health data, which builds an immutable record of progress that is validated by patients and providers and can be used to refine PROMs over time through feedback.
Blockchain in Health Research 2019 was the 2nd annual summit hosted at Georgetown University on 27 Apr 2019 by Sean Manion, Science Distributed and Gilles Hilary, Georgetown University.
Blockchain in Health Research 2019 was the 2nd annual summit hosted at Georgetown University on 27 Apr 2019 by Sean Manion, Science Distributed and Gilles Hilary, Georgetown University.
The document summarizes key ideas from Carl Jung, Martin Heidegger, and Jane Bennett regarding technology and its impact on society and the self. Carl Jung saw technology leading to self-destruction if not balanced by consciousness. Martin Heidegger viewed modern technology as destructive but believed humans could influence their relationship to it through questioning and creativity. Jane Bennett analyzed things as empowered "actants" within complex systems, rejecting the notion that humans are the sole agents of change. The short story "Byzantine Empathy" explores these themes through an activist using virtual reality to promote empathy and humanitarian funding.
Distributed Ledger Tech Applications - Health Report V1-12Sean Manion PhD
This document provides an overview of distributed ledger technology applications in healthcare. It discusses using blockchain to improve value and outcomes in health research by more efficiently allocating research funds and facilitating data sharing between researchers. It proposes a system called Value Based Health Research that would standardize and analyze research administration data using blockchain to speed up the research process and better link research funding to health outcomes. The document also provides a top 10 list of blockchain events in healthcare in 2018.
Distributed Ledger Tech Applications - Health Report V1.6Sean Manion PhD
This newsletter provides updates on applications of blockchain and distributed ledger technology in healthcare. It discusses several healthcare organizations working on blockchain projects related to credentialing and genetic data. Upcoming events are also highlighted, including a webinar on blockchain compliance and cybersecurity from Indiana University Health and Sentara Healthcare, and a blockchain bootcamp at the Node Digital Medicine Conference in December.
Distributed Ledger Tech Applications - Health Report V1.5Sean Manion PhD
This document provides a summary of recent developments in applying distributed ledger technology (DLT) like blockchain to healthcare. It discusses several articles about using blockchain for medical record sharing, clinical trials, and scientific research. Upcoming events are also mentioned, including conferences on applying blockchain in healthcare and a "Blockchain Bootcamp" being held on the topic.
Distributed Ledger Tech Applications - Health Report V1.4Sean Manion PhD
The document is a newsletter about applications of distributed ledger technology in healthcare called DLTA-H. It discusses Siemens investing $681 million in a blockchain study center in Berlin and growing career opportunities in blockchain healthcare. Upcoming events relating to blockchain in healthcare are also listed, including conferences in Nashville, Washington D.C., London, and Glasgow in November 2018.
Distributed Ledger Tech Applications - Health Report V1.3Sean Manion PhD
This document provides a summary of recent news and upcoming events related to applications of distributed ledger technology in healthcare. Key highlights include Pierre Fabre launching a blockchain patient engagement pilot, the CDC wanting to use blockchain to identify responders during crises faster, and HHS planning to launch a blockchain acquisition platform by Thanksgiving that is expected to provide an 800% return on investment. Upcoming events focus on blockchain in healthcare are also listed.
Distributed Ledger Tech Applications - Health Report V1.2Sean Manion PhD
The document is a newsletter about distributed ledger technology applications in health. It provides summaries of recent blockchain and healthcare news stories, including Blackberry announcing healthcare applications on its Spark platform, Dubai using blockchain for licensing health staff, and a survey finding most hospitals are learning about blockchain but over half may pilot it in the next two years. Upcoming blockchain and healthcare conferences are also listed.
Final Issue. Blockchain Healthcare Situation Report (BC/HC SITREP) Volume 2 Issue 26, 25 Jun - 01 Jul 2018. A weekly newsletter curating news and events relating to blockchain and healthcare by Sean Manion, CEO of Science Distributed.
United Nations, Blockchain for Impact Edition. Blockchain Healthcare Situation Report (BC/HC SITREP) Volume 2 Issue 22, 28 May - 04 Jun 2018. A weekly newsletter curating news and events relating to blockchain and healthcare by Sean Manion, CEO of Science Distributed.
Blockchain Healthcare Situation Report (BC/HC SITREP) Volume 2 Issue 20, 14 - 20 May 2018. A weekly newsletter curating news and events relating to blockchain and healthcare by Sean Manion, CEO of Science Distributed.
Current Ms word generated power point presentation covers major details about the micronuclei test. It's significance and assays to conduct it. It is used to detect the micronuclei formation inside the cells of nearly every multicellular organism. It's formation takes place during chromosomal sepration at metaphase.
The technology uses reclaimed CO₂ as the dyeing medium in a closed loop process. When pressurized, CO₂ becomes supercritical (SC-CO₂). In this state CO₂ has a very high solvent power, allowing the dye to dissolve easily.
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
Authoring a personal GPT for your research and practice: How we created the Q...Leonel Morgado
Thematic analysis in qualitative research is a time-consuming and systematic task, typically done using teams. Team members must ground their activities on common understandings of the major concepts underlying the thematic analysis, and define criteria for its development. However, conceptual misunderstandings, equivocations, and lack of adherence to criteria are challenges to the quality and speed of this process. Given the distributed and uncertain nature of this process, we wondered if the tasks in thematic analysis could be supported by readily available artificial intelligence chatbots. Our early efforts point to potential benefits: not just saving time in the coding process but better adherence to criteria and grounding, by increasing triangulation between humans and artificial intelligence. This tutorial will provide a description and demonstration of the process we followed, as two academic researchers, to develop a custom ChatGPT to assist with qualitative coding in the thematic data analysis process of immersive learning accounts in a survey of the academic literature: QUAL-E Immersive Learning Thematic Analysis Helper. In the hands-on time, participants will try out QUAL-E and develop their ideas for their own qualitative coding ChatGPT. Participants that have the paid ChatGPT Plus subscription can create a draft of their assistants. The organizers will provide course materials and slide deck that participants will be able to utilize to continue development of their custom GPT. The paid subscription to ChatGPT Plus is not required to participate in this workshop, just for trying out personal GPTs during it.
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
ESPP presentation to EU Waste Water Network, 4th June 2024 “EU policies driving nutrient removal and recycling
and the revised UWWTD (Urban Waste Water Treatment Directive)”
Unlocking the mysteries of reproduction: Exploring fecundity and gonadosomati...AbdullaAlAsif1
The pygmy halfbeak Dermogenys colletei, is known for its viviparous nature, this presents an intriguing case of relatively low fecundity, raising questions about potential compensatory reproductive strategies employed by this species. Our study delves into the examination of fecundity and the Gonadosomatic Index (GSI) in the Pygmy Halfbeak, D. colletei (Meisner, 2001), an intriguing viviparous fish indigenous to Sarawak, Borneo. We hypothesize that the Pygmy halfbeak, D. colletei, may exhibit unique reproductive adaptations to offset its low fecundity, thus enhancing its survival and fitness. To address this, we conducted a comprehensive study utilizing 28 mature female specimens of D. colletei, carefully measuring fecundity and GSI to shed light on the reproductive adaptations of this species. Our findings reveal that D. colletei indeed exhibits low fecundity, with a mean of 16.76 ± 2.01, and a mean GSI of 12.83 ± 1.27, providing crucial insights into the reproductive mechanisms at play in this species. These results underscore the existence of unique reproductive strategies in D. colletei, enabling its adaptation and persistence in Borneo's diverse aquatic ecosystems, and call for further ecological research to elucidate these mechanisms. This study lends to a better understanding of viviparous fish in Borneo and contributes to the broader field of aquatic ecology, enhancing our knowledge of species adaptations to unique ecological challenges.
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Leonel Morgado
Current descriptions of immersive learning cases are often difficult or impossible to compare. This is due to a myriad of different options on what details to include, which aspects are relevant, and on the descriptive approaches employed. Also, these aspects often combine very specific details with more general guidelines or indicate intents and rationales without clarifying their implementation. In this paper we provide a method to describe immersive learning cases that is structured to enable comparisons, yet flexible enough to allow researchers and practitioners to decide which aspects to include. This method leverages a taxonomy that classifies educational aspects at three levels (uses, practices, and strategies) and then utilizes two frameworks, the Immersive Learning Brain and the Immersion Cube, to enable a structured description and interpretation of immersive learning cases. The method is then demonstrated on a published immersive learning case on training for wind turbine maintenance using virtual reality. Applying the method results in a structured artifact, the Immersive Learning Case Sheet, that tags the case with its proximal uses, practices, and strategies, and refines the free text case description to ensure that matching details are included. This contribution is thus a case description method in support of future comparative research of immersive learning cases. We then discuss how the resulting description and interpretation can be leveraged to change immersion learning cases, by enriching them (considering low-effort changes or additions) or innovating (exploring more challenging avenues of transformation). The method holds significant promise to support better-grounded research in immersive learning.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
1. Blockchain, Science
Publishing, and Replicability
Blockchain in Healthcare Webinar Series – 28 Jun 2018
Sean T Manion, PhD
Science Distributed
2. Distributed Science Value Proposition
▪ Better Science (for Scientists)
▪ Problem: Reproducibility Issues; 20% of U.S. health science research can’t be replicated/reproduced*
▪ Solution: Improved reproducibility through transparency and immutable audit trail for research data;
better quality data from standardization; improved materials; increased meta-analysis capabilities
▪ Cheaper Research (for Funders)
▪ Problem: Expensive; decreasing ROI; $30 billion in U.S. health science on non-replicable research*
▪ Solution: Increased return on investment for research dollars spent; reduced data management costs
through blockchain/smart contracts, amplified with machine learning/AI; cheaper administration
▪ Faster Miracles (for Everyone)
▪ Problem: 17 years from bench to bedside; 2-5 years on administrative processes (my estimate)**
▪ Solution: Faster time from idea to treatment; improved outcomes with accelerated research and higher
quality data; improved tracking of individual contribution allowing for expanded permissioned access
of data to more researchers; faster administrative processes (e.g. IRB, grant review)
* “Economics of reproducibility in Preclinical Research” Freedman et al, PLoS 13(6) e1002165, 2015
** “Enhancing Federal Research: Traumatic Brain Injury & Blockchain Technology - Part 1.5, The Why.” Manion, Feb 2018
https://www.linkedin.com/pulse/enhancing-federal-research-traumatic-brain-injury-part-sean-manion-1/
Sean T Manion, PhD - Science Distributed - stmanion@gmail.com - Blockchain in Healthcare Webinar Series - Science Publishing & Replicability - 28 Jun 2018
3. What are the
pre-publication
presentations?
What is the
quality or
confidence
level?
How is
feedback
captured?
Gray lit?
What are the
comments?
Are there
objective
criteria?
Crowd-
sourced peer
review?
7. Publish &
Present
Are analyses
tied to original
hypotheses?
Are analyses
outlined in
protocol?
Justification of
new analyses?
Record of
attempted
analyses?
Statistical
power?
Lit basis of
interpretation?
Quality of refs?
Replicable?
Retractions?
6. Analyze &
Interpret
What are the
data standards?
Will it be
mergeable with
other studies?
What are the
quality
assurance
steps?
Is PHI secure?
Is there existing
data that can be
used?
How accessible
is the data?
What is the ROI
for the data?
5. Collect Data
Can parts of
this process be
automated?
Will that be
faster and more
reliable for
researchers and
regulator?
Can multiple
IRBs be
aligned?
Can parts of
IRB review be
crowdsourced?
Can audits be
easier?
4. Regulatory
Approval
Where are
funds?
Does my plan
match
programmatic
need?
Have I applied
here before?
Why am I
filling in all this
information
again?
How is the
money being
tracked?
3. Get Funding
What methods
are available?
What variations
have been
developed?
Are skilled
collaborators
available?
Where?
Are other
resources (i.e.
space,
equipment)
available?
2. Plan research
What
hypotheses
have already
been tested?
By whom?
When?
What
happened?
What was the
approach?
What was the
result?
Current status?
1. Form
Hypotheses
Distributed Science Opportunities?
• Where can things be improved in each area?
• Can blockchain/DLT facilitate and/or incentivize?
• What is the cost/benefit of implementation?
• If it can be achieved w/o blockchain/DLT, why isn’t it?
Sean T Manion, PhD - Science Distributed - stmanion@gmail.com - Blockchain in Healthcare Webinar Series - Science Publishing & Replicability - 28 Jun 2018
Questions to consider across the steps in science:
4. How accessible
are abstracts and
full manuscripts?
Reference
tracking for
appropriateness?
Retractions
tracked and
alerts to authors
referencing?
Funding tracked
by dollar
through
publication?
Confidence
score
gradient?
4. Publication
Record
How are
comments
captured?
Are responses
and edits
available?
Can unreported
changes inform
what should or
shouldn’t be
done in follow
up research?
Can this be
done in
condensed
time?
3. Comments &
Edits
How many
reviewers?
How big is the
pool?
What is the
balance for
bias?
Incentives for
review?
Weighting for
review?
Is crowd-
sourcing
feasible?
2. Peer Review
What are the
incentives for
publication?
How is
contribution
and IP tracked?
Transparency
on fees?
Funding
agencies?
Incentives for
reviewers?
1. Publication
Incentives
Distributed Science Publishing Opportunities?
Sean T Manion, PhD - Science Distributed - stmanion@gmail.com - Blockchain in Healthcare Webinar Series - Science Publishing & Replicability - 28 Jun 2018
5. Crowd-Sourced Peer Review Pilot
▪ Goal: to develop weighted crowd-sourced peer review of published and pre-published
material to provide confidence score related to replicability
▪ Crowd-sourcing + standardized, systematic review + distributed ledger
▪ DARPA announcement for systematized confidence in open research and evidence
▪ Science Distributed with Blockchain in Healthcare Global, ConsenSys, and advisors from
academia, government, and industry
▪ Interested in additional partners from academia and Open Science groups. Contact us.
Sean T Manion, PhD - Science Distributed - stmanion@gmail.com - Blockchain in Healthcare Webinar Series - Science Publishing & Replicability - 28 Jun 2018