This presentation gives an overview of the answers that my interviewees during the ACUMEN project gave on questions about their data deposition practices. It was my 'farewell' presentation at DANS.
Scott Edmunds talk at ODHK.meet.26: Open Science Data = Open Data (a rant in ...Scott Edmunds
The document discusses the benefits of open science data and argues that open data is important for addressing issues like climate change, disease outbreaks, and environmental problems. It provides an example where open genomic data from an E. coli outbreak in Germany was released under an open license and analyzed by researchers around the world, leading to important findings that helped control the outbreak. The document advocates for more open access and open data policies in Hong Kong to maximize the benefits of research and address issues like a lack of transparency in China.
A presentation about the visualizations I 'envisaged', wished for and made to deal with the academic-CV data and interview transcritps of the ACUMEN project.
From paper to screen: Putting maps on the webPetr Pridal
This document provides an overview of a workshop on putting maps online from paper sources. The main goal of the workshop is to present a complete workflow for digitizing, georeferencing, and publishing a 1912 map of Edinburgh online. The workshop structure includes sessions on digitizing paper maps, web presentation of high-resolution images using tiled viewers, georeferencing maps, and online publishing of maps. Georeferencing involves adding control points to align a scanned map to a known coordinate system, allowing integration with other geospatial data. Dynamic and pregenerated tile approaches to online map delivery are also discussed.
Agile lessons learned in the Microsoft ALM RangersRobert MacLean
The document discusses lessons learned from the Microsoft ALM Rangers team regarding agile practices. It provides an overview of scrum basics including that the product owner owns the backlog, the team completes work in sprints, and sprints end with a review and retrospective. It also notes some key lessons learned such as the importance of passion, priority definitions, light ceremonies, time as an engineering constraint, communication over metrics, and video not being a nice-to-have.
On Telecoms is a leading triple play operator in Greece offering fixed telephony, broadband, TV, and other services. It can leverage four key success factors: strong management experience in the Greek market, a proven business model, maximizing customer value through higher prices and lower churn, and differentiation through triple play offerings. The presentation discusses On Telecoms' services and business drivers, challenges, demands from customers, and proposes a convergent billing and revenue management platform as the solution.
The document discusses statutory compliance and why it is important for organizations. It notes that statutory compliance means complying with applicable laws and regulations. On average, organizations have to comply with around 120 laws. Non-compliance can result in penalties and loss of market credibility. The document outlines how to establish a statutory compliance program, including identifying applicable laws, conducting audits, and issuing compliance certificates. It also discusses the key benefits such as avoiding penalties and adding value to the organization.
Improving Education by Learning Analtyics (EADTU-EU Summit 2017)EADTU
Tinne De Laet presented on improving education through learning analytics. She defined learning analytics as the measurement, collection, analysis and reporting of learner data to understand and optimize learning. She provided examples of interventions using academic performance data, digital traces, and survey data. Her recommendations included focusing on available data, ensuring insights are actionable, involving various experts, evaluating tools, and scaling solutions. European collaboration was emphasized for advancing learning analytics.
Scott Edmunds talk at ODHK.meet.26: Open Science Data = Open Data (a rant in ...Scott Edmunds
The document discusses the benefits of open science data and argues that open data is important for addressing issues like climate change, disease outbreaks, and environmental problems. It provides an example where open genomic data from an E. coli outbreak in Germany was released under an open license and analyzed by researchers around the world, leading to important findings that helped control the outbreak. The document advocates for more open access and open data policies in Hong Kong to maximize the benefits of research and address issues like a lack of transparency in China.
A presentation about the visualizations I 'envisaged', wished for and made to deal with the academic-CV data and interview transcritps of the ACUMEN project.
From paper to screen: Putting maps on the webPetr Pridal
This document provides an overview of a workshop on putting maps online from paper sources. The main goal of the workshop is to present a complete workflow for digitizing, georeferencing, and publishing a 1912 map of Edinburgh online. The workshop structure includes sessions on digitizing paper maps, web presentation of high-resolution images using tiled viewers, georeferencing maps, and online publishing of maps. Georeferencing involves adding control points to align a scanned map to a known coordinate system, allowing integration with other geospatial data. Dynamic and pregenerated tile approaches to online map delivery are also discussed.
Agile lessons learned in the Microsoft ALM RangersRobert MacLean
The document discusses lessons learned from the Microsoft ALM Rangers team regarding agile practices. It provides an overview of scrum basics including that the product owner owns the backlog, the team completes work in sprints, and sprints end with a review and retrospective. It also notes some key lessons learned such as the importance of passion, priority definitions, light ceremonies, time as an engineering constraint, communication over metrics, and video not being a nice-to-have.
On Telecoms is a leading triple play operator in Greece offering fixed telephony, broadband, TV, and other services. It can leverage four key success factors: strong management experience in the Greek market, a proven business model, maximizing customer value through higher prices and lower churn, and differentiation through triple play offerings. The presentation discusses On Telecoms' services and business drivers, challenges, demands from customers, and proposes a convergent billing and revenue management platform as the solution.
The document discusses statutory compliance and why it is important for organizations. It notes that statutory compliance means complying with applicable laws and regulations. On average, organizations have to comply with around 120 laws. Non-compliance can result in penalties and loss of market credibility. The document outlines how to establish a statutory compliance program, including identifying applicable laws, conducting audits, and issuing compliance certificates. It also discusses the key benefits such as avoiding penalties and adding value to the organization.
Improving Education by Learning Analtyics (EADTU-EU Summit 2017)EADTU
Tinne De Laet presented on improving education through learning analytics. She defined learning analytics as the measurement, collection, analysis and reporting of learner data to understand and optimize learning. She provided examples of interventions using academic performance data, digital traces, and survey data. Her recommendations included focusing on available data, ensuring insights are actionable, involving various experts, evaluating tools, and scaling solutions. European collaboration was emphasized for advancing learning analytics.
The impact of evaluations on developments in academic careersFrank van der Most
A brief overview over the state of affairs in my art of the ACUMEN project. The slide on p. 7 has been my core slide during the project. If you are interested in the outcomes of the project, then download the deliverable here: http://research-acumen.eu/wp-content/uploads/D1.11-Impact%20of%20evaluations%20on%20academic%20careers_v1.0.pdf
The document summarizes the results of a question harmonization experiment conducted by the Research on Open Educational Resources for Development (ROER4D) project. The experiment aimed to harmonize research questions across ROER4D's 12 sub-projects and with other OER studies. Over 9 months, researchers from different regions participated in 15 online sessions to discuss and refine a bank of questions. This process helped develop a set of well-harmonized questions, increase understanding of key concepts, build a sense of community among researchers, and increase the research capacity of many participants. However, participation was uneven and some technical and process issues were encountered. Lessons included benefits of regular sessions and collaborative work, but limitations of a purely voluntary model
This document discusses learning analytics dashboards and how to design them effectively. It provides examples of existing learning analytics dashboards such as SNAPP, GISMO, and the Student Activity Meter. Common issues with dashboards are outlined, such as having too many screens, inadequate data context, and poor visualizations. The document recommends designing dashboards by reducing non-data elements, enhancing data visualization, and organizing information to support its intended meaning and use.
Group discussion and collaboration were the most engaging activities, while finding materials and school presentations were the most disengaging. Group work and data analysis were found to be the most helpful activities, while data errors and protocols were the most confusing. Participants enjoyed group discussion, videos, and presenting findings the most. Suggestions to improve future workshops included grouping teachers by content, providing more feedback during small groups, and incorporating goal setting and strategy sessions.
This document provides information about an R21 Boot Camp being held by the College of Pharmacy to foster research ideas and grant submissions. The Boot Camp will involve two informational sessions in October 2021 to discuss exploratory or developmental R21 grants. It encourages faculty to identify new ideas that could solve an important problem or gap. The Interim Associate Dean for Research will work to pair faculty with mentors and foster collaborations. The goal is to increase the number of grants submitted by providing support and resources to move ideas forward from initial concepts to full proposals.
The document describes a test fest conducted by librarians at the University of North Carolina to address a backlog of usability issues. It involved running 5 simultaneous usability tests on topics like database access, research videos, the catalog, and discovery tools. 8 participants took part in the first round, with 2 additional participants completing 2 tests later. The tests used methods like task analysis, surveys, and sketching. Affinity diagramming was used to analyze the 44 tests, though not all data could be incorporated. Outcomes so far include reports, catalog changes, and discovery tool decisions. The discussion focuses on managing backlogs and improving the test fest approach.
Presentation at Empirical Librarians 2018 in Knoxville, TN.
At UNC Chapel Hill, the User Experience and Assessment department regularly runs usability tests to inform our decision making and prioritize our users’ perspectives as we make changes. But there are more things to test than there are hours in the day. Our projects have a variety of stakeholders who are very interested in improving their services, and we found ourselves with a long list of tests we wanted to run.
To catch up, we adapted Harvard Libraries’ Test Fest model: five tests run simultaneously, with five participants rotating through the set of tests. Over a span of two hours, we completed 25 individual usability tests. In this one event, we caught up on much of our testing backlog.
This session will outline how we planned and executed Test Fest and what we learned from using this approach. We’ll also discuss how we approached analyzing the large amount of qualitative data that was gathered during testing, via affinity diagrams and lots of post-it notes.
The focus of this session is on our methodologies with an aim to include time for attendees to discuss how they would have approached the backlog, setting up Test Fest, and analyzing the data.
Program evaluation and outdoor education: An overviewJames Neill
This document provides an overview of program evaluation in outdoor education. It discusses what program evaluation is, why it's important to do, and different evaluation methods and tools. The presentation considers example evaluation studies and allows time to workshop individual program needs. Program evaluation aims to systematically determine a program's value by answering questions about needs, feasibility, process, outcomes, costs and generalizability. Common data collection methods include questionnaires, interviews, documentation review, observation, and focus groups. The evaluation process involves defining the purpose and audience, identifying objectives and stakeholders, gathering and analyzing data, and reporting/disseminating results.
Bioscience Laboratory Workforce Skills - part IIbio-link
This document discusses developing core skill standards for bioscience laboratory work. It provides examples of existing skill standard formats and proposes a new format. The new format includes critical work functions, key activities, and performance criteria for each activity. It also suggests developing authentic assessments that require students to complete real-world tasks instead of just knowing information. Groups are asked to brainstorm assessments for sample laboratory tasks. The goal is to develop a consensus skill standard format and identify assessments that ensure students gain the essential skills for bioscience laboratory careers.
Test Fest: Catching up on Your Usability Testing BacklogSarah Joy Arnold
Presentation at North Carolina Librarians' Association Biennial 2017 in Winston-Salem, NC. Part of "So Many Users, Not Enough Time: Large Scale Usability Testing Methods" with Chad Haefele and Scott Goldstein.
This presentation will discuss the process that Appalachian State University Libraries used to measure and test website usability during its recent redesign and migration to a new Drupal theme. It will emphasize how we recruited a large number of users and how large sample sizes promote better design decisions. While web usability research is well known for its flexibility in needing only about a dozen users to discover most problems, robust data-driven decisions are best supported by datasets that are large and significant. Attendees will learn techniques for surveying and testing more users without greatly compromising the richness of data collected.
At UNC-Chapel Hill, the User Experience and Assessment department regularly runs usability tests to inform our decision-making and prioritize our users' perspective as we make changes. But there are more things to test than there are hours in the day. Our projects have a variety of stakeholders who are very interested in improving their services, and we found ourselves with a long list of tests we wanted to run. To catch up, we adapted Harvard Libraries' Test Fest model: five tests run simultaneously, with five participants rotating through the set of tests. Over a span of two hours, we completed 25 individual usability tests. In this one event, we caught up on much of our testing backlog. This session will outline how we planned and executed Test Fest, how we recruited participants, and what we learned from using this approach. We'll also discuss our methodologies and briefly look at the results of each test.
This document outlines the research process from start to finish. It begins by defining research as a careful investigation aimed at discovering new information or revising current understanding. It then distinguishes between quantitative and qualitative research approaches. The document describes each step of the research process in detail, including refining an idea based on background research, conducting experiments or investigations, documenting work, evaluating results, and presenting findings. The overall process involves starting with an idea, investigating previous work, refining the idea, doing the core investigative work, evaluating outcomes, identifying future work, and disseminating the research.
Learning analytics research informed institutional practiceYi-Shan Tsai
The document summarizes learning analytics research and initiatives at the University of Edinburgh. It discusses early MOOC and VLE analytics projects that aimed to understand student behaviors and identify patterns. It also describes the Learning Analytics Map of Activities, Research and Roll-out (LAMARR) and efforts to build institutional capacity for learning analytics. Challenges discussed include the effort required to analyze raw data and involve stakeholders. The document advocates developing critical and participatory approaches to educational data analysis.
Post-it Up: Qualitative Data Analysis of a Test FestSarah Joy Arnold
Presentation at Southeastern Library Assessment Conference 2017 in Atlanta, GA.
This session will outline how we planned and executed five simultaneous usability tests and what we learned from using this method. We'll also discuss how we approached analyzing the large amount of qualitative data that was gathered during testing via affinity diagrams and lots of post-it notes. The focus of this session is on our methodologies, though we'll briefly look at the results of each test.
This document provides an overview of the Educational Quality Assessment (EQA) Inventory given to 11th grade students in Pennsylvania. The EQA examines various non-academic domains including social habits, self-esteem, values, creativity, coping skills, and career planning. Test questions are designed to measure attitudes in these domains and how strongly students demonstrate behaviors related to responsible citizenship. Student responses are analyzed to provide information on how each school ranks relative to statewide averages and similar schools, as well as the proportion of students meeting minimum standards in various attitudes. The goal is to use this information to guide decision making around educational programs and policies at federal, state, and local levels.
1) The document outlines plans to evaluate and modify special education processes and data systems at Achievement First schools.
2) It discusses conducting a needs assessment, evaluating current referral, evaluation, IEP and review processes, and modifying guidelines and documents.
3) Key deliverables include process flowcharts, intervention guides, training staff on new procedures, and piloting changes at one school before full network rollout.
This presentation was given by Peter Karlberg of the National Agency for Education (Skolverket) of Sweden at the GCES Conference on Education Governance: The Role of Data in Tallinn on 13 February during the afternoon session workshop on Learning Analytics.
The document provides information about Professional Learning Communities (PLCs) for Helena Public Schools. It discusses the district's commitment to PLCs and creating an online workspace to facilitate collaboration. The main goal is to have more students learning more through ensuring timely communication and effective implementation of PLC initiatives. The document outlines what a PLC is, why schools should implement them, how to create a PLC, how to do the work of a PLC, and provides various resources to support PLCs.
Scalable, Actionable, and Ethical Learning Dashboards: a reality checkTinne De Laet
Keynote presentation at Edmedia 2018 conference: https://www.aace.org/conf/edmedia/speakers/.
Results of Erasmus+ projects ABLE (www.ableproject.eu) and STELA (www.stela-project.eu) on learning dashboards for supporting first-year students.
This document provides guidance on analyzing and interpreting data from programs. It discusses the importance of having an analysis plan early on to ensure the data collected will answer evaluation questions. Both quantitative and qualitative data analysis techniques are covered, including descriptive statistics, coding themes from qualitative data, and discussing limitations. The document emphasizes starting with a plan, cleaning and organizing data, analyzing both numbers and narratives, interpreting results, and reflecting on what was learned as well as limitations.
The impact of evaluations on developments in academic careersFrank van der Most
A brief overview over the state of affairs in my art of the ACUMEN project. The slide on p. 7 has been my core slide during the project. If you are interested in the outcomes of the project, then download the deliverable here: http://research-acumen.eu/wp-content/uploads/D1.11-Impact%20of%20evaluations%20on%20academic%20careers_v1.0.pdf
The document summarizes the results of a question harmonization experiment conducted by the Research on Open Educational Resources for Development (ROER4D) project. The experiment aimed to harmonize research questions across ROER4D's 12 sub-projects and with other OER studies. Over 9 months, researchers from different regions participated in 15 online sessions to discuss and refine a bank of questions. This process helped develop a set of well-harmonized questions, increase understanding of key concepts, build a sense of community among researchers, and increase the research capacity of many participants. However, participation was uneven and some technical and process issues were encountered. Lessons included benefits of regular sessions and collaborative work, but limitations of a purely voluntary model
This document discusses learning analytics dashboards and how to design them effectively. It provides examples of existing learning analytics dashboards such as SNAPP, GISMO, and the Student Activity Meter. Common issues with dashboards are outlined, such as having too many screens, inadequate data context, and poor visualizations. The document recommends designing dashboards by reducing non-data elements, enhancing data visualization, and organizing information to support its intended meaning and use.
Group discussion and collaboration were the most engaging activities, while finding materials and school presentations were the most disengaging. Group work and data analysis were found to be the most helpful activities, while data errors and protocols were the most confusing. Participants enjoyed group discussion, videos, and presenting findings the most. Suggestions to improve future workshops included grouping teachers by content, providing more feedback during small groups, and incorporating goal setting and strategy sessions.
This document provides information about an R21 Boot Camp being held by the College of Pharmacy to foster research ideas and grant submissions. The Boot Camp will involve two informational sessions in October 2021 to discuss exploratory or developmental R21 grants. It encourages faculty to identify new ideas that could solve an important problem or gap. The Interim Associate Dean for Research will work to pair faculty with mentors and foster collaborations. The goal is to increase the number of grants submitted by providing support and resources to move ideas forward from initial concepts to full proposals.
The document describes a test fest conducted by librarians at the University of North Carolina to address a backlog of usability issues. It involved running 5 simultaneous usability tests on topics like database access, research videos, the catalog, and discovery tools. 8 participants took part in the first round, with 2 additional participants completing 2 tests later. The tests used methods like task analysis, surveys, and sketching. Affinity diagramming was used to analyze the 44 tests, though not all data could be incorporated. Outcomes so far include reports, catalog changes, and discovery tool decisions. The discussion focuses on managing backlogs and improving the test fest approach.
Presentation at Empirical Librarians 2018 in Knoxville, TN.
At UNC Chapel Hill, the User Experience and Assessment department regularly runs usability tests to inform our decision making and prioritize our users’ perspectives as we make changes. But there are more things to test than there are hours in the day. Our projects have a variety of stakeholders who are very interested in improving their services, and we found ourselves with a long list of tests we wanted to run.
To catch up, we adapted Harvard Libraries’ Test Fest model: five tests run simultaneously, with five participants rotating through the set of tests. Over a span of two hours, we completed 25 individual usability tests. In this one event, we caught up on much of our testing backlog.
This session will outline how we planned and executed Test Fest and what we learned from using this approach. We’ll also discuss how we approached analyzing the large amount of qualitative data that was gathered during testing, via affinity diagrams and lots of post-it notes.
The focus of this session is on our methodologies with an aim to include time for attendees to discuss how they would have approached the backlog, setting up Test Fest, and analyzing the data.
Program evaluation and outdoor education: An overviewJames Neill
This document provides an overview of program evaluation in outdoor education. It discusses what program evaluation is, why it's important to do, and different evaluation methods and tools. The presentation considers example evaluation studies and allows time to workshop individual program needs. Program evaluation aims to systematically determine a program's value by answering questions about needs, feasibility, process, outcomes, costs and generalizability. Common data collection methods include questionnaires, interviews, documentation review, observation, and focus groups. The evaluation process involves defining the purpose and audience, identifying objectives and stakeholders, gathering and analyzing data, and reporting/disseminating results.
Bioscience Laboratory Workforce Skills - part IIbio-link
This document discusses developing core skill standards for bioscience laboratory work. It provides examples of existing skill standard formats and proposes a new format. The new format includes critical work functions, key activities, and performance criteria for each activity. It also suggests developing authentic assessments that require students to complete real-world tasks instead of just knowing information. Groups are asked to brainstorm assessments for sample laboratory tasks. The goal is to develop a consensus skill standard format and identify assessments that ensure students gain the essential skills for bioscience laboratory careers.
Test Fest: Catching up on Your Usability Testing BacklogSarah Joy Arnold
Presentation at North Carolina Librarians' Association Biennial 2017 in Winston-Salem, NC. Part of "So Many Users, Not Enough Time: Large Scale Usability Testing Methods" with Chad Haefele and Scott Goldstein.
This presentation will discuss the process that Appalachian State University Libraries used to measure and test website usability during its recent redesign and migration to a new Drupal theme. It will emphasize how we recruited a large number of users and how large sample sizes promote better design decisions. While web usability research is well known for its flexibility in needing only about a dozen users to discover most problems, robust data-driven decisions are best supported by datasets that are large and significant. Attendees will learn techniques for surveying and testing more users without greatly compromising the richness of data collected.
At UNC-Chapel Hill, the User Experience and Assessment department regularly runs usability tests to inform our decision-making and prioritize our users' perspective as we make changes. But there are more things to test than there are hours in the day. Our projects have a variety of stakeholders who are very interested in improving their services, and we found ourselves with a long list of tests we wanted to run. To catch up, we adapted Harvard Libraries' Test Fest model: five tests run simultaneously, with five participants rotating through the set of tests. Over a span of two hours, we completed 25 individual usability tests. In this one event, we caught up on much of our testing backlog. This session will outline how we planned and executed Test Fest, how we recruited participants, and what we learned from using this approach. We'll also discuss our methodologies and briefly look at the results of each test.
This document outlines the research process from start to finish. It begins by defining research as a careful investigation aimed at discovering new information or revising current understanding. It then distinguishes between quantitative and qualitative research approaches. The document describes each step of the research process in detail, including refining an idea based on background research, conducting experiments or investigations, documenting work, evaluating results, and presenting findings. The overall process involves starting with an idea, investigating previous work, refining the idea, doing the core investigative work, evaluating outcomes, identifying future work, and disseminating the research.
Learning analytics research informed institutional practiceYi-Shan Tsai
The document summarizes learning analytics research and initiatives at the University of Edinburgh. It discusses early MOOC and VLE analytics projects that aimed to understand student behaviors and identify patterns. It also describes the Learning Analytics Map of Activities, Research and Roll-out (LAMARR) and efforts to build institutional capacity for learning analytics. Challenges discussed include the effort required to analyze raw data and involve stakeholders. The document advocates developing critical and participatory approaches to educational data analysis.
Post-it Up: Qualitative Data Analysis of a Test FestSarah Joy Arnold
Presentation at Southeastern Library Assessment Conference 2017 in Atlanta, GA.
This session will outline how we planned and executed five simultaneous usability tests and what we learned from using this method. We'll also discuss how we approached analyzing the large amount of qualitative data that was gathered during testing via affinity diagrams and lots of post-it notes. The focus of this session is on our methodologies, though we'll briefly look at the results of each test.
This document provides an overview of the Educational Quality Assessment (EQA) Inventory given to 11th grade students in Pennsylvania. The EQA examines various non-academic domains including social habits, self-esteem, values, creativity, coping skills, and career planning. Test questions are designed to measure attitudes in these domains and how strongly students demonstrate behaviors related to responsible citizenship. Student responses are analyzed to provide information on how each school ranks relative to statewide averages and similar schools, as well as the proportion of students meeting minimum standards in various attitudes. The goal is to use this information to guide decision making around educational programs and policies at federal, state, and local levels.
1) The document outlines plans to evaluate and modify special education processes and data systems at Achievement First schools.
2) It discusses conducting a needs assessment, evaluating current referral, evaluation, IEP and review processes, and modifying guidelines and documents.
3) Key deliverables include process flowcharts, intervention guides, training staff on new procedures, and piloting changes at one school before full network rollout.
This presentation was given by Peter Karlberg of the National Agency for Education (Skolverket) of Sweden at the GCES Conference on Education Governance: The Role of Data in Tallinn on 13 February during the afternoon session workshop on Learning Analytics.
The document provides information about Professional Learning Communities (PLCs) for Helena Public Schools. It discusses the district's commitment to PLCs and creating an online workspace to facilitate collaboration. The main goal is to have more students learning more through ensuring timely communication and effective implementation of PLC initiatives. The document outlines what a PLC is, why schools should implement them, how to create a PLC, how to do the work of a PLC, and provides various resources to support PLCs.
Scalable, Actionable, and Ethical Learning Dashboards: a reality checkTinne De Laet
Keynote presentation at Edmedia 2018 conference: https://www.aace.org/conf/edmedia/speakers/.
Results of Erasmus+ projects ABLE (www.ableproject.eu) and STELA (www.stela-project.eu) on learning dashboards for supporting first-year students.
This document provides guidance on analyzing and interpreting data from programs. It discusses the importance of having an analysis plan early on to ensure the data collected will answer evaluation questions. Both quantitative and qualitative data analysis techniques are covered, including descriptive statistics, coding themes from qualitative data, and discussing limitations. The document emphasizes starting with a plan, cleaning and organizing data, analyzing both numbers and narratives, interpreting results, and reflecting on what was learned as well as limitations.
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
How to Setup Warehouse & Location in Odoo 17 InventoryCeline George
In this slide, we'll explore how to set up warehouses and locations in Odoo 17 Inventory. This will help us manage our stock effectively, track inventory levels, and streamline warehouse operations.
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
How to Manage Your Lost Opportunities in Odoo 17 CRMCeline George
Odoo 17 CRM allows us to track why we lose sales opportunities with "Lost Reasons." This helps analyze our sales process and identify areas for improvement. Here's how to configure lost reasons in Odoo 17 CRM
Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...Diana Rendina
Librarians are leading the way in creating future-ready citizens – now we need to update our spaces to match. In this session, attendees will get inspiration for transforming their library spaces. You’ll learn how to survey students and patrons, create a focus group, and use design thinking to brainstorm ideas for your space. We’ll discuss budget friendly ways to change your space as well as how to find funding. No matter where you’re at, you’ll find ideas for reimagining your space in this session.
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPRAHUL
This Dissertation explores the particular circumstances of Mirzapur, a region located in the
core of India. Mirzapur, with its varied terrains and abundant biodiversity, offers an optimal
environment for investigating the changes in vegetation cover dynamics. Our study utilizes
advanced technologies such as GIS (Geographic Information Systems) and Remote sensing to
analyze the transformations that have taken place over the course of a decade.
The complex relationship between human activities and the environment has been the focus
of extensive research and worry. As the global community grapples with swift urbanization,
population expansion, and economic progress, the effects on natural ecosystems are becoming
more evident. A crucial element of this impact is the alteration of vegetation cover, which plays a
significant role in maintaining the ecological equilibrium of our planet.Land serves as the foundation for all human activities and provides the necessary materials for
these activities. As the most crucial natural resource, its utilization by humans results in different
'Land uses,' which are determined by both human activities and the physical characteristics of the
land.
The utilization of land is impacted by human needs and environmental factors. In countries
like India, rapid population growth and the emphasis on extensive resource exploitation can lead
to significant land degradation, adversely affecting the region's land cover.
Therefore, human intervention has significantly influenced land use patterns over many
centuries, evolving its structure over time and space. In the present era, these changes have
accelerated due to factors such as agriculture and urbanization. Information regarding land use and
cover is essential for various planning and management tasks related to the Earth's surface,
providing crucial environmental data for scientific, resource management, policy purposes, and
diverse human activities.
Accurate understanding of land use and cover is imperative for the development planning
of any area. Consequently, a wide range of professionals, including earth system scientists, land
and water managers, and urban planners, are interested in obtaining data on land use and cover
changes, conversion trends, and other related patterns. The spatial dimensions of land use and
cover support policymakers and scientists in making well-informed decisions, as alterations in
these patterns indicate shifts in economic and social conditions. Monitoring such changes with the
help of Advanced technologies like Remote Sensing and Geographic Information Systems is
crucial for coordinated efforts across different administrative levels. Advanced technologies like
Remote Sensing and Geographic Information Systems
9
Changes in vegetation cover refer to variations in the distribution, composition, and overall
structure of plant communities across different temporal and spatial scales. These changes can
occur natural.
Liberal Approach to the Study of Indian Politics.pdf
Depositing and evaluating datasets: een sketch of four disciplines in four countries
1. Depositing and evaluating
datasets: een sketch of four
disciplines in four countries
Frank van der Most
eHumanities and DANS
!
DANS lunch-presentation, 4 March 2014
1
2. ACUMEN
Academic Careers Understood through
Measurement and Norms
www.research-acumen.eu
!
“Currently, there is a discrepancy between the criteria
used in performance assessment and the broader
social and economic function of scientific and
scholarly research.” (proposal, p. 3)
!2
3. ACUMEN
Additional problems
✦ “lack of resources for qualitative evaluation
due to increased scale of research”
✦ “quantitative measures are often not
applicable at the individual level”
✦ “lack of recognition for new types of work”
(ACUMEN description of work, p. 3)
!3
4. ACUMEN
!
✦ so, ACUMEN will develop
•a portfolio for researchers, and
•criteria and guidelines for Good
Evaluation Practices
✦ Done … see the website
!4
6. Research Question
✦ How do evaluations influence the development of
individuals’ careers in academia?
(What is the impact of evaluations?)
•Wide notion of evaluations
•Careers + ‘work’
!6
7. Method
✦ Quite structured, in depth interviews
✦ 4 Countries : UK, Germany, the Netherlands, Poland
✦ 4 Disciplines : Astronomy/astrophysics, Environmental
engineering, Philosophy (+), Public Health
✦ Individual academics
•3 levels of seniority: PhD + 5 to 10 years, + 11 to 25
years, + 26 years or more (or equivalents)
✦ Deans, HoDs, or HR manager
!7
8. Method
Number of
interviews done
AA
EE
P+
PH
Total
UK
4 (3+1)
4 (3+1)
3 (2+1)
4 (3+1)
15 (11+4)
Germany
2 (2+0)
3 (3+0)
3 (2+1)
3 (3+0)
11 (10+1)
The Netherlands
3 (3+0)
4 (3+1)
3 (2+1)
4 (3+1)
14 (11+3)
Poland
2 (1+1)
1 (1+0)
4 (3+1)
1 (1+0)
8 (6+2)
Total
48
11 (9+2) 12 (10+2) 13 (9+4) 12 (10+2)
(38+10)
Numbers in brackets: individuals + deans/HR managers
!8
9. Questions relevant for DANS"
Academic individuals
✦ Have you ever deposited your research data, or
have you ever reviewed a data set?
•Could you describe the occasions in a few
words?
✦ In which evaluations is data deposition, creation,
maintenance, or review addressed?
!9
10. AI: Have you ever deposited your
research data?
Answer (coded) Discipline
AA
EE
P+
PH
Total
No
1
2
1
4
8
No, because we do not produce data
1
1
5
1
8
Yes, it is done automatically by the facilities
5
!10
5
11. European Space Agency
Example of European Southern Observatory archives
ESO / B. Tafreshi (twanight.org)
http://www.eso.org/public/archives/images/screen/potw1239a.jpg
Accessed 4 March 2014
!11
15. AI: Have you ever deposited your
research data?
Answer (coded) Discipline
AA
EE
P+
PH
Total
No
1
2
1
4
8
No, because we do not produce data
1
1
5
1
8
Yes, it is done automatically by the facilities
5
5
Question not asked
1
No, but I store privately
1
No, but I know that we should
2
No in general, but Yes, one time
1
2
2
3
3
1
3
Yes, we produce catalogues
3
2
No, but financing parties claim the data
2
2
1
1
Yes, some data is deposited at a central
place in the group
1
1
Yes, in the library of the university
1
1
10
38
Total
9
!15
10
9
16. AI: have you ever reviewed a
dataset?
✦ No, except …
✦ In AA: implicit reviewing when re-using a
dataset
•‘Reduction’ of data / processing of data
•Need to know about the instrument and
trust the data
✦ Exception of one specific project (EE)
!16
17. AI: In which evaluations is data
deposition, creation, maintenance, or
review addressed?
Answer (coded) Discipline
Do not know of any
AA
3
EE
P+
PH
Total
8
8
24
2
Question not asked
5
1
2
5
The process is evaluated
2
2
Incidentally, when data is re-processed
2
2
Through peer review of publications
1
1
2
No, it should be done but probably is not
1
1
Through peer review of claims to discoveries
1
1
Yes, through pipeline processing
1
Total
9
1
10
9
10
38
18. AI: In which evaluations is data
deposition, creation, maintenance, or
review addressed?
Answer (coded) Discipline
Do not know of any
AA
3
EE
P+
PH
Total
8
8
24
2
Question not asked
5
1
2
5
The process is evaluated
2
2
Incidentally, when data is re-processed
2
2
Through peer review of publications
1
1
2
No, it should be done but probably is not
1
1
Through peer review of claims to discoveries
1
1
Yes, through pipeline processing
1
Total
9
1
10
9
10
38
19. AI: In which evaluations is data
deposition, creation, maintenance, or
review addressed?
Answer (coded) Discipline
Do not know of any
AA
3
EE
P+
PH
Total
8
8
24
2
Question not asked
5
1
2
5
The process is evaluated
2
2
Incidentally, when data is re-processed
2
2
Through peer review of publications
1
1
2
No, it should be done but probably is not
1
1
Through peer review of claims to discoveries
1
1
Yes, through pipeline processing
1
Total
9
1
10
9
10
38
20. AI: In which evaluations is data
deposition, creation, maintenance, or
review addressed?
Answer (coded) Discipline
Do not know of any
AA
3
EE
P+
PH
Total
8
8
24
2
Question not asked
5
1
2
5
The process is evaluated
2
2
Incidentally, when data is re-processed
2
2
Through peer review of publications
1
1
2
No, it should be done but probably is not
1
1
Through peer review of claims to discoveries
1
1
Yes, through pipeline processing
1
Total
9
1
10
9
10
38
21. Questions relevant for DANS"
Deans, HR managers and Heads of
Departments
✦ Is your institution promoting data deposition
and data review?
Since when?
How?
✦ In which evaluations is data deposition,
creation, maintenance, or review addressed?
!21
22. DH: Is your institution promoting
data deposition and data review?
Answer (coded) Discipline
AA
EE
P+
PH
Total
Question not asked
1
2
1
4
Yes, university promotes
1
1
1
3
No promotion, but it may happen
1
No promotion
1
Total
2
!22
2
1
2
1
4
2
10
23. Dean in EE
✦ Motivation for researchers?
•Feels like extra bureaucracy
•Sit on their data
•What’s the point once you finish the project
•Checking for fraud
✦ Problems of depositing and re-use
•Meta-data > experimental setting
!23
24. Dean in PH
✦ Yes, the university is promoting
✦ Many questions to answer
•Should we store everything? Capacity
•Consent from patients, anonymity
✦ External institute or at the university?
•Trust in institution
•Technical reliability (15 years), back-ups etc.
!24
25. DH: In which evaluations is data
deposition, creation, maintenance, or
review addressed?
Answer (coded) Discipline
AA
EE
P+
PH
Do not know of any
1
2
1
2
Question not asked
1
Total
6
2
1
No evaluations, but interesting question
2
Total
!25
2
3
1
4
2
10
26. Conclusions
✦ Differences between disciplines, but also within
them
✦ DANS may get more busy than expected
•Outside pressure, from research councils and
through incidents such as Stapel case
•Or less busy … ?
✦ How to ‘sell’ the effort needed?
•How to turn data deposition into academic credit?
!26