Beyond Citation Counts - The Potential of Academic Social Network Sites for S...Christoph Lutz
Millions of researchers all around the world have profiles on academic social network sites, such as ResearchGate, Academia.edu, or Mendeley. Still these channels are hardly used for impact assessment. While scientific impact has traditionally been measured with bibliometrics, social media provide new avenues for influence measurement (Altmetrics). We focus on one specific type of social media, namely academic social network sites. How can such platforms provide insights into scientific impact and add to Altmetrics? To answer this question, we rely on a social network analysis of a research community on ResarchGate. The underlying data was provided by the platform provider. It contains detailed interaction and publication information of 55 faculty members of a Swiss public university. We apply a structural perspective and use centrality measures as core indicators of influence within the network.
Our analysis proceeds in three steps: First, we describe the network structure in terms of classical SNA metrics. Second, we analyze whether researchers’ network centrality is associated with other metrics of influence, namely: (a) activity on the platform (b) traditional metrics of scholarly influence (i.e. mainly bibliographic criteria), and (c) academic position. Third, we compare the network structure with that of participants' co-authorship pattern.
Our findings show that activity on the platform is the best predictor of impact within the network, while publication success and academic play less of a role. Implications for research and practice are provided.
Beyond Citation Counts - The Potential of Academic Social Network Sites for S...Christoph Lutz
Millions of researchers all around the world have profiles on academic social network sites, such as ResearchGate, Academia.edu, or Mendeley. Still these channels are hardly used for impact assessment. While scientific impact has traditionally been measured with bibliometrics, social media provide new avenues for influence measurement (Altmetrics). We focus on one specific type of social media, namely academic social network sites. How can such platforms provide insights into scientific impact and add to Altmetrics? To answer this question, we rely on a social network analysis of a research community on ResarchGate. The underlying data was provided by the platform provider. It contains detailed interaction and publication information of 55 faculty members of a Swiss public university. We apply a structural perspective and use centrality measures as core indicators of influence within the network.
Our analysis proceeds in three steps: First, we describe the network structure in terms of classical SNA metrics. Second, we analyze whether researchers’ network centrality is associated with other metrics of influence, namely: (a) activity on the platform (b) traditional metrics of scholarly influence (i.e. mainly bibliographic criteria), and (c) academic position. Third, we compare the network structure with that of participants' co-authorship pattern.
Our findings show that activity on the platform is the best predictor of impact within the network, while publication success and academic play less of a role. Implications for research and practice are provided.
Assignment planner and reference trackingdbslibrary
This presentation describes and demonstrates two tools that support and inform the delivery of education services to higher education students. 1) A customised online tool, that can help improve understanding of the library's role in information literacy and provides opportunities for librarians and faculty to collaborate. Statistics and a survey can yield information on usage and usefulness; 2) A re-purposed online tool (Google Forms) to track library patron’s reference interactions via various contacts points including IM, phone, Email and the reference desk. The purpose of tracking reference services is to inform the efficient design of library and information reference services to library users.
Computing degrees at University of Brighton 2014-15lp22
A brief introduction to the undergraduate degree courses offered by the University of Brighton on its Brighton campus. These include Computer Science, Computer Science (Games), Digital Media, Digital Media Development, Business Information Systems, Software Engineering and Business Computer Systems. We also offer three MComp degrees. Our Hastings campus also offers Internet Computing, Computer Systems with Networking and Digital Games Production.
Slides from my contribution to the panel convened by Jeremy Roschelle at the International Society for the Learning Sciences: Engaging Learning Scientists in Policy Challenges: AI and the Future of Learning
Aligning Learning Analytics with Classroom Practices & NeedsSimon Knight
The Learning Analytics Research Network (LEARN) invites you to join us for a talk about the exciting ways in which the University of Technology Sydney is using participatory design to augment existing classroom practices with learning analytics. Simon Knight, a LEARN Visiting Scholar from the University of Technology Sydney, will introduce a variety of projects, including their work developing analytics to support student writing.
Come meet others at NYU interested in learning analytics while learning from the examples of leading work in Australia. A light lunch will be served and the talk will be followed by a short Q&A. RSVP is required.
About Simon Knight
Simon Knight is a lecturer at the University of Technology Sydney in the Faculty of Transdisciplinary Innovation. His research investigates how people find and evaluate evidence, particularly in the context of learning and educator practices. Dr Knight received his Bachelor’s degree in Philosophy and Psychology from the University of Leeds before completing a teacher education program and Philosophy of Education MA at the UCL Institute of Education. Following teaching high school social sciences, Dr Knight completed an MPhil in Educational Research Methods at Cambridge, and PhD in Learning Analytics at the UK Open University.
About Simon’s Talk
How do we make use of data about our students to support their learning, and where does learning analytics fit into that? Educators are increasingly asked to work with data and technologies such as learning analytics to support and provide evidence of student learning. However, what learning analytics developers should design for, and how educators will implement analytics, is unclear. Learning analytics risks the same levels of low uptake and implementation as many other educational technologies if they do not align with educator practice and needs. How then do we tackle this gap, to support and develop technologies that are implemented in practice, for impact on learning?
At the University of Technology Sydney, we have taken a participatory design based approach to designing and implementing learning analytics in practice, and understanding their impact. In our work we have identified existing practices with which learning analytics may be aligned to augment them. This talk introduces some of these projects, particularly drawing on our work in developing analytics to support student writing (writing analytics), giving examples of how analytics were aligned with existing pedagogic practices to support learning. Through this augmentation, supported by design-based approaches, we argue we can develop research and practice in tandem.
Materials for introduction to adaptive learning and learning analytics as well as efforts of interoperability standardization. This slides treats brief concept of adaptive learning, reference model of learning analytics, data APIs for learning analytics, and topic list of standardization community (ISO/IEC JTC1 SC36).
24/7 Instant Feedback on Writing: Integrating AcaWriter into your TeachingSimon Buckingham Shum
https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-2-dec/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions).
This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
Assignment planner and reference trackingdbslibrary
This presentation describes and demonstrates two tools that support and inform the delivery of education services to higher education students. 1) A customised online tool, that can help improve understanding of the library's role in information literacy and provides opportunities for librarians and faculty to collaborate. Statistics and a survey can yield information on usage and usefulness; 2) A re-purposed online tool (Google Forms) to track library patron’s reference interactions via various contacts points including IM, phone, Email and the reference desk. The purpose of tracking reference services is to inform the efficient design of library and information reference services to library users.
Computing degrees at University of Brighton 2014-15lp22
A brief introduction to the undergraduate degree courses offered by the University of Brighton on its Brighton campus. These include Computer Science, Computer Science (Games), Digital Media, Digital Media Development, Business Information Systems, Software Engineering and Business Computer Systems. We also offer three MComp degrees. Our Hastings campus also offers Internet Computing, Computer Systems with Networking and Digital Games Production.
Slides from my contribution to the panel convened by Jeremy Roschelle at the International Society for the Learning Sciences: Engaging Learning Scientists in Policy Challenges: AI and the Future of Learning
Aligning Learning Analytics with Classroom Practices & NeedsSimon Knight
The Learning Analytics Research Network (LEARN) invites you to join us for a talk about the exciting ways in which the University of Technology Sydney is using participatory design to augment existing classroom practices with learning analytics. Simon Knight, a LEARN Visiting Scholar from the University of Technology Sydney, will introduce a variety of projects, including their work developing analytics to support student writing.
Come meet others at NYU interested in learning analytics while learning from the examples of leading work in Australia. A light lunch will be served and the talk will be followed by a short Q&A. RSVP is required.
About Simon Knight
Simon Knight is a lecturer at the University of Technology Sydney in the Faculty of Transdisciplinary Innovation. His research investigates how people find and evaluate evidence, particularly in the context of learning and educator practices. Dr Knight received his Bachelor’s degree in Philosophy and Psychology from the University of Leeds before completing a teacher education program and Philosophy of Education MA at the UCL Institute of Education. Following teaching high school social sciences, Dr Knight completed an MPhil in Educational Research Methods at Cambridge, and PhD in Learning Analytics at the UK Open University.
About Simon’s Talk
How do we make use of data about our students to support their learning, and where does learning analytics fit into that? Educators are increasingly asked to work with data and technologies such as learning analytics to support and provide evidence of student learning. However, what learning analytics developers should design for, and how educators will implement analytics, is unclear. Learning analytics risks the same levels of low uptake and implementation as many other educational technologies if they do not align with educator practice and needs. How then do we tackle this gap, to support and develop technologies that are implemented in practice, for impact on learning?
At the University of Technology Sydney, we have taken a participatory design based approach to designing and implementing learning analytics in practice, and understanding their impact. In our work we have identified existing practices with which learning analytics may be aligned to augment them. This talk introduces some of these projects, particularly drawing on our work in developing analytics to support student writing (writing analytics), giving examples of how analytics were aligned with existing pedagogic practices to support learning. Through this augmentation, supported by design-based approaches, we argue we can develop research and practice in tandem.
Materials for introduction to adaptive learning and learning analytics as well as efforts of interoperability standardization. This slides treats brief concept of adaptive learning, reference model of learning analytics, data APIs for learning analytics, and topic list of standardization community (ISO/IEC JTC1 SC36).
24/7 Instant Feedback on Writing: Integrating AcaWriter into your TeachingSimon Buckingham Shum
https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-2-dec/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions).
This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
IoT-based students interaction framework using attention-scoring assessment i...eraser Juan José Calderón
IoT-based students interaction framework using attention-scoring assessment in eLearning. Muhammad Farhan a,b, Sohail Jabbar a,c,d, Muhammad Aslam b, Mohammad Hammoudeh e, Mudassar Ahmad c, Shehzad Khalid f, Murad Khan g,Kijun Han d,
SimCon01: The benefits of Conceptual Modelling for Construction Simulation, b...Mani Poshdar
On 31 October 2017, the joint research group of the University of Auckland and the Auckland University of Technology (SimCon) held a 2-hour workshop to share new insight into the impacts and benefits of using simulation in the built environment engineering.
The learning objectives were as follows:
- The use of simulation to lift productivity and competitiveness within the construction industry
- The use of simulation for project planning, control and diagnosis
- The cutting-edge research being developed by SimCon research group
These are slides from the third presentation of the workshop. It is focused on describing the importance of conceptual modelling in improving the quality of simulation studies in construction.
The Generative AI System Shock, and some thoughts on Collective Intelligence ...Simon Buckingham Shum
Keynote Address: Team-based Learning Collaborative Asia Pacific Community (TBLC-APC) Symposium (“Impact of emerging technologies on learning strategies”) 8-9 February 2024, Sydney https://tbl.sydney.edu.au
Deliberative Democracy as a strategy for co-designing university ethics aro...Simon Buckingham Shum
Buckingham Shum, S. (2021). Deliberative Democracy as a strategy for co-designing university ethics around analytics and AI in education. AARE2021: Australian Association for Research in Education, 28 Nov. – 2 Dec. 2021
Deliberative Democracy as a Strategy for Co-designing University Ethics Around Analytics and AI in Education
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
Universities can see an increasing range of student and staff activity as it becomes digitally visible in their platform ecosystems. The fields of Learning Analytics and AI in Education have demonstrated the significant benefits that ethically responsible, pedagogically informed analysis of student activity data can bring, but such services are only possible because they are undeniably a form of “surveillance”, raising legitimate questions about how the use of such tools should be governed.
Our prior work has drawn on the rich concepts and methods developed in human-centred system design, and participatory/co-design, to design, deploy and validate practical tools that give a voice to non-technical stakeholders (e.g. educators; students) in shaping such systems. We are now expanding the depth and breadth of engagement that we seek, looking to the Deliberative Democracy movement for inspiration. This is a response to the crisis in confidence in how typical democratic systems engage citizens in decision making. A hallmark is the convening of a Deliberative Mini-Public (DMP) which may work at different scales (organisation; community; region; nation) and can take diverse forms (e.g. Citizens’ Juries; Citizens’ Assemblies; Consensus Conferences; Planning Cells; Deliberative Polls). DMP’s combination of stratified random sampling to ensure authentic representation, neutrally facilitated workshops, balanced expert briefings, and real support from organisational leaders, has been shown to cultivate high quality dialogue in sometimes highly conflicted settings, leading to a strong sense of ownership of the DMP's final outputs (e.g. policy recommendations).
This symposium contribution will describe how the DMP model is informing university-wide consultation on the ethical principles that should govern the use of analytics and AI around teaching and learning data.
March 2021 • 24/7 Instant Feedback on Writing: Integrating AcaWriter into yo...Simon Buckingham Shum
Slides accompanying the monthly UTS educator briefing https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-18-march/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions). This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
ICQE20: Quantitative Ethnography Visualizations as Tools for ThinkingSimon Buckingham Shum
Slides for this keynote talk to the 2nd International Conference on Quantitative Ethnography
http://simon.buckinghamshum.net/2021/02/icqe2020-keynote-qe-viz-as-tools-for-thinking/
An introduction to argumentation for UTS:CIC PhD students (with some Learning Analytics examples, but potentially of wider interest to students/researchers)
Webinar: Learning Informatics Lab, University of Minnesota
Replay the talk: https://youtu.be/dcJZeDIMr2I
Learning Informatics
AI • Analytics • Accountability • Agency
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
University of Technology Sydney
Abstract:
“Health Informatics”. “Urban Informatics”. “Social Informatics”. Informatics offers systemic ways of analyzing and designing the interaction of natural and artificial information processing systems. In the context of education, I will describe some Learning Informatics lenses and practices which we have developed for co-designing analytics and AI with educators and students. We have a particular focus on closing the feedback loop to equip learners with competencies to navigate a complex, uncertain future, such as critical thinking, professional reflection and teamwork. En route, we will touch on how we build educators’ trust in novel tools, our design philosophy of “embracing imperfection” in machine intelligence, and the ways that these infrastructures embody values. Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences spark productive reflection around as the UMN Learning Informatics Lab builds its program.
Biography:
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he serves as inaugural director of the Connected Intelligence Centre. CIC is a transdisciplinary innovation centre, using analytics to provide new insights for university teams, with particular expertise in educational data science. Simon’s career-long fascination with software’s ability to make thinking visible has seen him active in communities including Computer-Supported Cooperative Work, Hypertext, Design Rationale, Scholarly Publishing, Semantic Web, Computational Argumentation, Educational Technology and Learning Analytics. The challenge of visualizing contested knowledge has produced several books: Visualizing Argumentation, Knowledge Cartography, and Constructing Knowledge Art. He has been active over the last decade in shaping the field of Learning Analytics, co-founding the Society for Learning Analytics Research, and catalyzing several strands: Social Learning Analytics, Discourse Analytics, Dispositional Analytics and Writing Analytics. http://Simon.BuckinghamShum.net
Despite AI’s potential for beneficial use, it creates important risks for Australians. AI, big data, and AI-informed decision making can cause exclusion, discrimination, skill loss, and economic impact; and can affect privacy, security of critical infrastructure and social well-being. What types of technology raise particular human rights concerns? Which human rights are particularly implicated?
Abstract: The emerging configuration of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards). The idea that we may be transitioning into significantly new ways of knowing – about learning and learners, teaching and teachers – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. What should we see when open the black box powering analytics? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? This isn’t just interesting to ponder academically: your school or university will be buying products that are being designed now. Or perhaps educational institutions should take control, building and sharing their own open source tools? How are universities accelerating the transition from analytics innovation to infrastructure? Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Towards Collaboration Translucence: Giving Meaning to Multimodal Group DataSimon Buckingham Shum
Vanessa Echeverria, Roberto Martinez-Maldonado, and Simon Buck- ingham Shum.. 2019. Towards Collaboration Translucence: Giving Meaning to Multimodal Group Data. In Proceedings of ACM CHI conference (CHI’19). ACM, New York, NY, USA, Paper 39, 16 pages. https://doi.org/10.1145/3290605.3300269
Collocated, face-to-face teamwork remains a pervasive mode of working, which is hard to replicate online. Team members’ embodied, multimodal interaction with each other and artefacts has been studied by researchers, but due to its complexity, has remained opaque to automated analysis. However, the ready availability of sensors makes it increasingly affordable to instrument work spaces to study teamwork and groupwork. The possibility of visualising key aspects of a collaboration has huge potential for both academic and professional learning, but a frontline challenge is the enrichment of quantitative data streams with the qualitative insights needed to make sense of them. In response, we introduce the concept of collaboration translucence, an approach to make visible selected features of group activity. This is grounded both theoretically (in the physical, epistemic, social and affective dimensions of group activity), and contextually (using domain-specific concepts). We illustrate the approach from the automated analysis of healthcare simulations to train nurses, generating four visual proxies that fuse multimodal data into higher order patterns.
Panel held at LAK13: 3rd International Conference on Learning Analytics & Knowledge
http://simon.buckinghamshum.net/2013/03/lak13-edu-data-scientists-scarce-breed
Educational Data Scientists: A Scarce Breed
The Educational Data Scientist is currently a poorly understood, rarely sighted breed. Reports vary: some are known to be largely nocturnal, solitary creatures, while others have been reported to display highly social behaviour in broad daylight. What are their primary habits? How do they see the world? What ecological niches do they occupy now, and will predicted seismic shifts transform the landscape in their favour? What survival skills do they need when running into other breeds? Will their numbers grow, and how might they evolve? In this panel, the conference will hear and debate not only broad perspectives on the terrain, but will have been exposed to some real life specimens, and caught glimpses of the future ecosystem.
Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Kirsty Kitto, Simon Buckingham Shum, and Andrew Gibson. (2018). Embracing Imperfection in Learning Analytics. In Proceedings of LAK18: International Conference on Learning Analytics and Knowledge, March 5–9, 2018, Sydney, NSW, Australia, pp.451-460. (ACM, New York, NY, USA). https://doi.org/10.1145/3170358.3170413
Open Access: http://simon.buckinghamshum.net/2018/01/embracing-imperfection-in-learning-analytics
Abstract: Learning Analytics (LA) sits at the confluence of many contributing disciplines, which brings the risk of hidden assumptions inherited from those fields. Here, we consider a hidden assumption derived from computer science, namely, that improving computational accuracy in classification is always a worthy goal. We demonstrate that this assumption is unlikely to hold in some important educational contexts, and argue that embracing computational “imperfection” can improve outcomes for those scenarios. Specifically, we show that learner-facing approaches aimed at “learning how to learn” require more holistic validation strategies. We consider what information must be provided in order to reasonably evaluate algorithmic tools in LA, to facilitate transparency and realistic performance comparisons.
Opening to the inaugural workshop on Learning Analytics in Schools held at LAK18: International Conference on Learning Analytics & Knowledge, Sydney. http://lak18.solaresearch.org
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Normal Labour/ Stages of Labour/ Mechanism of LabourWasim Ak
Normal labor is also termed spontaneous labor, defined as the natural physiological process through which the fetus, placenta, and membranes are expelled from the uterus through the birth canal at term (37 to 42 weeks
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
On moving from a theory to a learning analytics application
1. Simon Buckingham Shum
Connected Intelligence Centre • University of Technology Sydney
@sbuckshum • http://utscic.edu.au • http://Simon.BuckinghamShum.net
Panel: Exploring practical impacts of learning analytics on student learning and pedagogical design
http://itali.uq.edu.au/alasi2017
On moving from a theory to a learning analytics application:
Automated feedback on reflective writing as an example
3. Conceptual Framework from the scholarship of teaching and learning reflective writing
Elegant analytical framework, but too complex for the end-user: need a simpler user model…
Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective Writing Analytics
for Actionable Feedback. Proceedings of LAK17: 7th International Conference on Learning Analytics & Knowledge, March 13-
17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436. [Preprint] [Replay]
4. Conceptual Framework from the scholarship of teaching and learning reflective writing
Elegant analytical framework, but too complex for the end-user: need a simpler user model…
5. Conceptual Framework from the scholarship of teaching and learning reflective writing
Elegant analytical framework, but too complex for the end-user: need a simpler user model…
For details see https://utscic.edu.au/tools/awa (LAK17 paper)
6.
7. Integrating AWA into meaningful assessment: guide
students through a writing exercise workflow (AWA Tutor)
Shibani, A., Knight, S., Buckingham Shum, S., & Ryan, P. (2017). Design and Implementation of a Pedagogic Intervention Using Writing Analytics.
In W. Chen et al. (Eds.). Proceedings of the 25th International Conference on Computers in Education, New Zealand. http://antonetteshibani.com/research