Analytic and strategic challenges of serious gamesDavid Gibson
How higher education learning and teaching can learn from serious game developers. Keynote at the 5th annual SeGAH conference concurrent with WWW 2017 held in Perth, Western Australia
UT-Austin Guest Lecture, ""Patterns and Outcomes of Youth Engagement in Colla...Rebecca Reynolds
Reports results of a program of game design learning in which information resource uses by students to solve programming challenges are explored. Students in MS and HS take a game design class daily, for credit and a grade for a full year and use a learning management system stocked with information resources to support their programming and game design. Results highlight types of inquiry they conduct, which strategies were more and less successful, and how their resource uses appear to connect to their learning outcomes. The results are discussed in relation to the overall landscape of educational technologies, considering the issue of structure.
Analytic and strategic challenges of serious gamesDavid Gibson
How higher education learning and teaching can learn from serious game developers. Keynote at the 5th annual SeGAH conference concurrent with WWW 2017 held in Perth, Western Australia
UT-Austin Guest Lecture, ""Patterns and Outcomes of Youth Engagement in Colla...Rebecca Reynolds
Reports results of a program of game design learning in which information resource uses by students to solve programming challenges are explored. Students in MS and HS take a game design class daily, for credit and a grade for a full year and use a learning management system stocked with information resources to support their programming and game design. Results highlight types of inquiry they conduct, which strategies were more and less successful, and how their resource uses appear to connect to their learning outcomes. The results are discussed in relation to the overall landscape of educational technologies, considering the issue of structure.
How we deployed Piwik web analytics system to handle a huge amount of unpredicted traffic, adding some cloud and modern scalability techniques. files:https://github.com/lorieri/piwik-presentation
Piwik - description of the software, why is it better than the competition, screenshots
piwik is an open source (GPL license) web analytics software. It gives interesting reports on your website visitors, your popular pages, the search engines keywords they used, the language they speak… and so much more.
piwik aims to be an open source alternative to Google Analytics.
1. Open APIs
2. Plugins Architecture
3. Data abstraction layer
4. Customizable dashboard
5. Innovative User Interface
Privacy Regulations and Your Digital SetupPiwik PRO
How Will the New Privacy Regulations Affect Your Digital Set-up? In less than 2 years from now, Europe’s new data privacy law will come into effect, changing the way organizations handle information of their users. General Data Protection Regulation will heavily impact usage of digital tools for customer insights and analytics.
This presentation was created by the Piwik PRO Team for a webinar session with Aurelie Pols. Webinar recording is available on: https://youtu.be/dPOvbbZ3vdo
A Comparison of Analytics and Tag Management Suites by Piwik PRO and GooglePiwik PRO
A combination of Google Tag Manager and Google Analytics has become the standard in the everyday activities of a data-driven marketer. But what if you’re looking for a more secure and privacy-compliant alternative? Welcome to Piwik PRO Tag Manager, which comes with built-in templates and integrations for Piwik along with a range of popular marketing and web analytics tags. Self-hosted and robust, it complies with the strictest privacy laws. See how the Piwik PRO analytics and tag management suite compares to Google and how you can make it work for your business.
Javascript Tracking or Web Log Analytics? Piwik PRO
JavaScript tracking method is used to collect data by the majority of analytics solutions today. It is valued for delivering high-quality insights, but also for extensive customization capabilities. However, for a variety of reasons, JS Tracking may not be accurate for everyone. The most popular alternative solution is Web Log Analytics, which allows you to collect web server log files and import them into your platform for viewing, analyzing, and reporting. Log analytics and JS tracking differ a lot when it comes to the processing and storage of data. See how they compare and choose the right tracking method for your organization.
Learning Analytics and how to use in educational or serious games for improving the use of the games
game traces
evidence based education
Talk at the Ecole Normal Superior, Lyon, France
The aim of our study is to extract the profiles of students activities, performed during the training sessions of a course of logic networks, and to relate such activities with the students’ performance at intermediate verification tests. In this course, undergraduate students learn and practice the concepts of logic networks with Deeds Simulator.
The Deeds is a set of educational tools for digital electronics, which stands for "Digital Electronics Education and Design Suite". It is used in courses of Electronic Engineering at DITEN, UNIGE.
By applying learning analytics methods to the data captured from activity logs and questionnaires, we aim to understand the learning behavior of students.
This project was presented at Learning Analytics Data Sharing – LADS14 Workshop at EC-TEL.
Education must capitalize on the trend within technology toward big data. New types of data are becoming available. From evidence approaches to xAPI and the whole Training and Learning Architecture(TLA) big data is the foundation of all.
Gaming Learning Analytics: using data for improving serious games applicability”
This talk will present how to use a data-driven approach to improve how serios games are applied in different domains and how H2020 RAGE project is trying to simplify this analytics processes by creating an open software infrastructure.
[DSC Europe 22] Machine learning algorithms as tools for student success pred...DataScienceConferenc1
The goal of higher education institutions is to provide quality education to students. Predicting academic success and early intervention to help at-risk students is an important task for this purpose. This talk explores the possibilities of applying machine learning in developing predictive models of academic performance. What factors lead to success at university? Are there differences between students of different generations? Answers are given by applying machine learning algorithms to a data set of 400 students of three generations of IT studies. The results show differences between students with regard to student responsibility and regularity of class attendance and great potential of applying machine learning in developing predictive models.
Presentation to BILETA 2017, Universidade do Minho, co-authored with Dirk Rodenburg, Queen's University, Ontario, and Robert Clapperton, Ametros Learning.
Applying participatory learning to STEM
E. Shaw, M. La, R. Phillips, and E. Reilly, “PLAY Minecraft! Assessing Secondary Engineering Education using Game Challenges within a Participatory Learning Environment,” in Proceedings of the 2014 ASEE Annual Conference, Indianapolis, IN, June 2014, Session W447.
At a recent "capstone" convening, our Wave II grantees shared updates on their projects through descriptive posters. Click through the slides to see snapshots of their work developing and scaling interactive learning modules aligned to the Common Core. For more about Wave II, visit: http://www.nextgenlearning.org/wave-ii
On Information Quality: Can Your Data Do The Job? (SCECR 2015 Keynote)Galit Shmueli
Slides by Galit Shmueli for keynote presentation at 2015 Statistical Challenges in eCommerce Research (SCECR) symposium, Addis Ababa, Ethiopia (www.scecr.org)
How we deployed Piwik web analytics system to handle a huge amount of unpredicted traffic, adding some cloud and modern scalability techniques. files:https://github.com/lorieri/piwik-presentation
Piwik - description of the software, why is it better than the competition, screenshots
piwik is an open source (GPL license) web analytics software. It gives interesting reports on your website visitors, your popular pages, the search engines keywords they used, the language they speak… and so much more.
piwik aims to be an open source alternative to Google Analytics.
1. Open APIs
2. Plugins Architecture
3. Data abstraction layer
4. Customizable dashboard
5. Innovative User Interface
Privacy Regulations and Your Digital SetupPiwik PRO
How Will the New Privacy Regulations Affect Your Digital Set-up? In less than 2 years from now, Europe’s new data privacy law will come into effect, changing the way organizations handle information of their users. General Data Protection Regulation will heavily impact usage of digital tools for customer insights and analytics.
This presentation was created by the Piwik PRO Team for a webinar session with Aurelie Pols. Webinar recording is available on: https://youtu.be/dPOvbbZ3vdo
A Comparison of Analytics and Tag Management Suites by Piwik PRO and GooglePiwik PRO
A combination of Google Tag Manager and Google Analytics has become the standard in the everyday activities of a data-driven marketer. But what if you’re looking for a more secure and privacy-compliant alternative? Welcome to Piwik PRO Tag Manager, which comes with built-in templates and integrations for Piwik along with a range of popular marketing and web analytics tags. Self-hosted and robust, it complies with the strictest privacy laws. See how the Piwik PRO analytics and tag management suite compares to Google and how you can make it work for your business.
Javascript Tracking or Web Log Analytics? Piwik PRO
JavaScript tracking method is used to collect data by the majority of analytics solutions today. It is valued for delivering high-quality insights, but also for extensive customization capabilities. However, for a variety of reasons, JS Tracking may not be accurate for everyone. The most popular alternative solution is Web Log Analytics, which allows you to collect web server log files and import them into your platform for viewing, analyzing, and reporting. Log analytics and JS tracking differ a lot when it comes to the processing and storage of data. See how they compare and choose the right tracking method for your organization.
Learning Analytics and how to use in educational or serious games for improving the use of the games
game traces
evidence based education
Talk at the Ecole Normal Superior, Lyon, France
The aim of our study is to extract the profiles of students activities, performed during the training sessions of a course of logic networks, and to relate such activities with the students’ performance at intermediate verification tests. In this course, undergraduate students learn and practice the concepts of logic networks with Deeds Simulator.
The Deeds is a set of educational tools for digital electronics, which stands for "Digital Electronics Education and Design Suite". It is used in courses of Electronic Engineering at DITEN, UNIGE.
By applying learning analytics methods to the data captured from activity logs and questionnaires, we aim to understand the learning behavior of students.
This project was presented at Learning Analytics Data Sharing – LADS14 Workshop at EC-TEL.
Education must capitalize on the trend within technology toward big data. New types of data are becoming available. From evidence approaches to xAPI and the whole Training and Learning Architecture(TLA) big data is the foundation of all.
Gaming Learning Analytics: using data for improving serious games applicability”
This talk will present how to use a data-driven approach to improve how serios games are applied in different domains and how H2020 RAGE project is trying to simplify this analytics processes by creating an open software infrastructure.
[DSC Europe 22] Machine learning algorithms as tools for student success pred...DataScienceConferenc1
The goal of higher education institutions is to provide quality education to students. Predicting academic success and early intervention to help at-risk students is an important task for this purpose. This talk explores the possibilities of applying machine learning in developing predictive models of academic performance. What factors lead to success at university? Are there differences between students of different generations? Answers are given by applying machine learning algorithms to a data set of 400 students of three generations of IT studies. The results show differences between students with regard to student responsibility and regularity of class attendance and great potential of applying machine learning in developing predictive models.
Presentation to BILETA 2017, Universidade do Minho, co-authored with Dirk Rodenburg, Queen's University, Ontario, and Robert Clapperton, Ametros Learning.
Applying participatory learning to STEM
E. Shaw, M. La, R. Phillips, and E. Reilly, “PLAY Minecraft! Assessing Secondary Engineering Education using Game Challenges within a Participatory Learning Environment,” in Proceedings of the 2014 ASEE Annual Conference, Indianapolis, IN, June 2014, Session W447.
At a recent "capstone" convening, our Wave II grantees shared updates on their projects through descriptive posters. Click through the slides to see snapshots of their work developing and scaling interactive learning modules aligned to the Common Core. For more about Wave II, visit: http://www.nextgenlearning.org/wave-ii
On Information Quality: Can Your Data Do The Job? (SCECR 2015 Keynote)Galit Shmueli
Slides by Galit Shmueli for keynote presentation at 2015 Statistical Challenges in eCommerce Research (SCECR) symposium, Addis Ababa, Ethiopia (www.scecr.org)
Designing a Survey Study to Measure the Diversity of Digital Learners IJITE
This article describes the design of a quantitative study that aims to gather empirical data on the different
types of digital learners in a student population, inclusive of the elusive digital natives who purportedly
exist in settings laden with digital technology. The design of this study revolves on the impetus in mapping
the diversity of digital learners, followed by elucidations on the research design and methods that are to be
employed, its accompanying data analysis, ethical considerations and an elaboration of the measures that
are taken in ensuring validity and reliability.
Designing a Survey Study to Measure the Diversity of Digital LearnersIJITE
This article describes the design of a quantitative study that aims to gather empirical data on the different
types of digital learners in a student population, inclusive of the elusive digital natives who purportedly
exist in settings laden with digital technology. The design of this study revolves on the impetus in mapping
the diversity of digital learners, followed by elucidations on the research design and methods that are to be
employed, its accompanying data analysis, ethical considerations and an elaboration of the measures that
are taken in ensuring validity and reliability.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
3. KOM – Multimedia Communications Lab
3
What is Learning Analytics
http://edtechreview.in/event/87-webinar/835-can-learning-analytics-enable-personalized-learning
“Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.” George Siemens 2011
4. KOM – Multimedia Communications Lab
4
Motivation
http://www.openequalfree.org/gamification-versus-game- based-learning-in-the-classroom/10082
Why Learning Analytics for Serious Games
•Evaluation of Serious Games
•Justifying expense in learning contexts
•Objective and cost-effective approach
•Evaluation with Serious Games
•Provide a big amount of gameplay data
•Interactive and engaging nature Stealth Assessment
•Enable insight about learner attributes and learning progress
6. KOM – Multimedia Communications Lab
6
Modelling for Learning Analytics in SG
https://www.linkedin.com/pulse/article/20140320222540- 1265384-show-what-you-know-the-future-of- competency-based-learning
•Competence-Based Knowledge Space Theory (CbKST)
•Requires learning domains to be modelled as a prerequisite competency structure
•Inferring knowledge states
•Narrative Game-Based Learning Objects (NGLOB)
•Additionally considers player type and narrative aspects
•Triple vector: Narrative Context, Gaming Context, Learning Context
•Evidence-Centered Design (ECD)
•Competency Model, Evidence Model and Action Model
•Open Learner Model (OLM)
•Presenting to the learner an understandable visualization of his current knowledge state
•Proven to improve learning outcomes
8. KOM – Multimedia Communications Lab
8
Choosing Data for Learning Analytics in SG
Depends on learning goals, setting, tasks, game genre, mechanic and platform
•Intensive vs. Extensive Data
•Extensive Data: for Higher Quantity
•Intensive Data: for Higher Quality
•Single-Player vs. Multiplayer
•Multiplayer:
•additional social component
•Data fed into social network analysis to identify aspects of collaborative learning
•Generic vs. Game-Specific Traces
•Generic:
•Identify strengths and weaknesses of learning games
•Compare different learning games
•Game-Specific:
•Designing games „with analytics in mind“
•More tailored to invidiual games
StoryPlay Learning Analytics Tool
10. KOM – Multimedia Communications Lab
10
Capturing Data for Learning Analytics in SG
Depends on data modalities and interactions
•Activity logs
•Widely employed
•Records interaction data in form of log files
•Multimodal Learning Analytics
•Includes biometric data and other multimodal data for assessing motivation, fun and collaboration aspects in learning settings
•Introduces its own challenges for aligning data
•Mobile and Ubiquitous Learning Analytics
•Data of mobile learners, suitable for mobile games
•Interaction with mobile devices
•Considering contextual information
12. KOM – Multimedia Communications Lab
12
Aggregating Data for Learning Analytics in SG
Depends on data sources and sample size
•Extensive Data Aggregation accross Users
•Log data joined into central database after preprocessing using session identifiers
•Log files generated on all machines should use same data format
•Need for standardized xml formats
•„Aggregation Model“: using semantic rules to map game actions or states to meaningful expressions under which similar events are grouped
•Intensive Data Aggregation accross Modalities
•Multimodal Data Synchronization needed for observing behavior accross MM data channels
•Some tools exist: Replayer, ChronoVis
ChronoViz.com
14. KOM – Multimedia Communications Lab
14
Analyzing Data for Learning Analytics in SG
Depends on learning context and application
•By instructor
•This step is not done by the system but instructor intervenes according to visualized statistics
•Automatic Analysis
•For intelligent tutoring systems and adaptive Serious Games
•Measures to be derived:
•Gaming: general in-game performance, in-game learning, in-game strategies, player type
•Learning: general traits and abilities of the learner, general knowledge, situation-specific state, learning behaviors, learning outcomes
•Rules governing the interpretation of in-game sources of evidence to infer competencies
•Algorithms applied during learning sessions to update competency models
•Data Mining and Machine Learning approaches can be used for identifying solution strategies, error patterns and player goals
16. KOM – Multimedia Communications Lab
16
Deploying Results for Learning Analytics in SG
Depends on learning context and application
•Visualization
•visualizations of narrative structure, player model and skill tree
•graphs, Hasse Diagrams, Heat Maps
•for games, a special need for real-time operation, extensibility and interoperability
•Adaptation
•macro-adaptivity: system responds by choosing the appropriate next learning object or narrative event
•micro-adaptivity: adjusting aspects within a learning task like task diffculty or feedback type
17. KOM – Multimedia Communications Lab
17
Popular Analytics Tools
Piwik
Google Analytics
OpenSim Analytics for Virtual Worlds
19. KOM – Multimedia Communications Lab
19
References
•N. R. Aljohani and H. C. Davis. Learning analytics in mobile and ubiquitous learning environments. In 11th
•World Conference on Mobile and Contextual Learning: mLearn 2012, Helsinki, Finland, 2012.
•R. S. J. D. Baker and K. Yacef. The State of Educational Data Mining in 2009 : A Review and Future Visions. 1(1):3-16, 2009.
•P. Blikstein. Multimodal learning analytics. Proceedings of the Third International Conference on
•Learning Analytics and Knowledge - LAK '13, page 102, 2013.
•S. Bull, Y. C. Y. Cui, a.T. McEvoy, E. Reid, and W. Y. W. Yang. Roles for mobile learner models. The 2nd IEEE International Workshop on Wireless and Mobile Technologies in Education, 2004. Proceedings., pages 124-128, 2004.
•S. Bull and J. Kay. Open learner models. In Advances in Intelligent Tutoring Systems, pages 301-322. Springer, 2010.
•G. K. Chung and D. S. Kerr. A Primer on Data Logging to Support Extraction of Meaningful
•Information from Educational Games: An Example from Save Patch. CRESST Report 814. National
•Center for Research on Evaluation, Standards, and Student Testing (CRESST), page 27, 2012.
•A. Cooper. Learning Analytics Interoperability – The Big Picture In Brief. Learning Analytics Community Exchange, 2014.
•E. Duval. Attention Please!: Learning Analytics for Visualization and Recommendation. LAK '11, pages 9-17, New York, NY, USA, 2011. ACM.
•A. Dyckho and D. Zielke. Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of . . . , 15:58-76, 2012.
•G. Dyke, K. Lund, and J.-J. Girardot. Tatiana: An Environment to Support the CSCL Analysis Process. In Proceedings of the 9th International Conference on Computer Supported Collaborative Learning – Volume 1, CSCL'09, pages 58{67. International Society of the Learning Sciences, 2009.
•B. M. Eun Young Ha Jonathan Rowe and J. Lester. Recognizing Player Goals in Open-Ended Digital Games with Markov Logic Networks. Plan, Activity and Intent Recognition: Theory and Practice, 2014.
•J.-C. Falmagne, D. Albert, C. Doble, D. Eppstein, and X. Hu. Knowledge Spaces: Applications in Education. Springer Science & Business, 2013.
•I. Garcia, A. Duran, and M. Castro. Comparing the eectiveness of evaluating practical capabilities through hands-on on-line exercises versus conventional methods. In Frontiers in Education Conference, 2008.
•FIE 2008. 38th Annual, pages F4H{18{F4H{22, Oct
•2008.
•S. Goebel, M. Gutjahr, and S. Hardy. Evaluation of Serious Games. Serious Games and Virtual Worlds in Education, Professional Development, and Healthcare, page 105, 2013.
20. KOM – Multimedia Communications Lab
20
References
•S. Goebel, V. Wendel, C. Ritter, and R. Steinmetz. Personalized, adaptive digital educational games using narrative game-based learning objects. pages 438-445. Springer, 2010.
•S. L. . B. H. Jan Plass. Assessment mechanics: design of learning activities that produce meaningful data.New York University, CREATE Lab, Apr. 2013.
•K. E. B. E. W. Joseph Grafsgaard Joseph Wiggins and J. Lester. Predicting Learning and Aect from Multimodal Data Streams in Task-Oriented Tutorial Dialogue. In Seventh International Conference on Educational Data Mining, 2014.
•D. S. Kerr and G. K. W. K. Chung. Using Cluster Analysis to Extend Usability Testing to Instructional Content. CRESST Report 816. National Center for Research on Evaluation, Standards, and Student Testing (CRESST), May 2012.
•M. D. Kickmeier-Rust and D. Albert. An Alien's Guide to Multi-Adaptive Educational Computer Games. Informing Science, 2012.
•M. D. Kickmeier-Rust and D. Albert. Learning analytics to support the use of virtual worlds in the classroom. In Human-Computer Interaction and Knowledge Discovery in Complex, Unstructured, Big Data, pages 358{365. Springer, 2013.
•K. R. Koedinger, R. Baker, K. Cunningham, A. Skogsholm, B. Leber, and J. Stamper. A data repository for the EDM community: The PSLC DataShop. Handbook of educational data mining, pages 43-55, 2010.
•J. Konert, K. Richter, F. Mehm, S. G•obel, R. Bruder, and R. Steinmetz. Pedale - a peer education diagnostic and learning environment. Journal of Educational Technology & Society, 2012.
•K. Korossy. Modeling knowledge as competence and performance. Knowledge spaces: Theories, empirical research, and applications, pages 103{132, 1999.
•J. Lester, B. Mott, J. Rowe, and J. Sabourin. {learner modeling to predict real-time aect in serious games. Design Recommendations for Intelligent Tutoring Systems, page 201, 2013.
•S. M. Letourneau. Intensive data at CREATE -Research approaches & future directions. In Simulations and Games Learning Analytics and Data Mining Workshop, 2013.
•R. Levy. Dynamic Bayesian Network Modeling of Game Based Diagnostic Assessments. CRESST Report, 2014.
•C. Martin, O. Galibert, M. Michel, F. Mougin, and V. Stanford. NIST Smart Flow System User's guide.
•M. Minovic and M. Milovanovic. Real-time learning analytics in educational games. pages 245{251. ACM,
•2013.
•R. J. Mislevy, R. G. Almond, and J. F. Lukas. A brief introduction to evidence-centered design. cse report 632. US Department of Education, 2004.
•A. Mitrovic and B. Martin. Evaluating the eect of open student models on self-assessment. International Journal of Articial Intelligence in Education, 17(2):121-144, 2007.
•A. Morrison, P. Tennent, and M. Chalmers. Coordinated visualisation of video and system log data. In Coordinated and Multiple Views in Exploratory Visualization, 2006. Proceedings. International Conference on, pages 91-102. IEEE,2006.
21. KOM – Multimedia Communications Lab
21
References
•K. Nurmela, E. Lehtinen, and T. Palonen. Evaluating CSCL log les by social network analysis. In Proceedings of the 1999 conference on Computer support for collaborative learning, page 54. International Society of the Learning Sciences, 1999.
•H. Ogata, M. Li, B. Hou, N. Uosaki, M. M. El-Bishouty, and Y. Yano. SCROLL: Supporting to share and reuse ubiquitous learning log in the context of language learning. Research & Practice in Technology Enhanced Learning, 6(2), 2011.
•S. Oviatt, A. Cohen, and N. Weibel. Multimodal learning analytics: Description of math data corpus for icmi grand challenge workshop. In Proceedings of the 15th ACM on International conference on multimodal interaction, pages 563{568. ACM, 2013.
•J. L. Plass, B. D. Homer, C. K. Kinzer, Y. K. Chang, J. Frye, W. Kaczetow, K. Isbister, and K. Perlin. Metrics in Simulations and Games for Learning. In Game Analytics, pages 697{729. Springer, 2013.
•M. Prensky. Computer games and learning: Digital game-based learning. Handbook of computer game studies, 18:97-122, 2005.
•C. Reuter, F. Mehm, S. G•obel, and R. Steinmetz. Evaluation of Adaptive Serious Games using Playtraces and Aggregated Play Data. Proceedings of the 7th European Conference on Game Based Learning (ECGBL) 2013, (October):504{511, 2013.
•A. Serrano, E. J. Marchiori, A. del Blanco, J. Torrente, and B. Fernandez-Manjon. A framework to improve evaluation in educational games. In Global Engineering Education Conference (EDUCON), 2012 IEEE, pages 1-8. IEEE, 2012.
•A. Serrano-Laguna, J. Torrente, P. Moreno-Ger, and B. Fernandez-Manjon. Application of learning analytics in educational videogames. Entertainment Computing, 2014.
•V. J. Shute, M. Ventura, M. Bauer, and D. Zapata-Rivera. Melding the power of serious games and embedded assessment to monitor and foster learning. Serious games: Mechanisms and eects,
•pages 295-321, 2009.
•G. Siemens and R. S. J. Baker. Learning Analytics and Educational Data Mining : Towards Communication and Collaboration. 2010.
•K. K. Tatsuoka. Architecture of knowledge structures and cognitive diagnosis: A statistical pattern recognition and classication approach. Cognitively diagnostic assessment, pages 327-359, 1995.
•T. P. Vendlinski, G. C. Delacruz, R. E. Buschang, G. Chung, and E. L. Baker. Developing high-quality assessments that align with instructional video games (CRESST Report 774). . National Center for Research on Evaluation, Standards, and Student Testing, 2010.
•A. V. Wallop. Althea Vail Wallop CS 229 Final Project Report Learning to Predict Disengagement in Educational iPad Application. pages 1-5, 2010.
•J.-D. Zapata-Rivera and J. E. Greer. Interacting with inspectable bayesian student models. International Journal of Articial Intelligence in Education, 14(2):127-163, 2004.