Presentation given by Jackie Cohen on April 24, 2014 at the Open CourseWare Consortium Global Conference in Slovenia.
Abstract - http://conference.ocwconsortium.org/2014/ai1ec_event/a-framework-to-integrate-public-dynamic-metrics-into-an-oer-platform-jaclyn-cohen-and-kathleen-omollo/?instance_id=542
Paper - http://openpraxis.org/index.php/OpenPraxis/article/view/118 or http://deepblue.lib.umich.edu/handle/2027.42/106587
2. the dynamic metrics and analytics project
...was initiated with a goal of
publicly sharing Open.Michigan usage data,
identifying interesting patterns in OER use,
updating displays of that data dynamically.
4. a repository and referratory
we host and help publish OER
(http://open.umich.edu)
we provide links to third-party platforms
and services which host relevant OER
5. three guides
β goals of sharing: how can we open data easily?
β user research: what do users want?
β technical design: what can/should we do with what
we have?
6. interesting corollary questions
β how to share usage data? what context is
needed?
β what does βuseβ mean for us?
β how can we share metrics in a
sustainable and useful way?
7. how do we organize OER?
Hierarchical system,
Drupal-based
(https://open.umich.
edu/wiki/OERbit)
Image source: Open.Michigan Initiative, CC-BY
8. how do we organize OER?
A hierarchy of units, courses, resources...
Image source: Open.Michigan Initiative medical collection, open.umich.edu/education/med, CC-BY http://creativecommons.org/licenses/cc-by/4.0
10. how do we measure content use?
Google Analytics, third-party-platform
analytics interfaces, access to APIs...
Image source: Google Analytics, charts
11. together: technical architecture
β¦ informs information architecture.
(e.g. Our hierarchical site structure
allowed us to easily answer some
individual questions with the first
phrase release of this project.)
13. user research β
β changes in action/use over time
β visual clarity
β ease of access
β availability of totals (lifetime numbers)
β evidence of reach
β evidence of use
15. overall
our user research suggests there is
qualitatively measurable value
in providing individual metrics
in clear ways
16. more conclusions
β tech architecture informs information architecture
β metrics can be used to justify costs of OER
production, and serve to strengthen relationships
β user research may lead to βbetterβ analytics
β more to come!
17. so not only is data exciting
...it also helps support
wider sharing of OER.
18. future project endeavors
β more data sharing: SlideShare, Amara.org
β continual development of visuals
β movement toward increased analytics
β increased data export options
20. thanks!
to all who provided assistance with and support for this
project, especially:
Kevin Coffman, Pieter Kleymeer, Susan Topol, Trisha Paul, Bob
Riddle, Stephanie Dascola, Margaret Ann Murphy, Emily Puckett
Rodgers, Pierre Clement, Michael Hess, Karen Kost;
all interviewees and Open.Michigan Initiative collaborators