How to Curate: Putting Curation into Practice for L&DDavid Kelly
These slides are used to support a talk exploring curation. Here's a sample description of such a talk:
Curation is a term that is becoming more and more common in the learning industry. Unfortunately, most learning professionals do not understand what it is, let alone how to leverage it in their organizations. And yet, in an age of exponentially increasing information, the need for quality curation has never been greater. During this session we'll explore how to put curation into practice in your organization. We will discuss the key value-adds that are critical in quality curation, and where they fit into the curation workflow. We'll also explore examples of organizations and individuals that have used curation for learning, and some of the tools that are used for curation purposes.
Creating a Data-Driven Organization, Crunchconf, October 2015Carl Anderson
What does it mean for an organization to be data-driven? How does an organization get there? Many organizations think that they are data-driven but the reality is that few genuinely are and that we could all do better. In this talk, I cover what it truly means to be data driven. The answer, it turns out, is not to do with the latest tools and technologies (although they can help) but having an appropriate data culture than spans the whole organization, where data is accessible broadly, embedded into operations and processes, and enables effective decision making. In this presentation, I dissect what an effective data-driven culture entails, covering facets such as data leadership, data literacy, and A/B testing, illustrating concepts with examples from different industries as well as personal experience.
Asegurando la calidad del dato en mi entorno de business intelligenceMary Arcia
El aseguramiento de la calidad de datos es el proceso que más demanda tiempo, gente y dinero dentro de nuestros proyectos de BI. Entendiendo el efecto clave en el proceso de la toma de decisiones que genera el” business intelligence”, no puede tratarse la calidad de los datos como un proceso tardío. En esta sesión vamos a conocer cómo tras una metodología de calidad de datos, los servicios de Data Quality Services de Microsoft SQL Server nos ayuda en este proceso de ahorrar tiempo y garantizar datos sanos y correctos para nuestros sistemas de BI.
Data Quality Strategy: A Step-by-Step ApproachFindWhitePapers
Learn about the importance of having a data quality strategy and setting the overall goals. The six factors of data are also explained in detail and how to tie it together for implementation.
The New Model of Moves Management for Effective FundraisingOrankashaw
Moves management focuses on using targeted efforts to shift influential donors from passive, one-time contributors to active members participants in the organization.
Developed by David Dunlop of Cornell University, he describes Moves Management as, "changing people's attitudes so they want to give."
Learn more about how non-profit organizations and charities can nurture long-term relationships with their key influencers by viewing the slideshow or visiting http://fundraising.avectra.com/solutions/moves-management.php.
In this lecture we discuss data quality and data quality in Linked Data. This 50 minute lecture was given to masters student at Trinity College Dublin (Ireland), and had the following contents:
1) Defining Quality
2) Defining Data Quality - What, Why, Costs
3) Identifying problems early - using a simple semantic publishing process as an example
4) Assessing Linked (big) Data quality
5) Quality of LOD cloud datasets
References can be found at the end of the slides
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 (CC-BY-SA-40) International License.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
How to Curate: Putting Curation into Practice for L&DDavid Kelly
These slides are used to support a talk exploring curation. Here's a sample description of such a talk:
Curation is a term that is becoming more and more common in the learning industry. Unfortunately, most learning professionals do not understand what it is, let alone how to leverage it in their organizations. And yet, in an age of exponentially increasing information, the need for quality curation has never been greater. During this session we'll explore how to put curation into practice in your organization. We will discuss the key value-adds that are critical in quality curation, and where they fit into the curation workflow. We'll also explore examples of organizations and individuals that have used curation for learning, and some of the tools that are used for curation purposes.
Creating a Data-Driven Organization, Crunchconf, October 2015Carl Anderson
What does it mean for an organization to be data-driven? How does an organization get there? Many organizations think that they are data-driven but the reality is that few genuinely are and that we could all do better. In this talk, I cover what it truly means to be data driven. The answer, it turns out, is not to do with the latest tools and technologies (although they can help) but having an appropriate data culture than spans the whole organization, where data is accessible broadly, embedded into operations and processes, and enables effective decision making. In this presentation, I dissect what an effective data-driven culture entails, covering facets such as data leadership, data literacy, and A/B testing, illustrating concepts with examples from different industries as well as personal experience.
Asegurando la calidad del dato en mi entorno de business intelligenceMary Arcia
El aseguramiento de la calidad de datos es el proceso que más demanda tiempo, gente y dinero dentro de nuestros proyectos de BI. Entendiendo el efecto clave en el proceso de la toma de decisiones que genera el” business intelligence”, no puede tratarse la calidad de los datos como un proceso tardío. En esta sesión vamos a conocer cómo tras una metodología de calidad de datos, los servicios de Data Quality Services de Microsoft SQL Server nos ayuda en este proceso de ahorrar tiempo y garantizar datos sanos y correctos para nuestros sistemas de BI.
Data Quality Strategy: A Step-by-Step ApproachFindWhitePapers
Learn about the importance of having a data quality strategy and setting the overall goals. The six factors of data are also explained in detail and how to tie it together for implementation.
The New Model of Moves Management for Effective FundraisingOrankashaw
Moves management focuses on using targeted efforts to shift influential donors from passive, one-time contributors to active members participants in the organization.
Developed by David Dunlop of Cornell University, he describes Moves Management as, "changing people's attitudes so they want to give."
Learn more about how non-profit organizations and charities can nurture long-term relationships with their key influencers by viewing the slideshow or visiting http://fundraising.avectra.com/solutions/moves-management.php.
In this lecture we discuss data quality and data quality in Linked Data. This 50 minute lecture was given to masters student at Trinity College Dublin (Ireland), and had the following contents:
1) Defining Quality
2) Defining Data Quality - What, Why, Costs
3) Identifying problems early - using a simple semantic publishing process as an example
4) Assessing Linked (big) Data quality
5) Quality of LOD cloud datasets
References can be found at the end of the slides
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 (CC-BY-SA-40) International License.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
Keys to Creating an Analytics-Driven CultureDATAVERSITY
Changing company culture takes time, energy and focus, as well as consistent reinforcement long after the breakrooms’ company culture posters start to fade. Creating an analytics-driven culture may be even harder to grow and sustain. Yet the rewards are vast for companies whose culture embodies an analytics-first mindset – and for those who use the derived insights to improve operational efficiency and decision-making, generate new revenue and prevent risk and fraud.
This webinar will offer advice and real-world examples on how to:
Develop and utilize an analytics-focused vision statement
Engage senior leaders to support analytics as a business problem-solver
Communication best practices to engage participants in the culture change
Use tried-and-tested best practices and approaches to build an analytics-driven culture
Building an Analytics CoE (Center of Excellence)Rahul Saxena
This deck is from a workshop I conducted at the Indian Institute of Management, Bangalore (IIMB) on 20th July, 2013.
Agenda:
* What does the organization want to do with analytics? What is the role of the CoE that they envision?
* What is the organizational context? Current providers of analytics? Leadership support?
* What will the Analytics CoE need to be like (now and in the future, up to the planning horizon)?
* Where do we stand with analytics capabilities now, compared to what we need?
* How will we evolve the CoE? Set expectations, drive the evolution, establish the value.
We have heard Industry 4.0 and how the new technologies are supporting in changing the Industry landscape. In this talk, I shared by perspective of Continuous QMS using the new tools and technolgies related to Big Data, Cloud, AI/ML and others for constanatly improving the Quality Management System (QMS). QMS is not just about the process asset library in a central place. It is about the learning organization which constantly improves the process, people , technology and thereby the products/platform and services. Quality by Discovery is about collecting the real time information from various data sources in the organization and ability to make quick decisions.
Data-centric design and the knowledge graphAlan Morrison
The #knowledgegraph--smart data that can describe your business and its domains--is now eating software. We won't be able to scale AI or other emerging tech without knowledge graphs, because those techs all require a transformed data foundation, large-scale integration, and shared data infrastructure.
Key to knowledge graphs are #semantics, #graphdatabase technology and a Tinker Toy-style approach to adding the missing verbs (which provide connections and context) back into your data. A knowledge graph foundation provides a means of contextualizing business domains, your content and other data, for #AI at scale.
This is from a talk I gave at the Data Centric Design for SMART DATA & CONTENT Enthusiasts meetup on July 31, 2019 at PwC Chicago. Thanks to Mary Yurkovic and Matt Turner for a very fun event!.
Data centric business and knowledge graph trendsAlan Morrison
The deck for my kickoff keynote at the Data-Centric Architecture Forum, February 3, 2020. Includes related data, content, and architecture definitions and fundamental explanations, knowledge graph trends, market outlook, transformation case studies and benefits of large-scale, cross-boundary integration/interoperation.
Customer-centricity is the new imperative, but most organizations are not prepared to transform the way they work to deliver a relevant, personalized customer experience at scale. Designed for those who have been exposed to Journey Mapping, this interactive workshop will share Accenture’s Customer Journey Management framework for guiding the omni-channel customer experience with agility and at scale. During the session you will assess your organization’s design, governance and operating model dimensions to identify capability gaps in delivering on your vision of customer-centricity.
In a working session you will prioritize the gaps in your organization’s capabilities to implement the Customer Journey Management framework. The workshop will help you visualize how to manage the dramatic increase in data, segments, content, collaboration, and compliance that come with high-fidelity journey mapping and omni-channel marketing. We will discuss your specific challenges, as well as real world examples of operating model innovations from companies across industries and levels of maturity. This session will help you prepare your company to identify and respond to customer experience opportunities with new levels of agility and scale.
A framework that discusses the various elements of Data Monetization framework that could be leveraged by organizations to improve their Information Management Journey.
Creating a clearly articulated data strategy—a roadmap of technology-driven capability investments prioritized to deliver value—helps ensure from the get-go that you are focusing on the right things, so that your work with data has a business impact. In this presentation, the experts at Silicon Valley Data Science share their approach for crafting an actionable and flexible data strategy to maximize business value.
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
Suresh Menon, Vice President, Product Management - Information Quality Solutions at Informatica, shares how to master your data and your business from the 2015 Informatica Government Summit.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Recommender Systems from A to Z – The Right DatasetCrossing Minds
In the last years a lot of improvements were done in the field of Machine Learning and the Tools that support the community of developers. But still, implementing a recommender system is very hard.
That is why at Crossing Minds, we decided to create a series of 4 meetups to discuss how to implement a recommender system end-to-end:
Part 1 – The Right Dataset
Part 2 – Model Training
Part 3 – Model Evaluation
Part 4 – Real-Time Deployment
This first meetup will be about building the right dataset and doing all the preprocessing needed to create different models. We will talk about explicit vs implicit feedback, dataset analysis, likes/dislikes vs ratings, users and items features, normalization and similarities.
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19Cloudera, Inc.
Join us to learn about the challenges of legacy data warehousing, the goals of modern data warehousing, and the design patterns and frameworks that help to accelerate modernization efforts.
The increase in the amount of structured data published using the principles of Linked Data, means that now it is more likely to find resources on the Web of Data that describe real life concepts. However, discovering resources related to any given resource is still an open research area. This thesis studies recommender systems that use Linked Data as a source for generating recommendations exploiting the big amount of available resources and the relationships between them. Accordingly, a framework named \emph{AlLied} to execute recommendation algorithms is proposed. This framework can be used as the main component for recommendations in a given architecture because it allows application developers to execute and evaluate recommendation algorithms in different contexts. Two implementations of this framework are presented and compared. The first one relies on graph-based algorithms and the second one on machine learning algorithms. Finally, a new recommendation algorithm that adapts dynamically to the linking features of the datasets used is also proposed
Making Data Timelier and More Reliable with Lakehouse TechnologyMatei Zaharia
Enterprise data architectures usually contain many systems—data lakes, message queues, and data warehouses—that data must pass through before it can be analyzed. Each transfer step between systems adds a delay and a potential source of errors. What if we could remove all these steps? In recent years, cloud storage and new open source systems have enabled a radically new architecture: the lakehouse, an ACID transactional layer over cloud storage that can provide streaming, management features, indexing, and high-performance access similar to a data warehouse. Thousands of organizations including the largest Internet companies are now using lakehouses to replace separate data lake, warehouse and streaming systems and deliver high-quality data faster internally. I’ll discuss the key trends and recent advances in this area based on Delta Lake, the most widely used open source lakehouse platform, which was developed at Databricks.
Keys to Creating an Analytics-Driven CultureDATAVERSITY
Changing company culture takes time, energy and focus, as well as consistent reinforcement long after the breakrooms’ company culture posters start to fade. Creating an analytics-driven culture may be even harder to grow and sustain. Yet the rewards are vast for companies whose culture embodies an analytics-first mindset – and for those who use the derived insights to improve operational efficiency and decision-making, generate new revenue and prevent risk and fraud.
This webinar will offer advice and real-world examples on how to:
Develop and utilize an analytics-focused vision statement
Engage senior leaders to support analytics as a business problem-solver
Communication best practices to engage participants in the culture change
Use tried-and-tested best practices and approaches to build an analytics-driven culture
Building an Analytics CoE (Center of Excellence)Rahul Saxena
This deck is from a workshop I conducted at the Indian Institute of Management, Bangalore (IIMB) on 20th July, 2013.
Agenda:
* What does the organization want to do with analytics? What is the role of the CoE that they envision?
* What is the organizational context? Current providers of analytics? Leadership support?
* What will the Analytics CoE need to be like (now and in the future, up to the planning horizon)?
* Where do we stand with analytics capabilities now, compared to what we need?
* How will we evolve the CoE? Set expectations, drive the evolution, establish the value.
We have heard Industry 4.0 and how the new technologies are supporting in changing the Industry landscape. In this talk, I shared by perspective of Continuous QMS using the new tools and technolgies related to Big Data, Cloud, AI/ML and others for constanatly improving the Quality Management System (QMS). QMS is not just about the process asset library in a central place. It is about the learning organization which constantly improves the process, people , technology and thereby the products/platform and services. Quality by Discovery is about collecting the real time information from various data sources in the organization and ability to make quick decisions.
Data-centric design and the knowledge graphAlan Morrison
The #knowledgegraph--smart data that can describe your business and its domains--is now eating software. We won't be able to scale AI or other emerging tech without knowledge graphs, because those techs all require a transformed data foundation, large-scale integration, and shared data infrastructure.
Key to knowledge graphs are #semantics, #graphdatabase technology and a Tinker Toy-style approach to adding the missing verbs (which provide connections and context) back into your data. A knowledge graph foundation provides a means of contextualizing business domains, your content and other data, for #AI at scale.
This is from a talk I gave at the Data Centric Design for SMART DATA & CONTENT Enthusiasts meetup on July 31, 2019 at PwC Chicago. Thanks to Mary Yurkovic and Matt Turner for a very fun event!.
Data centric business and knowledge graph trendsAlan Morrison
The deck for my kickoff keynote at the Data-Centric Architecture Forum, February 3, 2020. Includes related data, content, and architecture definitions and fundamental explanations, knowledge graph trends, market outlook, transformation case studies and benefits of large-scale, cross-boundary integration/interoperation.
Customer-centricity is the new imperative, but most organizations are not prepared to transform the way they work to deliver a relevant, personalized customer experience at scale. Designed for those who have been exposed to Journey Mapping, this interactive workshop will share Accenture’s Customer Journey Management framework for guiding the omni-channel customer experience with agility and at scale. During the session you will assess your organization’s design, governance and operating model dimensions to identify capability gaps in delivering on your vision of customer-centricity.
In a working session you will prioritize the gaps in your organization’s capabilities to implement the Customer Journey Management framework. The workshop will help you visualize how to manage the dramatic increase in data, segments, content, collaboration, and compliance that come with high-fidelity journey mapping and omni-channel marketing. We will discuss your specific challenges, as well as real world examples of operating model innovations from companies across industries and levels of maturity. This session will help you prepare your company to identify and respond to customer experience opportunities with new levels of agility and scale.
A framework that discusses the various elements of Data Monetization framework that could be leveraged by organizations to improve their Information Management Journey.
Creating a clearly articulated data strategy—a roadmap of technology-driven capability investments prioritized to deliver value—helps ensure from the get-go that you are focusing on the right things, so that your work with data has a business impact. In this presentation, the experts at Silicon Valley Data Science share their approach for crafting an actionable and flexible data strategy to maximize business value.
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
Suresh Menon, Vice President, Product Management - Information Quality Solutions at Informatica, shares how to master your data and your business from the 2015 Informatica Government Summit.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Recommender Systems from A to Z – The Right DatasetCrossing Minds
In the last years a lot of improvements were done in the field of Machine Learning and the Tools that support the community of developers. But still, implementing a recommender system is very hard.
That is why at Crossing Minds, we decided to create a series of 4 meetups to discuss how to implement a recommender system end-to-end:
Part 1 – The Right Dataset
Part 2 – Model Training
Part 3 – Model Evaluation
Part 4 – Real-Time Deployment
This first meetup will be about building the right dataset and doing all the preprocessing needed to create different models. We will talk about explicit vs implicit feedback, dataset analysis, likes/dislikes vs ratings, users and items features, normalization and similarities.
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19Cloudera, Inc.
Join us to learn about the challenges of legacy data warehousing, the goals of modern data warehousing, and the design patterns and frameworks that help to accelerate modernization efforts.
The increase in the amount of structured data published using the principles of Linked Data, means that now it is more likely to find resources on the Web of Data that describe real life concepts. However, discovering resources related to any given resource is still an open research area. This thesis studies recommender systems that use Linked Data as a source for generating recommendations exploiting the big amount of available resources and the relationships between them. Accordingly, a framework named \emph{AlLied} to execute recommendation algorithms is proposed. This framework can be used as the main component for recommendations in a given architecture because it allows application developers to execute and evaluate recommendation algorithms in different contexts. Two implementations of this framework are presented and compared. The first one relies on graph-based algorithms and the second one on machine learning algorithms. Finally, a new recommendation algorithm that adapts dynamically to the linking features of the datasets used is also proposed
Making Data Timelier and More Reliable with Lakehouse TechnologyMatei Zaharia
Enterprise data architectures usually contain many systems—data lakes, message queues, and data warehouses—that data must pass through before it can be analyzed. Each transfer step between systems adds a delay and a potential source of errors. What if we could remove all these steps? In recent years, cloud storage and new open source systems have enabled a radically new architecture: the lakehouse, an ACID transactional layer over cloud storage that can provide streaming, management features, indexing, and high-performance access similar to a data warehouse. Thousands of organizations including the largest Internet companies are now using lakehouses to replace separate data lake, warehouse and streaming systems and deliver high-quality data faster internally. I’ll discuss the key trends and recent advances in this area based on Delta Lake, the most widely used open source lakehouse platform, which was developed at Databricks.
2. Content
A Concept of Content Curation
B Models of Content Curation
C Value of Content Curation
2
3. Concept
of
Content
Curation
A
B
C
Curation
Aggregation
Online
Content
Creation
There are 3 types of content (online).
3
4. Concept
of
Content
Curation
A
B
C
Creation is like planting seeds from scratch.
Metaphor
on
seeds
was
inspired
by
Corinne
Weisgerber
-‐
Building
Thought
Leadership
through
Content
Cura@on
4
5. Concept
of
Content
Curation
A
B
C
Aggregation is like simply gathering seeds together, which
machine (or birds) can do.
5
6. Concept
of
Content
Curation
A
B
C
Curation is like sorting seeds out, putting them into context, framing
them, so that they make sense.
6
7. Concept
of
Content
Curation
A
B
C
“ Content Curation is a term that
describes the act of finding,
grouping, organizing or sharing the
best and most relevant content
on a specific issue. ”
Rohit Bhargava
SVP, Global Strategy & Planning| Ogilvy
Author, Likeonomics, Personality Not Included
Professor, Global Marketing | Georgetown University
7
8. Model
of
Content
Curation
A
B
C
Seeking Sensing Sharing
Information Curated content
from various Aggregation out through
sources different social
Distillation channels
Elevation
Mashup
Chronology
Curation goes through a 3-S process: Seeking, Sensing, Sharing. It’s
a process of adding value to the audience.
Source:
3-‐S
model
from
Beth
Kanter;
Sensing
model
from
Rohit
Bhargava
8
9. Model
of
Content
Curation
A
B
C
Seeking
• Define objectives and audience
• Organize sources:
• Use discovery tools
• Scan more than you capture
Tools
• Google Alerts
• RSS
• Twitter, Scoop.it, YouTube….etc.
Source:
3-‐S
model
from
Beth
Kanter;
Sensing
model
from
Rohit
Bhargava
9
10. Model
of
Content
Curation
A
B
C
Sensing - Aggregation
• Aggregation is the act of
curating the most relevant
information about a particular
topic into a single location
• It is the most common form of
content curation.
Examples
• Link2Asia Week of July 13
• Golden Bridges: China NPO
Notes:
1.
Sensing
is
where
value-‐adding
takes
place
Weibo Digest (October)
in
cura@on.
2.
The
5
models
can
be
used
alone
or
together.
Source:
3-‐S
model
from
Beth
Kanter;
Sensing
model
from
Rohit
Bhargava
10
11. Model
of
Content
Curation
A
B
C
Sensing - Distillation
• Distillation is the act of curating
information into a more
simplistic format where only the
most important or relevant ideas
are shared.
• Methods: summarizing,
introduction
Examples
• Academic Research Digest,
October 2011
• The Media Discusses Social
Organization Management
Source:
3-‐S
model
from
Beth
Kanter;
Sensing
model
from
Rohit
Bhargava
11
12. Model
of
Content
Curation
A
B
C
Sensing - Elevation
• Elevation refers to curation with
a mission of identifying a larger
trend or insight from smaller
daily musings posted online.
• one of the hardest forms of
content curation requiring more
expertise and analytical ability
Examples
• Philanthropy Advising Series:
Influences from the West
• Chinese Philanthropy Beyond
Rapidly Declining Number
Source:
3-‐S
model
from
Beth
Kanter;
Sensing
model
from
Rohit
Bhargava
12
13. Model
of
Content
Curation
A
B
C
Sensing - Mashups
• Mashups are unique curated
justapositions where merging
existing content is used to
create a new point of view.
Examples
• Converting ideas in an article
into a PowerPoint presentation is
an example of Mashups, IMO.
Source:
3-‐S
model
from
Beth
Kanter;
Sensing
model
from
Rohit
Bhargava
13
14. Model
of
Content
Curation
A
B
C
Sensing - Chronology
• Chronology is a form of curation
that brings together historical
information organized based on
time to show an evolving
understanding of a particular
topic.
Examples
• Memolane
• SOPA Timeline
Source:
3-‐S
model
from
Beth
Kanter;
Sensing
model
from
Rohit
Bhargava
14
15. Model
of
Content
Curation
A
B
C
Sharing
• Feed your network with a steady
diet of good stuff
• Comment on other people’s stuff
• Collaborative sensemaking
Tools
• Blogging
• Twitter, Scoop.it, Pinterest, Storify
etc.
Source:
3-‐S
model
from
Beth
Kanter;
Sensing
model
from
Rohit
Bhargava
15
16. Value
of
Content
Curation A B
C
Photo
credit:
Will
Lion
from
Flickr
16
17. Value
of
Content
Curation A B
C
Information overload is overwhelming!
17
18. Value
of
Content
Curation A B
C
Hence
come
the
values
of
content
curation.
Value No.1:
Curation reduces information
overload by providing
filtered information.
18
19. Value
of
Content
Curation A B
C
Value No.2:
Curation brings clarity to
chaos by making sense
of information.
19
20. Value
of
Content
Curation A B
C
Value No.3:
Curation extends
the shelf life
of information.
20
21. Value
of
Content
Curation A B
C
Value No.4:
For curators,
curation is an efficient way to
build knowledge, skills, Yo u are what yo u cura
te.
thought leadership
and network.
For organizations,
curation helps build staff
expertise, improve banding
and increase impact!
21
22. Value
of
Content
Curation A B
C
But………
Be aware!
There is good curation
and bad curation!
Source:
Beth
Kanter,
Good
Curaion
vs
Bad
Cura@on
,
hWp://www.bethkanter.org/good-‐cura@on-‐vs-‐bad-‐cura@on/
22
22
23. Value
of
Content
Curation A B
C
“ The key element that makes curation work is
the competence and focus of the curator and
of the topic he has selected. Repeated efforts to
create curated channels that mix and match
broad and highly competitive topics are bound to
”
see a very short life.
Robin Good
Master of Content Curation
23
23
24. Value
of
Content
Curation A B
C
Are you
ready “ Those who can, curate.
Those who can’t, review.
to be a
Those who can’t review, tweet.
Those who can’t tweet, retweet. ”
via Beth Kanter
curator?
Author, Beth’s Blog, The Networked Nonprofit
Expert on using technology at nonprofits
24