Digital methods allow for the computational analysis of social media data through three main steps: data extraction via platform APIs, data processing and aggregation through extraction software, and data analysis and visualization using analysis software. While promising access to behavioral data at scale, social media analysis requires an understanding of each platform's data formalizations and technical limitations. Different analytical gestures can be applied through statistics, graph theory, and other methods to investigate patterns in content, users, and their relations.
This presentation was prepared and presented by me during first year of my Post Graduation (M.B.A), in State Level Research Paper Presentation Competition, named Avishkar-(2014-15), in which I stood 3rd in the University Level.
It was my first attempt to handle the subject regarding management and to pack it into the research format.
This presentation is based on Conceptual and Observation based research. Its objective is to state changes occurring in the field of advertising and the techniques used to advertise.
Overview, Trends & Analysis of the Indian Media & Entertainment industry as of Jan 2020.
P.S. The impact of the lockdown has not been built into the 2020 & beyond numbers...
This presentation contains the definition of corporate communication, why it is required in NGO's, the relevance of social media, various objectives of NGO's and the advantages of using corporate communication in NGO's.
A presentation on Public Relations in Integrated Marketing Communication and Advertising, showcasing various aspects of PR including functions, advantages, disadvantages of PR and case studies to further reinforce the points made in the presentation.
When it comes to advertising your business, it's imperative that you know who your "target audience" is, and how to reach them. While it's important to reach as many people as possible, and it often seems like focusing only on certain segments of the population is limiting, you need to be "direct." Directly reaching those interested in your product or service will ultimately put more money in your pocket. Therefore, before you decide what your message is, and how to deliver it, you need to understand your target audience.
This presentation was prepared and presented by me during first year of my Post Graduation (M.B.A), in State Level Research Paper Presentation Competition, named Avishkar-(2014-15), in which I stood 3rd in the University Level.
It was my first attempt to handle the subject regarding management and to pack it into the research format.
This presentation is based on Conceptual and Observation based research. Its objective is to state changes occurring in the field of advertising and the techniques used to advertise.
Overview, Trends & Analysis of the Indian Media & Entertainment industry as of Jan 2020.
P.S. The impact of the lockdown has not been built into the 2020 & beyond numbers...
This presentation contains the definition of corporate communication, why it is required in NGO's, the relevance of social media, various objectives of NGO's and the advantages of using corporate communication in NGO's.
A presentation on Public Relations in Integrated Marketing Communication and Advertising, showcasing various aspects of PR including functions, advantages, disadvantages of PR and case studies to further reinforce the points made in the presentation.
When it comes to advertising your business, it's imperative that you know who your "target audience" is, and how to reach them. While it's important to reach as many people as possible, and it often seems like focusing only on certain segments of the population is limiting, you need to be "direct." Directly reaching those interested in your product or service will ultimately put more money in your pocket. Therefore, before you decide what your message is, and how to deliver it, you need to understand your target audience.
THE SHARING ECONOMY LACKS A SHARED DEFINITION: GIVING MEANING TO THE TERMSCollaborative Lab
You may have noticed the terms ‘sharing economy’, ‘ peer economy’, ‘collaborative economy’ and ‘collaborative consumption’ being used synonymously. Do these terms have different meanings? Yes. Are their common core ideas that explain the overlap? Absolutely.
In this presentation, we have defined and visualized the terms and core ideas that connect the likes of Airbnb, Taskrabbit, Lyft and Zipcar.
Crisis management in public relations bolaji okusagaBolaji Okusaga
The best of businesses, are businesses that are insulated from the erosion of value which comes with Crisis. The aviation industry is one such industry which has taken crisis as an integral part of the industry model and with the Mobil and BP Oil Spill crises, and the Shell Ogoni affair, the oil and gas industry is begining to realise that it cannot be business as usual.
This presentation provides complete snapshot of Indian Media Landscape and how the media has evolved over the years and where it is headed in the coming years!
Engines of Order. Social Media and the Rise of Algorithmic Knowing.Bernhard Rieder
Talk given at the Social Media and the Transformation of Public Space Conference on June 19 at the University of Amsterdam. References and comments are in the notes section.
THE SHARING ECONOMY LACKS A SHARED DEFINITION: GIVING MEANING TO THE TERMSCollaborative Lab
You may have noticed the terms ‘sharing economy’, ‘ peer economy’, ‘collaborative economy’ and ‘collaborative consumption’ being used synonymously. Do these terms have different meanings? Yes. Are their common core ideas that explain the overlap? Absolutely.
In this presentation, we have defined and visualized the terms and core ideas that connect the likes of Airbnb, Taskrabbit, Lyft and Zipcar.
Crisis management in public relations bolaji okusagaBolaji Okusaga
The best of businesses, are businesses that are insulated from the erosion of value which comes with Crisis. The aviation industry is one such industry which has taken crisis as an integral part of the industry model and with the Mobil and BP Oil Spill crises, and the Shell Ogoni affair, the oil and gas industry is begining to realise that it cannot be business as usual.
This presentation provides complete snapshot of Indian Media Landscape and how the media has evolved over the years and where it is headed in the coming years!
Engines of Order. Social Media and the Rise of Algorithmic Knowing.Bernhard Rieder
Talk given at the Social Media and the Transformation of Public Space Conference on June 19 at the University of Amsterdam. References and comments are in the notes section.
Leveraging Flat Files from the Canvas LMS Data Portal at K-StateShalin Hai-Jew
A lot of data are created in an LMS instance, and much of this can be analyzed for insight. In 2016, Instructure, the makers of Canvas, made their LMS data available to their customers through a data portal (updated monthly). This portal enables access to a number of flat files related to that particular instance. This presentation showcases how this big data was analyzed on a regular laptop with basic office software, to summarize Kansas State University’s use of the LMS. Methods for analysis include the following: basic descriptive statistics, survival analysis, computational linguistic analysis, and others.
The results are reported out with both numbers and data visualizations, including classic pie charts, line graphs, bar charts, mixed-charts, word clouds, and others. The findings provide some insights about how to approach the data, how to use a data dictionary, and other methods for extracting the data for awareness and practical decision-making. This work also is suggestive of next steps for more advanced analysis (using the flat files in a SQL database).
More information about this may be accessed at http://scalar.usc.edu/works/c2c-digital-magazine-spring--summer-2017/wrangling-big-data-in-a-small-tech-ecosystem.
Extracting Social Network Data and Multimedia Communications from Social Medi...Shalin Hai-Jew
This presentation provides an overview of some of the data extractions that may be achieved on social media platforms using their respective APIs and a free open-source tool (NodeXL).
Moving forward data centric sciences weaving AI, Big Data & HPCGenoveva Vargas-Solar
This novel and multidisciplinary data centric and scientific movement, promises new and not yet imagined applications that rely on massive amounts of evolving data that need to be cleaned, integrated and analysed for modelling purposes. Yet, data management issues are not usually perceived as central. In this keynote I will explore the key challenges and opportunities for data management in this new scientific world, and discuss how a possible data centric artificial intelligence supported by high performance computing (HPC) can best contribute to these exciting domains. If the moto is not academic, huge numbers of dollars being devoted to related applications are moving industry and academia to analyse these directions.
Researching Social Media – Big Data and Social Media AnalysisFarida Vis
Researching Social Media – Big Data and Social Media Analysis, presentation for the Social Media for Researchers: A Sheffield Universities Social Media Symposium, 23 September 2014
Interactive visualization and exploration of network data with gephiBernhard Rieder
Presentation for a workshop given at the Centre for Interdisciplinary Methodologies at Warwick University on May 9 2013. Focuses on conceptual and historical questions. Comments, references, and explanations are in the notes.
Ethnobotany and Ethnopharmacology:
Ethnobotany in herbal drug evaluation,
Impact of Ethnobotany in traditional medicine,
New development in herbals,
Bio-prospecting tools for drug discovery,
Role of Ethnopharmacology in drug evaluation,
Reverse Pharmacology.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
How to Create Map Views in the Odoo 17 ERPCeline George
The map views are useful for providing a geographical representation of data. They allow users to visualize and analyze the data in a more intuitive manner.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
We all have good and bad thoughts from time to time and situation to situation. We are bombarded daily with spiraling thoughts(both negative and positive) creating all-consuming feel , making us difficult to manage with associated suffering. Good thoughts are like our Mob Signal (Positive thought) amidst noise(negative thought) in the atmosphere. Negative thoughts like noise outweigh positive thoughts. These thoughts often create unwanted confusion, trouble, stress and frustration in our mind as well as chaos in our physical world. Negative thoughts are also known as “distorted thinking”.
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
Analyzing Social Media with Digital Methods. Possibilities, Requirements, and Limitations
1. Analyzing Social Media with Digital Methods
Possibilities, Requirements, and Limitations
Bernhard Rieder
Universiteit van Amsterdam
Mediastudies Department
2. The starting point
Social media are playing important roles in contemporary society, from the
very personal to the very public.
Many disciplines have begun to study social media, applying various
methodologies (ethnography, questionnaires, etc.), but there is an
explosion in data-driven research that relies on the computational analysis
of data gleaned from social media platforms.
The promise is (cheap and detailed) access to what people do, not what
they say they do; to their behavior, exchange, ideas, and sentiments.
3. This presentation
This talk introduces social media analysis using digital methods from a
theoretically involved yet "practical" perspective.
Instead of laying out an overarching "logic" of social media data analysis, I
focus on the basic setup and the rich reservoir of analytical gestures that
constitute the practice of data analysis.
1 / A (long) introduction
2 / Three examples covering Facebook, Twitter, and YouTube
3 / Some conclusions and recommendations
4. 1 / Introduction
Social media services host an increasing number of relevant phenomena,
including everyday practices, political presentation and debate, social and
political activism, disaster communication, etc.
A number of preliminary remarks:
☉ The phenomena one is interested in may not happen or resonate on social media;
many things happen elsewhere.
☉ Even if one's research focus is on social media, one may not get the data.
☉ One requires a least some technical competence and the willingness to confront
and learn about a number of technical matters.
☉ Every social media "platform" (Gillespie 2013) is different and requires a different
approach; cf. "medium-specificity".
5. 1 / Introduction
Hypothetico-deductive approaches are certainly possible, but this
presentation espouses inductive "exploratory data analysis" (Tukey 1962)
that emphasizes iteration, methodological flexibility, adjustment of
questions, and "grounded theory" (Glaser & Strauss 1965).
"Far better an approximate answer to the right question, which is often
vague, than an exact answer to the wrong question, which can always be
made precise. Data analysis must progress by approximate answers, at
best, since its knowledge of what the problem really is will at best be
approximate." (Tukey 1962)
6. 1 / Introduction
How does social media analysis with digital methods work?
social media platform
e.g. Twitter, Facebook
users communicate,
interact, express, publish,
etc. through "grammars
of action" (forms and
functions) rendered in
software
API
technical interface to the
data, defined in technical,
legal, and logistical terms
extraction software
e.g. DMI-TCAT, Netvizz
makes calls to API,
creates "views" by
combing data into
specific sets or
metrics, produces
outputs
provides visual or textual
representation of view,
e.g. an interactive chart
data in standard file
format, e.g. CSV
allows analyzing
files in various
ways, e.g.
statistics, graph
theory
output type 1: widget
output type 2:
file
analysis software,
e.g. Excel, gephi
layers of technical mediation that one might want to think about
1
2
3 4
7. 1 / Introduction - a / the platform
Social media services channel communication, interaction, etc. through
"grammars of action" (forms and functions) rendered in software; users
appropriate these affordances.
Every service is different. Every service changes over time, both in terms
of technology and user practices.
Homogeneous interfaces do not mean homogeneous practices. Platforms
strive to capture large audiences and leave important margins to users.
8. Social media platforms are organized
around instances of predefined
types of entities (users, messages,
hashtags, posts, etc.) and
connections between them.
They formalize and channel
expression, exchange, and
coordination and data fields
are closely related to these
formalizations.
10. Social media are different from the "open" Web because most data is
formalized in fields and a "semantic data model".
The more detailed the formalization, the more salient the data.
Social media platforms are essentially large databases.
1 / Introduction - a / the platform
11. Very large numbers and variety in users,
contents, purposes, arrangements, etc.
12. Social media are built around simple
point-to-point principles; this allows
for a variety of configurations to
emerge over time.
Every account is the same, but there
are vast differences in scale. We need
to begin with technical fieldwork and
conceptualization of the platform.
13. 1 / Introduction - b / the APIs
There are two possibilities to collect data automatically from social media
platforms: scraping the user interface or collecting via specified
application programming interfaces (APIs).
APIs specify (technically, legally, logistically):
☉ What data can be retrieved (certain fields may be inaccessible or incomplete);
☉ How much data can be retrieved (all APIs have rate limits);
☉ The span of coverage (temporal limitations apply often);
☉ The perceptivity of coverage (privacy or personalization can skew access);
For example, Facebook (currently) provides these variables for each post:
comment like share
count yes yes yes
individual user list yes yes no
time-stamp yes no no
14. Social media users produce
detailed data traces; data
pools in social media are
centralized and retrievable.
Structure of APIs is closely
related to given formalizations.
In order to select, process, and
interpret data we need to
understand the platform:
entities, relations, modes of
aggregation, metrics, etc.
Every platform is different and
we thus need medium-specific
data analysis.
15. 1 / Introduction - c / the extraction software
Extraction software are the programs that connect to the APIs, retrieve
data, and produce specific outputs.
Can range from custom-written scripts to one-click visualization widgets.
These programs work with API data, but add their own "epistemological
twist", i.e. produce particular views on the data. Sampling is often
difficult, therefore n = all is the norm.
Extraction software can be very simple and completely free or have steep
technical, logistical, and financial requirements.
18. Example for a on open source
analytics suite: DMI-TCAT
19. Example for a on open source
analytics suite: DMI-TCAT
20. There are many different tools out there, with different conceptual
underpinnings, ease of use, depth, etc.
Data analysis (statistics): Excel, SPSS, Tableau, Wizard, Mondrian, …
Data analysis (graph): Gephi, NodeXL, Pajek, …
Data analysis (other): Rapidminer, SentiStrength, Wordij, …
Data analysis (custom): R, Python (NLTK, NumPy & SciPy), …
This presentation relies mostly on R (R Core Team 2014) and Gephi
(Bastian, Heymann, Jacomy 2009).
1 / Introduction - d / the analysis software
21. 1 / Introduction - d / the analysis software
Analysis software provide analytical gestures to apply to the data; may be
integrated into the extraction software or not.
We investigate the structure of data by creating "views" of the data.
Analytical gestures produce orderings, lists, tables, charts, coefficients etc.
that are saying something about the data and thus the phenomenon.
Flusser (1991) describes gestures as having convention and structure, but
as different from reflexes, because translating a moment of freedom.
The notion of gesture indicates that data does not speak for itself, we
approach it with particular epistemic techniques (methods) related to a
sense of purpose, a "will to know" (Foucault 1976).
22. Analytical gestures develop from the tension between a "research
purpose" (question, exploration, etc.) and the available data:
The technical dimension of data (via platform, API, extractions software):
☉ Available units, variables, etc.
☉ Temporal coverage, completeness, perceptivity, etc.
☉ Technical formats, available "views", etc.
The semantic dimension of data (aspects of practice):
☉ Demographic (age, sex, income, etc.)
☉ Post-demographic (tastes, preferences, etc.)
☉ Behavioral (trajectories, interaction, etc.)
☉ Expressive (messages, comments, etc.)
☉ Technical (informing on the platform's functioning)
1 / Introduction - d / the analysis software
23. Statistics
Observed: objects and properties ("cases")
Data representation: the table
Visual representation: quantity charts
Inferred: relations between properties
Grouping: class (similar properties)
Graph theory
Observed: objects and relations
Data representation: the adjacency matrix
Visual representation: network diagrams
Inferred: structure of relations between objects
Grouping: clique (dense relations)
1 / Introduction - d / the analysis software
24. Quetelet 1827, Galton 1885, Pearson 1901
Regression, PCA, etc. are potentially useful.
1 / Introduction - d / the analysis software
25. Entities seem straightforward because data is well structured, but
variations in scale and practice require being careful.
Descriptive statistics for social media often profit from attention to the form of a distribution;
visualization, multi-point summaries, and metrics like kurtosis or skewness are very useful.
1 / Introduction - d / the analysis software
26. 1 / Introduction - d / the analysis software
Moreno 1934, Forsythe and Katz 1946
Graph theory, "a mathematical model for any
system involving a binary relation" (Harary 1969)
29. Nine measures of centrality (Freeman 1979)
Network statistics (e.g.
degrees, distances, density,
etc.) can help describing and
comparing networks.
Graph theory also provides
many mathematical tools to
derive metrics from the
structure of a network (e.g.
"centrality", "influence",
"authority", etc.), to identify
groupings, etc.
30. "Facebook Likes can be used to automatically and
accurately predict a range of highly sensitive
personal attributes including: sexual orientation,
ethnicity, religious and political views,
personality traits, intelligence, happiness, use of
addictive substances, parental separation, age,
and gender." (Kosinski, Stillwell, Graepel 2013)
There are many new(ish) techniques
coming from computer science for
automatic classification, prediction,
sentiment analysis, etc.
1 / Introduction - d / the analysis software
31. 1 / Introduction - conclusion
Four layers of technical mediation to take into account: the platform itself,
the API, the extraction software, the analytical techniques.
To do productive work, attention to these four layers needs to be
combined with theoretical resources and case knowledge.
Bringing this together requires iteration and flexibility; it's “detective work
– numerical detective work – or counting detective work – or graphical
detective work” (Tukey, 1977).
32. 2 / Examples - a / Facebook
Facebook is the largest social media platform with 1.5B monthly active
users. It incorporates networked communication (friend-to-friend), group
communication (Facebook Groups), and "mass" communication (Facebook
Pages).
A lot of analytical possibilities disappeared in April 2015 due to a
comprehensive push for more privacy; open FB Groups and FB Pages are
now the main entryways.
Extraction tool used: Netvizz (Rieder 2013)
Main example: Kullena Khaled Said Page (Rieder et al. 2015)
33. FB Pages allow for retrieval
of historical data without
time limit.
14K posts, 1.9M active
users, 6.8M comments
(99.9% Arabic), 32M likes
Kullena Khaled Said was
created in June 2010 by
Wael Ghonim after Khaled
Said was beaten to death
by Egyptian police.
34. comment like share
count yes yes yes
individual user list yes yes no
time-stamp yes no no
There is a lot of material for
analysis, but these numbers
need extensive data critique.
35. Data quality is high but the
platform is complex and
changing over time.
Is the linked content part
of the data?
These elements can drown
in a large data set and
skew it.
The quantitative is full of
qualitative considerations.
37. Kullena Khaled Said, June 2010 – July 2013 posts per
comment (timescatter), y-scale log10
10
1000
2010−06−10
2011−01−01
2011−01−25
2012−01−01
2012−01−25
2013−01−01
2013−01−25
2013−07−03
date
comments_count_fb
type
link
music
photo
question
status
video
53. 2 / Examples – a / Facebook
For Kullena Khaled Said, we were not only able to confirm the importance
of the page for the Egyptian revolution, but gain a much better
understanding of the dynamics of "connective action" (Bennett &
Segerberg) and what we called "connective leadership".
For the SIOTW network of self-declared affiliations, we were able to
nuance the complicated and skewed relationship between right-wing anti-
Islamism and Israeli actors and institutions.
While API-based research into private relations and interactions on
Facebook has become practically impossible, there are many opportunities
for investigating public (Pages) and semi-public (Groups) settings.
54. 2 / Examples – b / Twitter
While Twitter has fewer users than Facebook (320M MAU), it is used a lot
in the context of media debate, political conversation, and activism.
Twitter has very few privacy limitations, but data needs to be captured in
real time. To access the archive, one has to pay. But there is a 1% sample.
Extraction tool used: DMI-TCAT (Borra & Rieder 2014)
Main example: #gamergate
55. #gamergate project preliminary exploration:
is it about "ethics in game journalism" or a
neo-conservative hate movement?
56. There are counts everywhere,
but anything here can be
exploited for analysis.
Because of temporal limitations,
Twitter analysis means creating
databases of collected tweets.
71. 2 / Examples – b / Twitter
Twitter is a very open platform, the main problem is the requirement to
anticipate or react quickly since historical tweets are costly.
Since tweets can be easily sent by bots and automators, we have to be
very careful with metrics and always check from a number of different
perspectives.
For #gamergate, first findings show a very densely connected community
organized around a group of highly active and visible accounts.
Hashtag use (discounting bots) is dominated by outrage against perceived
"minority favoritism", "social justice warriors", and anti-abuse measures;
"ethics in journalism" is not prominent at all.
72. 2 / Examples - c / YouTube
YouTube is maybe the most understudied (witch digital methods) of the
large social media platforms (1B+ users).
YouTube is probably the most open social media platform, with very few
limitations on the API level.
YouTube Data Tools (YTDT), a new tool, is an attempt to facilitate data-
driven research.
83. 3 / Conclusions
Social media analysis with digital methods relies on the "natively digital
objects" (Rogers 2013) that platforms are built around; technical
mediation intervenes in all stages of the research process.
Despite the promise of easy access to well-structured data, there are
considerable difficulties and limitations.
Digital methods is not a one-click type of research, but requires
considerable time and critical interrogation to produce robust results:
which objects to take into account, how to create a sample / collection,
how to analyze it, how to interpret, how to make findings.
84. 3 / Conclusions
In order to deal with big and complex datasets, we need exploratory
approaches that combine micro/macro and qualitative/quantitative in
various ways:
☉ Investigate the platform in detail to account for technical pitfalls.
☉ Qualify quantities.
☉ Gain a sense of practices to orient quantitative methods.
☉ Use quantitative indicators to decide on qualitative focus.
☉ Read content to understand outliers.
☉ Make explicit plausibility tests based on reading.
☉ Interpret the small in relation to the large and the other way round.
Because n=all these articulations have become much more feasible.
Every analytical gesture shows different things, combination completes the
picture. We need "flexibility of attack, willingness to iterate" (Tukey 1962).
85. 3 / Conclusions
There is a lot of excitement about social media data analysis, but our
techniques are often still experimental and far from standardized.
We need interrogation and critiques of methodology that are developed
from engagement and historical / conceptual investigation.
We need analytical gestures that are more closely tied to concepts from
the humanities and social sciences.
Visualization and simple tools are very interesting, but require technical
and conceptual literacy to deliver more than (deceptive) illustrations.
86. 3 / Conclusions
Data analysis for social media requires (in my view):
☉ Robust understanding of the social media platform;
☉ A sense of purpose;
☉ Conceptual understanding of methods and analytical gestures;
☉ Knowledge of software tools for data analysis;
☉ Considerable domain expertise;
If you think that these approaches can be interesting for your research, I
would recommend to simply try out some of the tools to get a first-hand
impression.
Data can be thought of as a kind of "observation" rather than survey-based research.
This is what you can do with a tweet.
https://twitter.com/ICIJorg/status/321585235491962880 / https://api.twitter.com/1/statuses/show/321585235491962880.json
People do a lot of different things on Twitter, Facebook, etc. – and just because you and your immediate vicinity seem to have coherent practices, this does not mean others have.
Entities and types of relation are formalized in "domain specific ways" => FB social graph
Differentiation of scales (topological forms) is produced through technical means and emerge through social dynamics. Variations in scale are less institutional and more topological. (example: big Twitter accounts.)
The idea that this would foster equality comes from the fact that indeed, everybody is a node. We think in terms of properties, not in terms of structure/dynamics. Status is not what you are, but how you are connected.
=> Variety in topics, variety in scales. Size is the main differentiator.
Very large scale systems with very diverse uses on the one side, but highly concentrated data repositories on the other.
http://hashtagify.me/hashtag/gamergate
http://topsy.com/s?q=gamergate
Instead of getting an interface, you're getting a file.
R Core Team (2014). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL http://www.R-project.org/.
Bastian M., Heymann S., Jacomy M. (2009). Gephi: an open source software for exploring and manipulating networks. International AAAI Conference on Weblogs and Social Media.
This is where we start bringing together our knowledge of the platform, the case, etc.
Allows for all kinds of folding, combinations, etc. – Math is not homogeneous, but sprawling!
Different forms of reasoning, different modes of aggregation.
These are already analytical frameworks, different ways of formalizing.
There is a fast growing variety of analytical gestures focusing on large numbers of formalized and classed objects.
In statistics, regression analysis is a statistical technique for estimating the relationships among variables. (correlation)
A probability relationship: height and weight is correlated: if you are very tall, there is a good chance that you also weigh more; a statistical not a deterministic relationshhip
Erosion of determinism in the 19th century
Title : Recherches sur la population, les naissances, les décès, les prisons, les dépôts de mendicité, etc., dans le royaume des Pays-Bas , par M. A. Quételet,… 1827
http://gallica.bnf.fr/ark:/12148/bpt6k81568v.r=.langEN
Forsythe and Katz, 1946 – "adjacency matrix", Moreno, 1934
Visualization is, again, one type of analysis.
Which properties of the network are "made salient" by an algorithm?
http://thepoliticsofsystems.net/2010/10/one-network-and-four-algorithms/
Models behind: spring simulation, simulated annealing (http://wiki.cns.iu.edu/pages/viewpage.action?pageId=1704113)
Visual / spatial analysis is already very interesting, but graph theory allows to do much more. Networks are eminently calculable.
All in all, this process resulted in the specification of nine centrality measures based on three conceptual foundations. Three are based on the degrees of points and are indexes of communication activity. Three are based on the betweenness of points and are indexes of potential for control of communication. And three are based on closeness and are indexes either of independence or efficiency.
(Freeman 1979)
What concepts are they based on?
Graph shows prediction accuracy from likes. But this is still based on our "direct data", i.e. the things I liked.
Kosinski, Michal, David Stillwell, and Thore Graepel. "Private traits and attributes are predictable from digital records of human behavior." Proceedings of the National Academy of Sciences 110.15 (2013): 5802-5805.
B. Rieder (2013). Studying Facebook via data extraction: the Netvizz application. In WebSci '13 Proceedings of the 5th Annual ACM Web Science Conference (pp. 346-355). New York: ACM.
Rieder, B., Abdulla, R., Poell, T., Woltering, R., & Zack, L. (forthcoming). Data Critique and Analytical Opportunities for Very Large Facebook Pages. Lessons Learned from Exploring “We Are All Khaled Said”. Big Data & Society.
http://www.facebook.com/ElShaheeed
Khaled Said was beaten to death by the Egyptian police in Alexandria on June 6 2010
Page created by Wael Ghonim (Google Employee), considered to be a central place for the sparking of the Egyptian Revolution of 2011 (second man: AbdelRahman Mansour)
We are interested in a number of questions, in particular the role of the page in the Egyptian Revolution. (broad question)
And although we thing that this is basically about getting data out of a database, it's simply not that easy.
Activity on posts continues; because we have a timestamp on comments, we can cut, but not on likes.
Numbers need to be qualified on different levels.
Issues: data access, changing FB platform (e.g. threaded comments)
The communicational situation on this page is that only the admins can post.
Comments can no longer be read for quantitative and logistical reasons.
One of our research angles concerned polling as proto-democratic practice, so this is important.
For some things we can correct, for others we can't.
Simply plotting events is an analytical gesture. (=> pattern)
Visualization is great for getting a first overview, maybe also finding out problems.
Notice the dip of photos in February 2011. Photos are really the drivers of motivation.
In the whole period only 19 days without post.
Shared content but meticulous curation.
Start of a revolutionary dynamic when a threshold is crossed. We can see that in the comments of these days, when many declare they no longer care about their safety.
The revolutionary phase is followed by a face of reflection going towards the constitutional referendum.
Interestingly, we do not have a power law. The highly active group is larger than a power low would indicate.
We're not limited to merely quantitative perspectives, but there are so many comments! Two "distant reading" tools.
This is really the limit of what one can do with our resources.
Here, one needs to understand the layout algorithm to make interesting readings.
Top user commented on nearly 4K different posts
The topology indicates that the top users have different priorities.
We could qualify the most active users on the page
From DMI Workshop on Anti-Islamism.
Pages can like each other, a kind of declaration of affinity.
Starting point: stop Islamization of the World. Color: modularity algorithm (community detection)
What does this mean?
Starting point: stop Islamization of the World.
What does this mean?
I am using this case to walk you through some of the things one can do with DMI-TCAT.
User and network statistics give us a good idea, here.
We have a very dense community, with a number of highly active and visible top users.
#sjw apprears 3573 times, #journalism 120 times => "ethics in gaming journalism"?
The #gamedev tweets come from hashtag hijacking via IFTTT
Cascade Interface, typical qual-quant
Temporal and retweet patterns as means to detect bots.
We again see one sided association, the gamergaters connecting to mainstream gaming channels, but those rarely link back.
But no subscriptions taken into account!
Not only features channels but also subscriptions! But with subscriptions, one arrives quickly at a much larger network.