Online media is now a significant carrier for quicker and ubiquitous diffusion of information. Any user in social media can post contents, provide news blogs, and engage in debate or opinion nowadays. Most of the posted pieces of information on social media are useful while some are fallacious and insulting to others. Keeping the promise of freedom of speech and simultaneously no tolerance against hate speech often becomes a challenge for the hosting services. Some automated tools were developed for content filtering in industries. Also, companies are hiring specialized reviewers for accurate and unbiased reporting. However, these approaches are not achieving the goal as expected, on the other hand, new strategies are being adopted to tweak the automated systems. To face the situation, we proposed a smart crowdsourcing based content review technique to provide trustworthy and unbiased reviews for online shared contents. In this techniques, we designed an intelligent self-learned crowdsourcing strategy to select an appropriate set of reviewers efficiently which ensures reviewers’ diversity, availability, quality, and familiarity with the news topic. To evaluate our proposed method, we developed a mobile app similar to popular social media (e.g., Facebook).
doi.org/10.1007/978-3-030-04648-4_44
Jodhpur Park | Call Girls in Kolkata Phone No 8005736733 Elite Escort Service...
Smart Crowdsourcing Based Content Review System (SCCRS): An Approach to Improve Trustworthiness of Online Contents
1. Smart Crowdsourcing Based Content Review System
(SCCRS):
An Approach to Improve Trustworthiness
of Online Contents
Authors
Kishor Datta Gupta, Dipankar Dasgupta, and Sajib Sen
2. Motivation: Content review
• Review content before publication
- Scientific journals (IEEE,ACM..)
- Well established news organization. BBC,CNN..
- Googleplay, Itune, Coursera, Udemy, Themeforest, Netflix
• Readers Review content after publish
-Wikipedia, Stackoverflow, Qoura, Researchgate
• Review content only after complain reported
-Facebook, Youtube, Twitter, Instragram
3. Drawbacks of existing review process
• Use of (internal) paid reviewers
Only good for small amount of contents
Hard to maintain diversity among reviewers
Costly as reviewers are paid
• Current use of crowd sourcing
Reviewers may not go along with organization policy
Reviewers can misuse their power
Majority wins create opportunity for low quality
4. Proposed methodology of crowdsourcing
• A methodology of crowdsourcing
Where reviewers can not abuse
Reviewers biasness will be considered
Reviewers review will be weighted based on quality
Diversity of reviewers will be ensured
5. Related Crowdsourcing works
• Content Based -> NLP+ML
• Source based -> Weight based Decision
• Review Based -> Majority
6. Related work drawbacks
• Hard to detect sarcasm
• People like to abusive(trolling) online
• Opinions are hard to classify
7. Our Contect Review Approach
Context Classification of content
Get Context Related Reviewers List
Normalize Reviewer report
Check for Biasness
Classify the content for publish
9. Selection of Reviewers
• Attributes of Reviewers
– Availability: Is the reviewer available?
– Quality: Does the reviewer have good analytical skill?
– Familiarity: Is the reviewer familiar with all parts of the news content?
• So the goal isto find a reviewer set of 𝑅_𝑆 which satisfy the availability,
quality and familiarity constraint of reviewers 𝐶_𝑎𝑞𝑓 and can generate trust
worthiness news 𝐶_𝑛𝑒𝑤𝑠 with respect to weight of news topics 𝑇_𝑤𝑗
• Why group of multiple reviewer?
if some of them are absent, the process can still proceed .
The chance of missing any flaw in the content is very low
To ensure Diversity
10. Reviewer Availability
Daily & Recent login rate
Recent and total session rate
Recent and total response rate
Total Response rate in common topics
Recent Response rate in common topics
Other network activity
11. Reviewer Quality & Familiarity of the subject
Average time took for all review
Average time took for all similar review
Average review process time
Failed review rate
Successful review rate
Undecided review rate
Other attribute
14. Generating Diverse Set of Reviewers
Let us denote feature topic as F and Sensitivity weight of each feature as Fsw.
• We used the Gini-Simpson index Gini-Simpson index 𝑔𝑖 = 𝑖=1
𝑛
𝑝𝑖
2
, where 1 - λ = 1 - 𝑔𝑖 =
1 – 1/2 𝐷 for each of the diversity factors.
• Calculated the diversity of each reviewer sets, and
• Started a Multi-Objective Genetic Algorithm model (MOGA)
15. Generating Reviewer Set
X1 of
Reported
X2 of
true
positive
X3 of
true
negative
X4 of
true
negative
Total
∑X
content
After getting a reported content we will get the context of the n number report
generate a list of consists of
Now we will create a web form where all content will present
Reviewers will give rating to the news from 0 to K . Higher the k
higher the approve value
16. Using Reviewers decision
Merge all feedback Decision of threshold
Reviewers
weights in
set
Individual
point
Tunein
Biasness
Weight
Decision
Individual
Normalization
17. Implementation
• A steady state heuristic Multi-Objective Genetic Algorithm model has been
used
• Visual studio c# to make our demo application.
• Picked a reviewer database of 9999 reviewers, which already have prior data
of availability and familiarity and quality values.
• Normalized each data between 1 to 100 using equation 𝑋𝑖 =
𝑋 𝑖 −𝑚𝑖𝑛 𝑥
𝑚𝑎𝑥 𝑥−𝑚𝑖𝑛 𝑥
• After that, calculated the diversity factors for age, gender, and race for each
reviewer set using the equation 𝐷𝑖=
𝑓 𝑚𝑖𝑛 𝑑 ∗𝑑 𝑔𝑟𝑜𝑢𝑝𝑐𝑜𝑢𝑛𝑡
100
24. Conclusion
The proposed approach have the following advantages
compared to existing approaches:
• Reviewer biasness detection
• Reviewer abuse detection
• Reviewer diversity ensured
• Quality of reviewer can be ensured
Further work will
• Use blockchain-based reward system