Detecting, Understanding and Countering Online Harms

  in Special Issue   Posted on September 28, 2020

Information for the Special Issue

Submission Deadline: Sun 15 Nov 2020
Journal Impact Factor : 0.000
Journal Name : Online Social Networks and Media
Journal Publisher:
Website for the Special Issue: https://www.journals.elsevier.com/online-social-networks-and-media/call-for-papers/detecting-understanding-and-countering-online-harms
Journal & Submission Website: https://www.journals.elsevier.com/online-social-networks-and-media

Special Issue Call for Papers:

Online Social Networks and Media have revolutionized society, and are now a key part of how most people work, live, socialize, find information and entertain themselves. But whilst they have generated huge benefits, leading to unprecedented connectivity across the globe, online social networks have also enabled the spread of hazardous and dangerous behaviours. Such ‘online harms’ are now a pressing concern of policymakers, regulators and big tech companies. Building deep knowledge about the scope, nature, prevalence, origins and dynamics of online harms is crucial for ensuring we can clean up online spaces. This, in turn, requires innovation and advances in methods, data, theory and research design — and developing multi-domain and multi-disciplinary approaches. In particular, there is a real need for methodological research that develops high-quality methods for detecting online harms in a robust, fair and explainable way.

This special issue seeks high-quality scientific articles (including data-driven, experimental and theoretical research) which examine harmful behaviours, communities, discourses and ideas in online social networks and media. We welcome submissions on any online harm but particularly encourage papers which focus on online hate, misinformation, disinformation, extremism and terrorism. Data-driven approaches, supported by publicly available datasets, are strongly encouraged.

Areas of interest are (1) detecting and measuring online harms, (2) analysing online harms through the use of advanced modelling techniques and (3) developing and interrogating ways to tackle online harms. This includes but is not limited to:

  • The prevalence of online harms, either on one online platform or several.
  • The efficacy, usability and appropriateness of different counter measures to tackle online harms; both policies and new technologies.
  • The impact of large trigger events, such as COVID19 or the murder of George Floyd.
  • Niche and smaller online platforms, including how they differ from mainstream spaces.
  • Modelling and analysis techniques to predict online harms, as well as their dynamics and associated factors.
  • Machine learning (e.g. natural language processing and computer vision) to detect and categorise online harms.
  • The prevalence and role of counter speech online.
  • Biases in methods and analyses, including how explainable, accessible, fair, transparent and interpretable they are.
  • Integrated analysis of different online harms (e.g. studying how misinformation, hate and extremism intersect).
  • Cross-platform and inter-platform dynamics, such as user migration from mainstream to niche spaces.
  • Strategies for online harm dissemination used by malicious actors and others.
  • Community-based detection methods.
  • The ethics and social implications of socio-technical research to study and target online harms.

Guest Editor Team

Arkaitz Zubiaga, Queen Mary University of London <a.zubiaga@qmul.ac.uk>

Bertie Vidgen, Alan Turing Institute <bvidgen@turing.ac.uk>

Miriam Fernandez, Open University <miriam.fernandez@open.ac.uk>

Nishanth Sastry, University of Surrey <n.sastry@surrey.ac.uk>

Timeline [proposed]

  • Manuscript submission deadline: November 15th, 2020
  • First notification: January 15th, 2021
  • Submission of revised paper: February 15th, 2021
  • Notification of acceptance: March 15th, 2021
  • Publication: Summer 2021

Closed Special Issues