Skip to content

Latest commit

 

History

History
20 lines (13 loc) · 2.34 KB

stewardship-of-global-collective-beahvior.md

File metadata and controls

20 lines (13 loc) · 2.34 KB
tags
from_paper
society
research

Stewardship of Global Collective Behaviour

src: [@BakColeman:2021wt]

A well written call-to-arms for some sort of interdisciplinary effort to understand the possible pernicious effects of social media and more broadly the rapid pace of technological change affecting the way we communicate, form groups, digest information, and hopefully provide guidance on how to solve these problems (e.g. writing pieces specifically for regulators).

They use a term called crisis discipline which I like, the canonical example being climate (change) science: you have this incredibly complicated system that needs urgent research and attention (for catastrophic reasons), but you don't necessarily have the time (or it's just not possible given the complexities of the system) to be entirely systematic and sure about the conclusions. In other words, these kinds of disciplines call for a much more agile form of research.

What's interesting to me is that they couch all this talk through the lens of [[complexity-theory]]. The idea is that, for instance, once you connect half the world's population together through the internet, or social media, you're going to get unaccounted-for emergent behaviour, very much like those studied in complexity science, except usually the subject is natural processes (like swarms of locusts or school of fish). The difference now is that we're dealing with humans, social interactions.

A good example here is the flow of information. Usually when we think of information flow we think of communication networks, where we're sending bits of data around. However, the real information flow networks, and those that matter the most right now from a catastrophic perspective, are the information flows that we humans create when we read and share news over social media, thereby enabling the incredible propagation of fake news that we see permeate the world today. And this isn't just a simple process: once you incorporate humans (and human judgement) into this network, it becomes infinitely more complicated to model and predict.

I definitely feel like this is something that I've been trying to articulate, and so I'm happy to see it laid out in this clear manner (unlike the way my brain organises its information, if it does that at all). It also has the same sort of flavor as my [[fairness-project]] work.