Social Network Naming

Social Network Naming” by jurvetson is licensed under CC BY 2.0

Some internet based social control techniques explained

Brigading

What it looks like
A large number of people suddenly appear in a space to downvote, report, harass, overwhelm, or silence a person or group.

What’s really happening
Brigading is coordinated pressure, not organic disagreement.

It usually starts elsewhere:

  • a private chat
  • a forum
  • a social feed
  • a group with shared identity or grievance

Someone signals a target (“look at this”, “this person is dangerous”, “go report this”), and the group acts together.

Why systems enable it

  • Platforms treat many small actions as independent, even when they’re coordinated.
  • Visibility and metrics amplify perceived legitimacy (“everyone agrees!”).
  • Speed matters more than reflection.

Psychological driver
People feel morally justified because responsibility is diluted. No one feels like the attacker – they’re “just one of many.”

Why it kills trust
It replaces dialogue with force. The target isn’t engaged; they’re overwhelmed. Truth becomes irrelevant once numbers decide outcomes.

Spam floods

What it looks like
A space is filled with repetitive, low-quality, automated, or semi-automated content until meaningful interaction becomes impossible.

This can be:

  • advertisements
  • propaganda
  • copy-pasted arguments
  • AI-generated noise
  • deliberate nonsense

What’s really happening
Spam floods are not about persuasion – they’re about denial of meaning.

The goal is to:

  • exhaust moderators
  • drown out signal
  • make conversation feel pointless

Why systems enable it

  • Posting is cheap.
  • Accounts are disposable.
  • Quantity is easier than quality.
  • Automated systems reward activity.

Psychological driver
Spam doesn’t need belief – just repetition. It exploits fatigue, not reason.

Why it kills trust
People stop investing attention. Once attention collapses, community collapses shortly after.

Reputation gaming

What it looks like
Users learn how to “win” the system:

  • saying popular things instead of true things
  • farming likes or karma
  • signalling allegiance to dominant views
  • avoiding nuance because it performs poorly

Over time, the most visible voices are not the most thoughtful, but the most system-savvy.

What’s really happening
Reputation becomes detached from understanding.

The system rewards:

  • conformity
  • timing
  • emotional triggers
  • certainty

Rather than:

  • insight
  • reflection
  • good faith disagreement

Why systems enable it

  • Single-dimensional scores flatten complexity.
  • Public metrics encourage performance.
  • Rewards are immediate; understanding is slow.

Psychological driver
People adapt to incentives. If visibility equals value, behaviour shifts to maximise visibility.

Why it kills trust
You can no longer tell whether someone believes what they say, or is just optimising for approval.

Performative outrage

What it looks like
Highly emotional, morally charged reactions that:

  • escalate quickly
  • demand immediate alignment
  • punish hesitation or nuance
  • disappear once attention moves on

Outrage becomes a ritual.

What’s really happening
Outrage becomes a signal, not a response.

It signals:

  • belonging
  • moral alignment
  • group loyalty

Often without proportional understanding of the issue itself.

Why systems enable it

  • Outrage spreads faster than explanation.
  • Emotional content is prioritised.
  • Algorithms reward intensity.
  • Silence is interpreted as guilt.

Psychological driver
Fear of exclusion. People perform outrage to avoid becoming targets themselves.

Why it kills trust
Good faith disappears. People stop asking questions and start declaring positions. Complexity is treated as betrayal.

The common thread – social control and people-policing

All four are emergent behaviours, not individual moral failings.

They arise when systems:

  • reward speed over reflection
  • amplify numbers over coherence
  • conflate visibility with value
  • make identity and reputation fragile
  • encourage particular political narratives

In those conditions, people adapt in predictable ways – they tend to enforce black and white thinking, blaming others, dehumanising and mis-labeling entire groups of people. Very often it is the political narrative that triggers such behaviour, combined with media driven fear.

Why AnonNet resists these by design

Without repeating everything we’ve discussed:

No central reputation score: each server that the individual connects to hold it’s own score for that individual, some of that trust can carry over to any new server in the network, this means that any would be scammer, would have to earn trust over and over, in order to cause any significant impact. Therefore, reputation gaming collapses.

High effort for coordinated action: The result of this decentralised, and individually based trust model, brigading becomes much more expensive.

Containment spaces + friction: The system, will allow for clever reporting with automated spam detection responses. If, say, five hosts call out a spammer, and they are high trust hosts of that network, then the system will respond, and those spam floods lose leverage.

No algorithmic amplification: Without any kind of systemwide advertising, or other spamming models intended to amplify social control signals, outrage attacks will stop being profitable, or easily coordinated on such a large scale.

Most importantly: Trust grows from experience, not performance – when systems stop rewarding noise, noise fades.


0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *