Categories
Gaming Social Software

Political economies in self-moderating communities…

Derek Powazek wrote an interesting piece last year about rating-based moderation systems: Gaming the system: How moderation tools can backfire. One of the most important lessons in online community management is that top-down management is seldom particularly successful in forcing people to act in a certain way. Certainly if the image of the community that the administrators wish to enforce is radically different from what the community itself wants, then the site is more likely to rip itself apart than to fall in line. Online communities are not made social by the presence of these administrators and nor is the quality of their social interaction defined by that administrator. Groups of people self-organise, self-maintain and even – to an extent – self-moderate. An administrators job is to (more or less intrusively or architecturally) facilitate the creation or the maintenance of these self-organising aspects and to build a community space that suits them.

Moderation systems that are based upon ratings schemes are radical attempts to help people self-organise by choosing people and content worth reading and by ostracising bad users. The most well-known sites of this kind are Slashdot and Kuro5hin. But Derek’s piece reminds us that even these attempts to replicate concepts of reputation and rating online can have problems:

Still, it’s important to remember this essential truth: Any complicated moderation system that makes its algorithms public is eventually going to fall victim to gaming. So my advice is, if you’re going to use a community moderation system, make it as invisible as possible. No karma numbers, no contests, no bribes. Rely on social capital and quality content to get your community talking, and develop a system that helps you moderate without a lot of fanfare. The bottom line is, if you take away the scores, it’s hard to play the game.

I think Derek’s position on this is fundamentally correct – albeit a little strongly worded. I suppose my biggest problem is trying to work out to what extent gaming the system is an abuse of or a fundamental aspect of real-world analogies to these moderation schemes. Perhaps the problem is not that the social structures built within the game are too complex and take away from normal human interaction, but that they’re simply not gameable enough. The online political economy of a site like Slashdot seems to me to have some clear analogies with the problems of inflation in the virtual economies of MMORPGs. It could take many years for the UI and the ‘market’ to come together in gameably useful ways…

5 replies on “Political economies in self-moderating communities…”

Cracking the Virtual Economy
Great post from Tom Coates on plasticbag.org about the ways that online communities respond to — or evade, or subvert — moderation: I suppose my biggest problem is trying to work out to what extent gaming the system is an…

I’ve been a frontline administrator / moderator / hitler / leader / guidance coundellor of online communities in one form or another for over 8 years (ie before mainstream blogging) now – and I agree wholeheartedly.
My experience (which I’m not claiming is exhaustive or correct by any means) has shown me that moderators do not control online communities. In fact, the more moderation you apply, the less effective it is.
You either a) moderate so much that you lose the community (a community does not consist of a single voice), or b) lose control of the community altogether.
Moderation is a big thick grey line. Sure, community members are aware of golden rules in as much as the general public is aware of the fact that murder is wrong in ‘civilised’ societies. Beyond this though the moderator will find difficulty in enforcing rules and maintaining a state of parity.
Moreover, a successful moderator (in my experience) does not cast rules down to a community. The most successful moderation lies in peerage – in bending majority opinion with peer authority.
To be crass and patronising, its the sheep rule – leadership through example is the best form of moderation.
Of course, I would never belittle any community that I have moderated in this way. Successful real world moderation lies as much in the flexibility of the authority as it does in the pliance of the community itself.

The thing I advocate is for obfuscating the mechanism not hiding the rules/laws.
There are two things at work here; one is the laws by which the space works. Not the codes of conduct or social norms, but the social laws – that in a social space function like physical laws. If you post something good it gets seen more, if you post something bad it gets removed… onwards and upwards to karma systems and collaborative moderation. These should function just like the way you understand that if you throw a rock up in the air it falls back to hit you in the head.
Then there are mechanisms. These are the exact ways the rules/laws are implemented. g=G*M/D≤ for gravitational attraction. People don’t need to know this formula to work out rocks fall. They don’t need to know and if they do know then a select few will try to game the system. If told the mechanism, then the exact nature of these will eventually be empirically determined and taken advantage of.
This may sound draconian, I want to stress that it completely and utterly is. Most if not all online communities are run by single entities, companies or people, or groups of like minded enough people that it doesn’t matter. The best of these communities are ones that are seen by the users as benevolent dictatorships. And they will always be, even if the creator steps down, they usually still pay the bills to the ISP, thus having ultimate control.
I think there is something to do with equality and status and the obfuscation that is required. Communities where everyone is equal donít need to hide the rules. Communities where people can gain status do. Slashdot and Everquest need to hide the mechanisms of their rule systems. A game of chess or a heavily moderated messageboard donít. Thereís more to this last paragraph, Iíll have to think about it.

How Moderation can backfire
A nice article on the side-effects of publishing reputation data — you turn it into a game. This is a good explanation of why I’m looking at limiting access to reputation data in systems I’m designing, so that only the admins have access to all of the…

Comments are closed.