This reminds me of the Lucifer Effect experiment.
It's also something I've noticed in online games. I had many friends who started hacking after it became common knowledge that cheating the game was possible (and that admins weren't doing much about it).
This phenomenon is particularly annoying because in an ideal development life cycle, you need to do things in orderly steps: write code, test, deploy. It's not something you can do in a matter of hours. Addressing game mechanics abuse, on the opposite hand, requires immediate action to prevent damage to the collective perception of the game.
Having hacked items polluting the database isn't a big deal in itself, but having the entire community thinking it's ok to google for hacks can negatively affect bottom lines in several ways: when users quit, when you have to start to compete against gold-selling sites and shady eBay auctions (this applies mostly to micro-payment models, like Nexon or Ntreev games), when the time it takes to get from level 1 to some considerably high level becomes so short that the social value of the game diminishes, when new players are turned off by a sour greedy community, etc. It can really spiral out of control.
Some might think that hiring more moderators can fix the social problem, but in reality, that's the same as a company adding more people to a call center and expecting customer satisfaction to go up. Relying on a horde of minimum-wage workers to save a business is just foolish.
I suppose a better solution would start in the design phase of the game, but judging from the quality of many games out there, I think it's safe to say it's incredibly hard to get it right.
No comments:
Post a Comment