You threw my daughter out of Animal Jam when she was making plans to meet a friend for lunch, which I thought was a pretty impressive catch. She thought she had mistyped “duck.” What other kinds of behaviors or activities have you interrupted?
Thanks – I always love it when we can turn a discipline from the game into a positive parent interaction. As far as other behaviors/activities, where do I start? Obviously, for COPPA compliance, players trying to give out personal information is a very high priority – that’s addresses, emails, and phone numbers – but even Skype and other instant messaging usernames, FaceTime handles, and any other methods where players would be communicating outside of the game and potentially sharing that personal info.
While there is no law around it (which most parents are surprised to learn) we are also very diligent regarding inappropriate behavior and conversations, including cyber dating, drugs/alcohol, violence, vulgar language, cyberbullying and anything else we have deemed inappropriate to be associated with our brand and within the younger demographic we attract.
Read more here: At Least 17 Reasons Why Your Kid May Be Playing Animal Jam
This past month, I had the honor of being interviewed by Patrick O’Keefe, for his community management focused podcast, Community Matters. You can download it on iTunes or your favorite podcasting app, or stream it here:
On the podcast, we talk about many things, but our focus was on the treatment of those staff on the front lines of community – the moderators and engagement staff that actually interact with customers. I feel very strongly that while some of the burden of choosing and keeping a potentially toxic job is on the employee, an equal, and in some cases larger, portion of that responsibility is on the employers and brands hiring those individuals.
Often times, they are highly marginalized team members – many are contractors with little or no interaction with the larger team or the client/brand team. They are usually paid very low wages, even state-side, being told that they should be “happy” with their work-from-home status.
And that’s just when the content they are handling isn’t toxic. On most moderation teams, they have to screen out all the “bad” content, so that the audience doesn’t see it. But the moderators still see it and are usually not given the support required to handle emotionally volatile content. Even in communities for children, moderators can come across triggering content and some teams do not prepare their staff for that possibility. “Becoming numb to it” is an awful skill to have to develop on the job.
I also worry about the increasing trend to offshore moderation work to low-wage countries. As an employer, I understand the urge, but it is difficult to maintain high quality with non-native speakers, not to mention the difficulty of oversight of procedures regarding the emotional well being of those moderators. Just because they are offshore, doesn’t mean negative content won’t affect them the same.
I am interested to hear your thoughts on this topic. Let me know what you think in the comments or via twitter.