Posted in online community, Safety/Privacy, virtual worlds, web business

My Interview on the Community Signal Podcast

This past month, I had the honor of being interviewed by Patrick O’Keefe, for his community management focused podcast, Community Matters.  You can download it on iTunes or your favorite podcasting app, or stream it here:
 .
 .
On the podcast, we talk about many things, but our focus was on the treatment of those staff on the front lines of community – the moderators and engagement staff that actually interact with customers.  I feel very strongly that while some of the burden of choosing and keeping a potentially toxic job is on the employee, an equal, and in some cases larger, portion of that responsibility is on the employers and brands hiring those individuals.
 .
Often times, they are highly marginalized team members – many are contractors with little or no interaction with the larger team or the client/brand team.  They are usually paid very low wages, even state-side, being told that they should be “happy” with their work-from-home status.
 .
And that’s just when the content they are handling isn’t toxic.  On most moderation teams, they have to screen out all the “bad” content, so that the audience doesn’t see it.  But the moderators still see it and are usually not given the support required to handle emotionally volatile content.  Even in communities for children, moderators can come across triggering content and some teams do not prepare their staff for that possibility.  “Becoming numb to it” is an awful skill to have to develop on the job.
 .
I also worry about the increasing trend to offshore moderation work to low-wage countries.  As an employer, I understand the urge, but it is difficult to maintain high quality with non-native speakers, not to mention the difficulty of oversight of procedures regarding the emotional well being of those moderators.  Just because they are offshore, doesn’t mean negative content won’t affect them the same.
.
I am interested to hear your thoughts on this topic.  Let me know what you think in the comments or via twitter.
Posted in marketing, online community, Safety/Privacy, tween, virtual worlds, web business

Facebook and livestreaming tragedies

Great NPR Marketplace story on the conundrum of “free speech” and expression on social media during these tense times and how to moderate content for brands and different audience consumption.

http://www.marketplace.org/2016/07/07/life/facebook-and-livestreaming-tragedies

On the conversations happening at Facebook in the aftermath of Philando Castile’s death:

I imagine there are conversations around content moderation. You know, how do we treat events like this? Should they be subject to the normal rules surrounding violence or is there some kind of special dispensation that should be created for videos about news events, or videos that depict injustice. I think it’s a very tough line to straddle. –Deepa Seetharaman, reporter covering Facebook and other social media at the Wall Street Journal

Posted in gaming, kids, marketing, mobile, online advertising, online community, Safety/Privacy, trends, virtual worlds, web business

COPPA/FTC- you make it so difficult to help you out

So I have a reputation as somewhat of a COPPA compliance warrior – so much so that my co-workers have been known to groan when I mention registration or database needs.  It’s an odd thing to have attributed to you, especially with no legal experience SLASH a general frustration with legal risk aversion in general.  But when you have been working on kids websites and communities for coming up on 15 years, you kinda get a hang for the ins and outs of the laws that surround it (or at least you SHOULD).

That’s the thing.  COPPA is required. It’s not a choice.  You have to comply.  If you don’t, you get in trouble – like super expensive trouble – in dollars and negative PR.  So you just do it.  I always find it odd when people brag about it or add it as a tagline to their branding.  It’s akin to saying – “My name’s Harry and just so you know, I definitely DON’T punch random strangers in the face.” Duh, Harry, but thanks for letting me know.

But COPPA compliance has become alot like speeding on the highway.  Many people obey the speed limit, but many more edge a bit over the line.  When they see a cop, they pull back and pretend that 55 is totes what they were driving the whole time. But then they inch back up to 65 or 70 the second they are in the clear.  I’ve heard some people actually see a speeding ticket every now and then as a valid cost of driving – a tax they are willing to budget for.

Much the same, companies have started inching over the compliance line on COPPA.  I actually have been in meetings with kids brand execs (NOT my current ones 😉 ) that considered having a slush fund set aside in case their was a sanction levied against them.  But even Pollyanna-well-intentioned brands sometimes find themselves inching toward or even over the COPPA line.  You know why?  Because it’s SUPER hard to comply to in the internet/digital culture that we are in right now.

COPPA was put in place to protect kiddos from nefarious marketers who wanted to sell personal info. It was not for predators or decency or to teach personal accountability in identity protection.  But, with our culture of fear, those are the things that people think it’s in place for.

Is it good that a byproduct of this rather draconian law imposed on site operators so they don’t profit from the sale of kids info, ALSO helps prevent kids from distributing personal details about themselves in public forums?  Maybe – but I’m not sure that that remote and avoidable byproduct outweighs the other hurdles the law imposes.

You see, the whole thing is predicated on parents being super engaged in their children’s online lives.  Ask a parent about this and they will undoubtedly say:

“YES! Of course, I want to know what is going on with my child online AND to help them make good decisions accordingly!  I am an amazing parent!”

This is evidenced in tons of surveys.  But do those surveys follow up with the parent (and I mean REALLY follow up – not just ask the parent in another survey) with a

“Ok, parent, but do you REALLY? Are you ACTUALLY the super engaged parent you painted yourself to be?”

Chances are, if that followup actually happened, the answer  would be dodged with an excuse about lack of time or understanding, lament of the speed of tech advancements or a bold faced lie.

Truth is that, anecdotally (albeit with my use-cases in the thousands), parents don’t know about COPPA and their assumed required involvement.  So we can demand verified parental consent til the cows come home, but if the parents don’t understand that is something that is needed, all the FTC is protecting is a child’s ingenuity to lie about their age, while simultaneously making it harder for an an ethical site operators to pay their staff while providing good content for kids.

Unless these impositions on the site operators are coupled with a robust (and effective) campaign to explain to parents WHY and HOW they need to be involved, COPPA is simply discouraging smaller brands away from quality content from kids, encouraging children and parents to learn truth-dodging techniques in registrations and forcing the nefarious operators deeper into the shadows to avoid detection.  Only bigger brands can afford the legal counsel needed to check that they are in the right.  The ones who can’t afford will simply not offer the content, or worse, slap a “over 13” stamp on it and skirt their responsibility.

The internet is based on communication channels – especially in the age of social media that is now the norm.  By starting from the false axiom of parental involvement and prohibiting use of the now standard means of communication until this involvement is verified, you are setting up either a web of lies OR limiting our next generations ability to learn how to use these channels correctly.  Both are horrible choices.

And don’t even get me started on how most of the mobile rules don’t even have a path to compliance…

Instead, we should flip the paradigm:

  • For the operators – we make the compliance voluntary and, therefore, honestly brag worthy.  Make it like shopping on a secure site – you get the security so that your customers feel safe.  If you don’t have that seal or badge or OK from the FTC, parents/kids would think twice before using their site.
  • For parents – we give them back their parenting responsibilities.  If they think their child shouldn’t be giving out info online, the parents should be punishing the children for breaking their house rules, not blaming the sites for making it too easy for their child to give out info.  And we should be helping parents understand this and how to do this – not assuming they are already there.
  • For kids – we teach them media awareness, basic stranger/danger skills and critical thinking.  If they aren’t ready for it – their parents shouldn’t be letting them use those sites – whether they are 8, 12, 15 or 17 years old.

I am not naive, I know this isn’t going to happen this year or even next.  But I am optimistic as to this happening at some point.  Until then, I will remain the compliance warrior, marching and marching on.  But I have 10 million+ kids and parents on my compliant site – so you better believe I’m gonna start the first steps toward a more rational model now.

Posted in Animal Jam, kids, Safety/Privacy

Safer Internet Day!

Yeah, I know.  First, Community Manager Appreciation Day (#CMAD) and now Safer Internet Day?  How many trumped up days do my peers need to feel good and noticed?  (The answer is infinite – we are passionate, insatiable service providers, so we will take whatever we can get.)

That said – the twitterspheres I pay attention to were all over Safer Internet Day yesterday.  For really great resources, run a twitter search on #SaferInternetDay or #SID2013.

At Animal Jam, we launched a new Parent Video series that we are calling Animal Jam Parent Connection.  We will go through tips for frequent issues parents deal with and review some of the offerings we have at Animal Jam.  This first one focuses on teaching good password habits.

I host the first, and my aversion to myself being videoed aside…, I think it came out ok.  Let me know what you think and if you have ideas of other topics we should cover, like parent controls, bullying, naiveté, safe chat habits, etc  🙂

Posted in gaming, kids, marketing, mobile, online community, Safety/Privacy, trends, virtual worlds, web business

Digital Kids and 2013 Predictions

I am excited to be involved in this years Digital Kids Conference as an emcee for the first day’s talks.  It’s collocated with Toy Fair again in NYC, so we should have a nice crowd.  Tonda, Chris and crew are promoting on the regular social channels – Facebook, Twitter, etc.

There is also an affiliated Digital Kids Safety Summit as well – you know I’ll be THERE too.

In prep for the conference, I was interviewed about my predictions for our industry in 2013.  Here’s a snippet:

“The new COPPA articulations have changed the digital climate. COPPA requires a level of parental engagement and involvement that many families don’t realize. Parents don’t understand how much parental consent they have to give, and new online safety and privacy articulations are going play an important role in online parenting,”

You can see the whole interview here.

Hope to see you all there!

 

Posted in marketing, online community, Safety/Privacy, virtual worlds, web business

Network makers responsibility

I think a great deal about my responsibility as a kids community producer in establishing and maintaining a healthy and safe culture in the communities I manage.  This has been the case since I began in 2000 and is a constant driving force in my career.

With the addition of Social Media to the landscape and the mass market adoption of the online world, I often feel the personal responsibility to act as a steward to this Digital Citizenship/Netizen culture with those I interact with online – be it on Facebook, Twitter, Pinterest or whatever new site anyone is jumping into at the moment.

So danah boyd’s article on online networks and their role in our society was very exciting for me to stumble upon this week.  It’s a couple months old now, but the concepts are textbook caliber and really made me reflect on the more conceptual aspects of my day to day work.  I am proud that her final statement in the article is one that I remind myself of on a regular basis.

One thing’s clear: it’s high time we examined the values that are propagated through our tools. We all need to think critically about the information we create, consume and share. We all need to take responsibility for helping shape the world around us. – danah boyd

Definitely worth the read for anyone interested in the shift our collective societies are taking and the unique concerns that apply to the digital space, as well as the ones from the offline space that have followed us online.

Posted in kids, Safety/Privacy, web business

COPPA musings

The annual FOSI conference held in DC last week really helped to articulate for me some of the current ambiguity in the COPPA legislation, specifically with it’s intention and it’s enforcement.

Currently, the law is written in such a way that it clearly intends to protect childrens’ personally identifiable information (PII) from being used for nefarious purposes by the websites collecting it or their third party partners.  Some of the changes being proposed (public comments are due by the end of Nov) help to update and articulate this point and make the criteria points a bit more salient with todays tech climate (i.e adding geo-location, behavioral advertising, etc).

One point that is hotly debated is Email Plus.  Currently, sites can use this method (sending notification emails to a parent informing them of a child’s intent to share PII), but the FTC is trying to remove this.  The reason for this being that the sites should, by in large, not be soliciting PII from children in the first place and if they are, they should be complying with the more rigid parental verification models detailed in the law.  As Amy Pritchard from Metaverse Modsquad articulated to me, “Email plus is being eliminated as a way to collect PII and use it internally, as most sites had used it as a best practice parental notification method.  In order to allow sites to continue to do this, the proposed changes allow for sites to collect the parent email address for purposes of notifying the parent that the child has become a member of [or registered for] the site.”

The informal debates that I heard and participated in at the FOSI conference dealt mostly in the intent of the law.  Most of us agreed that the law should protect a child’s PII from being used for anything other than to make the game play better.  For the most part, the consensus is that, except for specific situations, like contests, DOB and gender are really the only 2 pieces of child PII a site needs to collect, and these are allowed currently under COPPA.

The finer point that I recognized in our sometimes spirited debates was between solicited PII and passively collected PII.   A site should not solicit PII from kids, such as in the registration process, as most of this information is not needed for normal game-play (unless, again, they get verifiable parental consent).   But what if kids give PII freely, such as in chat or on forums/boards?  What, if any, sanctions should be levied unto the site in these scenarios?  The informal consensus was that the site should at least employ means of screening and moderating such content so as to make sure that this PII is not easily given and read on the site – but that this should not be legislated as part of COPPA.

Anne Collier wrote about this recently (http://www.netfamilynews.org/?p=30775) – “The proposed [COPPA] changes respond to the advent of social media (social network sites, virtual worlds, online games, apps, etc.) in that sites can “allow children to participate in interactive communities without parental consent so long as the operators take reasonable measures to delete all or virtually all children’s personal information before it is made public,” and companies will also have to hold third parties such as app providers to the same privacy standards their services are held to.”

I do not think that the intention of the law should be about teaching and protecting kids to be safe with their PII.  While this is an ethical and moral imperative that companies that target this demographic should abide by, I fall pretty firmly on the side that this should not be federally mandated.  Many of us, myself included, believe that the free market, and hopefully vocal parent groups and watchdog organizations, should be more of the gauge as to whether this is being done on individual sites.  In theory, educating and protecting kids from sharing PII in chat is a great idea, but those of use who have to DO that work, realize how difficult and sometimes impossible it is to be 100% effective.  I do not see how the government could keep up with or track down how effectively sites are at keeping up with that.

This was the 5th Annual FOSI conference, and it was very good to see more representation from practitioners, rather than just lobbyists, marketers, safety advocates, researchers and bloggers.  Hopefully, those of us with real-world/front-line experience in implementing these sort of laws can gain influence in the conversations so laws can be amended or written practically the first time, rather than after the fact (or not at all).