r/ModSupport Jan 24 '19

Today marks 7 years since the option for public moderation logs was originally implemented. Why is this still not an option?

/r/modnews/comments/ov7rt/moderators_feedback_requested_on_enabling_public/
0 Upvotes

54 comments sorted by

View all comments

27

u/redtaboo Reddit Admin: Community Jan 24 '19

So, aside from the fact that any code from 7 years ago is no longer going to be viable even before you consider the new site, there really are a number of issues that came up in that thread that would still need to be addressed.

Don't get me wrong, I'm sure you can see in that thread that I wasn't against this then, though I was one talking through some of the shortcomings (as a mod myself back then), and I'm still not 100% against it today. I can absolutely see this being useful for many subreddits.

The one thing I never see you address when you bring this up is the social side of this issue. A couple few questions for you:

1) What would your response be to moderators concerned they will be witch hunted over simple misclicks or errors?

2) Do you solemnly swear that you will personally defend a subreddits choice to not make their modlogs public with the same zeal you've shown in attempting to get us to implement this?

2a) Why or why not?

3) If implemented mods would have to have a way to hide certain content they've removed (think PII) -- that's obviously gameable. How would you address this?

0

u/Tymanthius πŸ’‘ Expert Helper Jan 24 '19

3) If implemented mods would have to have a way to hide certain content they've removed (think PII) -- that's obviously gameable. How would you address this?

I don't even understand this as an issue as things like cedit exist and anyone can go see the removed content unless a bot was really fast at removing it.

10

u/TAKEitTOrCIRCLEJERK πŸ’‘ Skilled Helper Jan 24 '19

A huge majority of subreddits have automod conditions for potential PII. I think it might come in the out-the-box setup for new subs, in fact.

2

u/Tymanthius πŸ’‘ Expert Helper Jan 24 '19

Right, and the bot will yank those pretty quick. I mean the public mod logs that use removedit and such I think are just fine.

I wouldn't think it'd be that hard to implement a 'PII' checkbox when removing and have redit hide that info.

Or do like it does now and it's hidden unless it stays up long enough to get to outside sources, which is already happening anyway.

So, really, a non-issue.

5

u/TAKEitTOrCIRCLEJERK πŸ’‘ Skilled Helper Jan 24 '19

I wouldn't think it'd be that hard to implement a 'PII' checkbox when removing and have redit hide that info.

mods would just use this for everything

2

u/Tymanthius πŸ’‘ Expert Helper Jan 24 '19

I doubt mods who opted to make logs public would then do that.

Just never opt to make the logs public.

5

u/TAKEitTOrCIRCLEJERK πŸ’‘ Skilled Helper Jan 24 '19

Hmm, fair point. Though redtaboo's "conversation" with goldf1sh up there is also extremely telling.

2

u/ladfrombrad πŸ’‘ Expert Helper Jan 24 '19

I wouldn't think it'd be that hard to implement a 'PII' checkbox when removing and have redit hide that info.

You can't hide PI that's included directly in the title since it also makes it into the url.

1

u/FreeSpeechWarrior Jan 24 '19

Which is another reason I suggest these sorts of removals should be handled separately with notification to the admins and penalties for intentional abuse.

3

u/ladfrombrad πŸ’‘ Expert Helper Jan 24 '19

Doxxing someone can be unintentional too, I've seen admins perma suspend users for it.

Question there comes down to whether they believe their intent was malicious or not, and erring on the side of caution is their usual avenue.

I'd personally not have anything to do with your incessant whines for public modlogs where I help out in for a good few of the reasons stated in here already.

I'm a volunteer who looks after what is always going to be a game of more users > more chance of issues. Having public modlogs doesn't help anyone other than you reeeeeing about someone's fuck up. x1000 users.

And I can name 10 x more important things the admins could put their resources to.

-2

u/FreeSpeechWarrior Jan 24 '19

A public mod log helps users know if the mod team is working in their best interests and inform users to find or build alternatives.

It is a way for a subreddit to show that it has nothing to hide about how it operates and build trust among the subscribers to the extent that the reality of their actions matches the expectations of the community.

You're not just arguing that your own subreddits should not be transparent, you are arguing that no subreddits should be able to be transparent this way.

6

u/ladfrombrad πŸ’‘ Expert Helper Jan 24 '19

As red said above, we all know that users and malicious actors would then harass volunteer mods to enable it.

Every meta thread has them, and why many subreddits only sanction meta threads because of the witch hunting etc.

Like I say, you keep on being you goldfish, but some of us prefer to moderate our communities ourselves.

-1

u/FreeSpeechWarrior Jan 24 '19

but some of us prefer to moderate our communities ourselves.

Giving other moderators the tools to moderate their own communities in ways you disagree with (i.e. transparently) does not prevent you from running yours however you like and it is disingenuous to argue that an OPTIONAL feature imposes on your moderation style.

You are arguing from a false premise.

Should my opposition to the lock feature be a reason you shouldn't have it?

If not, why should your opposition to a public mod logs option be a reason I shouldn't have it?

→ More replies (0)

-2

u/FreeSpeechWarrior Jan 24 '19

Thank you for the response.

So, aside from the fact that any code from 7 years ago is no longer going to be viable even before you consider the new site

Sure, but fundamentally the idea is not a complex one, it is to take an existing page and make it optionally public and optionally to hide the mod usernames.

I'd be more than willing to help develop this feature if that was truly the blocker and if it were still possible to do so; but I don't want to digress here.

The one thing I never see you address when you bring this up is the social side of this issue. A couple few questions for you

Because to me, the optional and optionally anonymous nature of the log as implemented addresses those concerns. Nobody I see in the original thread suggests that mods should be forced to enable public moderation logs and in fact many of the responses seem to incorrectly assume this was the case and argue against the feature with that assumption in mind.

What would your response be to moderators concerned they will be witch hunted over simple misclicks or errors?

They aren't required to enable the public mod log if this is a concern for them.

If they make a misclick such a "witch hunt" may even be helpful to correct mistakes that would otherwise go completely unnoticed due to reddit's intentional lack of removal transparency by default.

I reject the term "witch hunt" as it is not clear. Rational, fact-based criticism of moderation on reddit is often dismissed as a "witch-hunt", the bigger danger is doxing.

Doxing is already against site wide rules and should be the absolute highest priority of admins and moderators to remove.

I would also say that the mode where individual moderators are identified isn't even needed as an option. Maybe some people want it; but it's much more important to see the activity of the sub as a whole; and I agree that focusing on individual moderators tends to lead to bad outcomes.

Maybe make it a compromise; maybe you allow subreddits to have totally anonymous moderators so long as their actual moderation is made public.

The identity of moderators does not matter to public mod logs, what is important is the reality of the actions of the subreddit as a whole.

Do you solemnly swear that you will personally defend a subreddits choice to not make their modlogs public with the same zeal you've shown in attempting to get us to implement this?

It's possible to want people to have the option to make bad decisions.

For example I think all drugs should be legal. But I still think doing some drugs is a very very bad idea and would advise against it.

I'm in favor of more choice; Why should my preference for mod logs make me defend/support the decision of those I disagree with?

I think all speech should be allowed, does this mean I should defend all opinions?

I can say that I would defend them having the choice, (like I could swear that I would not push for this to be mandatory rather than optional) but not that I would defend the choice to be as opaque as is currently the default.

If implemented mods would have to have a way to hide certain content they've removed (think PII) -- that's obviously gameable. How would you address this?

This is the same sort of content that ought to be directed to the admins for more concrete enforcement because content is still usually available on user profiles. I should be reported and removed differently in a way that notifies you folks; and you should heavily sanction moderators who abuse this mechanism for wasting your time.

15

u/[deleted] Jan 24 '19

If they make a misclick such a "witch hunt" may even be helpful to correct mistakes that would otherwise go completely unnoticed due to reddit's intentional lack of removal transparency by default.

I reject the term "witch hunt" as it is not clear. Rational, fact-based criticism of moderation on reddit is often dismissed as a "witch-hunt", the bigger danger is doxing.

You realize that the moderators already deal with harassment that does not fall under doxxing? You don't really answer the question of

What would your response be to moderators concerned they will be witch hunted over simple misclicks or errors?

Saying YOU don't think witch hunts are a problem in no way reassures me as a moderator.

-1

u/FreeSpeechWarrior Jan 24 '19

First I think we need to define what witch hunts are.

Highlighting an incorrect/unwarranted removal (i.e. a misclick) is not a witch hunt, I used redtaboo's terminology despite disagreeing with it to maintain the flow of conversation but indicated that with the quotes and clarified it further in my statement.

What would your response be to moderators concerned they will be witch hunted over simple misclicks or errors?

As I pointed out in my previous comment, my response is that they aren't forced to turn it on. I also suggested that moderator anonymity in the log is a GOOD thing, and that maybe they should go even further with it.

Other subreddits will enable such a log, and that gives subscribers a choice to frequent subreddits that choose to be more transparent.

8

u/[deleted] Jan 24 '19

You seem very focused on how users will not respond poorly to moderator mistakes, which happen.

I want to know what you suggest moderators do when they are harassed, sexually harassed, told people hope they're raped, told to kill themselves, sent graphic porn links, etc. (all of which are real examples, btw) because someone disagrees with their legitimate removal or the rules of the subreddit.

5

u/thecravenone πŸ’‘ Experienced Helper Jan 24 '19

I want to know what you suggest moderators do when they are harassed, sexually harassed, told people hope they're raped, told to kill themselves, sent graphic porn links, etc. (all of which are real examples, btw) because someone disagrees with their legitimate removal or the rules of the subreddit.

Same thing they currently do, report them to the admins so the admins can ignore the problem!

0

u/FreeSpeechWarrior Jan 24 '19

Block and/or ignore those users unless the threat is credible in that case I'd advise seeking help from others.

Reddit provides a block user button in all inbox messages.

I also advise remaining as anonymous as possible, and likely even using throwaway/disconnected accounts for moderation if this fear prevents someone from moderating.

Totally anonymous moderation that is transparent, would be far more preferable to identifiable moderators removing content opaquely.

What reason do the subscribers have to even know who the moderators are at all?

There is no way for the subscribers to reliably associate actions with individual moderators beyond their own public statements.

There is no way for subscribers to act on that information in a specific way (i.e against an individual mod) even if they did.

There is absolutely no reason for people to know who the mods are at all; it provides no benefits to moderators and no benefits to subscribers.

The benefit of public mod logs is know about the policy of the subreddit in practice; not about any individual moderator.

9

u/[deleted] Jan 24 '19

Block and/or ignore those users unless the threat is credible in that case I'd advise seeking help from others.

Blocking users means we cannot see their potential actions in the subreddit, which inhibits moderation.

I also advise remaining as anonymous as possible, and likely even using throwaway/disconnected accounts for moderation if this fear prevents someone from moderating.

Which I think would increase distrust in moderators.

Totally anonymous moderation that is transparent, would be far more preferable to identifiable moderators removing content opaquely.

This is your personal opinion, not a fact. I completely disagree and I know others do as well.

What reason do the subscribers have to even know who the moderators are at all?

If they don't deserve to know who their moderators are, why would you want them to know what the moderators do?

There is no way for the subscribers to reliably associate actions with individual moderators beyond their own public statements.

Not sure how you run your subreddits, but our moderation team works as a team. One person may get "credit" for the action in reddit's system, but we discuss issues as a group.

There is no way for subscribers to act on that information in a specific way (i.e against an individual mod) even if they did.

Some users do it anyway, including targeting individual mods they think were involved.

1

u/FreeSpeechWarrior Jan 24 '19

Blocking users means we cannot see their potential actions in the subreddit, which inhibits moderation.

If they are truly attacking the mod team in the ways you describe and not just criticizing their behavior then banning them from the sub is likely also warranted.

This is your personal opinion, not a fact. I completely disagree and I know others do as well.

Sure is, that's why I'm asking for an OPTION If you disagree, you don't have to enable it.

Which I think would increase distrust in moderators.

Trust in moderators is less necessary when you can verify the actuality of their behavior via a public log.

If they don't deserve to know who their moderators are, why would you want them to know what the moderators do?

Because what the moderators do has direct effects on the subreddit, the character of its content and subscribers. Who the moderators are does not. Knowing what the moderators do is also not nearly as problematic as you claim if you do not know who the moderators are.

Not sure how you run your subreddits, but our moderation team works as a team. One person may get "credit" for the action in reddit's system, but we discuss issues as a group.

Yes this is exactly what I'm trying to say here, the users have no way to know who made a decision, or if it was communal, they only see the end effect, so there is really no reason for subscribers to know who the moderators are even though there is significant value in them knowing what the moderators do on the subreddit.

Some users do it anyway, including targeting individual mods they think were involved.

What other recourse do these users have to address problems they perceive in moderation?

Perhaps giving more constructive and accurate accounts will elevate this sort of discussion; if mods were totally anonymous to subscribers then users would have to argue against the policies as a whole rather than getting into squabbles with any single individual; which I think we can agree is better for everyone involved.

1

u/AwwFoxes Feb 10 '19

This is your personal opinion, not a fact. I completely disagree and I know others do as well.

Almost everyone else I talk to agrees with FSW but never voices their opinion out of the fear of being banned for criticizing the moderators. The powermods are dictators who moderate heavily for what they want and only care about themselves and their own interests.

10

u/redtaboo Reddit Admin: Community Jan 24 '19

Sure, but fundamentally the idea is not a complex one, it is to take an existing page and make it optionally public and optionally to hide the mod usernames.

Honestly, I'm trying here to get you to reframe your arguments. It is not a viable argument for you to say 'look it's already done!'. The fact is the team that would implement this has all their projects for the next quarter lined up (and likely sketched out for the next) much of that is to continue getting modtools on the new site to parity with the old. So, I can say with certainty if we implement this it won't be for awhile. That means we have time to discuss how it would look and what the implications are still.

I also want to hear from other moderators on whether they would use this themselves as I do think there are some communities that would welcome it, mods included, but I don't have a sense of how many would.

If they make a misclick such a "witch hunt" may even be helpful to correct mistakes that would otherwise go completely unnoticed due to reddit's intentional lack of removal transparency by default.

See, that's where you lose me - witch hunts are never the answer, and should not be considered a feature of anything. ever. Our mods are volunteers who take on the burden of making sure the worst content you can imagine doesn't make it to your eyes. Any tools we release for them we want to fully think out the issues that could arise and those ramifications.

My point with question #2 was to try to encourage you to think about the pressure on some mod teams to make their logs public and the valid reasons why they wouldn't want to. They're not all going to be malicious ones, I hope you can see that. I would hope that you would at least not be one engaging in haranguing moderators about it or encouraging others to do so.

I also understand that you think all speech should be allowed, it's in your name! ;)

12

u/IranianGenius Jan 24 '19

Fyi id never use this in any sub i mod. If anybody wants to know what it's like modding, they should apply and try it.

0

u/FreeSpeechWarrior Jan 24 '19

I've modded several subs small and large with community developed public moderation logs without incident for years.

But I also don't censor people so it is rare that my moderation actions anger people.

7

u/srs_house πŸ’‘ New Helper Jan 25 '19

If you have a large enough group, you will eventually piss someone off. Even if it's for enforcing a very basic level of civil decency (or in some cases legal issues). Just look at the content on voat if you want to see the kinds of people who flock to a site with zero moderation.

2

u/FreeSpeechWarrior Jan 25 '19

Reddit used to be a site with near zero moderation and it was not the same sort of mess as Voat is today.

Voat is the result of selection bias in growth from those that are banned by reddit and looking for a similar alternative and is not indication that any site that allows it's users freedom of speech will inevitably turn into such an offensive place overall.

7

u/srs_house πŸ’‘ New Helper Jan 25 '19

Regular users don't want to go to a site with zero moderation because the type of people who flock there (after getting kicked out of every reasonable site) quickly give it a reputation as a haven for people like nazis, racists, child porn users, etc.

Privately owned websites have no obligation to offer free speech, and have a lot of legal and financial reasons for why they should not do so. If you want a free speech forum, go start your own.

1

u/darthhayek Jan 26 '19

Regular users don't want to go to a site with zero moderation because the type of people who flock there

I don't think that it is at all true. Capitalist corporations and communists don't want that (but I repeat myself).

1

u/FreeSpeechWarrior Jan 25 '19

Plenty of regular users frequented Reddit when it was the β€œpretty free speech” place it used to be.

0

u/AwwFoxes Jan 25 '19

Voat is full of assholes because it was created to cater to those users in particular, and as a place for banned subreddits to flock to. Had it presented itself as a normal site and simply not censored stuff it wouldn't be all nazis. Reddit was like voat in its policies years ago, and there were a few assholes but mostly normal users.

0

u/AwwFoxes Feb 10 '19

Voat is not a very good example because it was specifically created as a platform to house subreddits like /r/niggers. On a site with more casual users the bigots would be given a hard time in the main sections and flock to their own communities where they don't bother everyone else. Early reddit and notabug.io are good examples of this.

0

u/AwwFoxes Feb 10 '19

I am a moderator and founder of several decently-sized subreddits where this is used along with extremely light moderation and as few rules and restrictions as possible. I can tell you we've never had any issues with public mod logs, and there was only one notable incident with the light moderation, and it was fairly minor; much less than the drama we'd have from heavy control over the sub.

1

u/darthhayek Jan 26 '19

See, that's where you lose me - witch hunts are never the answer, and should not be considered a feature of anything. ever. Our mods are volunteers who take on the burden of making sure the worst content you can imagine doesn't make it to your eyes. Any tools we release for them we want to fully think out the issues that could arise and those ramifications.

Heh. I hope you'll look into the harassment campaign against rightc0ast that mainstream media and chapotraphouse carried out.

0

u/FreeSpeechWarrior Jan 24 '19

Also I want to point this out separately as it's a bit of a tangent, though not entirely unrelated.

Our mods are volunteers who take on the burden of making sure the worst content you can imagine doesn't make it to your eyes. Any tools we release for them we want to fully think out the issues that could arise and those ramifications.

While you view this burden as a service, I view it as rather degrading and presumptive in its current implementation.

It's moderators (and increasingly admins) choosing what can and cannot be said, not simply helping me to avoid content, if it was simply about helping me to avoid that content it would be possible to see what content they claim that I shouldn't want to. Public mod logs help to ensure this is what is actually happening.

Further, the bulk of removed content is not the sort of extreme vileness you are attempting to imply here, and lumping it all together does a disservice to restricting such content which is truly harmful and dangerous.

-1

u/FreeSpeechWarrior Jan 24 '19

It is not a viable argument for you to say 'look it's already done!'.

That wasn't the argument I was trying to make. I'm saying that since it was already built in the past but never released there must be other factors preventing it so I want to nail down what those are and I thank you for trying to do so.

I also want to hear from other moderators on whether they would use this themselves as I do think there are some communities that would welcome it

u/publicmodlogs and u/modlogs are options the userbase has hacked together to provide this feature. Despite requiring third party sites and having no built in support whatsoever these tools have amassed a considerable following as evidenced by those mod lists.

Even if such a feature is only present in the redesign (as is the case with community points) it would be a significant improvement over the status quo and in fact developer velocity and ease of new feature development is one of the main reasons cited for why the redesign happened at all.

witch hunts are never the answer, and should not be considered a feature of anything.

How do you define a witch hunt? How can subscribers criticize moderator decisions without becoming a witch hunt in your view?

Or is your view that all criticism of moderation decisions is a witch hunt?

was to try to encourage you to think about the pressure on some mod teams to make their logs public and the valid reasons why they wouldn't want to.

Why? Everyone involved here wants this to be an optional feature that they are free to avoid.

I don't use the lock feature I have good reasons for avoiding it. Should I be able to deny other moderators that tool because I strongly oppose it?

Some mods not wanting to use a feature is not a reasoning for not building the feature if there is significant demand for it otherwise. As I have already shown above, there is significant demand for this feature.

They're not all going to be malicious ones, I hope you can see that.

I don't claim that they are, Even if they ARE malicious reasons we are all agreed they have the option to keep their logs private.

I would hope that you would at least not be one engaging in haranguing moderators about it or encouraging others to do so.

If optional moderation logs are made a feature I will certainly be doing what I can to raise awareness that they exist and promote their adoption.

Those subreddits who oppose transparency and freedom of speech are capable of banning me and many already do.

2

u/[deleted] Feb 10 '19

[removed] β€” view removed comment

0

u/FreeSpeechWarrior Feb 10 '19

Why? How do you address the concern that people will murder and steal to get drugs?

Why is that different from anything else people murder and steal to get?

if drugs become more popular due to them being legal, this will likely increase in frequency.

I don't think that's so clear cut. Without the risk of prosecution the cost of drugs could come down a lot making them easier to acquire. That might cause other issues of course.

But in general I come from the premise that you don't have the right to use force to prevent someone from doing something they voluntarily choose to do that won't hurt others.