That sounds like the same basic idea -- moderators putting a mark on posts that basically says, "An upvote for this is a negative contribution to the community." Is the problem with the general idea, or with one specific detail which could be modified in implementation (banning versus ignoring votes)?
It's not fully transparent -- not to the moderators and meta-moderators, at any rate -- but it's an effective check and balance for "policing the police."
The entire idea is problematic, massively, in that it allows for covert manipulation of a community.
Why are a small handful of unaccountable people determining what is a "negative contribution", on the sly no less, when the whole reason for the existence of karma systems is to do that organically and by consensus?
More importantly, what does such a system accomplish via deception that flattening such posts in a visible and accountable way does not?
Because karma systems have systematic weaknesses that make them incapable of doing that in many situations.
Easy example: Let's say there are 10000 people who like cat pictures and 150 people who like long, in-depth articles. I'm one of the people who likes long, in-depth articles, so I make a forum for people to post and discuss long, in-depth articles. 10% of the cat-picture-lovers happen upon my forum and are delighted to see a new place that they can post cat pictures. My 149 friends and I downvote these cat pictures with all our hearts, but we simply can't outvote the 1000 cat-picture-lovers. Even worse, a cat-picture-lover can look at and upvote a hundred cat pictures in the time it takes me to read one long-form article, so the content that the forum is intended for is buried before even the 150 long-content-lovers get to see it. So the complete destruction of my long-article forum happens organically and by consensus, just like karma systems are supposed to work -- but it's not a desirable outcome!
Basically, karma systems are good at finding something that some group of people will like, but if that's all you want, subreddits are a bad idea, because the entire point of them is to focus on a particular thing and exclude other things that very likely more people would like to see.
Like I said, all the true* subreddits show that there are people who want to have more focused subreddits. The idea that focus should be impossible if a sufficient number of people who are don't have that focus stumble upon the subreddit just doesn't seem very reasonable to me.
> Why are a small handful of unaccountable people determining what is a "negative contribution", on the sly no less
On some of the smaller (but not tiny) subreddits, a submission can be both against the rules and plainly a bad idea as it pollutes the subreddit's goals.
But because voting is anonymous, no matter how much everyone complains about them being upvoted, no matter who weighs in and admonishes people to stop upvoting the garbage, it still occurs.
Over time, people submitting good content stop doing so, and people who submit bad content start doing it more. Vicious circle.
Mods can't just delete those posts either, because this reddit gets pissy that they're playing favorites.
It would be better if the mods were able to say "if you think this is upvoteworthy for this subreddit, you are no longer welcome here".
Instead of each subreddit being some big generic free-for-all, they could stay on topic. Could get rid of alot of fluff. And you could do it without those being banned being able to claim it was for personal animosity.
> Mods can't just delete those posts either, because this reddit gets pissy that they're playing favorites.
You absolutely can, and it's easy to do so. I moderate a small sub, and you can remove individual posts with the click of a button if you don't think they're a quality contribution.
But beyond that, AutoModerator allows you to automatically remove posts according to criteria that you set up in advance. You can ban certain sites, titles with certain keywords, certain types of content, posts from certain users, and so on. It makes moderation much, much easier.
There is always the problem that users will discover the filters and revolt if the mods haven't been transparent about filters. /r/technology users threw a fit when they discovered that the mods had been filtering stories with NSA in the title. Lots migrated over to /r/futurology as a result.
> It would be better if the mods were able to say "if you think this is upvoteworthy for this subreddit, you are no longer welcome here".
Banning 100s or 1000s of users in one fell stroke for having upvoted a single low-effort post would be far more disruptive and unjust than simply removing the offending post.
Lock the thread (with AutoModerator) and explain why. No need to disappear it.
Over time, people submitting good content stop doing so, and people who submit bad content start doing it more. Vicious circle.
Is this really the case? It's a common complaint, but it's usually presented without any backing. Why is the content necessarily "bad" if the community has chosen to feature it?
Mods can't just delete those posts either, because this reddit gets pissy that they're playing favorites.
Why do you suppose that is?
It would be better if the mods were able to say "if you think this is upvoteworthy for this subreddit, you are no longer welcome here".
Leaving aside the fact that you literally just advocated for thought crime, the problem is that an upvote can mean multiple things.
(Don't say "reddiqutte" either, I'm talking about use in reality)
An upvote can mean "like", or "on topic", or "contributes to the discussion", or about 20 other positive things. A downvote can have just as many other meanings in the other direction. The other thing is that with multireddits and /r/all, "on-topic-ness" isn't even going to be evaluated by the users unless you're looking at the front page of the community itself.
I really don't see how more covert moderation is a valid answer to abuse of non-covert moderation.
Removing the single submission doesn't fix the problem.
The people who upvoted things that shouldn't be upvoted are the problem. They may never notice that it was removed anyway.
So now the subreddit (which may only have 15,000 users) has to staff dozens of moderators just to keep track of the shit.
And those posts still polluted the subreddit. They can't be instantaneously removed.
> Why do you suppose that is?
Because in groups of thousands and tens of thousands of people, there's always someone who thinks not only was the meme post funny, but that everyone should lighten up and allow it anyway.
They argue publicly in a pseudo-anonymous forum. And you can't make them stop either, or even more backlash occurs. Nor can you remain silent, or out-argue them.
So a flame war erupts, and everything goes off-topic even more.
> An upvote can mean "like", or "on topic", or "contributes to the discussion", or about 20 other positive things. A downvote can have just as many other meanings in the other direction. The other thing is that with multireddits and /r/all, "on-topic-ness" isn't even going to be evaluated by the users
Unless those users are punished for failing to evaluate.
Guess we're going to have to agree to disagree on this - but let me know if you ever start a community website, because I want absolutely no part of a place where I can be punished for not liking things the staff wants me to like.
If you're going to run a community like that, why even have the people there in the first place? What you've just described is a blog.
I think what they meant was more along the lines of "for upvoting the things that are supposed to not be what the website is about".
If you are referring to the "Unless those users are punished for failing to evaluate."
I think that was meant to mean "unless those users are punished for not distinguishing whether the thing was in an appropriate location, and therefore upvoting something which does not belong where it is."
Not sure, but seems like a likely interpretation to me.