Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possibility for pre-moderation of proposals and comments #2503

Closed
ahukkanen opened this issue Jan 16, 2018 · 15 comments
Closed

Possibility for pre-moderation of proposals and comments #2503

ahukkanen opened this issue Jan 16, 2018 · 15 comments
Labels
module: comments module: proposals type: feature PRs or issues that implement a new feature

Comments

@ahukkanen
Copy link
Contributor

This is a Feature Proposal

🎩 Description

Some public organizations feel wary of publishing anything publicly that hasn't gone through them. There may be SPAM attacks on the sites which may cause some unsolicited content to appear under a site with that public organization's brand. This is not a concern when the user accounts are verified before allowing them to post content but this is an actual concern when this verification process is not done. Especially in the smaller organizations, they may want to verify the user accounts manually.

Therefore, it should be possible to pre-moderate the content that appears on the site. This would happen so that when anyone posts content (ideas, comments) to the site, it would first go to the hidden state and after a proper moderation process, it would appear on the site.

I believe this should be applied at least to ideas and comments.

Technically this would work as follows:

  • A new idea or comment is posted on the site
  • The system checks if pre-moderation is enabled (I believe it would be enough if this was a global setting)
  • If so, a new report is directly added against this content with a new reason flag automated. Also a new moderation is directly added against this new content with the hidden_at field marked at that point of time. This would hide the content from the site until it has been moderated.

Another required change would be to apply this to the moderation views in the admin section. This is not completely necessary regarding the functionality but otherwise admin users could be confused of the "report" terminology. I would suggest that in case pre-moderation is enabled, the following changes would be made in the admin section:

  • The default filter for the moderations would be the hidden moderations (since this is where to start from with pre-moderation)
  • The "Unreport" term would change to "Publish"

The new global configuration could be applied in the Decidim's initializer and it could look like this:

Decidim.configure do |config|
  # ...
  config.enable_premoderation = true
  # ...
end

📌 Related issues

📋 Additional Data

  • Decidim deployment where you found the issue:
  • Browser & version:
  • Screenshot:
  • Error messages:
  • URL to reproduce the error:
@mrcasals mrcasals added module: proposals module: comments type: feature PRs or issues that implement a new feature labels Jan 16, 2018
@mrcasals
Copy link
Contributor

Hi @ahukkanen! I like the idea and the suggested implementation. It's simple, it reuses most of the logic we already have, and the changes seem small. I'd like the people from @decidim/product to chime in on this, but the rationale behind this seems pretty clear to me.

Unfortunately, I don't think we have time to work on this as of now. Would you like to send us a PR? 😄

@ahukkanen
Copy link
Contributor Author

@mrcasals We would like to send a PR in case we get hired to work on this feature. I understand it's not a priority for you but it's good to add it here, so that it's been mentioned somewhere (and I can point to this conversation).

We've heard this need from a couple of sources already but not yet from anyone who'd be willing to invest into this feature.

@mrcasals
Copy link
Contributor

Sure, I understand the issues, no problem at all!

@oriolgual
Copy link
Contributor

I'm actually 👎 on this, regardless of the funds. Yes, pre-moderation can be useful to prevent spam, but it can also be used as a tool to censor content that the admins don't like. We already have a feature to report SPAM or malicious content.

If admins really want to now everything one could add a callback or similar to a model and then send emails each time some content is created.

@ahukkanen
Copy link
Contributor Author

@oriolgual Yes, this would use the same reporting feature already built-in but other way around.

I understand the concern that it may be used also as a censoring tool but I would also like to emphasize it is a real concern in some organizations that any content can appear under their name. Sometimes there may be delays for the moderation process (e.g. due to public holidays, etc.) which may cause the unsolicited content to be on the website for multiple days, in the worst situation even weeks. With pre-moderation this would not happen.

I believe also with the post moderation, content can be censored as the maintainer wishes. They can report the content themselves and then moderate it. For example Facebook is currently actively moderating public content that does not meet their criteria for publicly appropriate content (even if it would otherwise meet their standards).

The difference between these two is that with pre-moderation, the content does not become publicly available under that brand's name unless it has gone through a moderator. Every organization can of course set the standards for moderation and publicly communicate them with the users of the system.

@virgile-dev
Copy link
Contributor

Hey guys !
We implemented this for a client. We can send you guys a PR @mrcasals.
The code can be found here : https://github.com/opensourcepolitics/decidim/tree/f-moderation
I'll try to add screens to help you guys understand how it works.
Best,

@oriolgual
Copy link
Contributor

I agree that post moderation can also be used as a censoring tool, but at least there's some trace of it, with pre moderation there's no way to tell 🤷‍♀️

@virgile-dev
Copy link
Contributor

virgile-dev commented Jan 16, 2018

We were not super psyched by developping the moderation a priori as we feel the same as you about it. We started off using the standard a posteriori procedure already implemented in Decidim, but with some clients is was not debatable for pratical and legal reasons expressed in their term and conditions on how they do public debate.

We implemented the moderation a priori as a feature.setting so it's not activated by default.

Here is a logical workflow

And screens in french

@ahukkanen
Copy link
Contributor Author

This may also become important in Germany as of the new NetzDG law. It requires sites to remove the "obviously illegal" content within 24 hours of notice. The fines for not complying may be up to €50m.

Obviously this 24 hours limit may be tough for smaller organizations, so they might prefer pre-moderation.

@xabier
Copy link
Contributor

xabier commented Jan 18, 2018

Hi @ahukkanen ,

A fast comment: from the @decidim/product side we don't agree with pre-moderation, it is a very dangerous functionality for dis-honest governments to pre-censor proposals. Let me carefully read the whole thread and get back to this.

@xabier
Copy link
Contributor

xabier commented Jan 18, 2018

More on this:

  • Cases of censorship with post-moderation have been combated with screen captures and other means of demonstrating that the content was moderated. So having post-moderation activated does not mean that pre-moderation is the same and thus equally acceptable.
  • There are Participatory eXperience (PX) reasons to avoid pre-moderation: it is a really awful experience to submit a comment and a proposal and having to wait to see it published, it is suicidal to do this, Twitter, FaceBook or any other social network would have never survived with pre-moderation.
  • If a country has a law that prevents freedom of speach, it is not Decidim's mission to adapt to anti-democratic laws. I am sorry to say it so plainly or rudely, but this is a core of this project, Decidim is a technopolitical project for participatory democracy.

This being said, we really appreciate feedback and we understand it is hard to comply with your clients. We need to find ways to enforce democratic quality and make the software usable for as many people as possible, but, honestly, on this particular matter I don't see a good solution to include pre-moderation on the official repo.

We need to open the meta.decidim community in English so we can have this discussions with a more democratic context.

@ahukkanen
Copy link
Contributor Author

@xabier Completely understand your take on this and I appreciate explaining the cause behind the reason. I also appreciate you pushing for openness.

@virgile-dev Would it make sense to discuss further about extracting this feature to its own module/gem? Where should we move this conversation to?

@virgile-dev
Copy link
Contributor

Hey @ahukkanen ,
We feel the same as Decidim team about moderation a priori.
We've implemented this as an optional feature because for some client it became a
deal breaker for using the platform. But once they did one participatory process we think they'll realize they are not moderating anything (like 0.5%) they just switch naturally to the a posterior process.
It'd be great to find a way to implement it as a module/gem. We couldn't figure out a way of doing this. Best place to talk about this is filling an issue here.

@ahukkanen
Copy link
Contributor Author

@virgile-dev OK, I'll move the discussion there, thanks for the input!

Closing this one.

@ahukkanen
Copy link
Contributor Author

The discussion is moved to the new thread @OpenSourcePolitics repository: OpenSourcePolitics#33

Please give your input there in case you are interested about this topic!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: comments module: proposals type: feature PRs or issues that implement a new feature
Projects
None yet
Development

No branches or pull requests

5 participants