Poll: Do you support these proposed changes?
You do not have permission to vote in this poll.
Yes
88.89%
8 88.89%
Yes, but with modifications (please share them below)
11.11%
1 11.11%
No
0%
0 0%
No, but I have an alternative proposal (please share it below)
0%
0 0%
Total 9 vote(s) 100%
* You voted for this item. [Show Results]

Proposed paedophilia-exclusionary changes to the Code of Conduct
#21
I think I said previously we need to be conscious about English not being everyone's first language, and things can be misinterpreted easily on both sides. It should also be remembered that in some cultures, abuse is a lot more common and viewed differently to the Western world. No, that doesn't make it okay, but it can't be escaped that it may be more common/less common resulting in a sense of ignorance regarding the material and availability of the material online, considering it isn't something you would seek to find.

However, I am unsure why this requires to be a discussion nor why this proposal is necessary. We need to be mindful that children can access content here and the risks go beyond the support section and users promoting this content can choose to share the content in private messages (no method of monitoring particularly), links in signatures, avatar uploads and so on. I don't think any of us would want the community forums to become a platform for sharing child abuse material.

On that basis for any suitable protections for someone running a site with that content in question we would be looking to ban avatar uploads, suspend signature use, suspend access to private messages and block new posts being made. This would be the only way we could have these users on the forums with no risk to others. We shouldn't be allowing anyone sharing content of that nature on the community forums and could instead respond to their thread suggesting they seek support elsewhere and permanently suspend their account.

Another factor to consider, if a law enforcement agency would be conducting surveillance on one of these websites and was gathering IP addresses of users visiting it is a very real possibility that the link could be clicked without staff or a member being aware of the content. The only way to avoid that would be relying on the author to let everyone know the content which I'm sure we can agree is highly unlikely they would even with massive red banners.

Staff and users can access the community forums using workplace wifi/computers - could we imagine the consequences of such material being accessed at work? Lost job, law enforcement referrals, social services for people's children?

It could also be traumatic with potentially lasting effects if anyone were to access the material from here.

The last five small points I have to make:

1. The three members of staff who have replied to this thread are from different backgrounds, countries and cultures. It is really important to remember my opening paragraph.

2. I couldn't in good conscience allow someone sharing child pornography/abuse to remain on this forum for the reasons I've shared above. If I am aware of such a website I always report to as many agencies as possible, NCA, FBI, IWF etc

3. The debate is respectful as they always are here but let's keep it that way. We can do more collectively (staff + ex-staff + community members) than we can alone. We need to work together, staff can't act without the reports and users rely on staff to act on the reports.

4. If we feel there needs to be a change to the COC then we should go ahead and amend it but I feel it should be a given that this material isn't allowed.

5. Merry Christmas to everyone who celebrates it and our thoughts go out to some of the team members who gave the project so much but are no longer with us.
-Ben

Please do not PM me for support. I am looking to be hired for paid services.

You can view my paid services here.
Reply
#22
(2024-12-05, 02:24 AM)Laird Wrote: MyBB is a voluntary community. Communities, especially voluntary communities, have a right to collectively decide who, due to moral deficiency, they do not want to include.

It is, then, very relevant what the majority of this voluntary community wants.

I understand that people have a right to bring up their opinion, and sometimes even have a say on the matter (say, by "democracy"). But this "right" has limitations, regardless of the proportion of such any stand (in favor or against).

Regardless of the clarifications you kindly provided, my understanding of the textual modifications you suggest gives me the impression that they will motivate and enable discrimination.

(2024-12-05, 02:24 AM)Laird Wrote: On what basis do you consider that response to be sufficient? You obviously accept that the behaviour was unacceptable enough for support to be denied. Why, in your opinion, did it not rise to the level of a bannable offence?

The community that caused this discussion was questionable, but I have no ground to claim anything they shared here or in their community is breaking any law, but perhaps only moral expectations. I doubt anyone did in fact reported the site and that such, if so, resulted on no legal action be taken even if there was will for doing so from authorities.

(2024-12-05, 02:24 AM)Laird Wrote: How welcome do you think any children reading this exchange feel, realising that they may be forced to share space with paedophiles known to be sharing images of children like them in a sexual context, and that MyBB staff will not lift a finger to help?

To put it simply, I don't think your proposed changes will help into preventing this, and with it, you are also taking away the means for people to contribute or communicate with a community they are also a part of.

From the multiple exchanges, I can understand this is a disagreement on fundamental aspects of our individual morality or expectations or morality.

For example, you seem willing to the idea to consider that administrators or owners of "lolicon" or "shotacon" related forums would be affected. For me, that would be an absurd standard for any policy change we take.

I think it is reasonable to deny support to users from similar sites, but banning or blocking them entirely as default seems excessive considering they could already be banned or blocked in a case by case basis.

Finally, Devilshakerz response brings interesting topics I didn't know about. Perhaps we could promote awareness by implementing some "off-platform activity" policy.

But even if so, from the example pages of Twitch and Discord, I won't be sure that the site that caused this discussion and the manga genres I mentioned early would be affected in the ways that I understand is the suggested intention.
Soporte en Español

[Image: signature.png]

Discord at omar.gonzalez (Omar G.#6117); Telegram at @omarugc;
Reply
#23
(Yesterday, 09:57 AM)Omar G. Wrote: Regardless of the clarifications you kindly provided, my understanding of the textual modifications you suggest gives me the impression that they will motivate and enable discrimination.

Discrimination? You can't be serious.
If we report spammers here, are we discriminating against them too? Or if we report forums that violate the existing support guidelines, are we discriminating against them?

That's the end of it for me, I'm saying goodbye to supporting MyBB before I discriminate against any more innocent mass murderers, fraudsters or rapists.

good bye
Lu
support ended 
Reply
#24
apologism

(Yesterday, 12:25 AM)Laird Wrote: Regarding "apologism" versus "advertisement", "encouragement", "incentive", "inducement", and "solicitation", it needn't be one or another; multiple descriptors could be used to cover as broad a range of behaviour as possible. I agree though that this is the lesser in severity of the two behaviours.

It's true for the enforcement, especially when the text equips maintainers with actions without limiting available options.

The policy itself must use well-understood terms, and not wander into unclear scope. Its numerous open-ended clauses exist to cover any highly interpretative activity.

If the listed, well-defined terms seem satisfactory for the intended purposes of apologism, they're already referenced in law (others CoC reasons may also apply; illegal leaves the least room for interpretation).

Nonetheless, if you feel the current language needs more redundancy to reiterate what is illegal, some generalized examples are provided here.


final language

When considering the policies, we want to avoid an elaboration on 1 covered category, with 1 concrete scenario, informed by public effects of 1 case, with 1 technical outcome. It may be a good case study or enforcement guide - and seem appealing - but probably doesn't belong in an open-ended document that's expected to handle future cases, not past ones.

If it seems reasonable that the rule should apply:
  • to all members, not just those seeking technical support
  • to all, or other significantly harmful activities
(why the limitation if potential harm is the same or higher?)

then:
  • on-platform activity is covered - for both:
    • more direct activities
    • various encouragement
  • off-platform activity with:
    • reasonable suspicion of on-platform activity is covered (see above)
    • limited relevance to managed platforms may be subject to our connecting clause for negative effects on external environment (or future explicit mention of serious off-platform issues)


scope & enforcement

The linked example large platforms explicitly expand their scope to off-platform activity, but with careful caveats (even with extensive resources). Disregarding the need for those & restricting enforcement (by specifying technical outcomes vulnerable to technical workarounds that may ultimately impair enforcement) makes the response purely procedural, rather than effective.

To understand the balance better: the guiding principle in addressing material violations in the CoC and SEP is appropriate and effective harm reduction.

As the CC spells out (and mandates in the latest version) appropriate and fair corrective action in response unacceptable behavior, outcomes are determined by risk, likelihood, and negative effects. These factors and risk-limiting tools are not always available publicly (or can't be), and maintainers would be most familiar with dispute status and enforcement problems.

Numerous options and account restrictions are thus available, including banning, but it may not always be most effective. The CC leaves matching violations and consequences open to make enforcement easier, not harder.


Both code and policies should be informed by best practices. Feel free to dispute them with references (CoC templates, their interpretation, off-platform activity standard).
devilshakerz.com/pgp (DF3A 34D9 A627 42E5 BC6A 6750 1F2F B8AA 28FF E1BC) ▪ keybase.io/devilshakerz
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)