LogFAQs > #965569979

LurkerFAQs, Active DB, DB1, DB2, DB3, DB4, DB5, DB6, DB7, DB8, DB9, Database 10 ( 02.17.2022-12-01-2022 ), DB11, DB12, Clear
Topic List
Page List: 1
TopicThoughts on a certain Twitch policy...
ParanoidObsessive
06/05/22 5:58:36 PM
#17:


Zareth posted...
This is up there with suspending a kid who got his ass kicked at school, for being "involved in a fight"

Well, like most things in life, context is key. Did the kid get jumped because he was being an provocative abusive shit? Yes? Then fair, next. Was he getting bullied through no fault of his own? Then yeah, that's unfair and he shouldn't be punished because other people are assholes.

The problem is, actually disentangling what the context IS is hard when you're dealing with a bunch of kids, and the only potential witnesses are also kids (who are shitty in general and very apt to lie) or bitter, tenured teachers (who probably don't care enough to pay attention). So it's way easier for administrators to just knee-jerk toss everyone and call it a day as opposed to actually figuring out what really happened.

In Twitch's case, the problem is this:

adjl posted...
Of course, that would require actual effort on Twitch's part and not just an algorithm that does all the work for free, so they probably don't like that idea.

On some level, it becomes almost impossible to actually have human oversight on every facet of content, because of the sheer amount of content being produced on a daily basis. YouTube uploads more content by minute per day than any given human could watch in their entire lifetime. Twitch probably isn't much better - there are fewer content producers there, but most of them stream far more regularly and for longer than any YouTuber uploads (I've heard estimates that on average it's something like 2.5 million hours of content per day - and still growing).

To be even remotely effective, both YouTube and Twitch would probably have to hire thousands of overseers whose sole job is to try and watch content and mediate disputes fairly, and it still wouldn't be enough. Yes, they lean into just relying on the various algorithms because it's easy and cheap, but it's also one of the few viable ways to handle that much data without bloating operating costs to the point where it isn't worth running the service in the first place.

It's far more profitable to just unfairly punish (or ban) the occasional innocent content producer and shrug at the collateral damage then it is to come up with an actual fair, rational, humancentric system.

And let's be honest, it's not like having humans oversee moderations is going to automatically result in a fairer system anyway. People here can't seem to shut the fuck up about how terrible and biased the mods are and how unfairly persecuted they are every time they borderline break the ToS (or blatantly break the ToS). It wouldn't necessarily be any different on YouTube or Twitch. You'd just have a situation where, instead of an uncaring computer program dismissing your appeals and upholding your ban you'd have humans doing it, and it would just piss people off more.

---
"Wall of Text'D!" --- oldskoolplayr76
"POwned again." --- blight family
... Copied to Clipboard!
Topic List
Page List: 1