• 0 Posts
  • 120 Comments
Joined 1 month ago
cake
Cake day: May 24th, 2024

help-circle
  • ssj2marx@lemmy.mltoLemmy Shitpost@lemmy.worldStay Mad, Tankies
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    7 hours ago

    but I don’t believe authoritarianism is the best way to go about it.

    Humor me for a moment, which of the following do you consider authoritarian?

    • asking your boss for better wages
    • using the power of a union to force your boss to give your coworkers better wages
    • using the power of the state to force all bosses to pay all workers better wages



  • ssj2marx@lemmy.mltoLemmy Shitpost@lemmy.worldStay Mad, Tankies
    link
    fedilink
    arrow-up
    10
    arrow-down
    9
    ·
    edit-2
    11 hours ago

    we can’t fix our current situation in one election.

    We can never fix the current situation in one election. Fixing the American system, within the parameters set forth by that system, requires a dedicated voting bloc that lasts multiple elections refusing to vote for the Dems until they shift far enough left to appease that bloc. As long as you are focused on the next election, your prescription for fixing American politics is just as unrealistic as a random Twitter tankie declaring a general strike.
















  • Who will be the judge?

    The same people that should judge every criminal proceeding. Of course it’s not going to be perfect, but this is a case of not letting perfect be the enemy of good. Allowing generated or drawn images of sexualized children to exist has external costs to society in the form of normalizing the concept.

    The argument that making generated or drawn CSAM illegal is bad because the feds might plant such images on an activist is incoherent. If you’re worried about that, why not worry that they’ll plant actual CSAM on your computer?


  • there cannot be developed a scale or spectrum to judge where the fake stops and real starts

    Ah, but my definition didn’t at all rely on whether or not the images were “real” or “fake”, did it? An image is not merely an arrangement of pixels in a jpeg, you understand - an image has a social context that tells us what it is and why it was created. It doesn’t matter if there were real actors or not, if it’s an image of a child and it’s being sexualized, it should be considered CSAM.

    And yes I understand that that will always be a subjective judgement with a grey area, but not every law needs to have a perfectly defined line where the legal becomes the illegal. A justice system should not be a computer program that simply runs the numbers and delivers an output.