I’m rather curious to see how the EU’s privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn’t have a paywall)

  • stealthnerd@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    3
    ·
    10 months ago

    This is an article about unlearning data, not about not consuming it in the first place.

    LLM’s are not storing learned data in it’s raw, original form. They are injesting it and building an understanding of language based off of it.

    Attempting to peel out that knowledge would be incredibly difficult, if not impossible because there’s really no way to identify it.

    • Eccitaze@yiffit.net
      cake
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      And we’re saying that if peeling out knowledge that someone has a right to have forgotten is difficult or impossible, that knowledge should not have been used to begin with. If enforcement means big tech companies have to throw out models because they used personal information without knowledge or consent, boo fucking hoo, let me find a Lilliputian to build a violin for me to play.