• takeda@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    3
    ·
    3 months ago

    I find it surprising that anyone is surprised by it. This was my initial reaction when I learned about it.

    I thought that since they know the subject better than myself they must have figured this one out, and I simply don’t understand it, but if you have a model that can create something, because it needs to be trained, you can’t just use itself to train it. It is similar to not being able to generate truly random numbers algorithmically without some external input.

    • aes@programming.dev
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      3 months ago

      Sounds reasonable, but a lot of recent advances come from being able to let the machine train against itself, or a twin / opponent without human involvement.

      As an example of just running the thing itself, consider a neural network given the objective of re-creating its input with a narrow layer in the middle. This forces a narrower description (eg age/sex/race/facing left or right/whatever) of the feature space.

      Another is GAN, where you run fake vs spot-the-fake until it gets good.