• 0 Posts
  • 853 Comments
Joined 11 months ago
cake
Cake day: August 21st, 2024

help-circle

  • lime!@feddit.nuto196@lemmy.blahaj.zoneDestination rule
    link
    fedilink
    English
    arrow-up
    2
    ·
    23 hours ago

    it’s like that Mythbusters experiment where they fire a football from a truck. if something falls off of a vehicle, it’s still moving at the speed of the vehicle.

    now being behind an unevenly packed lumber carriage when approaching a tunnel, that would make me nervous.











  • lime!@feddit.nutoFuck AI@lemmy.worldDefinitely Not Stealing (Art by DragonsofWales)
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    3
    ·
    edit-2
    2 days ago

    it still is, which is why shit like this is so crazy because it happens a lot.

    the model keeps basically nothing of the original image, less than a single bit per work it ingests on (the LAION-B dataset is almost 6 billion images, most models have more than double the input data, the models are 5-6 GB, and each image is 1024x1024 pixels), so the image can’t be in there. and yet most of these models can manage to stitch together their input data almost perfectly. it’s like the model splits images into their constituent parts and builds them back the same.

    from a technical standpoint it’s amazingly unlikely. from a human perspective it’s scummy. from a legal perspective it’s 100% plagiarism.