Skanky@lemmy.world to No Stupid Questions@lemmy.world · 1 year agoFrom a legal/evidence perspective, what is going to happen when it will become impossible to tell the difference between a video generated by AI versus the real thing?message-squaremessage-square8fedilinkarrow-up12arrow-down11
arrow-up11arrow-down1message-squareFrom a legal/evidence perspective, what is going to happen when it will become impossible to tell the difference between a video generated by AI versus the real thing?Skanky@lemmy.world to No Stupid Questions@lemmy.world · 1 year agomessage-square8fedilink
minus-squareBlameThePeacock@lemmy.calinkfedilinkarrow-up1·edit-21 year agoEven that isn’t possible. While you could confirm it hasn’t been modified via hashing, it can only confirm that after it was created. If you created an entirely new file there’s no way to prove it wasn’t faked and then had a signature applied.
minus-squareBuddahriffic@lemmy.worldlinkfedilinkarrow-up1·1 year agoAlso, what’s to stop you from generating new hash information that is consistent with the new media if you do modify an existing one?
minus-squareBlameThePeacock@lemmy.calinkfedilinkarrow-up0arrow-down1·1 year agoThe timestamp of the original hash being sent to a central server. That’s the whole point of sending that hash close to when it happens.
Even that isn’t possible. While you could confirm it hasn’t been modified via hashing, it can only confirm that after it was created. If you created an entirely new file there’s no way to prove it wasn’t faked and then had a signature applied.
Also, what’s to stop you from generating new hash information that is consistent with the new media if you do modify an existing one?
The timestamp of the original hash being sent to a central server.
That’s the whole point of sending that hash close to when it happens.