How the upcoming AI legislations around the world, like voice cloning prevention and disclosure requeriment of techincal details of models, will affect open source or selfhosted models?
How do you implement voice cloning prevention? Human voices aren’t that unique. Also, AI voice cloning isn’t perfect. So… At what threshold is a voice considered, “cloned” from a legal perspective?
I mean, people couldn’t tell the difference between Scarlet Johansson and OpenAI’s “Sky” voice which was not cloned.
If you’re selling or publishing a voice in a way that impersonates another person without their consent that may be identifiable and prosecutable. “Generate with x voice.” 'Talk to x." Etc. Exact lettering is no necessary if intent is evident from pictures or evasive descriptions making an obvious implication.
If prosecution can find evidence of cloning/training that can also serve as basis.
In these ways it doesn’t have to be about similarity of the produced voice, of quality or alternative people, at all.
You make AI voice generation sound like it’s a one-step process, “clone voice X.” While you can do that, here’s where it’s heading in reality:
“Generate a voice that’s sounds like a male version of Scarlett Johansson”.
“That sounds good, but I want it to sound smoother.”
“Ooh that’s close! Make it slightly higher pitch.”
In a process like that, do you think Scarlett Johansson would have legal standing to sue?
What if you started with cloning your own voice but after many tweaks the end result ends up sounding similar to Taylor Swift? Does she have standing?
In court, you’d have expert witnesses saying they don’t sound the same. “They don’t even have the same inflection or accent!” You’d have voice analysis experts saying their voice patterns don’t match. Not even a little bit.
But about half the jury would be like, “yeah, that does sound similar.” And you could convict a completely innocent person.
You mean my second point does? Would you agree with, do you see my first point being independent of the process and act of such a creation?
The same applies to the creation or training process. If they trained with voice samples or have a collection of voice samples for matching, then those could serve as evidence or indications.
Like I said initially, how do we legally define “cloning”? I don’t think it’s possible to write a law that prevents it without also creating vastly more unintended consequences (and problems).
Let’s take a step back for a moment to think about a more fundamental question: Do people even have the right to NOT have their voice cloned? To me, that is impersonation; which is perfectly legal (in the US). As long as you don’t make claims that it’s the actual person. That is, if you impersonate someone, you can’t claim it’s actually that person. Because that would be fraud.
In the US—as far as I know—it’s perfectly legal to clone someone’s voice and use it however TF you want. What you can’t do is claim that it’s actually that person because that would be akin to a false endorsement.
Realistically—from what I know about human voices—this is probably fine. Voice clones aren’t that good. The most effective method is to clone a voice and use it in a voice changer, using a voice actor that can mimick the original person’s accent and inflection. But even that has flaws that a trained ear will pick up.
Ethically speaking, there’s really nothing wrong with cloning a voice. Because—from an ethics standpoint—it is N/A: There’s no impact. It’s meaningless; just a different way of speaking or singing.
It feels like it might be bad to sing a song using something like Taylor Swift’s voice but in reality it’ll have no impact on her or her music-related business.
I think the main idea is to codify the act as illegal, so if it’s discovered that someone used voice cloning (for like a telephone scam or something), then they can be charged for that too. But yeah it might be hard to prove without a lot of evidence
Human voices aren’t that unique.
Duuuuuuuddeeeeee lol… come on now.
My ex had an uncle with exactly my voice. Cadence, accent, inflection … it was uncanny.
From the perspective of human perception, people’s voices are only unique enough to about one in a few thousand. There’s a few outliers with much more unique voices but believe it or not, there’s a lot of people walking around on this earth that sound just like Morgan Freeman, James Earl Jones, and other voices people think are super unique.
I view an anti-cloning law as too risky: It sounds exactly like the type of thing that would prevent Grandma from cloning her own voice before going down for surgery because it just so happens to sound a lot like a famous person.
One country was already setting up copyright on your voice so AI can be served takedown notices. Voice are quite unique, its how my bank verifies who I am. If somebody clones my voice via AI it could fool that login system
I work for a huge bank and we tested voice recognition technology: Even under the best circumstances (high quality microphone with no ambient noise in a sound booth), it was far, far too easy to copy someone else’s voice by simply playing back a sliced up recording a la Sneakers (the movie). We ruled it out as an option over a decade ago.
The problem was fundamental and had nothing to do with the quality of the technology. If your bank is using your voice as a unique identifier they had better be using something else in addition to it! Because it’s super insecure.
There are other criteria like account number, etc. But the voice they ask you specific question live. But I get it. Thats why I have a hardware key for platforms that support it.
Your bank is run by fucking morons if they’ve allowed voice verification at any point after ~2 years ago.
It’s a kind of profound, logarithmic stupidity that increases exponentially every day as voice cloning technology gets better and better.
They are fucking stupid and don’t give one minuscule fuck about the security of your account.
Well yes they are a bank. Lol. I moved regular accounts to a credit union because I was sick of the bank’s problems. But still have a disability retirement fund with the bank because its a special government account for my child.
I would suggest that AI regulation should affect ALL models. No one should be exempt. I would also love AI regulation that makes it mandatory to Tag Ai generated or assisted content as such.
Before any of that can happen we need some non-ambiguous definitions of what “AI” is.
Jordan peterson: “But define AI first”
upcoming AI legislations around the world
this is so broad that it is impossible to answer.
if you can point to an individual piece of legislation and its actual text (in other words, not just a politician saying “we should regulate such-and-such” but actually writing out the proposed law) then it would be possible to read the text and at least try to figure it out.
@spit_evil_olive_tips@beehaw.org One example is the EU AI Act. Their requeriments for open source models are very lenient, only requiring summary of training data and disclosure of training details like the computational power used. But proprietary models, on the other hand, are required to implement content filtering etc.
That’s the neat part, it won’t






