Anyone know how many hours of training data it takes to build up a convincing model of someone’s voice? It was 10’s of hours when I did a bit of research a year ago… the article says social media is the likely source of training data for these scams, but that seems unlikely at this point.
A current state of the art ai model from Microsoft can achieve acceptable quality with about 3 seconds of audio. Commercially available stuff like eleven labs about 30 minutes. But quality will obviously vary heavily but then again they’re using a low quality phone call so maybe not that important
With that little, they may be able to recreate the timbre of someone’s voice, but speech carries a multitude of other identifiers and idiosyncrasies that they’re unlikely to get with that little audio, like personal vocabulary (we don’t choose the same words and phrasings for things), specific pronunciations (e.g. “library” vs “libary”), voice inflections, etc. Obviously, the more training data you have, the better the output.
That’s downright scary :-) I think it took longer in the last Mission Impossible.
30 minutes is still pretty minimal for the kind of targeted attack it sounds like this is used for. I suppose we all need to work with our families on code words or something.
I went in thinking the article was a bit alarmist, but that’s clearly not the case. Thank for the insight.
I cloned my own voice to prank a friend, and… Wow, it was a gut-dropping moment when I understood just how dangerous this tool is for precisely this type of scam.
It’s one thing to hear about it, but to actual experience it… Terrifying.
Oh, it was nothing more than just showing off the technology, really. It wasn’t a committed bit.
I cloned my voice then left a voicemail that said something like: “hey buddy it’s me. My car broke down and I’m at… Actually I don’t know where I’m at. I walked to the gas station and borrowed this guy’s phone. He said he’ll give me a ride into to town if I can get him $50 bucks. Could you venmo it to him at @franks_diner? I’ll get you back as soon as I can find my phone. … By the way this is really me, definitely not a bot pretending to be me.”
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world
This is a most excellent place for technology news and articles.
Anyone know how many hours of training data it takes to build up a convincing model of someone’s voice? It was 10’s of hours when I did a bit of research a year ago… the article says social media is the likely source of training data for these scams, but that seems unlikely at this point.
A current state of the art ai model from Microsoft can achieve acceptable quality with about 3 seconds of audio. Commercially available stuff like eleven labs about 30 minutes. But quality will obviously vary heavily but then again they’re using a low quality phone call so maybe not that important
With that little, they may be able to recreate the timbre of someone’s voice, but speech carries a multitude of other identifiers and idiosyncrasies that they’re unlikely to get with that little audio, like personal vocabulary (we don’t choose the same words and phrasings for things), specific pronunciations (e.g. “library” vs “libary”), voice inflections, etc. Obviously, the more training data you have, the better the output.
That’s downright scary :-) I think it took longer in the last Mission Impossible.
30 minutes is still pretty minimal for the kind of targeted attack it sounds like this is used for. I suppose we all need to work with our families on code words or something.
I went in thinking the article was a bit alarmist, but that’s clearly not the case. Thank for the insight.
BADONK!
Good luck criminals. I ignore nearly every call.
Whomever is stupid enough to think that Tom Hanks is calling you personally probably needs a court appointed guardian.
Unless you actually know Tom Hanks personally and are expecting a call from him, of course.
I cloned my own voice to prank a friend, and… Wow, it was a gut-dropping moment when I understood just how dangerous this tool is for precisely this type of scam.
It’s one thing to hear about it, but to actual experience it… Terrifying.
Mind sharing more info about the prank? Sounds like an interesting story
Oh, it was nothing more than just showing off the technology, really. It wasn’t a committed bit.
I cloned my voice then left a voicemail that said something like: “hey buddy it’s me. My car broke down and I’m at… Actually I don’t know where I’m at. I walked to the gas station and borrowed this guy’s phone. He said he’ll give me a ride into to town if I can get him $50 bucks. Could you venmo it to him at @franks_diner? I’ll get you back as soon as I can find my phone. … By the way this is really me, definitely not a bot pretending to be me.”