- Using generative artificial intelligence can imitate the voice of someone you know and communicate with you in real time.
- According to McAfee, about 52% of Americans share their voice online, making it easy for scammers to copy it.
- This is called an interactive voice response (IVR) and is used in a type of spam called voice phishing or “vishing.”
A humanoid robot works on the development of AI. 3D illustration.
Style Photography | Istock | Getty Images
Many of us now know that tax offices, car warranty companies, etc. will not call us to tell us about urgent fines or fees that we have to pay in the form of prepaid cards. However, the almost A fine of 300 million dollars against a massive transnational robocalling operation carried out by the Federal Communications Commission shows how widespread this problem has become.
But what about when the voice of someone you know is on the other line – your CEO, your spouse or your grandchild – urgently asking for money to help them get by? get out of trouble?
With the insidious use of generative artificial intelligence imitating the voice of someone you know and communicating with you in real time, this call becomes inherently unreliable.
The phone system was built on trust, says Jonathan Nelson, director of product management at phone analytics and software company Hiya Inc. “Before, we could assume that if your phone rang, there was a wire. physical copper that we could follow. all the way between those two points, and that’s gone,” Nelson said. “But the trust that it implied wasn’t.”
Now the only call you can trust is the one made by individuals. But with a quarter of all contactless calls reported as spam, i.e. fraudulent or simply harmful, according to Hiya Global Call Threat Report Q2 2023That’s a lot of checking.
A report on AI and cybersecurity from digital security company McAfee says that 52% of Americans share their voice online, which gives scammers the main ingredient to create a digitally generated version of your voice in order to victimize people you know. This is called an interactive voice response (IVR) and is used in a type of spam called voice phishing or “vishing.” While spear phishing used to take a lot of time and money, Nelson said “generative AI can kind of take what was once a very specialized spam attack and make it much more common.”
According to McAfee CTO Steve Grobman, these types of calls are bound to remain less likely than other, more obvious spam calls, at least for now. However, “they put the victim in a more precarious situation where they are more likely to act out, which is why it’s important to be prepared,” Grobman said.
This preparation depends on a combination of consumer education and technology warfare, or more precisely, white hat AI fighting black hat AI.
Companies like McAfee and Hiya are on the front lines of this fight, spotting AI-driven scam schemes (such as call history patterns that work similarly to a credit history for phone numbers) and finding ways to prevent them.
Despite the fact that the US federal government spearheaded the investigation into the IRS scam (see the 2023 podcast Chameleon: probable scam for a deep dive into investigation logistics), its response to AI technology’s rise in robocalls is disorganized, expert says.
Kristofor Healey is a former Department of Homeland Security special agent who now works in the private sector as CEO of Black Bear Security Consultants. He spent his time in the federal government investigating large-scale money laundering organizations and led the team that dismantled the IRS scam, the nation’s largest wire fraud case. history of the United States.
Healey argues that government and law enforcement are inherently reactive systems, but that AI as a tool for businesses such as call centers (“whether they are good or bad centers of calls,” he declared) will multiply the cases which must be reacted to.
Ultimately, technology can only be proactive to the extent that cybercriminals always take things to the next level. Experts say business and consumer education is the only truly proactive approach available, and it requires spreading awareness about how people can protect themselves and those around them.
For businesses, this may mean incorporating training on fake audio spam calls as part of required cybersecurity training for employees. For individuals, this may mean being more discerning about what you post online. Grobman said: “Sometimes risky behavior will be more likely to impact someone around you than it will impact you directly.” Criminals could use what we post on social media in an AI-generated voice cloned call to establish relationships with other victims.
Meanwhile, identity protection and personal data cleansing services will continue to be useful to consumers. Policies around how employees should behave when receiving a contactless call and what they share online, even on their personal profile, could become increasingly common.
Grobman recommends that families find a word of restraint or a word of validation that they can use to reassure themselves that it is indeed a loved one on the other end of the line. It’s like a spoken password; Just like numeric passwords, avoid using the names of your pets or children, or any readily available information.
What happens if someone calls saying they are from a business? Hang up, find the company’s contact information (don’t just call back the number that called you), and call them yourself to verify. “It is extremely important to independently validate through a trusted channel,” Grobman said.
For his part, Healey acts as a sort of vigilante when it comes to phone fraud, always picking up the phone when a spam number pops up on the screen. He doesn’t give them any confirming information, nor does he tell them who he is or any information about himself. It just keeps them online as long as possible, which costs them money because their voice over IP technology is at work.
“Keeping them on the phone is an effective way to keep them from harming anyone else,” Healey said.
The Widespread IRS Scam Healey Investigated and the Podcast Chameleon: probable scam covered had tangible implications for victims – shame, loss of financial security, loss of relationships, even loss of life. To the trained ear, spam calls may seem silly, but people like the elderly or those in a vulnerable state of mind have fallen and continue to fall for this charade.
With the use of AI technology imitating the voices of our acquaintances, friends or loved ones, the game becomes more anchored in the psyche. And it’s a game. At some point, Chameleon note, it’s no longer about money, but rather about success, adrenaline and power. But while education about this ever-evolving threat is gaining ground, technology is helping to fight back.