Android

AI is also good for blocking scams, thanks to Gmail


All over the world, countless degenerates are using AI to craft scams targeted at last-minute Christmas shoppers. While that’s unfortunate, Google has some good news for us. According to the company, Google employed some AI models in Gmail to help block numerous scams.

Christmas is just around the corner, and this means that last-minute shoppers are flocking to e-shops across the internet. This means that Christmas is coming early for scammers. They can easily take advantage of desperate shoppers trying to get gifts in before the grand day. Scammers have used AI to scam the masses, but it gets so much worse around the holidays.

Thankfully, Google uses AI models to stop scams through Gmail

Many scams occur through email. Scammers disguise themselves as real retailers like Amazon or Best Buy and send fake promotions to unsuspecting people. These emails contain links to malicious sites that steal users’ sensitive data. That’s just one type of the many scams out there.

Google explained three other types of scams popping up this holiday season. Invoice scams are when people send fake invoices to victims and pressure them to pay the sender.

Next, Celebrity scams are when scammers contact victims pretending to be a celebrity or claim to give them a celebrity endorsement. Using this, the victim will do pretty much anything for the “Celebrity.”

The third one is the scariest one. The scammer will send the victim information like their home address or a picture of their house. They’ll then threaten physical violence if they don’t give them money.

Google for the win

Google said that the AI models it uses in Gmail have reduced email scams by 35% compared to this time last year. The company was able to block millions of emails from making it to people, and this most likely resulted in millions of dollars in savings.

There was one LLM trained on phishing scams that blocked 20% more scams. Also, it’s able to scan 1000x more user-reported scams every day.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.