|
|
Not OK, Cupid: when your dating profile trains AI
|
|
|
When you sign up for a dating app, you expect some risks — awkward matches, maybe even scammers. What you don’t expect is your photos and personal data being quietly handed off to an AI company you’ve never heard of.
That’s exactly what happened with OkCupid. According to regulators, millions of user photos along with other sensitive data were shared without consent and used to train facial recognition tools. Users were completely unaware of what was going on behind their backs, but when the regulators finally stepped in, the company was not even fined. And this isn’t an isolated case — across apps and platforms, user data is increasingly being reused in ways people never explicitly agreed to, from ad targeting to AI training.
|
|
|
|
What else is going on
|
Why does the AdGuard UI read like this?
|
|
Privacy tools often involve complex technology, but for users they should remain simple and intuitive. In this post, AdGuard UX writer Sofia Orlova explains how the AdGuard UX team balances clarity, control, and usability: from clean interfaces and layered settings to texts that work for beginners and advanced users alike.
|
|
|
|
|
|
|
Your data is not yours anymore
|
Your data lives in AI forever
|
|
From personal photos to medical images, vast amounts of data scraped from across the web are being fed into AI models — often without people knowing. Once it’s in, there’s little control over how it’s used, and even less chance of getting it back.
|
|
|
|
|
|
Your vacuum may know more than it should
|
|
Robot vacuums are designed to map your home and avoid obstacles — but in doing so, they can also capture and share far more than you’d expect, and that includes your sensitive data.
|
|
|
|
|
|
Alexa, don't steal my data
|
|
What used to be private voice commands can now be sent straight to the cloud and used to train AI. With recent changes, opting out is no longer an option for some devices.
|
|
|
|
|
|
|
What caught our eye
|
|
|
AdGuard news
|
|
|
|