NoGoolag
4.54K subscribers
13.1K photos
6.88K videos
587 files
14.1K links
Live free!

πŸ“‘ @NoGoolag

FAQ:
http://t.me/NoGoolag/169

β˜…Group:
https://t.me/joinchat/nMOOE4YJPDFhZjZk

πŸ“‘ @Libreware

πŸ“‘ @TakeBackOurTech

🦊 @d3_works

πŸ“š @SaveAlexandria

πŸ’― % satire OSINT
Download Telegram
CNAME Cloaking, the dangerous disguise of third-party trackers

How come AdBlock, Adblock Plus, uBlock Origin, Ghostery, Brave and Firefox are letting a third-party tracker from Eulerian, a leading tracking company, execute their script freely on fortuneo.fr, one of the biggest online bank in France?

How come the same thing is happening on thousands of other popular websites worldwide?
What has started to happen in the last few months in the world of third-party tracking is having a major impact on people’s privacy, and it all stayed pretty much under the radar.

πŸ‘‰πŸΌ Read more πŸ‡¬πŸ‡§:
https://medium.com/nextdns/cname-cloaking-the-dangerous-disguise-of-third-party-trackers-195205dc522a

πŸ‘‰πŸΌ Read more πŸ‡©πŸ‡ͺ:
https://www.kuketz-blog.de/vorsicht-neue-art-des-trackings-via-cname-cloaking/

#CNAME #Cloaking #tracker #dns #AdBlock #AdblockPlus #uBlock #Ghostery #Brave #Firefox #Eulerian
πŸ“‘@cRyPtHoN_INFOSEC_DE
πŸ“‘@cRyPtHoN_INFOSEC_EN
πŸ“‘@cRyPtHoN_INFOSEC_ES
Image "Cloaking" for Personal Privacy

2020 is a watershed year for machine learning. It has seen the true arrival of commodized machine learning, where deep learning models and algorithms are readily available to Internet users. GPUs are cheaper and more readily available than ever, and new training methods like transfer learning have made it possible to train powerful deep learning models using smaller sets of data.

But accessible machine learning also has its downsides as well. A recent New York Times article by Kashmir Hill profiled clearview.ai, an unregulated facial recognition service that has now downloaded over 3 billion photos of people from the Internet and social media, using them to build facial recognition models for millions of citizens without their knowledge or permission. Clearview.ai demonstrates just how easy it is to build invasive tools for monitoring and tracking using deep learning.

So how do we protect ourselves against unauthorized third parties building facial recognition models to recognize us wherever we may go? Regulations can and will help restrict usage of machine learning by public companies, but will have negligible impact on private organizations, individuals, or even other nation states with similar goals.

The SAND Lab at University of Chicago has developed Fawkes1, an algorithm and software tool (running locally on your computer) that gives individuals the ability to limit how their own images can be used to track them. At a high level, Fawkes takes your personal images, and makes tiny, pixel-level changes to them that are invisible to the human eye, in a process we call image cloaking. You can then use these "cloaked" photos as you normally would, sharing them on social media, sending them to friends, printing them or displaying them on digital devices, the same way you would any other photo. The difference, however, is that if and when someone tries to use these photos to build a facial recognition model, "cloaked" images will teach the model an highly distorted version of what makes you look like you. The cloak effect is not easily detectable, and will not cause errors in model training. However, when someone tries to identify you using an unaltered image of you (e.g. a photo taken in public), and tries to identify you, they will fail.

πŸ‘€ πŸ‘‰πŸΌ http://sandlab.cs.uchicago.edu/fawkes/

#Fawkes #image #cloaking #facialrecognition #privacy
πŸ“‘@cRyPtHoN_INFOSEC_DE
πŸ“‘
@cRyPtHoN_INFOSEC_EN
πŸ“‘
@BlackBox_Archiv
πŸ“‘
@NoGoolag