NoGoolag
4.54K subscribers
13.1K photos
6.88K videos
587 files
14.1K links
Download Telegram
Is Big Brother Watching You? - BBC Click
Police deployment of facial recognition is happening across the globe. But the tech is fraught with issues such as civil liberties
https://www.youtube.com/watch?v=KqFyBpcbH9A

Big Brother Britain: Police FINE pedestrian £90 for disorderly behaviour after he tries to cover his face from facial recognition camera on the streets of London
https://www.dailymail.co.uk/news/article-7036141/Police-fine-pedestrian-90-facial-recognition-camera-row.html

Comments:
https://news.ycombinator.com/item?id=19928231


#uk #biometrics #facialrecognition #cctv #why
Media is too big
VIEW IN TELEGRAM
France Set to Roll Out Nationwide Facial Recognition ID Program

Digital identity enrollment app to be rolled out in November
Privacy, absence of consent and security among concerns raised

France is poised to become the first European country to use facial recognition technology to give citizens a secure digital identity -- whether they want it or not.

👉🏼 Read more:
https://www.bloomberg.com/news/articles/2019-10-03/french-liberte-tested-by-nationwide-facial-recognition-id-plan

#france #id #FacialRecognition #nationwide #thinkabout #why #video
📡@cRyPtHoN_INFOSEC_DE
📡@cRyPtHoN_INFOSEC_EN
📡@cRyPtHoN_INFOSEC_ES
This is how you kick facial recognition out of your town

Bans on the technology have mostly focused on law enforcement, but there’s a growing movement to get it out of school, parks, and private businesses too.

In San Francisco, a cop can’t use facial recognition technology on a person arrested. But a landlord can use it on a tenant, and a school district can use it on students.

This is where we find ourselves, smack in the middle of an era when cameras on the corner can automatically recognize passersby, whether they like it or not. The question of who should be able to use this technology, and who shouldn’t, remains largely unanswered in the US. So far, American backlash against facial recognition has been directed mainly at law enforcement. San Francisco and Oakland, as well as Somerville, Massachusetts, have all banned police from using the technology in the past year because the algorithms aren’t accurate for people of color and women. Presidential candidate Bernie Sanders has even called for a moratorium on police use.

Private companies and property owners have had no such restrictions, and facial recognition is increasingly cropping up in apartment buildings, hotels, and more. Privacy advocates worry that constant surveillance will lead to discrimination and have a chilling effect on free speech—and the American public isn’t very comfortable with it either. According to a recent survey by Pew Research, people in the US actually feel better about cops using facial recognition than they do about private businesses.

👉🏼 Read more:
https://www.technologyreview.com/s/614477/facial-recognition-law-enforcement-surveillance-private-industry-regulation-ban-backlash/

#surveillance #facialrecognition #lawenforcement #regulation #thinkabout
📡@cRyPtHoN_INFOSEC_DE
📡@cRyPtHoN_INFOSEC_EN
📡@cRyPtHoN_INFOSEC_ES
Enhancing digital privacy by hiding images from AI

Researchers develop a new technique that will keep your online photos safe from facial recognition algorithms. The research, which has been ongoing for more than six months, is targeted at countering the facial-recognition algorithms of big tech firms such as Facebook and Google. Professor Kankanhalli and her team from NUS Computer Science has developed a technique that safeguards sensitive information in photos by making subtle visual distortion in the images that are almost imperceptible to humans but render selected features undetectable by known algorithms.

https://news.nus.edu.sg/research/enhancing-digital-privacy-hiding-images-ai

#AI #facialRecognition
PimEyes - A Polish company just abolishes our anonymity

Research by
netzpolitik.org shows the potential for abuse of PimEyes, a free search engine for 900 million faces. All of whom have photos on the Internet could already be part of their database.

Dylan smiles into the camera, arm in arm with the other guests of a queer boat party. Behind them, glasses glisten on the shelves of a bar. Eight years ago a party photographer uploaded this snapshot on the internet. Dylan had already forgotten it - until today. Because with a reverse search engine for faces, everyone can find this old party photo of Dylan. All they have to do is upload his profile picture from the Xing career network, free of charge and without registration. But Dylan wants to keep his private and professional life separate: During the day he works as a banker in Frankfurt am Main.

The name of the search engine is PimEyes. It analyses masses of faces on the Internet for individual characteristics and stores the biometric data. When Dylan tests the search engine with his profile picture, it compares it with the database and delivers similar faces as a result, shows a preview picture and the domain where the picture was found. Dylan was recognized even though, unlike today, he did not even have a beard then.

Our research shows: PimEyes is a wholesale attack on anonymity and possibly illegal. A snapshot may be enough to identify a stranger using PimEyes. The search engine does not directly provide the name of a person you are looking for. But if it finds matching faces, in many cases the displayed websites can be used to find out name, profession and much more.

👀 👉🏼 🇬🇧 PimEyes - A Polish company just abolishes our anonymity
https://netzpolitik.org/2020/pimeyes-face-search-company-is-abolishing-our-anonymity/

👀 👉🏼 🇩🇪: https://netzpolitik.org/2020/gesichter-suchmaschine-pimeyes-schafft-anonymitaet-ab/

👀 👉🏼 🇬🇧 https://www.bbc.com/news/technology-53007510

👀 👉🏼 🇬🇧 https://petapixel.com/2020/06/11/this-creepy-face-search-engine-scours-the-web-for-photos-of-anyone/

👀 👉🏼 🇩🇪 Automated face recognition -
Enforce our data protection rights at last!
https://netzpolitik.org/2020/automatisierte-gesichtserkennung-setzt-unsere-datenschutzrechte-endlich-auch-durch/

#PimEyes #facialrecognition #searchengine #privacy #anonymity #ourdata #thinkabout
📡@cRyPtHoN_INFOSEC_DE
📡
@cRyPtHoN_INFOSEC_EN
📡
@BlackBox_Archiv
📡
@NoGoolag
Image "Cloaking" for Personal Privacy

2020 is a watershed year for machine learning. It has seen the true arrival of commodized machine learning, where deep learning models and algorithms are readily available to Internet users. GPUs are cheaper and more readily available than ever, and new training methods like transfer learning have made it possible to train powerful deep learning models using smaller sets of data.

But accessible machine learning also has its downsides as well. A recent New York Times article by Kashmir Hill profiled clearview.ai, an unregulated facial recognition service that has now downloaded over 3 billion photos of people from the Internet and social media, using them to build facial recognition models for millions of citizens without their knowledge or permission. Clearview.ai demonstrates just how easy it is to build invasive tools for monitoring and tracking using deep learning.

So how do we protect ourselves against unauthorized third parties building facial recognition models to recognize us wherever we may go? Regulations can and will help restrict usage of machine learning by public companies, but will have negligible impact on private organizations, individuals, or even other nation states with similar goals.

The SAND Lab at University of Chicago has developed Fawkes1, an algorithm and software tool (running locally on your computer) that gives individuals the ability to limit how their own images can be used to track them. At a high level, Fawkes takes your personal images, and makes tiny, pixel-level changes to them that are invisible to the human eye, in a process we call image cloaking. You can then use these "cloaked" photos as you normally would, sharing them on social media, sending them to friends, printing them or displaying them on digital devices, the same way you would any other photo. The difference, however, is that if and when someone tries to use these photos to build a facial recognition model, "cloaked" images will teach the model an highly distorted version of what makes you look like you. The cloak effect is not easily detectable, and will not cause errors in model training. However, when someone tries to identify you using an unaltered image of you (e.g. a photo taken in public), and tries to identify you, they will fail.

👀 👉🏼 http://sandlab.cs.uchicago.edu/fawkes/

#Fawkes #image #cloaking #facialrecognition #privacy
📡@cRyPtHoN_INFOSEC_DE
📡
@cRyPtHoN_INFOSEC_EN
📡
@BlackBox_Archiv
📡
@NoGoolag
Forwarded from GJ `°÷°` 🇵🇸🕊 (t ``~__/>_GJ06)
Facial Recognition Failures Are Locking People Out of Unemployment Systemshttps://www.vice.com/en/article/5dbywn/facial-recognition-failures-are-locking-people-out-of-unemployment-systems

People around the country are furious after being denied their #unemployment benefits due to apparent problems with facial recognition technology that claims to prevent fraud.

Unemployment recipients have been complaining for months about the #identity verification service ID.me, which uses a combination of biometric information and official documents to confirm that applicants are who they claim to be. The complaints reached another crescendo this week after Axios published a “deep dive” article about the threat of unemployment fraud based on statistics provided to the outlet by ID.me..

#facialrecognition #id #IDme
Police accused over use of facial recognition at King Charles’s coronation | King Charles coronation | The Guardian – https://www.theguardian.com/uk-news/2023/may/03/metropolitan-police-live-facial-recognition-in-crowds-at-king-charles-coronation

The largest previous LFR deployment was the 2017 Notting Hill carnival, when 100,000 faces were scanned.
Fussey said: “A surveillance deployment for the coronation would likely be the biggest live facial recognition operation ever conducted by the MPS, and probably the largest ever seen in Europe.

#UK #FacialRecognition
Ian Dunt (@IanDunt): "The purpose of the Public Order Act is to make the trigger for criminal penalties so broad, and the meaning of key terms so nebulous, that it will be hard for a protester to ever really know they are abiding by the law. https://inews.co.uk/opinion/most-draconian-assault-free-speech-living-memory-now-law-2313273" | nitter – https://nitter.net/IanDunt/status/1654034279641350150#m

#UK #FacialRecognition
#Clearview fined again in France for failing to comply with privacy orders | TechCrunch – https://techcrunch.com/2023/05/10/clearview-ai-another-cnil-gspr-fine/

Clearview AI, the U.S. startup that’s attracted notoriety in recent years for a massive privacy violation after it scraped selfies off the internet and used people’s data to build a facial recognition tool it pitched to law enforcement and others, has been hit with another fine in France over non-cooperation with the data protection regulator.
#facialrecognition #ai
The Israeli authorities are using an facial recognition system known as Red Wolf to track Palestinians and automate harsh restrictions on their freedom of movement, Red Wolf is part of an ever-growing surveillance network which is entrenching the Israeli government’s control over Palestinians, and which helps to maintain Israel’s system of apartheid. Red Wolf is deployed at military checkpoints in the city of Hebron in the occupied West Bank, where it scans Palestinians’ faces and adds them to vast surveillance databases without their consent.

#facialrecognition technology supports a dense network of (CCTV) cameras to keep Palestinians under near-constant observation. Automated #Apartheid shows how this #surveillance is part of a deliberate attempt by Israeli authorities to create a hostile and coercive environment for Palestinians.

https://www.amnesty.org/en/latest/news/2023/05/israel-opt-israeli-authorities-are-using-facial-recognition-technology-to-entrench-apartheid/

#Palestine #Israel #RedWolf
Documenting the rise of facial recognition in the UK
https://invidious.snopyta.org/watch?v=bX-Yxy1ESAQ&local=true

Facial recognition surveillance turns us into walking ID cards, and treats members of the public like suspects in a high-tech police line up.

Our new detailed report, Biometric Britain: The Expansion of Facial Recognition Surveillance, lays out how police, retailers, tech companies and even some schools are investing huge sums of money into this intrusive technology.

#UK #BigBrother #FacialRecognition #Surveillance #biometric #BigBrotherWatchUK
Israel Is Using a Vast Network of Biometric Cameras to Terrorize Palestinians - Truthout – June 2023

The facial recognition surveillance system violates Palestinians’ human rights to freedom of movement and privacy.

Israel is deepening its system of apartheid in the occupied Palestinian territories by using artificial intelligence-powered biometric facial recognition technology to track and restrict the movements of Palestinian people.

Facial recognition technology identifies and categorizes people on the basis of their physical features, including race, ethnicity, gender, age and disability status
.

#Israel #Palestine #Apartheid #Technology #FacialRecognition #surveillance #AI #Biometrics
Sony, Honda, Ford, Genesis, and Mullen Automotive nod toward facial recognition tech | Biometric Update –

A slew of the world’s largest automakers, including Sony, Honda, Ford, Genesis, and Mullen Automotive, have all either recently announced or patented facial recognition technologies.

A newly unveiled prototype car from Sony and Honda, called “Afeela,” is set to employ facial recognition to unlock the vehicle and open its door.

The semiconductors and chipsets set to underpin this biometric tech will be provided by electronics giant Qualcomm.

The firms will start taking orders in 2025, with U.S. deliveries set to start in 2026

#Automobile #FacialRecognition
In Mannheim, an automated system reports hugs to the police - AlgorithmWatch –

Mannheim, a large city on the Rhine, deployed a video system that claims to automatically detect physical violence in some streets. It can confuse hugging with strangling, and it is unclear whether it can actually prevent violence.

At the Alter Messplatz in Mannheim, four men sit on benches by the water while kids race scooters around the square. Earlier, two women thought the building behind the men was a public toilet, but it is a performance space, part of a new community center that houses a bar, basketball court and sports equipment rental.


But there's something else going on in the square: it is also a testing ground for what Baden-Württemberg’s interior ministry describes as Europe's first ever intelligent video surveillance system, which is now about to be tested in Hamburg. 

#VSA #Surveillance #FacialRecognition #Germany
#Manheim #Hamburg #EU
Media is too big
VIEW IN TELEGRAM
Israel's automated occupation Part 1/2 - 2023
How Israel automated occupation in #Hebron

Palestinians in Hebron are some of the most heavily monitored and controlled people on the planet.

In the first episode of a two-part special, Tariq Nafi reports from the occupied West Bank on the previously unknown facial recognition system ‘Red Wolf’, uncovered by Amnesty International and Breaking the Silence.

Contributors:
Izzat Karaki — activist; volunteer, Youth Against Settlements
Sophia Goodfriend — journalist; researcher, Duke University
Matt Mahmoudi — researcher and adviser, Amnesty International
Former Major General B — ex-Israeli military officer who spoke to Al Jazeera on condition of anonymity.

#15minuteCity #surveillance #RedWolf #FacialRecognition #Apartheid #Palestine #unit8200
Media is too big
VIEW IN TELEGRAM
Inside Israel’s surveillance machine part 2/2 - 2023

Palestinian existence in Jerusalem is under threat - carefully watched, recorded and restricted. In the Old City’s narrow streets and alleys, cameras are inescapable.

In the second episode of a two-part special, Tariq Nafi reports from the occupied West Bank, on how Israel’s surveillance machine infiltrates the lives of Palestinians. He also speaks with a former Israeli lieutenant in Unit 8200 - the elite intelligence unit responsible for spying on Palestinians.

Contributors:
Rula Jamal - Head of Monitoring & Documentation, Al Haq
Jalal Abukhater - Writer
Amal Sumarin - Silwan resident
Helga Tawil-Souri - Associate Professor, NYU
Israeli former Lieutenant Eli - Unit 8200

#15minuteCity #surveillance #Unit8200 #Palestine #Apartheid #Surveillance #RedWolf #FacialRecognition
Media is too big
VIEW IN TELEGRAM
In November 2021 the Washington Post published a stunning exposé on the use of 'Blue Wolf', a new mass surveillance system being operated by soldiers in the West Bank to photograph and collect sensitive personal information on local Palestinians. The story, which was brought to light thanks to several testimonies given by IDF soldiers to Breaking the Silence, represents a massive escalation in Israel's pursuit of control over the Palestinian civilian population in the West Bank, and raises some serious questions on the role of technology within the context of the occupation.
Breaking the Silence

#BlueWolf #RedWolf #Apartheid #Palestine #FacialRecognition #Surveillance #BreakingTheSilence

#Documentaire
Audio
About Face (Recognition) | EFF

Is your face truly your own, or is it a commodity to be sold, a weapon to be used against you? A company called #Clearview AI has scraped the internet to gather (without consent) 30 billion images to support a tool that lets users identify people by picture alone. Though it’s primarily used by law enforcement, should we have to worry that the eavesdropper at the next restaurant table, or the creep who’s bothering you in the bar, or the protestor outside the abortion clinic can surreptitiously snap a pic of you, upload it, and use it to identify you, where you live and work, your social media accounts, and more? 


Kashmir Hill has been writing about the intersection of #privacy and #technology for well over a decade; her book about Clearview AI’s rise and practices was published last fall. She speaks with the #EFF about how face recognition technology’s rapid evolution may have outpaced ethics and regulations, and where we might go from here. 

#FacialRecognition