NoGoolag
4.69K subscribers
19.9K photos
11.2K videos
785 files
20.1K links
Live free!

📡 @NoGoolag

FAQ:
http://t.me/NoGoolag/169

★Group:
https://t.me/joinchat/nMOOE4YJPDFhZjZk

📡 @Libreware

📚 @SaveAlexandria

📡 @BallMemes

FORWARDS ARE NOT ENDORSEMENTS

💯 % satire OSINT
Download Telegram
How Google Interferes With Its Search Algorithms and Changes Your Results

The internet giant uses blacklists,
algorithm tweaks and an army of contractors to shape what you see

👀 More than 100 interviews and the Journal’s own testing of Google’s search results reveal:

‼️
Google made algorithmic changes to its search results that favor big businesses over smaller ones, and in at least one case made changes on behalf of a major advertiser, eBay Inc., contrary to its public position that it never takes that type of action. The company also boosts some major websites, such as Amazon.com Inc. and Facebook Inc., according to people familiar with the matter.

‼️ Google engineers regularly make behind-the-scenes adjustments to other information the company is increasingly layering on top of its basic search results. These features include auto-complete suggestions, boxes called “knowledge panels” and “featured snippets,” and news results, which aren’t subject to the same company policies limiting what engineers can remove or change.

‼️ Despite publicly denying doing so, Google keeps blacklists to remove certain sites or prevent others from surfacing in certain types of results. These moves are separate from those that block sites as required by U.S. or foreign law, such as those featuring child abuse or with copyright infringement, and from changes designed to demote spam sites, which attempt to game the system to appear higher in results.

‼️ In auto-complete, the feature that predicts search terms as the user types a query, Google’s engineers have created algorithms and blacklists to weed out more-incendiary suggestions for controversial subjects, such as abortion or immigration, in effect filtering out inflammatory results on high-profile topics.

‼️ Google employees and executives, including co-founders Larry Page and Sergey Brin, have disagreed on how much to intervene on search results and to what extent. Employees can push for revisions in specific search results, including on topics such as vaccinations and autism.

‼️ To evaluate its search results, Google employs thousands of low-paid contractors whose purpose the company says is to assess the quality of the algorithms’ rankings. Even so, contractors said Google gave feedback to these workers to convey what it considered to be the correct ranking of results, and they revised their assessments accordingly, according to contractors interviewed by the Journal. The contractors’ collective evaluations are then used to adjust algorithms.

👉🏼 Read more (paywall):
https://www.wsj.com/articles/how-google-interferes-with-its-search-algorithms-and-changes-your-results-11573823753

👉🏼 Read more (german/no paywall):
https://netzpolitik.org/2019/der-selbstgebaute-algorithmus/

#DeleteGoogle #manipulation #search #algorithms #why #thinkabout
📡@cRyPtHoN_INFOSEC_DE
📡@cRyPtHoN_INFOSEC_EN
📡@cRyPtHoN_INFOSEC_ES
This media is not supported in your browser
VIEW IN TELEGRAM
The Cameras in Your Car May Be Harvesting Data as You Drive
Safety system sensors in modern cars are collecting data about the road on behalf of the company that makes them

If you drive a newer car, it’s likely to have at least one built-in camera or sensor that powers important safety systems such as automatic emergency braking (AEB) and blind spot warning (BSW), or that makes driving easier with assistance features such as adaptive cruise control and lane centering. Most of the software and algorithms that control those systems were developed by Mobileye.

https://www.consumerreports.org/automotive-technology/the-cameras-in-your-car-may-be-harvesting-data-as-you-drive/

#data #harvesting #cars #cameras #algorithms #surveillance #thinkabout #Mobileye
📡@cRyPtHoN_INFOSEC_DE
📡@cRyPtHoN_INFOSEC_EN
📡@BlackBox_Archiv
Police Across Canada Are Using Predictive Policing Algorithms, Report Finds

Police across Canada are increasingly adopting algorithmic technology to predict crime. The authors of a new report say human rights are threatened by the practice.

Police across Canada are increasingly using controversial algorithms to predict where crimes could occur, who might go missing, and to help them determine where they should patrol, despite fundamental human rights concerns, a new report has found.

To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada is the result of a joint investigation by the University of Toronto’s International Human Rights Program (IHRP) and Citizen Lab. It details how, in the words of the report’s authors, “law enforcement agencies across Canada have started to use, procure, develop, or test a variety of algorithmic policing methods,” with potentially dire consequences for civil liberties, privacy and other Charter rights, the authors warn.

https://www.vice.com/en_us/article/k7q55x/police-across-canada-are-using-predictive-policing-algorithms-report-finds

#Canada #police #predictive #algorithms #privacy #surveillance
Media is too big
VIEW IN TELEGRAM
Are #Algorithms Dividing Everyone into Separate Realities Online?
Truthstream Media
Is the world you are being presented online crafted to be vastly different from the person sitting right next to you in physical reality? Do people even realize this is what algorithms are doing to us all? Many have rightfully pointed out how we appear more divided than ever, but the first step to taking that wizardry down a notch is acknowledging what is actually happening out there for what it is.