First, the technology itself can be racially biased.... Buolamwini and Gebru’s 2018 research concluded that some facial analysis algorithms misclassified Black women nearly 35 percent of the time, while nearly always getting it right for white men.... These error-prone, racially biased algorithms can have devastating impacts for people of color....
Second, police in many jurisdictions in the U.S. use mugshot databases to identify people with face recognition algorithms.... Across the U.S., Black people face arrest for a variety of crimes at far higher rates than white people. Take cannabis arrests, for just one example...
Third... the entire system is racist.... Surveillance of Black people in the U.S. has a pernicious and largely unaddressed history, beginning during the antebellum era.... [There is] spying that targets political speech, too often conflated with “terrorism,” and spying that targets people suspected of drug or gang involvement. In recent years, we learned of an FBI surveillance program targeting so-called “Black Identity Extremists,” which appears to be the bureau’s way of justifying domestic terrorism investigations of Black Lives Matter activists.... Racial disparities in the government’s war on drugs are well documented.
I was reading that because of this vote in my city last night: "Madison City Council bans city agencies from using facial recognition technology" (Wisconsin State Journal).
On a 17-2 vote, the council approved a new ordinance that prohibits city agencies, departments and divisions from using facial recognition technology or "information derived from a face surveillance system" with a handful of exceptions. Following a national reckoning this year on over-policing in communities of color, Madison and other governments have scrutinized and limited the use of face surveillance systems by law enforcement.
"The technology has proven to be unreliable and faulty," Ald. Rebecca Kemble, 18th District, said of facial recognition, describing the ban more as a moratorium. "We also don't want this technology to be used to further worsen the racial disparities that there already are in our criminal justice system."
६७ टिप्पण्या:
It just IS... experts said so!
City council was probably right to not adopt the software as, based on this excerpt, it’s not ready yet.
But there’s another issue here, reflecting the general intellectual decline of our society and, especially, the ACLU. The technical hurdle that it’s more difficult to identify black faces than white ones (unsurprising if you think about it) is now evidence of racism.
Obviously the city council is run by Repubs, who else would be so anti-science.
I am an ex software developer from the old school days and this is another topic that makes me yell at the screen.
Idjits. These people are idjits.
-XC
First, the technology itself can be racially biased
That makes as much sense as saying laser tattoo removal is "racially buased" because it's less effective on darker skin. One of the classifiers they tested was Face++. It misidentified dark=skinned black women as male about 35% of the time. However, it only misidentified dark-skinned men 0.7% of the time light-skinned men 0.8% of the time, and light-skinned women 6% of the time. So it actually performed best at classifying dark-skinned men.
Black people face arrest for a variety of crimes at far higher rates than white people.
Because they commit more crime.
"We also don't want this technology to be used to further worsen the racial disparities that there already are in our criminal justice system."
Some other creative solutions to this problem: allow blacks to opt out of being arrested for outstanding warrants, make it optional for blacks to comply with law enforcement orders, allow blacks to drive cars without valid registration, tags, or drivers license, require retail establishments to open one hour before normal business hours for black customers to loot without having to risk glass or strain injuries.
If these solutions don't work in eliminating the scourge of black guys getting arrested for committing crime, we can always just stop arresting black people.
How else can white cops identify one black person from another?
Well shit, if Gen1 of some technology isn't perfect, we should just ban all of it before finding out how Gens 2 - n work out.
Lol.
Hey, it's racist to try to tell black people apart because they all look the same!
I have an idea, let's not allow the government to track, trace, and surveil us by video, or any other means.
Joke’s on y’all since Imma wear a mask, sunglasses and a FACE SHIELD forever?
p.s. anyone have a theory as to why those things say FACE SHIELD on them, as though you’d mistake them for something else if they weren’t helpfully labeled?
Question answered at the ACLU website.
No, it wasn't answered, it was bloviated about.
I enjoyed the opening statement from aclu's scribbler, wherein he author makes clear his biases and baseless fantasies:
"If police are authorized to deploy invasive face surveillance technologies against our communities, these technologies will unquestionably be used to target Black and Brown people merely for existing."
misidentified dark=skinned black women as male about 35% of the time.
Here's a crowing denouncement of Kanazawa's thought-crime paper, which drew a similar conclusion for some mysterious reason:
"Why Are Black Women Less Physically Attractive Than Other Women?"
Here's that censored paper
"Recall that women on average are more physically attractive than men."
Oh teh horror!
If we don't surveil them so much, black people will commit fewer crimes, and we'll have the numbers to prove it.
Call it what it is: surrender.
When the general public allowed the race hustlers to turn the issue of disproportionally high crime rates in the black community to disparate arrest and sentencing outcomes, society surrendered. We will just have to accept that property laws represent an infringement on black people's lifestyles. Can't have that.
Once we have the vaccine no need to wear masks when looting and burning on State Street. Cool.
Buolamwini and Gebru’s 2018 research concluded that some facial analysis algorithms misclassified Black women nearly 35 percent of the time, while nearly always getting it right for white men
They would say that, wouldn't they?
But it sounds like something that could be fixed.
If not, it could open up a lot of opportunities in crime for Black women.
White guys don't get a break nowadays ...
But an Alexs/Siri/Echo in your house listening to everything you say is perfectly fine.
Probably mandatory, eventually
John Henry
Let me be the first to relate this bad news to the Madison City Council.
This face recognition software will be used by somebody to monitor crime in Madison.
It just won’t be you.
As the crime levels increase because of the council’s malfeasance, somebody will sell this info to those willing to pay.
Which means those least able to pay will take it up the ass!
They could just call it the Looter And Rioter Protection Ordinance (LARPO).
Sometimes knee-jerk liberalism can work in our favor. This is one case. Another case would be religious freedom ensured by the demands of Muslims. What Progressives/totalitarians are up against is individual and minority rights. As we Deplorables may very well find ourselves oppressed because of our political views it's wise to remember that.
tim maguire The technical hurdle that it’s more difficult to identify black faces than white ones (unsurprising if you think about it) is now evidence of racism.
Science baby! It is annoying when it goes against the latest SJW mantras.
Now do facial recognition and Asians! of course that won't be racist because not blacks.
The real problem was when Chicago Mayor Lori Lightfoot was identified not as a Black woman, but as "Space Alien." Lightfoot responded angrily, "It's a trap!"
I’m still a software developer and still an expert in probabilistic machine learning, FWIW. Here are my thoughts, also FWIW conditioned on the above:
The first item is a genuine problem, but is solvable partially by applying known principles of human physiology to construct the prior probabilities going into the system (yes, I know the systems are probably based on the “deep learning” fad rather than Bayes; that’s part of the mistake) and partially by the second point.
The second point is not racist because it’s not racist to observe that recidivism exists. It would be irresponsible to develop a system that failed to take the phenomenon into account. It’s very important prior information in probabilistic reasoning.
The third point is also not racist: the FBI has the legal authority to seek permission, and sometimes be granted that permission, to surveil suspected domestic terrorist organizations. Some such domestic terrorist organizations are race-based. Black Lives Matter is one such race-based domestic terrorist organization by the definition used by the Department of Justice.
Now, as a software engineer with expertise in probabilistic machine learning and a registered big-L Libertarian, I share the concern several here have expressed about government surveillance generally, to say nothing of surveillance capitalism generally. My guy in the 2016 primaries was Rand Paul, who stands for sound money and against military adventurism, and literally filibustered drone surveillance of American citizens. But the argument that face recognition software is “racist” offered here only reveals one technical issue, and even that can’t be called “racist” given the possibility that black female faces are a priori harder to identify than others. Given the data sets in question to evaluate, the first thing I would try would be a Maximum Entropy classification precisely to see what categories are suggested by the data themselves. Then I’d ask how the data were generated, e.g. were the images taken with high-definition cameras with high dynamic range? Do we just have a single image per face, or do we have several with which to do 3D reconstruction? Do we have several in a time sequence so we can interpolate the effects of motion? etc. etc. etc..
Finally, it would be wise to demand that even given the most legitimate, above-board application of the technology, the software be required to only offer zero knowledge proof of identity of a match, and use a shared secret among the relevant officials to allow multiple must-certify decisions to reveal the actual match.
We have the technology to make objective decisions automatically, prove knowledge of facts without revealing those facts, and split sensitive data in such a way that N parties must agree to decrypt it. We should demand our government use such technology and allow it to be audited by anyone who wants to.
So face recognition software is bad? Okay.
But signature matching is darned near infallible? Okay?
Amazing how the left radically shifts its opinions on what is essentially a highly similar math-based pair of softwares.
They could just call it the Looter And Rioter Protection Ordinance (LARPO).
So the protestors in Portland were only "larping"?
Zero-knowledge proof of identity, darn it.
These are all old studies these idiots keep referencing. Facial recognition uses the same general framework as most image recognition. You take a bunch of labeled pictures and you train your algorithm with them. You have the full pool and you create test batches out of it to further optimize the accuracy.
The first algorithms were trained with picture pools that had fewer black women in them. They were also spectacularly bad with Asians and other groups. Since then they have added way more pictures of all sorts of people.
These idiots don't need to worry about facial recognition though. Stride Matching algorithms are almost as accurate as a fingerprint and don't care about direction you are facing or distance really and picture resolution is not as critical. It would be way easier to fuck with facial recognition too.
"These error-prone, racially biased algorithms"
There's error and there's bias. Not the same.
"Across the U.S., Black people face arrest for a variety of crimes at far higher rates than white people."
Ah, one wonders why.
"Third... the entire system is racist"
Well, OK then.
"Surveillance of Black people in the U.S. has a pernicious and largely unaddressed history"
Unaddressed? How about blacks address it by not committing any crimes for, oh, the next decade? Just to shame whitey.
How is face surveillance an anti-Black technology?
Because everything is?
protip
if the Madison City Council is opposed to something... the thing is probably pretty good
if the Madison City Council is FOR something.... the thing is probably fascist
Paul Snively said...
We have the technology to make objective decisions automatically, prove knowledge of facts without revealing those facts, and split sensitive data in such a way that N parties must agree to decrypt it. We should demand our government use such technology and allow it to be audited by anyone who wants to.
You know who and what is driving all this. You know who the movers and shakers are in ML.
You know our government is run by idiots compared to the true evil we face.
Google, Facebook, Microsoft and Amazon. And now apparently SalesForce is back in the running with their purchase of Slack.
They must all be broken up.
Now do fingerprints.
How can facial recognition technology work if They All Look Alike?
I'm sure someone like Mr Snively could easily modify the program so it spits out as many wrong answers for white men as it does for black women. Sure, raising the accuracy for black women would be better, but lowering it for white men is easy. You could have it produce random errors when looking up white men, or just output the top N possibilities for white men, and only the top 1 for black women. Fiddle with the numbers by race and sex, and you can probably make the accuracy equal for everyone.
It would make more work for the police when looking for a white guy. They'd have to eliminate some possibilities another way. Still, it might be worth it.
"Surveillance of Black people in the U.S. has a pernicious and largely unaddressed history,..."
I was chuckling at this one. This has been well expanded to include ALL Americans now. So we've managed to get equal under the rules of surveillance. We are all under surveillance during the course of our day, out and about or at home online. Or just watching TV, calling your mom, going to the grocery, getting gas. It's not systemically going after any one group anymore. They are watching everyone. And by 'they' I mean our own governments (Federal & State), and foreign governments, and our Tech Overlords. Hell, it's hard to tell the good guys from the bad guys any longer.
Except for one thing- while we're all being surveilled, only the conservatives are being censored.
Let me know when “you white folks can’t tell us apart” is no longer part of jury deliberations.
I don't think facial recognition technology should be used.
However, to cite a 2018 study -- likely using work done in 2016/2017 -- on a technology that is evolving rapidly to make any kind of point is stupid.
It would seem that the real question is, "How effective do we want the police to be?". And the Left wants the answer to be "Very effective against our enemies, completely ineffective against us.".
Lack of facial recognition technology means that Althea Bernstein's "frat boy" attackers are still on the loose! Well that, and the fact they were imaginary all along ...
China, who's way ahead on surveillance, also uses gait analysis to track people - so, just to be on the safe side, walk silly to throw them off.
In Biden's America it won't matter because we will all be bemasked 24/7.
Even in our homes so Alexa can't rat us out either.
As a privacy first kind of guy, I'd appreciate being in any group where the error rate is so high.
Much harder to convict me of a crime.
I'm still pissed about the bridge-cam snapping my license plate, so, no, I don't want to be surveilled 24/7.
If the police have a photo of an unidentified suspect, say a fellow who shoots a clerk in a convenience store and is caught on the security camera, under this ordinance, are the police permitted to visually compare the photo to photos contained in a physical book of mugshots? If so, and if the photos being compared are not too clear, are they permitted to use an enhanced seeing device, like a magnifying glass, to compare the photos? If so, and if they still cannot make a match, are they allowed under this ordinance to shine a really bright light on both photos to aid in the comparison?
If they can use all these types of fairly primitive matching technology, all of which must be pretty error-prone, why are they barred from using better technology that is likely less error-prone and more effective? Or are we just not all that concerned with finding a match to the shooter of the convenience store worker?
Also, if the match is inaccurate, and the police question the person who they identify by the inaccurate match, wouldn't the fact that the match is inaccurate become readily apparent pretty quickly, i.e. the person questioned does not, in person, look like the guy whose image was caught on the security camera, or the person questioned has an alibi, of any other of myriad reasons. Because an investigative tool send the police running down the wrong lead occasionally, do we throw out the tool?
"Much harder to convict me of a crime."
Well, that does raise a point, doesn't it. Fingerprints and DNA are considered positive ID by the courts. They are slowly backing off of hair and bite marks. The way these things work, some Professor tells some court that he can ID people by their [pseudo-science jargon goes here], and the court believes them, and that becomes a precedent, and thereafter all courts allow it. Until some hard-charging attorney gets his own Professors, and gets some judge to allow them to to testify to the contrary.
So, can a facial identification be accepted as evidence in court?
"Following a national reckoning this year on over-policing in communities of color"
God forbid that we actually protect "people of color" from criminals!
What a bunch of hateful racist scum the Leftists are
BothSidesNow said...
If they can use all these types of fairly primitive matching technology, all of which must be pretty error-prone, why are they barred from using better technology that is likely less error-prone and more effective? Or are we just not all that concerned with finding a match to the shooter of the convenience store worker?
Because the computerized facial recognition is faster and takes less effort, which leads to more criminals getting caught.
Which means to fewer "people of color" being victimized by criminals. Apparently the Left hates the idea of "people of color" not being victims
“But it sounds like something that could be fixed.”
“If not, it could open up a lot of opportunities in crime for Black women.”
I really don’t get the logic. It is supposedly somehow bad that black women are hard to identify so that they can be arrested for their crimes. Of course, that is racist - in favor of black women who don’t want to be arrested for their crimes.
The problem being faced is that we are rapidly becoming an image saturated society. We live in a brand new subdivision. The houses all came with doorbell cameras and are wired for a couple more cameras. I think that it is a loss leader for the electric company the builder used, who have recently expanded into security systems. I finally got most of it hooked up yesterday. (The trash truck was just picked up by both the doorbell camera and house front camera, and alerted me as I type this). Much better resolution than the cameras I had installed three years ago at our last house. This all means that the PHX PD, AZ DPS, or the feds could probably track anyone as they walked or drove through the neighborhood. Being moderately paranoid about this sort of privacy intrusion, I did opt out of authorizing the police to use my cameras, unless my security system has sent them an alarm. Are they honest? Could they override, for some sort of national emergency (like failing to wear a mask to supposedly prevent the spread of COVID-19)? Probably legally no... Still, I have all their stuff on a separate, daisy chained, server that I can isolate from the Internet with a flip of an A/B switch.
My partner is a bit ambivalent about all this. She likes that miscreants can be easily tracked as they come through the neighborhood, but refuses to have any cameras inside the house. Outside door sensors are fie, but getting inside motion detectors was like pulling teeth. She does feel a lot more secure (despite my having had six cameras at the old house, and more motion detectors etc), and for a woman over 60, that is important to her.
I am still putting my office together in the new house, and I kinda went on a computer equipment buying spree. Bought a board that can drive up to six monitors (required a 500 watt power supply, which had to come first). Replaced my four old monitors with brand new ones, thanks to Black Friday. And got a new web cam. Installed it in the middle of the quad rack of monitors last night. And interestingly, one of its key features is a privacy shield, which can be slipped over the camera. Why would that be an issue? Because it is apparently possible for other parties (presumably including the police) to sometimes seize control of your cameras. If you think about it, scary times, esp if you worry about Big Brother.
On the flip side, I have Apple’s facial recognition system on two of my IOS devices. Normally, it doesn’t work that well identifying me, and after a couple failures, I usually have to put in the 6 digit code. They have one person to identify, and probably experience a 75% failure rate doing it (100% if I forgot to remove my stupid mask). Their fingerprint system was better - maybe only a 50% failure rate. You see crime dramas set in NYC, where they are scanning thousands of cameras at once throughout the city and are able to match and follow targets in very short order. Are we there yet? I question that the Feds could make it work that much better than Apple can, over an extremely heterogeneous set of cameras. And even less that the NYPD (or worse, the PHX PD here) could.
Still, as I noted, the router used by my security system, and all the cameras, is on an A/B switch, and the web cam on my computer has an on/off switch on the USB hub it is attached through - so I really don’t need the privacy shield. But then the voting machines were supposed to have been air gapped from the Internet, and apparently weren’t.
NCMoss said...China, who's way ahead on surveillance, also uses gait analysis to track people - so, just to be on the safe side, walk silly to throw them off.
or...Invest in a badly fitting pair of orthopedic shoes with mismatched lifts... that force you to walk in an unnatural way.
I may start committing my crimes while identifying as a dark-skinned Black woman--lesbian, with a thing for young white hotties. But I digress.
Temujin puts it well. The question isn't IF we are surveilled but by whom and for what purposes, and what if anything can the average dog do about it?
Narr
Your answer here
I once had to take passport photos as part of a job had. It was extremely difficult to get a clear face shot as required for passports when the skin was dark because unnoticed shadows that made light-skinned features clearer made dark-skinned features blur. I'm sure that correct lighting eliminates this problem but taking good pictures in cloud and sunshine of faces held at various angles and at different heights relative to the camera and shaded by hat brims and colored by lipstick, eyeshadow and different expressions, and moving - and then of a different light balance to start with ... well, perhaps that problem isn't solved.
But it's also true that I've been told that the police in Madison are picking out the criminals who assaulted police officers or threw fire bombs during Mostly Peacefuls this summer by using camera footage and face recognition. So naturally...
They even use 'gait analysis,' or how people walk...scary stuff.
If you've ever watched a police show set in modern-day England you will know that the country is blanketed with security cameras almost everywhere. Very sad.
Question: What do the police do in Beijing?
Cop: 'What did the robber look like?'
Witness: 'He was about 5'6" tall. Straight black hair. Dark eyes. Slim. He spoke Mandarin if that helps.'
Cop: 'Perfect, we'll nail the bastard in no time.'
Dust Bunny Queen said...
NCMoss said...China, who's way ahead on surveillance, also uses gait analysis to track people - so, just to be on the safe side, walk silly to throw them off.
or...Invest in a badly fitting pair of orthopedic shoes with mismatched lifts... that force you to walk in an unnatural way.
That isn't how it works.
If you walk with an unnatural pattern you will be identified more easily. Trying to put it into lay-mans terms here.
The software takes a series of thermal images.
It takes each of the pixels of the image as a parameter. Each pixel is considered a dimension in a vector space.
The algorithm takes each dimension of the vector and measures the changes in the pixels across different images as you move.
Bigger differences and weird stuff are easy to pick out. Just like differentiating between a human and a dog is easier than a human vs. a human and person with a strange walking pattern is going to be the easiest to ID and will take the shortest time.
At this point things devolve into a black box. You will need most of a year of linear algebra, a bunch of probability and a bunch of calculus and two semesters of ML courses to get to the point where you can describe it badly like I am.
Your only hope to fool these algorithms is to walk without pattern. In the end I think you would only delay recognition this way.
You will also need to shade your thermal output. I think creams that chill or heat patches of skin will be developed.
But these sorts of things will attract attention much like switching a SIM card to a new phone attracts attention.
Really there is no way on an individual level to defeat this technology. Everyone in the society would have to be sensitized and turned against it.
"Really there is no way on an individual level to defeat this technology."
Doesn't matter.
In Biden's America we'll never be allowed to go outside again.
Hail Biden!
SteveBrooklineMA: I'm sure someone like Mr Snively could easily modify the program so it spits out as many wrong answers for white men as it does for black women.
Funny you mention that. That's differential privacy, and there are companies that offer it as a platform on which to build other services.
Bruce Hayden said...I really don’t get the logic. It is supposedly somehow bad that black women are hard to identify so that they can be arrested for their crimes. Of course, that is racist - in favor of black women who don’t want to be arrested for their crimes.
You beat me to it, Bruce.
Logic, like Math seems to be really difficult for leftists to understand.
"I really don’t get the logic. It is supposedly somehow bad that black women are hard to identify so that they can be arrested for their crimes."
The logic would be the risk of false positive results resulting in charges against innocents based on weak tech.
When I read the headline, I immediately thought of several instances of face recognition apps being particularly bad at identifying the faces of black people. This sounds like a legitimate concern to me. Even if there is ultimately no charge if a person is flagged as suspicious, and stopped and questioned by police for something they had nothing to do with this is an infringement on their liberty. There is harm done. I don't think it is fair to call it racist, because there is no motivation to be racist there. And the whole point of using a disproportionately affected analysis was to use that as evidence of hidden racist motivation, i.e. where some seemingly arbitrary standard is devised that just so happens to affect lots of black people but not lots of white people.
That being said the reasoning as whole seems off to me. However, I still like the outcome. I don't like the Big Brother idea of government monitoring people with cameras everywhere they go outside. And some people might say hey you don't have any expectation of privacy while you are outside in public, but I don't know if a binary analysis is appropriate for that. I don't think it is unfair for most people to expect the government to not surveil their every move while walking around outside.
Achilles suggests: walk silly to throw them off
Time to retrieve those old Monty Python videos.
"Following a national reckoning this year on over-policing in communities of color, Madison and other governments have scrutinized and limited the use of face surveillance systems by law enforcement."
Over-policing is the problem? Someone should tell the good citizens of Chicago, Minneapolis, St Louis, Baltimore, Philadelphia, Portland, DC and Seattle, to pick a few places with substantial "communities of color". They all seem to be experiencing the opposite problem, and the soaring crime rates for murder, assault, and all the rest that goes with it. No doubt, just part of the summer/fall/winter of Love, Love, Love that the wokest and leftiest of Dem mayors consistently deliver for their cities.
@Bruce Hayden:
I really don’t get the logic. It is supposedly somehow bad that black women are hard to identify so that they can be arrested for their crimes. Of course, that is racist - in favor of black women who don’t want to be arrested for their crimes.
The issue is not with being unable to identify a black woman who comm9tted a crime but with misidentifying her and thus implicating an innocent person.
The issue is really with how the developers market their platforms. High accuracy and low rates of error are part of the sales pitch. The numbers are accurate but obfuscatory because of the benchmarks used to determine them and the manner in which they are reported. By reporting accuracy data in aggregate, it obscures the fact that there is wide variance between subgroups.
No need to surveil the black part of Madison. Laws are for white people.
Blacks are caught far more often than whites AS A PERCENTAGE OF POPULATION. Blacks are only 13.4 % of the population and while the number of whites arrested (5,626,140) is more than the blacks (2,221,697) as a percentage of population the blacks get arrested more often.
https://ucr.fbi.gov/crime-in-the-u.s/2017/crime-in-the-u.s.-2017/tables/table-43
But whites commit less murders. 4,188 .vs. 5,025. And that is huge considering blacks make up 13.4 %! In fact go look at the FBI stats... 13.4 percent (blacks) commit most of the crime, again as a percentage of population.
Now is that statement racist? If you look at the arrest stats in the UK:
"There were 671,126 arrests between April 2018 and March 2019, almost 5,000 fewer arrests than the previous year."
"Black people were over 3 times as likely to be arrested as White people – there were 32 arrests for every 1,000 Black people, and 10 arrests for every 1,000 White people."
Yes in the UK blacks are arrested more to!
Anything to increase black crime and therefore keep their communities more under the thumb of Democratic Party government.
It is puzzling to me how banning facial recognition software helps any group. Fingering suspects correctly in line-ups and from mugshots is rife with group bias and the limitations of human memory. Using technology is almost certainly faster and at least as reliable as a person going through book after book of mugshots.
If the algorithms are flawed, then fix them. Why ban technology that has the potential to remove bias from the system?
टिप्पणी पोस्ट करा