Who Has Your Face? The Fight Against U.S. Government Agencies' Use of Face Recognition
3:50PM Jul 31, 2020
facial recognition technology
Greetings from Philadelphia. This is Bernie s, and this is my button that server of 50,000 honeybee bots in my backyard.
They like the paper.
got their friends.
Fight against government use of facial recognition technology is an important one being fought by civil liberties and other activist groups. Unfortunately, the technology is already out there in use endangering people's privacy. In our next talk speakers with the Electronic Frontier Foundation will explain issues with facial recognition tech, where we stand legally in relation to its use, and their new interactive site, which can reveal what agencies have been using your image. We present Dr. Matthew corviglia, and Jason Kelly, with who has your face, the fight against US government agencies use of face recognition.
This is a restricted area.
Hi, I'm Jason Kelly the Associate Director of Research at the Electronic Frontier Foundation, and I'm going to talk today, along with my colleague Matthew griglia about the dangers of face recognition, especially in the hands of government and law enforcement. The reason the talk is called who has your face is I want you to think about that question. As you watch the presentation and hopefully at the end, you'll have some answers around, who in fact does have your image for facial recognition purposes. Now, we just saw clip from the 1987 movie Robocop that is a little far fetched, in many ways, but actually demonstrates some of the dangers of police use of face recognition. You saw that Murphy couldn't identify an assailant. And he went to the police station to try to use the face recognition tool to determine the identity of the person that he earlier had killed, and he was able to do that despite the fact that he didn't technically have access to that tool by pressuring people with a giant spike in this case. But that's actually exactly the kind of problems that real life facial recognition technology use by law enforcement has, in many cases, their police who can have access to the tool for one purpose or shouldn't have access to the tool in general, and use it for a different purpose or find a way to access it for illegitimate purposes. For those of you who don't know, e FF is a member supported civil liberties organization, we work to ensure that when you go online and when you're in digital spaces, your use of technology is protected and your rights are protected as well. So, actually, this year we're celebrating our 30th anniversary, e FF was built, founded by its co founders, in 1990, and is 30 years old this year was 30 years old actually just this month. It was built before the World Wide Web. In fact, so I don't know if you are familiar with CFF but we've done some really fun things for our 30th anniversary, including a seven hour livestream that aired a few weeks ago, on our actual anniversary with DJs and a bunch of other cool stuff I hope you can check that out if you weren't able to watch it when it aired. It's available at the Internet Archive and on social media platforms. If you'd like to learn more about what e FF does you can go to, e ff.org slash 30, the fights against misuse of technology, the fights to protect your privacy, the fight to fight back against surveillance, have not slowed down in those 30 years so Jeff's work is more important than ever and we appreciate if you're able to take a look at what we work on and potentially consider becoming a member if you're not already. So I want to start by positioning FF fight against facial recognition technology, and it's misuse by government and law enforcement within the broader work that e FF does, which you can categorize in basically three ways impact litigation technology projects and activism. A few examples of some of the issues the FF has worked on over the years. In our impact litigation legal work we've sued the NSA over mass spying, as well as at&t, we've sued the Department of Homeland Security for illegally searching devices at the border. And we also have a coders rights team that you may have heard of that helps defend programmers and developers who are doing security research, and with our technology projects or tech projects team, we build privacy protective software. So you might be familiar with the difference between an HTTP website and HTTPS website. In fact, you're probably more familiar than I am. The team at Let's Encrypt in cert bot helps people install those SSL certificates. I hope I'm getting that right to ensure that websites are secure and encrypted. The Privacy Badger team builds a tool called Privacy Badger that protects you from third party trackers cookies that follow you around the web to make sure your experience of the web is more private. Then we have an activism team that's the team that I'm on and the team that Matthews on. We focus on digital campaigns around legislation, and also just around broad education. So you might have noticed a few months ago the.org domain was potentially going to be taken over by a private equity firm, and the activism team, along with legal and tech worked on that. We also do education on the activism team. What that means is essentially that we build websites we build tools that helps educate people around technology and its use.
One of the tools or educational tools that we've created is called street level surveillance. So street level surveillance is essentially an educational website, that tells people about the sorts of surveillance, that is out there being used by law enforcement and example face recognition, but additional examples might be body worn cameras cell site simulators, social media surveillance and drones. So why does this matter. Why does the FF fight back against unrestrained surveillance. To put it bluntly surveillance like this makes it harder for us to live our lives. If you know that you're being watched by police because they're using automated license plate readers to follow your car wherever it goes. Are you more likely or less likely to do things, even if those things aren't dangerous. Just knowing that you're being surveilled can kill your ability to freely associate and express yourself. This also can potentially infringe on your first amendment rights for example, after the Freddie Gray murmur in Baltimore. A few years ago, police used social media surveillance to determine who had been at the protests and arrest those individuals. If you have a restriction or feel a restriction on posting things on social media because police are going to surveil that you're less likely to use it and post those things, these different pieces of surveillance all come together to create a kind of broad dangerous surveillance network by both government and law enforcement, that overall can make it less likely for you to live the life that you would like to live with DFS work on digital rights, we tend to focus on the surveillance tools themselves, but the act the way these tools work can restrict everything that you do from online work to offline work. So how is E FF fighting back against misuse of facial recognition technology within the context of these three areas, primarily through activism. We know that right now there's time to ban facial recognition technology is used by government and law enforcement, but people need to know how it's used what it does, and that they can actually ban this tool. So, usually ffs work involves a lot of blogs a lot of white papers and a lot of testimony, basically a lot of text. But what we've worked on over the last few years is expanding into a little bit more interactive tools that can give people a better clearer sense of how surveillance affects them. Even though we'd like to say this is what we do say this is what a surveillance camera looks like the better option would be for us to walk to a city with people and give them a tour and say that's a surveillance camera and here's how it works, so that you know where surveillance cameras exists within your own community. We're working on that. So, to start in 2018, e FF launched sponsor surveillance which is a VR experience that teaches people to recognize how to identify spying technologies that police might use in their own communities. Now it's not specific to any area it's just a single location. But the idea is to educate people on how surveillance works and what it looks like. Just this month, we launched the Atlas of surveillance. The Atlas is a map that, as well as a searchable database that lets us zoom around the US, looking at what police departments use which surveillance tech from drones to sell flight simulators, body worn cameras facial recognition technology. You can see in this version of the map that I'm selecting which area in which Bay Area police departments, use facial recognition technology and then I'm zooming out to show you that there are 360. Overall, across the United States, that's just the beginning. That's certainly not all of them.
You'll notice a cluster in Washington DC, as well as in Florida. Now in DC that's because there are a lot of federal agencies that have used facial recognition technology in Florida, there's been a push for a long time now to use facial recognition and Florida actually has the oldest facial recognition database in the country. The Atlas was put together using publicly available records like budgetary records news stories about law enforcement surveillance tech. So, if there was a budget listing a budget item from the police department, about a facial recognition tool. It might be in the Atlas or if there was a news story about someone being captured or apprehended or identified using facial recognition that might also be in the Atlas, but there are thousands and thousands of departments, we simply don't have the research for yet because the Atlas was just launched and it never be comprehensive and additionally, most states and cities don't actually require police to have a public hearing about the surveillance tech that they buy, but it might never become clear what they have access to. So, we have the spot the surveillance tool we have the Atlas of surveillance tool. In the case of face recognition what we'd really like is to build a customized list telling you, who has your face. That's what we tried to build with the who as your face tool by answering a few questions, you can determine which agency is likely to have access to your facial recognition photo. I'll get into that in a bit. So, how does law enforcement hide its use of surveillance tech. Basically, I showed you what Atlas can tell us, but there are lots of departments that we don't know use facial recognition technology. Here's how they do it. They use private companies like Amazon or Clearview, to give them access to the tools they wouldn't otherwise use. So let's go through a few examples of the kinds of facial recognition tools that police often use. We'll start with clear view AI clear view was a tool that came to light in January. In January, no one really knew what it meant when a law enforcement department listed that they had a contract with Clearview. Now we know that Clearview AI is a facial recognition company that scraped 3 billion images from the web from places like Facebook, YouTube, Twitter and Venmo. And this company claims that it has more than 1000 law enforcement officers or agencies that have contracts with it. This includes the Justice Department Immigrations and Customs Enforcement or ice, and even organizations like the New Jersey Attorney General's Office we're surprised to learn that police departments within New Jersey, were not only using the tool when this New York Times story broke about it, but they were using an image of the Attorney General to sell their services to other agencies,
just a month ago Amazon's recognition software was a cheap and easily available facial recognition tool for law enforcement. You can see in this table that if you're familiar with the AWS Amazon Web Services payment structure Amazon's recognition, makes it easy for people to pay a small amount of money to submit thousands and thousands of images one at a time or in bulk, to recognition. Just this month. Some companies have started to at least publicly recognize the danger this technology poses when it's used by government and law enforcement, Amazon has paused its law enforcement use of the software, until 2021 their claim is that by then there will be safeguards or regulations in place. IBM meanwhile has completely sworn off making the software. There will always be some company willing to make and sell law enforcement this sort of technology. The bottom line is that we need to ban its use by government agencies, perhaps a bigger problem than these private companies selling law enforcement agencies access to facial recognition technology and software is the use of government databases that already exists for facial recognition technology. You might have seen a report that Georgetown University Center on privacy and technology put out about a year ago called perpetual lineup. One of the important points that report and website make is that over half of Americans are already in facial recognition database is accessible by police. That's because over the years, government agencies like dmvs across the country as well as state or federal agencies like the Department of State, have implemented facial recognition technology on the databases that they already have holding images for your photo IDs. So you'll see in this picture of passport, a driver's license and a visa, these databases, all have facial recognition technology, and some of that technology like with state dmvs was initially used to defend against fraud, so that when you go to the DMV and you take a picture, the DMV knows that there's already a license submitted, or given to a person that looks just like you. But over the years, these state, dmvs, and federal agencies have started expanding the use of facial recognition technology beyond those fraud protective purposes. Most federal agencies have been sneakily signing memorandums of understanding with their dmvs, allowing them to request access to the DMV facial recognition databases, hold millions of headshots from government IDs to give you a quick understanding of how facial recognition technology is accessed through dmvs, by law enforcement other government agencies. I'm going to play a quick video from Claire Garvey in the New York Times, Claire Garvey is a secure air a privacy expert and a facial recognition technology researcher at Georgetown, and this video appeared, just a few months ago, you might know Claire from her work on perpetual lineup.
We're all getting comfortable with face recognition, unlocking our phones, skipping airport lines, and even unlocking front doors, but the convenience is blinding us to how risky this technology actually is and how it is being used without us realizing, I'm Claire Garvey, my job is to research the use of facial recognition technology, by law enforcement, and then make recommendations around the use of the technology. Right now, most Americans are in a perpetual police lineup, because they got a driver's license. After that DMV agent snaps your picture. your face is turned into a face print a unique series of numbers that a face recognition system can read and compare to other faces. Now any police officer can run searches against your face for any reason, who robbed that corner store, who was jaywalking at 3am, who was at this protest. The digital equivalent of police walking through a crowd and yanking each of our IDs, out of our pockets. You could be picked out investigated, possibly arrested and sent to jail because you got a driver's license in one of these 32 states. That's a violation of your privacy and your fourth amendment protection against unreasonable search and that's just the tip of the iceberg.
I think government issued identification is basically a necessity at this point, through those dmvs law enforcement and government agencies have access to more than 640 1 million photos now for facial recognition purposes photos of more than half of Americans, the total number of dmvs you'll see listed here on the screen with facial recognition is at least 43, with only four of those limiting data sharing entirely. That means that, 39, dmvs, share access to some of their data with other law enforcement and state or federal agencies. Now, this is an ongoing battle, just this month, Missouri, which had banned facial recognition used by the DMV passed a bill to implement it. During a closed session, within a huge Omnibus package containing a lot of other bills. There was no room for citizens to comment, and the bill passed within a short period of time. These are battles that are ongoing. There is time to fight back in our research for who has your face project released a few months ago, we counted at least 27 states you can see on the screen, where the FBI can search or request data from driver's license and Id databases. In June of last year, the Government Accountability Office reported only 21. That means six new states began offering data from their facial recognition databases to the FBI, in the last year. This is an ongoing and changing battleground. Here are the dmvs that definitely have agreements to share fr database access in some way with local or state law enforcement. This means that even if agencies in these states are for example barred from using clear view or recognition. Often they can simply send a photo, a mug shot a photo taken from a mobile device. Photo pool from social media directly to the DMV who can run the images through their database. This isn't a comprehensive list though, it's nearly impossible to know which agencies are sharing these photos and with whom each agency across the US from state dmvs to the State Department, share access to their photos differently depending on agreements with local police, other states and federal agencies for the who as your face project we submitted dozens of public records requests. But we were lucky enough to also use the data that Georgetown had found through its public records requests for its perpetual lineup prod project. Unfortunately, even with submitting public records requests. It can take years to compile the results, and you may not even get them, we were continuously thwarted in our research by non responsive government agencies by conflicting information between agencies agreements that had been superseded by new agreements and the generally covert, and frankly opaque nature of these policies. This is a huge problem it, it should be easy to learn who has your image for facial recognition purposes, but in fact it's not some dmvs give the public the precise number of requests that have been made for facial recognition is an example here with Wisconsin, it received 238 requests in 2016. There's a table on the screen of what kinds of crimes, the requests were that were submitted. You'll see a cybercrime column here with 13 total cases. 13 photos submitted and four matched Now where did they get the photos for cybercrime to submit to Wisconsin's DMV database, who knows.
Another thing that you might see when you do a public records request for this kind of data is who made the requests and how many they made a list of agency requests to Utah's DMV included Immigrations and Customs Enforcement or ice, the Department of Homeland Security various state fusion centers state secret service agencies, and the United States, Office of National Drug Control Policy. This list is 62 pages long, it starts in 2015 and goes to 2017. So there are hundreds and hundreds of agencies around the country that requested DMV that Utah's DMV do a facial recognition, search for them, even if some of the agencies send data like this it's still impossible to know precisely which agencies are sharing which photos and with whom now which federal agencies have facial recognition. Unfortunately, many of them. a lot of them have their own systems as well as tapping into the state's databases, FBI department Homeland Security TSA CBP ice and State Department, all use facial recognition themselves. That means that when you get a passport, you get a visa or you get TSA precheck, the photos you submit are going into a database that can be queried for not just the purposes of identification, but for facial recognition to determine whether you match, according to an algorithm of photos submitted by the department or by someone else. We shouldn't be forced to submit to criminal face recognition searches merely because we want to drive a car flying an airplane vote or buy alcohol, but right now we are now here's a snippet of an ad from Clearview AI showing the difference between the number of photos that Clearview has 3 billion, and the number of photos that the FBI has access to which is now 600 million or so in was 400 and 11 million in this in this ad. You might ask, Is it better, at least if the states have access to these photos, which we submitted supposedly voluntarily rather than private companies which scraped the information without our consent, rather than think about which is better, we need to look closer at how face recognition works and why all law enforcement or government agency use of it should be banned. So let's just start with what is face recognition face recognition systems use computer algorithms to pick out distinctive details and turn those into a numerical template called a face print, and that face print is essentially like a thumbprint, it gets matched to photos that gets submitted to it. You might have heard that face rec is flawed, that the tool doesn't always match the correct image with the correct person. That's true, but that's not the only problem with face recognition, even if face recognition matched photos 100% of the time, there would still be plenty of issues and plenty of dangers. When the tool is used in the hands of law enforcement. I'll get into that more but let me introduce my colleague Matthew griglia. He is an expert in police tech and police surveillance.
The fact of the matter is that in the hands of an unaccountable criminal justice system. this tech is actually very dangerous. I find there are two main ways that face surveillance does harm to whoever encounters it first are the very physical legal and financial harms that come with being mis identified in January 2020 Robert Williams is arrested by the Detroit police department in his driveway in Farmington Hills, Michigan. Why was he arrested. He was arrested because after scanning a blurry security cameras still the DP DS algorithm spit out Mr. Williams name as a possible match in Detroit, this seems to be a regular occurrence. less than a year earlier, Michael Oliver was arrested after an algorithm identified him as an investigative lead an eye witness chose him out of a lineup, despite the fact that there was almost no other corroborating evidence racial bias machines like racial bias bias eyes can miss identify people. And I say racially biased machines, because the training data cell set is itself skewed toward identifying and therefore Miss identifying black and brown suspects. If the data being fed into the software has been generated by a criminal justice system, and over police's haphazardly arrests and charges racial minorities, then their results will also skew in that direction because that is what has been collected.
But the system itself is broken to right now police don't have to submit evidence, when they've used facial recognition to identify suspects most of the time. What that means is that suspects can be identified supposedly by an algorithm that they can't interrogate, and they don't know how many other people might have been matched through the algorithm that makes it impossible for the accused to defend themselves and to have due process. And this isn't new. As Matthew explains the Rogues Gallery has had this effect since its creation.
When it took us back over a century, because I think this point really illustrates the fundamental problems of this technology and the night of May 10 1899 gambler doc Owens was arrested on suspicion after he was accused of cheating at cards at the police station. He was photographed his measurements were taken and retained by the New York City Police Department Owens was let go because there was no warrant for his arrest and no real reason to keep him, but his photograph remains with the police. As one columnist in Chattanooga Tennessee who covered the story he wrote quote. Once a photograph is hung in the gallery there is no removing it policy of that kind is utterly indefensible is a cruel injustice and a constant bar to reform youthful indiscretion is put on the same plane as murder. It provides no loophole through which the victim may escape the dire results of criminal publicity. Nobody is going to employ a young man or young woman whose picture adorns the Rogues Gallery and quote. Simply put, once people were found guilty or even deemed suspicious, the retention of data would ensure that it would remain that way forever, their face would be searched as it is now over and over again, and compared to every subject. Every time someone else commits a crime, the faces of hundreds or thousands or now even millions of faces are searched over and over again for guilt and forever erodes the presumption of innocence. For anyone who has gotten a driver's license or has been arrested.
So, the way police tend to use face recognition, just literally the way they use the tool is also often dangerous. You might have heard of a story where a suspect you can see on the left here in the image was captured on a surveillance camera. The camera's image wasn't very clear. It was blurry and the image What didn't show up with any significant confidence any results through the facial recognition tool. But police noticed that in their minds at least this potential suspect looked a little bit like Woody Harrelson, so they put a photo of Woody Harrelson through the facial recognition tool to see what would come back. This comes from a study from Georgetown University Center on privacy and technology called garbage in, garbage out. It's not the only example of police submitting questionable images to facial recognition tech. They've also submitted, police artist sketches and see to see who they can get back. Just as an example. So, there are more issues with facial recognition technology than that but that's kind of a little outline of some of the problems that we have with them. Now you might be wondering whether e FF thinks that face recognition is inherently bad. The answer is no. This is a short demo from Apple's announcement of face ID, which is a tool that Apple iPhones use to unlock your phone. After taking a face print of your image and then every time you turn the phone towards yourself. It determines whether that matches the face print the image that it's seeing.
Here's how it works. Every time you glance at your iPhone 10 it detects your face, the flood a new illuminator, even in the dark,
the IR camera takes an IR image.
But don't dot projector protects out over 30,000 invisible IR dots.
Use the IR image and the dot pad and we push them through neural networks to create a mathematical model of your face, and then we check that mathematical model against the one that we've stored that you set up earlier, to see if to match and unlock your phone, and this all happens in real time that all happens invisible you don't see these things going off.
So what makes face recognition, good or bad. What makes it face recognition versus face surveillance. We've got a couple of simple guidelines, can you opt out of it to use with the Apple iPhone face ID, you don't have to use it. If it's using your image was your image collected surreptitiously or was your image collected with your consent. Again with the face ID example, your image was used with your consent, unless someone takes your phone and unlocks it by holding it in front of your face without your consent. Is it used to track you. Now, in most cases, it's going to be very difficult for law enforcement or government agencies to how you opt into the tool to use your image with your consent, and not to track you so it's going to do one of those things, most likely if the government is using this technology, which again it's just a technology just a tool. But if its primary function is to invade people's privacy. That's face surveillance and that should be banned by government use and law enforcement use, but it's not always simple. In this example, you'll see an ad for a tool by morpho trust or idemia, which is a company that sells its facial recognition technology to governments and law enforcement. So watch this and tell me if it's clear from the commercial, whether you can or can't opt out.
Now I'm going to go through a couple of other fun examples of face recognition technology. This is face recognition tool by Microsoft. That can supposedly recognize emotion can perceive varieties of emotion. This is different from facial recognition and that it's not matching one image to another it's using an algorithm to tell someone, the perceived emotional state of an individual on a photo. And one of my favorite things about this example is that it very specifically says is important to note that facial expression alone, facial expressions alone do not represent the internal states of people. Now I pictured picks this image from a protest in Portland of a protester. This is the kind of image that you might see in Robocop to determine that someone is a thread on the right hand side you'll see an example list of emotions and the emotion that is highest according to the confidence level of the tool is anger. And that may or may not be correct but it's just an interesting example of Microsoft's tool. Now I'm going to pick on Microsoft, a little bit more because their tool is one that's publicly available. Here is a facial recognition tool that Microsoft lets you drop two images into, and it will tell you its confidence level, whether or not they're the same image, and that's often how facial recognition tools work, they don't give you exact matches they give you let's say 10 images or 100 images, and they each of those has a confidence level, that the system believes it is or is not accurate. So here I'm submitting two new images to this system, one of john lewis, and one of Elijah Cummings, this was a mistake that was made recently by a senator when john lewis passed away. Now, the confidence level that these are not the same people, 19% it's not sure. Let's try another example. This is a photo of m&m on the left and on the right, a photo of David Foster Wallace. The system is only 10% sure that these are not the same person. Interesting. Right. But there's just a quick demo of this tool and kind of the direction that a lot of these tools can be used to go beyond face recognition itself. And I just want to reiterate that the title here. Just because you can build facial recognition tools for less than $100 doesn't mean you should. In this The New York Times recorded video from three cameras and matched those videos to people that they knew worked in the area so that they could track whether or not people were in the area, and it worked. They spent less than $100 doing it all they needed was access to the cameras. So one of the things that I I'd like you to take away from this talk is that building facial recognition tools, isn't that hard at this point. What's hard and what's worth doing is building tools that hack these systems or that fight back against them, building software and creating solutions that give people more privacy not less is what emff is interested in and hopefully what you're interested in, as well. Here's some fun examples if you're not an experienced coder that you might find interesting. This is from adversarial fashion, it's a fourth amendment alpr blocking hoodie, that the goal of is to fill the automated license plate reader database with jibberish, and confuse tools, by the police and also of course raise awareness of automated license plate readers. Here's another example of a facemask that says do not consent, I do not consent to the search.
Now I'm going to quickly go through
the who has your face website so you can get a sense of what we did to make it so that you can see who has your face in a government database. If you go to the website. You'll see a quick take the quiz button, it essentially is a five step quiz, you can use to learn which US government agencies may have access to your photo for facial recognition purposes. And there's also a longer resources page that describes in detail the photo sharing that we've discovered. So I'm just going to show you through the site what happens when you move forward it asks you whether you have a driver's license or a state photo ID, where you live and whether you live in a city of more than 100,000 people on choosing California and yes because that's true. I do have a passport or US visa. I have signed up for TSA precheck. I have not ever applied for a US government job that required a photo. But I clicked yes there just because it'll give you some more information. And then at the end you'll get a page like this that tells you whether your DMV shares the data with other departments and whether other departments, because you have a passport or a visa also have access to your image for facial recognition. So what's next, how do we stop these privacy invasions, we've actually been pretty successful at banning facial recognition in cities around the country, 10 US cities have voted to ban it. That includes Boston and San Francisco, but we need to ban it federally because as long as there are loopholes where individual agencies or departments across the country can submit photos, there will be privacy invasions using facial recognition. So right now you can go to act dot e ff.org to tell your senator to vote yes and co sponsor the facial recognition and biometric technology moratorium act of 2020, we're closer than ever to banding this tool. And I think we can do it. Hopefully you'll be able to help. So please go to act e ff.org, and please also continue thinking of other ways to make good technology and cool technology that protects privacy rather than invading and
welcome back. This is who has your face, the fight against the US government agencies use of facial recognition and we welcome Dr. Matthew Birbiglia and Jason Kelly, thank you so much for being here with us. So we have some questions that have come in, in the matrix chat, and if you have them and would like to hear the answers, please drop them in the livestream q&a. So a question from the audience. In China, there are already established systems to recognize faces that have masks on. Is there a push here for that now due to COVID and protests.
Yes. Yeah, absolutely. There is I mean I think already as protests ramp up and as the first kind of studies come out that show that COVID masks kind of kill and make facial facial recognition incredibly inaccurate. There is absolutely going to be a push probably within DHS, if there isn't or it isn't already happening to start to develop or maybe to purchase fr technology that does not is not foiled by masks.
Another question from the audience. Are there updates on the efforts of groups like Oakland privacy.
Yeah, so Oakland privacy is a is a local group in the Oakland area that's focused on
how passing various legislation in the like left the local level. They worked to get bans on fr and other community control a police surveillance measured measures passed in San Francisco and Berkeley and Oakland. They're working in Emeryville now on a measure that's similar to the measures passed in Berkeley. Matthew Do you know of any, I forget the other cities that are that are being pushed right now by the FA which is FF grassroots groups of Electronic Frontier Alliance the grassroots groups that work on things like this.
I mean, I think, one of the huge development in California was just in Santa Cruz, in which both predictive policing and facial recognition were abandoned one single ordinance.
So yeah, there's there's a lot going on locally that's actually I think where there's a lot of movement in the country.
And in the country, another audience question do you think the US could achieve something like the US gpdr with respect to facial recognition technology.
Yes. We actually have a. So there's two, there's kind of two questions there this talk is mostly about federal and government use of face surveillance and so there is a bill that Matthew actually knows quite a bit about more than me. That is in the federal level to ban government use of facial recognition, which would be a big start, but GDPR is mostly focused on commercial businesses, I believe, and there is a state level bill in Illinois, that's passed called the biometric Information Privacy Act that does ban commercial use of fr without consent and a variety of other things and that actually has been used to sue Facebook. Recently, a year or two ago for, you know, if you log into Facebook it would say is this you or does this person look like this person or whatever, and that they use that on people in Illinois, that that was a illegal under HIPAA, so we do have state level bills and we have this federal bill. Matthew Do you want to talk a little bit about
that federal bill. Sure, the Federal bill introduced by people like Jaya Paul Marquis Berkeley to leave. Has specifically targeted the use of facial recognition in the federal government. So what it would do to put a moratorium on use of face surveillance at the federal level so be it. FBI customs or Border Patrol DHS. And the other thing you would do is it would interrupt, some federal funding streams that help to pay for face recognition at the local and state level. So that is been introduced in the Senate in the house and it's moving forward EF f has an action that makes it very easy for you to contact your elected officials to ask them to pass or co sponsor that bill. And I think also the, the other big thing to consider. Because I would love to see a kind of push for a federal PIPA in a kind of GDPR style way is the California privacy law, which allows you, for instance, you're able to if you're a California resident, I was able to request to see all the images that Clearview which is a private company had on me. And then request them to delete the ones that were uncovered. So there are some state provisions like Illinois and California that are slowly rising up that apply to private corporate entities, but you know it's it's not enough to wait for them to come one at a time, it states that it seems like we really have to push for this type of legislation on a federal level as well.
Fantastic. We have more questions. Does fr do f our data maintain a provenance of how the image was acquired like date, time GPS.
Yeah, absolutely. At least where they know that information was acquired so for example with clear view as Matt was mentioning, they scrape the data from places like Facebook and the way they know who that person is is that they pull it from that person's profile so if they didn't have that metadata they actually couldn't identify the person. But yeah, in general, I think that's a big part of it it's it's recorded so that they know exactly as much information as they possibly can. Especially given, you know, federal databases. I'm pretty sure that's that's generally required, you know, to keep that information so that they can retrieve it and delete it if necessary.
I guess. Okay, next question is the argument of facial recognition being inaccurate, kind of a tenuous one, because if it doesn't work accurately it's dangerous, but if it does work accurately, then it's still dangerous. Yes.
I absolutely think it is tenuous and I think that, you know, for people committed to banning face recognition because the technology is kind of innately violent and innately violates civil liberties that, as, as the technology progressives, we might want to consider pivoting, our arguments away from just talking about an accuracy because inaccuracy. As I argued in the talk is is very important and has dire consequences for people in terms of being misidentified. But that's not the only reason why the technology should be banned and I think even if the technology tomorrow we woke up and it was perfect. It's still kind of a danger to civil liberties and so yeah I definitely encourage people to make both arguments, but also to not rest, you know, an entire political or legislative strategy only on the fact that it's an accurate.
Thank you. Okay, another question Could I put chaff on my car in the form of custom bumper stickers that just happened to be in the dimensions of license plates from other states and put them around my real one, a human would still know which is the real one but wouldn't alpr be confused.
That's a very good question. I don't, I'll start by saying I don't know if it's that might not be legal.
Yeah, that sounds that sounds like it might be illegal.
But, um, but I that's a good question. I don't know i don't i don't know how full they are I mean, there is supposedly those kind of adversarial fashion sweatshirts that have license plate on them, which theoretically can confuse Alp ours but I don't know the effectiveness of that. So that's a good question. I don't know.
Um, it looks like that might be the last question we have time for. So just thanks everyone for tuning in. Really appreciate it and big big ups for the good questions. Obviously if you don't follow the FF. Please do. Please go to, e ff.org and sign up for effecter. You can also follow us on Twitter, which is at Ufff. We have an Instagram with Facebook etc etc so please stay tuned in those channels and also there's a talk coming up right after this about clear view so you might wanna stick around for that.
Fantastic. So thank you so much Dr. Matthew griglia and Jason Kelly, we appreciate you sticking around for more hope 2020 back to you ground control.