Author: Beryl Lipton

  • Axon Tests Face Recognition on Body-Worn Cameras

    Axon Tests Face Recognition on Body-Worn Cameras

    [ad_1]

    Axon Enterprise Inc. is working with a Canadian police department to test the addition of face recognition technology (FRT) to its body-worn cameras (BWCs). This is an alarming development in government surveillance that should put communities everywhere on alert. 

    As many as 50 officers from the Edmonton Police Department (EPD) will begin using these FRT-enabled BWCs today as part of a proof-of-concept experiment. EPD is the first police department in the world to use these Axon devices, according to a report from the Edmonton Journal

    This kind of technology could give officers instant identification of any person that crosses their path. During the current trial period, the Edmonton officers will not be notified in the field of an individual’s identity but will review identifications generated by the BWCs later on. 

    “This Proof of Concept will test the technology’s ability to work with our database to make officers aware of individuals with safety flags and cautions from previous interactions,” as well as “individuals who have outstanding warrants for serious crime,” Edmonton Police described in a press release, suggesting that individuals will be placed on a watchlist of sorts.

    FRT brings a rash of problems. It relies on extensive surveillance and collecting images on individuals, law-abiding or otherwise. Misidentifications can cause horrendous consequences for individuals, including prolonged and difficult fights for innocence and unfair incarceration for crimes never committed. In a world where police are using real-time face recognition, law-abiding individuals or those participating in legal, protected activity that police may find objectionable — like protest — could be quickly identified. 

    With the increasing connections being made between disparate data sources about nearly every person, BWCs enabled with FRT can easily connect a person minding their own business, who happens to come within view of a police officer, with a whole slew of other personal information. 

    Axon had previously claimed it would pause the addition of face recognition to its tools due to concerns raised in 2019 by the company’s AI and Policing Technology Ethics Board. However, since then, the company has continued to research and consider the addition of FRT to its products. 

    This BWC-FRT integration signals possible other FRT integrations in the future. Axon is building an entire arsenal of cameras and surveillance devices for law enforcement, and the company grows the reach of its police surveillance apparatus, in part, by leveraging relationships with its thousands of customers, including those using its flagship product, the Taser. This so-called “ecosystem” of surveillance technologyq includes the Fusus system, a platform for connecting surveillance cameras to facilitate real-time viewing of video footage. It also involves expanding the use of surveillance tools like BWCs and the flying cameras of “drone as first responder” (DFR) programs.

    Face recognition undermines individual privacy, and it is too dangerous when deployed by police. Communities everywhere must move to protect themselves and safeguard their civil liberties, insisting on transparency, clear policies, public accountability, and audit mechanisms. Ideally, communities should ban police use of the technology altogether. At a minimum, police must not add FRT to BWCs.

    [ad_2]

    Source link

  • Washington Court Rules That Data Captured on Flock Safety Cameras Are Public Records

    Washington Court Rules That Data Captured on Flock Safety Cameras Are Public Records

    [ad_1]

    A Washington state trial court has shot down local municipalities’ effort to keep automated license plate reader (ALPR) data secret.

    The Skagit County Superior Court in Washington rejected the attempt to block the public’s right to access data gathered by Flock Safety cameras, protecting access to information under the Washington Public Records Act (PRA). Importantly, the ruling from the court makes it clear that this access is protected even when a Washington city uses Flock Safety, a third-party vendor, to conduct surveillance and store personal data on behalf of a government agency.

    “The Flock images generated by the Flock cameras…are public records,” the court wrote in its ruling. “Flock camera images are created and used to further a governmental purpose. The Flock images created by the cameras located in Stanwood and Sedro-Woolley were paid for by Stanwood and Sedro Wooley [sic] and were generated for the benefit of Stanwood and Sedro-Woolley.”

    The cities’ move to exempt the records from disclosure was a dangerous attempt to deny transparency and reflects another problem with the massive amount of data that police departments collect through Flock cameras and store on Flock servers: the wiggle room cities seek when public data is hosted on a private company’s server.

    Flock Safety’s main product is ALPRs, camera systems installed throughout communities to track all drivers all the time. Privacy activists and journalists across the country recently have used public records requests to obtain data from the system, revealing a variety of controversial uses. This has included agencies accessing data for immigration enforcement and to investigate an abortion, the latter of which may have violated Washington law. A recent report from the University of Washington found that some cities in the state are also sharing the ALPR data from their Flock Safety systems with federal immigration agents. 

    In this case, a member of the public in April filed a records request with a Flock customer, the City of Stanwood, for all footage recorded during a one-hour period in March. Shortly afterward, Stanwood and another Flock user, the City of Sedro-Woolley requested the local court rule that this data is not a public record, asserting that “data generated by Flock [automated license plate reader cameras (ALPRs)] and stored in the Flock cloud system are not public records unless and until a public agency extracts and downloads that data.” 

    If a government agency is conducting mass surveillance, EFF supports individuals’ access to data collected specifically on them, at the very least. And to address legitimate privacy concerns, governments can and should redact personal information in these records while still disclosing information about how the systems work and the data that they capture. 

    This isn’t what these Washington cities offered, though. They tried a few different arguments against releasing any information at all. 

    The contract between the City of Sedron-Woolley and Flock Safety clearly states that “As between Flock and Customer, all right, title and interest in the Customer Data, belong to and are retained solely by Customer,” and “Customer Data” is defined as “the data, media, and content provided by Customer through the Services. For the avoidance of doubt, the Customer Data will include the Footage.” Other Flock-using police departments across the country have also relied on similar contract language to insist that footage captured by Flock cameras belongs to the jurisdiction in question. 

    The contract language notwithstanding, officials in Washington attempted to restrict public access by claiming that video footage stored on Flock’s servers and requests for that information would constitute the generation of a new record. This part of the argument claimed that any information that was gathered but not otherwise accessed by law enforcement, including thousands of images taken every day by the agency’s 14 Flock ALPR cameras, had nothing to do with government business, would generate a new record, and should not be subject to records requests. The cities shut off their Flock cameras while the litigation was ongoing.

    If the court had ruled in favor of the cities’ claim, police could move to store all their data — from their surveillance equipment and otherwise — on private company servers and claim that it’s no longer accessible to the public. 

    The cities threw another reason for withholding information at the wall to see if it would stick, claiming that even if the court found that data collected on Flock cameras are in fact public record, the cities should still be able to block the release of the requested one hour of footage either because all of the images captured by Flock cameras are sensitive investigation material or because they should be treated the same way as automated traffic safety cameras

    EFF is particularly opposed to this line of reasoning. In 2017, the California Supreme Court sided with EFF and ACLU in a case arguing that “the license plate data of millions of law-abiding drivers, collected indiscriminately by police across the state, are not ‘investigative records’ that law enforcement can keep secret.” 

    Notably, when Stanwood Police Chief Jason Toner made his pitch to the City Council to procure the Flock cameras in April 2024, he was adamant that the ALPRs would not be the same as traffic cameras. “Flock Safety Cameras are not ‘red light’ traffic cameras nor are they facial recognition cameras,” Chief Toner wrote at the time, adding that the system would be a “force multiplier” for the department. 

    If the court had gone along with this part of the argument, cities could have been able to claim that the mass surveillance conducted using ALPRs is part of undefined mass investigations, pulling back from the public huge amounts of information being gathered without warrants or reason.

    The cities seemed to be setting up contradictory arguments. Maybe the footage captured by the cities’ Flock cameras belongs to the city — or maybe it doesn’t until the city accesses it. Maybe the data collected by the cities’ taxpayer-funded cameras are unrelated to government business and should be inaccessible to the public — or maybe it’s all related to government business and, specifically, to sensitive investigations, presumably of every single vehicle that goes by the cameras. 

    The requester, Jose Rodriguez, still won’t be getting his records, despite the court’s positive ruling. 

    “The cities both allowed the records to be automatically deleted after I submitted my records requests and while they decided to have their legal council review my request. So they no longer have the records and can not provide them to me even though they were declared to be public records,” Rodriguez told 404 Media — another possible violation of that state’s public records laws. 

    Flock Safety and its ALPR system have come under increased scrutiny in the last few months, as the public has become aware of illegal and widespread sharing of information. 

    The system was used by the Johnson County Sheriff’s Office to track someone across the country who’d self-administered an abortion in Texas. Flock repeatedly claimed that this was inaccurate reporting, but materials recently obtained by EFF have affirmed that Johnson County was investigating that individual as part of a fetal death investigation, conducted at the request of her former abusive partner. They were not looking for her as part of a missing person search, as Flock said. 

    In Illinois, the Secretary of State conducted an audit of Flock use within the state and found that the Flock Safety system was facilitating Customs and Border Protection access, in violation of state law. And in California, the Attorney General recently sued the City of El Cajon for using Flock to illegally share information across state lines.

    Police departments are increasingly relying on third-party vendors for surveillance equipment and storage for the terabytes of information they’re gathering. Refusing the public access to this information undermines public records laws and the assurances the public has received when police departments set these powerful spying tools loose in their streets. While it’s great that these records remain public in Washington, communities around the country must be swift to reject similar attempts at blocking public access.

    [ad_2]

    Source link

  • That Drone in the Sky Could Be Tracking Your Car

    That Drone in the Sky Could Be Tracking Your Car

    [ad_1]

    Police are using their drones as flying automated license plate readers (ALPRs), airborne police cameras that make it easier than ever for law enforcement to follow you. 

    “The Flock Safety drone, specifically, are flying LPR cameras as well,” Rahul Sidhu, Vice President of Aviation at Flock Safety, recently told a group of potential law enforcement customers interested in drone-as-first-responder (DFR) programs

    The integration of Flock Safety’s flagship ALPR technology with its Aerodome drone equipment is a police surveillance combo poised to elevate the privacy threats to civilians caused by both of these invasive technologies as drone adoption expands. 

    A slide from a Flock Safety presentation to Rutherford County Sheriff’s Office in North Carolina, obtained via public records, featuring Flock Safety products, including the Aerodome drone and the Wing product, which helps convert surveillance cameras into ALPR systems

    The use of DFR programs has grown exponentially. The biggest police technology companies, like Axon, Flock Safety, and Motorola Solutions, are broadening their drone offerings, anticipating that drones could become an important piece of their revenue stream. 

    Communities must demand restrictions on how local police use drones and ALPRs, let alone a dangerous hybrid of the two. Otherwise, we can soon expect that a drone will fly to any call for service and capture sensitive location information about every car in its flight path, capturing more ALPR data to add to the already too large databases of our movements. 

    ALPR systems typically rely on cameras that have been fixed along roadways or attached to police vehicles. These cameras capture the image of a vehicle, then use artificial intelligence technology to log the license plate, make, model, color, and other unique identifying information, like dents and bumper stickers. This information is usually stored on the manufacturer’s servers and often made available on nationwide sharing networks to police departments from other states and federal agencies, including Immigration and Customs Enforcement. ALPRs are already used by most of the largest police departments in the country, and Flock Safety also now offers the ability for an agency to turn almost any internet-enabled cameras into an ALPR camera. 

    ALPRs present a host of problems. ALPR systems vacuum up data—like the make, model, color, and location of vehicles—on people who will never be involved in a crime, used in gridding areas to systematically make a record of when and where vehicles have been. ALPRs routinely make mistakes, causing police to stop the wrong car and terrorize the driver. Officers have abused law enforcement databases in hundreds of cases. Police have used them to track across state lines people seeking legal health procedures. Even when there are laws against sharing data from these tools with other departments, some policing agencies still do.

    Drones, meanwhile, give police a view of roofs, backyards, and other fenced areas where cops can’t casually patrol, and their adoption is becoming more common. Companies that sell drones have been helping law enforcement agencies to get certifications from the Federal Aviation Authority (FAA), and recently-implemented changes to the restrictions on flying drones beyond the visual line of sight will make it even easier for police to add this equipment. According to the FAA, since a new DFR waiver process was implemented in May 2025, the FAA has granted more than 410 such waivers, already accounting for almost a third of the approximately 1,400 DFR waivers that have been granted since such programs began in 2018.

    Local officials should, of course, be informed that the drones they’re buying are equipped to do such granular surveillance from the sky, but it is not clear that this is happening. While the ALPR feature is available as part of Flock drone acquisitions, some government customers may not realize that to approve a drone from Flock Safety may also mean approving a flying ALPR. And though not every Flock safety drone is currently running the ALPR feature, some departments, like Redondo Beach Police Department, have plans to activate it in the near future. 

    ALPRs aren’t the only so-called payloads that can be added to a drone. In addition to the high resolution and thermal cameras with which drones can already be equipped, drone manufacturers and police departments have discussed adding cell-site simulators, weapons, microphones, and other equipment. Communities must mobilize now to keep this runaway surveillance technology under tight control.

    When EFF posed questions to Flock Safety about the integration of ALPR and its drones, the company declined to comment.

    Mapping, storing, and tracking as much personal information as possibleall without warrantsis where automated police surveillance is heading right now. Flock has previously described its desire to connect ALPR scans to additional information on the person who owns the car, meaning that we don’t live far from a time when police may see your vehicle drive by and quickly learn that it’s your car and a host of other details about you. 

    EFF has compiled a list of known drone-using police departments. Find out about your town’s surveillance tools at the Atlas of Surveillance. Know something we don’t? Reach out at [email protected].

    [ad_2]

    Source link

  • Beware the Bundle: Companies Are Banking on Becoming Your Police Department’s Favorite “Public Safety Technology” Vendor

    Beware the Bundle: Companies Are Banking on Becoming Your Police Department’s Favorite “Public Safety Technology” Vendor

    [ad_1]

    When your local police department buys one piece of surveillance equipment, you can easily expect that the company that sold it will try to upsell them on additional tools and upgrades. 

    Axon has been adding AI to its repertoire, and it now features a whole “AI Era” bundle plan. One recent offering is Draft One, which connects to Axon’s body-worn cameras (BWCs) and uses AI to generate police reports based on the audio captured in the BWC footage. While use of the tool may start off as a free trial, Axon sees Draft One as another key product for capturing new customers, despite widespread skepticism of the accuracy of the reports, the inability to determine which reports have been drafted using the system, and the liability they could bring to prosecutions.

    In 2024, Axon acquired a company called Fusus, a platform that combines the growing stores of data that police departments collect—notifications from gunshot detection and automated license plate reader (ALPR) systems; footage from BWCs, drones, public cameras, and sometimes private cameras; and dispatch information—to create “real-time crime centers.” The company now claims that Fusus is being used by more than 250 different policing agencies.

    Fusus claims to bring the power of the real-time crime center to police departments of all sizes, which includes the ability to help police access and use live footage from both public and private cameras through an add-on service that requires a recurring subscription. It also claims to integrate nicely with surveillance tools from other providers. Recently, it has been cutting ties, most notably with Flock Safety, as it starts to envelop some of the options its frenemies had offered.

    In the middle of April, Axon announced that it would begin offering fixed ALPR, a key feature of the Flock Safety catalogue, and an AI Assistant, which has been a core offering of Truleo, another Axon competitor.

    Flock Safety’s Bundles and FlockOS

    Flock Safety is another major police technology company that has expanded its focus from one primary technology to a whole package of equipment and software services. 

    Flock Safety started with ALPRs. These tools use a camera to read vehicle license plates, collecting the make, model, location, and other details which can be used for what Flock calls “Vehicle Fingerprinting.” The details are stored in a database that sometimes finds a match among a “hot list” provided by police officers, but otherwise just stores and shares data on how, where, and when everyone is driving and parking their vehicles. 

    Founded in 2017, Flock Safety has been working to expand its camera-based offerings, and it now claims to have a presence in more than 5,000 jurisdictions around the country, including through law enforcement and neighborhood association customers. 

    Among its tools are now the drone-as-first-responder system, gunshot detection, and a software platform meant to combine all of them. Flock also sells an option for businesses to use ALPRs to “optimize” marketing efforts and for analyzing traffic patterns to segment their patrons. Flock Safety offers the ability to integrate private camera systems as well.

    Much of what Flock Safety does now comes together in their FlockOS system, which claims to bring together various surveillance feeds and facilitate real-time “situational awareness.”

    Flock is optimistic about its future, recently opening a massive new manufacturing facility in Georgia.

    Motorola Solutions’ “Ecosystem”

    When you think of Motorola, you may think of phones—but there’s a good chance that you missed the moment in 2011 when the phone side of the company, Motorola Mobility, split off from Motorola Solutions, which is now a big player in police surveillance.

    On its website, Motorola Solutions claims that departments are better off using a whole list of equipment from the same ecosystem, boasting the tagline, “Technology that’s exponentially more powerful, together.” Motorola describes this as an “ecosystem of safety and security technologies” in its securities filings. In 2024, the company also reported $2 billion in sales, but unlike Axon, its customer base is not exclusively law enforcement and includes private entities like sports stadiums, schools, and hospitals.

    Motorola’s technology includes 911 services, radio, BWCs, in-car cameras, ALPRs, drones, face recognition, crime mapping, and software that supposedly unifies it all. Notably, video can also come with artificial intelligence analysis, in some cases allowing law enforcement to search video and track individuals across cameras.

    In January 2019, Motorola Solutions acquired Vigilant Solutions, one of the big players in the ALPR market, as part of its takeover of Vaas International Holdings. Now the company (under the subsidiary DRN Data) claims to have billions of scans saved from police departments and private ALPR cameras around the country. Marketing language for its Vehicle Manager system highlights that “data is overwhelming,” because the amount of data being collected is “a lot.” It’s a similar claim made by other companies: Now that you’ve bought so many surveillance tools to collect so much data, you’re finding that it is too much data, so you now need more surveillance tools to organize and make sense of it.

    SoundThinking’s ‘SafetySmart Platform’

    SoundThinking began as ShotSpotter, a so-called gunshot detection tool that uses microphones placed around a city to identify and locate sounds of gunshots. As news reports of the tool’s inaccuracy and criticisms have grown, the company has rebranded as SoundThinking, adding to its offerings ALPRs, case management, and weapons detection. The company is now marketing its SafetySmart platform, which claims to integrate different stores of data and apply AI analytics.

    In 2024, SoundThinking laid out its whole scheme in its annual report, referring to it as the “cross-sell” component of their sales strategy. 

    The “cross-sell” component of our strategy is designed to leverage our established relationships and understanding of the customer environs by introducing other capabilities on the SafetySmart platform that can solve other customer challenges. We are in the early stages of the upsell/cross-sell strategy, but it is promising – particularly around bundled sales such as ShotSpotter + ResourceRouter and CaseBuilder +CrimeTracer. Newport News, VA, Rocky Mount, NC, Reno, NV and others have embraced this strategy and recognized the value of utilizing multiple SafetySmart products to manage the entire life cycle of gun crime…. We will seek to drive more of this sales activity as it not only enhances our system’s effectiveness but also deepens our penetration within existing customer relationships and is a proof point that our solutions are essential for creating comprehensive public safety outcomes. Importantly, this strategy also increases the average revenue per customer and makes our customer relationships even stickier.

    Many of SoundThinking’s new tools rely on a push toward “data integration” and artificial intelligence. ALPRs can be integrated with ShotSpotter. ShotSpotter can be integrated with the CaseBuilder records management system, and CaseBuilder can be integrated with CrimeTracer. CrimeTracer, once known as COPLINK X, is a platform that SoundThinking describes as a “powerful law enforcement search engine and information platform that enables law enforcement to search data from agencies across the U.S.” EFF tracks this type of tool in the Atlas of Surveillance as a third-party investigative platform: software tools that combine open-source intelligence data, police records, and other data sources, including even those found on the dark web, to generate leads or conduct analyses. 

    SoundThinking, like a lot of surveillance, can be costly for departments, but the company seems to see the value in fostering its existing police department relationships even if they’re not getting paid right now. In Baton Rouge, budget cuts recently resulted in the elimination of the $400,000 annual contract for ShotSpotter, but the city continues to use it

    “They have agreed to continue that service without accepting any money from us for now, while we look for possible other funding sources. It was a decision that it’s extremely expensive and kind of cost-prohibitive to move the sensors to other parts of the city,” Baton Rouge Police Department Chief Thomas Morse told a local news outlet, WBRZ.

    Beware the Bundle

    Government surveillance is big business. The companies that provide surveillance and police data tools know that it’s lucrative to cultivate police departments as loyal customers. They’re jockeying for monopolization of the state surveillance market that they’re helping to build. While they may be marketing public safety in their pitches for products, from ALPRs to records management to investigatory analysis to AI everything, these companies are mostly beholden to their shareholders and bottom lines. 

    The next time you come across BWCs or another piece of tech on your city council’s agenda or police department’s budget, take a closer look to see what other strings and surveillance tools might be attached. You are not just looking at one line item on the sheet—it’s probably an ongoing subscription to a whole package of equipment designed to challenge your privacy, and no sort of discount makes that a price worth paying.

    To learn more about what surveillance tools your local agencies are using, take a look at EFF’s Atlas of Surveillance and our Street-Level Surveillance Hub

    [ad_2]

    Source link

  • “Guardrails” Won’t Protect Nashville Residents From AI-Enabled Camera Networks

    “Guardrails” Won’t Protect Nashville Residents From AI-Enabled Camera Networks

    [ad_1]

    Nashville’s Metropolitan Council is one vote away from passing an ordinance that’s being branded as “guardrails” against the privacy problems that come with giving the police a connected camera system like Axon’s Fusus. But Nashville locals are right to be skeptical of just how much protection from mass surveillance products they can expect.  

    “I am against these guardrails,” council member Ginny Welsch told the Tennessean recently. “I think they’re kind of a farce. I don’t think there can be any guardrail when we are giving up our privacy and putting in a surveillance system.” 

    Likewise, Electronic Frontier Alliance member Lucy Parsons Labs has inveighed against Fusus and the supposed guardrails as a fix to legislators’ and residents’ concerns in a letter to the Metropolitan Council. 

    While the ordinance doesn’t name the company specifically, it was introduced in response to privacy concerns over the city’s possible contract for Fusus, an Axon system that facilitates access to live camera footage for police and helps funnel such feeds into real-time crime centers. In particular, local opponents are concerned about data-sharing—a critical part of Fusus—that could impede the city’s ability to uphold its values against the criminalization of some residents, like undocumented immigrants and people seeking reproductive or gender-affirming care.

    This technology product, which was acquired by the police surveillance giant Axon in 2024, facilitates two major functions for police:

    • With the click of a buttonx—or the tap of an icon on a map—officers can get access to live camera footage from public and private cameras, including the police’s Axon body-worn cameras, that have been integrated into the Fusus network.
    • Data feeds from a variety of surveillance tools—like body-worn cameras, drones, gunshot detection, and the connected camera network—can be aggregated into a system that makes those streams quickly accessible and susceptible to further analysis by features marketed as “artificial intelligence.”

    From 2022 through 2023, Metropolitan Nashville Police Department (MNPD) had, unbeknownst to the public, already been using Fusus. When the contract came back under consideration, a public outcry and unanswered questions about the system led to its suspension, and the issue was deferred multiple times before the contract renewal was voted down late last year. Nashville council members determined that the Fusus system posed too great a threat to vulnerable groups that the council has sought to protect with city policies and resolutions, including pregnant residents, immigrants, and residents seeking gender-affirming care, among others. The state has criminalized some of the populations that the city of Nashville has passed ordinances to protect. 

    Unfortunately, the fight against the sprawling surveillance of Fusus continues. The city council is now making its final consideration of the aforementionedan ordinance that some of its members say will protect city residents in the event that the mayor and other Fusus fans are able to get a contract signed after all.

    These so-called guardrails include:

    • restricting the MNPD from accessing private cameras or installing public safety cameras in locations “where there is a reasonable expectation of privacy”; 
    • prohibiting using face recognition to identify individuals in the connected camera system’s footage; 
    • policies addressing authorized access to and use of the connected camera system, including how officers will be trained, and how they will be disciplined for any violations of the policy;
    • quarterly audits of access to the connected camera system; 
    • mandatory inclusion of a clause in procurement contracts allowing for immediate termination should violations of the ordinance be identified; 
    • mandatory reporting to the mayor and the council about any violations of the ordinance, the policies, or other abuse of access to the camera network within seven days of the discovery. 

    Here’s the thing: even if these limited “guardrails” are in place, the only true protection from the improper use of the AI-enabled Fusus system is to not use it at all. 

    We’ve seen that when law enforcement has access to cameras, they will use them, even if there are clear regulations prohibiting those uses: 

    Firms such as Fusus and its parent company Axon are pushing AI-driven features, and databases with interjurisdictional access. Surveillance technology is bending toward a future where all of our data are being captured, including our movements by street cameras (like those that would be added to Fusus), our driving patterns by ALPR, our living habits by apps, and our actions online by web trackers, and then being combined, sold, and shared.

    When Nashville first started its relationship with Fusus in 2022, the company featured only a few products, primarily focused on standardizing video feeds from different camera providers. 

    Now, Fusus is aggressively leaning into artificial intelligence, claiming that its “AI on the Edge” feature is built into the initial capture phase and processes as soon as video is taken. Even if the city bans use of face recognition for the connected camera system, the Fusus system boasts that it can detect humans, objects, and combine other characteristics to identify individuals, detect movements, and set notifications based on certain characteristics and behaviors. Marketing material claims that the system comes “pre-loaded with dozens of search and analysis variables and profiles that are ready for action,” including a “robust & growing AI library.” It’s unclear how these AI recognition options are generated or how they are vetted, if at all, or whether they can even be removed as would be required by the ordinance.

    A page from Fusus marketing materials, released through a public records request, featuring information on the artificial intelligence capabilities of its system

    The proposed “guardrails” in Nashville are insufficient to address danger posed by mass surveillance systems, and the city of Nashville shouldn’t think they’ve protected their residents, tourists, and other visitors by passing them. Nashville residents and other advocacy groups have already raised concerns.

    The only true way to protect Nashville’s residents against dragnet surveillance and overcriminalization is to block access to these invasive technologies altogether. Though this ordinance has passed its second reading, Nashville should not adopt Fusus or any other connected camera system, regardless of whether the ordinance is ultimately adopted. If Councilors care about protecting their constituents, they should hold the line against Fusus. 

    [ad_2]

    Source link

  • California Law Enforcement Misused State Databases More Than 7,000 Times in 2023

    California Law Enforcement Misused State Databases More Than 7,000 Times in 2023

    [ad_1]

    The Los Angeles County Sheriff’s Department (LACSD) committed wholesale abuse of sensitive criminal justice databases in 2023, violating a specific rule against searching the data to run background checks for concealed carry firearm permits.

    The sheriff’s department’s 6,789 abuses made up a majority of the record 7,275 violations across California that were reported to the state Department of Justice (CADOJ) in 2023 regarding the California Law Enforcement Telecommunications System (CLETS). 

    Records obtained by EFF also included numerous cases of other forms of database abuse in 2023, such as police allegedly using data for personal vendettas. While many violations resulted only in officers or other staff being retrained in appropriate use of the database, departments across the state reported that violations in 2023 led to 24 officers being suspended, six officers resigning, and nine being fired.

    CLETS contains a lot of sensitive information and is meant to provide officers in California with access to a variety of databases, including records from the Department of Motor Vehicles, the National Law Enforcement Telecommunications System, Criminal Justice Information Services, and the National Crime Information Center. Law enforcement agencies with access to CLETS are required to inform the state Justice Department of any investigations and discipline related to misuse of the system. This mandatory reporting helps to provide oversight and transparency around how local agencies are using and abusing their access to the array of databases. 

    A slide from a Long Beach Police Department training for new recruits.

    Misuse can take many forms, ranging from sharing passwords to using the system to look up romantic partners or celebrities. In 2019, CADOJ declared that using CLETS data for “immigration enforcement” is considered misuse under the California Values Act.  

    EFF periodically files California Public Records Act requests for the data and records generated by these CLETS misuse disclosures. To help improve access to this data, EFF’s investigations team has compiled and compressed that information from the years 2019 – 2023 for public download. Researchers and journalists can look up the individual data per agency year-to-year. 

    Download the 2019-2023 data here. Data from previous years is available here: 2010-2014, 2015, 2016, 2017, 2018.  

    California agencies are required to report misuse of CLETS to CADOJ by February 1 of the following year, which means numbers for 2024 are due to the state agency at the end of this month. However, it often takes the state several more months to follow up with agencies that do not respond and to enter information from the individual forms into a database. 

    Across California between 2019 and 2023, there have been:

    • 761 investigations of CLETS misuse, resulting in findings of at least 7,635 individual violations of the system’s rules
    • 55 officer suspensions, 50 resignations, and 42 firings related to CLETS misuse
    • six misdemeanor convictions and one felony conviction related to CLETS misuse

    As we reviewed the data made public since 2019, there were a few standout situations worth additional reporting. For example, LACSD in 2023 conducted one investigation into CLETS misuse which resulted in substantiating thousands of misuse claims. The Riverside County Sheriff’s Office and Pomona Police Department also found hundreds of violations of access to CLETS the same year. 

    Some of the highest profile cases include: 

    • LACSD’s use of criminal justice data for concealed carry permit research, which is specifically forbidden by CLETS rules. According to meeting notes of the CLETS oversight body, LACSD retrained all staff and implemented new processes. However, state Justice Department officials acknowledged that this problem was not unique, and they had documented other agencies abusing the data in the same way.
    • A Redding Police Department officer in 2021 was charged with six misdemeanors after being accused of accessing CLETS to set up a traffic stop for his fiancée’s ex-husband, resulting in the man’s car being towed and impounded, the local outlet A News Cafe reported. Court records show the officer was fired, but he was ultimately acquitted by a jury in the criminal case. He now works for a different police department 30 miles away.
    • The Folsom Police Department in 2021 fired an officer who was accused of sending racist texts and engaging in sexual misconduct, as well as abusing CLETS. However, the Sacramento County District Attorney told a local TV station it declined to file charges, citing insufficient evidence.
    • A Madera Police Officer in 2021 resigned and pleaded guilty to accessing CLETS and providing that information to an unauthorized person. He received a one-year suspended sentence and 100 hours of community service, according to court records. In a statement, the police department said the individual’s “behavior was absolutely inappropriate” and “his actions tarnish the nobility of our profession.”
    • A California Highway Patrol officer was charged with improperly accessing CLETS to investigate vehicles his friend was interested in purchasing as part of his automotive business. 

    The San Francisco Police Department, which failed to provide its numbers to CLETS in 2023, may be reporting at least one violation from the past year, according to a May 2024 report of sustained complaints, which lists one substantiated violation involving “Computer/CAD/CLETS Misuse.” 

    CLETS is only one of many massive databases available to law enforcement, but it is one of the very few with a mandatory reporting requirement for abuse; violations of other systems likely never go reported to a state oversight body or at all. The sheer amount of misuse should serve as a warning that other systems police use, such as automated license plate reader and face recognition databases, are likely also being abused at a high rate–or even higher, since they are not subject to the same scrutiny as CLETS.

    [ad_2]

    Source link

  • The Atlas of Surveillance Expands Its Data on Police Surveillance Technology: 2024 Year in Review

    The Atlas of Surveillance Expands Its Data on Police Surveillance Technology: 2024 Year in Review

    [ad_1]

    EFF’s Atlas of Surveillance is one of the most useful resources for those who want to understand the use of police surveillance by local law enforcement agencies across the United States. This year, as the police surveillance industry has shifted, expanded, and doubled down on its efforts to win new cop customers, our team has been busily adding new spyware and equipment to this database. We also saw many great uses of the Atlas from journalists, students, and researchers, as well as a growing number of contributors. The Atlas of Surveillance currently captures more than 11,700 deployments of surveillance tech and remains the most comprehensive database of its kind. To learn more about each of the technologies, please check out our Street-Level Surveillance Hub, an updated and expanded version of which was released at the beginning of 2024.

    Removing Amazon Ring

    We started off with a big change: the removal of our set of Amazon Ring relationships with local police. In January, Amazon announced that it would no longer facilitate warrantless requests for doorbell camera footage through the company’s Neighbors app — a move EFF and other organizations had been calling on for years. Though police can still get access to Ring camera footage by getting a warrant– or through other legal means– we decided that tracking Ring relationships in the Atlas no longer served its purpose, so we removed that set of information. People should keep in mind that law enforcement can still connect to individual Ring cameras directly through access facilitated by Fusus and other platforms. 

    Adding third-party platforms

    In 2024, we added an important growing category of police technology: the third-party investigative platform (TPIP). This is a designation we created for the growing group of software platforms that pull data from other sources and share it with law enforcement, facilitating analysis of police and other data via artificial intelligence and other tools. Common examples include LexisNexis Accurint, Thomson Reuters Clear, and 

    New Fusus data

    404 Media released a report last January on the use of Fusus, an Axon system that facilitates access to live camera footage for police and helps funnel such feeds into real-time crime centers. Their investigation revealed that more than 200,000 cameras across the country are part of the Fusus system, and we were able to add dozens of new entries into the Atlas.

    New and updated ALPR data 

    EFF has been investigating the use of automated license plate readers (ALPRs) across California for years, and we’ve filed hundreds of California Public Records Act requests with departments around the state as part of our Data Driven project. This year, we were able to update all of our entries in California related to ALPR data. 

    In addition, we were able to add more than 300 new law enforcement agencies nationwide using Flock Safety ALPRs, thanks to a data journalism scraping project from the Raleigh News & Observer. 

    Redoing drone data

    This year, we reviewed and cleaned up a lot of the data we had on the police use of drones (also known as unmanned aerial vehicles, or UAVs). A chunk of our data on drones was based on research done by the Center for the Study of the Drone at Bard College, which became inactive in 2020, so we reviewed and updated any entries that depended on that resource. 

    We also added new drone data from Illinois, Minnesota, and Texas

    We’ve been watching Drone as First Responder programs since their inception in Chula Vista, CA, and this year we saw vendors like Axon, Skydio, and Brinc make a big push for more police departments to adopt these programs. We updated the Atlas to contain cities where we know such programs have been deployed. 

    Other cool uses of the Atlas

    The Atlas of Surveillance is designed for use by journalists, academics, activists, and policymakers, and this was another year where people made great use of the data. 

    The Atlas of Surveillance is regularly featured in news outlets throughout the country, including in the MIT Technology Review reporting on drones, and news from the Auburn Reporter about ALPR use in Washington. It also became the focus of podcasts and is featured in the book “Resisting Data Colonialism – A Practical Intervention.”

    Educators and students around the world cited the Atlas of Surveillance as an important source in their research. One of our favorite projects was from a senior at Northwestern University, who used the data to make a cool visualization on surveillance technologies being used. At a January 2024 conference at the IT University of Copenhagen, Bjarke Friborg of the project Critical Understanding of Predictive Policing (CUPP) featured the Atlas of Surveillance in his presentation, “Engaging Civil Society.” The Atlas was also cited in multiple academic papers, including the Annual Review of Criminology, and is also cited in a forthcoming paper from Professor Andrew Guthrie Ferguson at American University Washington College of Law titled “Video Analytics and Fourth Amendment Vision. 


    Thanks to our volunteers

    The Atlas of Surveillance would not be possible without our partners at the University of Nevada, Reno’s Reynolds School of Journalism, where hundreds of students each semester collect data that we add to the Atlas. This year we also worked with students at California State University Channel Islands and Harvard University.

    The Atlas of Surveillance will continue to track the growth of surveillance technologies. We’re looking forward to working with even more people who want to bring transparency and community oversight to police use of technology. If you’re interested in joining us, get in touch

    This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

    [ad_2]

    Source link

  • AI in Criminal Justice Is the Trend Attorneys Need to Know About

    AI in Criminal Justice Is the Trend Attorneys Need to Know About

    [ad_1]

    The integration of artificial intelligence (AI) into our criminal justice system is one of the most worrying developments across policing and the courts, and EFF has been tracking it for years. EFF recently contributed a chapter on AI’s use by law enforcement to the American Bar Association’s annual publication, The State of Criminal Justice 2024.

    The chapter describes some of the AI-enabled technologies being used by law enforcement, including some of the tools we feature in our Street-Level Surveillance hub, and discusses the threats AI poses to due process, privacy, and other civil liberties.

    Face recognition, license plate readers, and gunshot detection systems all operate using forms of AI, all enabling broad, privacy-deteriorating surveillance that have led to wrongful arrests and jail time through false positives. Data streams from these tools—combined with public records, geolocation tracking, and other data from mobile phones—are being shared between policing agencies and used to build increasingly detailed law enforcement profiles of people, whether or not they’re under investigation. AI software is being used to make black box inferences and connections between them. A growing number of police departments have been eager to add AI to their arsenals, largely encouraged by extensive marketing by the companies developing and selling this equipment and software. 

    As AI facilitates mass privacy invasion and risks routinizing—or even legitimizing—inequalities and abuses, its influence on law enforcement responsibilities has important implications for the application of the law, the protection of civil liberties and privacy rights, and the integrity of our criminal justice system,” EFF Investigative Researcher Beryl Lipton wrote.

    The ABA’s 2024 State of Criminal Justice publication is available from the ABA in book or PDF format.

    [ad_2]

    Source link

  • Cop Companies Want All Your Data and Other Takeaways from This Year’s IACP Conference

    Cop Companies Want All Your Data and Other Takeaways from This Year’s IACP Conference

    [ad_1]

    Artificial intelligence dominated the technology talk on panels, among sponsors, and across the trade floor at this year’s annual conference of the International Association of Chiefs of Police (IACP).

    IACP, held Oct. 19 – 22 in Boston, brings together thousands of police employees with the businesses who want to sell them guns, gadgets, and gear. Across the four-day schedule were presentations on issues like election security and conversations with top brass like Secretary of Homeland Security Alejandro Mayorkas. But the central attraction was clearly the trade show floor. 

    Hundreds of vendors of police technology spent their days trying to attract new police customers and sell existing ones on their newest projects. Event sponsors included big names in consumer services, like Amazon Web Services (AWS) and Verizon, and police technology giants, like Axon. There was a private ZZ Top concert at TD Garden for the 15,000+ attendees. Giveaways — stuffed animals, espresso, beer, challenge coins, and baked goods — appeared alongside Cybertrucks, massage stations, and tables of police supplies: vehicles, cameras, VR training systems, and screens displaying software for recordkeeping and data crunching.

    And vendors were selling more ways than ever for police to surveillance the public and collect as much personal data as possible. EFF will continue to follow up on what we’ve seen in our research and at IACP.

    A partial view of the vendor booths at IACP 2024

    Doughnuts provided by police tech vendor Peregrine

    “All in On AI” Demands Accountability

    Police are pushing forward full speed ahead on AI. 

    EFF’s Atlas of Surveillance tracks use of AI-powered equipment like face recognition, automated license plate readers, drones, predictive policing, and gunshot detection. We’ve seen a trend toward the integration of these various data streams, along with private cameras, AI video analysis, and information bought from data brokers. We’ve been following the adoption of real-time crime centers. Recently, we started tracking the rise of what we call Third Party Investigative Platforms, which are AI-powered systems that claim to sort or provide huge swaths of data, personal and public, for investigative use. 

    The IACP conference featured companies selling all of these kinds of surveillance. Also, each day contained multiple panels on how AI could be integrated into local police work, including featured speakers like Axon founder Rick Smith, Chula Vista Police Chief Roxana Kennedy, and Fort Collins Police Chief Jeff Swoboda, whose agency was among the first to use Axon’s DraftOne, software using genAI to create police reports. Drone as First Responder (DFR) programs were prominently featured by Skydio, Flock Safety, and Brinc. Clearview AI marketed its face recognition software. Axon offered a whole set of different tools, centering its whole presentation around AxonAI and the computer-driven future. 

    The booth for police drone provider, Brinc

    The policing “solution” du jour is AI, but in reality it demands oversight, skepticism, and, in some cases, total elimination. AI in policing carries a dire list of risks, including extreme privacy violations, bias, false accusations, and the sabotage of our civil liberties. Adoption of such tools at minimum requires community control of whether to acquire them, and if adopted, transparency and clear guardrails. 

    The Corporate/Law Enforcement Data Surveillance Venn Diagram Is Basically A Circle

    AI cannot exist without data: data to train the algorithms, to analyze even more data, to trawl for trends and generate assumptions. Police have been accruing their own data for years through cases, investigations, and surveillance. Corporations have also been gathering information from us: our behavior online, our purchases, how long we look at an image, what we click on. 

    As one vendor employee said to us, “Yeah, it’s scary.” 

    Corporate harvesting and monetizing of our data market is wildly unregulated. Data brokers have been busily vacuuming up whatever information they can. A whole industry provides law enforcement access to as much information about as many people as possible, and packages police data to “provide insights” and visualizations. At IACP, companies like LexisNexis, Peregrine, DataMinr, and others showed off how their platforms can give police access to evermore data from tens of thousands of sources. 

    Some Cops Care What the Public Thinks

    Cops will move ahead with AI, but they would much rather do it without friction from their constituents. Some law enforcement officials remain shaken up by the global 2020 protests following the police murder of George Floyd. Officers at IACP regularly referred to the “public” or the “activists” who might oppose their use of drones and other equipment. One featured presentation, “Managing the Media’s 24-Hour News Cycle and Finding a Reporter You Can Trust,” focused on how police can try to set the narrative that the media tells and the public generally believes. In another talk, Chula Vista showed off professionally-produced videos designed to win public favor. 

    This underlines something important: Community engagement, questions, and advocacy are well worth the effort. While many police officers think privacy is dead, it isn’t. We should have faith that when we push back and exert enough pressure, we can stop law enforcement’s full-scale invasion of our private lives.

    Cop Tech is Coming To Every Department

    The companies that sell police spy tech, and many departments that use it, would like other departments to use it, too, expanding the sources of data feeding into these networks. In panels like “Revolutionizing Small and Mid-Sized Agency Practices with Artificial Intelligence,” and “Futureproof: Strategies for Implementing New Technology for Public Safety,” police officials and vendors encouraged agencies of all sizes to use AI in their communities. Representatives from state and federal agencies talked about regional information-sharing initiatives and ways smaller departments could be connecting and sharing information even as they work out funding for more advanced technology.

    A Cybertruck at the booth for Skyfire AI

    “Interoperability” and “collaboration” and “data sharing” are all the buzz. AI tools and surveillance equipment are available to police departments of all sizes, and that’s how companies, state agencies, and the federal government want it. It doesn’t matter if you think your Little Local Police Department doesn’t need or can’t afford this technology. Almost every company wants them as a customer, so they can start vacuuming their data into the company system and then share that data with everyone else. 

    We Need Federal Data Privacy Legislation

    There isn’t a comprehensive federal data privacy law, and it shows. Police officials and their vendors know that there are no guardrails from Congress preventing use of these new tools, and they’re typically able to navigate around piecemeal state legislation. 

    We need real laws against this mass harvesting and marketing of our sensitive personal information — a real line in the sand that limits these data companies from helping police surveil us lest we cede even more of our rapidly dwindling privacy. We need new laws to protect ourselves from complete strangers trying to buy and search data on our lives, so we can explore and create and grow without fear of indefinite retention of every character we type, every icon we click. 

    Having a computer, using the internet, or buying a cell phone shouldn’t mean signing away your life and its activities to any random person or company that wants to make a dollar off of it.

    [ad_2]

    Source link