spot_imgspot_img

Top 5 This Week

spot_img

Related Posts

Fargo Police Let an Algorithm Arrest a Grandmother. It Took Her Lawyer One Bank Statement to Prove They Were Wrong.

THE FACTS

Between April and May 2025, someone walked into several Fargo-area banks with a forged U.S. Army military ID and withdrew tens of thousands of dollars. Surveillance cameras caught the woman on video. Fargo police needed a name. So they ran the footage through facial recognition software.

The software returned Angela Lipps.

Lipps is 50 years old, a mother of three, a grandmother of five, and a lifelong resident of north-central Tennessee. The farthest she’d ever traveled was to neighboring states. She had never been on an airplane. She had never set foot in North Dakota. She did not know anyone in North Dakota.

Here is what the Fargo Police Department did next: A detective looked at Lipps’s social media accounts and Tennessee driver’s license photo. He wrote in a charging document that she “appeared to be the suspect based on facial features, body type and hairstyle and color.” He filed four counts of unauthorized use of personal identifying information and four counts of theft. A judge signed a warrant.

Here is what the Fargo Police Department did not do: call her. No detective picked up the phone. No one asked Angela Lipps a single question before sending armed federal agents to her door.

On July 14, 2025, U.S. Marshals arrived at Lipps’s home in Tennessee. She was babysitting four young children. They arrested her at gunpoint and booked her as a fugitive from justice from North Dakota. Because she was classified as a fugitive, she was held without bail.

Lipps sat in a Tennessee jail cell for 108 days. North Dakota officers did not come to get her until October 30. She made her first court appearance in Fargo on October 31—Halloween, for those keeping score—and that was the first time anyone from law enforcement spoke to her about the charges.

THE BLAME

This was not a technology failure. Facial recognition software does what it is built to do: it returns a probabilistic match. It is, by design, a lead—not a conviction. Every major law enforcement body in the country acknowledges this. The FBI’s own guidelines say facial recognition results should be treated as investigative leads that require independent verification.

The failure here is entirely human. A detective in Fargo received a software match and treated it as case closed. He filed charges on the basis of a machine’s guess and his own eyeballing of Facebook photos. He did not request bank records. He did not check travel records. He did not call the suspect. He did not do the work that separates policing from pointing.

And then an entire system—prosecutors, a judge, U.S. Marshals—carried that failure forward without question. A prosecutor signed the charging documents. A judge found probable cause. Federal marshals flew across the country to execute the arrest. At no point did any human in this chain ask the most basic investigative question: was this woman actually in North Dakota when the crimes happened?

Her lawyer, Jay Greenwood, answered that question in one move. He pulled Lipps’s bank records. They showed her depositing Social Security checks, buying cigarettes at a gas station, ordering pizza, paying for Uber Eats—all in Tennessee, all at the exact times Fargo police said she was committing fraud 1,200 miles away. Greenwood presented the records to investigators on December 19, more than five months after her arrest. It was the first time police had ever interviewed her.

Lipps was released on Christmas Eve. Fargo police did not pay for her trip home. They did not give her a coat for North Dakota in December. Local defense attorneys covered a hotel room and food for Christmas Eve and Christmas Day. A nonprofit called the F5 Project drove her to Chicago so she could make her way back to Tennessee.

She arrived home to find she had lost everything. Her home—gone, bills unpaid for months. Her car—gone. Her dog—gone. No one from the Fargo Police Department has apologized.

THE PATTERN

The ACLU has identified this as the twelfth known case in the United States of someone being wrongfully arrested due to facial recognition errors. Every previous case followed the same script: software returns a match, police skip the verification, an innocent person loses weeks or months of their life, and the department blames the technology while quietly continuing to use it.

What makes Lipps’s case especially galling is that a neighboring department showed exactly how easy it is to get this right. West Fargo police ran the same facial recognition software on a similar fraud case in their jurisdiction. It also flagged Lipps. But West Fargo police determined that a facial recognition hit alone was not sufficient evidence to charge someone—and held off on filing. Same software, same suspect, same city. One department did the work. One did not.

When WDAY News reporter Matt Henson tried for more than a week to get Fargo Police Chief David Zibolski on camera to discuss the case, the chief declined. When Henson raised the question at Zibolski’s retirement press conference on March 11—asking why no one from Fargo Police ever spoke to Lipps during her five months in jail—the chief replied: “Thank you, Matt, for that question, but we are not here to talk about that today.”

Zibolski’s retirement was announced the day before the story broke. Fargo Mayor Tim Mahoney was asked whether the retirement was connected to the case. He declined to answer.

THE VERDICT

Angela Lipps is home. The charges were dismissed without prejudice—meaning they could technically be refiled, a detail the mayor’s office was careful to mention in its statement. Lipps is now working with two attorneys, including one based in Minneapolis, on a civil lawsuit against the Fargo Police Department.

The facial recognition software did what software does: it produced an output. The detective who treated that output as proof did what lazy investigators do: he stopped looking. The department that left a grandmother in a cell for five months without a single interview did what institutions do when they trust machines more than they trust their own obligation to verify: it broke a person’s life and walked away.

The algorithm didn’t arrest Angela Lipps. A detective in Fargo did. The algorithm didn’t skip the phone call, the bank records check, the basic due diligence that would have cleared her in minutes. Humans in Fargo skipped all of that. The algorithm didn’t leave her stranded in North Dakota on Christmas Eve with no coat and no ride home. The Fargo Police Department did.

Accountability status: pending. Lipps is pursuing civil litigation. Fargo police say the investigation is “ongoing.” The police chief who oversaw the case has retired. No one has apologized.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles