Louise Seamster
debt, development, infrastructure, the economics of racial inequality, and the myth of racial progress. I express my own views
- I am trying and failing to find the article from 2 months ago that adds one more terrifying layer: about the successful altering of a facial recognition database such that the photos no longer “read” as someone’s face to a computer, although they look unchanged to a person.
- In which @tressiemcphd.bsky.social scares the shit out of me. “ICE knows that it cannot shoot us all. But the Department of Homeland Security is close to being able to track us all.” www.nytimes.com/2026/02/03/o...
- [Not loaded yet]
- I knew someone here would be able to find it! It was hard to look up given the amount on “ai facial recognition”
- @nanaslugdiva.blacksky.app came through! bsky.app/profile/nana...
- yes this one! at least this article is on the same study. Thank you!!! it’s framed here as helping everyday people, but tools can be used various ways.
- (and funded by the government…to help “ordinary users” protect their privacy??)
- The false positives + overly-generic categorizations like “domestic terrorism,” already have significant precedent in phenomena like the 9-11 travel bans and “gang member” databases, both of which included babies. Communities of color have long experience with these practices and *warned everyone…*
- even if you think you “have nothing to hide”, you have a lot to lose from these practices. They work just as effectively from their INaccuracy—people would be remiss to think they’re just about sorting risk or about knowledge. they are about producing categories of people, like “enemy of the state."
- Black surveillance scholars including @hypervisible.blacksky.app and @wewatchwatchers.bsky.social have long warned of the dangers of simultaneous hypervisibility AND invisibility, long known to the Black community. Being always seen and watched, but as a “type,” not recognized as a person.
- There is huge risk to our privacy in terms of the data we *did* produce, the use of our faces, tracking, etc. AND we also need to draw on these important insights to understand how the risks include the dangers of *misrecognition* weaponized against people.
- [Not loaded yet]
- I wrote this a year ago to be ready for when it eventually falls apart!