Earlier this year, I wrote about a friend who was attacked and thrown to the ground on a September afternoon, while she was walking toward Borough Hall in Brooklyn to go to the post office. She was on the phone with her mother when a man, whose erratic behavior she had noticed in the distance, pushed her into the street, leaving her with bruises, a chipped tooth and fear new to her after decades of living in New York.
At the time, it did not seem as though the assault might be part of a dark emerging trend. But the incident would presage many others — instances in which women in New York were randomly punched on the street in the middle of the day.
At the end of April, Joseph Kenny, the chief of detectives for the city’s Police Department, reported that 50 women had suffered this sort of unprovoked brutality since the beginning of the year in the lower half of Manhattan alone. A few days before the announcement, a 9-year-old girl standing with her mother was punched in the face by a man at Grand Central Station.
The incidents came to public attention largely via TikTok videos that some of the victims posted right after they were attacked — vivid accounts from young women with bruised faces and black eyes, which led to questions about how seriously the police were pursuing these cases as acts of violent misogyny. Then, just this week, the Manhattan district attorney’s office indicted a man — a 40-year-old fringe political candidate named Skiboky Stora — on hate crime charges “for assaulting, stalking and harassing strangers in a series of anti-female, anti-white and antisemitic incidents.” He had been arrested in March two days after he was accused of punching a 23-year-old woman, a social media influencer whose video describing what had happened so unexpectedly one morning in Chelsea has been viewed tens of millions of times.
Ever since smartphones with cameras became ubiquitous, technology has come to play an increasingly crucial role in modern law enforcement both as a means of fomenting national outrage over otherwise obscured injustices and tackling the more mundane challenges of solving crimes. In the case of my friend, Laura, the attack had been recorded on a security camera, and she had also managed to take a very clear close-up of him — head to toe — with her phone. What intrigued me was what her situation revealed about the legal system’s latent and quasi-romantic faith in the power of human perception over the more conclusive determinations supplied by digital imaging.
A precinct lineup today is nothing like the familiar scene in police procedurals. Five weeks after her attack, Laura, a successful artist in her 50s (who asked that I not use her full name because her case is ongoing), was given a set of photographs selected by an algorithm. Faced with a “lineup” of headshots, she was unable to identify her assailant with any confidence. This bothered her, she told me, precisely because she was a painter in the realist tradition who prided herself on an adept eye for physical detail. Any other characteristic she might have registered about her attacker, like his carriage or the way he held his hands, became meaningless.
Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.
Thank you for your patience while we verify access.
Already a subscriber? Log in.
Want all of The Times? Subscribe.