When I was in college my friend and I decided to call out some peers of ours who were making overtly sexist jokes on a public Facebook thread. These were made by guys who were in leadership positions in college.
I remember being told, "We don't actually think these things." "They're just jokes, there are bigger issues." "We're not actually sexist". All their friends thought we were making a big deal. The same people today are decrying lack of public acknowledgement from the govt.
Like Lol do you think these things exist as absolutes? That you can cherrypick what kind of sexism is okay? Nothing changes until the culture changes and I don't understand how we consistently lose sight of that.
Until you completely dismantle the society that normalizes viewing women (yes men are assaulted too, that is equally important) as weaker objects that serve male purposes nothing changes.
EDIT: After this thread, one of the guys actually did reach out to me, to talk about what happened and how he has grown. It was an important reminder to me that people do evolve and you have to give them the space to do the work if they are willing.