Do you think something will actually change?

Everywhere I look I see women who post about sex strikes and protests and advocate for violence againts men and the government, but what impact do you think this will actually have?

Once the outrage about the abortion ban and whatever ensues will have died down, what do you think women have been able to change?

The American society and the lawmakers are to religious or too rich to care about middle and lower class womens rights and I’m really uncertain if all this rethoric will result in something substantial.

Call me a pessimist, but I really don’t see a chance for change. This will – if at all – end in a catastrophe for the American society consisting of unhappiness, chaos, mayhem and uncontrollable violence, where women’s rights will evolve to a slogan used for capitalistic interests.

Do you think something will actually change?