A comment on my recent post about working women got my mind running on something that’s gone through my head before. The “war on women” seems to be driven in large part by Christian zealots. Does it ever occur to them that they might be driving women away from the church?
Think about it for a minute. Not all Christians are bad by any means, but it’s clear beyond doubt that Christian extremists are giving a bad name to their whole faith and that they’re behind the war on gays and women, at least in the US. Men are trying to destroy women in every possible way. They’re claiming it to be the will of their god and the bible. They’re simultaneously making women hate men, the government, and organized religion.
Have they ever considered the possibility that women will leave the church in droves? Not wanting all their rights stripped away, not wanting to be mindless little breeding machines, they will turn away from the source: The Christian church. I can tell you right now, if my faith group were responsible for such horrors and hate, I’d drop it like a hot potato, or at least ditch the organized portion. I might maintain some faith, but it would be forever altered and I’d never set foot in a church again.
I think that might end up being the case. Women will leave organized religion and follow true faith. I can’t imagine their god actually wants them to be slaves, tortured and berated by men every day. That’s not life or love, it’s punishment for being female. No god worthy of worship would want that. I think it may dawn on women that what god wants and what the people running the church and government want are not the same thing.
The males members of organized religion are destroying the lives of women and gays everywhere. I don’t understand it. I don’t see an endgame here, other than ruining the world for the sake of power. Somehow I can’t imagine their god actually approves. I think god has just become an excuse for the atrocities that men want to commit.