A lot can be said about the topic. But here's one thought.
A substantial amount of the political left has long wanted Christianity - and religion in general - to wither, eager to pick up some changes that aren't possible when the religion is heavily integrated into the social fabric. Just think of what we'll get if we don't have those stuffy (nay, patriarchal!) rules about sex and gender holding us back!
It's true, you really widen the range of possibilities once Christianity is beaten back. What people may just be starting to realize is that the range is wider than 'more textbook liberals'. If Trump wins in November, it's going to happen while a lot of liberals - professing atheists, even - look at many of his voters, fists clenched while they scream 'BUT YOU'RE SUPPOSED TO BE CHRISTIAN! CHRISTIANS DON'T VOTE FOR MEN LIKE THIS!'
Maybe not, once upon a time. But who's been spending all that time trying to unweave the Christian threads in the social fabric? Did they really think the only thing that religion was restraining was more left-wing sentiment?
Christian culture wasn't just restraining people from fucking, gents.