Politics

America’s Stunning Embrace Of Paganism Signals The End Of This Country As We Know It

Published

on

The following essay is adapted from the author’s new book, Pagan America: The Decline of Christianity and the Dark Age to Come.

It’s hard to survey the state of our country and not conclude that something is very wrong in America. I don’t just mean with our economy or the border or rampant crime in our cities, but with our basic grasp on reality itself.

Our cultural and political elite now insist that men can become women, and vice versa, and that even children can consent to what they euphemistically call “gender-affirming care.” In a perfect inversion of reason and common sense, some Democratic lawmakers now want laws on the books forcing parents to affirm their child’s “gender identity” on the pain of having the child taken from them by the state for abuse.

Abortion, which was once reluctantly defended only on the basis that it should be “safe, legal, and rare,” is now championed as a positive good, even at later stages of pregnancy. Abortion advocates now insist the only difference between an unborn child with rights and one without them is the mother’s desire, or not, to carry the pregnancy to term.

But even less contentious issues are now up for grabs, like mass rape. After Hamas terrorists filmed themselves raping and murdering Israeli women on Oct. 7, boasting about their savagery to a watching world, vast swaths of the America left still cannot bring themselves to condemn Hamas. The same progressive college students who insist that the

CLICK HERE to read the rest of this ARTICLE. This post was originally published on another website.

Trending

Exit mobile version