Oh, yes it is. It is and it isn't. America was never one country. The South and the pro-southern lands settled in the west, and then the rest of us. We thought, ultimately, that what we thought of as American values we just that, and enjoyed some level of national acceptance.
Wrong! The South, and especially the evangelical Christians residing there, saw things differently. As long as they were left alone to practice their apartheid we could maintain the illusion of a national consensus, at least on the critical things. Desegregation ended that for good.
And the South rose again. The stench of racism and hate is everywhere. This was us, and it wasn't. Through the Southern Strategy, as it's called, the South has taken over the country, at first being used by the old Republican elite. But now they've demanded their due.
Which was everything we see that makes the country unrecognizable to us.
No comments:
Post a Comment