I was just over on CNN, and the hatred in our public arena stuns me.
The idea that we can paint ‘other’ as evil and hateful just because we can form those words is astonishing.
When did it become ‘other’ to be ‘white’ in America? When did where we are born mean we are evil? Why is South Carolina evil, but the Bronx is not?
Why is it right to be gay, but wrong to be a traditional American?
One third of our women are raped, 15% of our men.
Isn’t it time that the progressive agenda says to the rest of US, “OK, this is going badly, maybe we can help you change things for the better for awhile … ”
Will you hold your breath and wait for them?