What is going on in America?
In Texas liberals attack politicians for holding traditional beliefs.
But, one photo is worth a thousand words, as a liberal terrorist throws a bomb:
So, why did America become liberal?
What is your theory?
The Nazis decided they were losing World War 2, and they brought the war into America to destroy us from within.
That is my theory. And that is the premise of my fiction series.
Do you have a better idea?