Do you remember the old days?
When Rock and Roll was king?
Sex and drugs sold everything?
It seems much dirtier now than it was described back then. Doesn’t it?
Don’t the rich, elite hollywood types focus more on the dirty needles that got everyone sick than they focus on how glamorous the men were and how sexy the women were?
Why is it the hollywood elite wanted us to think all that was good back then, and they want us to think it is dirty now that they got rich off of that lifestyle?
Just one side of the 1970’s. And I realize there are many other view points.
What is your point of view?
Do you still rock and roll?