All of the Hollywood types took different roads to fame. For some it was a short road- some it was a long road. Sure some were privileged since their youth but not all and most came from working class families in small town America.
Why in America are wealthy evangelical preachers able to get away talking about politics and what’s morally right or wrong? If celebrities do it they are written off as elite and out of touch. I do believe most celebrities that talk about politics are genuine in what they stand for and believe. Sure the vast majority of Hollywood is liberal but just like any other special interest group they have their own causes. They stand together much like a Union organizes to stand for particular causes.
Hollywood isn’t right on everything but they don’t discriminate among the less fortunate and they are inclusive of all people (race, sex, class, sexual orientation, etc.). Isn’t that what America is… or should be all about? Maybe Hollywood Boulevard isn’t so far off from the folks on Main St.
Laurent Gilbert, Jr