Just seems like its adversarial media/hollywood makes fun of politics and people the whomever pokes back all in jest. Now it seems the joke taking or whatever has become serious and dangerous in tone. How did that become so? I can only think of McCarthism when it started.


Hollywood is composed of upper class Californians. It would be surprising if they weren’t lefty.
It’s like being surprised rednecks from Mississippi aren’t righty. In both cases: of course they are.
Art in all forms, everywhere, throughout history leaned left.
Funny since rednecks have everything to gain from left wing working class solidarity and elites have everything to gain from right wing trickle-down economics.