Hollywood, I am told, is all about money; they don't have an agenda at all other than make lots and lots of money. Hollywood doesn't care about politics or political correctness.
Flapdoodle, hogwash, and poppycock. The same people who tell me that are also the first people to complain about Fox News' rightward tilt.
Before Bill Clinton was President, Hollywood always portrayed the President as evil, crooked, crazy, craven, or some combination; but after 1992, suddenly we had SUPER PRESIDENT. (EG Bill Pullman in Independance Day, Harrison Ford in Air Force One.) Playing to the fears of the time? Ronald Reagan was (and still is!) one of the most popular Presidents of the 20th Century, yet I struggle to recall so much as one movie made during the Reagan years wherein the President was half as nifty as Harrison Ford.
Hollywood did its best to keep Mel Gibson's The Passion from reaching theaters. In fact, devout Christians are usually portrayed as psychotic or worse in most movies; at best they are treated as harmless eccentrics. Is that playing to the country's tastes, considering how many people in the USA self-identify as "Christian"?
The Fox News Network is reviled as being "hard right", yet it's the most popular news channel out there. If the media in the US were all about making money, wouldn't they sit up and take notice of Fox's success, and attempt to imitate it, rather than denigrate it? If money was primary and politics were a distant second, wouldn't other channels vie to out-fox Fox?