Hollywood does a horrible job of representing the truth, much less Americans and their actual values and positions. As an American citizen who has proudly served my country, I can say they do NOT speak for me on anything of importance.
So just a reminder, Hollywood != typical American, and Americans are not the only ones who too often tend to believe everything they see on film or TV.
It used to be entertainment, now almost everything is a delivery system for propaganda of some type.