I remember at some point, when I was still quite young, being suprised to find out that WWII didn't start when the Japanese attacked Pearl Harbor.
My father always said that he'd never forget the day he arrived in the town where he settled down as a young man (and where I was born many years later). The first thing he did when he got off the train was buy a newspaper. The headline said "HITLER INVADES POLAND." Not the kind of thing you're likely to forget!
Of course I knew who Hitler was and about the "war in Europe" and knew what "D-Day" was. But I was taken aback when I realized that the newspaper headline he was talking about was from Sep 1939, lol. It just never occurred to me that there had been much of a war going on until the USA was in it - because the emphasis from popular media was always on the part of the war the USA fought, and that's it. This was before I was far enough along in school to have been taught about such things in history class, but TBH I don't think they did much better there. At least it got mentioned, but sadly we never seemed to have enough time to get as in-depth about anything as I would have liked. That I always just did on my own.
|