As American student, the world history I learn is, sorry to say, one sided. I've always wondered what students in other countries learn about events where we learn that the Americans are "the good guys" (to Americans, at least). For instance, what do students in England learn about the American Revolution? What do German students learn about the Holocaust and World War II? What do Japanese students learn about Pearl Habor? These are just some of the questions I've had, so come on people from other countries... talk away.
This thread is not meant to insult or make any country look bad.
This thread is not meant to insult or make any country look bad.
Comment