As far as I know, after speaking with a woman who moved to my town from Germany, they're taught the facts of the situation and realize that some of what their country did was absolutely horrible. However, they're taught both that they personally should not take blame for the situation (as mostly everyone involved is long dead), but that they should not feel pride for what many Germans did in cases like the Holocaust. That's all according to her, of course. I mean, from the viewpoint of an outsider, it's a valid point that no German born after the fact can be held responsible, and many German citizens back then did not agree with what was happening. It goes without saying that they shouldn't be proud of what happened. I do personally believe her when she said that they learn the facts - just the same as many US kids. However, all history is skewed; everyone has a stance and everyone gives significance to details that benefit their image for educational purposes, unfortunately. I almost feel as though German students might get a more complete picture of the war, in some cases, than those in my own country.
I couldn't agree more, denying facts of history is a BS but no German born after the WW 1/2 sould feel the blame or anything
But germany did not start WW I. It was way more complicated.
We did start WW II but I don't feel guilty about it, why should I? I wasn't even born when that happend.
But I don't really care for the past...anyway, let us not forget that AH came from Austria
Guys did they taught about what happened after World War II?
About the forced labour of Germans and how some of them were almost slaves?
I had the pleasure of stay 5 days in BERLIN this January, and it's such a beautiful city, in fact I didn't want to go there and it was just because I don't feel any interesting to go, but a friend convinced me and I don't regret cause is was such a unique experience I think BERLIN is a city where you can learn about its mistakes and they don't feel ashamed that you go there only to know their history, because they wanna to you know that that s**t never happen again.
Guys did they taught about what happened after World War II?
About the forced labour of Germans and how some of them were almost slaves?
I had the pleasure of stay 5 days in Germany this January, and it's such a beautiful city, in fact I didn't want to go there and it was just because I don't feel any interesting to go, but a friend convinced me and I don't regret cause is was such a unique experience I think Germany is a city where you can learn about its mistakes and they don't feel ashamed that you go there only to know their history, because they wanna to you know that that s**t never happen again.
Germany did not start WW1. At least I was taught they didn't. This Serbian nationalist killed the Prince of Austria-Hungary. Germany and Austria-Hungary had signed this alliance treaty earlier. Russia supported Serbia, they wanted war cuz they saw a possibility for access to the Mediterranean sea. France was still pressed about the German-French war they lost so they joined Serbia and Russia. They ended up winning in the end and Germany was blamed cuz it was the most powerful country, though the entire conflict was instigated by Serbia and Austria-Hungary
!!!
Germany didn't START the war, they only took the blame as a part of the Treaty of Versailles. Germany can only arguably be blamed for getting the US involved, with the Lusitania explosion and the Zimmermann telegraph.