To our German buddy's, during childhood, do they teach about the rise of Hitler or does the society try to forget about it and not bring it up. I am NOT trying to cause a shit storm, actually curious. I cannot recall actually learning much about it during my early studies. But of course, the US school system is a world wide joke.