Let's talk about why everything you know about the rise of Hitler and the transformation of Nazi Germany is basically yet another lie designed to protect rich, western fascists:
Elite industrialists and aristocrats just took more money and said "the French did this to you"
So that doesn't hold much water either.
You see as Johnny-come-lately in Europe, the Germans weren't a big part of the colonial project and that burned their ass something fierce. Eventually they FORCED Bismark to (ct)
which by this they mean eastern Europe.
In traditional Western racism, the "lesser races" are all non-white, but Germans wanted Poland for their own as a colony so now you had to shift that dehumanization, scientific racism, colonial project to... white people.
So, what's next? Ok, Bismark is fired, Germany enters WW1, loses, loses all of it's colonies and Germany is left to sit and spin; but those ideas of racial superiority, master and servant, conqueror and colony didn't LEAVE
Nazism was PURPOSELY introduced, slowly, over a 40 year period, to the German political discourse.
Hitler, Goebbels and the boys just came along and hijacked it at a moment of weakness in Germany govt.
Simple - social Darwinism was also extremely popular in the west, most of our elites ARE basically fascists and the TRUTH of Hitler (ct)
this is completely unacceptable to the Western power structure so they make up a little story "the Germans went mad"
as a US client state.
It's extremely important to understand how the history of Germany helped introduce, reinforce and popularize these social Darwinist ideas, fierce nationalism and a sense of superiority/rightful claim to rule Europe.
So do you see? You already, at the time of Bismark or soon after his rise to power, have DEEP nationalist and "stabbed in the back" myths (ct)
This ALREADY exists BEFORE you introduce social Darwinism, which itself reignites latent, over-the-top German antisemitism.
And all of this is happening well before anyone knows who Hitler is.