Nazi Germany has fallen. After allied forces defeated Nazi Germany in World War II, Europe became a dangerous place to be associated with the Nazi regime as officers, party members, and supporters of Hitler began to flee Germany.
Vote Now!

Nazi Germany has fallen. After allied forces defeated Nazi Germany in World War II, Europe became a dangerous place to be associated with the Nazi regime as officers, party members, and supporters of Hitler began to flee Germany.
Vote Now!