Philadelphian Creative Commons License 2002.03.14 0 0 7388
Juanhu, when you say that the U.S. didn't show anything to Europe or Japan, do you mean to imply the West Germany, with a political culture that tended to tolerate authoritarian regimes, would have magically transformed itself into a democratic land with the rule of law after World War II? Do you mean to say that after a fascist regime West Germany would have adopted a constitution where power is dispersed both "vertically" (among the federal government and the state governments) and "horizontally" (among the institutions of the federal government)? If it were not for U.S. influence, do you think that ordinary Germans would have developed their loathing of sending German troops outside the boundaries of Germany?

Maybe you could argue that you don't like the influence that the U.S. has had on Germany, but clearly Germany adopted a system that was introduced to the Germans by the U.S.