Some historians argue that the New Deal was the tsunami-like event that radically transformed American society, culture, and economics. Other historians contend that it was World War II—not the New Deal—that dramatically changed America society, politics, and economics. Which event do you think transformed America more?
Responses (2)
I don't know which one had more of an impact, but I'm pretty certain of this: the New Deal has often been falsely credited by historians as having helped brought the United States out of the Great Depression. In reality, it created entitlement programs, more dependence on the government, and more depression. It was World War II that really got America out of the Depression, because manufacturing in preparation for war increased and it created incentives for people to work.
I hope this helps! ;)
In regards to the New Deal, it was at best damage control. Firstly it implemented a whole array of fiscal policy reforms. Its merit lies in that it did keep many from dying from the lack of basic needs nutrition/housing etc. One could argue that it gave individuals a sense of purpose in a broken economy. The New Deal and its facets are debated to this day primarily over the simple question of the responsibility the government has to the poverty stricken(what ever its cause may be).
WWII not only brought the U.S out of its depression, it sent it souring into the greatest economy the world had ever seen.
New Deal = Damage Control
WWII = Perfect timing. The best thing that could have happened for the U.S.