Hot Student Stories
top-5-grammar-and-spelling-checkers-to-rock-in-2019

How did the New Deal change things for American Workers?

Craig Stewart

in Business

follow
followin
1 answer
7 views

1 answer


William Cain on December 16, 2018

The New Agreement has changed the role of government completely. Before the New Deal, government had essentially no role in the direction of the economy or in the care of the people. After the New Deal, the government has come to play a very important role in these two things.Before the New Deal, the government is expected to be more or less laissez-faire. It was supposed to just stay out of the way and let the economy rise or fall of the "natural" way. If the people were too old to work, they need to rely on the family. If a bank failed, its depositors were out of luck. The New Deal changed all of that.


Add you answer