Hot Student Stories
top-5-grammar-and-spelling-checkers-to-rock-in-2019

Which did settlers bring to American Indian lands in the West? trade restrictions seeds for new forests new diseases different crops

Rodney Fox

in History

follow
followin
1 answer
11 views

1 answer


Annie Barnes on February 3, 2019

The answer is settlers bring new diseases to the American Indians lands in the west.


Add you answer