when did americans as a whole start eating more vegetables?

Fez F

New member
I'm thinking that americans didn't eat many vegetables in the 1800's. So I'm thinking that in the 1900's people I guess learned the positive effects of vegetables and started imforming the general public about it, which is why now everybody knows that they are good for you and you should eat a lot. My question is around what year did the vegetable movement began
 
Back
Top