I mean, I know some parents who always accent their kids to eat their veggies and make it a really big deal, like veggies are something that they "must" eat, saying they´re healthy instead of saying they´re tasty. And then these kids make it a big deal, too and don´t want to eat vegetables. I think presenting vegetables and fruits as something natural is better. My niece was never told she "must" eat anything and she takes food naturally, she eats anything. In fact she prefers carrot juice to coke because she grew up drinking it since she was a toddler. Do you agree this is a better idea?