We recently watched/read Food Inc, Omnivore's dilemma and Fast Food Nation (and a few others that I forget right now), which has gotten us thinking about our food, where it comes from and what it goes through. I know not everyone loves Whole Foods and they aren't this utopian company some people make them out to be, but I do appreciate what they are trying to do. Of course, it pains me to pay so much for some of the groceries there, but if it makes me and my family healthier, then maybe it's worth the upfront investment. If it saves me from the doctor's or other health problems down the road, then paying a bit more in my grocery bill is worth it to me. Looking at our doctor's bills from last year (and really they're only my husband's) we had AFTER INSURANCE, over $1,000 to pay the various doctors and hospitals. Now the scary thing is that: WE HAVE INSURANCE! and my husband is not sick. He has no real illness, but just some general ailments. So if some of these doctor visits can be avoided by eating a little healthier, a little fresher and a little greener, then that's what we will do.
Of course, I haven't read a whole lot of research -just the aforementioned books/movies, but I do think there has to be something to it. There is a noticeable increase in food allergies and obesity and early onset puberty in kids... something is happening to cause this and one thing that comes to mind is our diet and specifically our food source. I know ideally we would eat local and seasonal, but that isn't possible a lot of the times. In the summer months we do buy our produce almost exclusively at farmer's markets, which may not be organic, but they are local and I do want to support the smaller farms.
But what do you think? Is Whole Foods a scam? Is organic just a bunch of hype? Are consumers to blame for the demand for cheap inexpensive food?