Organic Food is Not Better For You

Claims study from Stanford University

|

Eating organic food will not make you healthier, according to researchers at Stanford University, although it could cut your exposure to pesticides.

They looked at more than 200 studies of the content and associated health gains of organic and non-organic foods.