“It’s surprising to me how few people know that organic means without pesticides, antibiotics or hormones,” he said. “In stores or restaurants around the country, I would ask, ‘Do you have anything organic?’ Half the time they would say, ‘Do you mean vegetarian?’ ”
I have a feeling why, but people like to assume that organic and vegan/vegetarianism are synonymous with each other. I understand, yet again, they are largely surrounded by a progressive movement, but you should know the definition of these terms by now.
I may sound like I'm overreacting a bit, and maybe I am, but when people ask you before Thanksgiving if you eat Turkey, even though they know you're vegan and you've clearly defined what that means-- it's a little sad. Do people just not listen? Or is anything different too foreign to understand?
People need to start educating themselves.