1) If your not from Southern California, what impressions did you have about the culture of Southern California and particuarly LA before you got here? Basically, what image came into your head when you thought about Southern California? What things shaped this image (media, TV, movies?)? Is California what you expected?
2) If you are from Southern California do you feel like Southern California is being portrayed or has been portrayed in the past inaccurately? Do you feel like the portrayal of Southern California in things like movies and TV actually affects the culture of Southern California?
3) How do you think the image of California affects the way that Californians think about health or food? How can the "image" of California (and perhaps the resulting insecurities of Californians) be turned into profit for the health and food industry? Do you have any examples?
4) Do you think all this promotes the actual health of Californians or just the image of health?