Another interest of mine is research that aids in developing strategies that people can use to improve their ability to reach long-term goals in the face of temptations to satisfy short-term goals instead: studying and working instead of playing, exercising instead of watching TV, eating an apple intead of a doughnut, etc. Since I'm pretty good about getting work done and exercising daily and saving money, but have difficulty not eating too much, especially of sweets, I have a particular interest in how people can better meet their healthy eating goals.
This is the topic of another article in the December issue of the Journal of Consumer Research.
Specifically, they are addressing two rather contradictory theories / streams of research on the influence of food temptations on later eating opportunities.
The first, I think we're all familiar with: exposure to food cues (such as the sight and smell of food) activates our desire to eat, thus making it difficult for us to control ourselves. Research on self-regulation as a limited resource (which I've talked about before) suggests that when you have to control yourself in one situation, you deplete your self-regulation ability, setting yourself up for self-control failure in the future. These ideas would indicate that the best way to overcome temptation is to avoid it - keeping yummy snacks (e.g. Oat Crack cereal) out of the house, not exposing yourself to ads for food, etc.
The second one is new to me: the "critical level model" states that only once a problem reaches some threshhold level of seriousness or difficulty do people engage their problem-solving abilities. It is theorized that this occurs because people expect intense states to last longer than milder states and thus find the serious problem worth trying hard to solve. In the context of achieving long-term eating goals, I can see how this might play out. People frequently put effort into thinking of strategies for dealing with major Diet Threat situations like holidays, parties, vacations, all-you-can-eat buffets, and so forth, while we are rather blase about the small-time temptations we encounter on a daily basis. This model would suggest that encountering a large enough problem / threat to our eating goals in the form of a food temptation might trigger self-control strategies that would benefit you in a later self-control situation.
In this paper, they look at two kinds of temptations: Actionable temptations, in which the food is physically present and available for you to "act" on the temptation by eating, and Nonactionable temptations, like advertisements that make you think about food but do not provide an immediate opportunity to act on your desire to eat. They hypothesized that Actionable temptations (in their experiment, real candy that participants were looking at to do a survey but were not allowed to eat) would lead to greater self-control in a later eating situation compared to Nonactionable temptations (participants seeing drawings of candy).
This research is difficult to describe because of the complexity of the experiments, so I will cut to the overall results:
* Previous exposure to the real candy made something happen to block thoughts about eating when they later encounter a chance to eat.
* When given the chance to eat some M&M's, the drawings of candy group and control group ate a lot more when the experiment included the eating "cues" of the smell of chocolate and the availability of the chocolate in easy-to-grab trays, compared to when they were just presented with the M&M's with no smell and in less convenient containers. Those in the real candy group did not increase their intake of M&M's when those cues were present compared to when they were absent. Again, the idea is that something about their previous exposure to the real candy protected them from cues that typically make people eat like crazy.
* They rejected the idea that previous exposure to real candy (Actionable temptation) made people more likely to think about dieting than those who had only seen drawings (Nonactionable temptation); both groups had elevated thoughts about dieting compared to the control group. Their position is that something made the real candy group think less about eating and actually eat less compared to those who only saw pictures of food and the control group, but it wasn't increased thinking about dieting. The question remains: What is that "something"?
It's clearly unwise to interpret this single study as a reason to go to Sam's Club and buy the biggest damn bag of candy you can find. For one, the "future" eating opportunities they provided were not very distant in time from the original exposure to the candy at the beginning of the experiment. It's unknown how long this "something" that protects you against future eating temptations lasts. Your own big bag of candy may function as both exposure to temptation and later eating opportunity in the same afternoon.
Also, critically, the sorts of social inhibitions that allowed people to be tempted by the original real candy, but not actually eat any of it in the lab, are unlikely to exist in your own kitchen; if you've just eaten a pound of chocolate, you're probably not going to feel very tempted by candy for a while anyway and you've already hosed your diet.
In any event, it'll be interesting to see where future research on this topic goes.
Source: Geyskens, K., Dewitte, S., Pandelaere, M., & Warlop, L. (2008). Tempt me just a little bit more: the effect of prior food temptation actionability on goal activation and consumption, Journal of Consumer Research, 35, 600-610.
Subscribe to:
Post Comments (Atom)
5 comments:
Very fun!
The obvious hypothesis is that sitting next to a bowl of candy that you're not allowed to eat trains you to resist, and or inures you to, the smell and ease-of-accessing candy. For a while. At least that would be my guess, though I don't find the results of this research obvious at all.
A categorization guess - The 'in real life' candy had a quality of 'foodness' that the pictures of food didn't - people know that food in pictures isn't real food. This becomes critical if (pop evo-psych on) people who are exposed to 'more' food prior to the opportunity to eat respond in some way to that by perceiving a time of 'plenty', and so feel less urge to 'stuff themselves' than the people who weren't exposed to food before they had the opportunity to eat.
Good thoughts.
I seem to recall other evo-psych theorizing that the reason people respond so strongly to food cues is that the feast/famine cycle dictates that people eat as much as they can when food is available, so as to be able to survive when food is scarce. Not sure how this would intersect with rvman's evo-psych thinking. Of course, one of the things we love about evo-psych is its fundamental untestability. :)
At this point, I find it a bit difficult to even conjecture what the hell's going on, so I applaud your efforts.
Well, I guess one way to consider the "train to resist" vs. "inure" hypotheses would be to see whether exposure to one food protects you against wanting a very different food in the later eating opportunity. Does exposure to chocolate candy have the same effect when the food later encountered is lasagna instead of M&M's? I would expect the "train to resist" effect to be more general and the "inure" effect to be more specific.
To rvman's idea - perhaps the experimenter could manipulate the sheer quantity of food to which the participants are originally exposed. It sounds to me like the "perceiving a time of plenty" hypothesis would be stronger when you see a huge amount of food vs. just some.
I wonder if the real candy situation is just a "taste" of the weird phenomenon of losing your appetite when exposed to the smells and dealing with food for a long time before you eat it. I remember this happening one Thanksgiving when Jennifer and I spent 5 hours preparing the meal, but neither of us was hungry when it was time to eat. However, it didn't last long. The leftovers tasted really good the next day.
Post a Comment