Wednesday, October 08, 2014

The Perils of Social Commitment



Many behavioral scientists and persuasive designers recommend social commitment as a motivational tool.  Just this week, I read this in the SocialMedia Examiner and this in Beeminder.  Social commitment is the idea that if I make a public commitment to do something (quit smoking) then I am more likely to follow through.  Because not only do I fail to myself, I also look bad in front of all of my friends.  That pressure is supposed to be the extra incentive I need to keep at it. 

It can work some of the time and in some contexts. But what bugs me is that many of the advocates seem to think it works for all people, all the time.  There is ample evidence that these strategies can backfire.  There is an effect (that I haven’t heard a great term for, so feel free to suggest one – I have used “entitlement indulgence” among others) in which the act of announcing your intention makes you feel like you have made progress to towards your goal.  So you feel entitled to ease up and/or reward yourself with something you don’t deserve because you haven’t really done anything yet.

This is a huge challenge in gamification, which is why Markus and I are dedicating an entire chapter in our book to leveling up, which we are defining as creating a process in between finishing one major step of an activity and starting the next one.  You have to get your user into that next step smoothly or you run the risk of entitlement indulgence. 

Another problem with social commitment is the risk of negative social feedback.  If you make a public commitment to quit smoking and then experience a setback, it is possible that your social network will try to buck you up.  But it also possible that you will get criticism.  This negative feedback can be demotivating and make it harder to get back on track.  The tighter your social network is the more powerful that social feedback is – whether it is positive of negative.  So everything we like about social feedback is also everything we dislike about it.

So let’s look at this as an opportunity rather than a problem.  If we rely on the social network for the commitment contract, then we are at the mercy of the social network and how they frame their comments, feedback, and opinions.  On the other hand, we know that people anthropomorphize AI much more than we care to admit.  The movie Her is really not that far off from reality.  How many times have you yelled at the voice on your GPS?  So what if we set up the GX so that the public commitment is made to their teammate avatar, for which we have full control of its responses?  We can make sure that all feedback is positively framed, encouraging, and designed to get the user back on track to their target behavior (e.g. quit smoking).