Dynamic Privacy Choices
Imagine a consumer using a social media service. They use the application to read news or posts from friends. At the same time, the social media app learns about the consumer by analyzing their profile and monitoring their browsing habits. The more actively the consumer uses the service, the more data the app can collect, which often means a better experience for the consumer and higher revenue for the service.
The consumer faces a trade-off: On the one hand, they enjoy the benefits of using social media, such as socializing with friends. On the other hand, the consumer also values their privacy and is concerned about the risk of data leakage, identity theft or other potential abuses of personal information. If the consumer thinks the privacy cost of using the app is high, they will reduce their activity on the platform.
This paper develops a dynamic game-theoretic model that analyzes the interaction between a consumer’s incentive to protect privacy and a platform’s incentive to collect data. The platform can encourage the consumer to use the service more actively by employing a certain privacy policy. The key idea is that when the consumer has low privacy on a platform, the marginal cost to the consumer of giving up more privacy is low. Thus, data collection reduces the consumer's welfare but increases their incentive to keep using the service.
The paper shows that the platform can collect a lot of data even if the consumer is sensitive to privacy. To do so, the platform initially offers high privacy protection to encourage the consumer to use the platform; however, as time goes by, the platform gradually degrades privacy protection. In the long run, the consumer will lose privacy but keep a high activity level, and the platform typically offers negligible privacy protection.