Doctoral Thesis: Algorithmic Interactions With Strategic Users: Incentives, Interplay, and Impact

Tuesday, June 20
10:30 am - 12:00 pm

32-G882 (Hewlett Room)

Alireza Fallah

Abstract:

The societal challenges posed by machine learning algorithms are becoming increasingly important, and to effectively study them, it is crucial to incorporate the incentives and preferences of users into the design of algorithms. Privacy concerns serve as a notable example, as the success of machine learning algorithms heavily relies on the collection and utilization of vast amounts of user data. However, users are growing increasingly concerned about the extent of data collected by businesses and the potential misuse of their personal information. 

This talk presents frameworks for studying the interactions between a platform and strategic users in the data acquisition problem. The central objective of the platform is to estimate a parameter of interest by collecting users’ data. However, users demand privacy guarantees or compensations in exchange for sharing their information.

In the first part of the talk, we formulate this question as a Bayesian-optimal mechanism design problem, in which an individual can share her data in exchange for a monetary reward but at the same time has a private heterogeneous privacy cost which we quantify using differential privacy. We consider two popular data market architectures: central and local. In both settings, we establish minimax lower bounds for the estimation error and derive (near) optimal estimators for given heterogeneous privacy loss levels for users. Next, we pose the mechanism design problem as the optimal selection of an estimator and payments that elicit truthful reporting of users’ privacy sensitivities. We further develop efficient algorithmic mechanisms to solve this problem in both privacy settings.

In the second part, we focus on the impact of data market architecture on user privacy. In particular, we first compare central and local architectures from both users and the platform’s points of view. Moreover, we establish that, when there is no payment by the platform, and in the space of all privacy-preserving data mechanisms, a shuffling-based mechanism, in which the platform observes users’ data only after being shuffled, is optimal from the users’ point of view. Finally, we briefly discuss the impact of adding such a shuffler within the data market architecture on the users’ utility.

Details

  • Date: Tuesday, June 20
  • Time: 10:30 am - 12:00 pm
  • Category:
  • Location: 32-G882 (Hewlett Room)
Additional Location Details:

Thesis Supervisor(s): Prof. Asu Ozdaglar

Zoom link available  upon request.