Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data?

26 May 2021  ·  Ryoma Sato ·

Fairness is a crucial property in recommender systems. Although some online services have adopted fairness aware systems recently, many other services have not adopted them yet. In this work, we propose methods to enable the users to build their own fair recommender systems. Our methods can generate fair recommendations even when the service does not (or cannot) provide fair recommender systems. The key challenge is that a user does not have access to the log data of other users or the latent representations of items. This restriction prohibits us from adopting existing methods designed for service providers. The main idea is that a user has access to unfair recommendations shown by the service provider. Our methods leverage the outputs of an unfair recommender system to construct a new fair recommender system. We empirically validate that our proposed method improves fairness substantially without harming much performance of the original unfair system.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods