Congratulations to Prof. Yongfeng Zhang for receiving the Facebook Faculty Research Award for his project titled "Towards a Sustainable Social Platform based on Explainable and Fairness-aware Recommendation" with an awarded amount $75,000.
Social platforms are three-sided marketplaces, including the users, the content producers, and the platform. The content producers produce various types of items to be broadcasted in the system, including but not limited to text messages, articles, images, video clips, and advertisements. Users interact with these items and interact with other users for information seeking, entertainment, and productivity. Meanwhile, the platform serves as the essential tool to connect the users and their interested items.
One of the most fundamental approaches to connecting users with the appropriate items is through personalization and recommendation, which aims to learn the personalized preferences of the users through advanced machine learning techniques so as to understand their information needs and connect the users with the best items. However, most of the existing methods for personalized recommendation are designed based on short-term optimization considerations, but do not put the platform's long-term sustainable development into consideration. For example, matching-based or sequence-based recommendation models are optimized over users' previously clicked items so as to predict their future actions for recommendation. This purely click-driven optimization design makes the algorithms vulnerable to several important risks such as unfair resource allocations, feedback-loops and echo chambers, as well as unexplainable algorithimc decisions.
These problems weakens the platform's sustainable development over each of the three stakeholders: 1) Unfair chances of exposure results in loss of content producers, 2) feedback loops and echo chambers result in narrowed user interests, 3) lack of explainability and trustworthiness results in scarified conversion rate and profit of the platform. This project explores explainable and fairness-aware techniques to address the above problems, including Long-term Fairness for Sustainable Recommendation, Causal debias for mitigating feedback loops and echo chambers, and Counterfactual explainable recommendations. Algorithms and insights developed from this project will be integrated into the Facebook recommendaiton system and thus benefit billions of users.