Poison-Tolerant Collaborative Filtering Against Poisoning Attacks on Recommender Systems

Thar Baker, Tong Li, Jingyu Jia, Baolei Zhang, Chang Tan, Albert Y. Zomay

Research output: Contribution to journalArticlepeer-review


Personalized recommendation is deemed ubiquitous. Indeed, it has been applied to several online services (e.g., E-commerce, advertising, and social media applications, to name a few). Learning unknown user preferences from user-provided data lies at the core of modern collaborative filtering recommender systems. However, there is an incentive for malicious attackers to manipulate the learned preferences, which could affect business decision making, by injecting poisoned data. In the face of such a poisoning attack, while previous works have proposed a number of defense methods succeeding in other machine learning (ML) tasks, little is effective for collaborative filtering (CF). Thereof, we present a new defense scheme called poison-tolerant collaborative filtering (PTCF), which is highly robust against poisoning attacks on collaborative filtering. Different from the defenses that remove outliers or search a min-loss subset, the PTCF scheme enables collaborative filtering on an attacked training dataset while guarantees system's availability and integrity. We evaluate extensively the PTCF scheme on a public dataset (Jester) and two real-world datasets (Movie and E-Shopping), and demonstrate that the PTCF scheme is significantly effective in providing robustness.
Original languageEnglish
Pages (from-to)1-13
Number of pages13
JournalIEEE Transactions on Dependable and Secure Computing
Publication statusPublished - 16 Jan 2024


  • Collaborative filtering
  • Data models
  • Optimization
  • Recommender systems
  • Sparse matrices
  • Task analysis
  • Training
  • poisoning attacks
  • recommender system
  • supervised learning


Dive into the research topics of 'Poison-Tolerant Collaborative Filtering Against Poisoning Attacks on Recommender Systems'. Together they form a unique fingerprint.

Cite this