← All Projects
ML Engineer · 2024

Safe-Feed MMoE Recommender

Multi-task learning with Mixture-of-Experts routing

Multi-Task LearningPyTorchRecommender Systems

The Problem

Content recommender systems often need to optimize for multiple competing objectives simultaneously — engagement, safety, and relevance. Standard shared-bottom architectures suffer from negative transfer, where optimizing one task degrades another.

Approach

I first deployed a Shared Bottom Network to quantify the degree of negative transfer between tasks. Then I architected a Multi-gate Mixture-of-Experts (MMoE) model that uses task-specific gating networks to route inputs to different expert sub-networks, allowing each task to learn its own combination of shared representations.

Results

  • Shared Bottom Network learns good features when tasks are complementary
  • Multi Task Loss effectively balanced competing gradient signals