Language constraints limited recommendation quality
Limited NLP tooling made personalization difficult in low-resource languages.
We adapted transformer-based recommenders
A modern recommender architecture was applied and tuned for language constraints.
Transformer-based news recommendation model
An NRMS architecture with self-attention was implemented to model user preferences and article representations.
~30% gains in coverage and serendipity
Recommendation diversity improved while maintaining stable ranking performance.



