Check out all the details and more findings here:
arxiv.org/abs/2503.22508
I will also present this at the #IR4GOOD at #ECIR2025 next week, alongside many other contributions from
@irglasgow.bsky.social. Looking forward to continuing these discussions at #SIGIR2025.
Posts by Andreas Chari
Then, we try to zero-shot these fine-tuned models on other language pairs. Some are related to French-Catalan, such as Occitan, and some are entirely unrelated, such as Mandarin.
We see it does transfer to Occitan-French pairs and in Cantonese-Mandarin pairs (more on the paper)
We fine-tuned them on Catalan queries and French Docs and see that we can regularise the models to be more robust on Catalan (and we see some gains in French!)
What will happen if you fine-tune neural rankers such as BGE-M3 and ColBERT-XM on low-resource queries and high-resource documents of two different (albeit related) languages?
Will it help regularize the rankers on these similarities?
We translated five collections of mMARCO into similar languages and evaluated retrieval methods based on how well they would perform if the queries were expressed in a similar low-resource language.
It turns out they do not perform very well (an understatement).
🚨 New Pre-Print!🚨 with @macavaney.bsky.social &
@iadhounis.bsky.social. Stop using "translate-train" for all your multilingual needs. We explore zero-shot transfer for low-resource languages... 🧵
🇮🇹🇮🇹🇮🇹
I am happy to share that our work with @macavaney.bsky.social and @iadhounis.bsky.social , "Improving Low-Resource Retrieval Effectiveness using Zero-Shot Linguistic Similarity Transfer", has been accepted to the #ecir2025 "IR for Good" track.
Demo paper entitled “FinPersona: An LLM-Driven Conversational Agent for Personalized Financial Advising” has been accepted to #ecir2025 - output of a collaboration with University of Tokyo with T Takayanagi, M Suzuki, K Izumi, R McCreadie, @javiersanzcruza.bsky.social and @iadhounis.bsky.social
#IR4Good paper entitled “Fair Exposure Allocation Using Generative Query Expansion” has been accepted to #ecir2025 - work by @tjaenich.bsky.social @grahammcdonald.bsky.social and @iadhounis.bsky.social
Happy to share that our paper "Improving novelty and diversity of nearest-neighbors recommendation by exploiting dissimilarities" with @psperez.bsky.social and @abellogin.bsky.social has been accepted to #ECIR2025 IR4Good track!
Full length paper entitled “One size doesn’t fit all: Predicting the Number of Examples for In-Context Learning” has been accepted at #ecir2025 - work by Manish Chandra, @gdebasis.bsky.social and @iadhounis.bsky.social
Full length paper entitled “A Multi-modal Recipe for Improved Multi-domain Recommendation” has been accepted at #ecir2025 - work by @zixuanyi.bsky.social and @iadhounis.bsky.social
#IR4Good paper entitled “Improving Low-Resource Retrieval Effectiveness using Zero-Shot Linguistic Similarity Transfer” has been accepted to #ecir2025 - work by @andreaschari.bsky.social @macavaney.bsky.social and @iadhounis.bsky.social
Tomorrow, in our weekly reading group series, we will be discussing the recent Nature paper entitled “Detecting hallucinations in large language models using semantic entropy“ by Farquhar et al. #IRGlasgowReadingGroup
www.nature.com/articles/s41...
Today, Aixin Sun from the Nanyang Technological University is giving an #IRTalk entitled "Understanding and Evaluating Recommender Systems from a User Perspective". Details: samoa.dcs.gla.ac.uk/events/viewt...
@uofgcompsci.bsky.social
@irglasgow.bsky.social
At 15:00 on 25th November, Jianling Wang from Google DeepMind will give an #IRTalk entitled "When LLMs Meet Recommendations: Scalable Hybrid Approaches to Enhance User Experiences". Details: samoa.dcs.gla.ac.uk/events/viewt...
@uofgcompsci.bsky.social
@irglasgow.bsky.social
Hello everyone! Happy to be here! If you are interested on the research that we do at @irglasgow.bsky.social (IR, RecSys, NLP), I have created an starter pack for you!
Access it here: go.bsky.app/BM6iHbU
I will keep updating it over time as more people in our team join BlueSky!