Goran Glavaš(@gg42554) 's Twitter Profileg
Goran Glavaš

@gg42554

Professor for #NLProc @Uni_WUE.

ID:382302464

linkhttps://sites.google.com/view/goranglavas calendar_today29-09-2011 20:48:47

381 Tweets

994 Followers

261 Following

Goran Glavaš(@gg42554) 's Twitter Profile Photo

Great work by Andreea Iana to provide the first truly multilingual news recommendation dataset to date, enabling research on multilingual and cross-lingual news recommendation!

account_circle
Andreea Iana(@iana_andreea) 's Twitter Profile Photo

Happy for the opportunity to talk about xMIND, our new , joint work w/ Goran Glavaš & Heiko Paulheim, at the Academic Speed Dating event organized by the Mannheim Center for Data Science!

Check out xMIND at: github.com/andreeaiana/xM…

Happy for the opportunity to talk about xMIND, our new #multilingual #news #dataset, joint work w/ @gg42554 & @heikopaulheim, at the Academic Speed Dating event organized by the Mannheim Center for Data Science! Check out xMIND at: github.com/andreeaiana/xM…
account_circle
Gregor Geigle(@GregorGeigle) 's Twitter Profile Photo

Pretty neat but would be great if they compare with (and cite) mBLIP, an open multilingual LVLM predating this by half a year arxiv.org/abs/2307.06930.
Also maybe evaluate on established benchmarks like IGLUE, MAXM, or XM3600 to compare with prior work?

account_circle
CambridgeLTL(@CambridgeLTL) 's Twitter Profile Photo

📢New Preprint
🤔️Balancing the tradeoff between in-task performance and model calibration: thrilled to share our findings with self-ensembled in-context learning which enhances⬆️ both aspects! 🎉

📄Paper: arxiv.org/abs/2312.13772
📎Code: github.com/cambridgeltl/e…

📢New Preprint 🤔️Balancing the tradeoff between in-task performance and model calibration: thrilled to share our findings with self-ensembled in-context learning which enhances⬆️ both aspects! 🎉 📄Paper: arxiv.org/abs/2312.13772 📎Code: github.com/cambridgeltl/e…
account_circle
Goran Glavaš(@gg42554) 's Twitter Profile Photo

With LLMs, it's easier than ever to do solid NLP for standard languages, BUT can your LLM *reason* in micro- and nano-dialects? Check our DIALECT-COPA shared task for more details. w. Nikola Ljubešić and Ivan Vulić

account_circle
Heiko Paulheim(@heikopaulheim) 's Twitter Profile Photo

For research works in news , we are looking for people with skills in the following languages: Guarani, Haitian Creole, Quechua, Somali, Thai, Indonesian, Swahili, Tamil, Georgian, Vietnamese, Japanese. Get in touch with Andreea Iana. Please RT!

account_circle
Gregor Geigle(@GregorGeigle) 's Twitter Profile Photo

Llava 1.5 struggles with other languages. That's why we need more multilingual multimodal models like our mBLIP (arxiv.org/abs/2307.06930)

Also why F1 and chrF++ and not the established accuracy and CIDEr? Just makes comparison harder.

account_circle
Goran Glavaš(@gg42554) 's Twitter Profile Photo

Great effort by Andreea Iana! An amazing resource & codebase for anyone working on news recommendation. Easy to train SotA models and comparatively evaluate and ablate them (true apple-to-apple comparisons)!

account_circle
Goran Glavaš(@gg42554) 's Twitter Profile Photo

We (WüNLP Universität Würzburg #UniWürzburg ) are hiring! Looking for a postdoc to work on LLM alignment.

More info at tinyurl.com/5n7xy9c9 and tinyurl.com/2murbute

Come join our young and energetic team in Würzburg, one of the prettiest and most livable cities in Germany!

account_circle
Goran Glavaš(@gg42554) 's Twitter Profile Photo

Writing the EMNLP rebuttals. I'm now convinced (also after having served for a year as EiC for ACLRollingReview) that nothing short of publicly releasing reviews *with reviewer identities* will substantially improve the (currently appaling) average review quality in .

account_circle
Goran Glavaš(@gg42554) 's Twitter Profile Photo

Great work by Gregor Geigle! SoTA massively multilingual Vision-and-Language models obtained with few days of training on consumer-grade GPUs.

account_circle
Chia-Chien Hung(@cc_hung_) 's Twitter Profile Photo

Happy to share that I presented our work about Demographic Adaptation at Insights Workshop today at EACL2023.
🙌Thanks to my co-authors:
Anne Lauscher (she/her) Dirk Hovy Goran Glavaš @spponzius
📖Check here: aclanthology.org/2023.findings-…
(1/2)

account_circle
Fabian David Schmidt(@fdschmidt) 's Twitter Profile Photo

Gladly sharing that our work with Goran Glavaš and Ivan Vulić is accepted to , takeaways: (1) averaging training checkpoints outperforms conventional model selection for cross-lingual transfer; (2) averaging checkpoints from multiple runs brings further gains if heads are aligned

Gladly sharing that our work with @gg42554 and @licwu is accepted to #acl2023, takeaways: (1) averaging training checkpoints outperforms conventional model selection for cross-lingual transfer; (2) averaging checkpoints from multiple runs brings further gains if heads are aligned
account_circle
Heiko Paulheim(@heikopaulheim) 's Twitter Profile Photo

Still training with explicit user models? Check our SIGIR 2024 paper 'Simplifying Content-Based Neural News Recommendation: On User Modeling and Training Objectives' to learn about alternatives! arxiv.org/abs/2304.03112 Andreea Iana Goran Glavaš Data and Web Science Group

account_circle