Nando de Freitas 🏳️‍🌈(@NandoDF) 's Twitter Profileg
Nando de Freitas 🏳️‍🌈

@NandoDF

I research intelligence to understand what we are, and to harness it wisely. I lead a wonderfully creative AI team at @GoogleDeepMind who inspire me everyday.

ID:29843511

linkhttps://scholar.google.com/citations?user=nzEluBwAAAAJ&hl=en calendar_today08-04-2009 22:41:09

10,1K Tweets

96,6K Followers

656 Following

Follow People
Martín Alcalá Rubí(@draix) 's Twitter Profile Photo

“Startup uruguaya de IA fue elegida por Google entre miles de empresas para ingresar a su aceleradora de EE.UU. “

Gracias Antonio Larronda por la nota! /cc BrainLogic AI Zapia

elpais.com.uy/el-empresario/….

account_circle
Nando de Freitas 🏳️‍🌈(@NandoDF) 's Twitter Profile Photo

Throughout history, changes in human expression have often been met with severe criticism, e.g. fauvism, photography, rock and roll, and so on.

I can’t wait to see what the *artists of the future* will say and do with generative AI. Whether we like it or not, this is only the…

account_circle
Aran Komatsuzaki(@arankomatsuzaki) 's Twitter Profile Photo

ControlNet++: Improving Conditional Controls
with Efficient Consistency Feedback

Proposes an approach that improves controllable generation by explicitly optimizing pixel-level cycle consistency

proj: liming-ai.github.io/ControlNet_Plu…
abs: arxiv.org/abs/2404.07987

ControlNet++: Improving Conditional Controls with Efficient Consistency Feedback Proposes an approach that improves controllable generation by explicitly optimizing pixel-level cycle consistency proj: liming-ai.github.io/ControlNet_Plu… abs: arxiv.org/abs/2404.07987
account_circle
khipu.ai(@Khipu_AI) 's Twitter Profile Photo

Um ano inteiro se passou desde nosso último KHIPU em Montevidéu🇺🇾. Confira nossas melhores recordaçōes neste vídeo.

E fique atentos para mais notícias interessantes sobre 🥁🥁 KHIPU 2025 🥁🥁

youtube.com/watch?v=AgkARA…

account_circle
AK(@_akhaliq) 's Twitter Profile Photo

Google presents RecurrentGemma

Moving Past Transformers for Efficient Open Language Models

We introduce RecurrentGemma, an open language model which uses Google's novel Griffin architecture. Griffin combines linear recurrences with local attention to achieve excellent

Google presents RecurrentGemma Moving Past Transformers for Efficient Open Language Models We introduce RecurrentGemma, an open language model which uses Google's novel Griffin architecture. Griffin combines linear recurrences with local attention to achieve excellent
account_circle
AI at Meta(@AIatMeta) 's Twitter Profile Photo

Today we’re releasing OpenEQA — the Open-Vocabulary Embodied Question Answering Benchmark. It measures an AI agent’s understanding of physical environments by probing it with open vocabulary questions like “Where did I leave my badge?”

More details ➡️ go.fb.me/7vq6hm…

account_circle
Anna(@AnnaPanArt) 's Twitter Profile Photo

People worry about AI stealing jobs, but it’s not the point here. Don’t you see?- the world that we know of is gone.
All resources (including art) is decentralized and will be abundance.
What you really should worry about is to find new life meanings, once you’ve got everything.

account_circle
Science Magazine(@ScienceMagazine) 's Twitter Profile Photo

The evolution of the nervous system may have followed multiple paths and risen independently in two early lineages of animals, according to a Science study in comb jellies.

Check out that research from last year: scim.ag/6yy

The evolution of the nervous system may have followed multiple paths and risen independently in two early lineages of animals, according to a Science study in comb jellies. Check out that research from last year: scim.ag/6yy #ScienceMagArchives
account_circle
Aran Komatsuzaki(@arankomatsuzaki) 's Twitter Profile Photo

RULER: What's the Real Context Size of Your Long-Context Language Models?

- new task categories multi-hop tracing and aggregation to test behaviors beyond searching from context
- all models exhibit large performance drops as the context length increases

arxiv.org/abs/2404.06654

RULER: What's the Real Context Size of Your Long-Context Language Models? - new task categories multi-hop tracing and aggregation to test behaviors beyond searching from context - all models exhibit large performance drops as the context length increases arxiv.org/abs/2404.06654
account_circle
Google DeepMind(@GoogleDeepMind) 's Twitter Profile Photo

With @GoogleCloud, we’re releasing CodeGemma & RecurrentGemma: two new additions to the Gemma family of lightweight, state-of-the-art open models.

These focus on empowering developers and researchers, helping them to build responsibly. → dpmd.ai/4apiG4j

With @GoogleCloud, we’re releasing CodeGemma & RecurrentGemma: two new additions to the Gemma family of lightweight, state-of-the-art open models. These focus on empowering developers and researchers, helping them to build responsibly. → dpmd.ai/4apiG4j #GoogleCloudNext
account_circle
Aran Komatsuzaki(@arankomatsuzaki) 's Twitter Profile Photo

Scaling Laws for Data Filtering -- Data Curation cannot be Compute Agnostic

Argues that data curation cannot be agnostic of the total compute that a model will be trained for

repo: github.com/locuslab/scali…
abs: arxiv.org/abs/2404.07177

Scaling Laws for Data Filtering -- Data Curation cannot be Compute Agnostic Argues that data curation cannot be agnostic of the total compute that a model will be trained for repo: github.com/locuslab/scali… abs: arxiv.org/abs/2404.07177
account_circle
Min Choi(@minchoi) 's Twitter Profile Photo

This is wild.

Udio just dropped and it's like Sora for music.

The music are insane quality, 100% AI. 🤯

1. 'Dune the Broadway Musical'

account_circle
Aran Komatsuzaki(@arankomatsuzaki) 's Twitter Profile Photo

Google presents Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention

1B model that was fine-tuned on up to 5K sequence length passkey instances solves the 1M length problem

arxiv.org/abs/2404.07143

Google presents Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention 1B model that was fine-tuned on up to 5K sequence length passkey instances solves the 1M length problem arxiv.org/abs/2404.07143
account_circle
Matt Bornstein(@BornsteinMatt) 's Twitter Profile Photo

[Announcement] We're leading the seed round for udio, a new AI music app launching today in public beta.

Go to udio.com and try it. You will be blown away by the music you can create - melodic, coherent, creative & high fidelity.

Thoughts & samples below 👇

account_circle
Alan Karthikesalingam(@alan_karthi) 's Twitter Profile Photo

Our new work in Nature Medicine

Using generative models in path, radiology and derm - we can create synthetic training data to ⬆️ AI fairness & robustness, including underrepresented groups. Work w/ a great team Google DeepMind Google AI Google Health

nature.com/articles/s4159…

account_circle
udio(@udiomusic) 's Twitter Profile Photo

Introducing Udio, an app for music creation and sharing that allows you to generate amazing music in your favorite styles with intuitive and powerful text-prompting.

1/11

account_circle
Nicolas Papernot(@NicolasPapernot) 's Twitter Profile Photo

One of the key arguments Somesh made in his keynote relied on the work by Boaz Barak 's group on the impossibility of watermarking in generative ML: arxiv.org/abs/2311.04378

account_circle
Pier Giuseppe Sessa(@piergsessa) 's Twitter Profile Photo

Excited of the fantastic collaboration with the Griffin team on post-training! The released RecurrentGemma 2B model is a competitive but higher throughput version of Gemma 2B.

account_circle