AI Summary
This article discusses using contextual embeddings from large language models to capture linguistic information in real-time conversations, allowing thoughts to be transmitted from one person's brain to another during natural conversations. This shared model-based linguistic space has implications for understanding how language is processed and shared between individuals during communication.
Zada et al. use contextual embeddings from large language models to capture linguistic information transmitted from the speaker’s brain to the listener’s brain in real-time, dyadic conversations.