When two people meet and discover they both love "transformers," but one's thinking of robots in disguise while the other's contemplating mathematical functions that map vector spaces. The equations shown (Q n , K h , V h , and a nm ) are from attention mechanisms in machine learning transformers - the architecture powering modern AI systems like ChatGPT. Meanwhile, Optimus Prime is busy saving the world from Decepticons. Dating tip for mathematicians: specify which transformers you're excited about. Saves awkward moments when you start explaining self-attention matrices and they were expecting discussions about Bumblebee.