📄 Paper
The closest linear-time approximation to softmax attention reported to date—built on the Yat-kernel.
Jose Miguel Luna, Taha Bouhsine, Krzysztof ChoromanskiarXiv:2602.04915
🎤 Talk
Mathematical constraints preventing AI superintelligence, and solutions from
first principles.
Taha BouhsineDevFest
Brooklyn/Queens
🎤 Talk
Physics-grounded AI—letting information dictate model behavior.
Taha BouhsineDevFest
Silicon Valley
💬 Panel
AI 2.0 is emerging while humanity operates on outdated ethics.
Taha BouhsineDevFest
Modesto