Ask HN: Why are LLM UIs so slow?
I love Claude Sonnet but the user interface is super slow. After chatting sometime, it becomes too slow and its even hard to scroll.
To counter this, I tried Openrouter's chat interface, but that is painfully slow. I'm trying now Gemini 2.5 in Google AI studio and it is also slow.
What is the underlying reason for this? I understand the backend takes a lot of computation, but frontend?
by lukejkwarren on 4/2/2025, 1:03:22 PM
It's the big bummer with reasoning models, although they are improving a lot. I experimented with various reasoning models for my AI security scanner product but found the performance to just be far too slow.
I love Claude Sonnet but the user interface is super slow. After chatting sometime, it becomes too slow and its even hard to scroll.
To counter this, I tried Openrouter's chat interface, but that is painfully slow. I'm trying now Gemini 2.5 in Google AI studio and it is also slow.
What is the underlying reason for this? I understand the backend takes a lot of computation, but frontend?