You asked - we listened and reduced the response time in the AI Chatbot widget!
With the newly implemented live response streaming, you’ll see the assistant’s reply appear in parts as it’s generated, instead of waiting for the entire message to load
I was surprised by this change while testing earlier is there a way to change it back, I’m really into design and this makes the response look choppy before the change it was smoother in my opinion. But I understand it’s probably better objectively so customers aren’t waiting and making it engaging to read would just like to have options open.
Exactly, we were aiming to reduce the response time as it could take quite a lot of time depending on the response length. The response streaming option does increase the chances of the customer seeing the response before losing patience
However, feel welcome to share your suggestion in our Wishlist, we’re always happy to hear from you there!