This is a case study of 3 projects with a seemingly trivial task in common: how do we show users meaningful suggestions of what they can buy in a web store without knowing much about them?
The first project is about us struggling to design meaningful suggestions using conventional patterns. In the second, we shifted to a conversational UI and saw its mechanics change the dynamics of interaction. In the third project, we had an epiphany where it became apparent just how much designing conversational UIs is tied to understanding language, speech acts and the psychology of human behaviour.
We will look at how much conversational UIs differ from traditional UIs, like steppers or forms, and what makes them better - or worse. We'll look at how the conversations can be prototyped and how speech act theory can help us in designing better interfaces.
A good part of the session will be spent with Bobby, the chatbot we created to help users find the perfect gift, and with the psychology behind the bot. Apart from the chatbot itself, we we also made a prototyping tool to help us design the conversations.