Be the first to like this
Can computers learn to understand their users in a way we intuitively understand other people? The short answer is: Yes, they can. Teaching computers to adapt to the human mind is exactly what Brain-Computer Interfacing is about.
Based on EEG – measuring electromagnetic potentials generated by brain activity on the scalp – Brain-Computer Interfaces (BCIs) can automatically detect and react to changes in the state of its user’s mind. This ‘thought reading’ can be used to directly control a computer – as if we were using a computer mouse, while not using a single muscle in our body. In addition to explicit, intentional control, BCI can be used for implicit communication from the user to the computer. The computer can see and learn how our brain responds to changes in the environment and build – step by step – a model of its user’s mind. It can then continuously adapt to our intentions, ideas and concepts and support the ongoing Human-Computer Interaction.
I call this approach Neuroadaptivity and see it as a path to a convergence of human and machine intelligence. In that way, a neuroadaptive computer can indeed learn to gain an understanding of its user.
I call this approach Neuroadaptivity and see it as a path to a convergence of human and machine intelligence. In that way, a neuroadaptive system – a computer – can indeed learn to gain an understanding of its user.