Our daily life is full of activities: working, cooking, cleaning, driving are just some of them. While performing these activities our communicating capabilities are limited since most of our senses are blocked by the main activity. Thus we cannot benefit from accessing the computer or using our hands and eyes for driving or our voice for talking to other people. In this paper we invest into enabling and designing human-computer interaction in situations where common modes of communication fail. Therefore we allow controlling an interface just by using head movements and propose a model-based interface design that enables to design and implement such interactions. As a proof of concept we explain our approach based on a natural interface for a musician that we have implemented. It enables to turn music sheets just by head movements.