For robots to be more capable interaction partners they will necessarily need to adapt to the needs and requirements of their human companions. One way that the human could aid this adaptation may be by teaching the robot new ways of doing things by physically demonstrating different behaviours and tasks such that the robot learns new skills by imitating the learnt behaviours in appropriate contexts. In human–human teaching, the concept of scaffolding describes the process whereby the teacher guides the pupil to new competence levels by exploiting and extending existing competencies. In addition, the idea of event structuring can be used to describe how the teacher highlights important moments in an overall interaction episode. Scaffolding and event structuring robot skills in this way may be an attractive route in achieving robot adaptation; however, there are many ways in which a particular behaviour might be scaffolded or structured and the interaction process itself may have an effect on the robot's resulting performance. Our overall research goal is to understand how to design an appropriate human–robot interaction paradigm where the robot will be able to intervene and elicit knowledge from the human teacher in order to better understand the taught behaviour. In this article we examine some of these issues in two exploratory human–robot teaching scenarios. The first considers task structuring from the robot's viewpoint by varying the way in which a robot is taught. The experimental results illustrate that the way in which teaching is carried out, and primarily how the teaching steps are decomposed, has a critical effect on the efficiency of human teaching and the effectiveness of robot learning. The second experiment studies the problem from the human's viewpoint in an attempt to study the human teacher's spontaneous levels of event segmentation when analysing their own demonstrations of a routine home task to a robot. The results suggest the existence of some individual differences regarding the level of granularity spontaneously considered for the task segmentation and for those moments in the interaction which are viewed as most important.