This section begins with the observed non-verbal communication strategies employed by robot players to initiate assistance, supplemented with insights into the reactions of pedestrian players in scenarios replicating encounters with urban robots. Following that, we shift our focus to the bystander perspective by reporting on the diverse factors that influence their decision to offer assistance or not. Subsequently, we synthesise these perspectives in reporting word sorting results, revealing the desired characteristics of robot help-seeking behaviour. It is worth noting that the findings discussed in this section may not exhaustively encompass the spatio-temporal and political complexities of urban settings, which are difficult to fully replicate in bodystorming design activities.
4.1 Strategies of robot players to elicit help
4.1.1 Addressing bystanders.
One of the challenges for robot players was to capture the pedestrian’s player attention and initiate interaction with them. During the bodystorming activity, robot players employed various strategies to address bystanders in order to gain further assistance. Robot players frequently utilised rapid movements of the pole, thereby generating noticeable noise. This approach served to attract attention before initiating subsequent communication with pedestrians (n=7). This approach was corroborated by five participants who identified the robots’ noise and shaking poles as the primary factors that compelled them to stop and further observe.
When not receiving further assistance, some robot players directly addressed the nearby pedestrians using various methods (n=5). They oriented themselves toward the pedestrian players to face them directly, or slightly moved towards them, which served as an indication to those individuals that they were being called out for assistance. As P15 illustrated when referring to the moment the robot player turned its front towards him and approached: ‘It started walking directly towards me, and that’s when I realised it needed assistance from me.’
After some pedestrians remained indifferent to the robots’ request, a few robot players took their actions a step further. They proactively chased departing pedestrians, obstructed their path, or even engaged in physical contact with them. For instance, robot player P9 tilted his box costume towards the pedestrian player P11 and rubbed against him (as shown in Fig.
3, left). These pursuing behaviours generally elicited a sense of discomfort among participants, causing them to maintain a considerable distance from the robot player (P3, P9), or even escape from the situation entirely (P7, P11). During the subsequent interview, P11 described the interaction as
‘needy’ and
‘creepy,’ prompting a strong desire to flee.
4.1.2 Cueing intentions.
Upon capturing the attention of pedestrian players, robot players used their body’s orientation or pointer’s directionality to further convey their intentions. They either oriented themselves accordingly or used the pointer to indicate their intended directions or objects they needed assistance with. Such non-verbal cues, while informative, may not always ensure clear communication. This was underscored during bodystorming when three pedestrian participants sought additional confirmation from the robot player. For example, when robot player P4 oriented himself towards and paused upon the traffic light button, P1 approached, pointed at the button and looked at P4, asking,
‘Is it?’ (see Fig.
3, middle).
To enhance clarity, the robot player responded by adding motions in the desired direction or repeating the same pointing gesture. Robot player P4 responded to P1 by stepping back and moving forward towards the traffic light button again as confirmation. Recognising this, P1 then assisted by pressing the button. In this manner, both parties intriguingly communicated through a blend of verbal and non-verbal exchanges.
4.1.3 Displaying emotions.
Even though participants were not given any external communication modalities apart from the robot body – consisting of a box and a pole – six of the participants attempted to convey emotions through movements or sound. A common emotion exhibited by participants (n=5) was anxiety when pedestrians did not offer help during the role-play. This anxiety was manifested through frequent and intense shaking of the pole or twisting of the robot’s body. The displayed emotion raised empathy among participants. P17, for instance, reflected on an instance where the robot player was impeded by an obstacle and energetically waved its pole towards her: ‘[…] it (the robot) seemed very anxious. Then I quickly realised that it was this thing that was blocking it.’ As a result, three out of the five participants who initially didn’t assist the robot began to pay heightened attention and acknowledged the situation. This ultimately prompted them to step in and provide assistance.
Four robot players tried to express gratitude after receiving help by expressing joyful emotions through their movements, such as hopping up and down (P6) or spinning around (P15), as well as through sound, like vocalising an uplifting tune (P7, P1). Displaying these joyful emotions prompted responses such as nodding (P9) and waving (P17, as shown in Fig.
3, right) by pedestrian players. P17 drew a connection between waving to the robot and her experience of encountering small animals, explaining,
‘Because I tend to greet small animals, or things that I find cute or have emotions.’ Furthermore, P7 conveyed a sense of satisfaction when seeing the robot spinning around, stating,
‘I felt very satisfied because I believe it has feelings. […] I help it and it is happy, which also makes me happy.’ P9, commenting on the joyful tune produced by the robot player P11 following his assistance, noted,
‘It made me feel like, alright, I did the right thing.’4.1.4 Demonstrating repetitive patterns.
Having discussed the three primary components of communications — addressing bystanders, cueing intentions, and displaying emotions — another notable observation emerged. Robot players frequently assembled these components into discernible repetitive patterns, resembling the predictable and programmed behaviours generally associated with robots (n=7).
P7, for instance, developed a unique routine to signify a pathway obstructed by an obstacle: she advanced towards obstacles while emitting two flat-tone beeps, moved back with an up-tone beep, and paused briefly before repeating this cycle multiple times. Her rhythmic auditory cues were synchronised with her physical movements. She later explained that this use of repetitive movements and audio cues was reflective of her ‘imagination of the robot having some program behind the system.’ Similarly, P2 adopted a pattern that combined cueing intention and addressing bystanders by repeatedly turning towards the pedestrian player, returning to the original position, and then turning towards the obstacles blocking its path.
Eight participants indicated that the recognition of programmed machine-like behaviour augmented their understanding of a robot’s intent to communicate. In contrast, movements lacking a recognisable pattern were sometimes perceived as ‘erratic’ or ‘malfunctioning’. The repetitive movement patterns reminded four participants of situations where domestic cleaning robots get stuck and repeatedly attempt to move back and forth. The familiar motion patterns helped participants form associations and understand the robot’s need for help.
In addition to enhancing understanding, recognising repetitive patterns in the robot’s behaviour also potentially improved participants’ confidence in the robot’s abilities. The robot’s consistent adherence to certain rules communicated a sense of control over its actions, as P9 noted, ‘I think the robot knew what it wanted.’ P7 indicated that the repetitive patterns in the robot’s movements signify predictability, allowing ‘the pedestrian (to) anticipate what’s gonna happen.’
4.2 Factors shaping bystander decision to offer or decline assistance
4.2.1 Preconceptions of agent autonomy.
Our study underscores the prevailing perception among participants that service robots should operate with complete self-sufficiency and efficiency (n=10). This forms a major reason for participants’ reluctance to assist robots. For instance, P9 shared his presumption about robots’ capabilities to manage all tasks autonomously, stating, ‘I thought the robot was able to do everything itself.’ Such misconceptions can foster misunderstandings about the actual capabilities and needs of these robots, an aspect further highlighted when P9 continued, ‘[…] so I didn’t realise the robot was asking me to help.’ This expectation subsequently instigated skepticism among six participants regarding the functional utility of service robots that require human intervention. This sentiment was articulated through comments like,‘If you have to work for them, then what’s the point to have a robot’ (P6). P7 further noted a decline in trust due to the robot’s need for help, contrasting it with her expectation of a service robot’s role as a functioning entity, stating ‘As a working robot (i.e., service robot), they kind of made me feel like untrust(worthy). […] so they (have to) work perfectly.’
Interestingly, when debriefed about the actual scenario, there was a notable shift in the attitudes of four participants. They came to understand that the robots’ challenges arose from external factors outside their capabilities (e.g., obstacles purposely placed by humans) rather than any inherent malfunctions. P4 underscored this realisation, remarking, ‘then it’s the human’s fault (for placing the obstacle)’. The reassignment of responsibility for the robot’s immobilisation not only improved participants’ inclination to assist but also enhanced their empathy towards the robot’s predicament. P9 summarised this change of mind, noting that in this case the robot is ‘in need of help rather than being needy’.
4.2.2 Absence of responsibility.
The notion of being a mere bystander or pedestrian, devoid of any responsibility towards the enacted robots, emerged as a primary factor influencing the decision not to assist among ten participants. This sense of detachment made them reluctant to invest their time and effort in helping ‘something’ they didn’t feel accountable for. P14 particularly highlighted resistance to being perceived as ‘free labour’ for commercial enterprises, posing the question: ‘why should I spend my time helping something that is making a profit?’ However, they later nuanced this statement by adding: ‘If it (the robot) is for a non-profit purpose, then I might be inclined to help, even if it means me being a bit delayed.’
The concerns of getting entangled in potential troubles further discouraged five participants from offering help. P1 expressed this concern, stating ‘I am afraid of touching it and breaking things. […] It could cause trouble if we touch it.’
4.2.3 Unfamiliarity with robotic technology.
Seven participants expressed hesitation to assist the robot due to a perceived lack of expertise. They felt ill-equipped as ‘random pedestrians’ (P7) to provide assistance to the robot, a task they believed was best left to professionals, as indicated by two participants.
The unfamiliarity with robotic technology sparked safety concerns among six participants, hindering them from offering help. This was further corroborated by our observations of five participants who actively distanced themselves or avoided the robot when it approached them for assistance. This evasion stemmed from the uncertainty about potential risks linked to the robot’s predicament, as P7 stated, ‘I don’t know if it’s a tiny little issue or if it’s going to explode or something.’
4.2.4 Intrinsic motivation: empathy and emotional responses.
Our interviews indicate that intrinsic motivation plays a compelling role in promoting bystanders to assist the robot, with feeling empathy being the primary motivator (n=8). Participants described the robot using terms like ‘depressing’, ‘frustrated’, and ‘helpless’, signifying their ability to infer the robot’s emotional states through observation of its movements within given contexts. For example, P9 noted that observing the robot’s body swaying in an appeal for help prompted an association with vulnerable individuals, stating ‘[…], so (it’s) like a child needs help or an old person needs help’.
In addition to empathy, six participants reported experiencing a sense of emotional reward, capturing feelings of ‘fulfilment’, ‘satisfaction’, and ‘delight’, following their actions to assist the robots. P2, for example, articulated this sentiment as, ‘You helped it and witnessed it moving forward, which brings you a sense of satisfaction.’ Moreover, the gratitude exhibited by the robot reportedly amplified these emotional rewards (n=4).
However, it was also evident that some participants demonstrated a reduced level of empathy towards robots. Specifically, P1 drew comparisons with other entities, affirming readiness to
‘stop for a dog or a cat, but not for a robot.’ He justified this attitude with his belief that
‘you can’t expect to treat a robot as a human or as a living animal.’ In one session, P15, who assumed the role of a vehicle driver, further underscored this perspective by simulating a horn by knocking on the board prop and voicing their impatience by yelling at the robot. Their behaviour was justified by their assertion that the robot
‘cannot be viewed as human.’4.2.5 Extrinsic motivation: entertaining value and material reward.
Apart from intrinsic motivation, our interviews highlighted the role of extrinsic motivation, stemming from both tangible and intangible incentives, in fostering helping behaviours towards robots.
Six participants anticipated entertainment value from helping robots, viewing this form of intangible incentive as an additional incentive to offer assistance. As articulated by P9, the incorporation of a ‘surprise element’ into the interaction could further ‘spark joy.’ P12 also mentioned the potential of ‘transforming it (assisting robots) into a more game-like experience’.
Furthermore, two participants felt that their prosocial actions should also yield tangible advantages for them. They suggested the introduction of material rewards, such as vouchers or discounts from the company that implements the robot, could further stimulate their willingness to assist. This perspective was rooted in their belief that as bystanders, their prosocial behaviour towards the robot was not directly ‘benefiting’ them.
4.3 Desired robot characteristics
Based on the results of the word sorting activity and insights derived from the participants’ discussions, we identified three characteristics that help-seeking robots could implement, which we present in this section.
4.3.1 Vibrantly mechanical.
The physical sensations experienced by robot players are evenly distributed among all words with a slightly higher number of ‘clumsy’ (n=4). This choice primarily stems from the physical limitations of moving within the robot costume.
There’s a noticeable overlap between the perceived physical qualities of the robot player and the desired attributes. In particular, ‘mechanical’ was selected as perceived quality (n=9) and desired quality (n=10). Similarly, the same pattern was observed in ‘playful’ (8 for perceived quality, 14 for desired quality), and ‘cute’ (11 for perceived quality, 8 for desired quality). This correlation implies that the robot players’ behavioural strategies, to some extent, met participants’ expectations for how a robot should act when seeking assistance, especially concerning these attributes. The movement of the robot players, encapsulated in the minimalist abstract box-shaped costume, inherently conveys an impression of cuteness without the need for additional embellishments. As P12 expressed, ‘I feel its existence is cute enough already. Just imagine you’re on the road, helping a little robot, and it goes "dibbly-dobbly" as it moves forward. It’s already incredibly cute, and there’s no need to add additional design language.’ While some participants also selected ‘lively’ and ‘organic’, they view these as elements that can be added to the overall mechanical nature of the robot to enhance the expressiveness. P13 emphasised the importance of using these elements thoughtfully to ‘avoid creating the uncanny valley effect.’
4.3.2 Cheerfully confident.
In regards to the psychological feelings of robot players, there was no clear tendency being identified in the word sorting, which may imply that this was highly subjective. When it comes to other participants’ perceptions of the robot player, ‘brave’ (n=9) and ‘confident’ (n=8) emerge as two dominant qualities. Participants highly commend the robot player’s efforts to find solutions in challenging situations. P16 pointed out the robot’s vulnerability and the hazards and difficulties of its environment. She articulated, ‘(the robot) dared to cross the road on his own and was thinking about how to do it.’ when it was ‘dangerous because there were no traffic lights for (it)’. P2 contemplated the perceived braveness, projecting the psychological state that humans have when seeking help from strangers onto the robot. She expressed that the robot ‘needs help (and asks for it), as it needs to use courage to convey the help it needs.’
In terms of the desired psychological attributes, participants generally favoured positive emotions. The term ‘confidence’ (n=12) emerged as the predominant expectation that people had for the robot’s demeanour. P7 expressed this perspective by describing a service robot as ‘some kind of professional stuff’, implying the need for the robot to exhibit characteristics that align with expected proficiency. In addition, participants generally expected the robot to display ‘happy’ (n=6) emotions after receiving help.
Descriptors of negative emotional valence (e.g., ‘sad’) were not considered as desired qualities for help-seeking robots. P2 offered an enlightening comment that a casually-encountered robot exhibiting sadness while seeking assistance was reminiscent of street beggars, leading to feelings of what they described as ‘emotional blackmail.’
4.3.3 Responsively outgoing.
The self-assessed feeling of ‘needy’ emerged as the most frequently chosen term among robot players (n=7), indicating that they experienced a sense of helplessness and a perceived necessity for human aid as a robot in the given situation. To elicit help, most of the robot players opted to project a more approachable character, frequently choosing descriptors like ‘extrovert’ (n=4), ‘collaborative’ (n=4) and ‘inviting’ (n=3). In contrast, only two robot participants resonated with terms like ‘shy’ and ‘introvert’.
Other participants’ perceived robot social quality aligned with the self-assessed social attribute of robot players, with ‘needy’ (n=15) and ‘extroverted’ (n=8) being the most frequently chosen terms. In terms of the desired social qualities, although ‘extroverted’ was still among the preferred terms, participants placed a greater emphasis on communication qualities such as being ‘polite’, ‘responsive’ and ‘collaborative’. P2 highlighted the importance of politeness, even when the robot is in urgent need of help, saying, ‘At the same time, when you try to attract everyone’s attention as much as possible, you also need to be gentle towards others.’ The desired ‘responsive’ quality reflects participants’ expectation of receiving feedback after offering help. For example, P17 expressed feeling disappointed if she helped the robot without receiving any response from it.