A total of 1,127 European citizens completed the survey. On average, participants took 23.6 minutes to complete the study. Twenty-seven participants were discarded for failing two or more attention checks. Some data loss occurred and the results of eight participants did not get recorded in the database. The final sample consisted of 1,092 answers.
Of the final sample, 529 participants (48.4 %) self-identified as females, 541 (49.5%) as males, and 22 (2%) had other gender definitions. Mean age was 27.9 (SD = 8.37). In a distribution by age group, 489 (44.8%) were 18-24 years old, 411 (37.6%) were 25-34, 133 (12.2%) were 35-44, 42 (2.8%) were 45-54, 16 (1.5%) were 55-65 years old, and one was 74 years old.
There was participation from citizens residing in 20 different EC countries and from 26 different EC nationalities as well as two participants that resided in the EC but reported other nationalities (see table 1A & 2A - SI). Of them, 641 lived in cities, 301 lived in towns, and 150 lived in villages or rural areas.
Regarding studies, 368 participants had bachelor’s degrees or equivalent, 247 had master's degrees or equivalent, 236 had attended some college but no degree, 190 had a high school degree, 24 had a PhD, 14 had a professional degree, and 13 had less than a high school diploma. Regarding occupation, 413 participants were students, 406 reported being employed, 100 were not working, 74 were students and employed, 64 were self-employed, 6 were retired, and the rest reported other situations. Concerning work in the healthcare sector, most participants (959) had never worked in the area of healthcare, some of them (76) worked in the area of healthcare in the past, and some others (57) are currently working in the healthcare sector. Of them, 392 had a background in Computer Science, Engineering, or Robotics, while most of them (700) did not. Concerning programming skills, 462 had basic programming skills, 410 none, 157 medium, and 63 advanced. As to their level of interest in scientific discoveries and technological developments, 576 were moderately interested, 468 were very interested, and 48 were not interested.
With respect to marital status, 571 were single, 372 were in a domestic partnership or with a partner, 109 were married, 15 were divorced or separated, 2 were widowed and 23 reported other situations. As for household composition, 482 were living with their parent(s), 220 were living with their partner, 157 were living by themselves, 115 were living with housemates, 89 were living with their partner and child(ren), 13 were living with their child(ren), and 16 reported other situations. This is in line with the age distribution of the respondents.
Of the participants, 558 saw themselves and their household belonging to the middle class of society, 379 to the lower middle class of society, 103 to the upper middle class of society, 48 to the lower class of society, and 4 to the higher class of society. To the question "during the last twelve months, would you say you had difficulties paying your bills at the end of the month?", 534 participants answered never, 276 almost never, 233 from time to time, and 49 most of the time.
With respect to familiarity using robots in general, 536 had previously used robots at home (e.g., a robotic vacuum cleaner) and the rest had not, 208 had previously used robots at work (e.g., industrial robots) and the rest had not, and 491 had previously used robots at other places (e.g., airports or malls) and the rest had not. Specifically concerning social robots, 711 had never interacted with social robots, 361 had occasionally interacted with social robots, and 20 had ample experience interacting with social robots.
Finally, in respect of religious beliefs, 727 were not religious, 251 were slightly religious, 89 were moderately religious, and 25 were very religious. Regarding to what religious family do they belonged to or identified themselves most close to, 529 answered Christian, 493 none, 24 Muslim, 8 Buddhist, 5 Jewish, 4 Hindu, and 29 Other.
2.1. Attitudes towards robots and fear of robots
The first block of questions examined general attitudes towards robots and robots for healthcare as well as fear of robots.
The majority of the participants (72.7%, n = 813) reported having an overall fairly positive view of robots.
The questionnaire that evaluated general attitudes toward robots contained five statements. The majority of the sample somewhat agreed (63.1%, n = 706) that "robots are a good thing for society because they help people". The majority opinion was split between somewhat agree (37.7%, n = 421) and somewhat disagree (41.7%, n = 466) that robots steal people's jobs. The majority of the sample totally agreed (56.5%, n = 632) that robots are necessary as they can do jobs that are too hard or too dangerous for people. The majority of the sample totally agreed (59.8%, n = 669) that robots are a form of technology that requires careful management. The majority opinion was split between somewhat agree (41.2%, n = 461) and somewhat disagree (41%, n = 458) that the widespread use of robots can boost job opportunities.
The questionnaire that evaluated robots for healthcare contained four statements. The majority of the sample somewhat agreed (58.2%, n = 651) that robots for healthcare should be promoted. The majority of the sample disagreed (54.9%, n = 614) that robots for healthcare should be banned. The majority of the sample somewhat agreed (62.9%, n = 703) that robots for healthcare can be beneficial for the economy. The majority of the sample somewhat agreed (57.7%, n = 645) that robots for healthcare can be beneficial for the citizens.
Fear of robots was evaluated on a 5-point (1 = not at all, 5 = very strongly), 12-item scale (Cronbach's α = .89). On average, participants reported a moderate fear of robots (M = 2.22 , SD = .79).
2.2. Willingness to promote the different functions carried out by SARs for healthcare
Participants were asked to rate on a 4-point scale (1 = totally disagree - 2 = somewhat disagree - 3 = somewhat agree - 4 = totally agree) the following statement for each of the functions: "In the ideal European society that you envision for the near future, European policymakers support and promote the development and deployment of social robots that perform the following functions..."
The functions that were rated most negatively (M < 3) were measuring vital signs so that the robots themselves can provide pre-diagnoses or do the triage (i.e., prioritize patients according to the degree of urgency) based on the symptoms and banning entrance to a building to people that present a threat to public health (e.g., have a fever) or do not comply with the rules (e.g., mandatory face mask). Offering motivational conversation to patients of a hospital, expressing and interpreting emotions to communicate, providing touch interaction for emotional support, providing personalized information such as informing about the characteristics of a particular medical condition, symptoms, and potential treatments to patients in hospitals, and collecting personal information of medical symptoms presented by patients on arrival at the hospital or medical center also obtained a negative evaluation (M < 3). The remaining functions were all rated M = 3 or higher. Table 3A - SI shows Mean and SD per each function.
2.3. Threats to trustworthy social robots
We explored whether participants consider a series of functions for which social robots are being deployed to entail a potential threat to trustworthy social robots. The participants rated on a 5-point scale (1 = very low - 5 = very high) the perceived threat to each of the six dimensions of trustworthy social robots: human autonomy; privacy; safety; fairness, diversity, and non-discrimination; societal well-being, and accountability for each of the functions in the previous section.
The perceived threats were explored using the Median as a reference. Evaluation for all items ranged between very low (Mdn = 1) and medium (Mdn = 3). None of the functions were rated above 3 for any of the dimensions.
The functions that were perceived more threatening were triage (i.e., social robots that measure vital signs so that the robots themselves can provide pre-diagnoses or do the triage (i.e., prioritize patients according to the degree of urgency) based on the symptoms) and banning entrance (i.e., social robots that ban entrance to a building to people that present a threat to public health (e.g., have a fever) or do not comply with the rules (e.g., mandatory face mask). In particular, for triage, threats to human autonomy, privacy, safety, and accountability were rated as Medium (Mdn = 3). For banning entrance, threats to human autonomy, safety, fairness, diversity, and non-discrimination, accountability were rated as Medium (Mdn = 3). Providing personalized information (i.e, social robots that provide personalized information such as informing about the characteristics of a particular medical condition, symptoms, and potential treatments to patients in hospitals) and monitoring (i.e., social robots that monitor patients in hospitals and send alerts to the medical staff if an unusual situation is detected) followed next. In particular, both for personalized information and monitoring, threats to privacy, safety, and accountability were rated as medium (Mdn = 3). Finally, for patient registration, data collection, patrolling, and home assistance, the threat to privacy was also rated as medium (Mdn = 3).
By dimensions, a potential threat to privacy was the one that raised the most concern. Privacy was rated with a Median of 3 in eight occasions. Safety and Accountability were each rated with a Median of 3 in four occasions, Human autonomy was rated with a Median of 3 on two occasions, and Fairness, diversity, and non-discrimination was rated with a Median of 3 on one occasion. Societal well-being was not rated with a Median of 3 or above on any occasion. Table 4A - SI shows the medians for each threat and function.
2.4 Care recipients
Participants evaluated on a 4-point scale (1 = no, definitely not - 4 = yes, definitely) whether European policymakers should support and promote the development and deployment of social robots to assist different groups of care recipients in the ideal European society that they envisioned for the near future. Robot assistance for the different groups were all rated very similarly, all yielding a Median of 3 = yes, to some extent. Table 1 shows the Mean and SD of the different care recipients that were rated by the participants.
2.5. Vulnerability
We additionally examined whether the perceived vulnerability was related to willingness to promote social robots for the different care recipients, such that the more vulnerable a group of recipients was perceived, the less likely robot acceptance to take care of that recipient would be. For that, we asked participants to rate how vulnerable they perceived a series of recipients: conscious adult patients, unconscious adult patients, children patients, older adults with cognitive impairment, children with autism, and healthy older adults. For healthy older adults, we measured perceived vulnerability separately for two groups: those in elderly care and those at home by themselves, to assess whether the person being alone (vs. surrounded by caregivers) could also influence perceived vulnerability and willingness to promote robot deployments depending on this circumstance. Table 2 shows the results.
We explored the correlations between perceived vulnerability and care recipients. We found an inverse correlation between willingness to promote social robots for the care recipient of a group and the perceived vulnerability of that group for most cases. Willingness to promote social robots for conscious adult patients in a hospital was inversely correlated with perceived vulnerability of that group (r=-.142, p<.01), willingness to promote social robots for unconscious adult patients in a hospital was inversely correlated with perceived vulnerability of that group (r= -.106, p<.01), willingness to promote social robots for children patients in a hospital was inversely correlated with perceived vulnerability of that group (r= -.172, p<.01), and willingness to promote social robots for older adult patients in a hospital was inversely correlated with perceived vulnerability of that group (r= -.143, p<.01). Similarly, willingness to promote social robots for healthy older adults in an elderly care center was inversely correlated with perceived vulnerability of that group (r= -.097, p<.01), willingness to promote social robots for older adults with dementia in an elderly care center was inversely correlated with perceived vulnerability of that group (r= -.073, p<.01), and willingness to promote social robots for older adults with dementia in an occupational therapy center was inversely correlated with perceived vulnerability of that group (r= -.083, p<.01). However, willingness to promote social robots for healthy older adults living at home by themselves was not significantly inversely correlated with perceived vulnerability of that group (r=-.004, p=.88). Finally, willingness to promote social robots for children with autism in an occupational therapy center was inversely correlated with perceived vulnerability of that group (r= -.180, p<.01) and willingness to promote social robots for children with autism in their homes was inversely correlated with perceived vulnerability of that group (r= -.141, p<.01).
2.6. Roles
We analyzed whether the role performed by the robot as well as the degree of responsibility developed in that role: assisting a human to perform the role vs. developing the role by itself were factors that could influence willingness to promote social robots for healthcare. For that, participants answered whether, in the ideal European society that they envisioned for the near future, European policymakers should support and promote the development and deployment of social robots that performed a series of roles in the healthcare sector (here onwards, acceptance). Performing as a doctor and performing as a psychotherapist were the roles that raised the highest rejection whereas assisting a receptionist and assisting a nurse were the roles that got the highest acceptance. Also, the degree of responsibility was a determining factor of acceptance. Assisting a human that performed the role of nurse, doctor, receptionist, caregiver, or psychotherapist was consistently rated higher in acceptance compared to performing the role by itself. In particular, acceptance for assisting a nurse in a hospital was significantly higher than acceptance for performing as a nurse (t(1091)= 45.938, p<.001), acceptance for assisting a doctor in a hospital was significantly higher than acceptance for performing as a doctor (t(1091)=50.507, p<.001), acceptance for assisting a receptionist in a healthcare center was significantly higher than acceptance for performing as a receptionist in a healthcare center (t(1091)=30.805, p<.001), acceptance for assisting a caregiver in an elderly care center was significantly higher than acceptance for performing as a caregiver in an elderly care center (t(1091)=42.749, p<.001), acceptance for assisting a caregiver in a private home was significantly higher than acceptance for performing as a caregiver in a private home (t(1091)=38.351, p<.001), and acceptance for assisting a psychotherapist in an occupational therapy center was significantly higher than acceptance for performing as a psychotherapist in an occupational therapy center (t(1091)=41.534, p<.001). Table 3 shows the Means and SD for all roles.
2.7 Attitudes towards robots by gender and age
We examined potential differences regarding attitudes towards robots by gender as well as the relationship between attitudes towards robots and age. For attitudes towards robots, we created a scale from the following four items that evaluated general attitudes towards robots [The item "robots are a form of technology that requires careful management" that was part of in the initial questionnaire based on9 was eliminated because it did not contribute to a simple factor structure when the factorial analysis was performed, extraction based on fixed number of factors, factors to extract = 1 (KMO = .65; Barlett's Test of Sphericity x2 (10) = 664.823, p< .001)]: robots are a good thing for society, because they help people, robots steal people's jobs (reversed), robots are necessary as they can do jobs that are too hard or too dangerous for people, widespread use of robots can boost job opportunities (Cronbach's α = .65, M = 2.96, SD = .48). Also, we used the item that assessed the general view of robots ("Generally speaking, do you have a view of robots..." 1- very negative - 4 - very positive). We found significant differences by gender both in the general attitudes towards robots scale and the general view of robots, so that females (M = 2.92, SD = .47 for attitudes scale and M = 2.96, SD = .49 for view of robots item) had less positive attitudes towards robots than males (M = 3, SD = .49 for attitudes scale and M = 3.05, SD = .56 for view of robots item).
We did not find a significant correlation between age and general attitudes towards robots (r = -.027, p = .37) or with general view of robots (r = -.052, p = .08).
2.8 Attitudes towards robots and religious beliefs
We additionally examined the relationship between attitudes towards robots and religious beliefs as well as between fear of robots and religious beliefs. For religious beliefs, we used the question "to what level do you consider yourself to be religious?" (1- not religious - 4 - very religious). For attitudes towards robots, we used again the general attitudes towards robots scale (see section above) and the item that assessed the general view of robots, as in the previous section. For fear of robots, we used the 12-item fear of robots scale.
We found a positive correlation between religious beliefs and fear of robots, so that the more religious, the more fear participants had (r = .177, p < .01). We also found an inverse correlation between religious beliefs and general view of robots so that the more religious the less positive views of robots (r = -.062, p < .01) as well as an inverse correlation between religious beliefs and general attitudes towards robots, so that the more religious the less positive attitudes towards robots participants had (r = -.131, p < .01).
2.9 Attitudes towards robots and previous experience with robots
Lastly, we explored the relationship between previous experience interacting with social robots and attitudes towards robots. For previous experience, we used the item "indicate your experience interacting with social robots (1 - never, 3 - ample experience). For attitudes towards robots we used again the general attitudes towards robots scale and the item that assessed the general view of robots, as in the previous section.
We found that previous experience interacting with social robots was positively correlated with general view of robots, so that the more experience the more positive views (r = .121, p < .01). Previous experience interacting with social robots was also positively correlated with attitudes towards robots, however, it did not reach significance (r = .084, p = .05).