**2. Literature review**

### **2.1 Technology adoption and acceptability**

Social informatics studies the relationships between people, digital technologies, and their contexts of use [12]. In this approach, the focus is on the relationship between technology and society from a perspective that does not privilege either [2] but examines, as they put it, the hyphen in the "socio-technical" expression. Adoption studies can be a practical application of social informatics approaches because, to be able to study and promote adoption, an understanding of the possibilities harnessed by the materiality of the technology—as well as the value that the technology brings into people's lives—is necessary. It is an approach that contrasts with an *a priori* promotion of technologies that occasionally work well for people, occasionally are valuable, are sometimes abandoned, are sometimes unusable, and thus incur predictable waste and inspire misplaced hopes [13].

Adoption is a process "starting with the user becoming aware of the technology, and ending with the user embracing the technology and making full use of it" [14]. Awareness has been seen as the key to developing new ICT infrastructures [15] and as a key determinant of consumers' adoption behavior [16]. Lack of awareness was identified as an obstacle to mobile phone adoption [17] and in the IoT landscape, our survey showed that the less aware people are of the expression "Internet of Things," the higher the odds (1.3 times) that they will not want to use the technology in the future [3]. Furthermore, security and privacy can influence the adoption of smart home technologies. For example, in their investigation of trust in the cybersecurity-preserving capabilities of smart home devices, Cannizzaro et al. [3] revealed how anxiety about the likelihood of a security incident in IoT for the home, emerged as a statistically significant factor influencing the adoption of smart home technology. Lipford et al. [18] outlined how IoT technologies introduce challenging privacy issues that may frustrate their widespread adoption, whereas Guhr et al. [19] emphasize how privacy concerns directly and indirectly influence the intended smart home usage.

Adoption studies are typically carried out by what Rogers [20] calls "change agencies," whose short-term goal is to facilitate the adoption of innovations and who often follow a segmentation strategy of least resistance to innovations. This logic of pursuing economic gain and sidelining wider societal interests also appears in recent key IoT adoption studies (e.g., [21–24]), which justify adoption purely through economic arguments and do not mention the societal risks that the technology may raise. The underlying economic model of the new wave of digital innovations has been dubbed "surveillance capitalism" [24], defined by the harvesting of data and its analysis for the commodification of human activity. In response, Helbing [25] states that we must ensure the ethical use of new digital technologies. Hence, acceptability is a way to mitigate this one-sided approach to adoption and can help to understand the impact of unintended consequences, for example, the erosion trust in technology, privacy [12], or the rate of acceptance of the smart home in older adults [6, 26] Technology acceptability is "the degree of primary users' predisposition to carry out daily activities using the intended device" [27]. Philosophically, technology acceptability is a judgment that prescribes the way in which the technology examined ought to be desirable [28]. Acceptability is a popular perspective in health and assistive technology-related IoT services and products, where, for example, Shahrestani [29] defines acceptability as "guidelines to evaluate how a particular approach or technology is working for the elderly or people with disability," thus relating acceptability to

the general process of *evaluation*. In regard to the IoT, Taylor et al. [30] define acceptability in conjunction with "attitudes," for example, "Policymakers need to investigate the *attitudes* of the public if *acceptability* of IoT is to be understood" ([30], emphasis added). Hence, "acceptability" feeds on evaluations, predispositions, and attitudes toward a given technology, foregrounding the user in the user-technology relation. As such, acceptability has the potential to give consumers a voice and thus rebalance the business-consumer relationship. The socio-technical approach intrinsic in acceptability can encourage a discovery process that helps designers effectively understand the relevant life worlds and work worlds of the people who will use their systems [2].

Outside of academia, acceptability-related studies are rather popular and are often carried out by interest groups [31, 32] or organizations defending consumers' rights (e.g., [5]). Trust is fundamental to consumer technology where the transmission of personal and sensitive information is involved.

De Poel and Verbeek note how science and technology scholars have shied away from explicit normative or ethical discussions [33], but with the advent of the IoT, and the smart home being marketed to the wider population, ignoring technology ethical-acceptability concerns and disregarding consumer trust is no longer possible.

Trust in privacy and security are key factors affecting the acceptability of the smart home [3, 34]. To date, there have been few nontechnical studies of security and privacy concerns of smart home device users [35].
