Section 1 Theories of Trust

### **Chapter 1**

## Trust Management: A Cooperative Approach Using Game Theory

*Ujwala Ravale, Anita Patil and Gautam M. Borkar*

### **Abstract**

Trust, defined as the willingness to accept risk and vulnerability based upon positive expectations of the intentions or behaviours of another. The qualities or behaviours of one person that create good expectations in another are referred to as trustworthiness. Because of its perceived link to cooperative behaviour, many social scientists regard trust as the backbone of effective social structures. With the advancement in technology, through these online social media people can explore various products, services and facilities. Through these networks the end users want to communicate are usually physically unknown with each other, the evaluation of their trustworthiness is mandatory. Mathematical methods and computational procedures do not easily define trust. Psychological and sociological factors can influence trust. End users are vulnerable to a variety of risks. The need to define trust is expanding as businesses try to establish effective marketing strategies through their social media activities, and as a result, they must obtain consumer trust. Game theory is a theoretical framework for analysing strategic interactions between two or more individuals, in the terminology of game theory, called players. Thus, a conceptual framework for trust evaluation can be designed using a game theory approach that can indicate the conditions under which trustworthy behaviour can be determined.

**Keywords:** trust, cooperative behaviour, game theory, sociological factors, vulnerable

### **1. Introduction**

Trust is a subjective, multi-faceted, and abstract notion. In addition to computer technology, many researchers worked on trust for a variety of fields, including business, philosophy, and social science. Analysts from diverse domains concur with the basic definition of trust, i.e., it is measurement of the trustworthiness of a person or any living things. Trust is regularly inferred from certain input appraisals through aggregation of trust.

Trust has been classified as a black-box, or undifferentiated variable, in the massive number of studies, and has rarely been investigated in depth. Even if it appears in predictable ways, trust is not a one-dimensional or homogeneous idea. Trust is viewed as a multi-faceted notion that can be interpreted differently depending on the context. In addition to computer technology, trust has been

studied in a variety of fields, including economics, psychology, and social studies. Researchers from several fields agree on the basic definition of trust, that is trust characterises an individual's level of anticipation and trustworthiness, also shows cooperative relation between inter organisational entities. Trust is derived from specific feedback evaluation and mechanism. It has been discovered that trust reduces disagreement and uncertainty by fostering goodwill that strengthens relationships while also increasing satisfaction and partners' willingness to trade.

Trust management encompasses trust as an identification and communication establishment of the elements with different techniques for computation, transmission, consolidation, and information storage, consumption models and enhancement in service provisioning of trust. Certain trust functionality can be implemented and supported using distributed computing. Decentralised trust management refers to the administration of trust in fully decentralised computer systems as well as hybrid centralised-decentralised computing systems.

Trust management has infiltrated a wide range of collaborative networked computing systems, including peer-to-peer and eCommerce, social networks and online communities, cloud and edge computing, mobile ad hoc networks and wireless sensor networks, community sourcing, multi-agent systems, and the Internet of things [1].

### **1.1 Different trust management models**


hole, DoS, data modification/insertion attacks, sinkhole, contradictory behaviour attacks, and so on are examples of potential assaults.


### **2. Related study**

All of our social interactions are built on the foundation of trust. Trust is a complex human habit that has evolved over time. Trust has many various interpretations, and as a result, many alternative representations and management principles, depending on the circumstances and applications. It has been a research issue in many domains, including psychology, sociology, IT systems, and so on. For example, trust is utilised in trade systems like eBay and automatic Peer-to-Peer systems like file and resource sharing, where trust is built by algorithms based on prior events, which provide either good or negative evidence or feedback.

In online systems, there are two sorts of trust: direct trust, which is based on a person's direct connection with others, and recommendation trust, which is based on the experiences of other individuals in a social network and grows in a sense based on the propagative feature of trust. Different trust management models are discussed in below section.

Wang et al. [11] developed a game theory-based trust evaluation model for social networks. As a result, when modelling a trust relationship, various factors must be taken into account. The trust value is calculated by considering three factors mainly: feedback efficacy, service reliability and suggestion credibility. In social networks, service transactions are based on node-to-node trust links. Building a trust relationship, on the other hand, is a long and winding process impacted by previous contacts, trust recommendations, and trust management, among other things.

Jian et al. [12] proposed a trust model basically for online social networks using evidence theory techniques. Evidence theory is mostly used for target identification, decision making and to analyse online social networks. The proposed model mainly contains three steps i.e. to achieve individual trust evaluation, determine the relevance of features with respect to each user, which is used for decision making.

Trust evidence approach is to show the probability of trust and distrust among the stakeholder. This approach achieves an error rate which is minimal and highest accuracy in the dataset Epinions.

Chen et al. [13] provided a trust evaluation model using a machine learning algorithm that takes into account a wide range of trust-related user attributes and criteria to enhance human decision-making. User features are classified into four categories based on the empirical analysis: link-based features, profile-based features, feedback-based features, behaviour-based features. Then a lightweight attribute selection technique based on users' online information to analyse the efficiency of each feature and identify the ideal combination of features using users' online information in the form of records. Results are conducted on real-world dataset to show the overall performance which is better as compared to other traditional approaches.

In the current era, Online Social networks have an essential role in practically every aspect of life, and their presence can be seen in all aspects of daily life. Metaheuristic search algorithms are used in social networks due to the property of dynamic nature which it exhibits.

Peng et al. [14] proposed a feature fusion technique in conjunction with an artificial bee colony (ABC) for community identification task to improve performance in terms of accuracy in trust-based community detection using an artificial bee colony (TCDABCF). This strategy takes into account not only an individual's social qualities, as well as in a community the relationship of trust that exists between users is also considered. As a result, the proposed technique may result in the finding of more appropriate clusters of similar users, each with significant individuals at the centre. Proposed technique makes use of an artificial bee colony (ABC) to accurately identify influential persons and their supporters. For simulation purposes, the Facebook dataset is used and the proposed method has obtained 0.9662 and 0.9533 Normalised mutual Information (NMI) and accuracy, respectively.

Reputation and trust prediction are "soft security" solutions that allow the user to evaluate another user without knowing their identity. The trustworthiness of users in social networks is calculated using the reputation level of other users. A new probabilistic reputation feature is more efficient than raw reputation features. Various machine learning algorithms and 10-fold cross validation proposed by Liu et al. [15] is used for simulation. The witness trustor users' trust values are used to determine the trustee's reputation qualities. Raw and probabilistic reputation features, which are two different types of characteristics, were compared. Three datasets namely wiki, Epinions and Slashdot are used for simulation purposes. SMOTE boost algorithm is used to balance the dataset to improve prediction performance of prediction. In online social networks, this trust prediction algorithm can be used to strengthen social relationships and identify trustworthy users.

The recommended approach proposed by Mohammadi et al. [16] took into account users' attitudes toward one another on a social network as the basis of their trust. The mostly textual contents shared on social networks were analysed to determine how people felt about one another. In this Sense trust model, initially analysis of hidden sentiments between the texts exchanged between two social network users are taken into consideration. Then Hidden Markova Model is used to evaluate trust between users on social networks. Statements exchanged; Hidden Markov Model (HMM) is utilised. Both RNTN and HMM are trained with emails extracted from Enron Corporation undergoing crowdsourcing and labelling.

A trust framework by Hansi et al. [17] introduced the proposed methodology to determine the node trust values for social network users using reinforcement


*Trust Management: A Cooperative Approach Using Game Theory DOI: http://dx.doi.org/10.5772/intechopen.102982*

### **Table 1.**

*Comparison of trust management methods.*

learning. On social media trust between two nodes is evaluated based on the features i.e. number of neighbour nodes, relationship among the nodes and number of common neighbour nodes. After selecting features if there is a edge among two nodes, the trust value is denoted as 1 otherwise 0. Second, the node trust will be determined using a training model value. After that, a recommendation algorithm will be used to determine the results. Finally, the simulation is used to analyse the effectiveness of the suggested strategy For the purpose of simulation data from an adaptable social network will be used.

To address the trust evaluation problem in trust social networks, Liu et al. [18] presented NeuralWalk, a machine learning-based approach. Unlike traditional methods, NeuralWalk models singlehop trust propagation and trust combining using a neural network architecture called WalkNet. When the NeuralWalk method is used, WalkNet is trained. Advogato dataset is used to evaluate the accuracy of algorithm. TheNeuralWalk algorithm, in collaboration with WalkNet, does a BFS multi-hop trust assessment across TSNs (**Table 1**).

### **3. Trust-building mechanisms**



### **Table 2.**

*Trust mechanisms in online social networks.*


An online platform's trust mechanism is a method for overcoming knowledge gaps between market players and facilitating transactions. Many different types of trust mechanisms exist that are listed below (**Table 2**):

To develop trust among users in a social network is critical. It is critical to study in depth all possible ties between users in the social network and to appropriately evaluate those relations in order to determine who-trusts-whom and integrate that knowledge in the social recommender.

To estimate trust some models use a behavioural pattern of user interaction. Few parameters which are consider to calculate trust are as follows:


### **4. Techniques for trust evaluation**

Different trust evaluation techniques are classified as Statistical and machine learning approaches, heuristics-based techniques, and behaviour-based techniques. Statistical and machine learning techniques aim to provide a mathematical model for trust management that is sound.

The goal of heuristic-based strategies is to define a feasible model for constructing reliable trust systems. User behaviour in the community is the focus of behaviour-based models.

### **5. Trust evaluation methods**

See **Figure 1**.

### **5.1 Analysis of trust evaluation methods**

The practise of assessing trust using attributes that influence trust is known as trust evaluation. It is confronted with a number of serious challenges, including a shortage of critical assessment data, a requirement for data processing, and a request for a straightforward participant statement to decision making. Analysis of trust is achieved by using following methods:

### *5.1.1 Fuzzy logic approach*

Trust evaluation model using fuzzy logic in various IOT applications considers the parameters like device physical security, device security level and device ownership trust [19]. Cloud computing plays a very important role on the internet to provide various useful services. In cloud environments trustworthiness of nodes is determined by performance in terms of response time and workload is considered.

**Figure 1.** *Different trust evaluation methods.*

Another parameter which is used is known as elasticity in terms of scalability, security, usability and availability [20]. In Wireless Sensor Network fuzzy based trust prediction model trust is calculated in intra cluster and inter cluster level. Trust computation is performed using direct trust and indirect trust interaction among the nodes [21].

### *5.1.2 Game theory approach*

In Online social network trust degree is calculated using three parameters like feedback effectiveness, service reliability and recommendation credibility. In wireless sensor network game theory approach is used to mitigate security attacks. In WSN it mainly calculates parameters like cooperation, reputation and security level from the information collected from the network. In a cloud computing environment trust is evaluated for both user and server providers.

### *5.1.3 Bayesian network*

Users in a virtual world, such as an e-commerce marketplace, are unable to physically inspect the quality of trade products before purchasing them, nor can they secure the security of personal data, resulting in uncertainty and mistrust among network actors. In wireless sensor networks direct trust values are calculated using Bayesian theory and when there is uncertainty in direct trust, indirect trust values are calculated using entropy concept.

### *5.1.4 Feedback approach*

Trustworthiness is achieved by participants' behaviour and feedback. In the network many Quality-of-Service parameters are considered for evaluating behavioural trust value. In Cloud computing, service level agreement parameters are assumed to maintain the feedback and compute the feedback trust value of the cloud service provider [22]. Feedback proves the genuineness of participants.

### *5.1.5 Agent based Approach*

In wireless sensor networks, mobile nodes are used as a router to transfer packet and communication established between nodes. So, every node or agent that is required to be trusted to each other [23]. If a malicious node enters the communication channel, then the network will disturb. So, trust model gives proper security and provides support for decision making.

### **5.2 Bio-inspired trust and reputation model**

A trust model and reputation model mainly consist of components like collecting information, performing ranking, entity selection, transaction and finally reward points. To select the most trustworthy node, it is based on a bio-inspired ant colony algorithm. To select the most trustworthy node, comparison of average phenomenon is done with predefined threshold value, if it is larger than node is trustworthy.

Machine Learning based Trust Evaluation Model: Trust evaluation model based on machine learning can overcome the problems like cold start and zero knowledge which is a disadvantage of traditional trust evaluation models. Machine learning algorithms like logistic regression, K Means, DBscan, SVM, Artificial Neural Network and Decision Tree algorithms are used to determine direct trust value based on trust related attributes.

Ant Colony optimization for Trust Evaluation: Ant Colony Optimization (ACO) is a metaheuristic approach which is used to solve problems of existing models. In wireless sensor networks ACO finds shortest path for packet transmission in a network and accordingly updating of trust value is performed. In online social networks trust value is calculated by activities performed between users.

Human Immune System: Artificial immune system is inspired from Human immune system to provide solution against security attacks in IoT, Wireless sensor network. Which builds the secure environment among the sensor network and evaluates trust between nodes. Different security algorithms and techniques are evaluated based on the immune system such as the IDS system.

### **5.3 Socio-inspired method**

The socio-inspired class of methods draws its inspiration from human psychology shown during historical and psychological relationships. Mankind has natural and inherent competitive inclinations, as well as the ability to collaborate, work together, and interact socially and culturally. This natural behaviour is used to build trust among them. All of these natural behaviours assist an individual in learning and imitating the actions of other humans, allowing them to adapt and enhance their own behaviours throughout time [24]. Individuals tend to adapt and evolve faster through interactions in their social setup than through biological evolution based on inheritance, which gives rise to this family of trust evaluation methods.

Social network: Social networks have grown in popularity as a means of sharing information and connecting people with similar interests. Enterprises and governments stand to benefit greatly from the public accessibility of such networks, as well as the capacity to share opinions, thoughts, information, and experience [5]. Social trust defines with three parameter such as trusted information gathering, evaluation of trust value, and trust dissemination. In social networks, trust evaluation model categories as sociological trust like emotions, behavioural activities of users and computational trust evaluated from sociological trust value.

Socio-physiological: Because the media has such a large influence on public consciousness in today's environment, the question of trust is important. People create firm opinions on many issues based on what they have heard in the news or read on the Internet [25]. As a result, a person gets exposed to several aspects of media such as television, newspapers, and broadcast media at the same time. Most people believe that the information they receive is the only one that is right, which leads to the establishment of false beliefs that have nothing to do with the truth.

### **5.4 Computational methods**

Trust is an important entity for successful finance and social networks. If trust factor is disabled then the entire system will collapse so mathematical modelling is built to define trust value in such applications [26]. Computational trust is measured using game theory approach, cognitive approach and neurological approach.

### **6. Game theory approach for social media**

Game theory approach used in different fields for decision making such as cloud computing, mobile adhoc network, etc. In cloud computing, Nash equilibrium (NE) enhances the trust evaluation at boot load level for service provider and end user or participant [27]. It also prohibits service provider and customer to breach service level agreement. The mathematical study of cooperation and conflict is known as

game theory. It offers a unique and interdisciplinary approach to the study of human behaviour that may be used to any circumstance in which each player's choice effects the utility of other participants, and in which players take this mutual influence into account while making decisions. This type of strategic interaction is often utilised in the study of human-centered systems, such as economics, sociology, politics, and anthropology. Game theory is a powerful conceptual and procedural tool for studying social interaction, including game rules, informational structure of interactions, and payoffs associated with certain user decisions. Game theory may be used to all behavioural fields in a unified approach. Game theory is a powerful conceptual and procedural tool for studying social interaction, including game rules, informational structure, and payoffs associated with specific user decisions.

A game will be defined in the framework of Game Theory as a conflict between two agents: G—a trustworthy agent that receives data, and U—an agent that transmits data. There are two strategies available to players. For agent G, there are two options: trust the agent U or do not trust the agent U. For agent U, the first approach is to send proper data, whereas the second strategy is to send false data. Payments when players win/lose can be designed in order to consider the game in its usual form and express it through the payment matrix.

Because agent G cannot check or dispute the data at the time of receipt, the danger of losing reliable data must be considered. This involves the introduction of the concept of data value. Consider INFOi belongs pre-exist information in the system. so ∃ v(INFOi): v(INFOi) 6¼ v(INFOj), i 6¼ j is means maximum information i is transmitted. Assume that value of data or information decreases with time. So ∃tf: 0 < tf ≤ t is receiving information at time t, then ∃ v(INFOi, tf, t): v(INFOi, tf, t) ≤ v (INFOi) is the value of the data i at the time t. It can be calculated by the equation:

$$\mathbf{v}(\text{INFOi}, \text{tf}, \mathbf{t}) = \mathbf{v}(\text{INFOi}) \times \mathbf{k} \text{INFOi}(\text{tf}, \mathbf{t}), \tag{1}$$

where k INFOi (tf, t) is the function of relevance of the information i at time t. We consider k(tf, t) 6¼ 0 as long as the agent cannot disapprove the information, so, let an exponential function of the form Ex represented in equation to calculate actual data on social network:

$$\text{kilFOi}\left(\text{tf}, \mathbf{t}\right) = \left(\text{Ex INFOi}\right)\mathbf{t} - \mathbf{t}\mathbf{f} \tag{2}$$

Payoff function (G) of the agent is presented as can be described by the equation

$$\text{fG}(\mathbf{x}, \mathbf{y}) = \begin{cases} \text{v}(\text{INFOi}) \,\mathbf{x} = \mathbf{1}, \mathbf{y} = \mathbf{1} \\ \mathbf{0} \,\mathbf{x} = \mathbf{1}, \mathbf{y} = \mathbf{2} \\ \text{v}(\text{INFOi}, \text{tf}, \mathbf{t}) \mathbf{x} = \mathbf{2}, \mathbf{y} = \mathbf{1} \\ \text{v}(\text{INFOi}, \text{tf}, \mathbf{t}) \,\mathbf{x} = \mathbf{2}, \mathbf{y} = \mathbf{2} \end{cases} \tag{3}$$

where x, y is the number strategies of the agent G and U. For the agent U, the biggest gain will be the value Truth(INFOi) = 1 of the agent G, in the case when the agent U lied, and minimal - when the agent G has trust to U, and U provided him with correct data. To denote the wins of the agent U, we introduce the payoff function, presented in following equation.

$$\text{fU}\left(\mathbf{x},\mathbf{y}\right) = \begin{cases} -\mathbf{1}, \mathbf{x} = \mathbf{1}, \mathbf{y} = \mathbf{1} \\ \mathbf{1}, \mathbf{x} = \mathbf{1}, \mathbf{y} = \mathbf{2} \\ \mathbf{0}, \mathbf{x} = \mathbf{2}, \mathbf{y} = \mathbf{1} \\ \mathbf{0}, \mathbf{x} = \mathbf{2}, \mathbf{y} = \mathbf{2}. \end{cases} \tag{4}$$

### *Trust Management: A Cooperative Approach Using Game Theory DOI: http://dx.doi.org/10.5772/intechopen.102982*

where x, y is the number of agent G and U strategies.

User behaviour in social networks is a type of dynamic interaction that evolves continuously throughout the development process. The main characteristics of social networks are reflected in user engagement behaviours. Identify node attributes, investigate social network secret nodes, identify viral marketing influencers, and investigate node centricity. Exploring secret nodes is crucial in complicated social networks because it can help detect terrorists sooner, recommend certain things to potential buyers, and uncover origins of misinformation.

### **7. Conclusion**

This work gives a survey on the existing psychology of trust mechanisms. Describe the trust and trustworthiness with respect to various domains such as. social networks, computerised systems, economics, etc. Review the various trust management techniques in cloud computing, cryptography and machine learning. Also discussed the trust evaluation methods that are categorised as bioinspired, socio-inspired, computational and analysis-based trust. Particularly, this study categorised the existing trust evaluation methods into sub categories-based functions of different trust level calculation techniques like game theory approach, machine learning. Evaluation criteria focused on advantages and disadvantages of different trust evaluation techniques. Article focused on issues and challenges in trust management in various fields to enhance the research work.

### **Author details**

Ujwala Ravale<sup>1</sup> \*, Anita Patil<sup>2</sup> and Gautam M. Borkar<sup>2</sup>

1 Department of Computer Engineering, SIES Graduate School of Technology, Navi Mumbai, India

2 Department of Information Technology, Ramrao Adik Institute of Technology, D Y Patil Deemed to be University, Navi Mumbai, India

\*Address all correspondence to: ujwala.ravale@siesgst.ac.in

© 2022 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/ by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

### **References**

[1] Fan X, Liu L, Zhang R, Jing Q, Jing Ping B. Decentralized trust management: Risk analysis and trust aggregation. ACM Computing Survey. 2019;**53**(1):1–33. Article ID: 2

[2] Balaji PG, Srinivasan D. An introduction to multi-agent systems. In: Innovations in Multi-Agent Systems and Applications. SCI, Springer; 2010; **310**:1–27

[3] Jones Granatyr, Vanderson Botelho, Otto Lessing, Edson Emilio Scalabrin, Jean-Paul Barthes. Trust and reputation models for multi-agent systems. ACM Computing Surveys. 2015;(2):1-42

[4] Pinyol I, Sabater-Mir J. Computational trust and reputation models for open multi-agent systems: A review. Artificial Intelligence Review. 2013;**40**:1-25

[5] Sherchan W. A survey of trust in social networks. ACM Computing Surveys. August 2013;**45**(4):1-33. Article ID: 47

[6] Jiang W, Wang G, Bhuiyan ZA, Jie W. Understanding graph-based trust evaluation in online social networks: Methodologies and challenges. ACM Computing Surveys. March 2017;**49**(1):1-35. Article ID: 10

[7] Keum DH, Lim J, Ko Y-B. Trust based multipath QoS routing protocol for mission-critical data transmission in tactical ad-hoc networks. In: Security and Privacy in Wireless Sensor Network (Basel). June 2020;**20**

[8] Ali A, Ahmed M, Khan A, Ilyas M, Razzaq MS. A trust management system model for cloud. In: International Symposium on Networks, Computers and Communications (ISNCC). 2017

[9] Kerrache CA, Calafate CT, Cano J-C, Lagraa N, Manzoni P. Trust management for vehicular networks: An adversary-oriented overview. IEEE Access. 2016

[10] Cho J-H, Swami A, Chen I-R. A survey on trust management for mobile Ad Hoc networks. IEEE Communications Surveys & Tutorials. 2011

[11] Wang Y, Cai Z, Yin G, Gao Y, Tong X, Han Q. A game theory-based trust measurement model for social networks. Computational Social Networks. 2016

[12] Wang J, Qiao K, Zhang Z. Trust evaluation based on evidence theory in online social networks. International Journal of Distributed Sensor Networks. 2018;**14**(10)

[13] Chen X, Yuan Y, Lu L, Yang J. A multidimensional trust evaluation framework for online social networks based on machine learning. IEEE Access. 2019;**7**

[14] Peng Z, Rastgari M, DorostkarNavaei Y, Daraei R, Oskouei RJ, Pirozmand P, et al. TCDABCF: A trust-based community detection using artificial bee colony by feature fusion. Hindawi, Mathematical Problems in Engineering. 2021;**2021**

[15] Liu S, Zhang L, Yan Z. Predict pairwise trust based on machine learning in online social networks: A survey. IEEE Access. 2018;**4**

[16] Mohammadi A, Golpayegani SAH. SenseTrust: A sentiment based trust model in social network. Journal of Theoretical and Applied Electronic Research. 2021

[17] Mayadunna H, Rupasinghe L. A trust evaluation model for online social networks. In: National Information

*Trust Management: A Cooperative Approach Using Game Theory DOI: http://dx.doi.org/10.5772/intechopen.102982*

Technology Conference (NITC); IEEE. 2018

[18] Guangchi Liu, Chenyu Li, Qing Yang, "NeuralWalk: Trust assessment in online social networks with neural networks", INFOCOM—IEEE Conference on Computer Communications, 2019.

[19] Khalil A, Mbarek N, Togni O. Fuzzy Logic based security trust evaluation for IoT environment. IEEE/ACS 16th International Conference on Computer Systems and Applications (AICCSA). 2019

[20] Kesarwani A, Khilar PM. Development of trust-based access control model using fuzzy logic in cloud computing. The Journal of King Saud University: Computer and Information Sciences. Part A. 2019;**34**(8):4956-4967

[21] Anita X, Bhagyaveni MA, Manickam JML. Fuzzy-based trust prediction model for routing in WSNs. Hindawi Publishing Corporation Scientific World Journal. 2014;**2014**. p. 11. Article ID: 480202

[22] Mujawar TN, Bhajantri LB. Behaviour and feedback-based trust computation in cloud environment. Journal of King Saud University: Computer and Information. September 2020;**34**(8):4956-4967

[23] A Boukerche, Xu L, An agent-based trust and reputation management scheme for wireless sensor networks, IEEE Conference and Exhibition on Global Telecommunications (GLOBECOM), 2005.

[24] Kumar M, Kulkarni AJ. Socioinspired optimization metaheuristics: A review. Socio-cultural Inspired Metaheuristics. Studies in Computational Intelligence. Springer. 2019;**828**:241–265

[25] Shpak M, Kichuk A, Sytnyk O, Ishchuk N, Filonenko D, Hrozna O. Socio-psychological factors of user trust in information in electronic mass communication. Special Issue: Innova tion in the Economy and Society of the Digital Age. 2021;**39**(5)

[26] Trcek D. Computational trust management, QAD, and its applications. Informatica. March 2014;**25**(1):139-154

[27] Gokulnath K, Uthariaraj R. Game theory based trust model for cloud environment. Hindawi Publishing Corporation, e Scientific World Journal. 2015;**2015**:10. Article ID: 709827

### **Chapter 2**

## Interpersonal Trust within Social Media Applications: A Conceptual Literature Review

*Kevin Koidl and Kristina Kapanova*

### **Abstract**

Interpersonal trust within social media applications is a highly discussed topic. The debate ranges from trusting the application, related to security and privacy, to trusting content and the underlying content delivery algorithms. Several trustrelated phenomena have surfaced in recent years, known as filter bubbles, echo chambers and fake news. Addressing these phenomena is often pushed to either the regulator or directly to the provider of the social media application. Interpersonal trust within social media applications is a more complex topic and not limited to the application or the content, but has to include the behaviour of the user. To broaden the debate beyond the prevalent focus on the application and content this paper presents a conceptual literature review studying interpersonal trust within social media with the goal to deepen the understanding of the complex interplay between user behaviour in relation to interpersonal trust. Based on this review modalities of interpersonal trust are identified and presented. To extend on these findings an information-dense word embedding based analysis is presented by using unsupervised machine learning techniques.

**Keywords:** social media, trust, truth, literature review, machine learning

### **1. Introduction**

Social media is an important part of interpersonal communication and essential for building and maintaining lasting and meaningful relationships. Recently, social media has been challenged by policymakers to promote and spread content that is not truthful, often referred to as fake news, and with that, has led to a *crisis of trust* [1]. In addition the global pandemic has moved several physical interactions online with several technical developments, such as remote work and online learning, being conducted online with a significant impact on trust based interpersonal interactions.

At the core of this crisis lies the question of responsibility. Technology providers tend to push responsibility to the users by claiming that the application only facilitates the transaction and cannot be responsible for the nature or purpose of the content. This, however, is rejected by policymakers, which tend to argue that personal information is misused and sold for content targeting. On the one hand, it

can be argued that social media providers should protect the interests of their users and ensure that their personal information is not used to target them with potentially false or harmful information. On the other hand, it can be argued, that users should become more aware of such information and not 'trust' everything they see. This includes making their own background checks and spending the time to investigate the source and intention of the message. This trust-related debate, therefore, bears the question of what responsibility the user holds in trusting information that is spread on social media applications in validating its trustworthiness. In addition to this reduced, transaction-based, point of view, between the user and the social media application. Interpersonal trust has to be investigated. The recent debate related to echo chambers and filter bubbles points to the fact that users tend to trust content from trusted peers more than from unknown users. Furthermore, users tend to focus more on what engagement their own content gleaned and not so much what other content they engage on themselves [2]. Content engagement is used to gauge trust for example via likes, shares, comments and reactions in the form of emoticons. The main assumption is, that content with high engagement, is most likely content that can be trusted [3]. However, it is easy to conclude that content engagement is not suited to assess if the content is true, false or misleading. The main reason being, that any reaction can be fabricated (e.g. by false accounts). To compound this challenge, the underlying content distribution algorithms of social media applications react strongly to content that receives increased engagement by assuming that content with a high number of engagement is interesting to more users, hence the content is spread wider and faster. This specific action can affect information diffusion and the role of the users. Indeed, social media algorithms give more visibility to contents with higher engagement by hiding the visibility of contents with less engagement (e.g. post in Facebook groups). Trust, therefore, cannot be assessed by assuming the content is trustworthy due to the level of engagement. Based on this assessment two possible viewpoints can be introduced. The first pointing to the social media applications screening content and the second pointing to the user needing to trust their own ability to judge content. Furthermore, a good trust model needs to take into account several aspects which involve the level of trustworthiness a user has towards both the content and the user sharing the content. Moreover, it can be argued that if the user does not trust their own ability to assess information they might prefer a regulator to decide. This, however, points to the challenges of censorship and how political bias within the screening teams should be handled. An addition to social media related inter person trust the recent global pandemic has led to an increased online usage with several physical social interactions moving online, specifically these are online working and online learning. However, there is to date no indication that the global pandemic and its impact on online has had a significant impact on social media based interpersonal reactions or that it has changed anything in relation to trust dynamics in social networks. Pandemic related online technologies used are mostly focused on video live and real-time conferencing without the need or usage of a social network or any related social technologies that create an social activity overlay. There are instances of trust related aspects such as companies trusting their workers less due to the lack of control and insight. This argument has led to the increased development of surveillance technology for online workers which however is not directly related to interpersonal trust covered in this article. It remains however an interesting and ongoing topic to reflect how the pandemic, once it is over, changes the dynamics of online and social media related interpersonal trust.

This article is organised as follows. First, a discussion of interpersonal trust within social media applications is provided. This is followed by a conceptual literature review resulting in the identification of modalities of trust in social media. *Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

Finally, a rudimentary and brief information-dense word embeddings analysis is provided to illustrate how impactful the terminology around interpersonal trust within the state of the art of interpersonal trust within social media applications is. This concluding study is based on unsupervised machine learning techniques. Finally, a discussion is provided.

### **2. Interpersonal trust in social media applications**

Trust is a complex construct and often defined from different perspectives. This makes it difficult to define and to categorize conceptually. In this section, we seek to provide an overview of definitions and categories of trust with the goal to frame the conceptual literature review discussed in Section 3.

### **2.1 Trust and trustworthiness**

A common opinion is that connected people trust one another's content. However, trust is a far more complex concept which takes several aspects of the human dynamics into account. Different definitions of trust can be related to the real-world relationships of people and based on this the trust aspects of the relationship take various aspects and definitions into account.

From a general perspective, it can be argued that modern societies are becoming increasingly complex due to technologies that provide instant access to a large amount of information and services. Based on this, it can be argued that trust is a key concept that ensures all members of a complex society to deal with a high level of complexity. Only by trusting the technology to do the job right it is possible to 'give away' the control to technology. The same argument stands for non-technical processes and trust such as financial and regulatory processes in which trust is given to a central bank and/or governments. Trust hence is an essential fabric of our society without which the complexities of it would not be manageable and there existing within a complex society would not be possible.

On a more specific level trust can be defined as a derivation of the reciprocity, learned when people are in cooperation with others, like in associations and other forms of voluntary organizations [1]. In addition to this definition, trust is composed of personal values (e.g. personal happiness), but also by political and economic values.

The argument of complexity reduction, central to trust, is aligned with the concept of trusted agents. A trust agent has the purpose to complete tasks on behalf of a person. It can be a person, a governmental agency or technology. However, trusting an agent is not easy. The main reason for this is the role of risk related to trust [4, 5]. The more an individual trusts, the more risk the individual is willing to take. In this context O'Hara [6] discusses that for technology to increase the quality of life, it is necessary that technology assists in increasing trustworthiness throughout society. However, it can be argued, social media applications as one of the most used technology for social interactions within societies, specifically those relying on social interactions on content, are not designed to increase trustworthiness. They typically revert to simplified low-risk substitutes of trust, such as ratings, recommendations and engagement. The risk argument is essential however, without risk there is no trust. This argument on the flipside implies that any action that holds no risk does not require any trust. The result of risk minimization within social interactions in social media applications is a significantly decreased impact such applications have on increasing the quality of social interactions and with that the quality of life throughout societies [6]. As mentioned above, trust as a concept is complex,

due to it being based on the person's beliefs and attitudes. Therefore, it is challenging to understand what properties within social interactions increase trustworthiness and specifically how these properties can be utilized in a mostly automated social media application.

From an economic point of view, an essential view point, especially when discussing publicly trading social media companies, trust is partially a product of people's capacity to assess the trustworthiness of their potential partners. People, as homo economicus, often calculate the costs and projected outcomes of their decisions to trust. From a rational perspective, trusting involves expectations about interaction partners based on calculations, which weigh the cost and benefits of certain courses of action to either the trustors or the trustees [7]. In this context Weber et al. [8] notes that in some cases people display a willingness to trust people they do not know and will never meet or see. Moreover, a more technology-related view is taken by Friedman et al. [9] in which an end-user must first trust in that atmosphere—technology and human community combined and only then the interacting partners are positioned to trust in any particular online interaction with other people. In addition, trust can depend on non-rational factors, such as love or altruism and may involve a loose confluence on diverging interests. In extreme cases, trust is even necessary when people are in desperate situations from which they cannot extricate themselves [10]. For example, two parties having an asymmetrical dependency in a trusting relation—one is dependent on the other, but not the other way around [10]. Lewis and Weigert [11] argue that trust, from a sociological perspective, should be viewed as a property of collective units (such as groups and collectives), and not of isolated individuals. As a collective attribute, trust is applicable to the relations among individuals rather than to their psychological states taken individually [11].

It is not clear however what role social media applications play in increasing or decreasing interpersonal trust and what implications this has on the overall society which can only function if trust exists. Several research studies have been proposed and they show the wide effect that social media have on the creation of the trust. However, researching trust within online interactions is a complex task and requires the replication of physical interaction with its sets of interpersonal cues in the context of online exchange may be a feasible method to promote online trust. We postulate that the infusion of social presence in websites for online transactions may increase users' trust in online organizations which is in line with Beldad et al. [10]. Thus, the problem for establishing trust online is how to do so in light of uncertainty about both the magnitude and the frequency of risk and potential harm [9]. The inclination to view trustors and trustees symmetrically under the premise that each party interprets each other's actions similarly [8].

In the context of online trust functions of ongoing image and reputation management are important to discuss. The potential partners have the burden of not only creating trust but also maintaining it and this process involves the duty of presenting themselves as trustworthy persons [12]. This corresponds to Goffman's presentation of the self-theory, which proposes that people are constantly engaged in managing and controlling the impressions they make on others to attain their goals [13]. Specifically, in interhuman relationships, trust can be viewed as a product of people's capacity to assess the trustworthiness of their potential partners. More specifically, trust, therefore, can be considered as the reflected trustworthiness of the trustees and their trustworthiness that is subjectively entertained in the judgment of the trustors [14]. A further view on trust is offered by Zand [15] by viewing trust as a concept that increases the vulnerability to others whose behaviour one cannot control. Essentially meaning that online trust is defined as an attitude of confident expectation in an online situation of risk that one's vulnerabilities will not be exploited which can be

### *Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

viewed as an argument that trust in offline settings is applicable to trust in an online environment [16]. Following the argument of increased vulnerability Zand [15] argues that trust can be viewed as the willingness of people to be vulnerable to the actions of others based on the expectation that the latter will perform a particular action important to the former, irrespective of the ability to monitor and control the latter. However, the idea of being vulnerable when trusting skews towards the realization that while uncertainties and ambiguities are abounding in all forms of exchanges and transactions, risks creep underneath. Doney et al. [17] extend this argument by arguing that the sources of risks are related to vulnerability and/or uncertainty about an outcome. Therefore, trust can be regarded as people's behavioural reliance on others on a condition of risk [18].

The connection between risk and trust has been highlighted in Rousseau et al. [19]. If considered that trust is one of the major concepts of an online/offline human social relationship, two specific aspects characterize a trust relationship between humans: the risk and the interdependence. The risk regards the intention of the other party which is not certain before, instead, the interdependence concerns the interests of the two parties which are related. These two conditions are needed to consider a human relationship as a trust relationship, and changes in these two factors may change the level of trust [19].

In relation to risk Koller [20] and Lewis and Weigert [11] ask if we trust because there are risks or do we take risks because we trust? The first question emphasizes that risks determine trust, while the second question supposes that trust is an antecedent of risk-taking behaviour in any relationship, in which the form of risk-taking, according to Mayer et al. [21]. This, however, depends on the situation and/or context. The people's level of trust in their interaction partners is positively related to the perceived risks present in the situation. This means that an increase in risk perceptions could result in the augmentation of people's degree of trust [20, 21].

There are several more specific viewpoints on what makes a person trustworthy and with that different perspective how this can be achieved online in comparison to offline. Sztompka [14] for example employs three criteria in estimating the trustworthiness of a person being reputation, performance, and appearance. Mayer et al. [21] describes that trustworthy occurs when the transactional partners (1) have the required skills, competencies, and characteristics that enable them to exert influence within a specific domain (competence criterion), (2) are believed to do good to trustors (no egocentric motive), and (3) are perceived to adhere to a set of principles that trustors consider acceptable—a definition of integrity [10].

In digital media studies, authenticity has often been discussed in relation to online identity and self-presentation [22]. Especially social media has changed the possibilities for self-presentation. The main reason for this is that users present themselves in flat spatial and temporal context [23]. This happens because social media changed the nature of the interpersonal relationship in two ways: space and time. Time because the Internet is able to reduce the barriers of time thanks to asynchronous communications. Instead, as concerns space, social information can spread to a very wide set of interested users [24].

This relates to Goffman's impression management framework addressing challenges for the separation of backstage and frontstage identity performances [10, 25] which relates to the users online and offline persona. Social media allows users to perform strategic authenticity by revealing personal information, displaying symbolic connections, and responding to their audiences immediately and regularly. This controlled selection, along with monitoring self-disclosures [23] and constant redaction of profiles [26], help users to perform authenticity for multiple audiences by presenting themselves in different ways based on different strategic personas. Based on [4] authenticity stems from the construction of identity. Giddens explains that an authentic person is one who knows herself and is able to reveal that knowledge to the other. Based on this it can be argued that social media applications influence how individuals build and express the overarching biographical narrative upon which their authenticity claims rest [27].

A trust dimension related to authenticity is reputation which is often represented online as reviews and ratings. It can be argued that the everincreasing popularity of review websites that feature user-generated opinions (e.g., TripAdvisor and Yelp) are increasingly gamed to increase monetary value through opinion spam (e.g. fraudulent reviews) [28]. Ratings and reviews, therefore, are a weak indicator of trust simply because it is impossible to gauge if the person who produced the rating or review have behaved with goodwill. There is furthermore no incentive for the same to do so. This is further reduced by decreasing anonymity and increase violations of privacy and undermine personal autonomy [9]. Further reasons for a reduced trust in online reputation is emotional bias and deceptive opinion spam which in both cases are highly subjective and mostly motivated based on different reasons than validating reputation. Moreover, there is a high risk of reviews and ratings being purchased and therefore false. Should a person be identifiable and therefore related to a real person the history of comments, reviews and ratings can create a collection of records which indicates the user's performance in a prior transaction which can increase trustworthiness [29].

In relation to other online applications that evolve around use cases that include social interactions video calling applications can be mentioned. In relation to trust these applications overlap slightly with social media apps specifically if the social media app is focused on short form video presentation. Within such apps several technology enhancements, such as facial filters and background filters can lead to a distortion of the persons actual look and a distortion of where the person is, such as by using different background filters. However, there are two sides to this, pandemic related surge in online social applications, which is short term (recorded) video posted on a social media platform and a live transmission for meetings are learning which is either on a social media platform via live feature or directly via a plethora of video conferencing tools that are available for free.

In relation to trust aspects of real-time o video based trust are understudied however due to the added visual effect are similar to content posted. Hence, content can be viewed widely not only as posts that contain images and text but also videos and which as argued above underly commenting, sharing, rating, etc. which are all prone to be use to validate trust. In relation to real-time video however interactions drastically change with trust being assessed based on multimodal content experiences, such as image, voice and speech in real-time [REF]. A more worrying development in relation to trust and video based content, live or recorded, has become known as deep fake [REF]. This AI empowered technology allows not only the changing of the look and feel of a individual within a video, including voice and speech, but allows for the fake replication of a person as if it is the person [REF]. Deep fake technologies will further evolve and become more easy and cheaper to produce. To ensure videos and interactions of the same, which can still be categorized under the umbrella of fake news, technologies will need to evolve in terms of validating the authenticity of content [REF].

In the following sub-section a closer view is placed on methods to measure and model interpersonal social interaction related to trust and within social media applications.

*Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

### **2.2 Measuring and modelling interpersonal trust in social media applications**

Trust has several properties, which are usually used to define models or to measure it. As concerns social media applications [30]. The most interesting ones are dynamic, propagative, no-transitive, asymmetric, and composable. Dynamic means that trust can change. It can increase, decrease, or decay with time. There are two approaches to update the trust value: event-driven, where all trust data are updated when an event happens, and time-driven, where trust is periodically updated. From a properties point of view trust is propagative, which means that if a generic user A trusts B, and B trusts C, A can trusts C, which is the basic concept of a recommendation system. However, trust is not transitive, which means that if A trusts B, and B trusts C, this does not imply that A trusts C. The composable property mean that propagation of trust (and distrust) can follow long social chains, which allows a member A to create a trust connection with a member D that was not directly connected to the member A. When several social chains recommend different values of trust for the member D, then A needs to compose the trust information. Finally, trust is asymmetric, which means that a level of trust between the two members is not the same. Indeed, a member A may have a certain level of trust with a member B, but the level of trust from B to A can be less or more than it is trusted back. We can, therefore, note a strong correlation between trust and similarity. Indeed, users with trust relations are likely to be similar, and this similarity is called homophily. Based on these properties several models have been proposed. Usually, methods are classified by using the propagative characteristic. The techniques used take into account statistical and machine learning techniques, heuristics-based techniques, and behaviour-based techniques.

### **3. Conceptualization of interpersonal trust in social media**

Trust in technology significantly differs from the interpersonal trust. Specifically, the technical environment is one of the building stones helping to build trust between people and can, therefore, be referred to as a socio-technical research challenge. Trustworthy environments are intertwined with social aspects and together they build trust resulting in the usage of the online application. The following overview serves as a high-level conceptualization of how trust is modelled and realized within social media applications (**Figure 1**). This overview has be developed based on the literature review in the table below the illustration. Its main categories are defined as modalities of trust in interpersonal interactions within social media applications. These modalities are perceived risk, perceived reputation, perceived authenticity, perceived complexity reduction. Each modality is mapped to concept derived from the literature review.

Extending this literature social-technical research related to interpersonal trust in social media applications can be found in the following table.

Dwyer et al. [31] Trust and privacy concern within social networking sites: a comparison of Facebook and MySpace A study describing the impact of trust and Internet privacy concern on the use of social networking sites for social interactions. Comparison of Facebook and MySpace. Fogel and Nehmad [32] Internet social network communities: risktaking, trust, and privacy concerns Risk-taking, trust and privacy attitude on social networks (MySpace, Facebook) among 205 college students using scales and ANOVA.

Perceived risk (verification, privacy)

### **Figure 1.**

*Modalities of trust in social media.*



*Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

### Perceived reputation (aspects of reputation)



### Perceived authenticity (personas, anonymity, information quality)



### *Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

### Perceived complexity reduction



*Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

> on the thematic complexity, the employed research methods, the expertise, or the motivations of the researchers. The results reveal that prior attitudes determine judgments about the user comments, the attacked claims, and the source of the claim. After controlling for attitude, people agree most with thematic complexity comments, but the comments differ in their effect on perceived claim credibility only when the comments are made by experts.

To extend the categorisation and introduction of moralities of trust in social media applications we introduce a machine learning based analysis of terminology usage in relation to the trust modalities introduced. This analysis is conducted by using word embedding within social media related publications. The main objective of the following section is to assess the popularity of trust related aspects in social media publication.

### **4. Analysis of modalities via information-dense word embeddings with unsupervised machine learning techniques**

In order to assess the popularity of the discussed topic the following section discusses the topic of trust and social media as a topic that is becoming increasingly more important, especially with recent trust breach cases, such as Cambridge analytic. In addition the emerging field of autonomous networks and the requirement for trust valuation (e.g. ad-hoc networks) indicates the need for increased scientific production in the field of interpersonal trust in social media applications. To gain a more comprehensive overview of the debate an in-depth review of trust-related keywords is presented below. The extraction of facts, knowledge and relationships from this increasing body of literature requires a more generalized approach, such as machine learning based text mining through the collection of abstracts. To achieve this we relied on natural language processing techniques, such as doc2vec, for word embeddings performed on abstracts from scientific papers containing the keywords 'trust' and 'social networks'. We focus on the abstracts since they represent a compressed view of the informational content according to Atanassova et al. [57]. The decision to analyse abstracts only was supported by a processing point of view, with abstracts typically short (usually about 300 words) and available as part of the metadata, access to them is relatively easy for analysis. The processing was conducted with publications up to 2020. The main rational for this was to ensure discussions around the impact of the pandemic are not counted which in the view of the authors creates a distortion of the topic of this paper. Further publications, once the pandemic is over, can apply this same approach to focus and possibly compare the impact of the pandemic on the interpersonal trust related debate in social media.

### **4.1 Data collection**

From Google Scholar we collected 560 unique articles in English, which had 'trust' and 'social network' in their keywords. This selection was focused on articles

**Figure 2.** *The figure shows the publication year of the collected and analysed articles.*

within the research field of computer science and related fields such as computational sociology. The article dates ranged from 1980 to 2020 and therefore represent exactly 20 years. Hence, this analysis can be defined as pre-pandemic. From each article we extracted and processed the article name, the publication year and the article's abstract. The following **Figure 2** shows a histogram (amount per year) of the above mentioned terms.

### **4.2 Preprocessing**

In order to prepare the abstract texts for natural language processing we tokenized each document, therefore processing the abstract of each article as a separate document. The result of this pre-processing was a bag-of-words consisting of the token (a non-stop word, hence any term that holds meaning), a token-id and the token-count, such as 2-tuples, which together created the text corpus for further research. In a second step all tokens were normalized. This resulted in 4623 unique words representing the overall size in the processed corpus (or the size of the vocabulary) with a vector size of 300 (which was defined manually). In the following subsection we discuss the overall findings.

### **4.3 Findings**

After the data processing stage we performed a frequency analysis of the words that were collected from the abstracts. The plot below (**Figure 3**) shows the 10 most frequent words, with 'trust', 'social', 'network', and 'user' being the most frequent.

It has to be noted that the word frequency analysis disregards important relations between the words. To mitigate this affect, we selected the 20 most *Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

**Figure 3.** *Most frequent words.*

**Figure 4.** *Top 20 bigrams.*

common bi-grams and tri-grams in the data set. In the bi-gram case (see **Figure 4**) we can see that 'trust' co-occurs with 'social trust', 'trust model', 'trust network', 'based trust', 'trust management', 'trust social', 'trust reputation', 'trust relationships', 'trust-based', 'trust distrust', and 'trust evaluation'. In the case of tri-grams (see **Figure 5**), 'trust' interestingly appears in relation to 'trust news media', 'context-aware trust', 'trust reputation systems', 'trust social commerce', 'trust social media', 'trust social networks'. In the case of 'context-aware trust', it is interesting to note that the notion of trust is related to the specific context a user finds himself/herself in. Therefore trust values are different depending on the context.

### **Figure 6.**

*Word vector representation based on semantically close words to "trust" from the doc2vec model.*

To further analyse the relationship between words from the corpus is used for a word embedding analysis, where semantically similar words are mapped to proximate points in geometric space. As shown on **Figure 6** below, the semantically similar words to 'trust' are 'application', 'context', 'prediction', 'recommend', 'collaborative'. In the case of 'trustworthy' the most similar words were 'applicable', 'approach', 'exist' and 'relation'.

To gain a deeper understanding of the topics used in the publications we explored topic modelling to the data set. For this the latent semantic analysis (LSA) technique was applied resulting in the clusters indicated in **Figure 7** topics. The largest number of articles was clustered around the topic 'trust social network'.

Finally, as illustrated in **Figure 8**, a topic analysis cluster was applied after normalisation resulting 'trust' and 'online' being the most used terms.

*Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

### **Figure 7.**

*t-SNE visualization of the word clusters from the scientific abstracts. The trained model was reduced to a vector of 50, with a concatenation of context vectors and a max vocabulary size of 1000.*

### **5. Conclusions and future work**

Studying the implications of trust online is challenging due to the complexity of the topic. Specifically in relation to the literature review it has proven very difficult to identify publications that focus on interpersonal trust online, specifically social media. Most publications in the intersection of trust and online relate to topics of security. However, the advent and growing discussion around the topic of fake news has proven to be a good reference point in the identification of relevant papers. A further difficulty in the categorisation has been the global pandemic which has changed the dynamic of online application usage towards real-time and away from posted content based on recordings or text. During the inception of this paper the pandemic was still ongoing there a concluding pandemic related investigation into trust in online interpersonal interactions can be concluded as valuable future work. Early work in the topic of pandemic related trust implications can be found in Dwyer et al. [58]. Moreover, the advent of DeepFake, a concept that dynamically creates fake news, possibly also in live interactions, is an important topic to review in future work. Overall, it can be concluded that this work, both the literature review introducing modalities of trust and the extended review of the abstracts of research papers should provide a solid foundation for further more focused investigations and studies in the implications of trust in online interactions,

specifically related to social media and social media related online technologies. Moreover, this paper has laid the foundation of deeper and more comprehensive literature and state of the art review in interpersonal trust as a foundational dimension in addition to the current debate that focuses mostly on the application or the user behaviour. For this, the paper introduced a comprehensive overview and identified key aspects related to interpersonal trust and truthfulness.

### **Acknowledgements**

This work is supported by the ADAPT Centre for Digital Content Technology, which is funded under the Science Foundation Ireland Research Centres Programme (Grant 13/RC/2106) and is co-funded under the European Regional Development Fund.

*Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

### **Author details**

Kevin Koidl\* and Kristina Kapanova School of Computer Science and Statistics, Trinity College Dublin, Dublin, Ireland

\*Address all correspondence to: kevin.koidl@tcd.ie

© 2022 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/ by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

### **References**

[1] Fisher C. What is meant by 'trust' in news media? In: Otto K, Köhler A, editors. Trust in Media and Journalism. Wiesbaden: Springer VS; 2018

[2] Koidl K, Conlan O, Reijers W, Farrell M, Hoover M. The BigFoot initiative: An investigation of digital footprint awareness in social media. In: Proceedings of the 9th International Conference on Social Media and Society (SMSociety '18). New York, NY, USA: ACM; 2018. pp. 120-127. DOI: 10.1145/ 3217804.3217904

[3] Fletcher R, Park S. The impact of trust in the news media on online news consumption and participation. Digital Journalism. 2017;**5**(10):1281-1299. DOI: 10.1080/21670811.2017.1279979

[4] Giddens A. Modernity and Self-Identity: Self and Society in the Late Modern Age. Cambridge, UK: Polity Press; 1991

[5] Luhmann N. Trust and Power. Chichester: John Wiley; 1979

[6] O'Hara K. A General Definition of Trust. Southampton, GB: University of Southampton; 2012. p. 19

[7] Lane C. Introduction: Theories and issues in the study of trust. In: Lane C, Bachmann R, editors. Trust within and between Organizations. Oxford: Oxford University Press; 1998. pp. 31-63

[8] Weber JM, Malhotra D, Murnighan JK. Normal acts of irrational trust: Motivated attributions and the trust development process. Research in Organizational Behavior. 2005;**26**:75-101

[9] Friedman B, Kahn PH Jr, Howe DC. Trust online. Communications of the ACM. 2000;**43**(12):34-40

[10] Beldad A, de Jong M, Steehouder M. How shall I trust the faceless and the

intangible? A literature review on the antecedents of online trust. Computers in Human Behavior. 2010;**26**(5):857-869

[11] Lewis JD, Weigert A. Trust as a social reality. Social Forces. 1985;**63**(4): 967-985

[12] Haas DF, Deseran FA. Trust and symbolic exchange. Social Psychology Quarterly. 1981;**44**(1):3-13

[13] Goffman E. The Presentation of Self in Everyday Life. New York: Anchor Books/Doubleday; 1959

[14] Sztompka P. Trust: A Sociological Theory. Cambridge: Cambridge University Press; 1999

[15] Zand DE. Trust and managerial problem solving. Administrative Science Quarterly. 1972;**17**(2):229-239

[16] Corritore CL, Kracher B, Wiedenbeck S. Online trust: Concepts, evolving themes, a model. International Journal of Human-Computer Studies. 2003;**58**:737-758

[17] Doney PM, Cannon JP, Mullen MR. Understanding the influence of national culture on the development of trust. Academy of Management Review. 1998; **23**(3):601-620

[18] Currall SC, Judge TA. Measuring trust between organizational boundary role persons. Organizational Behavior and Human Decision Processes. 1995; **64**(2):151-170

[19] Rousseau DM, Sitkin SB, Burt RS, Camerer C. Not so different after all: A cross-discipline view of trust. Academy of Management Review. 1998;**23**(3): 393-404

[20] Koller M. Risk as a determinant of trust. Basic and Applied Social Psychology. 1988;**9**(4):265-276

*Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

[21] Mayer RC, Davis JH, Schoorman FD. An integrative model of organization trust. Academy of Management Review. 1995;**20**(3):709-734

[22] Marwick AE. Online identity. In: Hartley J, Burgess J, Bruns A, editors. A Companion to New Media Dynamics. Malden, MA: Blackwell; 2013. pp. 355-364

[23] Boyd D. Social network sites as networked publics: Affordances, dynamics, and implications. In: Papacharissi Z, editor. A Networked Self: Identity, Community, and Culture on Social Network Sites. New York: Routledge; 2011. pp. 39-58

[24] Sutcliffe AG, Gonzalez V, Binder J, Nevarez G. Social mediating technologies: Social affordances and functionalities. International Journal of Human Computer Interaction. 2011;**27** (11):1037-1065

[25] Hogan B. The presentation of self in the age of social media: Distinguishing performances and exhibitions online. Bulletin of Science Technology Society. 2010;**30**(6):377-386. DOI: 10.1177/ 0270467610385893

[26] Papacharissi Z. Without you, I'm nothing: Performances of the self on Twitter. International Journal of Communication. 2012;**6**:1989-2006. DOI: 1932-8036/20120005

[27] Duguay S. "He has a way gayer Facebook than I do": Investigating sexual identity disclosure and context collapse on a social networking site. New Media and Society. 2014. DOI: 10.1177/1461444814549930 [online first]

[28] Ott M, Cardie C, Hancock JT. Estimating the prevalence of deception in online review communities. In: Proceedings of International World Wide Web Conference 2012 (IW3C2). 2012

[29] Houser D, Wooders J. Reputation in auctions: Theory, and evidence from eBay. Journal of Economics and Management Strategy. 2006;**15**:353-369. DOI: 10.1111/j.1530-9134.2006.00103.x

[30] Sherchan W, Nepal S, Paris C. A survey of trust in social networks. ACM Computing Surveys (CSUR). 2013; **45**(4):47

[31] Dwyer C, Hiltz S, Passerini K. Trust and Privacy Concern within Social Networking Sites: A Comparison of Facebook and MySpace. 2007

[32] Fogel J, Nehmad E. Internet social network communities: Risk-taking, trust, and privacy concerns. Computers in Human Behavior. 2009;**25**(1):153-160

[33] Dhami A, Agarwal A, Chakraborty TK, Singh BP, Minj J. Impact of trust, security and privacy concerns in social networking: An exploratory study to understand the pattern of information revelation in Facebook. In: 2013 3rd IEEE International Advance Computing Conference (IACC). Ghaziabad; 2013. pp. 465-469. DOI: 10.1109/IAdCC. 2013.6514270

[34] Paramarta V, Jihad M, Dharma A, Hapsari I, Sandhyaduhita P, Hidayanto A. Impact of user awareness, trust, and privacy concerns on sharing personal information on social media: Facebook, Twitter, and Instagram. In: 2018 International Conference on Advanced Computer Science and Information Systems (ICACSIS). 2018

[35] Sharif A, Soroya SH, Ahmad S, Mahmood K. Antecedents of selfdisclosure on social networking sites (SNSs): A study of Facebook users. Sustainability. 2021;**13**(3):1220. DOI: 10.3390/su13031220

[36] Abdul-Rahman A, Hailes S. Supporting trust in virtual communities. In: Proceedings of the 33rd Hawaii International Conference on System

Sciences-Volume 6—Volume 6 (HICSS '00). Vol. 6. Washington, DC, USA: IEEE Computer Society; 2000. p. 6007

[37] Matsuo Y, Yamamoto H. Community gravity: Measuring bidirectional effects by trust and rating on online social networks. In: WWW'09—Proceedings of the 18th International World Wide Web Conference. 2009. pp. 751-760. DOI: 10.1145/1526709.1526810

[38] Zacharia G, Maes P. Trust management through reputation mechanisms. Applied Artificial Intelligence. 2000;**14**(9):881-907

[39] Chen W, Fong S. Social network collaborative filtering framework and online trust factors: A case study on Facebook. In: 2010 Fifth International Conference on Digital Information Management (ICDIM). 2010

[40] Rosen D, Lafontaine P, Hendrickson B. CouchSurfing: Belonging and trust in a globally cooperative online social network. New Media & Society. 2011;**13**(6): 981-998

[41] Jiang W, Wang G. SWTrust: Generating trusted graph for trust evaluation in online social networks. In: IEEE 10th International Conference on Trust, Security and Privacy in Computing and Communications. Changsha; 2011. pp. 320-327. DOI: 10.1109/TrustCom.2011.251

[42] Li Y, Cao H, Zhang Y. Static and dynamic structure characteristics of a trust network and formation of user trust in an online society. Social Networking. 2018;**7**:197-219. DOI: 10.4236/sn.2018.74016

[43] Henderson S, Gilding M. 'I've never clicked this much with anyone in my life': Trust and hyperpersonal communication in online friendships. New Media & Society. 2004;**6**(4): 487-506

[44] Duguay S. Dressing up Tinderella: Interrogating authenticity claims on the mobile dating app Tinder. Information, Communication and Society. 2016;**20** (3):351-367

[45] McGloin R, Denes A. Too hot to trust: Examining the relationship between attractiveness, trustworthiness, and desire to date in online dating. New Media & Society. 2016;**20**(3):919-936

[46] Djafarova E, Rushworth C. Exploring the credibility of online celebrities' Instagram profiles in influencing the purchase decisions of young female users. Computers in Human Behavior. 2017;**68**:1-7

[47] Amin F, Khan MF. Online reputation and stress: Discovering the dark side of social media. FIIB Business Review. 2021;**10**(2):181-192

[48] Ryu EA, Han E. Social media influencer's reputation: Developing and validating a multidimensional scale. Sustainability. 2021;**13**(2):631

[49] Sheldon P. "I'll poke you. You'll poke me!" self-disclosure, social attraction, predictability and trust as important predictors of Facebook relationships. Cyberpsychology: Journal of Psychosocial Research on Cyberspace. 2009;**3**(2):Article 1. Retrieved from: https://cyberpsycholog y.eu/article/view/4225/3267

[50] Adali S, Escriva R, Goldberg M, Hayvanovych M, Magdon-Ismail M, Szymanski B, Wallace W, Williams G. Measuring behavioral trust in social networks. In: 2010 IEEE International Conference on Intelligence and Security Informatics. 2010

[51] Lankton NK, McKnight DH. Do people trust Facebook as a technology or as a "person"? Distinguishing technology trust from interpersonal trust. In: AMCIS 2008 Proceedings. 2008. p. 375

*Interpersonal Trust within Social Media Applications: A Conceptual Literature Review DOI: http://dx.doi.org/10.5772/intechopen.103931*

[52] Habibi M, Laroche M, Richard M. The roles of brand community and community engagement in building brand trust on social media. Computers in Human Behavior. 2014;**37**:152-161

[53] Anderson E, Simester D. Reviews without a purchase: Low ratings, loyal customers, and deception. Journal of Marketing Research. 2014;**51**(3): 249-269

[54] Huber B, Barnidge M, Gil de Zúñiga H, Liu J. Fostering public trust in science: The role of social media. Public Understanding of Science. 2019;**28**(7): 759-777

[55] Mourey JA, Waldman AE. Journal of the Association for Consumer Research. 2020;**5**(2):162-180

[56] Gierth L, Bromme R. Attacking science on social media: How user comments affect perceived trustworthiness and credibility. Public Understanding of Science. 2020;**29**(2): 230-247

[57] Atanassova I, Bertin M, Larivière V. On the composition of scientific abstracts. Journal of Documentation. 2016;**72**(4):636-647

[58] Dwyer C, et al. What Factors Influenced Online Social Interaction During the COVID-19 Pandemic? 2021

Section 2
