7.1 Introduction
The widespread use of the internet and social networks has provided a platform for social media users to create social bonds irrespective of where they are physically located. It also acts as a breeding ground for misinformation propagation online. Research shows that social media users play a central role in spreading this misinformation online by falsely believing it (Vosoughi et al., Reference Vosoughi, Roy and Aral2018). Because of social distancing and restrictions, many social media users relied on social networking to obtain information during the early stages of COVID-19. Lack of knowledge about this new virus also facilitated the related misinformation to spread over the internet, especially when there were limited scientific studies available. Misinformation has caused harm to the general public during the pandemic. For example, Forati and Ghose (Reference Forati and Ghose2021) revealed that locations with posted high numbers of misinformation later experienced a rise in the number of cases. Studies have also found that those with beliefs in conspiracies about the virus were more likely to reject expert, authoritative information (Uscinski et al., Reference Uscinski, Enders, Klofstad, Seelig, Funchion, Everett, Wuchty, Premaratne and Murthi2020) and had a higher hesitancy against the vaccine (Freeman et al., Reference Freeman, Waite, Rosebrock, Petit, Causier, East, Jenner, Teale, Carr, Mulhall, Bold and Lambe2022). Allington et al. (Reference Allington, Duffy, Wessely, Dhavan and Rubin2021) also revealed a negative relationship between COVID-19 conspiracy beliefs and health-protective behaviours, and such behaviours may have been caused by using social media as a source of information. Therefore, understanding human behaviour around health-related misinformation is crucial to combat relevant misinformation online.
Significant computational studies have concentrated on misinformation identification and mitigation to help control the spread of misinformation online. Despite the evidence showing that social media users play an important role in the spread of misinformation on social media (Vosoughi et al., Reference Vosoughi, Roy and Aral2018), limited studies have focused on examining how users interact with misinformation socially and what factors drive these social behaviours. By expressing opinions, social media users find other like-minded individuals online and create social bonds in a virtual environment. Through the appraisal framework (Martin & White, Reference Martin and White2003) studies such as Inwood and Zappavigna (Reference Inwood and Zappavigna2021) have modeled social bonding behaviour in a virtual setting using the ambient affiliation social semiotic approach by leveraging fine-grained linguistic patterns in writing. Understanding social bonding behaviour is crucial for identifying the root cause of the spread of misinformation. Users are the key players in misinformation spreading by falsely believing the misinformation. In Zhou et al. (Reference Zhou, Shu, Phoha, Liu and Zafarani2022) the authors investigated the spread of misinformation behaviour from a social media user’s intentional perspective. They found that users may have unintentionally shared misinformation on social networking sites. Using such signals and incorporating them into the model design improved the performance of their misinformation identification model. This result indicates that incorporating human information interaction signals from a social perspective can enhance model design and performance. Therefore, leveraging social bonding behaviour may aid future research in developing more effective systems to combat the spread of misinformation. This chapter investigates two research questions:
(1) How do social media users bond around misinformation?
(2) What is the relationship between emotions and social bonding behaviour?
We adopt the Stimuli–Organism–Response (SOR) framework (Mehrabian & Russell, Reference Mehrabian and Russell1974) to explore online social bonding behaviour surrounding misinformation spread through the lens of emotions. We view information exposure as stimuli, social media users’ emotions as an organism, and social bonding behaviour as a response. Using a public COVID-19 dataset – CoAID (Cui & Lee, Reference Cui and Lee2020) from X, formerly known as Twitter – we explicitly explored online social bonding behaviour around COVID-19 misinformation in our study. We analyzed conversations from X to characterize social connections and attitudes toward misinformation. We randomly sampled 330 X conversations using CoAID (Cui & Lee, Reference Cui and Lee2020) and manually annotated the social bonding behaviour expressed in the reply posts. Deepmoji (Felbo et al., Reference Felbo, Mislove, Søgaard, Rahwan and Lehmann2017) and the Multidimensional Lexicon of Emojis (MLE) (Godard & Holtzman, Reference Godard and Holtzman2022) were applied to extract emotion intensities from the reply posts. More details about emotion intensity extraction are covered in Section 7.3.4. Once emotion intensities were computed, we investigated how they differed for varying levels of information veracity. This is followed by quantifying how these emotions contribute to users’ attitudes toward the information mentioned in the source post. The attitude intensity of an individual decides how the social bond is formed and leads to different online social bonding behaviours.
Similar to Vosoughi et al. (Reference Vosoughi, Roy and Aral2018), who found that emotion and the intensities of emotion among different information veracities are different, we found that social bonding behaviours are different around true information and misinformation. Further analysis in our study showed that, for misinformation, the emotion of surprise significantly contributes to supporting the misinformation. Anticipation, on the other hand, significantly contributes to rejecting the misinformation. In contrast, when users are exposed to true information, the emotions of anger and sadness significantly contribute to its support, unlike fear, which significantly contributes to rejecting true information.
7.2 Literature Review
7.2.1 Misinformation and Human–Information Interaction
Studies such as Soroya et al. (Reference Soroya, Farooq, Mahmood, Isoaho and Zara2021) and Ke et al. (Reference Ke, Du and Ji2021) examined how social media users interacted with information during the COVID-19 event. They examined how information influences user behaviour, from seeking to avoiding information. Humans are prone to seeking knowledge when faced with uncertainties. However, it also has the drawback of making individuals feel overwhelmed by information – information overload – which makes them try to avoid it. Significant efforts have also been put into studying human–information interaction around misinformation. Vosoughi et al. (Reference Vosoughi, Roy and Aral2018) discuss the spread of true and false information online from various perspectives. They discovered that false information spreads faster, deeper, and more widely on online social networks. They also found that misinformation is more novel compared to the truth. The novelty of misinformation could be the reason why it receives more attention from users and is shared more often. Weeks (Reference Weeks2015) evaluated misperceptions of misinformation and how they affect users’ judgments of misinformation from various elements. The author found that emotion, partisanship, and correction play an important role in affecting judgment accuracy in misinformation. However, in the study, the emotions are limited to anger and anxiety. This limitation is due to previous studies finding that these negative emotions contribute to human misperceptions.
7.2.2 Emotions and User Behaviours
Human behaviours are affected by emotions, consciously or unconsciously, and numerous studies have examined the effects of emotions on various user behaviours. For example, Dunn and Schweitzer (Reference Dunn and Schweitzer2005) examined different emotional states and how they affect an individual’s trust. They found that positive emotions such as happiness increase the likelihood of an individual trusting an event, whereas negative emotions such as anger have the opposite impact. Moving this to an online setting, Wang et al. (Reference Wang, Seng-cho and Chang2009) found that positive emotions have a greater impact than negative emotions when it comes to perceived belief in internet applications.
It is widely acknowledged that the inclusion of emotion as a feature in user behaviour prediction computation models is beneficial. Many studies have discovered that the design of models containing information about emotions improves model accuracy (Calvo & D’Mello, Reference Calvo and D’Mello2010; Pak & Paroubek, Reference Pak and Paroubek2010). The improved effectiveness of the models also demonstrated that emotion is a powerful predictor of user behaviour. For instance, Chen et al. (Reference Chen, Liu and Zou2017) showed that their model benefits from incorporating the emotion signal when modeling the behaviour of users who repost content. User emotions have also been adapted in false information modeling to improve modeling performance (Li et al., Reference Li, Feng and Zhang2016).
7.2.3 User Behaviours based on the SOR Framework
The Stimulus–Organism–Reaction (S-O-R) framework (Mehrabian & Russell., Reference Mehrabian and Russell1974) in psychology aids in understanding human behaviour and its underlying causes. Therefore, it is frequently used when the goal of a study is to identify the main factors influencing a particular behaviour. It has been heavily used in studying customer behaviour (Arora et al., Reference Arora, Parida and Sahney2020; Gatautis et al., Reference Gatautis, Vitkauskaite, Gadeikiene and Piligrimiene2016). For example, using the SOR framework, Sherman et al. (Reference Sherman, Mathur and Smith1997) explained how the store environment affects the emotions of customers and how that leads to different purchasing behaviours, whereas Slama and Tashchian (Reference Slama and Tashchian1987) did a case study on how consumer involvement would affect shampoo purchasing behaviour as the response.
Although it is traditionally used to study consumer behaviour, in recent years, information studies have also applied this framework to unpack the information behaviour of social media users. Soroya et al. (Reference Soroya, Farooq, Mahmood, Isoaho and Zara2021) studied how users move from information seeking to information avoidance online. Using the SOR framework, they found that information avoidance behaviour was caused by information seeking, as the stimulus drives users to experience information overload. From these affected users’ internal states of information anxiety as the organism and finally becoming information avoidance as the outcome, Xiao and Su (Reference Xiao and Su2022) revealed a relationship between users who were incidentally exposed to misinformation and had different misperceptions as organisms and ended up sharing the misinformation as a result of their misperceptions.
The SOR framework is also widely used to investigate COVID-19-related health misinformation. For instance, Li et al. (Reference Li, Chen and Rao2022) used this framework to explain how the COVID-19 risk, as a stimulus for individuals, affects their emotions as an organism and leads to different responses in terms of sharing COVID-19-related misinformation online. The framework was used by Wu (Reference Wu2022) to demonstrate how an individual’s social and information dependency leads to positive and negative effects on them. They found the positive effect has a greater influence on users’ sharing of misinformation on social media platforms as the outcome. Similarly, Sampat and Raj (Reference Sampat and Raj2022) demonstrated, based on the COVID-19 event, how many personal characteristics influence information sharing behaviour on social media.
7.3 Methodology and Data
7.3.1 Dialogic Social Affiliation Text
Limited attention has been paid to how values are communicated and how bonds are formed on social media from a social behaviour perspective. Introduced by Knight (Reference Knight2008; Reference Knight, Bednarek and Martin2010; Reference Knight2013) and summarized in the ambient affiliation framework (Inwood & Zappavigna, Reference Inwood and Zappavigna2021), the dialogic affiliation system is designed to evaluate how the recipient negotiates the bond introduced in the initial communication. This system first identifies the bond from a linguistic standpoint using the ideation and attitude pairs and then observes how the bonds communicate in social networks. To identify the attitude, the system leverages the appraisal framework (Martin & White, Reference Martin and White2003). The appraisal framework (Martin & White, Reference Martin and White2003), a semiotic social framework, is a tool that helps understand an individual’s attitude and appreciation toward others through written messages. The appraisal framework contains three dimensions of evaluation, including attitude, graduation, and engagement. Since the social bonding behaviour is what we are focusing on rather than the actual bond itself, in our study, therefore, upon identifying bonds in the source post, we identify the attitude only from the appraisal framework. The attitude dimension includes effect (emotions felt, e.g., happy, sad, interested), judgment (assessing an individual’s behaviour, e.g., right, wrong, irresponsible), and appreciation (valuing an object, e.g., yummy, nice, fantastic). Applying this framework to social media platform conversations helps identify the underlying bonds in the posts. Using the dialogic affiliation system targeted to evaluate social bond negotiation at an online conversation level, we can also understand how an individual would bond with the information. Figure 7.1 provides a graphic representation of how the dialogic affiliation system functions. Under this framework, a user manages or ignores the bond that was indicated in the source post by dividing their bonding behaviour into the first and second levels. If the user is managing the bond, it can be further divided into accepting or rejecting the proposed bond. Finally, these behaviors can be classified into four groups based on how users negotiate social bonds in an online setting. We apply the framework in our study to analyze social media users’ behaviour and how bonds are formed around misinformation.

Figure 7.1 Dialogic affiliation system
Figure 7.1Long description
The manager leads to support and reject. Support directs to warrant equals strongly agree, and defer equals somewhat agree plus alternative ideations. Reject directs to oppose equals somewhat disagree plus alternative ideations, and dismiss equals strongly disagree. Ignore directs to ignore equals neutral with or without alternative ideations.
According to the social bond theory (Pratt et al., Reference Pratt, Gau and Franklin2011), one of the factors determining how strongly one will bond with the environment or other individuals is belief – that is, the value and attitude of an individual. In an online environment, this could refer to whether the information aligns with their values and whether they agree or disagree with the information. From there, users decide whether to socially bond with the information. In other words, how an individual would bond could also be determined by how much the information aligns with their values, as reflected by their attitude expressed through writing. It also acts as an indication of whether they agree with the information presented or not. In our study, we are using direct mapping of social bonding behaviour and attitude intensity, as shown in Table 7.1. It reflects whether users agree with the provided information through their social behaviour.

Table 7.1Long description
The table has three columns titled social bonding behavior, attitude intensity, and blank.
Row 1 reads. Support warrant. Strongly agree. 2.
Row 2 reads. Support defer. Somewhat agree. 1.
Row 3 reads. Ignore. Neutral. 0.
Row 4 reads. Reject oppose. Somewhat disagree. Minus 1.
Row 5 reads. Reject dismiss. Strongly disagree. Minus 2.
7.3.2 SOR Model for Information Veracity and Social Bonding Behaviors
The Stimulus–Organism–Reaction (S-O-R) framework is frequently used to understand and explain human behaviors in response to different cues in the environment and how they are triggered by different external factors (Arora et al., Reference Arora, Parida and Sahney2020; Gatautis et al., Reference Gatautis, Vitkauskaite, Gadeikiene and Piligrimiene2016; Sherman et al., Reference Sherman, Mathur and Smith1997; Slama & Tashchian, Reference Slama and Tashchian1987). It is also widely used to explain how individuals react to false information and how they use information about the recent COVID-19 pandemic (Li et al., Reference Li, Chen and Rao2022; Sampat & Raj, Reference Sampat and Raj2022; Slama & Tashchian, Reference Slama and Tashchian1987; Soroya et al., Reference Soroya, Farooq, Mahmood, Isoaho and Zara2021; Wu, Reference Wu2022; Xiao & Su, Reference Xiao and Su2022). It has been proven to be a useful tool to describe results caused by individuals’ internal states. The SOR framework has supported and directed numerous studies on human behaviour about information and health-related misinformation. We therefore apply it in our study to understand what social bonding behaviors are presented and the underlying causes for them.
Misinformation, according to earlier research, significantly affects users’ emotions (Vosoughi et al., Reference Vosoughi, Roy and Aral2018; Wang et al., Reference Wang, McKee, Torbica and Stuckler2019). Therefore, information veracity was included and acted as a stimulus in our study. Information veracity as the stimuli affect users’ internal states by triggering different emotional intensities in them. Hence, applying the model, the emotion intensities captured are the organisms users reveal when facing different information realities. Existing literature has revealed that emotions have an impact on their associated behaviors (Dunn & Schweitzer, Reference Dunn and Schweitzer2005; Wang et al., Reference Wang, Seng-cho and Chang2009). Therefore, users’ online social bonding behaviors would be affected by their emotions. In summary, applying the SOR framework to our study (Figure 7.2), the external stimuli are the veracity of the source information. The emotion experienced by users is that of the organism as it mirrors internal human states. We used the chi-square test of independence to determine whether there is a relationship between information’s veracity and how users negotiate social bonds online. The null hypothesis of the test is that there is no relationship between information’s veracity and how users develop social bonds online. The alternative hypothesis is that the veracity of information and how users negotiate bonds are not independent, meaning that, for different information veracity, users negotiate bonds differently in an online situation. This helps answer when the stimuli are different and whether the responses differ as well. It helps lay the groundwork before we investigate further.

Figure 7.2 The conceptual SOR model
Figure 7.2Long description
Stimuli or information veracity consists of a source tweet with true information and a source tweet with false information. It leads to organism or emotions, which list anger, anticipation, joy, trust, fear, surprise, sadness, and disgust. They lead to response or online social bonding types, where attitude intensity leads to support warrant, support defer, ignore, reject oppose, and reject dismiss.
7.3.3 Data
After reading the truth or the misinformation, users may develop different affections and emotions toward the presented information. Users would then react and respond to their feelings when posting their opinions on various topics online. This behaviour can be understood as a user’s attempt to negotiate social bonds in an online environment.
We used the COVID-19 related misinformation dataset – CoAID (Cui & Lee, Reference Cui and Lee2020) – to study how users negotiate social bonds around COVID-19 misinformation. This is a dataset containing COVID-19 healthcare misinformation together with users’ social engagement with the misinformation. COVID-19 is one of the major events around the globe in recent years. Using a relevant dataset (Cui & Lee, Reference Cui and Lee2020) with social engagement containing a significant number of online conversations allows us to investigate how users communicate bonds around misinformation.
The dataset contains over 8,000 pairs of conversations on X through replying to a source post related to misinformation and over 127,000 pairs related to true information. The dataset provided the source and reply post IDs, so we could obtain the conversations using the IDs. The dataset also contains labels for whether the information mentioned in the source post is true or false, together with labels for whether it is a claim or news that the original post mentions.
Source posts containing facts are mostly official accounts with announcements or information URLs. There are two major types of accounts posting misinformation: private accounts and posts combating misinformation from official accounts or users from the fact-checking community, clarifying that the underlying message is false. The reply posts are generally from private accounts expressing their opinions on the topic discussed in the source post. Examples of truth and misinformation source posts and their replies can be found in Figure 7.3. The source and replies were obtained using the official X API using the provided post IDs. We then filtered out any non-English conversations and randomly selected 175 conversations related to true information and 175 conversations related to misinformation. Lastly, we performed annotation using the dialogic affiliation framework, as shown in Figure 7.1, on the selected subset of X conversations.

Figure 7.3 Example source and reply posts
Figure 7.3Long description
Part A. The post reads the N 95 respirator masks that healthcare workers need to protect themselves while treating coronavirus patients are in dangerously short supply. But now, Duke University researchers have developed a method to clean them so they can be safely re-worn. A photo of a mask is below. The tweet was posted at 9:31 PM, on March 28, 2020. It has 1861 retweets, 162 quote tweets, and 5090 likes. The comments section has a comment that reads, Better than the nothing Trump is supporting. The reply option is near. Part B. The lower half of the post reads, Belgium health minister bans non-essential sexual activities of persons 3 or greater in indoor areas. A link is below. The comments section titled Tweet Your Reply has a comment that reads, While Boris just politely requests that Brits don't. When is he going to ban it? Followed by three emojis with happy tears. The reply option is near.
The annotations were completed by three individuals; one is the author of this chapter, and the other two are independent researchers. Guidelines were provided to assist annotators with their annotations. They were instructed to first identify bonds using the ideation and attitude pairs in the source post. This is leveraging the appraisal framework (Martin & White, Reference Martin and White2003), as in Inwood and Zappavigna (Reference Inwood and Zappavigna2021). After bonds were identified, they read the replies and used the dialogic affiliation framework to label how users negotiated bonds. Some examples of annotation are provided in Table 7.2.
Source Post | Reply | Social Bonding |
---|---|---|
Misinformation | ||
More funny math from @USER: (1) All non-US countries have done about 25 million tests; (2) US: about 5.9M (through Apr. 28); (3) Trump: “We’ve tested more than every country COMBINED” | More lies from our Lying King. | Support warrant |
Royal Palace confirms Queen Elizabeth tests positive for coronavirus – UCR World News – Will oil prices fall again next week? The price $20 would be seen? I opened buy at $23.14 and didn’t stop loss in time yesterday thanks to busy works [Face with Tears of Joy] | Let’s see what news comes till tomorrow [Grinning Face with Smiling Eyes] | Support defer |
CDC recommends men shave their beards to protect against coronavirus | Don’t worry about Coronavirus, put a lime in it | Ignore |
CDC recommends men shave their beards to protect against coronavirus | No way. I would suit up instead | Reject oppose |
First volunteer in UK coronavirus vaccine trial has died | Fake news | Reject dismiss |
Truthful information | ||
The UK is supporting Somali government to set up hand washing stations to prevent the spread of Coronavirus. The first went operational in Lower Shabelle not far from Mogadishu yesterday with more to follow | Job well done | Support warrant |
If the Government wants to help kids with their education during this crisis, instead of pushing to send them back to unsafe schools, he could start by ensuring all children have the resources – like books and computers – that they need to learn at home | Why doesn’t the @USER donate their educational equipment lol [Face with Tears of Joy][Rolling on the Floor Laughing] | Support defer |
We know what to do but clearly aren’t all doing it: wash hands, social distance, mask when you can’t, stay home if you are sick | @USER The test are rigged for positive results. Investigate | Ignore |
Experts call on UK to not use contact tracing app for surveillance – Business Insider @URL | @USER This would be madness don’t take it out on the public … | Reject oppose |
“Worst nightmare”: Fauci warns that coronavirus pandemic ‘isn’t over yet’ | @USER This guy changes his mind more than the weather! | Reject dismiss |
The ideation and attitude pairs are in bold. @USER and @URL indicate user mention and external link, and [] represents an emoji.
Once the annotators finished the annotation, a majority vote was taken among the three sets of annotations. We discarded any sample without a majority vote. This might be due to the conflicting interpretations of the post’s message (Clark, Reference Clark1985; Day & Gentner, Reference Day and Gentner2007), which would make coming to an agreement or comprehending its contents challenging. After removing instances without a majority vote, we had 330 annotations left, which were made up of 162 truths and 168 misinformation-related conversations.
7.3.4 Emotion Intensity
To understand how users feel after reading the information, we can evaluate the emotions extracted from their written text. By analyzing the emotion scores across various online social bonding types when exposed to truth and misinformation, we can learn what emotions might potentially affect a user’s reaction to bonds.
Prior studies have concentrated on getting word counts for Plutchik’s eight basic emotions (Plutchik, Reference Plutchik, Plutchik and Kellerman1980) using the NRC Word-Emotion Association Lexicon (Mohammad & Turney, Reference Mohammad and Turney2013) to measure the emotions in text. One of the disadvantages of this method is that it is bound by vocabulary and relies on exact word matching to extract meaning. It also failed to consider the use of negation to generate contradictory emotions. Numerous prior studies have discovered that general misinformation induces different types of negative emotions when presented to users (Vosoughi et al., Reference Vosoughi, Roy and Aral2018). It has been demonstrated that false information about COVID-19 also causes negative emotions (Leng et al., Reference Leng, Zhai, Sun, Wu, Selzer, Strover, Zhang, Chen and Ding2021). However, when we used NRC (Mohammad & Turney, Reference Mohammad and Turney2013) to assess emotions in the chosen dataset, we discovered that it did not work well at capturing emotion intensity in the chosen dataset.
The ability of emojis to convey emotions in written text, on the other hand, is well recognized. According to Gülşen (Reference Gülşen, Ogata and Akimoto2016), emojis can be used to represent emotions. In contrast to words, emojis, which are symbols that represent faces, can indicate one or many emotions in contrast to words (Jaeger & Ares, Reference Jaeger and Ares2017). When compared to regular text usage, Ai et al. (Reference Ai, Lu, Liu, Wang, Huang and Mei2017) show that it has a greater semantic meaning. Emojis have also been suggested for use in psychometric scales to gauge emotions and personalities in several psychology studies, such as Marengo et al. (Reference Marengo, Giannotta and Settanni2017) and Phan et al. (Reference Phan, Amrhein, Rounds and Lewis2019), with promising results.
Therefore, to overcome the disadvantage of using exact word matching, we decided to first project the text message to a space represented by an emoji by applying Deepmoji (Felbo et al., Reference Felbo, Mislove, Søgaard, Rahwan and Lehmann2017). This is a neural model that projects text into five distinct emojis, which are used to represent underlying emotions more comprehensively. To look up the emotional intensity of each of the emoji representations, we use the Multidimensional Lexicon of Emojis (MLE) (Godard & Holtzman, Reference Godard and Holtzman2022). MLE comes with emotion intensity scores for 359 emojis. The emotion intensity score from MLE is calculated using over 3 million inputs. X posts and emotion ratings were provided by 2,230 individual raters. This allows direct mapping of emoji content to the intensity of Plutchik’s eight emotions (Plutchik, Reference Plutchik, Plutchik and Kellerman1980). We use MLE to map the emojis with their corresponding emotion scores after projecting the emotions into the emoji space. For each of the eight emotions, we add the scores for the five anticipated emojis and their corresponding scores. The full process of mapping the emotions and obtaining the final emotion scores is presented in Figure 7.4.

Figure 7.4 The overall process of obtaining emotion intensity scores
The Kolmogorov-Smirnov (K-S) test is used to analyze the distribution of emotions related to the veracity of the information. When combined with the mean and standard deviation of the emotion score, this helps to confirm whether there are any variances in the eight emotions’ levels of intensity when information veracity varies. The result from the K-S test helps us answer whether the stimuli cause variance in the organism (i.e., under different information veracity, whether the emotion intensities would be different).
To determine if emotions have different impacts on social bonding types when information veracity varies, ordinary least squares regression is used to model such a relationship. Using a linear combination of the emotion intensity scores, we can estimate the users’ likelihood to believe and create social bonds with the provided information. We can also quantify the potential effect each emotion has on how likely they believe and how likely they are to bond with the source post. The output of the model is used as an indication of which emotion contributes the most to how users would like to bond based on whether they believe the information or not.
As mentioned in Section 7.3.2, based on the theory of social bonding (Pratt et al., Reference Pratt, Gau and Franklin2011), where alignment with personal value can determine how much a user would like to bond, we mapped the online social bonding type to a numerical value indicating the users’ attitude, as shown in Table 7.1. This is to make the attitude of a user and how one would like to bond a quantifiable, measurable variable. Once the attitude intensity is quantifiable, then it is feasible to use the variable to model the relationship between the emotion intensity scores and how they contribute to how a user would like to bond. This is to answer the last question of our conceptual SOR model: to examine the relationship between the organism (emotion) and the response (user bonding types).
7.4 Results
7.4.1 Social Bonding Type Distribution
Using our annotated data, we can observe from Figure 7.5 that, when true information is presented to users, they tend to support the proposed bond by warranting or deferring to an alternative bond. In contrast, when misinformation is presented to users, they tend to reject the bond by dismissing or opposing it and suggesting an alternative bond. We can see that dismissal is the most common type of online social bonding when misinformation is presented to users. More users are ignoring the proposed bond when true information is presented. From the distribution, we believe that the veracity of the information affects how users negotiate bonds in an online environment. Further analyses are done in Sections 7.4.2–7.4.4 to validate that this is the case.

Figure 7.5 Online social bonding behaviours distribution by information veracity
Figure 7.5Long description
The graph plots bars for false and true. The vertical axis ranges from 0 to 100. The horizontal axis represents online social bonding types. The values are 40 and 42 for the support warrant. 15 and 40 for support defer. 75 and 40 for reject dismiss. 20 and 45 for reject oppose. 15 and 35 for ignore. Note, all values are approximated.
7.4.2 Association between Information Veracity and Negotiation of Social Bonds
As previously mentioned, we are determining whether or not the information’s veracity and the type of online social bonding are independent of each other using the chi-square test of independence. This lays the foundation for further investigation into the connection between them. In the chi-square test of independence, the null hypothesis assumes that online social bonding behaviours are independent of the veracity of information. We then check whether the chi-square statistic is over the critical value. From the chi-square test result (p-value = 0.003), at a 1 percent confidence level, we can reject the null hypothesis and accept the alternative hypothesis to conclude that the veracity of the information and the way that users negotiate bonds are not independent. This is also coherent with our previous observation in Section 7.4.1. Given that these two variables are not independent of each other, we can further investigate using the SOR framework (Mehrabian & Russell., Reference Mehrabian and Russell1974) to identify the relationships between information veracity and how users choose to negotiate the bond.
7.4.3 Emotion Distribution between Different Information Veracity
To determine whether and how the veracity of the information affects users’ emotions, for both the truth and misinformation, we used all the data we collected using the provided post IDs and we performed the Kolmogorov-Smirnov (K-S) test on the emotion intensity scores derived from the replies. The results can be found in Table 7.3. This is to test whether the emotional intensity distribution among the two types is the same. When the distribution is different, we can infer that the emotional intensity for that particular emotion is different among the types, which is caused by the veracity of the information. To have a better understanding of which type has the higher intensity score for each emotion, the mean and standard deviation are also calculated for the emotion intensity scores. The intensity scores for all emotions are significantly different (p-value < 0.01). With true information, emotions such as anticipation, joy, sadness, surprise, and trust are higher. With misinformation, on the other hand, emotions such as anger and disgust are higher. These results are consistent with a prior study (Vosoughi et al., Reference Vosoughi, Roy and Aral2018), with a subtle difference. Broadly, true information triggers more positive emotions, while misinformation triggers more negative emotions. Users’ emotions are impacted by the veracity of information.

Table 7.3Long description
The table has four columns for the emotion, mean, std, and K S test D statistic. Mean and std have two sub-columns for true and false.
Row 1 reads. Anger. 0.2773. 0.2799. 0.0940. 0.0853. 0.1027.
Row 2 reads. Anticipation. 0.4970. 0.4679. 0.0792. 0.0716. 0.1535.
Row 3 reads. Disgust. 0.2061. 0.2109. 0.0782. 0.0712. 0.1222.
Row 4 reads. Fear. 0.2954. 0.2938. 0.0846. 0.0773. 0.0752.
Row 5 reads. Joy. 0.4641. 0.4235. 0.1214. 0.0997. 0.1489.
Row 6 reads. Sadness. 0.2731. 0.2721. 0.0795. 0.0727. 0.0836.
Row 7 reads. Surprise. 0.2454. 0.2297. 0.0360. 0.0335. 0.1624.
Row 8 reads. Trust. 0.5547. 0.5229. 0.0799. 0.0788. 0.1760.
7.4.4 Ordinary Least Squares Regression Model on Emotion and How Users Negotiate Bonds Using Social Media
The result of an ordinary least squares regression predicting attitude intensity or online social bonding types is reported in Table 7.4. The model is applied to different types of information and is analyzing how and what emotions are affecting social bonding behaviours. Using the ordinary least squares regression model, we depicted how users prefer to connect with the information as their attitude intensity toward the provided information. We quantified what and how emotions contribute to how users bond on social media platforms under different information veracities.
Emotion | True information | False information |
---|---|---|
Anger | 87.15 (43.26)* | 39.06 (50.48) |
Anticipation | −40.69 (24.71) | −86.78 (28.14)** |
Disgust | −69.51 (38.80) | −86.98 (51.38) |
Fear | −113.39 (47.21)* | −6.55 (50.235) |
Joy | 13.19 (12.40) | 16.96 (26.41) |
Sadness | 72.65 (29.50)* | 33.49 (33.61) |
Surprise | −0.81 (35.64) | 105.14 (42.21)* |
Trust | 33.49 (21.615) | 16.56 (24.46) |
Notes: Unstandardized regression coefficients reported. Standard error is listed in parentheses. p-values are two-tailed. * p < 0.05. ** p < 0.01.
From the result, we found that, when users are exposed to true information, anger, fear, and sadness contribute significantly to how they connect and bond with the information. Anger and sadness contribute significantly to positive attitude intensity and bond with the true information. Fear, on the other hand, significantly contributes to negative attitude intensity and rejects true information. In contrast, when users are exposed to misinformation, anticipation and surprise are significant factors in determining how users bond with misinformation, while anticipation significantly contributes to a more positive attitude intensity and supports misinformation. Surprise, on the other hand, significantly contributes to rejecting the misinformation. Overall, we found that the relationship between different information veracity and the importance of different emotions in how users bond is different.
The Chi-square test result indicates that, for different information veracity, the social bonding behaviour initiated by users is different. It lays the groundwork for us to further investigate the potential factors contributing to such an outcome. This is consistent with what we saw in Figure 7.5, where users prefer to believe the truth by supporting or deferring the bond and rejecting the bond by dismissing behaviour, indicating whether the information being provided is true or not.
7.5 Discussion
We found that, when exposed to the truth and misinformation, emotion intensities as an internal state of users are different. Our findings are consistent with those of Vosoughi et al. (Reference Vosoughi, Roy and Aral2018). With true information, users have higher levels of anticipation, joy, sadness, and trust. Disgust has a higher intensity when misinformation is presented to users. This also aligns with Vosoughi et al. (Reference Vosoughi, Roy and Aral2018). However, a subtle difference in surprise is observed in our analysis. The intensity of surprise is higher when true information is presented but lower in Vosoughi et al. (Reference Vosoughi, Roy and Aral2018). We suspect this is a case-specific observation since the dataset we used for our analysis was published during the early stages of COVID-19, when the general public still had limited knowledge on this topic. Hence, revealing a higher intensity of surprise when true information is presented is reasonable. This is also supported by the novelty hypothesis from Vosoughi et al. (Reference Vosoughi, Roy and Aral2018), where information that is more novel would receive greater attention and surprise. Similarly with fear, since this is an unknown event, it is reasonable for users to feel fear even when true information is presented.
We found an association between emotions, users’ attitude intensity, and social bonding behaviour when users are exposed to different information veracity. We found that, when users are exposed to misinformation, anticipation and surprise are key factors that contribute significantly to social bonding behaviour. The anticipation emotion contributes significantly to bond rejection. This is no surprise since the more anticipatory a user is, the less likely he or she will believe misinformation. However, the surprise emotion contributes significantly to supporting the misinformation. We suspect there are two reasons for this observation. Firstly, surprise as an emotion would intensify the feelings of other emotions, according to Mellers et al. (Reference Mellers, Fincher, Drummond and Bigony2013). Secondly, when an individual is more emotional, they are more likely to behave irrationally (Pham, Reference Pham2007), in our case by falsely believing in misinformation. In contrast, when users are exposed to true information, the anger, fear, and sadness emotions contribute significantly to how they bond socially. Anger and sadness play an important role in social bonding around the truth. While fear emotions may lead people to reject factual information, we believe that encountering true information can also trigger anger and sadness when this information aligns with our observations and those reported by Vosoughi et al. (Reference Vosoughi, Roy and Aral2018). Hence, these emotions have a greater contribution to attitude intensity, which is understandable. Similarly, it is understandable that fear and emotion can lead individuals to refuse or reject bonding with the truth. Especially in the case of COVID-19, true information could contain messages about the seriousness of the virus. In addition, there were uncertainties about this virus, and fear of the unknown could also cause individuals to withdraw or disassociate themselves from the information. We also suspect the distribution of bonding types can be a cause of the subtle differences between the contributions of emotion intensity and bonding behaviour. As depicted in Sections 7.4.1 and 7.4.2, bonding distributions are different under different veracity.
In an earlier study, Weeks (Reference Weeks2015) examined information behaviour in a political context. Their study only examined two types of emotions and other factors that may influence individuals’ judgments on political-related misinformation. The author found that anger, as an emotion, motivates the evaluation of uncorrected misinformation. Hence, the positive relationship between anger and belief level in our study can be explained by this. In Weeks (Reference Weeks2015), they discovered that anxiety, the second type of emotion they are investigating, promotes initial belief, allowing individuals to believe in the information presented more easily. In our study, when users are exposed to misinformation with a higher intensity of surprise, they are more easily convinced by and believe misinformation. These are two distinct emotions that have similar outcomes when it comes to misinformation. A common factor among anxiety and surprise is the possible unknown in the future (Grupe & Nitschke, Reference Grupe and Nitschke2013). We suspect this is the reason why these two emotions come to the same conclusion. The other factors discussed in their study are case-dependent, such as political standpoints, and they do not apply to our study. Lastly, it is no surprise that, when users experience negative emotions such as fear, they prefer to withdraw rather than bond with the information.
We believe our study examining individuals’ social bonding behaviors from a social perspective will help design mitigation mechanisms to counteract the spread of misinformation. The emotional signals in misinformation can be leveraged to design automatic misinformation mitigation systems. It is based on the understanding that when users are exposed to true information, anger, sadness, and fear are key contributors to socially bonding with the information. Where anger and sadness encourage a bond with true information and fear encourages a rejection bond, we may include more words related to anger and sadness to promote their propagation. In contrast, words related to fear should be minimized since they push users away from creating social bonds with the truth. With misinformation, surprise should be minimized since it leads users to falsely believe in misinformation and bond with the misinformation. Anticipation words, on the other hand, could be promoted to encourage users to disconnect from misinformation.
7.6 Conclusion
In this chapter, we applied the Stimuli–Organism–Response (SOR) framework to examine the relationship and the effect between misinformation, emotions, and social bonding behaviour in an online environment. We found that, with different information veracity, the triggered emotion intensities are different. When exposed to true information, more anger and disgust were triggered, whereas, when exposed to misinformation, higher intensities of anticipation, fear, joy, sadness, surprise, and trust were triggered. Furthermore, we found that, with different information veracity, the key contributing factors to social behaviour are different. Anger and sadness trigger social bonding reactions that support true information. The fear emotion, on the other hand, promotes rejection of true information. When exposed to misinformation, anticipation promotes rejection, and surprise promotes behaviour to support the misinformation. We believe these findings could help future research to include such signals and improve the computational model design and performance in identifying and mitigating misinformation online. Future research can examine ways to leverage such signals in model development to help design models with better performance when incorporating social signals. Our study examining individuals’ social bonding behaviors from a social perspective may help future misinformation identification and mitigation studies design models that better capture misinformation from a more human-understandable perspective.