Addressing Inaccurate and Misleading Information About Biological Threats Through Scientific Collaboration and Communication in Southeast Asia
Mis- and disinformation about outbreaks, epidemics, and pandemics are age-old problems that have been exacerbated in modern times by the rapid production and flow of information through various means. False claims based on inaccurate and misleading information during infectious disease events present challenges to effective outbreak control, including potential distrust among affected populations in public health response and questions among security experts about the true origins of outbreaks. Some false claims may be disproven through sound scientific analysis, suggesting a role for scientists to provide evidence-based, scientifically defensible information that may discredit or refute such claims. The National Academies conducted a study to engage scientists in Southeast Asia in working across scientific disciplines and sectors to identify and address inaccuracies that may fuel mis- and disinformation. Although no evidence exists demonstrating that Southeast Asia is more likely to develop or receive misinformation than any other geographic region, this region has witnessed the emergence of several infectious diseases that have spilled over into the human population and regional growth of efforts around responsibility in the life sciences. Although this study has a regional focus, the outcomes may be useful for scientists in many other parts of the world given the ubiquitous nature of mis- and disinformation.
The main purpose of the study is to build capacity within Southeast Asia to counter scientific inaccuracies about biological threats that lead to or perpetuate mis- or disinformation in a collaborative and effective manner. At the request of the sponsor, this report puts a particular emphasis on (1) creating working definitions of mis- and disinformation to provide a deeper understanding of the generation and spread of such claims; and (2) understanding how to distinguish consensus views of the scientific community, supported scientific information that may not necessarily be the consensus view, or unresolved or evolving scientific information.
This study focuses on misinformation about emerging infectious diseases (e.g., influenza strains and Ebola virus); targeted disinformation about biological events and materials; inaccurate claims about the purpose of pathogen research and facilities; and poor-quality scientific information such as studies that have little statistical power, methods that do not match the conclusions drawn, or poor reporting of results. The study is not specifically focused on the COVID-19
pandemic, nor was the goal of the study to address any particular claim including the origins of the SARS-CoV-2 pandemic. The study also does not focus on issues of vaccine hesitancy or resistance, primarily owing to the significant complexity of the underlying reasons for these views. However, the outcomes of the study may be broadly helpful to scientists seeking to counter misinformation about vaccines.
The primary audience for the study and its outputs, specifically the engagement strategy report and how-to guide, are scientists and scientific institutions from Southeast Asia. Because mis- and disinformation are global problems, scientists from other regions also may benefit from the study outcomes. For this study, the term “scientist” includes laboratory and field life scientists; clinicians (human and veterinary); public health scientists; social scientists; and scientists from a variety of other natural, physical, and computer sciences and organizations (academia, industry, nongovernmental organizations, government laboratories, and community or unconventional laboratories). Potential broader audiences of corrective messages are policymakers, journalists, and lay and religious leaders. Although the primary focus of the study is to develop the trusted network of scientists rather than to refute claims in broader public discourse or in designed informational or behavioral change campaigns, the outcomes provide informative knowledge to regional scientists who may choose to engage policymakers and other non-scientific stakeholders in refuting misinformation.
The regional focus of the study, which is specified in the Statement of Task, allowed the committee to include structural and cultural considerations and challenges of the region, and develop actionable recommendations based on existing regional resources and scientific networks. Although the focus and framing of the report and recommendations are within the lens of Southeast Asia, the scholarship of network design, misinformation, and science communication may inform efforts by scientists in other parts of the world to counter inaccurate and misleading information given the international nature of mis- and disinformation. In addition, international networks may offer certain advantages to engaging scientists with different expertise and accessing resources from international organizations. However, opportunities for involving scientists from Southeast Asia and for building understanding and trust among network members may be greater at the regional level where the cultures and societal structures may be more similar. Therefore, this strategy report and accompanying how-to guide are tailored to provide regionally appropriate and actionable advice, rather than high-level options that are not specific to any region or country in the world.
To address the Statement of Task, the committee produced the following consensus products: (1) this report, which describes the committee’s recommended strategy for developing a trusted network of qualified scientists who address inaccurate and misleading information that might lead to mis- or disinformation; (2) a how-to guide for scientists to determine whether and how to address a claim and, subsequently, how to correct the scientific information fueling the claim; and (3) a relational mapping of selected online collaboration platforms.1 The strategy report and guide do not provide recommendations that request limitless capacity
building, workforce training, and funding for a number of reasons. These types of recommendations are less actionable, do not empower partners, do not help scientists navigate potentially challenging policy and practice, and put the focus on long-term funding as opposed to providing the tools and knowledge needed to implement the engagement strategy and guide. The strategy report and guide highlight actions that could have high impact, focus on local solutions, and build on local resources so they can be implemented feasibly by scientists in Southeast Asia. Given the complexity of mis- and disinformation, many stakeholders and efforts are necessary to prevent and otherwise counter these claims. This study involves one set of stakeholders—scientists—in helping to address mis- and disinformation that are fueled by inaccurate and misleading information. Furthermore, the landscape and risks associated with mis- and disinformation, though not new, are evolving rapidly because of the volume of information available, the availability of new online platforms (e.g., social media) and tools (e.g., artificial intelligence [AI]), and changing pandemic and broader societal context. Therefore, critically analyzing all available (often new) resources for addressing mis- and disinformation for their effectiveness is not feasible. The outcomes of this study are intended to present current scholarship and practical knowledge about how to engage scientists in working collaboratively to address inaccurate and misleading information, which could be provided to broader audiences (e.g., policymakers, journalists, lay and religious leaders, and other members of the public) with evidence-supported, robust scientific information that effectively can counter mis- and disinformation.
This report starts with a detailed description of the problem of compounding threats involving inaccurate and misleading information that leads to or perpetuates mis- or disinformation, and their associated consequences to public health and national security. This discussion is followed by a description of cases where life, social, and/or computer scientists have helped to address these problems and challenges. Subsequently, the strategy for a trusted network of scientists from Southeast Asia is described. The strategy builds on scholarship about scientific networks and network influence, countering misinformation, and understanding uncertainty, which are detailed in the last several sections of the report. Drawing on the scholarship presented in this report, the how-to guide2 provides clear instructions for scientists to determine whether and how to address particular claims.
Misinformation is defined as the unintentional spread of false or misleading information that is shared by mistake or under the presumption of truth (Sedova 2021). Disinformation is defined as false, misleading, distorted, or isolated factual information that is spread deliberately with the intention to cause harm or damage.
The lines between accidental and deliberate spread of inaccurate information may be blurry. For example, false claims that may have been introduced deliberately may be spread by actors who believe them to be true and do not intend to spread false information. Similarly, actors who intend to cause harm may spread misinformation deliberately, including inaccurate scientific information that fits their purpose. Further distinguishing among misinformation, disinformation, and other related concepts, such as conspiratorial thinking or a state of being uninformed (Scheufele and Krause 2019), is beyond the scope of this report. Furthermore, the actors and approaches involved in addressing targeted disinformation often are different than those involved in correcting inaccurate and misleading information that contribute to misinformation. Scholarship on countering targeted disinformation is extremely limited, but increasingly social, computer, political, and natural scientists are studying the creation, spread, and correction of misinformation. Therefore, this report and associated guide draw on the available scholarship on misinformation to inform the recommended approach for countering inaccurate and misleading information, regardless of the intention behind its spread.
Consequences of the spread of inaccurate, misleading, or even hyped scientific information include loss of trust in the public health system and/or ineffective public health responses during epidemics, international conflict about the source and responsibility of epidemics, or the targeting of individual scientists and research institutions associated with particular claims. This section provides examples of inaccurate and misleading claims about biological threats.
Mis- and Disinformation Involving Infectious Diseases
During the height of the Cold War, the Soviet Union deliberately spread false claims about the United States conducting biological activities with non-peaceful intent, seeking falsely to identify the United States as a violator of the Biological and Toxins Convention, which bans the development, stockpiling, production, acquisition, and retention of microbial or other biological agents or toxins that have no peaceful or prophylactic purposes (Gamberini and Moodie 2020). Such disinformation did not disappear at the end of the Cold War. During the past 15 years, Russia has continued to spread disinformation about U.S. biological investments in Central Asia, and specifically about the activities of the Richard Lugar Center for Public Health in Tbilisi, Georgia, which Russia claims is a secret biological weapons facility run by the United States that could be responsible for the release of viruses (Fidler 2019; Gamberini and Moodie 2020; Isachenkov 2018; Lentzos 2018). During the COVID-19 pandemic, as in previous epidemics, Russia and China have spread disinformation to discredit the United States and several European countries, which already has adversely affected the public health system and the government during the COVID-19 pandemic (Dubow et al. 2021). These claims are spread deliberately, are motivated politically, and have real consequences in international diplomacy and/or response to public health emergencies.
Inaccurate and misleading claims associated with biological events also can undermine response activities in affected areas and discredit the current
international leaders in biosecurity. For example, during the 2014–2016 West Africa Ebola Virus Disease (EVD) outbreak, distrust of health care and international aid workers by West Africans affected by the virus adversely altered disease dynamics and response activities. The consequences of this distrust and skepticism presented significant challenges to infection control, particularly in identifying, isolating, and treating exposed or newly infected individuals.
Although this study did not focus on vaccine hesitancy, the ongoing coronavirus outbreak (2019–present) has revealed significant vulnerabilities from misinformation about the safety and effectiveness of vaccines and natural products in Southeast Asia. In Thailand, public health officials initially provided incorrect information about the two vaccines (Sinovac-CoronaVac and AstraZeneca COVID-19 vaccine) that the country has acquired, stating their efficacies as significantly higher than experimentally supported (Chulalongkorn University 2021; KIN () – Rehabilitation & Homecare n.d.; Mallapaty 2021). Once the incorrect information was discovered, the Ministry of Health made corrections to its materials and communications, but the inaccurate information continues to be accessible online. In Indonesia, numerous claims have been made by non-scientists about the efficacy of natural products in protecting against SARS-CoV-2 infection and the sources of the pandemic (Fitrianingrum 2021; Rahmawati et al. 2021). However, no scientific evidence exists supporting these false claims (Adelayanti 2020) whereas other claims have been corrected (Indonesia COVID-19 Task Force n.d.; Majelis Ulama Indonesia 2021). In Malaysia and Indonesia, false claims about vaccines not being halal (Ministry of Health for Malaysia 2021), specifically that they are made in aborted fetal cells or include cells from pigs (Kassim 2020; Ruzki and Ismail 2021), have adversely affected vaccine acceptance. In addition, translation of information into local languages has contributed to the spread of inaccurate information. In other parts of the world, claims spread about the SARS-CoV-2 vaccine include the following: (1) a means for covert implantation of a microchip, (2) the cause of sterility in women, and (3) a scientific hoax (Goodman and Carmichael 2020; Henley and McIntyre 2020; Imhoff and Lamberty 2020; Mayo Clinic Health System 2021).
Another significant focus of disinformation claims during the SARS-CoV-2 pandemic is related to its origin, which has yet to be determined. Numerous articles have been published attributing the introduction of SARS-CoV-2 into humans to deliberate release as a biological weapon by China, the United States, Israel, or the United Kingdom (Gertz 2020; MEMRI 2020; Pamuk and Brunnstrom 2020; Rogin 2020; Stevenson 2020). These claims can affect diplomatic efforts among countries adversely, expose researchers to various personal and security harms (e.g., death threats), and/or limit efforts to enhance safety and security of field and laboratory-based research of emerging infectious diseases.
The volume and variability of scientific information communicated have led the World Health Organization (WHO) to coin the term “infodemics,” which it defines as “a tsunami of information—some accurate, some not—that spreads alongside an epidemic” (WHO 2020). Limited to no evidence exists that suggests the problem with infodemics has gotten worse over time (Simon and Camargo 2021). Limited evidence also exists suggesting that access to more
accurate information is connected to more informed choices by citizens and policymakers. For example, a recent U.S. National Academies’ report stated that science alone rarely changes attitudes and behaviors regardless of whether the audience understands the correct information or suggests their minds would be changed with new information (NASEM 2020). Nevertheless, WHO has maintained the idea that unique challenges exist in addressing the COVID-19 pandemic because of the volume and variable quality and accuracy of available scientific information (WHO 2020, n.d.). According to WHO, these challenges are exacerbated during emergency situations when little information is available, public health and/or security questions are unanswered, and decisions for response are made quickly.
Scientific Information Overload: A New Threat Landscape
Digital communication technologies, in particular the internet and social media, profoundly have changed the way scientific information is produced, consumed, and disseminated and have contributed to access to and spread of mis- and disinformation (Okeleke and Robinson 2021). In the 1990s, scientists were among the earliest to adopt internet technologies primarily to communicate with their peers. Today, scientists can choose to disseminate their work to many different audiences and at many different stages of the scientific process. In addition, scientists now can share data, crowdsource peer review, conduct analyses, request funding (i.e., crowdsourced funding), and establish and maintain collaborations online.
These developments have been complicated by changing norms within the scientific community related to promoting one’s own work and sharing data and results prior to peer review. The choice about whether to use online channels for communication may be influenced by the stage of a scientist’s research and the intended audience of communications about their research. Prior to writing a new scientific article, email, listservs, and various online platforms can be used to connect with colleagues to exchange comments, ideas, and data. After completion of a manuscript but prior to its peer review, preprint servers can be used to disseminate early manuscript drafts to other scientists outside of one’s direct collaboration circles. Two primary preprint servers exist for biological research, bioRxiv and medRxiv, both of which add disclaimers about whether individual papers have been peer reviewed and published. Both preprint servers have internal processes for checking for plagiarism, completeness, non-scientific content, and biosecurity risks. This review process involves trying to identify papers that could cause harm, such as those countering established scientific knowledge (e.g., claims such as “smoking does not cause cancer”). During the past few years, the number of submissions and downloads to bioRxiv has significantly increased, and in 2017 only two-thirds of the papers submitted to bioRxiv were published in peer-reviewed journals (Abdill and Blekhman 2019). As demonstrated during the COVID-19 pandemic, the significant number of papers available to scientists and non-scientists alike, many of which have not been peer reviewed, provide ample opportunities for any inaccurate or misleading
information to be identified and used or shared by interested individuals. Blogs and social media can be used to advertise a newly published manuscript to the broader scientific community or even the public (Yeo et al. 2017).
With so many channels and opportunities for dissemination, risks may exist from an overabundance of information that cannot be assessed effectively for quality and accuracy before broad dissemination. Information overload leads to new threats to the integrity of science and the scientific workflow. Sometimes, errors may occur from misuse of a channel. For example, during the SARSCoV-2 pandemic, individuals, including scientists, discussed the virus, illnesses, and other health- and science-related issues on social media platforms, which included sharing unconfirmed and misleading information (Cinelli et al. 2020). As another example, inaccurate information posted on preprint servers has been amplified by the press or by social media influencers before they could be vetted as part of the regular peer-review process. Other times, errors may occur even as part of the “regular” process of scientific dissemination—for example, when correct, peer-reviewed information is distorted or exaggerated in institutional press releases (Li et al. 2017). As social media increasingly becomes a “hype machine,” developing strategies to prevent dissemination of biased and distorted claims will become crucial (West and Bergstrom 2021).
Types of Scientific Inaccuracies and Misinformation
Producing accurate scientific information in crisis situations is more difficult than in non-crisis situations. Often, a limited amount of information is available, data are sparing, and the situation is evolving rapidly in crisis situations. At the same time, crises demand urgent policy actions, which rely on emerging science, often conducted under immense public scrutiny. Still, scientists have been able to lend their expertise during crises in positive ways. However, the SARS-CoV-2 pandemic has presented a cautious tale of generating and/or sharing sound and evidence-supported information, particularly when the outbreak-causing virus is not well understood. The volume of information produced and communicated through peer-reviewed journals, non-peer-reviewed preprint servers, predatory journals, online platforms and websites, and social media create significant challenges for scientists and other consumers of scientific information to evaluate the accuracy and quality of publicly shared scientific information. Articles that are poorly conducted, involve small sample sizes, and/or demonstrate other significant weaknesses have fueled misinterpretation and misinformation by the media or other groups with particular agendas. For example, the results of a 2020 Danish study on the effectiveness of masks for protecting against SARS-CoV-2 infection were interpreted by the media as “questioning the efficacy of masks” and used by antimasking campaigns to promote its agenda (O’Grady 2021). Because evidence-informed decision-making involves acquiring and aggregating available information, inaccuracies resulting from weak study design, low sample size, and poor data analysis also run the risk of developing ineffective and misinformed policy and public health actions.
In addition, scientific communications that extensively promote a particular product or information (referred to as hype), exaggerate the results and outcomes of a study (referred to as hyperbole), or distort the data or results can lead to and/or perpetuate misinformation, which could have significant downstream consequences to democratic decision-making and public health, both of which are critical during infectious disease epidemics (West and Bergstrom 2021). Other practices, such as not publishing negative results, referred to as publication bias, and selective referencing of articles, referred to as citation bias (e.g., self-citation, citation of authoritative authors, and citation by journal impact factor), present a partial picture of the scientific results and could perpetuate inaccurate and misleading information (Joober et al. 2012; Urlings et al. 2021). Scientists are well positioned to address these challenges to promote accuracy and robustness of scientific data and results (West and Bergstrom 2021).
Malign actors also influence crises when ambiguity exists and information changes rapidly. First impressions of information can be challenging to counter so anticipating and preemptively addressing problematic narratives can help fill information gaps with the correct information before misleading claims have an opportunity to spread. Partnerships with social and mainstream media are critical to reduce the risk of amplification of inaccurate or misleading information. But, if problematic information is not identified before it enters the mainstream venues, other means are necessary to counter those claims within those venues.
Scientists can play several roles in addressing inaccurate and misleading claims including the following: (1) countering misinformation that has long-term consequences to public health and research progress and translation among scientists and other key stakeholders; (2) building trust and strengthening communication among scientific and non-scientific communities (e.g., policymakers and journalists); (3) developing defensible, high-quality, accurate scientific information, which results in high confidence even if consensus does not exist; (4) engaging with members of the broader public to convey how scientific progress is made and how to evaluate scientific information critically; (5) developing further a new field of study in developing, monitoring, and countering misinformation, which already has been created and includes social, political, computer, and life scientists; and (6) acting as objective, independent messengers of evidence-supported scientific information.
Scientific inquiry has been crucial for assessing emerging outbreaks and epidemics, and developing effective strategies for countering emerging, reemerging, and persistent infectious disease and other biological threats. Often life and public health scientists from various disciplines, including virology, computational biology, and epidemiology, play numerous roles in preventing, detecting, and responding to biological threats. These roles include detection and characterization of new infectious diseases and/or strains of existing pathogens, analysis of the rate of illnesses and deaths, identification of the source of new
infections and effective control measures, and development of new vaccines and medicines among many other activities. For example, the 2011 Escherichia coli O104:H4 outbreak initially was characterized using bioinformatics analyses of bacterial genomic sequences following an unusual infection pattern—24 infections in France and more than 3,816 cases in Germany (Guy et al. 2012; IOM 2012). The sequence analysis enabled food safety experts to use epidemiological methods to track the source, which was contaminated sprout seeds (Grad et al. 2012), but not until after the outbreak was correlated inaccurately with lettuce, cucumbers, and tomatoes (Foley et al. 2013). These analyses, which were conducted relatively quickly after initial infections were documented, enabled public health officials to prevent further spread of the bacteria and addressed any concern about the deliberate nature of the outbreak, which resulted from the unusual infection patterns. Scientists from other disciplines, including various social science disciplines and computer science, have enabled effective control of biological incidents. The 2014–2016 EVD outbreak in West Africa illustrates the important role of social and computer scientists. Cultural anthropologists shed light on why white protective suits and removal of ill or deceased individuals caused distrust, leading health agencies to change the color of the suit (because white signifies death within the local culture) and develop safe practices for traditional burials (Nyhan 2014). These changes allowed health workers to gain the trust they needed among the local population to control the spread of EVD. Similarly, advances in big data analysis and modeling of infections informed international awareness about and response to the outbreak (Nieddu et al. 2017; Wein 2014). During the SARS-CoV-2 pandemic, computer scientists also bring to infectious disease outbreaks their knowledge of information transfer via social media and other online platforms, and advanced data analytics such as AI for quickly identifying effective existing therapies and near-real-time surveillance during outbreaks (Basu 2021; Begley 2021; Bridgman et al. 2021; Budd et al. 2020; Himelein-Wachowiak et al. 2021; Kostkova et al. 2021; Mohsin et al. 2020; Zhou et al. 2020).
Conclusion 1: Life, social, and computer scientists play critical roles in ensuring their science and the science produced and shared by others within their communities is accurate and supported by well-designed and implemented studies. They can work collaboratively to leverage appropriate expertise to produce and/or disseminate accurate scientific information, peer review and correct inaccurate scientific information, and ensure that scientific information is communicated in an unbiased, objective, and culturally informed manner.
Conclusion 2: Scientists are one of several stakeholders in addressing misinformation. Other stakeholders include policymakers, journalists, and members of the public. Engaging these other stakeholders positively is critical to building trust, communicating corrective information clearly and effectively, and addressing misinformation in a timely manner if and when necessary.
Several online networks of scientists seeking to share information and promote interaction have emerged during the past 10 years, and the number of networks has exploded since the emergence of SARS-CoV-2. During the past 5 months alone, platforms such as nextstrain.org and virological.org have enabled scientists to use publicly deposited viral genomic sequences to examine the similarity/divergence and evolutionary rates of strains from infected individuals, and to estimate dates of viral emergence in the human population (Nextstrain n.d.; Virological n.d.). Other platforms, such as the COVID-19 Global Science Portal, provide opportunities for scientists to share information about the pandemic response, inform members about scientific priorities and investments, and enable discourse about emerging scientific issues associated with the pandemic (ISC n.d.b). Still, other platforms, such as kaggle.com, provide opportunities for data scientists to contribute to answering scientific questions about the pandemic (Allen Institute for AI 2021). In addition to these platforms, bioRxiv and medRxiv have provided readers opportunities to comment on deposited articles and submit articles themselves, providing opportunities for third-party review. Though these reviews are moderated, their accuracy may not be verifiable. Some preprint servers are integrated with publishers, which enables transition of articles from non-peer-reviewed to peer-reviewed articles. In addition, publishers of peer-reviewed journals have provided free access to published research about pathogens causing ongoing outbreaks, epidemics, or pandemics, including to WHO and other public health entities (STM 2021). Some publishers also provide training on research integrity and countering misinformation to their scientist-authors (Elsevier n.d.; Nature Masterclasses n.d.). Finally, virtual platforms, such as ResearchGate, Academia.edu, and Mendeley provide opportunities for scientists to share their articles with others. Together, online platforms, pre-publication databases, and individual groups of scientists enable broader participation in addressing global scientific questions associated with ongoing events. Although few individual groups of scientists and online platforms explicitly address potential biosecurity risks resulting from disinformation campaigns, these platforms suggest that the concept of a virtual community of scientists conducting community-based analyses and peer review to produce accurate, evidence-supported science is possible.
Finding 1: During the past few years, numerous virtual platforms, including and independent of social media platforms, were created to foster communication and collaboration among scientists, share data, and crowdsource analysis. These platforms created opportunities for scientists throughout the world to interact with each other.
Like in many other fields, AI has been considered a helpful tool in identifying misinformation claims so they can be addressed by various individuals or entities, including a few policymakers whom the committee consulted. For example, organizations, such as Meta AI (formerly Facebook), and marketplaces, such as Amazon Mechanical Turk, use or can use AI to detect misinformation
(Horowitz 2021; Meta AI 2020). However, these tools have demonstrated challenges in identifying misinformation related to the COVID-19 pandemic (Siriwardana n.d.). Therefore, understanding the capabilities and limitations of AI, especially in helping to identify false claims, is critical if national and/or nongovernmental entities plan to use AI as part of their fact-checking efforts and publishers and scientists plan to use AI to identify relevant articles for review and analysis. AI refers to the ability of computers to perform tasks that normally require human intelligence and has been studied for decades, primarily focusing on routine software to codify knowledge of human experts in specifically programmed rules (i.e., “if given a specific input, then provide a specific output”) (Sedova 2021). Recent interest in AI focuses mostly on machine learning, a type of AI that is driven by data abundance, innovation in algorithms, and computing power. Machine learning systems learn from data rather than human expertise and involve the development of a model based on a training dataset and defined algorithm architecture that can compensate for bias in the data. Four types of machine learning exist: (1) supervised learning that uses example data that have been curated and labeled by humans; (2) unsupervised learning that uses data that do not require labels but rather identifies patterns in the data, clusters similar groups of data, and can detect new behaviors from previously unidentified patterns; (3) semi-supervised learning that uses labeled and unlabeled data; and (4) reinforcement learning that uses an autonomous AI agent that gathers its own data as it performs goal-oriented tasks to maximize rewards in a learning environment and improve based on trial and error in fictitious or real-world situations.
Conclusion 3: Technical solutions alone are not sufficient to identifying and addressing inaccurate and misleading claims. Scientists from a diversity of disciplines are necessary to address inaccurate and misleading claims about biological threats. What is needed is the establishment of a structure for human interactions, favorable policies for data sharing and analysis, and a process for collaboration to correct inaccurate and misleading information before it has the potential to fuel misinformation.
A strategy for engaging scientists in addressing inaccurate and misleading information builds on the role that scientists play in curbing the spread of misinformation of biological threats, benefits from leveraging local and international networks in a dynamic and distributed network, and enables scientists to interact on an ongoing basis to improve scientific accuracy and excellence. A distributed network is a network of networks that enables active involvement of scientists from many disciplines and institutions to assist with correcting inaccurate and misleading information, and draws on existing discipline- or sector-specific networks of scientists. The underlying scholarship and regional considerations for the recommended strategy and considerations for scientists
involved in countering misinformation are described after the description of the recommended strategy.
Recommendation 1: Leaders of established scientific networks in Southeast Asia jointly should create a distributed network of individuals and organizations (i.e., a network of networks) that draws on a diversity of scientific disciplines and sectors needed to correct inaccurate and misleading scientific information about infectious diseases and other biological threats. The network should be regional and have a leadership structure that includes scientists from countries in the regional network. The network itself should be virtual only, leveraging recently developed online collaboration tools, but should be based in a host nation within Southeast Asia to support key operations (e.g., website, email addresses, and resource repositories) and gain credibility by regional and national authorities.
The following sub-recommendations define the scope and boundaries of the network.
Recommendation 1.1: The network should establish (1) a governing board, comprised of knowledgeable scientists who are responsible for making strategic decisions for the network; and (2) an executive team, comprised of scientists who have a proven track record of developing and sustaining scientific networks. The executive team should manage, motivate, and expand the membership; foster diversity among the membership (i.e., gender, age, experience level, country, ethnicity, scientific discipline, type of institution); ensure continuous services are provided to members; and activate the network to address inaccurate and misleading information about biological threats when such claims arise. The executive team should work with individual members and external stakeholders to identify claims and determine which claims to address before activating relevant expert members of the network.
Recommendation 1.2: The network should include scientists from all relevant disciplines—specifically a variety of life and other natural sciences, computer and information science, social sciences including science communication experts, and political science including researchers of misinformation—to ensure that the most relevant expertise is leveraged in addressing inaccurate and misleading information and can be communicated effectively to the appropriate stakeholders, whether within or outside the network. Furthermore, the network should include scientists and country-level networks from academia, industry, and other nongovernmental and government organizations.
Recommendation 1.3: The network should be hosted or sponsored by an organization with authority in Southeast Asia (e.g., the Association of Southeast Asian Nations) to enhance its credibility among scientists, policymakers, and other stakeholders, and to support its work through connections with other networks of experts, share information, provide meeting facilities, and connect the network’s efforts with regional public health and scientific priorities.
Recommendation 1.4: The primary audience and membership of the network should be scientists. However, the network governing body should develop trusted relationships with policymakers, journalists, and lay and religious leaders to enable broader communication and interaction with these stakeholders.
Recommendation 1.5: The network should engage (1) international scientific networks to improve access to needed expertise; and (2) non-technical organizations, such as established, credible fact-checking organizations, to assist with identification of inaccurate and misleading information, determination about whether to address the inaccurate information, and communication of corrective information.
Recommendation 1.6: The network should focus initially on addressing inaccurate and misleading information relating to infectious disease and other biological threats. However, as the network grows and gains credibility and members, it may expand to countering inaccurate and misleading information in other fields.
Developing a trusted, regional network of qualified scientists who can work together to counter misinformation based on scientific inaccuracies involves the development of a high-level strategy and structure for the network. A regional network of scientists offers advantages over national initiatives, including access to experts from a diversity of disciplines, minimization of political or social influence of information corrected or discussed, and access to resources such as training programs and guides. A network of life, social (including experts in science communication), and computer scientists may collaborate and communicate at the technical level without raising concern or barriers at the political levels, especially if network leadership and members understand and comply with national policies (see Recommendation 7). Figure 1 illustrates the committee’s vision of the overall strategy of the network. It highlights the need to develop short-, mid-, and long-term goals to achieve full development of the distributed network, and the strategy and tactics need to be iterated over time to achieve the vision. Developing the vision of the network at its beginning ensures that the strategy and tactics work toward realizing the vision and embeds considerations about long-term sustainability of the network at the start. This strategy will need
to (1) empower talent; (2) motivate scientists; (3) engage international, regional, and national efforts; and (4) leverage existing initiatives in responsible science.
Empowering Local and Regional Talent
A critical element for the trusted network is having a diversity of scientific experts who work collaboratively to address inaccurate and misleading information. Many countries in Southeast Asia have invested in developing scientific talent and scientific experts in various biological sciences disciplines. Current measures of scientific output (e.g., number of graduates in specific disciplines and number of papers published) do not characterize local scientific capacity adequately and may present difficulties in generating or recruiting future talent in various scientific fields. Many countries in Southeast Asia have science, technology, and innovation talent development roadmaps, but many do not include similar initiatives to support the development of social scientists because the social sciences are perceived as not being associated with economic gain from innovation (Scott-Kemmis et al. 2021). The pandemic has revealed a significant lack of a robust community of social science experts capable of supporting the laboratory or field-based natural science community in Southeast Asia to address
a variety of societal problems, including capabilities to address inaccurate and misleading claims effectively.
In addition, equipping life and computer scientists from Southeast Asia with science communication and stakeholder engagement skills can empower them to address inaccurate and misleading information. Furthermore, providing skills to foster inclusive leadership and promote trust and sustainability within scientific networks can enable cross-disciplinary, cross-sector, regional collaboration to address these claims. Identifying the talent gaps in specific scientific disciplines in Southeast Asia is critical to ensure local relevance and sufficiency. Furthermore, an inclusive network of scientists can enable all groups to work toward a common goal of addressing misinformation about emerging infectious diseases and biological threats by leveraging each other’s expertise and diverse perspectives.
Finding 2: Training scientists on science communication, public engagement, and science policy is critical to countering misinformation.
Finding 3: In Southeast Asia, research and education in these and other social science disciplines are limited. But they are necessary for ensuring that cross-disciplinary scientific teams have all relevant skills and expertise available when addressing inaccurate information that could lead to or perpetuate misinformation.
Recommendation 2: National competent authorities, whether the Ministries of Science or other Ministries, in Southeast Asian countries should develop undergraduate, graduate, or postgraduate training programs on various social science disciplines, including science communication and public engagement. In addition, national competent authorities should develop informal training programs in science communication and public engagement for life, computer, and other natural scientists and engineers to enhance their communication skills and their ability to build public trust. Countries that already have established education or training programs should share their curricula, best practices, and lessons learned.
Motivating Service for the Purpose of the Network
Another critical element of the trusted network is motivating local scientists to contribute actively in promoting scientific excellence and integrity, building the defensible and accurate scientific knowledge base to correct inaccurate and misleading information, and identifying and addressing misinformation. Examples of ways that scientists may be motivated to contribute to the network include the following: (1) appealing to concepts of scientific responsibility, associating the practice of countering inaccurate information with scientific integrity and excellence; (2) highlighting the potential consequences of misinformation
such as the erosion of trust in science and scientists by members of the public and negative effects on the research and public health ecosystem; (3) recognizing scientists for their contributions to countering inaccuracies; and (4) providing financial support for time spent addressing inaccurate information.
Intrinsic motivation-building strategies include promotion of inclusivity and an understanding of the local culture, trust, and openness. Teams involving Southeast Asian scientists need to be created in a manner that allows for inclusion and productive contribution of scientists who are early in their careers and are from a variety of disciplines and sectors. Promoting trust through a culture of transparency and openness is critical to building a trusted network both for internal interactions among members and external interactions among various stakeholders (e.g., scientists, members of the public, policymakers, and journalists). This transparency is a critical factor in addressing effectively the growing mistrust, skepticism, and various other challenges associated with inaccurate and misleading information that fuels misinformation by supporting greater accountability, quality, and safety, and by promoting ethical attitudes and practices, all of which contribute to scientific excellence. The value of transparency in fostering trust is highly dependent on the external and internal communication processes that are mediated by several relational and contextual factors.
Motivating the scientific community to engage in efforts such as addressing misinformation involves social responsibility and integrity. For example, the Young Scientists Network-Academy of Sciences Malaysia and the Association of Southeast Asian Nations-Young Scientists Network (ASEAN-YSN) promote Responsible Conduct of Research to encourage greater awareness among scientists in upholding the principles of research integrity and addressing contemporary needs of various external stakeholders in Southeast Asia. These existing networks promote multi-sector engagement among academia, industry, government, and broader society and embrace responsible team science among interdisciplinary teams of scientists and others within the scientific ecosystem. Through these efforts, researchers in the region are encouraged to reflect on their own actions to ensure they align with ethical principles associated with correcting scientific errors and addressing misinformation.
Finally, when misinformation involves scientific issues, scientists may be concerned about the presumed harm that these claims have on the reputation of other scientists and the public, which in turn can activate scientists’ support for education and legislation for correcting misinformation. These considerations can become key enablers in encouraging domain experts, such as life, computer, and social scientists, to work together to safeguard the interests of the scientific community and the broader society.
As scientists become motivated to contribute to the network, enhance scientific excellence, and address misinformation, continuous engagement and involvement in the network can be promoted. Continuing engagement can be measured via the ability of scientists to receive continuous professional development opportunities for improving their skills in addressing inaccurate and misleading claims, and enhancement of existing and innovative capacities for addressing inaccurate information at institutional and national levels.
Actively motivating members, providing evolving professional development offerings, and regularly co-designing and implementing efforts to promote a shared vision and mission among network members are critical for ensuring the sustainability and relevance of the network. Members of the networks with diverse interests could be encouraged to pursue their interests as long as they are framed within the overarching goals of the network. Finally, by valuing inclusivity and representation among members, the network provides opportunities for scientists from various disciplines, genders, career stages, accessibility, and cultures to interact. Diversity strengthens scientific excellence, which plays a critical role in the network. Inclusion of diverse views, expertise, and experiences enables the network to incorporate new and innovative approaches; promote scientific responsibility; enhance accuracy and accessibility of science among various audiences; and be nimble, flexible, responsive, and proactive. Box 2 highlights the values of the trusted network.
Engaging International, Regional, and National Efforts
A third critical element of the trusted network is creating an entity that is valued and trusted by its members and various stakeholders (or audiences) and that works in coordination with existing networks, many of which have different but overlapping missions, members, and audiences. Several international, regional, and country-level networks exist to facilitate scientific interaction on general and discipline-specific topics. Some of the examples of international networks of scientists include the Global Young Academy, The InterAcademy Partnership (IAP), and The World Academy of Sciences. Examples of regional networks are ASEAN-YSN and the Association of Academies and Societies of Sciences in Asia. Regional networks such as these can be expedient at implementing
new initiatives and programs, often have a greater cultural understanding of issues and regional context, and may receive financial and political support by regional (e.g., ASEAN) or national authorities. Examples of country-level societies are national academies and national young academies. These networks may be connected to regional and international networks. Networks of scientists also could be formed on a disciplinary- or field-specific basis such as the American Society of Microbiology, the International Society for Infectious Diseases’ ProMed, the American Biological Safety Association International, the International Life Science Institute, the International Union of Biological Sciences, the International Union of Biochemistry and Molecular Biology, the International Network for Government Science Advice, and the Southeast Asia One Health University Network and national one health university networks in Southeast Asian countries.
Trust is built on scientific excellence, clear and concise communications, and active and respectful dialogue with various stakeholders. Although the need for addressing misinformation about biological threats has been recognized by several international organizations (e.g., WHO, the United Nations Interregional Crime and Justice Research Institute), prioritizing regional and national efforts that involve scientists as part of the solution toward addressing inaccurate and misleading information that leads to or otherwise enhances misinformation would provide high-level support for the trusted network and other related activities. For example, if ASEAN prioritizes initiatives for addressing misinformation, members of the trusted network could provide accurate, authoritative, and defensible scientific information that could debunk those claims and also could provide high-level support for the trusted network. In The State of Southeast Asia: 2021 Survey Report, published by the ASEAN Studies Centre at the Institute of Southeast Asian Studies-Yusof Ishak Institute, respondents (from academia, think-tanks or research institutions, government entities, civil society organizations, nongovernmental organizations, business or finance institutions, and regional or international organizations) from ASEAN, in particular Singapore and Brunei, urged scientists and medical doctors to engage in public policy discussions and scientific advice to better address the SARS-CoV-2 pandemic (Seah et al. 2021). Similarly, partnerships between this trusted network and respected international organizations such as the International Science Council, IAP, and national academies (e.g., the U.S. National Academies) further enhance the network’s ability to draw on scientific leaders in a variety of disciplines, including those not yet well represented in Southeast Asia.
Recommendation 3: The distributed network (see Recommendation 1) should leverage existing scientific networks in Southeast Asia, catalyzing collaboration among their members, especially those with needed expertise, to address specific claims. Regional scientists involved in developing this distributed network should develop open and trusted lines of communication and partnership with relevant Ministries to gain high-level support for the network and its outputs.
Leveraging Scientific Responsibility and Ethical Principles
Responsible science is a core part of life science research and development throughout the world and is grounded in international principles on scientific responsibility, which include integrity, respect, fairness, trustworthiness, transparency, recognition of benefits and possible harms, accountability, stewardship, and objectivity in the conduct and communication of science (IAP 2016; ISC n.d.a; NASEM 2017a). Several of these principles align with the 1979 Belmont Report, which created an ethical framework for research involving human subjects and, subsequently, applied more broadly for biomedical ethics in 2001 and the social sciences (Beauchamp and Childress 2001; National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research 1979). These core principles have been applied to many other types of research and scientific activities including ethical considerations associated with research that raises security concerns (Selgelid 2016).
Building on this foundation, the U.S. National Academies and Academy of Sciences Malaysia engaged scientists throughout Southeast Asia in 2013 on responsible science (Chau 2020; Clements et al. 2017). These activities focused on linking responsible conduct of research and other responsible science concepts with scientists’ role in addressing or otherwise minimizing security risks of their research, specifically the potential for accidental or deliberately malicious exploitation of peaceful life sciences research (NAS 2017; NRC 2004; WHO 2010). Following this conference, the Malaysian Educational Module on Responsible Conduct of Research was published in 2018, and the ASEAN-YSN and Southeast Asia regional office of the International Science Council launched a project in 2019 to train instructors on responsible science throughout the region (ASEAN-YSN n.d.; MOSP n.d.). Internationally, the European Union initiated an effort on responsible research and innovation as part of its Horizon 2020 program, and WHO is updating its 2010 report Responsible Life Sciences Research for Global Health Security: A Guidance Document (EC n.d.; WHO 2010).
Finding 4: International and regional efforts toward responsible science provide a strong foundation and framework for guiding the activities of scientists who are interested in correcting inaccurate scientific information that leads to or propagates misinformation in the public sphere.
Developing an implementation plan for the creation and maintenance of the distributed network is a critical next step in engaging scientists from a variety of disciplines and sectors to assist in countering misinformation based on inaccurate and misleading information. The recommended strategy lays the foundation, including key considerations for its successful implementation, for the development of this distributed network. The implementation plan would document clear tactics for building strategic partnerships to realize the distributed network and
gain support among the scientific and policy communities in Southeast Asia, adding value to members on an ongoing basis (e.g., through training of new skills), developing and executing the process for adjudicating claims that need to be addressed by the network and for enabling cross-disciplinary and cross-sector collaboration to address those claims, and identifying and implementing improvements of the distributed network. Furthermore, the implementation plan can specify the means and frequency of interaction among network members; the means of interaction with other networks and regional leaders; the process for developing common messages about corrected information that are accessible in multiple languages; and a unified platform through which members and broader stakeholders can communicate and access authoritative, evidence-informed, and defensible scientific information. Finally, the implementation plan can include the development of key operational actions, including (1) a process for maintaining connectivity and interaction among members, stakeholders, and other networks; (2) an action plan for sustaining the network beyond initial funding (i.e., through the development of a fundraising or business plan that identifies potential funders, articulates the value proposition to potential funders, and documents measures of success of the network); (3) a clear definition of the membership types, levels, and costs, if relevant; (4) incentives for member and stakeholder engagement; and (5) consideration of the nation that hosts the network, even if implemented as virtual-only, and associated policies for operating the recommended network.
Recommendation 4: The governing body and executive team of the distributed network (Recommendation 1.1) should establish an implementation plan for the creation of the distributed network prior to its formation. This plan should specify important aspects of the network’s operation such as (1) sustainability of the network once established; (2) interaction with scientific and non-technical stakeholders; (3) ongoing benefits to members (e.g., training, mentorship and coaching opportunities, professional networking, production of publications); and (4) processes for identifying, adjudicating, and, if deemed necessary, addressing inaccurate information relating to biological threats. The plan should build on the key considerations of the strategy, lessons learned about effective correction of inaccurate information that are described in this report (section on Interventions Against Inaccurate and Misleading Information), and the associated how-to guide for identifying and potentially addressing inaccurate information that leads to or perpetuates misinformation. Furthermore, the plan should define the role that scientists play in addressing inaccurate and misleading information and scientific responsibility. This role should be considered within the broader context of other societal and international actors who may be addressing mis- and disinformation.
Recommendation 5: The implementation plan should specify how the network will interact with stakeholders in government, media, and the broader
public to establish itself as a “go-to” regional focal point for addressing inaccurate and misleading information with evidence-supported and clearly communicated scientific information. The implementation plan also should describe the relationships among the network’s leadership, members, external scientists, and other external stakeholders to provide a clear understanding of who makes inquiries about addressing inaccurate statements and how these statements will be evaluated and addressed.
Recommendation 6: The implementation plan should draw on the guide, which is a companion to this strategy report, for addressing inaccurate and misleading statements. Network leaders and members should work with experts in the life sciences, science communication, team science, computer science, and research studying the flow, spread, and correction of misinformation to train network members in using the guide effectively.
Different approaches to building a network to combat misinformation about biological threats were considered. One approach is a network that involves formal or informal associations of individuals as members. Another approach is a consortium that involves formal association of institutions as members. In this section, both models, practicalities of their creation and maintenance, and limitations and capabilities of each are described along with common considerations that apply to any scientific network model.
Selecting which model is appropriate involves analysis of several key considerations including sustainability, which is a significant concern associated with the longevity and durability of any network, and the ability to be dynamic. Continuous engagement of members in the network’s primary activities and emphasis on the benefits of membership to the network promote its sustainability. Dynamic networks require more active participation of members and specialized domain expertise, especially for events that occur periodically. For example, although misinformation is a constant concern, misinformation about infectious disease research and epidemics may occur periodically and only when specific concerns or events emerge. Therefore, designing the network such that it can activate different members based on changing needs is important for addressing inaccurate information about emerging infectious diseases and other biothreats. Both models described here incorporate these considerations in network design.
Trust is a critical component of any type of network. Creating a stable and trusted network involves the provision of opportunities for affiliation and social exchange, development and mentoring, and support for strategic collective action. Collective learning and collaboration can enable a community to construct moral authority. Known as collective intelligence, both networks of individuals and institutions can foster innovation more effectively to solve highly complex challenges and marshal resources and knowledge fast (Riedl et al. 2021; Suran et al. 2020).
Network of Individuals
The “network of individuals” model is similar to professional associations or Facebook groups that draw together a set of individuals who identify with a particular community or particular cause. For example, most scientific societies have individual members; provide newsletters, awards, meetings, and publications; and appoint fellows. They have an array of membership types including those working in specific fields or disciplines, at experiential levels and/or career stages (students), and with varied citizenship.
The benefit of a network of individuals is that it can be targeted to the individuals who are most invested in and most knowledgeable about the science. Scientists can leverage their domain expertise to identify and address misinformation in ways that reflect problem-solving and critical thinking in their own fields. The network can be crafted with a balance between different domain experts including science communicators and various life scientists, and ensure diversity in membership (e.g., fields, career stage, expertise, geographical and regional representation, sector). Furthermore, these organizations offer opportunities for sub-groups to be established that address particular issues, such as science policy. This sub-group structure could provide opportunities for individual scientists from existing, reputable networks to address misinformation. This initial nucleus of trusted individuals may attract interest from other members who also are able to lend their expertise to common goals, and enable the development and use of common rules or effective procedures for working together.
This model of networks of individuals also has several limitations. A significant limitation is the challenge of sustaining such a network. If the network is activated sporadically, members may not be engaged sufficiently, and membership might experience significant attrition and turnover during the dormant periods. As a newly established network, trust and credibility also will need to be established. Another limitation is that a structure would need to be established to run the network, including a leadership structure, a way to identify who would be included as members, and a process for vetting those members. Additionally, networks of individuals may not be able to draw on available funds that societies have to support their own activities through the collection of dues. Finally, a network of individuals may provide less protection of its members from attacks (e.g., to reputation or oneself and family) than a consortium-based model.
Finding 5: If a network of individuals is created to address inaccurate and misleading information, it would need to have a structure that provides credibility and stability, and be agile and fluid in times of crisis.
Network as a Consortium (of Science Societies and Institutions)
Different from the collective knowledge of individuals, connectional knowledge of a consortium harnesses the value of relationships and networks, thus connecting organizations or networks (Dahwan and Joni 2015). The value of consortia is that the membership is organizations (e.g., universities) or associations,
which already have stable membership of individuals and trust within their communities. A particularly effective example of a formal network of organizations is the Societies Consortium on Sexual Harassment in STEMM, based on recommendations by the U.S. National Academies (NASEM 2018). The mission of this Societies Consortium is to “support academic and professional disciplinary societies in fulfilling their mission-driven roles as standard bearers and standard setters for excellence in science, technology, engineering, mathematics, and medical (STEMM) fields, addressing sexual harassment in all its forms and intersectionalities.” As an organization, the consortium sought to create a strong and influential, collective voice, “backed by action, of a diverse and large group of committed societies (of many sizes and areas of focus) to set standards of excellence in STEMM fields.” In short, the Societies Consortium was created to bring together scientific networks to address a challenge in science and society, and to help create and provide resources and tools for researchers to increase ethical behavior in science, specifically on sexual harassment (Societies Consortium on Sexual Harassment in STEMM n.d.).
The benefit of a consortium of societies (or institutions) is in the strength of a collective and unified voice that can more easily and rapidly unite thousands of scientists. Member societies can get resources and information out rapidly if needed to address immediate, external issues. The focus on the consortium can be more about creating useful resources and leadership guidance on addressing issues, setting standards for ethics in science, or providing other common resources, rather than evolving more slowly as a society of individual members in specific scientific disciplines. Another benefit of a consortium is the potential sustainability of the network. A consortium creating resources and tools for societies that can then be passed down as resources for individual smaller network members is creating value for all through the shared cost and the non-duplication of efforts. Finally, consortia may provide institutional protection to members who contribute to addressing inaccurate information, helping to protect individuals’ privacy and security for particularly sensitive topics.
The limitation of a consortium is that individuals can be lost in the structure. Although the internal personnel may be structured similarly to an association or institution with an executive board and leadership group, the individual voices of members might be more difficult to hear, as membership to the consortium is through a smaller number of representatives of other networks and/or institutions. This structure may have more of a singular leadership role or source of tools rather than a society made up of individual members. Although the benefits of a consortium trickle down to individual society members, the resources and tools developed are vetted by the consortium leadership and developed to address the specific challenge.
Hybrid of Consortium and Individual Models
The committee also considered a hybrid model in which a network of societies (e.g., consortium) also invites individual membership. Such a model would have many of the same benefits as a consortium. An additional benefit
is that individuals not represented by a society or institution are able to participate. However, the institutional verification and validation of members would be diminished in the hybrid model. Although connectivity is of central importance to connective knowledge, connectivity also can come from individual champions or influencers. Rarely will a scientist not belong to either an association or a research institution, so valuable resources of a consortium would be available for their use. Some scientists and fields are impacted more by mis- and disinformation than others, and could use the support and resources straight from the source.
Another advantage of a hybrid model is that it could take advantage of existing social ties among scientists participating in the same societies or with shared institutional affiliation. Membership in and influence of such a hybrid network would thus spread over the existing “social network” of institutional affiliations, taking full advantage of indirect and direct network effects of this underlying network, in a way reminiscent of how social networks like Facebook took advantage of existing ties among college students in its early growth (Sundararajan 2006).
Existing regional networks in Southeast Asia already may have resources to support the translation of scientific information into multiple local languages, communication and interaction among scientists from different countries, access to information and expertise in challenging environments, and planning around important cultural traditions and events. Numerous local languages are used by scientists throughout Southeast Asia. For example, Malaysia is a multiracial country with many languages and dialects (e.g., Mandarin, Malay, Tamil, Bjaus, Iban, Cantonese, and Hakka). Building trust and open lines of communication with scientists and local communities requires the use of local languages for communicating science in crises.
As with many regions around the world, each Southeast Asian country has different laws governing scientific activities and data sharing and access, which are helpful to know when determining whether and how to counter inaccurate information. At the heart of some of these policies are different perspectives on trust, respect, and reciprocal benefit of regional countries and scientists and international scientists (Merson et al. 2015). In 2019, Indonesia strengthened its laws governing access to its biodiversity samples by international scientists, nearly banning export of physical or digital information about samples without a material transfer agreement and increasing the legal penalties of violating these requirements (Rochmyaningsih 2019). Simultaneously, Indonesia began supporting open access to public health articles, an action that accelerated during the COVID-19 pandemic (IAKMI n.d.; Universitas Airlangga n.d.). In Malaysia, legislation related to conservation of biological diversity is mainly sector-based and often is implemented by federal or state level entities (sometimes, implementation is shared by federal and state entities) (MyBIS 2015). Malaysia supports open access to scientific data and information, primarily to reap the benefits of research (MOSP n.d.). Since 1992, Thailand has implemented relevant policies and measures to comply with the Convention on Biological Diversity (CBD)
(Anti-Fake News Center Thailand n.d.; Jalli 2020; Office of Environmental Policy and Planning 2000; Schuldt 2021; Tanasugarn et al. 2000). Vietnam has implemented its Law on Biodiversity and strengthened its legal structures for the management of access to genetic resources and sharing of benefits arising from their use to meet its obligations under the CBD and associated Nagoya Protocol on Access to Genetic Resources and Fair and Equitable Sharing of the Benefits Arising from their Utilization (Socialist Republic of Vietnam 2008, 2017, 2019, 2020a,b). Several countries, including Vietnam, have developed open or limited access to digital sequence information (DSI) to promote scientific pursuits, while the countries party to the CBD are exploring whether DSI should be included in the Nagoya Protocol (Strategic Framework for Capacity-building 2017).
Several Southeast Asian countries also have established laws and partnerships to address misinformation that emerged during the COVID-19 pandemic (Jalli 2020). The Indonesian government works with fact-checking organizations to address misinformation, and the Indonesian Anti-Defamation Society (Masyarakat Anti Fitnah Indonesia) was created before the pandemic to counter hoaxes and other false information (MAFINDO n.d.). The Malaysian government created a platform to cross-check information spread through social media, and Malaysia’s Ministry of Health Malaysia (2021) launched a new website, COVIDNOW, for more transparent COVID-19 data sharing with the public to increase public trust. Thailand created an Anti-Fake News Center and Singapore created Factually,3 both of which are designed to correct false information spread through the media (Anti-Fake News Center Thailand 2021; Schuldt 2021).
Finding 6: Many countries in Southeast Asia have established programs to address false claims, but few academic articles have been published to characterize the scale, scope, and enablers of mis- and disinformation in the region. In addition, studies evaluating effective measures to counter false claims in Southeast Asia have not been published.
Recommendation 7: The executive team of the network should develop resources for scientists to understand country-specific laws governing data sharing, scientific research, and combating misinformation. Furthermore, information describing similarities and differences in trust among scientists and with other stakeholders, respect among scientists, and mutual benefit of scientific activities should be provided to network members to enable productive and positive collaboration.
Recommendation 8: The governance body and executive team of the proposed network should begin building trusted connections with country-specific programs aimed at combating misinformation and in local languages to assist in identifying and communicating corrective information.
Understanding the flow of information is critical to achieving the vision and mission of the proposed network. With the emergence and widespread use of social media platforms, knowledge and understanding about how information flows and gains acceptance within networks have increased. The idea of opinion leaders (also called “influentials”) has been around since the 1950s (Katz and Lazarsfeld 1955; Keller and Berry 2003; Lazarsfeld et al. 1948). The assumption behind the idea is that information rarely flows directly from a source to a receiver; instead, it is passed along by opinion leaders who are connected closely to an information source in the network and then pass along that information to a wider group of audiences in their respective networks (Shah and Scheufele 2006). With the emergence of online social networks, the concept has shifted both the focus and terminology toward the idea of the “influencer.” The concept of the “influencer” is grounded in the social influence network theory, which is a “process of interpersonal influence that occurs in groups, affects persons’ attitudes and opinions on issues, and produces interpersonal agreement, including group consensus, from an initial state of disagreement” (NRC 2003).
In 2003, this theory was evaluated by modeling change in consensus opinion for a particular policy question, and the results may be relevant to understanding how influencing systems may be manipulated to correct inaccurate information. These results include the following: (1) the structure of the influence network can be strategically designed to build consensus about a particular issue efficiently; (2) group-think may occur if consensus occurs too quickly and a diversity of views and data are not considered; (3) direct communication by the authoritative source, communication by a close representative of the authoritative sources, or parallel influence networks communicating the same message by their authoritative sources can help to sway consensus toward the views of the source; (4) groups may have more robust influence of outcomes than individuals, and interpersonal interactions among the group can further enhance the robustness of the influence outcomes because they are less sensitive to changes in personal opinion; and (5) involvement in the influencer group can result in modifications of the influence structure and outcomes (NRC 2003).
Finding 7: Although studies analyzing social media platforms for patterns of influence and ways to minimize bias are being conducted, similar research does not exist for understanding the flow of scientific information generated and communicated through different sources and platforms, and about the influence of different types of information among scientists, members of the public, journalists, and policymakers. With the current volume and flow of information, these studies would provide the scientific foundation for strategic design of influence systems pushing corrective information (i.e., accurate, authoritative, and evidence-supported science) to various audiences.
As society has access to abundant sources of information, emerging technology, and diverging opinions, it, in turn, has incentivized misinformation and mistrust among various publics and across national boundaries (Scheufele et al. 2021b).
The creation and spread of misinformation are motivated by financial and ideological gains (Tandoc et al. 2018). On the one hand, these motivations can convert clicks to profits. On the other hand, deliberate effort to create and circulate these claims can promote certain beliefs, viewpoints, or stances, resulting in enhanced power or reputation. Misinformation can originate from many different sources such as media, vested interests, politicians, rumors, and fiction (Lewandowsky et al. 2012). A study tracking 126,000 rumors diffused by more than 3 million Twitter users found that misinformation claims were disseminated significantly farther, faster, deeper, and more broadly than the truth (Vosoughi et al. 2018). Scholars have examined factors that contribute to the spread of misinformation including through social media, advances in technologies, and limited bandwidth for information. Media platforms, especially social media platforms, allow misinformation to thrive (Mourão and Robertson 2019). Similarly, technological advancement has made manipulation of the truth with misinformation easier (Tandoc 2019). Furthermore, information overload leaves individuals with insufficient knowledge and motivation to carefully process information (Pennycook and Rand 2019; Tandoc et al. 2018).
The willingness to accept misinformation also is driven by several factors. People weigh information that reflects and reinforces their prior beliefs more heavily than information that contradicts those beliefs, which is referred to as motivated reasoning (Kunda 1990). This confirmation bias may explain why an audience for misinformation exists (Lazer et al. 2018). Social factors also have created an environment conducive to the emergence of misinformation. For example, social tumult and divisions caused by partisanship, populism, and distrust facilitate people’s intentions to fall for information that confirms their enmity toward another group (Vargo et al. 2018).
Connections Between Verifiable Scientific Information and Policy
Misinformation often connects verifiable information with democratically determined policy stances (ALLEA 2021; Scheufele and Krause 2019). Statements that are not factual information but rather a debatable policy stance cannot be corrected using scientific information. However, statements that are inaccurate can be addressed through scientific analysis or experimentation or corrected using publicly available, defensible scientific information.
Selective and Tailored Access to Information
A fundamental transformation is occurring in communication via various media platforms that drives information acquisition toward personal preference. Media platforms exploit psychological and sociological vulnerabilities among users and their communication networks to algorithmically deliver content that maximizes user engagement in ways that are often agnostic to truth or consistency with the best available science (Brossard and Scheufele 2022; Cacciatore et al. 2016). Scientific papers relating to the COVID-19 pandemic have been brought into the public eye in an unprecedented way through social media, which raises the bar for scientists to communicate their work clearly, concisely, accurately, and accessibly to any audience, including lay audiences.
In addition, both legacy and new media continue to play very influential roles in translating highly complex science into 300 words or even a 30-second segment for the general public. This practice can lead to a loss of information through distillation, and the information presented may not be accurate. Scientists with the appropriate domain expertise and/or with scientific or media communication training may not be on these channels to correct any inaccurate or misleading information presented in these clips or texts.
Figure 2 illustrates critical questions that scientists and other stakeholders can consider when exploring ways of communicating, building trust, and engaging with each other, policymakers, journalists, lay or religious leaders, or other members of the broader public.
The U.S. National Academies have published several reports and guides on science communication in specific fields and on the science of science communication (see Box 3). A brief description of models for science communication and developing counter-messages is provided in this section.
Science Communication Models
Although several science communication models exist, one model, the “deficit model,” remains popular for communicating technical concepts despite having been demonstrated as ineffective by communications scholars. The knowledge deficit model of science communication assumes that lack of societal support for science and technology-related issues is based on ignorance (Simis et al. 2016). This model has several limitations: (1) science is evolving, so scientific facts can be interpreted in more than one way by different audiences; (2) scientific information often is not directly communicated by scientists but rather through a mediator such as news media; (3) presentation of only science does not provide all of the relevant information to inform what actions need to be taken and when to take them; and (4) no consideration about the needs of different audiences are accounted for (NASEM 2017a).
Scientists and science communicators need to stop using the knowledge deficit model in favor of other science communication models (NASEM 2017b). Few, if any, studies have been conducted associating communication models with specific scientific controversies. Despite this challenge, broader principles of science communication apply. Most critically, information most likely will resonate with specific audiences if it is described or framed in a way that meaningfully connects the content with audience’s existing cognitive schemas (Price and Tewksbury 1997). People generally are prone to absorbing information that supports their established personal beliefs and opinions based on their beliefs, values, religious views, or ideologies, an effect known as “biased assimilation” that often leads to “motivated reasoning” and may even result in self-deception (Liang et al. 2015; Scheufele 2014). One application of meaningfully connecting with people’s cognitive schemas is the Narrative Policy Framework, which draws on the knowledge that individuals’ decisions are influenced by a variety of factors
beyond scientific information alone including past experiences, values, beliefs, and other associations (von Winterfeldt 2013). All models will have their merits and limitations, so evaluating the circumstance in which scientific information is being communicated and the audiences to which information is communicated is critical for determining which models are going to be most effective.
Recommendation 9: When communicating complex, technical information, scientists and science communicators should use communication models that meaningfully connect new information to audiences’ existing cognitive schemas or mental models.
Creating Counter-Misinformation Messages
- Expertise in a variety of disciplines is needed to address misinformation.
- Use of language and context that is relatable can be used to communicate and connect with diverse audience(s).
- Developing these connections, demonstrating empathy, actively listening, demonstrating integrity, and avoiding hype can help build trust with various audiences, which is critical for countering misinformation where scientific opinion is shared prolifically and not necessarily consistent.
- Scientists need to listen and be attentive to their audience’s responses and underlying views when developing corrective messages. In addition, learning about audience perspectives, context, and values also helps to develop messages and use language and images that resonate with and are tailored to specific audiences.
- When correcting inaccuracies, scientists need to provide the corrective messages clearly and with audience-appropriate context, information, and language at the start of any communication.
During emerging infectious disease outbreaks or other incidents involving biological threats, the scientific foundation may be inconclusive or in flux because of lack of available data, availability of relevant and verifiable scientific literature to the particular events, the newness of the events, and/or new information being produced as events unfold over time. These uncertainties present significant challenges to individual and national-level policymaking. Policy decisions may require scientific input, regardless of how settled a particular area of science might be at the time (Scheufele et al. 2020), and scientists might be faced with the dilemma of wanting to combat claims that they believe to be incorrect with science that itself might turn out to be incorrect or evolving (Krause et al. 2022). Uncertainty is described in greater detail later in this report.
Media and Public Engagement Training
Basic training in media and public engagement may include the following: (1) approaches for implementing public engagement activities, (2) potential pitfalls in conducting public engagement, (3) approaches for resolving these issues, (4) public speaking, (5) science communication writing (e.g., editorials and newspaper articles), (6) use of new media platforms, and (7) media landscape and approaches for collaborating with media practitioners. This training may include guidance on navigating the political landscape while correcting misinformation, which can help to address any concerns or fears of political repercussions when engaging various audiences. Scientists may avoid public engagement if they are concerned about potential detrimental effects on their reputations and/or careers (Ho et al. 2020a,b). To limit these consequences, institutions can develop and provide clear rules and guidelines to help scientists navigate the intricacies and sensitivities in their cultural and political contexts. Furthermore, institutions can support and provide recognition to scientists who seek to address inaccurate and misleading claims (Ho et al. 2020c).
Several questions about whether and how scientists should counter mis- and disinformation are explored below.
Should Scientists Pre- or Debunk Incorrect Claims?
The available evidence base for answering this question is weak as prebunking (i.e., anticipating and proactively addressing misinformation) and debunking (i.e., reactively addressing misinformation) inaccurate claims is a fairly new field of research and relies mostly on experimental findings with limited validity. Furthermore, the potential long-term benefits of pre- and debunking inaccurate claims with audiences beyond the captive experimental audiences are poorly understood (Hovland 1959). This limited scholarship does not diminish the value and importance of collective efforts such as fact-checking. Scaling up efforts to “fact-check” individual claims found on online platforms is extremely challenging, but several groups have begun to research and develop means of achieving this (NASEM 2021). With the inclusion of domain experts, “fact-checkers” may help to identify credible sources of information that provide reliable, accurate scientific information. Taking advantage of algorithmically curated timelines and delivery mechanisms requires collaboration among fact-checkers, scientists, and social media platforms to “offer a unique opportunity to share high-quality health information with a broad audience” (Kington et al. 2021).
Correcting inaccurate or misleading information before it has a chance to spread via online platforms, social media, or mainstream media can be an effective strategy for preventing misinformation (Sedova 2021). However, scientists need to be aware of the role of people’s worldview on the persistence of misinformation. Providing correct information a little at a time and offering alternative narratives to the misinformation that affirm the views of the audience could help to address these concerns (Brann n.d.; Chinn and Hart 2021a,b; Cook et al. 2015).
How Does Information Connect to Values?
Information is not consumed in a vacuum. The exchange of credible scientific information will be particularly effective if it is presented in ways that meaningfully connect to values and belief systems that are important to different audiences. This means that messages be framed strategically and informed by the audience in question. Framing of messages refers to the idea that the information will evoke very different attitudes or cognitive responses within different audiences if presented in ways that resonate with particular preexisting cognitive schemas (Scheufele 2014).
One problem associated with framing is the cooperation or selective use of scientific information and expertise by partisan groups for political purposes. During the 2009 H1N1 pandemic and current COVID-19 pandemic, policy
decisions, such as closure of schools and/or businesses (or lockdowns) and other practices (e.g., mask and quarantine mandates), divided support among the broader public, which may fall on partisan lines (Cauchemez et al. 2014; CIDRAP 2009a,b; Davis et al. 2015; Harvard T.H. Chan School of Public Health 2009; Krupa 2009; McNeil 2009; Pallarito 2009; Toffelmire n.d.). Science derives its cultural authority as society’s creator and curator of knowledge from the perception that it operates systematically, objectively, and free from partisan politics (Brossard and Nisbet 2007). Therefore, individual scientists, who may be domain experts in relevant fields, may choose to work with science policy experts to translate existing and objective scientific knowledge to inform policy and practice. However, avoiding endorsement of particular policies without all available information in hand may create new, unanticipated problems.
Should and How Should Scientists Engage the Public?
Engaging various public and policy audiences is a two-way street, with scientists understanding the key questions these audiences have before providing them defensible, accurate, and objective information, and with these audiences supporting the generation of new knowledge or updating existing scientific knowledge. To do this effectively, collaboration is needed among domain experts, including life, computer, and social scientists with expertise in public engagement and science communication, to monitor public opinion(s) and media content and to conduct sentiment analysis within various communities that are invested in or are working to address the issue at hand.
Another consideration when engaging members of the public is that individuals who trust science may be more likely to believe and share inaccurate information containing scientific information than individuals who do not trust science (O’Brien et al. 2021). Individuals who trust science implicitly believe claims made by scientists or include scientific information. Similarly, individuals who are primed to critically evaluate information are less likely to believe a particular claim regardless of whether scientific information is referenced than individuals who are primed to trust the science. These findings suggest that public engagement strategies focusing on encouraging critical evaluation of information may be more effective than encouraging a general trust in science at reducing belief in misinformation.
How Can Uncertainties Be Captured and Communicated Transparently?
Uncertainty is a particularly challenging topic to address, particularly when very little information is available or situations are unfolding. In these situations, the “best available science at the moment” may be necessary to consider and avoid long-term damage to science as a trusted institution (Scheufele et al. 2020, 2021a). The COVID-19 pandemic illustrated the difficulty in navigating uncertainty within the constantly changing global and national policy, public health, and pandemic-relevant scientific landscape (Marcus and Oransky 2020).
Within this context, poor communication of what actually is known scientifically and what still remains unknown creates significant challenges, from scientists overstating preexisting knowledge of response measures used for other infectious diseases (e.g., effectiveness of mask use, particularly masks made of different materials, and distance of airborne spread of viral particles) to journalists using results from single studies, including studies that were disseminated through preprint servers (Bok et al. 2021; Escandón et al. 2021; Hornik et al. 2021; West and Bergstrom 2021).
Mistakes or miscalculations in the published scientific literature can enhance these complexities further. For example, the Chief of Thai Traditional and Alternative Medicine self-retracted a study titled “Efficacy and safety of Andrographis paniculata extract in patients with mild COVID-19: A randomized controlled trial,” which was uploaded to the online preprint server medRxiv in July 2021, because of a statistical miscalculation, which led the authors to conclude the study’s hypothesis was more valid than it actually was (Wanaratna et al. 2021).
Scientific information and conclusions rarely are absolutely certain at any given time, and scientific knowledge always is open to revision. Expressing the degree of certainty around scientific knowledge, conclusions, and scientifically informed recommendations is necessary but complex for several reasons.
Degrees of certainty exist around scientific conclusions and degrees of certainty exist around scientific recommendations for practical decisions, and these types of certainty are different (Fischhoff and Davis 2014; The Royal Society 2022). Royall (2000) offers a tripartite distinction of certainty about what is known, certainty about what to believe, and certainty about what to do. With respect to scientific conclusion-making, standards of evidence do (or should) not logically change as a function of the needs of individuals, organizations, or humanity at large or the difficulty of doing studies (Travis 2006; Vernooij et al. 2021). In contrast, the degree of certainty needed for prudent decision-making and action-taking that are informed by science may indeed change as a function of the circumstances, and the degree of evidence needed in such situations is itself an issue that cannot be decided by science alone (Richardson et al. 2017; The Royal Society 2022).
Describing the uncertainty around scientific information, conclusions, or scientifically informed decisions is difficult because uncertainty is not a simple dichotomy of “yes things are certain” or “no, things are not certain”—uncertainty exists in degrees (Weiss and LaPorte 2018). Only in a small minority of cases can individuals objectively and scientifically quantify the degree of scientific certainty or uncertainty. In most cases, only qualitative statements can be made about the degree of certainty and the factors influencing certainty (NRC 1996). Efforts to make certainty quantitative and some of the inherent subjectivity involved have been recognized previously (Fischhoff and Davis 2014; van der Bles et al. 2019).
Quantifying or describing uncertainty is challenging because many different types of uncertainty exist (Phillips and LaPole 2003). One type that can be well quantified, at least under some circumstances, involves stochastic uncertainty from sampling variance, measurement error, and related aspects. These factors usually are quantified in statistics from the realm of “frequentist” statistical thinking and include confidence intervals, standard errors, prediction intervals, p-values, and so on (Lele 2020). Collectively, such statistics allow a better understanding of the precision of our estimates of quantities such as estimates characterizing associations among variables, causal effects of one variable on another, or simply descriptive aspects of some population such as the average height of adults in a population. Similarly, the extent to which the results of a study may be generalizable to other circumstances, other populations, other periods of time, and so on cannot be known a priori from theory but only empirically estimated after the different situation is observed or projected from some other references before the new or different situation is observed (Amrhein et al. 2019; Kukull and Ganguli 2012). In some cases, reasonably sound evidence will be available.
Inaccurate and misleading information during infectious disease outbreaks has provided opportunities for the creation and perpetuation of mis- and disinformation. Such campaigns during infectious disease outbreaks have presented, and continue to present, challenges to effective outbreak control, including distrust among affected populations in response activities and questions among security experts about the true origins of outbreaks. Effectively addressing inaccurate information about infectious diseases and other biological threats involves a multidisciplinary and multi-sector network of scientists. The committee recommended a distributed network of reputable, high-quality scientific organizations and individual scientists from a diversity of life, social, and computer sciences who can be activated to address particularly problematic claims involving biological threats. Considerations of member expertise, availability of resources, ability to protect members, sustainability, credibility and authoritative knowledge, inclusivity, and collaborative and cross-disciplinary approaches were identified as critical characteristics associated with a successful, effective network aimed at enhancing the scientific knowledge about emerging infectious diseases and other biological threats. A visualization of online platforms through which scientists continue to engage, collaborate, share data and information, crowdsource scientific analysis, and peer review scientific information are available online.4
Effectively addressing inaccurate and misleading information involves a careful examination of several factors. These factors include (1) the potential that the information, if spread, would cause harm to societal systems such as public health and national security; (2) the likelihood that the claims can be addressed through science; (3) the existence of data and/or scientific evidence to address
the claim in an authoritative and defensible manner; and (4) the potential for amplification of the claim rather than its correction. Claims that may result in significant harms (e.g., presenting significant barriers to public health response during an epidemic or accusation of deliberately malicious action) and that are caused or perpetuated by inaccurate or misleading information could be addressed through the development and/or communication of accurate and evidence-supported science. If insufficient data exist or the knowledge basis is weak, scientists may choose either to not address the inaccurate information or to undertake new, peer-reviewed analysis to produce the needed knowledge to correct the inaccuracies. If scientists choose to address the claims, considerations about the spread of information, uncertainty associated with the corrective message, methods and primary audience for communication, and public engagement will determine their overall effectiveness at countering inaccuracies. This collective information has been translated into a how-to guide5 for scientists to determine whether and how to address particular inaccurate information and broader misinformation claims, and to whom and how to communicate the corrective actions.
Abdill, R.J., and R. Blekhman. 2019. Tracking the Popularity and Outcomes of All bioRxiv Preprints. Elife 8. https://doi.org/10.7554/eLife.45133.
Adelayanti, N. 2020. Eucalyptus Has Not Yet Proven to Kill Coronavirus. https://www.ugm.ac.id/en/news/19703-eucalyptus-has-not-yet-proven-to-kill-coronavirus.
ALLEA (All European Academies). 2021. Fact or Fake?: Tackling Science Disinformation. https://allea.org/portfolio-item/fact-or-fake.
Allen Institute for AI. 2021. COVID-19 Open Research Dataset Challenge (CORD-19). Original Edition. Edited by Kaggle.
Amrhein, V., D. Trafimow, and S. Greenland. 2019. Inferential Statistics as Descriptive Statistics: There Is No Replication Crisis If We Don’t Expect Replication. The American Statistician 73(Suppl 1):262–270. https://doi.org/10.1080/00031305.2018.1543137.
Anti-Fake News Center Thailand. n.d. Anti-Fake News Center Thailand. https://www.antifakenewscenter.com.
ASEAN-YSN (Association of Southeast Asian Nations-Young Scientists Network). n.d. ASEAN Responsible Conduct of Research. ASEAN Young Scientists Network. https://www.aseanrcr.com.
Basu, R. 2021. Using Statistics to Aid in the Fight Against Misinformation. https://www.american.edu/media/news/20211202-statistics-and-misinformation.cfm.
Beauchamp, T.L., and J.F. Childress. 2001. Principles of Biomedical Ethics, 5th Edition. Vol. 28. New York: Oxford University Press.
Begley, A. 2021. AI Method Reveals 17 Existing Drugs That May Fight Against COVID-19. Drug Target Review, August 24, 2021. https://www.drugtargetreview.com/news/96015/ai-method-reveals-17-existing-drugs-that-may-fight-against-covid-19.
Bok, S., D.E. Martin, E. Acosta, M. Lee, and J. Shum. 2021. Validation of the COVID-19 Transmission Misinformation Scale and Conditional Indirect Negative Effects on Wearing a Mask in Public. International Journal of Environmental Research and Public Health 18(21):11319. https://doi.org/10.3390/ijerph182111319.
Brann, M. n.d. Review: How to Correct Misinformation, According to Science. Indiana Clinical and Translational Sciences Institute. https://indianactsi.org/review-how-to-correct-misinformation-according-to-science.
Bridgman, A., E. Merkley, O. Zhilin, P.J. Loewen, T. Owen, and D. Ruths. 2021. Infodemic Pathways: Evaluating the Role That Traditional and Social Media Play in Cross-National Information Transfer. Frontiers in Political Science 3(20). https://doi.org/10.3389/fpos.2021.648646.
Brossard, D., and M.C. Nisbet. 2007. Deference to Scientific Authority Among a Low Information Public: Understanding U.S. Opinion on Agricultural Biotechnology. International Journal of Public Opinion Research 19(1):24–52. https://doi.org/10.1093/ijpor/edl003.
Brossard, D., and D.A. Scheufele. 2022. The Chronic Growing Pains of Communicating Science Online. Science 375(6581):613–614. https://doi.org/10.1126/science.abo0668.
Budd, J., B.S. Miller, E.M. Manning, V. Lampos, M. Zhuang, M. Edelstein, G. Rees, V.C. Emery, M.M. Stevens, N. Keegan, M.J. Short, D. Pillay, E. Manley, I.J. Cox, D. Heymann, A.M. Johnson, and R.A. McKendry. 2020. Digital Technologies in the Public-Health Response to COVID-19. Nature Medicine 26(8):1183–1192. https://doi.org/10.1038/s41591-020-1011-4.
Cacciatore, M.A., D.A. Scheufele, and S. Iyengar. 2016. The End of Framing as We Know It … and the Future of Media Effects. Mass Communication and Society 19(1):7–23. https://doi.org/10.1080/15205436.2015.1068811.
Cauchemez, S., M.D.V. Kerkhove, B.N. Archer, M. Cetron, B.J. Cowling, P. Grove, D. Hunt, M. Kojouharova, P. Kon, K. Ungchusak, H. Oshitani, A. Pugliese, C. Rizzo, G. Saour, T. Sunagawa, A. Uzicanin, C. Wachtel, I. Weisfuse, H. Yu, and A. Nicoll. 2014. School Closures During the 2009 Influenza Pandemic: National and Local Experiences. BMC Infectious Diseases 14(1):207. https://doi.org/10.1186/1471-2334-14-207.
Chau, D-M. 2020. Young Scientists in Malaysia Have Made Integrity Training Fun and Relevant. Nature. https://www.nature.com/articles/d41586-020-03082-x.
Chinn, S., and P. Sol Hart. 2021a. Climate Change Consensus Messages Cause Reactance. Environmental Communication 1–9. https://doi.org/10.1080/17524032.2021.1910530.
Chinn, S., and P. Sol Hart. 2021b. Effects of Consensus Messages and Political Ideology on Climate Change Attitudes: Inconsistent Findings and the Effect of a Pretest. Climatic Change 167(3):47. https://doi.org/10.1007/s10584-021-03200-2.
Chulalongkorn University. 2021. https://www.mhesi.go.th/index.php/en/content_page/item/3627-155642.html.
CIDRAP (Center for Infectious Disease Research and Policy). 2009a. H1N1 Flu Breaking News: Business Flu Concerns, Costly School Closings, Vaccine Shipments, Virus Mutation, Testing for Flu. https://www.cidrap.umn.edu/news-perspective/2009/09/h1n1-flu-breaking-news-business-flu-concerns-costly-school-closings-vaccine.
CIDRAP. 2009b. H1N1 Flu Breaking News: Global Tally Tops 50K, Novel H1N1 Preponderance, Quarantine Risk in China, Young as Vaccine Priority, Grant to Develop Flu Drug. University of Minnesota. https://www.cidrap.umn.edu/news-perspective/2009/06/h1n1-flu-breaking-news-global-tally-tops-50k-novel-h1n1-preponderance.
Cinelli, M., W. Quattrociocchi, A. Galeazzi, C.M. Valensise, E. Brugnoli, A.L. Schmidt, P. Zola, F. Zollo, and A. Scala. 2020. The COVID-19 Social Media Infodemic. Scientific Reports 10(1):16598. https://doi.org/10.1038/s41598-020-73510-5.
Clements, J.D., N.D. Connell, C. Dirks, M. El-Faham, A. Hay, E. Heitman, J.H. Stith, E.C. Bond, R.R. Colwell, L. Anestidou, J.L. Husbands, and J.B. Labov. 2017. Engaging Actively with Issues in the Responsible Conduct of Science: Lessons from International Efforts Are Relevant for Undergraduate Education in the United States. CBE—Life Sciences Education 12(4):596–603. https://doi.org/10.1187/cbe.13-09-0184.
Cook, J., U. Ecker, and S. Lewandowsky. 2015. Misinformation and How to Correct It. In Emerging Trends in the Social and Behavioral Sciences, pp. 1–17. Hoboken, NJ: John Wiley & Sons, Inc.
Dahwan, E., and S-N. Joni. 2015. Get Big Things Done: The Power of Connectional Intelligence. New York: St. Martin’s Press.
Davis, B.M., H. Markel, A. Navarro, E. Wells, A.S. Monto, and A.E. Aiello. 2015. The Effect of Reactive School Closure on Community Influenza-Like Illness Counts in the State of Michigan During the 2009 H1N1 Pandemic. Clinical Infectious Diseases 60(12):e90–e97. https://doi.org/10.1093/cid/civ182.
Dubow, B., E. Lucas, and J. Morris. 2021. Jabbed in the Back: Mapping Russian and Chinese Information Operations During COVID-19. Center for European Policy Analysis. https://cepa.org/jabbed-in-the-back-mapping-russian-and-chinese-information-operations-during-covid-19.
EC (European Commission). n.d. Horizon 2020: Responsible Research & Innovation. European Union. https://ec.europa.eu/programmes/horizon2020/en/h2020-section/responsible-research-innovation.
Elsevier. n.d. India COVID-19 Healthcare Hub. Elsevier Healthcare Hub. https://elsevier.health/en-US/india/home.
Escandón, K., A.L. Rasmussen, I.I. Bogoch, E.J. Murray, K. Escandón, S.V. Popescu, and J. Kindrachuk. 2021. COVID-19 False Dichotomies and a Comprehensive Review of the Evidence Regarding Public Health, COVID-19 Symptomatology, SARS-CoV-2 Transmission, Mask Wearing, and Reinfection. BMC Infectious Diseases 21(1):710.
Fidler, D.P. 2019. Disinformation and Disease: Social Media and the Ebola Epidemic in the Democratic Republic of the Congo. Council on Foreign Relations. https://www.cfr.org/blog/disinformation-and-disease-social-media-and-ebola-epidemic-democratic-republic-congo.
Fischhoff, B., and A.L. Davis. 2014. Communicating Scientific Uncertainty. Proceedings of the National Academy of Sciences 111(4):13664–13671. https://doi.org/10.1073/pnas.1317504111.
Fitrianingrum, N. 2021. Dual Pandemics: COVID-19 and Disinformation in Indonesia—A Conversation with GGF 2035 Fellow Nurma Fitrianingrum. Global Policy Opinion (blog), Durham University. https://www.globalpolicyjournal.com/blog/15/06/2021/dual-pandemics-covid-19-and-disinformation-indonesia-conversation-ggf-2035-fellow.
Foley, C., E. Harvey, S.A. Bidol, T. Henderson, R. Njord, T. DeSalvo, T. Haupt, A. Mba-Jones, C. Bailey, C. Bopp, S.A. Bosch, P. Gerner-Smidt, R.K. Mody, T-A. Nguyen, N. Strockbine, and R.V. Tauxe. 2013. Outbreak of Escherichia coli O104:H4 Infections Associated with Sprout Consumption—Europe and North America, May–July 2011. Centers for Disease Control and Prevention. https://www.cdc.gov/mmwr/preview/mmwrhtml/mm6250a3.htm.
Gamberini, S.J., and A. Moodie. 2020. The Virus of Disinformation: Echoes of Past Bioweapons Accusations in Today’s COVID-19 Conspiracy Theories. War on the Rocks. https://warontherocks.com/2020/04/the-virus-of-disinformation-echoes-of-past-bioweapons-accusations-in-todays-covid-19-conspiracy-theories.
Gertz, B. 2020. Coronavirus Link to China Biowarfare Program Possible, Analyst Says. The Washington Times. https://www.washingtontimes.com/news/2020/jan/26/coronavirus-link-to-china-biowarfare-program-possi.
Goldstein, C.M., E.J. Murray, J. Beard, A.M. Schnoes, and M.L. Wang. 2021. Science Communication in the Age of Misinformation. Annals of Behavioral Medicine 54(12):985–990. https://doi.org/10.1093/abm/kaaa088.
Goodman, J., and F. Carmichael. 2020. Coronavirus: Bill Gates “Microchip” Conspiracy Theory and Other Vaccine Claims Fact-checked. BBC, May 30, 2020. https://www.bbc.com/news/52847648.
Grad, Y.H., M. Lipsitch, M. Feldgarden, H.M. Arachchi, G.C. Cerqueira, M. FitzGerald, P. Godfrey, B.J. Haas, C.I. Murphy, C. Russ, S. Sykes, B.J. Walker, J.R. Wortman, S. Young, Q. Zeng, A. Abouelleil, J. Bochicchio, S. Chauvin, T. DeSmet, S. Gujja, C. McCowan, A. Montmayeur, S. Steelman, J. Frimodt-Møller, A.M. Petersen, C. Struve, K.A. Krogfelt, E. Bingen, F-X. Weill, E.S. Lander, C. Nusbaum, B.W. Birren, D.T. Hung, and W.P. Hanage. 2012. Genomic Epidemiology of the Escherichia coli O104:H4 Outbreaks in Europe, 2011. Proceedings of the National Academy of Sciences 109(8):3065–3070. https://doi.org/10.1073/pnas.1121491109.
Guy, L., C. Jernberg, S. Ivarsson, I. Hedenström, L. Engstrand, and S.G.E. Andersson. 2012. Genomic Diversity of the 2011 European Outbreaks of Escherichia coli O104:H4. Proceedings of the National Academy of Sciences 109(52):E3627–E3628. https://doi.org/10.1073/pnas.1206246110.
Harvard T.H. Chan School of Public Health. 2009. Four-fifths of Businesses Foresee Severe Problems Maintaining Operations If Significant H1N1 Flu Outbreak. https://www.hsph.harvard.edu/news/press-releases/businesses-problems-maintaining-operations-significant-h1n1-flu-outbreak.
Henley, J., and N. McIntyre. 2020. Survey Uncovers Widespread Belief in “Dangerous” COVID Conspiracy Theories. The Guardian. https://www.theguardian.com/world/2020/oct/26/survey-uncovers-widespread-belief-dangerous-covid-conspiracy-theories.
Himelein-Wachowiak, M., S. Giorgi, A. Devoto, M. Rahman, L. Ungar, H.A. Schwartz, D.H. Epstein, L. Leggio, and B. Curtis. 2021. Bots and Misinformation Spread on Social Media: Implications for COVID-19. Journal of Medical Internet Research 23(5):e26933. https://doi.org/10.2196/26933.
Ho, S.S., J. Looi, and T.J. Goh. 2020a. Scientists as Public Communicators: Individual- and Institutional-Level Motivations and Barriers for Public Communication in Singapore. Asian Journal of Communication 30(2):155–178. https://doi.org/10.1080/01292986.2020.1748072.
Ho, S.S., J. Looi, Y.W. Leung, and T.J. Goh. 2020b. Public Engagement by STEM and Non-STEM Researchers in Singapore: A Qualitative Comparison of Macro- and Meso-Level Concerns. Public Understanding of Science 29(2):211–229. https://doi.org/10.1177/0963662519888761.
Ho, S.S., T.J. Goh, and Y.W. Leung. 2020c. Let’s Nab Fake Science News: Predicting Scientists’ Support for Interventions Using the Influence of Presumed Media Influence Model. Journalism 1464884920937488. https://doi.org/10.1177/1464884920937488.
Hornik, R., A. Kikut, E. Jesch, C. Woko, L. Siegel, and K. Kim. 2021. Association of COVID-19 Misinformation with Face Mask Wearing and Social Distancing in a Nationally Representative US Sample. Health Communication 36(1):6–14. https://doi.org/10.1080/10410236.2020.1847437.
Horowitz, B.T. 2021. Can AI Stop People from Believing Fake News? IEEE Spectrum. https://spectrum.ieee.org/ai-misinformation-fake-news.
Hovland, C.I. 1959. Reconciling Conflicting Results Derived from Experimental and Survey Studies of Attitude Change. American Psychologist 14(1):8–17. https://doi.org/10.1037/h0042210.
IAKMI (Indonesian Public Health Association). n.d. Public Health of Indonesia. https://stikbar.org/ycabpublisher/index.php/PHI/index.
IAP (The InterAcademy Partnership). 2016. Doing Global Science: A Guide to Responsible Conduct in the Global Research Enterprise. Princeton, NJ: Princeton University Press. https://www.interacademies.org/publication/doing-global-science-guide-responsible-conduct-global-research-enterprise.
Imhoff, R., and P. Lamberty. 2020. A Bioweapon or a Hoax?: The Link Between Distinct Conspiracy Beliefs About the Coronavirus Disease (COVID-19) Outbreak and Pandemic Behavior. Social Psychological and Personality Science 11(8):1110–1118. https://doi.org/10.1177/1948550620934692.
Indonesia COVID-19 Task Force. n.d. Home Page. https://covid19.go.id.
IOM (Institute of Medicine). 2012. Improving Food Safety Through a One Health Approach: Workshop Summary. Washington, DC: The National Academies Press.
IOM. 2015. Communicating to Advance the Public’s Health: Workshop Summary. Washington, DC: The National Academies Press.
Isachenkov, V. 2018. Russia Claims US Running Secret Bio Weapons Lab in Georgia. AP News. https://apnews.com/article/public-health-north-america-health-ap-top-news-in-state-wire-0cf158200e674f41bd3026133e5e043d.
ISC (International Science Council). n.d.a. Freedoms and Responsibilities in Science. https://council.science/what-we-do/freedoms-and-responsibilities-of-scientists.
ISC. n.d.b. COVID-19 Global Science Portal. International Science Council. https://council.science/covid19.
Jalli, N. 2020. Combating Medical Misinformation and Disinformation Amid Coronavirus Outbreak in Southeast Asia. The Conversation. https://theconversation.com/combating-medicalmisinformation-and-disinformation-amid-coronavirus-outbreak-in-southeast-asia-131046.
Joober, R., N. Schmitz, L. Annable, and P. Boksa. 2012. Publication Bias: What Are the Challenges and Can They Be Overcome? Journal of Psychiatry & Neuroscience 37(3):149–152. https://doi.org/10.1503/jpn.120065.
Kassim, S.S.A. 2020. Fitnah! Pakar bidas dakwaan vaksin imunisasi bayi diternak guna sel babi—“Cuba letak ikan air tawar dalam laut, boleh hidup tak?” MStar. https://www.mstar.com.my/lokal/viral/2020/07/12/fitnah-pakar-bidas-dakwaan-vaksin-imunisasi-bayi-diternak-guna-sel-babi---cuba-letak-ikan-air-tawar-dalam-laut-boleh-hidup-tak.
Katz, E., and P.F. Lazarsfeld. 1955. Personal Influence: The Part Played by People in the Flow of Mass Communications. New York: Free Press.
Keller, E., and J. Berry. 2003. The Influentials. New York: Free Press.
KIN () – Rehabilitation & Homecare. n.d. KIN Rehabilitation & Homecare. https://kinrehab.com/news/view/346/#pid=2.
Kington, R.S., S. Arnesen, W-Y. Sylvia Chou, S.J. Curry, D. Lazer, and A.M. Villaruel. 2021. Identifying Credible Sources of Health Information in Social Media: Principles and Attributes. NAM Perspectives. https://doi.org/10.31478/202107a.
Kostkova, P., F. Saigí-Rubió, H. Eguia, D. Borbolla, M. Verschuuren, C. Hamilton, N. AzzopardiMuscat, and D. Novillo-Ortiz. 2021. Data and Digital Solutions to Support Surveillance Strategies in the Context of the COVID-19 Pandemic. Frontiers in Digital Health 3:70792. https://doi.org/10.3389/fdgth.2021.707902.
Krause, N.M., I. Freiling, and D.A. Scheufele. 2022. The Infodemic “Infodemic”: Toward a More Nuanced Understanding of Truth-Claims and the Need for (Not) Combatting Misinformation. The ANNALS of the American Academy of Political and Social Science 700(1):112–123.
Krupa, C. 2009. Swine Flu Closes More Than 600 Schools in U.S. Associated Press. https://www.nbcnews.com/id/wbna33520744.
Kukull, W.A., and M. Ganguli. 2012. Generalizability: The Trees, the Forest, and the Low-Hanging Fruit. Neurology 78(23):1886–1891. https://doi.org/10.1212/WNL.0b013e318258f812.
Kunda, Z. 1990. The Case for Motivated Reasoning. Psychological Bulletin 108(3):480–498. https://doi.org/10.1037/0033-2909.108.3.480.
Lazarsfeld, P.F., B. Berelson, and H. Gaudet. 1948. The People’s Choice: How the Voter Makes Up His Mind in a Presidential Campaign, 2nd Edition. New York: Columbia University Press.
Lazer, D., M.A. Baum, Y. Benkler, A.J. Berinsky, K.M. Greenhill, F. Menczer, M.J. Metzger, B. Nyhan, G. Pennycook, D. Rothschild, M. Schudson, S.A. Sloman, C.R. Sunstein, E.A. Thorson, D.J. Watts, and J.L. Zittrain. 2018. The Science of Fake News. Science 359(6380):1094–1096. https://doi.org/10.1126/science.aao2998.
Lele, S.R. 2020. How Should We Quantify Uncertainty in Statistical Inference? Frontiers in Ecology and Evolution 8(35). https://doi.org/10.3389/fevo.2020.00035.
Lentzos, F. 2018. The Russian Disinformation Attack That Poses a Biological Danger. Bulletin of the Atomic Scientists. https://thebulletin.org/2018/11/the-russian-disinformation-attack-that-poses-a-biological-danger.
Lewandowsky, S., U.K.H. Ecker, C.M. Seifert, N. Schwarz, and J. Cook. 2012. Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest 13(3):106–131. https://doi.org/10.1177%2F1529100612451018.
Li, Y., J. Zhang, and B. Yu. 2017. Proceedings of the 2017 EMNLP Workshop: Natural Language Processing Meets Journalism. Association for Computational Linguistics, Copenhagen, Denmark, September 7, 2017.
Liang, X., S.S. Ho, D. Brossard, M.A. Xenos, D.A. Scheufele, A.A. Anderson, X. Hao, and X. He. 2015. Value Predispositions as Perceptual Filters: Comparing of Public Attitudes Toward Nanotechnology in the United States and Singapore. Public Understanding of Science 24(5):582–600. https://doi.org/10.1177/0963662513510858.
MAFINDO (Masyarakat Anti Fitnah Indonesia). n.d. COVID. https://www.mafindo.or.id/?s=COVID.
Majelis Ulama Indonesia. 2021. Fatwa MUI No. 02 Tahun 2021 tentang Produk Vaksin COVID-19 Sinovac Life Sciences Co. Ltd. China Dan Pt. Bio Farma (Persero). Jakarta, Indonesia: Majelis Ulama Indonesia.
Mallapaty, S. 2021. China’s COVID Vaccines Have Been Crucial—Now Immunity Is Waning. Nature. https://www.nature.com/articles/d41586-021-02796-w.
Marcus, A., and I. Oransky. 2020. The Science of This Pandemic Is Moving at Dangerous Speeds. Wired. https://www.wired.com/story/the-science-of-this-pandemic-is-moving-at-dangerous-speeds.
Mayo Clinic Health System. 2021. COVID-19 Vaccine Myths Debunked. Featured Topic. Mayo Clinic Health System. https://www.mayoclinichealthsystem.org/hometown-health/featured-topic/covid-19-vaccine-myths-debunked.
McNeil, Jr., D.G. 2009. U.S. Declares Public Health Emergency Over Swine Flu. The New York Times. https://www.nytimes.com/2009/04/27/world/27flu.html.
MEMRI (The Middle East Media Research Institute). 2020. Palestinian Writers: The Coronavirus Is a Biological Weapon Employed by U.S., Israel Against Their Enemies. The Middle East Media Research Institute. Special Dispatch No. 8654. https://www.memri.org/reports/palestinian-writers-coronavirus-biological-weapon-employed-us-israel-against-their-enemies.
Merson, L., T.V. Phong, L.N.T. Nhan, N.T. Dung, T.T.D. Ngan, N.V. Kinh, M. Parker, and S. Bull. 2015. Trust, Respect, and Reciprocity: Informing Culturally Appropriate Data-Sharing Practice in Vietnam. Journal of Empirical Research on Human Research Ethics 10(3):251–263. https://doi.org/10.1177/1556264615592387.
Meta AI. 2020. Here’s How We’re Using AI to Help Detect Misinformation. ML Applications (blog), Meta. https://ai.facebook.com/blog/heres-how-were-using-ai-to-help-detect-misinformation.
Ministry of Health for Malaysia. 2021. Vaksin COVID Menurut Perspektif Islam. Ministry of Health for Malaysia. https://covid-19.moh.gov.my/semasa-kkm/2021/01/vaksin-covid19-menurut-perspektif-islam.
Mohsin, J., F.H. Saleh, and A.M. Ali Al-muqarm. 2020. Real-Time Surveillance System to Detect and Analyzers the Suspects of COVID-19 Patients by Using IoT Under Edge Computing Techniques (RS-SYS). 2nd Al-Noor International Conference for Science and Technology, August 28–30, 2020.
MOSP (Malaysia Open Science Platform). n.d. Open Science. https://www.akademisains.gov.my/mosp.
Mourão, R.R., and C.T. Robertson. 2019. Fake News as Discursive Integration: An Analysis of Sites That Publish False, Misleading, Hyperpartisan and Sensational Information. Journalism Studies 20(14):2077–2095. https://doi.org/10.1080/1461670X.2019.1566871.
MyBIS (Malaysia Biodiversity Information System). 2015. Biological Diversity Management. https://www.mybis.gov.my/art/16.
NAE (National Academy of Engineering). 2008. Changing the Conversation: Messages for Improving Public Understanding of Engineering. Washington, DC: The National Academies Press.
NAE. 2013. Messaging for Engineering: From Research to Action. Washington, DC: The National Academies Press.
NAS (National Academy of Sciences). 2014. The Science of Science Communication II: Summary of a Colloquium. Washington, DC: The National Academies Press.
NAS. 2017. Initiatives on Responsible Science. https://www.nap.edu/resource/18356/responsible_science.
NAS. 2018. The Science of Science Communication III: Inspiring Novel Collaborations and Building Capacity: Proceedings of a Colloquium. Washington, DC: The National Academies Press.
NASEM (National Academies of Sciences, Engineering, and Medicine). 2015. Trust and Confidence at the Interfaces of the Life Sciences and Society: Does the Public Trust Science?: A Workshop Summary. Washington, DC: The National Academies Press.
NASEM. 2016a. Communicating Chemistry: A Framework for Sharing Science: A Practical Evidence-Based Guide. Washington, DC: The National Academies Press.
NASEM. 2016b. Effective Chemistry Communication in Informal Environments. Washington, DC: The National Academies Press.
NASEM. 2016c. Science Literacy: Concepts, Contexts, and Consequences. Washington, DC: The National Academies Press.
NASEM. 2017a. Fostering Integrity in Research. Washington, DC: The National Academies Press.
NASEM. 2017b. Communicating Science Effectively: A Research Agenda. Washington, DC: The National Academies Press.
NASEM. 2018. Sexual Harassment of Women: Climate, Culture, and Consequences in Academic Sciences, Engineering, and Medicine. Washington, DC: The National Academies Press.
NASEM. 2020. Airborne Transmission of SARS-CoV-2: A Virtual Workshop. https://www.nationalacademies.org/our-work/airborne-transmission-of-sars-cov-2-a-virtual-workshop.
NASEM. 2021. Based on Science: Answers to Everyday Science and Health Questions from the National Academies. https://www.nationalacademies.org/based-on-science.
National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. 1979. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. Edited by the Department of Health, Education, and Welfare. Washington, DC: U.S. Department of Health and Human Services.
Nature Masterclasses. n.d. A NatureResearch Service. Springer Nature. https://masterclasses.nature.com.
Nextstrain. n.d. SARS-CoV-2/COVID-19: Recent Outbreak of a Novel Coronavirus. https://nextstrain.org/help/coronavirus/SARS-CoV-2.
Nieddu, G.T., L. Billings, J.H. Kaufman, E. Forgoston, and S. Bianco. 2017. Extinction Pathways and Outbreak Vulnerability in a Stochastic Ebola Model. Journal of The Royal Society Interface 14(127). https://doi.org/10.1098/rsif.2016.0847.
NRC (National Research Council). 1996. National Science Education Standards. Washington, DC: National Academy Press.
NRC. 2003. Dynamic Social Network Modeling and Analysis: Workshop Summary and Papers. Social Influence Network Theory: Toward a Science of Strategic Modification of Interpersonal Influence Systems by N.E. Friedkin. Washington, DC: The National Academies Press. https://doi.org/10.17226/10735.
NRC. 2004. Biotechnology Research in an Age of Terrorism. Washington, DC: The National Academies Press.
NRC. 2014. Sustainable Infrastructures for Life Science Communication: Workshop Summary. Washington, DC: The National Academies Press.
Nyhan, B. 2014. Fighting Ebola, and the Conspiracy Theories. The New York Times. https://www.nytimes.com/2014/08/26/upshot/fighting-ebola-and-the-conspiracy-theories.html.
O’Brien, T.C., R. Palmer, and D. Albarracin. 2021. Misplaced Trust: When Trust in Science Fosters Belief in Pseudoscience and the Benefits of Critical Evaluation. Journal of Experimental Social Psychology 96:104184. https://doi.org/10.1016/j.jesp.2021.104184.
Office of Environmental Policy and Planning. 2000. Biodiversity Conservation in Thailand: A National Report. Bangkok, Thailand: Ministry of Science, Technology, and Environment.
O’Grady, C. 2021. “It’s Misinformation at Worst.” Weak Health Studies Can Do More Harm Than Good, Scientists Say. Science. https://www.science.org/content/article/it-s-misinformation-worst-weak-health-studies-can-do-more-harm-good-scientists-say.
Okeleke, K., and J. Robinson. 2021. Exploring Online Misinformation and Disinformation in Asia Pacific. London, UK: GSM Association. https://www.gsma.com/asia-pacific/resources/mis-disinfo-report-apac.
Pallarito, K. 2009. Respirator or Face Mask?: Best H1N1 Protection Still Debated. CNN. https://www.cnn.com/2009/HEALTH/11/06/face.mask.swine.flu/index.html.
Pamuk, H., and D. Brunnstrom. 2020. U.S. Summons Chinese Envoy Over Beijing’s Coronavirus Comments. Reuters. https://www.reuters.com/article/us-health-coronavirus-china-diplomacy-idUSKBN2102XW.
Pennycook, G., and D.G. Rand. 2019. Lazy, Not Biased: Susceptibility to Partisan Fake News Is Better Explained by Lack of Reasoning Than by Motivated Reasoning. Cognition 118:39–50. https://doi.org/10.1016/j.cognition.2018.06.011.
Phillips, C.V., and L.M. LaPole. 2003. Quantifying Errors Without Random Sampling. BMC Medical Research Methodology 3:9. https://doi.org/10.1186/1471-2288-3-9.
Price, V., and D. Tewksbury. 1997. News Calues and Public Opinion: A Theoretical Account of Media Priming and Framing. In Progress in the Communication Sciences, edited by G.A. Barnett and F.J. Boster, pp. 173–212. New York: Ablex.
Rahmawati, D., D. Mulyana, G. Lumakto, M. Viendyasari, and W. Anindhita. 2021. Mapping Disinformation During the COVID-19 in Indonesia: Qualitative Content Analysis. Jurnal Aspikom 6(2):13. https://doi.org/10.24329/aspikom.v6i2.907.
Richardson, M.B., M.S. Williams, K.R. Fontaine, and D.B. Allison. 2017. The Development of Scientific Evidence for Health Policies for Obesity: Why and How? International Journal of Obesity 41(6):840–848. https://doi.org/10.1038/ijo.2017.71.
Riedl, C., Y.J. Kim, P. Gupta, T.W. Malone, and A.W. Woolley. 2021. Quantifying Collective Intelligence in Human Groups. Proceedings of the National Academy of Sciences 118(21):e2005737118. https://doi.org/10.1073/pnas.2005737118.
Rochmyaningsih, D. 2019. Indonesia’s Strict New Biopiracy Rules Could Stifle International Research. Science. https://www.science.org/content/article/indonesia-s-strict-new-biopiracy-rules-could-stifle-international-research.
Rogin, J. 2020. Opinion: State Department Cables Warned of Safety Issues at Wuhan Lab Studying Bat Coronaviruses. The Washington Post. https://www.washingtonpost.com/opinions/2020/04/14/state-department-cables-warned-safety-issues-wuhan-lab-studying-bat-coronaviruses.
Royall, R. 2000. On the Probability of Observing Misleading Statistical Evidence. Journal of the American Statistical Association 95(451):760–768. https://doi.org/10.2307/2669456.
Ruzki, O.R.M., and I.S. Ismail. 2021. COVID-19: Vaksin Pfizer, AstraZeneca tidak mengandungi bahan daripada babi. Berita Harian. https://www.bharian.com.my/berita/nasional/2021/05/820693/covid-19-vaksin-pfizer-astrazeneca-tidak-mengandungi-bahan-daripada.
Scheufele, D.A. 2014. Science Communication as Political Communication. Proceedings of the National Academy of Sciences 11(Suppl 4):13585–13592. https://doi.org/10.1073/pnas.1317516111.
Scheufele, D.A., and N.M. Krause. 2019. Science Audiences, Misinformation, and Fake News. Proceedings of the National Academy of Sciences 116(16):7662–7669. https://doi.org/10.1073/pnas.1805871115.
Scheufele, D.A., N.M. Krause, I. Freiling, and D. Brossard. 2020. How Not to Lose the COVID-19 Communication War. Issues in Science and Technology. https://issues.org/covid-19-communication-war.
Scheufele, D.A., A.J. Hoffman, L. Neeley, and C.M. Reid. 2021a. Misinformation About Science in the Public Sphere. Proceedings of the National Academy of Sciences 118(15):e2104068118. https://doi.org/10.1073/pnas.2104068118.
Scheufele, D.A., N.M. Krause, and I. Freiling. 2021b. Misinformed About the “Infodemic”? Science’s Ongoing Struggle with Misinformation. Journal of Applied Research in Memory and Cognition 10(4):522–526. https://doi.org/10.1016/j.jarmac.2021.10.009.
Schuldt, L. 2021. Official Truths in a War on Fake News: Governmental Fact-Checking in Malaysia, Singapore, and Thailand. Journal of Current Southeast Asian Affairs 40(2):340–371. https://doi.org/10.1177/18681034211008908.
Scott-Kemmis, D., P. Intarakumnerd, R. Rasiah, and R. Amaradasa. 2021. Southeast Asia and Oceania. In UNESCO Science Report 2021. Paris, France: United Nations.
Seah, S., H.T. Ha, M. Martinus, and P.T.P. Thao. 2021. The State of Southeast Asia: 2021. Singapore: ISEAS-Yusof Ishak Institute. https://www.iseas.edu.sg/articles-commentaries/state-of-southeast-asia-survey/the-state-of-southeast-asia-2021-survey-report.
Sedova, K. 2021. Open Science, Mis/Disinformation, and AI: Capabilities and Limitations. Presentation at National Academies’ Committee on Addressing Inaccurate and Misleading Information about Biological Threats through Scientific Collaboration and Communication: Fifth Public Meeting, virtual, September 27, 2021.
Selgelid, M.J. 2016. Gain-of-Function Research: Ethical Analysis. Science and Engineering Ethics 22(4):923–964. https://doi.org/10.1007/s11948-016-9810-1.
Shah, D.V., and D.A. Scheufele. 2006. Explicating Opinion Leadership: Nonpolitical Dispositions, Information Consumption, and Civic Participation. Political Communication 23(1):1–22. https://doi.org/10.1080/10584600500476932.
Simis, M.J., H. Madden, M.A. Cacciatore, and S.K. Yeo. 2016. The Lure of Rationality: Why Does the Deficit Model Persist in Science Communication? Public Understanding of Science 25(4):400–414. https://doi.org/10.1177/0963662516629749.
Simon, F.M., and C.Q. Camargo. 2021. Autopsy of a Metaphor: The Origins, Use and Blind Spots of the “Infodemic.” New Media & Society. https://doi.org/10.1177/14614448211031908.
Siriwardana, M. n.d. Artificial Intelligence Struggles to Moderate COVID Misinformation. https://scholar.harvard.edu/cvt/artificial-intelligence-struggles-moderate-covid-misinformation.
Socialist Republic of Vietnam. 2008. Law No. 20/2008/QH11 issued by the National Assembly dated November 13, 2008, on Biodiversity. Hanoi, Vietnam.
Socialist Republic of Vietnam. 2017. Decree No. 59/2017/ND-CP issued by the Government dated May 12, 2017, on the Management of Access to Genetic Resources and the Sharing of Benefits Arising from Their Utilization. Hanoi, Vietnam.
Socialist Republic of Vietnam. 2019. Circular No. 15/2019/TT-BTNMT issued by the Ministry of Natural Resources and Environment dated September 11, 2019, on Organization and Operation of the Appraisal Committee to Appraise the Application Dossier for a License to Access Genetic Resources for Commercial Research or Commercial Product Development Purposes. Hanoi, Vietnam.
Socialist Republic of Vietnam. 2020a. Circular No. 07/2020/TT-BNNPTNT issued by the Ministry of Agriculture and Rural Development dated May 22, 2020, on Organization and Operation of the License Appraisal Council Access to Genetic Resources for Research for Commercial Purposes and Commercial Product Development. Edited by Ministry of Agriculture and Rural Development. Hanoi, Vietnam.
Socialist Republic of Vietnam. 2020b. Circular No. 10/2020/TT-BTNMT issued by the Ministry of Natural Resources and Environment dated September 29, 2020, on Reporting on Access to Genetic Resources and Sharing of Benefits from the Use of Genetic Resources. Hanoi, Vietnam.
Societies Consortium on Sexual Harassment in STEMM. n.d. Societies Consortium on Sexual Harassment in STEMM. https://societiesconsortium.com.
Stevenson, A. 2020. Senator Tom Cotton Repeats Fringe Theory of Coronavirus Origins. The New York Times. https://www.nytimes.com/2020/02/17/business/media/coronavirus-tom-cotton-china.html.
STM. 2021. Publisher Support for Combating COVID-19. https://www.stm-assoc.org/about-stm/coronavirus-2019-ncov.
Strategic Framework for Capacity-building. 2017. Convention on Biological Diversity. https://www.cbd.int/abs/capacitybuilding-framework.shtml.
Sundararajan, A. 2006. Local Network Effects and Complex Network Structure. http://dx.doi.org/10.2139/ssrn.650501.
Suran, S., V. Pattanaik, and D. Draheim. 2020. Frameworks for Collective Intelligence: A Systematic Literature Review. ACM Computing Surveys 53(1):Article 14. https://doi.org/10.1145/3368986.
Tanasugarn, L., S. Dhabanandana, S. Sriwatanapongse, J. Kuanpotn, T. Changtavorn, S. Wannakriroj, C. Ratanachaicharn, J. Kruavan, V. Keeratinijakal, C. Kokkeadtikul, B. Watanayakorn, J. Donavanik, R. Teeragawinsakul, K. Kaewthai, N. Udomtavee, and K. Bhusawan. 2000. UBMTA for Thailand. In A Uniform Biological Material Transfer Agreement for Thailand Has Been Drafted. Bangkok, Thailand: Central Intellectual Property and International Trade Court.
Tandoc, Jr., E.C. 2019. The Facts of Fake News: A Research Review. Sociology Compass 13(9):e12724. https://doi.org/10.1111/soc4.12724.
Tandoc, Jr., E.C., Z. Wei Lim, and R. Ling. 2018. Defining “Fake News.” Digital Journalism 6(2):137–153. https://doi.org/10.1080/21670811.2017.1360143.
The Lancet Infectious Diseases. 2020. The COVID-19 Infodemic. The Lancet Infectious Diseases 20(8):875. https://doi.org/10.1016/s1473-3099(20)30565-x.
The Royal Society. 2022. The Online Information Environment. https://royalsociety.org/-/media/policy/projects/online-information-environment/the-online-information-environment.pdf.
Toffelmire, A. n.d. H1N1 (Human Swine Flu): Should I Wear a Mask? MedBroadcast. https://www.medbroadcast.com/channel/cold-and-flu/h1n1-swine-flu/h1n1-human-swine-flu-should-i-wear-a-mask.
Travis, J. 2006. Is It What We Know or Who We Know?: Choice of Organism and Robustness of Inference in Ecology and Evolutionary Biology. (American Society of Naturalists Presidential Address). The American Naturalist 167(3):303–314. https://doi.org/10.1086/501507.
Universitas Airlangga. n.d. The Indonesian Journal of Public Health. Universitas Airlangga. https://e-journal.unair.ac.id/IJPH.
Urlings, M.J.E., B. Duyx, G.M.H. Swaen, L.M. Bouter, and M.P. Zeegers. 2021. Citation Bias and Other Determinants of Citation in Biomedical Research: Findings from Six Citation Networks. Journal of Clinical Epidemiology 132:71–78. https://doi.org/10.1016/j.jclinepi.2020.11.019.
van der Bles, A.M., S. van der Linden, A.L.J. Freeman, J. Mitchell, A.B. Galvao, L. Zaval, and D.J. Spiegelhalter. 2019. Communicating Uncertainty About Facts, Numbers and Science. Royal Society Open Science 6(5):181870.
Vargo, C.J., L. Guo, and M.A. Amazeen. 2018. The Agenda-Setting Power of Fake News: A Big Data Analysis of the Online Media Landscape from 2014 to 2016. New Media & Society 20(5):2028–2049. https://doi.org/10.1177%2F1461444817712086.
Vernooij, R.W.M., G.H. Guyatt, D. Zeraatkar, M.A. Han, C. Valli, R. El Dib, P. Alonso-Coello, M.M. Bala, and B.C. Johnston. 2021. Reconciling Contrasting Guideline Recommendations on Red and Processed Meat for Health Outcomes. Journal of Clinical Epidemiology 138:215–218. https://doi.org/10.1016/j.jclinepi.2021.07.008.
Virological. n.d. SARS-CoV-2 Coronavirus. Virological. https://virological.org/c/novel-2019-coronavirus/33.
von Winterfeldt, D. 2013. Bridging the Gap Between Science and Decision Making. Proceedings of the National Academy of Sciences 110(Suppl 3):14055. https://doi.org/10.1073/pnas.1213532110.
Vosoughi, S., D. Roy, and S. Aral. 2018. The Spread of True and False News Online. Science 359(6380):1146–1151. https://doi.org/10.1126/science.aap9559.
Wanaratna, K., P. Leethong, N. Inchai, W. Chueawiang, P. Sriraksa, A. Tabmee, and S. Sirinavin. 2021. Efficacy and Safety of Andrographis paniculata Extract in Patients with Mild COVID-19: A Randomized Controlled Trial. medRxiv 2021.07.08.21259912. https://doi.org/10.1101/2021.07.08.21259912.
Wein, H. 2014. Computer Models Can Help Guide Ebola Response. Bethesda, MD: Office of Communications and Public Liaison in the National Institutes of Health Office of the Director.
Weiss, D., and G. LaPorte. 2018. Uncertainty Ahead: A Shift in How Federal Scientific Experts Can Testify. National Institute of Justice Journal 279. https://nij.ojp.gov/topics/articles/uncertainty-ahead-shift-how-federal-scientific-experts-can-testify.
West, J.D., and C.T. Bergstrom. 2021. Misinformation in and About Science. Proceedings of the National Academy of Sciences 118(15):e1912444117. https://doi.org/10.1073/pnas.1912444117.
WHO (World Health Organization). 2010. Responsible Life Sciences Research for Global Health Security: A Guidance Document. Geneva, Switzerland: World Health Organization.
WHO. 2020. Call for Action: Managing the Infodemic. https://www.who.int/news/item/11-12-2020-call-for-action-managing-the-infodemic.
WHO. n.d. Infodemic. https://www.who.int/health-topics/infodemic#tab=tab_1.
Yeo, S.K., X. Liang, D. Brossard, K.M. Rose, K. Korzekwa, D.A. Scheufele, and M.A. Xenos. 2017. The Case of #arseniclife: Blogs and Twitter in Informal Peer Review. Public Understanding of Science 26(8):937–952. https://doi.org/10.1177/0963662516649806.
Zhou, Y., F. Wang, J. Tang, R. Nussinov, and F. Cheng. 2020. Artificial Intelligence in COVID-19 Drug Repurposing. The Lancet Digital Health 2(12):e667–e676. https://doi.org/10.1016/S2589-7500(20)30192-8.
This page intentionally left blank.