paint-brush
Disinformation Echo-Chambers on Facebook: Funding and Referencesby@escholar
161 reads

Disinformation Echo-Chambers on Facebook: Funding and References

tldt arrow

Too Long; Didn't Read

Recent events have brought to light the negative effects of social media platforms, leading to the creation of echo chambers, where users are exposed only to content that aligns with their existing beliefs.

People Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Disinformation Echo-Chambers on Facebook: Funding and References
EScholar: Electronic Academic Papers for Scholars HackerNoon profile picture

This paper is available on arxiv under CC BY-SA 4.0 DEED license.

Authors:

(1) Mathias-Felipe de-Lima-Santos, Faculty of Humanities, University of Amsterdam, Institute of Science and Technology, Federal University of São Paulo;

(2) Wilson Ceron, Institute of Science and Technology, Federal University of São Paulo.

Funding

This project was partially funded by the University of Amsterdam’s RPA Human(e) AI and by the European Union’s Horizon 2020 research and innovation programs No 951911 (AI4Media).

References

(1) Sunstein, C. R. #Republic: Divided Democracy in the Age of Social Media; Princeton University Press: Princeton, New Jersey, 2017.


(2) Garimella, K.; De Francisci Morales, G.; Gionis, A.; Mathioudakis, M. Political Discourse on Social Media. In Proceedings of the 2018 World Wide Web Conference on World Wide Web - WWW ’18; ACM Press: New York, New York, USA, 2018; Vol. 2, pp 913–922. https://doi.org/10.1145/3178876.3186139.


(3) Del Vicario, M.; Bessi, A.; Zollo, F.; Petroni, F.; Scala, A.; Caldarelli, G.; Stanley, H. E.; Quattrociocchi, W. The Spreading of Misinformation Online. Proc. Natl. Acad. Sci. 2016, 113 (3), 554–559. https://doi.org/10.1073/pnas.1517441113.


(4) Garrett, R. K. Echo Chambers Online?: Politically Motivated Selective Exposure among Internet News Users. J. Comput. Commun. 2009, 14 (2), 265–285. https://doi.org/10.1111/j.1083- 6101.2009.01440.x.


(5) Nickerson, R. S. Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Rev. Gen. Psychol. 1998, 2 (2), 175–220. https://doi.org/10.1037/1089-2680.2.2.175.


(6) Biehl, J. A Political Economy of Pharmaceuticals. Will to Live 2021, 18 (1), 10–13. https://doi.org/10.2307/j.ctv1wmz48h.5.


(7) Allcott, H.; Gentzkow, M. Social Media and Fake News in the 2016 Election. J. Econ. Perspect. 2017, 31 (2), 211–236. https://doi.org/10.1257/jep.31.2.211.


(8) Bennett, W. L.; Livingston, S. The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions. Eur. J. Commun. 2018, 33 (2), 122–139. https://doi.org/10.1177/0267323118760317.


(9) Giglietto, F.; Righetti, N.; Rossi, L.; Marino, G. It Takes a Village to Manipulate the Media: Coordinated Link Sharing Behavior during 2018 and 2019 Italian Elections. Inf. Commun. Soc. 2020, 23 (6), 867–891. https://doi.org/10.1080/1369118X.2020.1739732.


(10) Farkas, J.; Schou, J. Fake News as a Floating Signifier: Hegemony, Antagonism and the Politics of Falsehood. Javnost 2018, 25 (3), 298–314. https://doi.org/10.1080/13183222.2018.1463047.


(11) Wardle, C. The Need for Smarter Definitions and Practical, Timely Empirical Research on Information Disorder. Digit. Journal. 2018, 6 (8), 951–963. https://doi.org/10.1080/21670811.2018.1502047.


(12) Abramowitz, S.; McKune, S. L.; Fallah, M.; Monger, J.; Tehoungue, K.; Omidian, P. A. The Opposite of Denial: Social Learning at the Onset of the Ebola Emergency in Liberia. J. Health Commun. 2017, 22 (sup1), 59–65. https://doi.org/10.1080/10810730.2016.1209599.


(13) Guidry, J. P. D.; Jin, Y.; Orr, C. A.; Messner, M.; Meganck, S. Ebola on Instagram and Twitter: How Health Organizations Address the Health Crisis in Their Social Media Engagement. Public Relat. Rev. 2017, 43 (3), 477–486. https://doi.org/10.1016/j.pubrev.2017.04.009.


(14) Guidry, J. P. D.; Meganck, S. L.; Perrin, P. B.; Messner, M.; Lovari, A.; Carlyle, K. E. #Ebola: Tweeting and Pinning an Epidemic. Atl. J. Commun. 2021, 29 (2), 79–92. https://doi.org/10.1080/15456870.2019.1707202.


(15) Smallman, S. Whom Do You Trust? Doubt and Conspiracy Theories in the 2009 Influenza Pandemic. J. Int. Glob. Stud. 2015, 6 (2), 1–24.


(16) Wagner-Egger, P.; Bangerter, A.; Gilles, I.; Green, E.; Rigaud, D.; Krings, F.; Staerklé, C.; Clémence, A. Lay Perceptions of Collectives at the Outbreak of the H1n1 Epidemic: Heroes, Villains and Victims. Public Underst. Sci. 2011, 20 (4), 461–476. https://doi.org/10.1177/0963662510393605.


(17) Arif, N.; Al-Jefri, M.; Bizzi, I. H.; Perano, G. B.; Goldman, M.; Haq, I.; Chua, K. L.; Mengozzi, M.; Neunez, M.; Smith, H.; Ghezzi, P. Fake News or Weak Science? Visibility and Characterization of Antivaccine Webpages Returned by Google in Different Languages and Countries. Front. Immunol. 2018, 9 (JUN), 1215. https://doi.org/10.3389/fimmu.2018.01215.


(18) Dubé, E.; Vivion, M.; MacDonald, N. E. Vaccine Hesitancy, Vaccine Refusal and the AntiVaccine Movement: Influence, Impact and Implications. Expert Rev. Vaccines 2014, 14 (1), 99– 117. https://doi.org/10.1586/14760584.2015.964212.


(19) West, J. D.; Bergstrom, C. T. Misinformation in and about Science. Proc. Natl. Acad. Sci. U. S. A. 2021, 118 (15), e1912444117. https://doi.org/10.1073/pnas.1912444117.


(20) Hornsey, M. J.; Harris, E. A.; Fielding, K. S. The Psychological Roots of Anti-Vaccination Attitudes: A 24-Nation Investigation. Heal. Psychol. 2018, 37 (4), 307–315. https://doi.org/10.1037/hea0000586.


(21) Ceron, W.; De-Lima-Santos, M.-F.; Quiles, M. G. Fake News Agenda in the Era of COVID-19: Identifying Trends through Fact-Checking Content. Online Soc. Networks Media 2021, 21, 100116. https://doi.org/10.1016/j.osnem.2020.100116.


(22) Ceron, W.; Sanseverino, G. G.; De-Lima-Santos, M.-F.; Quiles, M. G. COVID-19 Fake News Diffusion across Latin America. Soc. Netw. Anal. Min. 2021, 11 (1), 47. https://doi.org/10.1007/s13278-021-00753-z.


(23) Chadwick, A.; Kaiser, J.; Vaccari, C.; Freeman, D.; Lambe, S.; Loe, B. S.; Vanderslott, S.; Lewandowsky, S.; Conroy, M.; Ross, A. R. N.; Innocenti, S.; Pollard, A. J.; Waite, F.; Larkin, M.; Rosebrock, L.; Jenner, L.; McShane, H.; Giubilini, A.; Petit, A.; Yu, L. M. Online Social Endorsement and Covid-19 Vaccine Hesitancy in the United Kingdom. Soc. Media Soc. 2021, 7 (2), 205630512110088. https://doi.org/10.1177/20563051211008817.


(24) Monari, A. C. P.; Sacramento, I. A “Vacina Chinesa de João Doria”: A Influência Da Disputa Política-Ideológica Na Desinformação Sobre a Vacinação Contra a Covid-19. Rev. Mídia e Cotid. 2021, 15 (3), 125–143.


(25) Bode, L.; Vraga, E. The Swiss Cheese Model for Mitigating Online Misinformation. Bull. At. Sci. 2021, 77 (3), 129–133. https://doi.org/10.1080/00963402.2021.1912170.


(26) Gera, S.; Sinha, A. C-ANN: A Deep Leaning Model for Detecting Black-Marketed Colluders in Twitter Social Network. Neural Comput. Appl. 2022, 34 (18), 15113–15127. https://doi.org/10.1007/s00521-021-06756-3.


(27) Gleicher, N. Coordinated inauthentic behavior explained https://about.fb.com/news/2018/12/inside-feed-coordinated-inauthenticbehavior/%0Ahttps://newsroom.fb.com/news/2018/12/inside-feed-coordinated-inauthenticbehavior/ (accessed Oct 11, 2021).


(28) Broniatowski, D. A. Towards Statistical Foundations for Detecting Coordinated Inauthentic Behavior on Facebook; Washington, DC, 2021.


(29) Freelon, D.; Wells, C. Disinformation as Political Communication. Polit. Commun. 2020, 37 (2), 145–156. https://doi.org/10.1080/10584609.2020.1723755.


(30) Keller, F. B.; Schoch, D.; Stier, S.; Yang, J. H. Political Astroturfing on Twitter: How to Coordinate a Disinformation Campaign. Polit. Commun. 2020, 37 (2), 256–280. https://doi.org/10.1080/10584609.2019.1661888.


(31) Yang, K. C.; Pierri, F.; Hui, P. M.; Axelrod, D.; Torres-Lugo, C.; Bryden, J.; Menczer, F. The COVID-19 Infodemic: Twitter versus Facebook. Big Data Soc. 2021, 8 (1). https://doi.org/10.1177/20539517211013861.


(32) Kim, Y.; Song, D.; Lee, Y. J. #Antivaccination on Instagram: A Computational Analysis of Hashtag Activism through Photos and Public Responses. Int. J. Environ. Res. Public Health

2020, 17 (20), 1–20. https://doi.org/10.3390/ijerph17207550.

(33) Möller, J. Filter Bubbles and Digital Echo Chambers. In The Routledge Companion to Media Disinformation and Populism; Tumber, H., Waisbord, S., Eds.; Routledge: London, 2021; pp 92– 100. https://doi.org/10.4324/9781003004431-10.

(34) Dahlgren, P. The Internet, Public Spheres, and Political Communication: Dispersion and Deliberation. Polit. Commun. 2005, 22 (2), 147–162. https://doi.org/10.1080/10584600590933160.

(35) Belair-Gagnon, V.; Nelson, J. L.; Lewis, S. C. Audience Engagement, Reciprocity, and the Pursuit of Community Connectedness in Public Media Journalism. Journal. Pract. 2019, 13 (5), 558–575. https://doi.org/10.1080/17512786.2018.1542975. (36) Benkler, Y. The Wealth of Networks; Yale University Press: New York, 2006.

(37) McPherson, M.; Smith-Lovin, L.; Cook, J. M. Birds of a Feather: Homophily in Social Networks. Annu. Rev. Sociol. 2001, 27 (1), 415–444. https://doi.org/10.1146/annurev.soc.27.1.415.

(38) Terren, L.; Borge, R. Echo Chambers on Social Media: A Systematic Review of the Literature. Rev. Commun. Res. 2021, 9 (May), 1–39. https://doi.org/10.12840/ISSN.2255-4165.028.

(39) Gt Walker, P.; Whittaker, C.; Watson, O.; Baguelin, M.; Ainslie, K. E. C.; Bhatia, S.; Bhatt, S.; Boonyasiri, A.; Boyd, O.; Cattarino, L.; Cucunubá, Z.; Cuomo-Dannenburg, G.; Dighe, A.; Donnelly, C. A.; Dorigatti, I.; Van Elsland, S.; Fitzjohn, R.; Flaxman, S.; Fu, H.; Gaythorpe, K.; Geidelberg, L.; Grassly, N.; Green, W.; Hamlet, A.; Hauck, K.; Haw, D.; Hayes, S.; Hinsley, W.; Imai, N.; Jorgensen, D.; Knock, E.; Laydon, D.; Mishra, S.; Nedjati-Gilani, G.; Okell, L. C.; Riley, S.; Thompson, H.; Unwin, J.; Verity, R.; Vollmer, M.; Walters, C.; Wang, W.; Wang, Y.; Winskill, P.; Xi, X.; Ferguson, N. M.; Ghani, A. C. The Global Impact of COVID-19 and Strategies for Mitigation and Suppression; 2020.

(40) Bruns, A. Echo Chambers? Filter Bubbles? The Misleading Metaphors That Obscure the Real Problem. In Hate Speech and Polarization in Participatory Society; Pérez-Escolar, M., NogueraVivo, J. M., Eds.; Routledge: London, 2021; pp 33–48. https://doi.org/10.4324/9781003109891- 4.

(41) Flaxman, S.; Goel, S.; Rao, J. M. Filter Bubbles, Echo Chambers, and Online News Consumption. Public Opin. Q. 2016, 80 (Specialissue1), 298–320. https://doi.org/10.1093/poq/nfw006.

(42) Bakshy, E.; Messing, S.; Adamic, L. A. Exposure to Ideologically Diverse News and Opinion on Facebook. Science (80-. ). 2015, 348 (6239), 1130–1132. https://doi.org/10.1126/science.aaa1160.

(43) Bright, J. Explaining the Emergence of Political Fragmentation on Social Media: The Role of Ideology and Extremism. J. Comput. Commun. 2018, 23 (1), 17–33. https://doi.org/10.1093/JCMC/ZMX002.

(44) Eady, G.; Nagler, J.; Guess, A.; Zilinsky, J.; Tucker, J. A. How Many People Live in Political Bubbles on Social Media? Evidence From Linked Survey and Twitter Data. SAGE Open 2019, 9 (1). https://doi.org/10.1177/2158244019832705.

(45) Garimella, K.; Morales, G. D. F.; Gionis, A.; Mathioudakis, M. Quantifying Controversy on Social Media. ACM Trans. Soc. Comput. 2018, 1 (1), 1–27. https://doi.org/10.1145/3140565.

(46) Cinelli, M.; De Francisci Morales, G.; Galeazzi, A.; Quattrociocchi, W.; Starnini, M. The Echo Chamber Effect on Social Media. Proc. Natl. Acad. Sci. 2021, 118 (9), e2023301118. https://doi.org/10.1073/pnas.2023301118.

(47) Arguedas, A. R.; Robertson, C. T.; Fletcher, R.; Nielsen, R. K. Echo Chambers, Filter Bubbles, and Polarisation: A Literature Review; Oxford, 2022.

(48) Bucher, T.; Helmond, A. The Affordances of Social Media Platforms. In The SAGE Handbook of Social Media; Burgess, J., Marwick, A., Poell, T., Eds.; SAGE Publications Ltd: 1 Oliver’s Yard, 55 City Road London EC1Y 1SP, 2018; pp 233–253. https://doi.org/10.4135/9781473984066.n14.

(49) Barberá, P. How Social Media Reduces Mass Political Polarization. Evidence from Germany, Spain, and the U.S; 2015.


(50) van der Linden, S. Misinformation: Susceptibility, Spread, and Interventions to Immunize the Public. Nat. Med. 2022, 28 (3), 460–467. https://doi.org/10.1038/s41591-022-01713-6.


(51) Righetti, N. Four Years of Fake News: A Quantitative Analysis of the Scientific Literature. First Monday 2021. https://doi.org/10.5210/fm.v26i7.11645.


(52) Tandoc, E. C.; Lim, Z. W.; Ling, R. Defining “Fake News”: A Typology of Scholarly Definitions. Digit. Journal. 2018, 6 (2), 137–153. https://doi.org/10.1080/21670811.2017.1360143.


(53) Chang, K.-C.; Menczer, F. How many bots are on Twitter? The question is tough to answer — and misses the point https://www.niemanlab.org/2022/05/how-many-bots-are-on-twitterthe-question-is-tough-to-answer-and-misses-the-point/ (accessed Sep 1, 2022).


(54) Giglietto, F.; Righetti, N.; Marino, G. Understanding Coordinated and Inauthentic Link Sharing Behavior on Facebook in the Run-up to 2018 General Election and 2019 European Election in Italy; SocArXiv, 2019. https://doi.org/10.31235/osf.io/3jteh.


(55) Giglietto, F.; Iannelli, L.; Rossi, L.; Valeriani, A.; Righetti, N.; Carabini, F.; Marino, G.; Usai, S.; Zurovac, E. Mapping Italian News Media Political Coverage in the Lead-Up of 2018 General Election. SSRN Electron. J. 2018. https://doi.org/10.2139/ssrn.3179930.


(56) Giglietto, F.; Righetti, N.; Rossi, L.; Marino, G. Coordinated Link Sharing Behavior as a Signal to Surface Sources of Problematic Information on Facebook. ACM Int. Conf. Proceeding Ser. 2020, 20, 85–91. https://doi.org/10.1145/3400806.3400817.


(57) Varol, O.; Ferrara, E.; Davis, C. A.; Menczer, F.; Flammini, A. Online Human-Bot Interactions: Detection, Estimation, and Characterization. Proc. 11th Int. Conf. Web Soc. Media, ICWSM 2017 2017, 280–289.


(58) Khaund, T.; Kirdemir, B.; Agarwal, N.; Liu, H.; Morstatter, F. Social Bots and Their Coordination During Online Campaigns: A Survey. IEEE Trans. Comput. Soc. Syst. 2022, 9 (2), 530–545. https://doi.org/10.1109/TCSS.2021.3103515.


(59) Dang, S.; Paul, K.; Chmielewski, D. Do spam bots really comprise under 5% of Twitter users? Elon Musk wants to know | Reuters https://www.reuters.com/technology/do-spam-botsreally-comprise-under-5-twitter-users-elon-musk-wants-know-2022-05-13/ (accessed Sep 1, 2022).


(60) Luceri, L.; Deb, A.; Giordano, S.; Ferrara, E. Evolution of Bot and Human Behavior during Elections. First Monday 2019, 24 (9). https://doi.org/10.5210/fm.v24i9.10213.


(61) Bugra Torusdag, M.; Kutlu, M.; Selcuk, A. A. Are We Secure from Bots? Investigating Vulnerabilities of Botometer. 5th Int. Conf. Comput. Sci. Eng. UBMK 2020 2020, 343–348. https://doi.org/10.1109/UBMK50275.2020.9219433.


(62) Balestrucci, A. How Many Bots Are You Following? In CEUR Workshop Proceedings; Ancona; Italy, 2020; pp 47–59.


(63) Bello, B. S.; Heckel, R.; Minku, L. Reverse Engineering the Behaviour of Twitter Bots. 2018 5th Int. Conf. Soc. Networks Anal. Manag. Secur. SNAMS 2018 2018, 27–34. https://doi.org/10.1109/SNAMS.2018.8554675.


(64) Akyon, F. C.; Esat Kalfaoglu, M. Instagram Fake and Automated Account Detection. In Proceedings - 2019 Innovations in Intelligent Systems and Applications Conference, ASYU 2019; IEEE, 2019; pp 1–7. https://doi.org/10.1109/ASYU48272.2019.8946437.


(65) Cresci, S. A Decade of Social Bot Detection. Commun. ACM 2020, 63 (10), 72–83. https://doi.org/10.1145/3409116.


(66) Al-Khateeb, S.; Agarwal, N. Examining Botnet Behaviors for Propaganda Dissemination: A Case Study of ISIL’s Beheading Videos-Based Propaganda. In Proceedings - 15th IEEE International Conference on Data Mining Workshop, ICDMW 2015; IEEE, 2016; pp 51–57. https://doi.org/10.1109/ICDMW.2015.41.


(67) Islam, A. K. M. N.; Laato, S.; Talukder, S.; Sutinen, E. Misinformation Sharing and Social Media Fatigue during COVID-19: An Affordance and Cognitive Load Perspective. Technol. Forecast. Soc. Change 2020, 159, 120201. https://doi.org/10.1016/j.techfore.2020.120201.


(68) Törnberg, P. Echo Chambers and Viral Misinformation: Modeling Fake News as Complex Contagion. PLoS One 2018, 13 (9), 1–21. https://doi.org/10.1371/journal.pone.0203958.


(69) IFCN. International Fact-Checking Network – Poynter https://www.poynter.org/ifcn/ (accessed Jul 27, 2020).


(70) Bruns, A.; Harrington, S.; Hurcombe, E. ‘Corona? 5G? Or Both?’: The Dynamics of COVID-19/5G Conspiracy Theories on Facebook. Media Int. Aust. 2020, 177 (1), 12–29. https://doi.org/10.1177/1329878X20946113.


(71) de-Lima-Santos, M.-F.; Kooli, A. Instagrammable Data: Using Visuals to Showcase More Than Numbers on AJ Labs Instagram Page. Int. J. Commun. 2022, 16, 2821–2842. https://doi.org/1932–8036/20220005.


(72) Reuters. Facebook removes dozens of vaccine misinformation “superspreaders” https://www.reuters.com/technology/facebook-removes-dozens-vaccine-misinformationsuperspreaders-2021-08-18/ (accessed Oct 11, 2021).


(73) Bastian, M.; Heymann, S.; Jacomy, M. Gephi: An Open Source Software for Exploring and Manipulating Networks. Icwsm 2009.


(74) Blondel, V. D.; Guillaume, J. L.; Lambiotte, R.; Lefebvre, E. Fast Unfolding of Communities in Large Networks. J. Stat. Mech. Theory Exp. 2008, 2008 (10), 10008. https://doi.org/10.1088/1742- 5468/2008/10/P10008.


(75) Fortunato, S. Community Detection in Graphs. Phys. Rep. 2010, 486 (3–5), 75–174. https://doi.org/10.1016/J.PHYSREP.2009.11.002.


(76) Howard, P. N.; Woolley, S.; Calo, R. Algorithms, Bots, and Political Communication in the US 2016 Election: The Challenge of Automated Political Communication for Election Law and Administration. J. Inf. Technol. Polit. 2018, 15 (2), 81–93. https://doi.org/10.1080/19331681.2018.1448735.


(77) Theocharis, Y.; Jungherr, A. Computational Social Science and the Study of Political Communication. Polit. Commun. 2021, 38 (1–2), 1–22. https://doi.org/10.1080/10584609.2020.1833121.


(78) Girvan, M.; Newman, M. E. J. Community Structure in Social and Biological Networks. Proc. Natl. Acad. Sci. U. S. A. 2002, 99 (12), 7821–7826. https://doi.org/10.1073/pnas.122653799.


(79) Watts, D. J.; Strogatz, S. H. Collective Dynamics of ’small-World9 Networks. Nature 1998, 393 (6684), 440–442. https://doi.org/10.1038/30918.


(80) Walter, N.; Salovich, N. A. Unchecked vs. Uncheckable: How Opinion-Based Claims Can Impede Corrections of Misinformation. Mass Commun. Soc. 2021, 15205436.2020.1864406. https://doi.org/10.1080/15205436.2020.1864406.


(81) Mozilla Foundation. INSIDE THE SHADOWY WORLD OF DISINFORMATION FOR HIRE IN KENYA; 2021.


(82) Culliford, E. Facebook cracks down on German anti-COVID restrictions group over “social harm” https://www.reuters.com/technology/facebook-shuts-down-network-linked-germananti-covid-group-launches-rules-social-2021-09-16/ (accessed Oct 10, 2021).


(83) Angus, D.; Bruns, A.; Hurcombe, E.; Harrington, S.; Glazunova, S.; Montaña-Niño, S. X.; Obeid, A.; Coulibaly, S.; Copland, S.; Graham, T.; Wright, S.; Dehghan, E. ‘Fake News’ and Other Problematic Information: Studying Dissemination and Discourse Patterns. AoIR Selected Papers of Internet Research. 2021. https://doi.org/10.5210/spir.v2021i0.12089


(84) Dehghan, E.; Glazunova, S.. ‘Fake news’ discourses. Journal of Language and Politics. 2021. 20(5), 741–760. https://doi.org/10.1075/jlp.21032.deh


(85) de-Lima-Santos, M.-F.; Ceron, W. Coordinated Amplification, Coordinated Inauthentic Behavior, Orchestrated Campaigns? A Systematic Literature Review of Coordinated Inauthentic Content on Online Social Networks. In T. Filibeli & M. Ö. Özbek (Eds.), Mapping Lies in the Global Media Sphere (1st ed.). 2024.Taylor & Francis.


(86) Porreca, A.; Scozzari, F.; Di Nicola, M. Using text mining and sentiment analysis to analyse YouTube Italian videos concerning vaccination. BMC Public Health. 2020. 20(1), 1–9. https://doi.org/10.1186/s12889-020-8342-4


(87) Wardle, C. Misunderstanding Misinformation. Issues in Science and Technology. 2023. 29(3), 38– 40. https://doi.org/10.58875/zaud1691