Is Knowledge Dangerous

October 7, 2025

Is Knowledge Dangerous?

By Eva Smith

In this essay I will argue that in an age of mass knowledge sharing, the related dangers are equally large in scale. In the context of mass data collection, the danger of knowledge lies within sources which claim to be representative but are not; sources aim to be complete but are full of serious gaps; and sources that say they are secure but are instead highly vulnerable. I will firstly examine the risks of the mass collection of civilian data, especially personal information taken ‘sans’ consent. Secondly, I will explore the risks of technological advances and the implications of data security breaches. Finally, I will discuss the potential for discrimination within healthcare, potentially becoming more prevalent due to biased data, showing how easily danger creeps in when data favours one group only to inadvertently pose risk to another. 


A key danger of mass collection and storage of civilian data is the notion that companies use data for hidden purposes, breaking the implicit trust between consumer and company. So worried are members of the public about this issue that in an Ipsos poll (Lloyd & Jackson, 2022), 78% of respondents said companies should obtain their consent before accessing and using their data. This was echoed in a survey I did of students and staff at my school (Source 1, Smith, 2023) where only 20% of respondents said they felt “very safe” online. A sample of 46 is small but indicative nonetheless of deep mistrust.


While trust in these companies does exist in some quarters (research from Mast (2020), shows smart watch data may have advanced our understanding of the impact of the pandemic) this trust can easily be lost. Health apps such as Ovia record personal information not required for the app to function but, instead, collected for sale to third parties. (Taylor, 2023). 


Further problems arise when organisations have their information discovered by third parties. I believe real danger lies in the risk posed to political stability nationally and globally due to the digital revolution…


This revolution is building ever-increasing data dominance in our daily lives. It would take 181.3 million years to download all the internet’s data showing the information we create online is increasing exponentially (IBM as cited by Rayaprolu, 2023) and hacking technology is advancing alongside it making major companies the targets of cyberattacks. For example, a Russian-linked hacking group recently targeted the MOVEIT software used by the state of Louisiana (Vargas, 2023). They accessed private information of every licensed Louisiana driver revealing the terrifying amount of power wielded against organisations across the world (Vargas, 2023). Hackers could potentially expose sensitive government information causing widespread global outrage. Humanity’s use of knowledge can seem positive because of the myriad benefits the internet can bring to sectors like education and health. For example, using the internet rather than expensive data centres helps the NHS reduce “overhead costs” (NHS, 2020).  Yet, if data is not held securely, what then? Mast (2020) stresses serious consequences for users when companies fail to keep their data safe. Adobe, Zynga and Canva all experienced large-scale customer data breaches in 2019. This misplaced trust in major companies can cause important information to be leaked, while constantly improving hacking technology only increases the frequency of these cyberattacks putting everyone’s safety at risk.

Finally, while I can see that mass collection of data can be advantageous, we need to consider how data bias harms us when we rely too heavily on it for our health. It can damage health care outcomes for some. For example, women are disadvantaged by the huge disparity between medical data collected about men and women (Burns et al., 2023). Burns et al (2023) argues “Globally, the quality and quantity of women’s health data collection is uneven.” This can make it hard to find reliable information as data is likely to be based on men as “women are largely under-represented in medical research” according to (Merone et al., 2022). This, say (Burns et al., 2023) means: “women wait longer than men for (...) a diagnosis (...) and are more likely to be misdiagnosed”. Although some argue that change is coming through the expanding femtech industry, the wider benefits are less certain. By 2025, the femtech sector could be worth “$50 billion” (Frost & Sullivan, n.d.) and  “could revolutionize female healthcare”. 


However, experts believe that the femtech industry is inadvertently isolating and excluding other groups. The use of the prefix ‘fem’ can alienate individuals who are non-binary, transgender, or intersex,” says (Horsting, 2019) leading to a “lack of inclusivity” (Horsting, 2019) within the term femtech. This could mean lower levels of non-binary, transgender and intersex research “participation” (Horsting, 2019) in this industry, so people from other groups “receive fewer of the novel [femtech] innovations to potentially support their medical needs” (Horsting, 2019). Even within femtech, where companies and researchers aim to fill the data gaps for women, we could see a vicious cycle of exclusion that continuously alienates and marginalises other groups, increasing poor health outcomes. This bias effectively creates an element of danger within the care some groups are offered due to insufficient medical information. 


In conclusion, I believe knowledge stored insecurely, collected ‘sans’ consent and not checked for bias is dangerous. Insecure data is more likely to be stolen through hacking causing widespread mistrust while bias means data is not representative, leading to exclusion and poor medical advice from professionals we are supposed to trust. I believe alongside Amnesty International that we must “rewire…resist… and rewrite” our relationship with technology. We need to “rewire” our approach to technology to ensure the safety of humans is at the forefront of advancements. Secondly, “resist” the unlawful and surreptitious collection of private information through apps such as Ovia. Finally, we must “rewrite” the way we collect data to be mindful of bias and exclusion. (Amnesty International, 2023).




Reference List


Amnesty International, (13th of February 2023). Amnesty International. https://www.amnesty.org/en/tech/ Retrieved on the 13th of July of 2023.


Burns, D., Grabowsky, T., Kemble, E., Pérez, L. (2023). Closing the data gaps in women’s health. McKinsey & Company. Closing the gender data gap in healthcare | McKinsey Retrieved on the 5th of July.


Cambridge University. (n.d.). Knowledge. Cambridge University Press. https://dictionary.cambridge.org/dictionary/english/knowledge  Retrieved on July 11, 2023.

Frost & Sullivan, (n.d.). Femtech Market - Digitizing Women's Health.  Frost & Sullivan. Femtech Market - Digitizing Women's Health - Themes Retrieved on the 5th of July 2023.

Horsting, T. (2023). The Pros and Cons of the Rise of Femtech. Patient Worthy.https://patientworthy.com/2019/07/22/increased-focus-womens-healthcare-needs-pros-cons-multiple-sclerosis-ms/  Retrieved on the 5th of July 2023.


IBM as cited in Rayaprolu, A., Ivanov, I., Shahnazari, K. (2023) 25+ Impressive Big Data Statistics for 2023. Techjury. https://techjury.net/blog/big-data-statistics/ Retrieved on the 14th of July.


Lloyd, N., & Jackson, C. (2022). Most Americans say it is increasingly difficult to control who can access their online data. Ipsos. Most Americans say it is increasingly difficult to control who can access their online data | Ipsos Retrieved on the 11th of July 2023.


Mast, S. (2020). Data Collection: The Good, The Bad And The Ugly. Forbes. Data Collection: The Good, The Bad And The Ugly Retrieved on the 11th of July 2023. 


Merone, L., Tsey, K., Russel, D., Nagle, C. (2022). Sex Inequalities in Medical Research: A Systematic Scoping Review of the Literature. The National Institutes of Health. https://doi.org/10.1089/whr.2021.0083 Retrieved on the 14th of July 2023

NHS, (2020). The case for Internet First. https://digital.nhs.uk/services/internet-first/internet-first-guidance/the-case-for-internet-first Retrieved on 15th of July 2023. 


Taylor, J. (2023). Fertility apps collect unnecessary personal data and could sell it to third parties – study. The Guardian. Fertility apps collect unnecessary personal data and could sell it to third parties – study | Australia news | The Guardian Retrieved on the 12th of July 2023.


Vargas, R. (2023). Every Louisiana driver’s license holder exposed in colossal cyber-attack. The Guardian. Every Louisiana driver’s license holder exposed in colossal cyber-attack Retrieved on the 13th of July.


October 7, 2025
Cryptography: Truly Unbreakable? By Christy Badila
October 7, 2025
Has Globalisation Failed? By Daria Chiric