Technology, Power & Ethics (DP IB Theory of Knowledge): Revision Note
Technology, power & ethics
Technology can shift power by changing who can access information, influence decisions, and control how knowledge is produced and shared
Ethical questions arise because technological choices affect people’s rights and well-being, including privacy, fairness, and who benefits or is harmed
Creators of technology need to consider responsibility to the user and wider society
E.g. exploring the complexities of responsible decision-making in autonomous vehicles
Surveillance and privacy
Surveillance is the monitoring, recording or tracking of people’s behaviour or communications using technology
Privacy is about an individual’s control over personal information; how it is collected, how it is used and who can access it
Surveillance can increase security and accountability, but it can also reduce privacy by collecting data people did not knowingly agree to share
E.g. CCTV in public spaces can deter crime, but it also records people who are not suspected of wrongdoing
Power can shift towards those who control surveillance tools and data, because they can observe others without being observed themselves
Ownership of information
Ownership of information refers to who has the right to access, control, sell or delete data
Ownership can be unclear when information is co-produced
E.g. a fitness app generates health data from your body, but the company stores it and may analyse it for commercial purposes
Those who own or control data can gain power by deciding who gets access and what it can be used for
Ethical issues arise when people do not understand what they are agreeing to, or cannot realistically refuse, meaning that consent for data sharing may be weak
Algorithmic manipulation and control
Algorithms can influence our use of technology by shaping what we see and what is rewarded or discouraged
E.g. a social media feed prioritises emotionally engaging content, increasing time on the platform
The power of the technology platform increases when algorithms are opaque, because people cannot easily challenge decisions or understand how they are being influenced
Ethical concerns include:
reduced autonomy
exploitation of attention
unequal impacts on different groups
Counterbalances include transparency about how recommendations work, user controls, and independent oversight of high-stakes systems

Social and ethical implications of technological knowledge
Social implications | Ethical implications |
Can change how people work, learn and communicate, reshaping who has influence. Can increase reliance on technological outputs in everyday decisions. Can widen gaps between groups if access to tools, data or digital skills is unequal. Can spread information quickly at a large scale, which can improve coordination but also increase the impact of misleading content. | Technological knowledge can be over-trusted because it looks precise or data-driven, even when it depends on incomplete data or contestable assumptions. Can create unfair outcomes if systems reflect bias, or if impacts fall more heavily on some groups. Requires accountability: who is responsible when tool-based decisions cause harm? Requires responsible use: checking reliability and fairness, and being clear about limitations and uncertainty. |
Unlock more, it's free!
Was this revision note helpful?