{"id":1965,"date":"2025-05-04T12:23:33","date_gmt":"2025-05-04T16:23:33","guid":{"rendered":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/?p=1965"},"modified":"2025-05-04T12:25:17","modified_gmt":"2025-05-04T16:25:17","slug":"computer-vision-ethics","status":"publish","type":"post","link":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/science\/computer-vision-ethics\/","title":{"rendered":"Computer Vision Ethics"},"content":{"rendered":"<p><span style=\"font-weight: 400\">Computer vision (CV) is a field of computer science that allows computers to \u201csee\u201d or, in more technical terms, recognize, analyze, and respond to visual data, such as videos and images. CV is widely used in our daily lives, from something as simple as recognizing handwritten text to something as complex as analyzing and interpreting MRI scans. With the advent of AI in the last few years, CV has also been improving rapidly. However, just like any subfield of AI nowadays, CV has its own set of ethical, social, and political implications, especially when used to analyze people&#8217;s visual data.<\/span><\/p>\n<p><span style=\"font-weight: 400\">Although CV has been around for some time, there is limited work on its ethical limitations in the general AI field. Among the existing literature, authors categorized six ethical themes, which are espionage, identity theft, malicious attacks, copyright infringement, discrimination, and misinformation [1]. As seen in Figure 1, one of the main CV applications is face recognition, which could also lead to issues of error, function creep (the expansion of technology beyond its original purposes), and privacy. [2].<\/span><\/p>\n<figure id=\"attachment_1967\" aria-describedby=\"caption-attachment-1967\" style=\"width: 2559px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-1967 size-full\" src=\"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/Screenshot-2025-05-04-at-12.10.39\u202fPM.png\" alt=\"Computer Vision technologies related to Identity Theft\" width=\"2559\" height=\"1359\" srcset=\"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/Screenshot-2025-05-04-at-12.10.39\u202fPM.png 2559w, https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/Screenshot-2025-05-04-at-12.10.39\u202fPM-300x159.png 300w, https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/Screenshot-2025-05-04-at-12.10.39\u202fPM-1024x544.png 1024w, https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/Screenshot-2025-05-04-at-12.10.39\u202fPM-768x408.png 768w, https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/Screenshot-2025-05-04-at-12.10.39\u202fPM-1536x816.png 1536w, https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/Screenshot-2025-05-04-at-12.10.39\u202fPM-2048x1088.png 2048w\" sizes=\"auto, (max-width: 2559px) 100vw, 2559px\" \/><figcaption id=\"caption-attachment-1967\" class=\"wp-caption-text\">Figure 1: Specific applications of CV that could be used for Identity Theft.<\/figcaption><\/figure>\n<p><span style=\"font-weight: 400\">To discuss CV\u2019s ethics, the authors of the article take a critical approach to evaluating the implications through the framework of power dynamics. The three types of power that are analyzed are dispositional, episodic, and systemic powers [3].\u00a0<\/span><\/p>\n<p><b><i>Dispositional Power<\/i><\/b><\/p>\n<p><span style=\"font-weight: 400\">Dispositional power is defined as the ability to bring out a significant outcome [4]. When people gain that power, they feel empowered to explore new opportunities, and their scope of agency increases (they become more independent in their actions) [5]. However, CV can threaten this dispositional power in several ways, ultimately reducing people\u2019s autonomy.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">One way CV disempowers people is by limiting their information control. Since CV works with both pre-existing and real-time camera footage, people might be often unaware that they are being recorded and often cannot avoid that. This means that technology makes it hard for people to control the data that is being gathered about them, and protecting their personal information might get as extreme as hiding their faces.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">Apart from people being limited in controlling what data is being gathered about them, advanced technologies make it extremely difficult for an average person to know what specific information can be retrieved from visual data. Another way CV might disempower people of following their own judgment is through communicating who they are for them (automatically inferring people\u2019s race, gender, and mood), creating a forced moral environment (where people act from fear of being watched rather than their own intentions), and potentially leading to over-dependence on computers (e.g., relying on face recognition for emotion interpretations).\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">In all these and other ways, CV undermines the foundation of dispositional power by limiting people\u2019s ability to control their information, make independent decisions, express themselves, and act freely.<\/span><\/p>\n<p><b><i>Episodic Power<\/i><\/b><\/p>\n<p><span style=\"font-weight: 400\">Episodic power, or as often referred to as power-over, defines the direct exercise of power by one individual or group over another. CV can both give new power or improve the efficiency of existing one [6]. While this isn&#8217;t always a bad thing (for example, parents watching over children), problems arise when CV makes that control too invasive or one-sided\u2014especially in ways that limit people&#8217;s freedom to act independently.<\/span><\/p>\n<p><span style=\"font-weight: 400\">\u00a0With CV taking security cameras to the next level, opportunities such as baby-room monitoring or fall detection for elderly people open up to us. However, it also leads to the issues of surveillance automation, which can lead to over-enforcement in scales as small as private individuals to bigger corporations (workplaces, insurance companies, etc.). Another power dynamic shifts that need to be considered, for example, when the smart doorbells show far beyond the person at the door and might violate a neighbor\u2019s privacy by creating peer-to-peer surveillance.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">These examples show that while CV may offer convenience or safety, it can also tip power balances in ways that reduce personal freedom and undermine one\u2019s autonomy.<\/span><\/p>\n<p><b><i>Systemic Power<\/i><\/b><\/p>\n<p><span style=\"font-weight: 400\">Systematic power is not viewed as an individual exercise of power, but rather a set of societal norms and practices that affect people\u2019s autonomy by determining what opportunities people have, what values they hold, and what choices they make. CV can strengthen the systematic power by making law enforcement more efficient through smart cameras and increase businesses\u2019 profit through business intelligence tools.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">However, CV can also reinforce the pre-existing systematic societal injustices. One example of that might be flawed facial recognition, when the algorithms are more likely to recognize White people and males [7], which led to a number of false arrests. This might lead to people receiving unequal opportunities (when biased systems are used for hiring process), or harm their self-worth (when falsely recognized as a criminal).\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">Another matter of systematic power is the environmental cost of CV. AI systems rely on vast amounts of data, which requires intensive energy for processing and storage. As societies become increasingly dependent on AI technologies like CV, those trying to protect the environment have little ability to resist or reshape these damaging practices. The power lies with tech companies and industries, leaving citizens without the means to challenge the system. When the system becomes harder to challenge or change, that\u2019s when the ethical concerns regarding CV arise.<\/span><\/p>\n<p><b><i>Conclusion<\/i><\/b><\/p>\n<p><span style=\"font-weight: 400\">Computer Vision is a powerful tool that keeps evolving each year. We already see numerous applications of it in our daily lives, starting from the self-checkouts in the stores and smart doorbells to autonomous vehicles and tumor detections. With the potential that CV holds in improving and making our lives safer, there are a number of ethical limitations that should be considered. We need to critically examine how CV affects people\u2019s autonomy, might cause one-sided power dynamics, and reinforces societal prejudices. As we are rapidly transitioning into the AI-driven world, there is more to come in the field of computer vision. However, in the pursuit of innovation, we should ensure the progress does not come at the cost of our ethical values.<\/span><\/p>\n<p><strong>References:<\/strong><\/p>\n<p><span style=\"font-weight: 400\">[1] Lauronen, M.: Ethical issues in topical computer vision applications. Information Systems, Master\u2019s Thesis. University of Jyv\u00e4skyl\u00e4. (2017). <\/span><a href=\"https:\/\/jyx.jyu.fi\/bitstream\/handle\/123456789\/55806\/URN%3aNBN%3afi%3ajyu-201711084167.pdf?sequence=1&amp;isAllowed=y\"><span style=\"font-weight: 400\">https:\/\/jyx.jyu.fi\/bitstream\/handle\/123456789\/55806\/URN%3aNBN%3afi%3ajyu-201711084167.pdf?sequence=1&amp;isAllowed=y<\/span><\/a><\/p>\n<p><span style=\"font-weight: 400\">[2] Brey, P.: Ethical aspects of facial recognition systems in public places. J. Inf. Commun. Ethics Soc. 2(2), 97\u2013109 (2004). https:\/\/ <\/span><a href=\"http:\/\/doi.org\/10.1108\/14779960480000246\"><span style=\"font-weight: 400\">doi.org\/10.1108\/14779960480000246<\/span><\/a><\/p>\n<p><span style=\"font-weight: 400\">[3] Haugaard, M.: Power: a \u201cfamily resemblance concept.\u201d Eur. J. Cult. Stud. 13(4), 419\u2013438 (2010)<\/span><\/p>\n<p><span style=\"font-weight: 400\">[4] Morriss, P.: Power: a philosophical analysis. Manchester University Press, Manchester, New York (2002)<\/span><\/p>\n<p><span style=\"font-weight: 400\">[5] Morriss, P.: Power: a philosophical analysis. Manchester University Press, Manchester, New York (2002)<\/span><\/p>\n<p><span style=\"font-weight: 400\">[6] Brey, P.: Ethical aspects of facial recognition systems in public places. J. Inf. Commun. Ethics Soc. 2(2), 97\u2013109 (2004). https:\/\/doi.org\/10.1108\/14779960480000246<\/span><\/p>\n<p><span style=\"font-weight: 400\">[7] Buolamwini, J., Gebru, T.: Gender shades: intersectional accuracy disparities in commercial gender classification. Conference on Fairness, Accountability, and Transparency, pp. 77\u201391 (2018) Coeckelbergh, M.: AI ethics. MIT Press (2020)<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Computer vision (CV) is a field of computer science that allows computers to \u201csee\u201d or, in more technical terms, recognize, analyze, and respond to visual data, such as videos and images. CV is widely used in our daily lives, from something as simple as recognizing handwritten text to something as complex as analyzing and interpreting [&hellip;]<\/p>\n","protected":false},"author":736,"featured_media":1983,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_genesis_hide_title":false,"_genesis_hide_breadcrumbs":false,"_genesis_hide_singular_image":false,"_genesis_hide_footer_widgets":false,"_genesis_custom_body_class":"","_genesis_custom_post_class":"","_genesis_layout":"","footnotes":""},"categories":[65,1],"tags":[85,87,96,60,243,108,227],"class_list":{"0":"post-1965","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-csci-tech","8":"category-science","9":"tag-ai","10":"tag-ai-ethics","11":"tag-artificial-intelligence","12":"tag-csci-tech","13":"tag-computer-vision","14":"tag-ethics","15":"tag-technology","16":"entry"},"featured_image_src":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/1-600x400.jpeg","featured_image_src_square":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-content\/uploads\/sites\/35\/2025\/05\/1-600x600.jpeg","author_info":{"display_name":"Madina Sotvoldieva '28","author_link":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/author\/msotvoldieva\/"},"_links":{"self":[{"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/posts\/1965","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/users\/736"}],"replies":[{"embeddable":true,"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/comments?post=1965"}],"version-history":[{"count":0,"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/posts\/1965\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/media\/1983"}],"wp:attachment":[{"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/media?parent=1965"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/categories?post=1965"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/students.bowdoin.edu\/bowdoin-science-journal\/wp-json\/wp\/v2\/tags?post=1965"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}