Weapons and Artificial Intelligence: Two Options for Denying the Value- Neutrality Thesis | Tomsk State University Journal of Philosophy, Sociology and Political Science. 2020. № 58. DOI: 10.17223/1998863X/58/8

Weapons and Artificial Intelligence: Two Options for Denying the Value- Neutrality Thesis

The article shows that the value-neutrality thesis contains contradictions and cannot be used in the practice of humanitarian assessment of technology. The theory of value embedding is analyzed. The conclusion that this approach does not explain the possibility of using technical objects to implement values that were not anticipated by the developers and does not disclose the process of “embedding” values has been made. It is shown that the concept of value embedding cannot be used as the basis for a humanitarian assessment of technologies. The attribution of values to the technical objects themselves, which we can find in various concepts of ethical artificial intelligence, is not fully justified and can lead to serious negative social and humanitarian consequences. At the same time, due to their physical organization and functional orientation, technical objects construct many types of human activity, and make some of them possible. Therefore, they can indirectly influence the implementation of certain values, and this must be taken into account in the practice of regulating the use of these objects. The problems of the humanitarian assessment and forecasting of the possible directions of the implementation of values during the implementation of technologies were developed in the 20th century in international law when assessing various types of weapons. The analysis shows that under the current conditions it is advisable to use such approaches in relation to artificial intelligence systems that can cause serious humanitarian and social transformations. At the same time, in the field of intelligent systems, the practice of facilitated legal regulation based on the theory of value embedding is being actively promoted. The functions of axiological and ethical assessment in such cases are in fact transferred from experts to developers of these technologies. In these conditions, it is important that specialists in the field of philosophy of technology, axiology, ethics actively speak out in the public space on these issues.

Download file
Counter downloads: 118

Keywords

value, artifact, axiology of technology, value-neutrality thesis, artificial intelligence, theory of value embedding

Authors

NameOrganizationE-mail
Yastreb Natalia A.Vologda State Universitynayastreb@mail.ru
Всего: 1

References

Mitcham C., Walelbers K. Technology and Ethics: Overview // A Companion to the Philosophy of Technology / eds. J.B. Olsen, S. Pedersen, V. Hendricks. West Sussex : Wiley Blackwell, 2009. P. 367-383.
Elul J. The Technological Society / J. Elul, J. Wilkinson (transl.), R. Merton (int.). New York : Vintage Books, 1964. 449 p.
Beijing AI Principles. URL: https://www.baai.ac.cn/news/beijing-ai-principles-en.html (accessed: 06.06.2020).
The Moral Status of Technical Artefacts / eds. P. Kroes, P.-P. Verbeek. Dordrecht : Springer Netherlands, 2014. Vol. 17. 248 p.
Klenk M. How Do Technological Artefacts Embody Moral Values? // Philosophy & Technology. 2020. Vol. 2. URL: https://link.springer.com/article/10.1007/s13347-020-00401-y (accessed: 06.06.2020).
Verbeek P.P. Moralizing Technology: Understanding and Designing the Morality of Things. London : University of Chicago Press, 2011. 200 p.
Poel van de I., Kroes P. Can technology embody values? // The Moral Status of Technical Artefacts. 2014. P. 103-124.
Hoven van den J. Design for values and values for design // Information Age. 2005. № 4. P. 4-7.
Декларация об отмене употребления взрывчатых и зажигательных пуль. URL: http://old.memo.ru/prawo/hum/spb-1868.htm (дата обращения: 04.06.2020).
Конвенция ООН по конкретным видам обычного оружия 1980 года URL: https://www.icrc.org/ru/document/konvenciya-oon-po-konkretnym-vidam-obychnogo-oruzhiya-1980-goda (дата обращения: 01.06.2020).
Конвенция по кассетным боеприпасам. URL: https://www.un.org/ru/documents/decl_ conv/conventions/pdf/cluster_munitions.pdf (дата обращения: 01.06.2020).
AI Ethics Principles/ Australian Government, Department of Industry, Science. URL: https://www.industry.gov.au/data-and-publications/ building-australias-artificial-intelligence-capability /ai-ethics-framework/ai-ethics-principles
On Artificial Intelligence - A European approach to excellence and trust. URL: https:// ec.europa.eu/info/sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf
Top Ten Principles for Ethical AI. URL: http://www.thefutureworldofwork.org/media/35420/ uni_ethical_ai.pdf (accessed: 01.06.2020).
Ochigame R. The invention of “ethical AI”: how big tech manipulates academia to avoid regulation // The Intercept. 2019. URL: https://theintercept.com/2019/12/20/mit-ethical-ai-artificial-intelligence/ (accessed: 01.06.2020).
 Weapons and Artificial Intelligence: Two Options for Denying the Value- Neutrality Thesis | Tomsk State University Journal of Philosophy, Sociology and Political Science. 2020. № 58. DOI: 10.17223/1998863X/58/8

Weapons and Artificial Intelligence: Two Options for Denying the Value- Neutrality Thesis | Tomsk State University Journal of Philosophy, Sociology and Political Science. 2020. № 58. DOI: 10.17223/1998863X/58/8

Download full-text version
Counter downloads: 1571