PhD position at Delft University of Technology TUDelft, The Netherlands

Ph.D. Position in moral or theoretical philosophy at TU Delft, Netherlands 2022

The Netherlands

Ph.D. position on “Ethics of artificial intelligence in the defence domain”  is available for Master’s degree candidates in moral or theoretical philosophy, with a demonstrable interest in the design of (AI) technologies at the Faculty of Technology, Policy and Management, TU Delft (Delft University of Technology), Netherlands 2022

Save 0

Scroll Down for Content

General Info

Position: PhD
No. of Positions: 1
Research Field:
Deadline to Apply: Expired
Joining Date: ASAP
Contract Period: 4 years
Salary: € 2,443.00 - € 3,122.00/month

Faculty of Technology, Policy & Management

Delft University of Technology (TU Delft), the Netherlands


Scroll Down for Content

Qualification Details

  • You will have completed, prior to appointment, a Master's degree or equivalent degree in moral or theoretical philosophy, with a demonstrable interest in the design of (AI) technologies. Other Master's degrees will also be considered, for instance in science and technology studies, engineering, or computer science, if you have demonstrable expertise in conceptual and philosophical problems and approaches to embed values in technological design.
  • Strong conceptual and analytic skills
  • You have proven capacity to work across disciplinary boundaries.
  • Relative to experience, you have excellent research skills and excellent academic writing and presentation skills.
  • You can work both independently and as part of a team.
  • You have a high level of proficiency in Dutch.

Doing a PhD at TU Delft requires English proficiency at a certain level to ensure that the candidate is able to communicate and interact well, participate in English-taught Doctoral Education courses, and write scientific articles and a final thesis. For more details please check the Graduate Schools Admission Requirements.

Responsibilities/Job Description

Military AI-enabled systems pose serious ethical, legal and societal challenges. Considering the unique nature of work in the defence domain, the stakes are immense. Several scientists and activist groups have warned against the potential arising of "killer robots", i.e., autonomous weapon systems that select and attack targets without meaningful human control. Other applications of AI in defence are also subjected to ethical and legal objections, such as AI systems that play a role in providing situational awareness or collecting intelligence. These applications can have serious ethical and legal ramifications as they may for instance bias decision-making processes on morally sensitive issues. The use of AI technology in defence means handing over some degree of autonomy and responsibility to machines, which may impact human agency, human dignity and human rights in warfare.

Without sufficient consideration of ethical, legal and societal aspects (ELSA) of the use of AI in the defence domain, risks like losing control, biased decision-making, violation of rights, and decreasing humanity in warfare may result in losing public support. These risks and consequences should be avoided.

Currently, it is unclear which AI-enabled systems are acceptable from ethical, legal and societal perspectives, and which are not, under what conditions/circumstances. This leads to both "over-use" (e.g., using too many AI-systems in too many situations, with lack of consideration of consequences) and "under-use" (e.g., not using AI, due to lack of knowledge or fear of consequences) of AI-systems. Both over-use and under-use of AI in defence may lead to risks of protecting the freedom, safety and security of society.

This PhD project will deliver a methodology for the safe and sound use of AI in the defence domain. The methodology will have to ensure ethical, legal and societal alignment in all stages of design, acquisition, and operationalization of autonomous systems and military human-machine teams. The project will also identify codesign methods for designing human-machine teams can be used to achieve ethical, legal, and societal compliance. It will help identifying the algorithms to be used to ensure this compliance, and will support the efforts to incorporate ethical, legal and societal aspects in a system-of-AI-systems.

This PhD position will be part of the ELSA (ethical legal societal aspects) Lab defence, granted under the NWA call "Human-centred AI for an inclusive society – towards an ecosystem of trust".

The successful candidate will work under the supervision of Filippo Santoni de Sio, Jeroen van den Hoven, Mark Neerincx, and Jurrian van Diggelen. Filippo Santoni de Sio and Jeroen van den Hoven are, respectively, associate and full professor in ethics and philosophy of technology at TU Delft. They have worked among other things on design for values, the ethics of digital technologies, and meaningful human control over autonomous systems. Mark Neerincx is full professor in Human-Centered Computing at the Delft University of Technology, and Principal Scientist at TNO Department of Human-Machine Teaming. Jurrian van Diggelen is Senior Researcher at TNO Defence, Safety and Security, and coordinator of  the ELSA Lab of which this PhD is part.


How to Apply?

Application Method: Online Application
Ref. No.: TUD02187

Application Procedure

Please submit the following items:

  • A one-page letter of motivation
  • A curriculum vitae of maximum 3 pages
  • Names and contact information of two referees
  • A one-page research note setting out your ideas about how you propose to carry out the project described above
  • A writing sample of maximum 20 pages
  • An academic grade transcripts of your highest degree programme(s)

Shortlisted candidates will be invited to an interview, presumably in Delft, in the weeks of June 6 or 13.

TU Delft stands for diversity and inclusion. We welcome employees with a wide variety of backgrounds and perspectives.

A pre-employment screening can be part of the application procedure.

About the Department/Section/Group

Faculty Technology, Policy and Management

With its excellent education and research at the intersection of technology, society and policy, the Faculty of TPM makes an important contribution to solving complex technical-social issues, such as energy transition, mobility, digitalisation, water management and (cyber) security. We combine insights from the engineering sciences, the social and the humanities. We develop robust models and designs, are internationally oriented and have an extensive network with knowledge institutions, companies, social organisations and governments.

Click here to go to the website of the Faculty of Technology, Policy and Management.

About the Employer:

Note or Other details

The position is based at the Ethics and Philosophy of Technology Section at the Faculty of Technology, Policy and Management at TU Delft (, which provides a stimulating and internationally oriented research environment. To this end, the candidate will participate in the activities of in the Graduate School of TU Delft, the Faculty of Technology, Policy and Management at TU Delft, and TPM and 4TU. Ethics Center for Ethics and Technology

Doctoral candidates will be offered a 4-year period of employment in principle but in the form of 2 employment contracts. An initial 1,5-year contract with an official go/no go progress assessment within 15 months. Followed by an additional contract for the remaining 2,5 years assuming everything goes well and performance requirements are met.

Salary and benefits are in accordance with the Collective Labour Agreement for Dutch Universities, increasing from € 2443 per month in the first year to € 3122 in the fourth year. As a PhD candidate, you will be enrolled in the TU Delft Graduate School. The TU Delft Graduate School provides an inspiring research environment with an excellent team of supervisors, academic staff and a mentor. The Doctoral Education Programme is aimed at developing your transferable, discipline-related and research skills.

The TU Delft offers a customisable compensation package, discounts on health insurance and sports memberships, and a monthly work costs contribution. Flexible work schedules can be arranged. For international applicants, we offer the Coming to Delft Service and Partner Career Advice to assist you with your relocation.

TU Delft stands for diversity and inclusion. We welcome employees with a wide variety of backgrounds and perspectives.

Partners in the Elsa Lab consortium, combined, have ample experience in (military) AI-systems, with a focus on legal (Leiden University, Asser), ethical (TU Delft), societal (HHS) and technical (TNO, NLDA) aspects. The consortium will democratize its process of research by including civil organizations. By including stakeholders that will represent civil society, the project will collaborate according to the Quadruple Helix model. The consortium will innovate through interactive processes that allow all stakeholders to contribute knowledge and perspectives.

The successful candidate in this position is expected to play an active role in the ELSA Lab project and to participate actively in the (stakeholder)  workshops, public events, and other project activities.

Because of the nature of the proposed work, a security screening may be required to fulfill this position.

Scroll Down for Content

Contact details

For information about the application procedure, please contact Filippo Santoni de Sio at [email protected]

Advertisement Details:

Other Vacancies from this field

Scroll Down for Content