close

PhD on Ethical AI for Health & Medicine

Research / Academic
Eindhoven

Generative artificial intelligence is disrupting many domains, including communication, education, and academic research. Recently, some companies and research initiatives have proposed that LLMs could - perhaps should - be used in healthcare, both for discovering new medical technologies and for patient treatment. In terms of research, LLMs can find trends and patterns in vast quantities of medical research, synthesising data and making new medical discoveries. In terms of improving medical care, LLMs have the potential to be integrated into the doctor-patient relationship. They can summarise consultations, and offer ongoing medical advice, and independently provide mental health services.

Using LLMs and other AI systems in healthcare comes with serious ethical issues. The most challenging of these is privacy, but ethical issues also include: the ability of these technologies to generate misleading or unreliable information to healthcare practitioners, the emergence of 'responsibility gaps' in the healthcare system, and issues of social and epistemic justice among different patients and users. Some of these ethical challenges are similar to risks in other domains, but the challenges of applying LLMs in healthcare potentially comes with distinct harms. Unreliable healthcare information can have devastating consequences, which may be magnified if the design of LLMs do not take the privacy of the highly sensitive data they deal with into account as well as issues of justice and responsibility.   

This PhD will begin by examining the key ethical challenges of privacy, reliability, justice and responsibility, showing how these ethical risks can be minimised or mitigated. This will prepare the candidate to develop a new ethical framework that outlines how generative AI can be ethically deployed in various kinds of healthcare. The candidate can either choose to focus on how LLMs can be used in medical research or how these technologies can provide new tools for doctors and patients. The candidate will also be responsible for sharing ethical insights with the other partners in the MedGPT consortium (see below for details). Additionally, an interest in analysing the use of generative AI in healthcare from the perspective of philosophy of science or intercultural philosophy would be a plus, although not a requirement.

Funding & Institutional Embedding

This PhD position is part of the Medical GPT: Revolutionising Healthcare with Ethical AI project (MedGPT). The candidate will be supervised by Matthew J. Dennis (TU/e), Vlasta Sikimić (TU/e), Filippo Santoni de Sio (TU/e), and they will be hosted by Eindhoven University of Technology (TU/e) in the Philosophy & Ethics Group. The candidate will be responsible for working with other stakeholders in the MedGPT project, and will be encouraged to collaborate with other researchers in the consortium.

Philosophy & Ethics Group

TU/e's Philosophy and Ethics (P&E) group connects philosophy and ethics to emerging technologies and innovation. Researchers in the P&E group primarily study innovative technologies and technology-related problems in detail to enable empirically informed analyses that are meaningful to philosophers, researchers across disciplines, and other societal stakeholders. To do this, the group has established close interdisciplinary collaborations with researchers from groups in the TU/e School of Innovation Sciences, as well as with mechanical engineers, climate scientists, and archaeologists, among others. The group's expertise covers a variety of philosophical sub-disciplines, including applied ethics, normative ethics, meta-ethics, philosophy of science and technology, and epistemology. More information about P&E can be found here: https://www.tue.nl/en/research/research-groups/innovation-sciences/philosophy-ethics

Requirements:

  • A master's degree (or an equivalent university degree) in philosophy or related discipline.
  • A research-oriented and impact-orientated attitude.
  • Ability to work as part of a team of researchers working on similar topics, but from different disciplines.
  • Interested in collaborating with governmental, policy, and industry stakeholders.
  • Fluent in spoken and written English.
  • We particularly welcome applications from candidates belonging to groups that have been traditionally underrepresented in academia, including, but not limited to, women and ethnic minorities.
  • The desired starting date is August 2025.

Salary Benefits:

A meaningful job in a dynamic and ambitious university, in an interdisciplinary setting and within an international network. You will work on a beautiful, green campus within walking distance of the central train station. In addition, we offer you:

  • Full-time employment for four years, with an intermediate evaluation (go/no-go) after nine months. You will spend 10% of your employment on teaching tasks.
  • Salary and benefits (such as a pension scheme, paid pregnancy and maternity leave, partially paid parental leave) in accordance with the Collective Labour Agreement for Dutch Universities, scale P
    (min. €2,901 max. €3,707).
  • A year-end bonus of 8.3% and annual vacation pay of 8%.
  • High-quality training programs and other support to grow into a self-aware, autonomous scientific researcher. At TU/e we challenge you to take charge of your own learning process.
  • An excellent technical infrastructure, on-campus children's day care and sports facilities.
  • An allowance for commuting, working from home and internet costs.
  • Staff Immigration Team and a tax compensation scheme (the 30% facility) for international candidates. 
Work Hours:

38 hours per week

Address:

De Rondom 70