Doctoral Consortium

Research in responsible computing and AI can involve interdisciplinary, multidisciplinary, and transdisciplinary work, which provide extra challenges to an already challenging doctoral career trajectory.

The Doctoral Consortium will take place on October 27th, 2025. Doctoral students in areas related to responsible computing, responsible AI, human-AI interaction, and policy who are admitted to the Doctoral Consortium will received 1:1 and small-group mentorship from established responsible computing and AI faculty, opportunities for networking and establishing themselves as future voices in the area. We also plan career skill development sessions, such as proposal writing. Consortium participants will also be expected to present at a poster session on October 28th during the main Summit event, and to participate in interactive visioning sessions during the main Summit event.

Doctoral students in responsible computing, responsible AI, human-centered AI, human-AI interaction, policy, and related areas are encouraged to apply using the Application Form. Your advisor will be need to submit a 1-page reference letter to Mark Riedl and Kartik Goyal.

Applications before September 15, 2025 will receive full consideration.

We especially encourage Doctoral students who are just before their thesis proposal defense or just after, as this is the period where additional mentorship can have substantial impact.

We also especially encourage Doctoral students who are working in labs that operate from a different epistemic backgroud than that of the student. For example, a student with a traditional CS training working in a policy lab (or vice versa), or a student with traditional HCI training working in an AI lab (or vice versa).

Other strong candidates are welcome and will be fully considered.

Applicants selected for participation in the Consortium will have their flights, lodging, and other travel expenses covered (US only; sorry, our funds have restrictions).