University of Oxford

Open Data about the University of Oxford

data.ox.ac.uk

beta

Research Associate on “Interpretable and Explainable Deep Learning for Natural Language Understanding and Commonsense Reasoning”

Applications for this vacancy closed on 2 January 2020 at 12:00PM
THIS POST IS FOR INTERNAL APPLICANTS ONLY



The Artificial Intelligence and Machine Learning group at the Department of
Computer Science has a vacancy for a Research Associate on “Interpretable and
Explainable Deep Learning for Natural Language Understanding and Commonsense
Reasoning”, funded by the Alan Turing Institute.



Reporting to Professor Thomas Lukasiewicz, you will be responsible for
carrying out research towards new approaches to interpretable and explainable
deep learning for natural language understanding and commonsense reasoning.
You will explore, generalise, and integrate deep learning approaches to
structured data extraction and to large-scale logic-based reasoning, towards
an interpretable and explainable deep-learning approach to human-like
understanding and commonsense reasoning in natural language processing, and to
investigate its applications in other disciplines, such as healthcare,
engineering, law, and finance. You will also collaborate with Professor
Lukasiewicz and members of his research group, providing guidance to junior
members of the research group, including PhD students, MSc students, and/or
project volunteers.



The primary selection criteria are a PhD/DPhil (or close to completion) in
Computer Science, Mathematics, Statistics, Engineering, Computational
Linguistics, or related discipline, together with relevant experience, in
particular possessing a good (theoretical and programming) background in
machine learning, and knowledge representation and reasoning (desirably in
deep learning and neural networks, deep-learning-based representations,
knowledge bases and graphs, ontology languages, natural language processing,
and explainable and interpretable artificial intelligence), as well as good
software engineering skills (especially in system implementations and
experimental evaluations), and potentially experience in health-care,
engineering, law, or finance applications.



Whilst the role is a Grade 7 position, we would be willing to consider
candidates with potential but less experience who are seeking a development
opportunity, for which an initial appointment would be at Grade 6 (£29,176 -
£34,804 p.a.) with the responsibilities adjusted accordingly. This would be
discussed with applicants at interview/appointment where appropriate



The closing date for applications is 12.00 noon on Thursday 2 January 2020.



Our staff and students come from all over the world and we proudly promote a
friendly and inclusive culture. Diversity is positively encouraged, through
diversity groups and champions, for example
http://www.cs.ox.ac.uk/aboutus/women-cs-oxford/index.html, as well as a number
of family-friendly policies, such as the right to apply for flexible working
and support for staff returning from periods of extended absence, for example
maternity leave.

dc:spatial
Department of Computer Science, Parks Road, Oxford.
Subject
oo:contact
oo:formalOrganization
oo:organizationPart
vacancy:applicationClosingDate
2020-01-02 12:00:00+00:00
vacancy:applicationOpeningDate
2019-12-16 09:00:00+00:00
vacancy:furtherParticulars
vacancy:internalApplicationsOnly
True
vacancy:salary
type
comment

THIS POST IS FOR INTERNAL APPLICANTS ONLY


The Artificial Intelligence and Machine Learning group at the Department of Computer Science has a vacancy for a Research Associate on “Interpretable and Explainable Deep Learning for Natural Language Understanding and Commonsense Reasoning”, funded by the Alan Turing Institute.


Reporting to Professor Thomas Lukasiewicz, you will be responsible for carrying out research towards new approaches to interpretable and explainable deep learning for natural language understanding and commonsense reasoning. You will explore, generalise, and integrate deep learning approaches to structured data extraction and to large-scale logic-based reasoning, towards an interpretable and explainable deep-learning approach ...

THIS POST IS FOR INTERNAL APPLICANTS ONLY



The Artificial Intelligence and Machine Learning group at the Department of
Computer Science has a vacancy for a Research Associate on “Interpretable and
Explainable Deep Learning for Natural Language Understanding and Commonsense
Reasoning”, funded by the Alan Turing Institute.



Reporting to Professor Thomas Lukasiewicz, you will be responsible for
carrying out research towards new approaches to interpretable and explainable
deep learning for natural language understanding and commonsense reasoning.
You will explore, generalise, and integrate deep learning approaches to
structured data extraction and to large-scale logic-based reasoning, towards
an interpretable and explainable deep-learning approach ...
label
Research Associate on “Interpretable and Explainable Deep Learning for Natural Language Understanding and Commonsense Reasoning”
notation
144560
based near
page