dc.description.abstract |
Recurrent Neural Networks (RNNs) trained on neuroscience tasks are promising models
of population dynamics of their biological counterparts [1, 2, 3, 4]. In this approach, RNNs
are only constrained by task definition, and can potentially find many solutions to the same
task. This is because training RNNs (or any highly parameterized function) by input-output
examples can suffer from underspecification, as different algorithms or mechanisms can solve
the same input-output examples. In light of this, how do we ensure that our RNN models
are faithful to the biological computation being modelled?
To systematically approach this question, we need a ground truth model. We therefore
use a student-teacher framework in which both the teacher and the student are RNNs. In
particular, a teacher RNN is trained to solve a task – much like an animal in laboratory –
and the student is constrained to match either the behavior or the neural data of the teacher.
We then compare mechanistic similarity by invoking the concept of stress tests – inputs not
from the task related training distribution. We recognise that both behavioral and neural
constraints have weaknesses in this regard and that in order to find some kind of guarantee,
information about the teacher’s response to perturbations to the inputs is crucial for the
student. Motivated by the abundance of behavioral data, we propose a novel method of
training RNNs that we call ‘Jacobian constraint’ , wherein we constrain not only the RNNs
input-output behavior but also the sensitivity – the behavioral response to infinitesimal stress
tests. We find that RNNs obtained by this method replicate stress test behavior better than
those obtained by constraining RNNs to neural data.[5, 6] |
en_US |