-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: Differentiability w.r.t what exactly? #58
Comments
Hi @JonathanKuelz! Thanks for your interest and for opening the issue :) Btw, @CarlottaSartore will tell you more about the rationale behind the |
Dear ADAM developers,
I just stumbled across your package and it looks amazing! However, there's a question I couldn't find an answer to in your examples and the README: If I am not mistaken, a JAX/PyTorch-implementation of the algorithms you included in ADAM should allow to compute gradients not only with respect to the joint configuration (as far as I understand, that's what you do in your examples), but leverage autograd to do so w.r.t any model parameter (e.g. joint offsets (aka link length) or inertial parameters).
Is that something that is supported by ADAM and if so, would you be so kind to provide me a pointer on where to start with something like that?
The text was updated successfully, but these errors were encountered: