You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The way around is to wrap them around callable classes, but it would be great to support them, like the dot imports for the torch optimizes, so I dont have to duplicate them or clear a wrapper class
The text was updated successfully, but these errors were encountered:
I think this would not be possible. If you run inspect.signature(torch.argmax) it just fails. And this wouldn't even be possible to fix in pytorch side. The problem is that torch.argmax has multiple signatures, see help(torch.argmax), which is something that native python functions don't support.
I will keep this in mind in case some better idea comes up. But now I think a wrapper class is the best option. Possibly a single class which gets the torch function name so that there is no need for one class for each function.
Possibly a single class which gets the torch function name so that there is no need for one class for each function.
This is also what I did, but I also came up with a different kinda hacky idea to pass it as dict and parse it later as a partial function. For example:
🚀 Feature request
Hi! I would like to use torch functions straight from the
yaml
file, for example:However, I have had time to succeed, as the typing check fails, as for example the following:
The way around is to wrap them around callable classes, but it would be great to support them, like the dot imports for the torch optimizes, so I dont have to duplicate them or clear a wrapper class
The text was updated successfully, but these errors were encountered: