Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for monodle failure #629

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

meenakshiramanathan1
Copy link
Contributor

@meenakshiramanathan1 meenakshiramanathan1 commented Nov 6, 2024

Fix #513

  1. Moved pybuda fix for monodle model to forge
  2. padded activations was removed from convtranspose2d eval as it was causing improper output shape calculation.

@meenakshiramanathan1 meenakshiramanathan1 changed the title Mramanathan/convtranspose2d Fix for monodle failure Nov 6, 2024
@meenakshiramanathan1 meenakshiramanathan1 force-pushed the mramanathan/convtranspose2d branch 3 times, most recently from 0aeabea to f9c284f Compare November 6, 2024 06:50
input_nid = input_[0]
input_node = graph["nodes"][input_nid]
if input_node["op"] == "parameter" and input_node["name"].endswith("weight"):
in_channel = input_node["attrs"]["shape"][0][0][0]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this handle different shape ranks?

Copy link
Contributor Author

@meenakshiramanathan1 meenakshiramanathan1 Nov 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, inputs are either going to be 3d or 4d for convtranspose2d which is handled here, I have added different shape rank shape inputs in sanity now.

input_nid = input_[0]
input_node = graph["nodes"][input_nid]
if input_node["op"] == "parameter" and input_node["name"].endswith("weight"):
in_channel = input_node["attrs"]["shape"][0][0][0]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does it work for channel first and channel last cases?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

input_node["attrs"]["shape"][0][0][0] corresponds to the in-channels from the param being sent
In case of below example, it will be 16 and it would be the same for channelfirst and channel last cases

"in_channels, out_channels, kernel_size, stride, padding, groups, bias, dilation, padding_mode, input_shape",
    [ 16, 33, (3, 3), 2, 0, 1, True, 1, "zeros", (20, 16, 50, 100)) ]

self.padding_bottom,
]

assert self.padding_top == self.padding_bottom, "Padding values for top and bottom must be the same."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As this is a TTNN limitation, do we need to assert this here?

Let's track this as their issues, and model it further if we see that this is by design :))

Copy link
Contributor Author

@meenakshiramanathan1 meenakshiramanathan1 Nov 8, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nvukobratTT , the TTNN limitation is for a different model i.e mobilenetv2 and not for this model, these changes has been added to modify the padding handled in eval function of convtranspose2d op as a fix for monodle model.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this an implementation limitation from our size regarding conv2d transpose? More precisely, my question is can we support this on FFE with different padding for each side?

If yes, let's update assert message to be a bit more cleaner about why this is limitation :))

Copy link
Contributor Author

@meenakshiramanathan1 meenakshiramanathan1 Nov 8, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure Nikola, I will check that and make required changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

AssertionError: Only supports group of 1 or in_channel
2 participants