aten.leakyrelu.default in unary_ops#7975
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/7975
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New FailureAs of commit ac77b1c with merge base 7bc06d1 ( NEW FAILURE - The following job has failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D68688186 |
Summary: Pull Request resolved: pytorch#7975 Differential Revision: D68688186
d92f078 to
0c89c5f
Compare
|
This pull request was exported from Phabricator. Differential Revision: D68688186 |
Differential Revision: D68688186
0c89c5f to
ac77b1c
Compare
|
This pull request was exported from Phabricator. Differential Revision: D68688186 |
| "hardshrink"); \ | ||
| } | ||
|
|
||
| #define DEFINE_LEAKY_RELU_FN(op_name) \ |
There was a problem hiding this comment.
why do we need a macro that is only used once?
There was a problem hiding this comment.
I couldn't use other macros (DEFINE_ACTIVATION_FN for instance) since the func signature is different. Also, I thought maybe other activations (like PReLU) could use the same macro in the future.
|
@hossein1387 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Differential Revision: D68688186 Pull Request resolved: #7975
Differential Revision: D68688186 Pull Request resolved: pytorch#7975
Differential Revision: D68688186