Skip to content

NXP Backend: Add imxrt700cm backend which combines the Neutron and CortexM backends#18488

Open
MartinPavella wants to merge 5 commits intopytorch:mainfrom
nxp-upstream:nxg01483/EIEX-762-create-the-aot-part-of-imxrt700cm-backend-combining-neutron-and-cortex-m
Open

NXP Backend: Add imxrt700cm backend which combines the Neutron and CortexM backends#18488
MartinPavella wants to merge 5 commits intopytorch:mainfrom
nxp-upstream:nxg01483/EIEX-762-create-the-aot-part-of-imxrt700cm-backend-combining-neutron-and-cortex-m

Conversation

@MartinPavella
Copy link
Collaborator

@MartinPavella MartinPavella commented Mar 25, 2026

Summary

Add imxrt700cm backend which combines the Neutron and CortexM backends into one. The backend uses Neutron wherever possible, and the leftover nodes are handled by Cortex-M.

Test plan

Unit tests provided

cc @robert-kalmar @JakeStevens @digantdesai

…ors.

The operator `dim_order_ops._clone_dim_order.default` uses the `kwargs` to determine the output dim order. Since the `kwargs` were always empty, this operator produced in incorrect result in the pass, which broke the rest of the model.
@MartinPavella MartinPavella self-assigned this Mar 25, 2026
@MartinPavella MartinPavella added module: nxp Issues related to NXP Neutron NPU delegation and code under backends/nxp/ release notes: nxp Changes to the NXP Neutron backend delegate labels Mar 25, 2026
@pytorch-bot
Copy link

pytorch-bot bot commented Mar 25, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18488

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

✅ You can merge normally! (2 Unrelated Failures)

As of commit b45f3f7 with merge base 5e77594 (image):

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Mar 25, 2026
@MartinPavella MartinPavella force-pushed the nxg01483/EIEX-762-create-the-aot-part-of-imxrt700cm-backend-combining-neutron-and-cortex-m branch 3 times, most recently from b474c94 to dda9ddd Compare March 25, 2026 13:06
@MartinPavella MartinPavella force-pushed the nxg01483/EIEX-762-create-the-aot-part-of-imxrt700cm-backend-combining-neutron-and-cortex-m branch from dda9ddd to b45f3f7 Compare March 25, 2026 14:12
training, the weights will be stored in the file.
:param train: Boolean indicating whether to train the model.
:param num_epochs: Number of epochs to use during training.
:param cortex_m_safe: There is a bug in the Cortex-M backend related to the `pad` operator. If this parameter is
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's fix this if it is something quick as opposed to introducing a new bypass logic? WDYT - CC @rascani since you were discussing this earlier this week in the context of NHWC.

from torchao.quantization.pt2e.quantizer.quantizer import Q_ANNOTATION_KEY


class IMXRT700CMQuantizer(Quantizer):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quantizers are meant to be composible. Recipe is the right user facing abstraction to target an SoC with multiple different backends. Take a look at https://github.com/pytorch/executorch/blob/main/export/tests/test_target_recipes.py especially something like get_android_recipe to understand how two or more quantizers / partitioners are encapsulated and made to work together.

In your case, I imagine a target recipe for rt700 with neutron and cortex-m.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. module: nxp Issues related to NXP Neutron NPU delegation and code under backends/nxp/ release notes: nxp Changes to the NXP Neutron backend delegate

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants