Skip to content

Code snippet for running with r2.7 release #215

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft

Conversation

zpcore
Copy link
Collaborator

@zpcore zpcore commented Apr 24, 2025

This is the code snippet to make torchprime runnable with PyTorch/XLA r2.7 release. No plan to merge just for reference.

@zpcore
Copy link
Collaborator Author

zpcore commented Apr 24, 2025

Maybe In the future, we can release torchprime to align with the PyTorch/XLA release schedule to make sure we have working version with PTXLA.

@tengyifei
Copy link
Collaborator

tengyifei commented Apr 24, 2025

This is a tricky problem. For example, I would like to upstream the host offload graph transform to PyTorch/XLA this/next week. And I think @zpcore you were thinking of replacing the splash attention impl in torchprime with upstream too. If torchprime picks up a requirement to work with the latest stable release, then we will have to wait 3 months.

Conversely, if there's e.g. a great feature that applies to all models, we won't be able to use that until 3 months.

We could avoid waiting for 3 months by splitting the code paths with version checks. My worry is that that would quickly turn into a complicated matter due to the sheer number of new things needed in PyTorch/XLA.

This is a much less severe problem in JAX because JAX releases bi-weekly. As a result, MaxText can easily depend on the latest stable version of JAX.

cc @yaoshiang

@tengyifei
Copy link
Collaborator

IMO a simpler solution could be providing an installation script that automatically installs the right nightly wheels that matches the version used by E2E tests.

@yaoshiang
Copy link
Collaborator

I actually think we should release PT/XLA frequently - so we should release a 2.7.1 in [2/4] weeks, etc. Along with my "pinned dep on torch" proposal, users wouldn't have to think about which version of torch to match up with... installing torch_xla just install the right version of torch.

I don't think there is consensus on that, however.

In the meantime, as we add more scenarios to torch prime, I think each will need their own installation instructions with pins to specific nightlies we know work. The generic latest "nightly" would arguably be not as optimal as a specific nightly, since nightlies are by definition not stable.

@tengyifei
Copy link
Collaborator

so we should release a 2.7.1 in [2/4] weeks, etc. Along with my "pinned dep on torch" proposal

That's a good idea. In fact, I think we don't even have to be restricted to minor versions. E.g. if PyTorch releases 2.7.1, we could do:

  • 2.7.1.0501
  • 2.7.1.0515
  • etc
    from HEAD periodically

And with the guarantee that they all work (ABI compatible and tested) with 2.7 and also the PyTorch HEAD.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants