Skip to content

Async Execution Functions #3

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from
Draft

Async Execution Functions #3

wants to merge 2 commits into from

Conversation

guill
Copy link
Owner

@guill guill commented Apr 19, 2025

This commit adds support for node execution functions defined as async. When
a node's execution function is defined as async, we can continue
executing other nodes while it is processing.

Standard uses of await should "just work", but people will still have
to be careful if they spawn actual threads. Because torch doesn't really
have async/await versions of functions, this won't particularly help
with most locally-executing nodes, but it does work for e.g. web
requests to other machines.

Remaining work:

  1. The UI doesn't properly display multiple concurrent node executions.
  2. I probably need some work to handle the case where one node hits an
    out-of-memory error (or other exception) while another node is
    concurrently executing.
  3. If people are doing node expansion within an async function and using
    the GraphBuilder, they'll need to provide a manual prefix. The
    default one won't necessarily work properly in this case. This looks
    easy to fix in Python 3.12+ with contextvars, but I have to figure
    out how to do it in 3.11 and earlier.

guill added 2 commits April 19, 2025 01:12
This commit adds support for node execution functions defined as async. When
a node's execution function is defined as async, we can continue
executing other nodes while it is processing.

Standard uses of `await` should "just work", but people will still have
to be careful if they spawn actual threads. Because torch doesn't really
have async/await versions of functions, this won't particularly help
with most locally-executing nodes, but it does work for e.g. web
requests to other machines.

Remaining work:
1. The UI doesn't properly display multiple concurrent node executions.
2. I probably need some work to handle the case where one node hits an
   out-of-memory error (or other exception) while another node is
   concurrently executing.
3. If people are doing node expansion within an async function and using
   the `GraphBuilder`, they'll need to provide a manual prefix. The
   default one won't necessarily work properly in this case. This looks
   easy to fix in Python 3.12+ with contextvars, but I have to figure
   out how to do it in 3.11 and earlier.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant