Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix memory leak #269

Merged
merged 9 commits into from
Oct 8, 2024
Merged

Fix memory leak #269

merged 9 commits into from
Oct 8, 2024

Conversation

adwhit
Copy link
Contributor

@adwhit adwhit commented Sep 30, 2024

I made this primarily to show how it can be done, rather than to offer as the 'right' fix. Basically instead of passing Arc<Task> everywhere we keep the Task inside the QueuingExecutor and just send a uuid u32 to notify when it should be polled again. Very 'async executor 101'.

Fixes #268

TODO: Update runtime docs

Copy link
Member

@charypar charypar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What a bug! Well found and replicated. 👏🏻

I think the redesign is sensible and pretty minimal, but would like to explore a version without UUIDs and HashMap. Made some comments to suggest how that might work, but may well have missed something.

Comment on lines 110 to 114
// we read off both queues and execute the tasks we receive.
// Since either queue can generate work for the other queue,
// we read from them in a loop until we are sure both queues
// are exhaused
let mut did_some_work = true;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be simpler to automatically consider received tasks ready and enqueue them as they get an ID? Here we would then go through all new tasks first, then run the ready queue to exhaustion?

Unless I'm missing something, which is not unlikely :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, what I'm probably missing is that ready tasks may spawn further tasks, which we need to take in. So the result is the same tik-tok loop...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah it's an ugly loop but necessary, I think

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a chance this might alter observable behaviour, since the running order of the tasks has potentially changed. Of course we shouldn't be relying on the order that effects arrive, but in practice at least in tests it is hard to avoid.

That could be avoided by having everything arrive on one queue (using an enum), not very pretty though.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think effectively random order of execution is a good thing - nothing should be relying on the ordering (beyond the causal one).

crux_core/src/capability/executor.rs Outdated Show resolved Hide resolved
crux_core/src/capability/executor.rs Outdated Show resolved Hide resolved
crux_core/src/capability/executor.rs Outdated Show resolved Hide resolved
crux_core/src/capability/executor.rs Outdated Show resolved Hide resolved
crux_core/src/capability/executor.rs Outdated Show resolved Hide resolved
crux_core/src/capability/executor.rs Outdated Show resolved Hide resolved
@adwhit adwhit changed the title [For discussion] Fix memory leak Fix memory leak Oct 1, 2024
Copy link
Member

@charypar charypar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This latest version looks great! Happy to merge this unless you want to have a go at that test in this PR

Copy link
Member

@charypar charypar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bloody hell, there's a blog post behind that test 😳 cool!

@charypar charypar requested a review from obmarg October 2, 2024 13:00
Copy link
Member

@StuartHarris StuartHarris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'tis a thing of beauty

Copy link
Collaborator

@obmarg obmarg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice

@charypar charypar merged commit 9e249e5 into redbadger:master Oct 8, 2024
9 checks passed
@adwhit adwhit deleted the alex/memory-leak branch October 8, 2024 12:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Memory leak due to circular reference in Arc<Task>
4 participants