-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of memory on a big query #82
Comments
Oh no :-( that’s not good. I tried Googling the error message but nothing useful came out. You could try putting the query alone in its own module or somehow giving GHC more memory (swap space?) to work with. Squeal uses type level lists which are quite inefficient when calculating Join and Has and the rest. At runtime all that inefficiency should completely go away but compile time is a different story. If I had an equivalent example I could investigate more thoroughly. |
Putting the query alone in its own module doesn't seem to make much of a difference. I still have to check if splitting in several functions helps (though it probably shouldn't !).
It still consumes 3.5 GB but that's already way more manageable. |
Some more information about this, thanks to the remarkable investigative work done by @haitlahcen.
The current workaround, rather than the For current Squeal users with problematic compilation time and memory usage, |
Wow! Thanks so much @adfretlink and @haitlahcen ! This is great. Sorry Squeal stresses GHC out so much. |
Hey! I've opened an issue for stack as well |
Manually unrolling recursive type families should radically improve compile time and memory usage. Might open a PR today. |
Small update on this topic: we've just squashed our migrations, redifining our |
Pretty interesting. I wonder what would happen with aggressive use of partial type signatures. If all intermediate schemas are wild-carded |
How would we do this ? Something like:
But how would GHC be able to fetch the order migrations properly ultimately ? |
The way I do it in my projects is I have a directory structure like
where each
and
and re-exports Now, we shouldn't need to define any of the intermediate |
Attempt to work around OOM. See: morphismtech/squeal#82
Attempt to work around OOM. See: morphismtech/squeal#82
Attempt to work around OOM. See: morphismtech/squeal#82
@adfretlink Thank you for documenting the workaround using In case anyone needs a repro, here’s a PR on my open source project that exhibits this problem: @echatav Thanks for documenting how you organize your schema migrations. I ended up doing something similar on my own but it’s nice to see it being validated: https://github.com/zoomhub/zoomhub/tree/69f420ee9f2d6b88392cfa2657948e1c2c74db30/src/ZoomHub/Storage/PostgreSQL/Schema |
I'm trying to compile a query returning 37 fields, using 14 joins, 1 single where clause, a group by on 33 fields, and a order by on 4 fields. Sadly, I get "unable to commit 745537536 bytes of memory" when ghc is trying to compile the module containing the query. (I cannot post the query for IP reasons, sorry)
Do you have any idea of what I could do to help the compiler on this ?
The text was updated successfully, but these errors were encountered: