hekkaidekapus has quit [Ping timeout: 268 seconds]
<ocharles>
Holy moly. Running haskell.nix against our project causes it to use _50GB_
<ocharles>
Make that close to 100GB o.o
<ocharles>
I think this is all at evaluation time though, so not sure how to debug. Maybe I can trace some stuff
<ocharles>
Oh, weird. Every package in my project copies the entire `src` directory . I guess using `src = ./.` is a bad idea
<michaelpj>
__monty__: you should be able to set `materialized` to a path and then run the `updateMaterialized` script and it will copy the files in
<michaelpj>
it should definitely work if the files aren't right!
<michaelpj>
ocharles: yeah, you definitely want to filter the source!
<michaelpj>
and that is... a lot of memory
<michaelpj>
is your project really big?
<ocharles>
Sorry, it's disk space, not memory
<ocharles>
I don't think it's that big. 100k LOC, 2.2GB git checkout
<michaelpj>
yeah, my bet would be lots of copies of the source
<ocharles>
Why don't they all just share the same source directory?
<michaelpj>
I think we do some additional filtering per-package, so if you start with something that has lots of crap in it, you may get it repeated many times
<ocharles>
Ah. Yea, I'm seeing every Haskell project has our vendored node_modules and stuff
<michaelpj>
well, if package A depends on package B, but they're in the same source tree, then if they both used the same source, then a change to the source of package A would force a rebuild of package B, even though B doesn't depend on A
<ocharles>
True
<ocharles>
Ok, let's see what `cleanGit` gives
<ocharles>
Yea, this looks much better! Just a few GB used. I might just override each `packages.*.src` to be more specific and see if that's any better. I'll read the haskell.nix source, cause I'm kind of surprised it's not just using what the plan JSON states for the `src` attribute
<michaelpj>
ocharles: the source filtering is actually pretty complicated! I don't actually know the details, but @hami
<michaelpj>
hamishmack: might have more to say
<ocharles>
this might actually be enough as is, but will do some reading and digging
<ocharles>
Thanks, as always, for the help!
<ocharles>
I've at least replaced the build of all our Haskell-backed tools with haskell.nix now, so the last remaining bit is our actual code
<ocharles>
Can't wait to have tests run in parallel with downstream stuff building. So much time wasted for what is almost always a green light :(
<ocharles>
And also can't wait to stop being a manual solver whenever we have to upgrade a Hackage package -__-
<michaelpj>
ocharles: we don't necessarily save you from solver issues! after all, we're just taking the plan from cabal, so if your plan doesn't work, you're still in the same place
<ocharles>
sure, but usually there is a plan. My current situation is: bump this package, build, bump another package, build. Repeat for the best part of a morning
<ocharles>
We currently have 176 cabal2nix calls
<ocharles>
And this just relies on the strategy of "just use the latest version and pray this'll all work"
<michaelpj>
Y I K E S
<michaelpj>
176??
<michaelpj>
wow
fendor_ has joined #haskell.nix
fendor has quit [Ping timeout: 264 seconds]
<angerman>
hamishmack is on leave until ~mid next week.
<ocharles>
Yea, now you can see why I want to throw this current approach in the bin!
<ocharles>
Is `optimization: False` in `cabal.project` meant to do anything? I seem to still be compiling with `-O`
<michaelpj>
ocharles: I think `optimization: False` only applies to your project packages? So your dependencies will still be compiled with optimization. I think we should do the same thing as cabal, but it's possible it doesn't work due to the ghc-options issue
<ocharles>
That's actually what I want, but my local packages are still being compiled with optimizations (and crashing GHC, eep)
<ocharles>
In fact, even `package *\n optimization: False` is compiling with `-O`. Hmm
<ocharles>
I'll double check the plan.json, I guess that's the source of truth
<angerman>
ocharles: it's a bit sad that not all flags end up in plan.json, in general cabal is open to just accepting pull requests that add flags.
srk has quit [Remote host closed the connection]
srk has joined #haskell.nix
stites[m] has quit [Quit: Bridge terminating on SIGTERM]
domenkozar[m] has quit [Quit: Bridge terminating on SIGTERM]
michaelpj has quit [Quit: Bridge terminating on SIGTERM]
Poscat[m] has quit [Quit: Bridge terminating on SIGTERM]
Ericson2314 has quit [Quit: Bridge terminating on SIGTERM]
jonge[m] has quit [Quit: Bridge terminating on SIGTERM]
siraben has quit [Quit: Bridge terminating on SIGTERM]
ptival[m] has quit [Quit: Bridge terminating on SIGTERM]
Poscat[m] has joined #haskell.nix
fendor_ is now known as fendor
ptival[m] has joined #haskell.nix
michaelpj has joined #haskell.nix
siraben has joined #haskell.nix
Ericson2314 has joined #haskell.nix
stites[m] has joined #haskell.nix
jonge[m] has joined #haskell.nix
domenkozar[m] has joined #haskell.nix
<ocharles>
Hmm, `shellFor {}` gives me a shell with all dependencies of all local packages _except_ source-repository-package entries. Is that expected?
<ocharles>
(Trying to add it to `additional` now)
<ocharles>
I have a feeling they end up counting as local packages, which is the default test
<ocharles>
Ok, yea - (import ./nix/pkgs {}).circuithub-haskell-nix.hsPkgs.opaleye.isLocal => true. But `isProject => false`, so I'll just use that.
<ocharles>
Though I guess it doesn't entirely matter, because if something is a source-repository-package, it seems like Cabal doesn't care that it's in the global ghc-pkg database anyway