I can perfectly visualize and understand why Haskell could consume a large chunk of memory. Haskell's laziness is a blessing and a curse at the same time. I mean the language surely is not suitable for system programming where memory is limited but correctness is the critical prize to achieve at all cost.
3 года назад+2
Thanks, this is great! Though i wish you had also looked at how space leaks look like/happen on the heap.
This memorisation is really cool but I remember using haskell arrays for dynamic programming haskell back in uni which should still be faster then running over the index tree with access speed of log n but why would you or should you use trees or/vs arrays(O(1) access) in such a manner or am I not understanding sth correctly
But this is done in interactive mode. I have thought haskell in runtime doesn't contain all the meta data including constructor, but only compilers know and deal with it in compile time. Is there anybody who knows which is right?
In Haskell, type-erasure happens during code generation in contrast to e.g java where objects have metadata associated to them available at runtime. You can still access type info with type classes like Data.Typeable and Data.Data, but you must be explicit about it. I think what we see here is debug information added by ghci (idk about specifics). Executables with profiling on preserve type info that can be collected at runtime on the prof files. Not sure about regular builds. Disclaimer: I am not an expert in ghc, so take it with a grain of salt.
Wow GHC deduplicate and reuse small objects (aka flyweight pattern). Is it possible to turn the feature off? What impact on high concurrent environment with lots of threads? JVM has similar feature string reduplication, but it is usually deactivated, because of races.
If it's so simple to decide that it's correct to ignore the svgcairo problem that it can be done without justification, why does cabal need me to tell it to do that? If it's not so simple to decide, why would I go ahead and do it?
I wish this talk existed ten years ago. Thank you so much, Joachim for being our guest!
That cyclic vs infinite list is a very neat trick, I had no idea!
I can perfectly visualize and understand why Haskell could consume a large chunk of memory. Haskell's laziness is a blessing and a curse at the same time. I mean the language surely is not suitable for system programming where memory is limited but correctness is the critical prize to achieve at all cost.
Thanks, this is great! Though i wish you had also looked at how space leaks look like/happen on the heap.
19:10 nice example why index operation is danger "l !! 4" is fifth element.
I LOVE HIS SHIRT!
Thank you!
This memorisation is really cool but I remember using haskell arrays for dynamic programming haskell back in uni
which should still be faster then running over the index tree with access speed of log n
but why would you or should you use trees or/vs arrays(O(1) access) in such a manner or am I not understanding sth correctly
Good stuff
But this is done in interactive mode. I have thought haskell in runtime doesn't contain all the meta data including constructor, but only compilers know and deal with it in compile time. Is there anybody who knows which is right?
In Haskell, type-erasure happens during code generation in contrast to e.g java where objects have metadata associated to them available at runtime. You can still access type info with type classes like Data.Typeable and Data.Data, but you must be explicit about it. I think what we see here is debug information added by ghci (idk about specifics). Executables with profiling on preserve type info that can be collected at runtime on the prof files. Not sure about regular builds.
Disclaimer: I am not an expert in ghc, so take it with a grain of salt.
Wow GHC deduplicate and reuse small objects (aka flyweight pattern). Is it possible to turn the feature off? What impact on high concurrent environment with lots of threads? JVM has similar feature string reduplication, but it is usually deactivated, because of races.
Why Nothing pointer is not 0 but a dangling address?!
To distinguish it from the [] pointer?
Uninitiated memory I guess
If it's so simple to decide that it's correct to ignore the svgcairo problem that it can be done without justification, why does cabal need me to tell it to do that? If it's not so simple to decide, why would I go ahead and do it?
The justification is that it builds 😃
But yes, you are right that some of this is unnecessarily hard. I hope things will improve over time here.
@@nomeata the evidence to date is that it doesn't improve
AAAAAAAAamzaing