There is one more reason to formulate patterns like this, which is in my opinion even more important than any one that you mentioned. Programming is all about patterns and the better a language is at capturing and formalising those patterns, the better language it is overall. Why patterns are important? Well because we are extremely good at recognising patterns. That helps us a lot to learn new stuff or find our way around in complex systems. For instance once upon a time I was learning the programming language OCaml, which has a wonderful concurrency library called Lwt. I needed to learn that library for a project that I was building. So I started reading the documentation and was reading it and it didn't really explained that much until I found out there were bind and return functions. At that moment I said to myself: aha, so it's just a monad! And voila, from that moment on I was capable of basic usage of the library. Those two definitions alone were sufficient to get me started on using the library. This was an amazing experience and it was all just because I recognised a familiar pattern in the system that helped me further nagivate my learning process and use the system productively very quickly.
Finally someone who understands learning: without the why, the content is pointless. Thank for this content, this is the most engaging and simple explanation on the starting point of category theory I've found after watching and reading for many hours. It would also be great if you could mention your references somewhere in the video. Did you really learn all this only from Hackage and WIkipedia?
Another awesome video in this awesome series, I was afraid when you gave the MATH WARNING, but I think you explained it really well and I was able to follow along. My use case for haskell is only Xmonad for the moment but the more I learn about it the more I wanna try to use it in other contexts. Having a type system whose design is based on algebraic proofs I think is such an cool idea and I certainty think it gives the programmer much more confidence about program correctness than the Object Oriented Patterns, which at best are glorified common sense and at worse are over engineered garbage. I wish other languages would take notes from haskell's type system.
stimes actually illustrates very well the practical usefulness of the Semigroup typeclass. Because of associativity, multiple operations with the same element can be grouped by the binary digits of the number of times it is performed, which is the default implementation and is very fast. For example, one can calculate large Fibonacci-numbers by taking powers of the 2x2 matrix (0 1)(1 1). One just needs to code up the multiplication and make it the diamond operator of the Semigroup instance; stimes takes care of the rest. data Mat2by2 = M Integer Integer Integer Integer deriving (Show) instance Semigroup Mat2by2 where M a b c d M e f g h = M (a*e+b*g) (a*f+b*h) (c*e+d*g) (c*f+d*h) fib :: Integer -> Integer fib n = m11 $ stimes (n + 1) (M 0 1 1 1) where m11 (M x _ _ _) = x
Your example of distributed computation made me go back and watch the middle that I had skipped; I'm not a mathematician so I don't care about proofs & perfect rigor, but the (eventual) capability to trivially distribute a bunch of computation without having to thread & such would be awesome. Especially as we move away from single core performance and get more & more cores into our computers. The future will eventually be held by ease of parallelization, and formally describing properties like associativity will help enable that.
Yep, this is the reduce part of map-reduce. sconcat is an associative but not commutative reduction (Futhark reduce_comm uses both). Languages like OpenMP and Chapel have specific support for reductions also. The directional variant is folds, which are also common to most array languages.
The general constraint for a hash function should not be forgotten that if 2 (possibly different) objects are equal according to a certain equivalence relation then the according hash function must yield the same value for these objects. Of course you can first map every objects to a base value which represents the equivalence class it belongs to. Then for Haskell these base values have no identity and so the purity of the hash function ensures the above general constraint.
The syntax in the last slide: prop x y = ((xy)mempty) === (x(ymempty)) where types = (x::[Int],y::[Int]) is not standard haskell. Where is it defined and which extension can be used to translate this language into valid haskell?
There is one more reason to formulate patterns like this, which is in my opinion even more important than any one that you mentioned. Programming is all about patterns and the better a language is at capturing and formalising those patterns, the better language it is overall. Why patterns are important? Well because we are extremely good at recognising patterns. That helps us a lot to learn new stuff or find our way around in complex systems. For instance once upon a time I was learning the programming language OCaml, which has a wonderful concurrency library called Lwt. I needed to learn that library for a project that I was building. So I started reading the documentation and was reading it and it didn't really explained that much until I found out there were bind and return functions. At that moment I said to myself: aha, so it's just a monad! And voila, from that moment on I was capable of basic usage of the library. Those two definitions alone were sufficient to get me started on using the library. This was an amazing experience and it was all just because I recognised a familiar pattern in the system that helped me further nagivate my learning process and use the system productively very quickly.
Finally someone who understands learning: without the why, the content is pointless. Thank for this content, this is the most engaging and simple explanation on the starting point of category theory I've found after watching and reading for many hours.
It would also be great if you could mention your references somewhere in the video. Did you really learn all this only from Hackage and WIkipedia?
Not really. For engineers knowing what it does is more important than the 'why'.
To the point and easy to follow . Thank you
Another awesome video in this awesome series, I was afraid when you gave the MATH WARNING, but I think you explained it really well and I was able to follow along. My use case for haskell is only Xmonad for the moment but the more I learn about it the more I wanna try to use it in other contexts. Having a type system whose design is based on algebraic proofs I think is such an cool idea and I certainty think it gives the programmer much more confidence about program correctness than the Object Oriented Patterns, which at best are glorified common sense and at worse are over engineered garbage. I wish other languages would take notes from haskell's type system.
stimes actually illustrates very well the practical usefulness of the Semigroup typeclass. Because of associativity, multiple operations with the same element can be grouped by the binary digits of the number of times it is performed, which is the default implementation and is very fast. For example, one can calculate large Fibonacci-numbers by taking powers of the 2x2 matrix (0 1)(1 1). One just needs to code up the multiplication and make it the diamond operator of the Semigroup instance; stimes takes care of the rest.
data Mat2by2 = M Integer Integer Integer Integer deriving (Show)
instance Semigroup Mat2by2 where
M a b c d M e f g h = M (a*e+b*g) (a*f+b*h) (c*e+d*g) (c*f+d*h)
fib :: Integer -> Integer
fib n = m11 $ stimes (n + 1) (M 0 1 1 1) where
m11 (M x _ _ _) = x
Hell yeah, I am actually studying group theory right now
Best Haskell tutorial videos on the Internet! :)
It finally clicked. Thank you
Your example of distributed computation made me go back and watch the middle that I had skipped; I'm not a mathematician so I don't care about proofs & perfect rigor, but the (eventual) capability to trivially distribute a bunch of computation without having to thread & such would be awesome. Especially as we move away from single core performance and get more & more cores into our computers. The future will eventually be held by ease of parallelization, and formally describing properties like associativity will help enable that.
Yep, this is the reduce part of map-reduce. sconcat is an associative but not commutative reduction (Futhark reduce_comm uses both). Languages like OpenMP and Chapel have specific support for reductions also. The directional variant is folds, which are also common to most array languages.
This is unironically of one the most practical things in computer science. You call it "proofs" but this is what good programming looks like
The general constraint for a hash function should not be forgotten that if 2 (possibly different) objects are equal according to a certain equivalence relation then the according hash function must yield the same value for these objects.
Of course you can first map every objects to a base value which represents the equivalence class it belongs to. Then for Haskell these base values have no identity and so the purity of the hash function ensures the above general constraint.
The use cases are impressive, great work!
In mathematics semigroup and monoid is the same thing. Monoid is Bourbaki's word.
Actually, the domain for “school arithmetic” as you call it is not the real numbers for division… {R} - {0}
It's a field of real numbers
The syntax in the last slide:
prop x y = ((xy)mempty) === (x(ymempty))
where types = (x::[Int],y::[Int])
is not standard haskell.
Where is it defined and which extension can be used to translate this language into valid haskell?
You are right. Watch video #18 - QuickCheck
Excellent! Thank you!
Thank you, amazing work!
Can you provide us with further sources (books) about the topics we have seen here?
Thanks ;)
Thank you very much for the content
Such a great channel!!
Why nothing algebra domain is empty set, but operations is magic none and not an empty set too?
The type of the hash function should be more general, i.e. f:: A -> B. And it's not a binary operation.
Awesome.. thank you