I agree! If you are new to the field you'll want to stop and think about stuff here and there, but he really does do a very nice gradual introduction of FP and get to some advanced topics.
Scott's explanation of mapping, functors, and monads is gold. I've read quite a few tutorials and videos in an effort to better understand Haskell, but couldn't quite understand the overall picture until watching this.
I'm currently learning Haskell and after watching many presentations, many books, many lectures, I find this to be the best introduction to functional programming. This talk generalizes, provides the big picture of functional programming. Many light bulbs popped up on my head when I was listening to this.
While I have quite a long way to go until I can properly re-program my brain to think more functionally, this talk really helped spark a couple light bulb moments. Some of the ideas I actually ran back to my company's OOP Enterprise code and implemented, funny enough. Great video, hopefully a couple more of these and I'll have my head wrapped around this crazy functional world :)
basically many presenters which i have seen so far started introduction into FP same way as the last statement. "Monad is just a monoid....." together with bunch of lines with FP stuff while repeating same words over and over. Monoids, monads, functors, endomorphisms without explaining single word. This guy made this stuff much more clearer. Very good presentation.
Enjoyed the talk very much, especially the humor. I came here as a professional Scala developer after an interview where I failed to enumerate functional programming patterns I use. I leave kinda disappointed because I use most of these patterns daily anyway. I guess when I'm asked to enumerate functional programming patterns I use again, I'm just going to use the fancy names like continuations, monadic bind and functors.
Beautiful talk. Probably the best introductions I've seen. And I think the comments under-rate the humour. I thought it was funny as hell. One NB (which the speaker probably is well aware of): 56:37 all monoids are either groups or semigroups, identity element or no. A group is a monoid with inverses. For example, integers under addition, (Z, +), form a group, where the inverses are the negative numbers: 5 + (-5) = 0. However, if I take only non-negative integers with addition ( { z in Z | z >= 0 }, + ), there are no inverses so I get a monoid, which is called a semigroup because it doesn't entirely satisfy the group axioms. But notice, it still has an identity element (that is, zero). So, it isn't wrong, but it isn't entirely correct to say that a monoid with no identity is called a semigroup, since monoids WITH an identity element may also be called semigroups.
Very well Indeed. I must agree with this amalgamation of information in integration of this mysterious calculation. Indeed this is a superb observation indeed.
Alright I'm sort of confused. At 23:19 he talks about how he could rewrite the interface in F# using one function... If the interface is already only one method, and that method only accepts int and returns int. Could he not have just done the same thing in the original language by just ... scrapping the interface?
There is a problem with the technique used at 57:40. The Order total is wrong when the code is refactored. Original: 2 * 19.98 + 1 * 1.99 + 3 * 3.99 = 53.92 Refactored: 6 * 25.96 = 155.76
I post a question in Stackoverflow(stackoverflow.com/questions/45626196/defining-a-non-zero-integer-type-in-f/45635075#45635075) about this matter and all responses revolve around creating a NonZeroInteger type that throws an exception when a zero is passed to constructor, but this can be achieved with any OO programming language so why the video's author claims this as a F# or a functional programming goodness. Honestly I feel tricked.
F* has refinement types that can do exactly what you're thinking of (among many other static typing features), and can be reduced to F#. www.fstar-lang.org/ But I think you're missing the forest for the trees. Even that example you linked shows how sum types like Option can be used to handle errors _without_ exceptions. If you watch the rest of the talk (or more relevantly, the follow-up talk on error handling vimeo.com/97344498 ) there are many examples of why that property is useful.
Wow, great! I think that FP patterns combine nicely with DB in 6th normal forms, since you call functions with included another functions - > You get benefits of right definition what to call. And in some super fast DB where only value pairs are possible you could get super fast code. There is Convergence I think.
Nice catch. You may try to implement a sample project since practice is the best judge for ideas. Try look at www.anchormodeling.com/about/ for some inspiration.
@@ivanplyusnin3292 I thought about using Key Value pair DB like REDIS or MemcacheDB. What do you think? Tools you provided link to might be way to MODEL DATA -> USE Key value DB in 6NF WITH Functional programming . Only think is how to store precipitant data. ruclips.net/video/W2Z7fbCLSTw/видео.html
1) 43:05 I didn't get the error handling part - he showed the code before and after error handling, it was the same, ok great soooo - where was the error handling? where did all the different error messages go? they were different for each error so surely you had to put them somewhere... highly misleading (he suggested 200% extra lines just for error handling, 0 extra lines for error handling in F# - that's what i'm challenging: the code did go into the monads sooo ... it's not like it disappeared ;) 2) 42:26 he got the promises wrong - the promises ACTUALLY SOLVE the pyramid of doom the exact same way bind does :-D he presented it as if they just rewrote it in different words.
You can't really have something like NonIntegerZero in most languages. This is sort of the limit of even the most powerful static type systems. It is the domain of dependent typing but then you have to give up Turing-completeness...
no, dependently typed languages like F* and Idris are still Turing complete! diverging (non-terminating) functions are allowed, as long as they're marked as such.
Does List.fold loop or is it a recursive higher order function? He says loop in the video but I am guessing that is just a hiccup and he means recursion but I don't know f#. From my understanding a big part of functional programming is treat data as immutable, so no loops. Is that right?
There is nothing wrong with loops in functional programming. Recursion is usually just a loop anyway. Look at the source for List.fold for an answer to your question. github.com/dotnet/fsharp/blob/main/src/fsharp/FSharp.Core/list.fs#L216-216
That coughing is killing me. Much louder than the talking volume. Can't hear him talk if I reduce the volume, but I get attacked by coughs if I turn it up.
+eNSWE In addition to what Magnetohydrodynamics said, you can think of a ring as a set R with operations * and + denoted as (R, +, *) where: 1. R under addition is an abelian group 2. R under multiplication is a monoid 3. Multiplication is distributive over addition These 3 characteristics are much easier for remembering what a ring is, given that you know what the underlying structures are. In fact, these structures (abstract algebraic structures) arose largely out of linear algebra from relaxing the axioms of fields and vector spaces. I started learning abstract algebra before number theory so this is actually how I remember what a ring is, instead of the 8+ axioms normally given in an introduction to rings. Anyway, it is pretty easy to remember all the axioms just from knowing the group axioms and a couple of generalizations: A group is a set G with an operation where: 1. G is closed under that operation 2. The operation is associative 3. There is an identity element for the operation in G (if we take the operation to be + and identity to be 0, then for every x in G, x + 0 = 0 + x = x) 4. Every element in G has an inverse for the operation (if we take the operation to be + and the identity to be 0, then for every x in G, there exists a y so that x + y = y + x = 0) If you take 1 and 2, you get a semigroup. If you take 1,2,3 you get a monoid. All four gives you a group. If the operation is commutative, we say we have a commutative or abelian group. You can also add commutativity to the operation for a strict monoid (something that is a monoid, but not a group, like multiplication in integers) to get a commutative monoid. If you do this for the multiplication operation of a ring, you get a commutative ring. If you also add inverses to the multiplication (except for 0), it becomes an abelian group (for its non-zero elements) and we can then say that (R,+,*) is a field. How you remember this stuff will largely depend on the order you learn it in, but these concepts are all very much algebraically and historically related!
Well, in the category of sets they are endomorphisms. But they don't preserve the neutral element in the sense that: `1 = plus1(0+0) =!= plus1(0) + plus1(0) = 2`, which means you can't use any kind of map-reduce approach. So I don't quite get the point he is trying to make at 1:03:04, as this definitely is not something that can be done "in parallel". (In the end he is more explicitly talking about "endofunctors" instead of "endomorphisms", which would be a better word for describing "plus1" and "subtract42".)
Are you sure? I would think that your example is not one of preservation of order. The correct example would be if this is valid: plus1(0 + plus1(0)) + plus1(0) which is equal to plus1(0) + plus1(0 + plus1(0)) I'm not sure, but that's how I understood it.
I'm actually not so sure anymore. I know endomorphisms from an algebraic background: mathworld.wolfram.com/Endomorphism.html It might be that there is a bit of a discrepancy between what endomorphism means in the context of functional programming compared to a group/module/ring/vector space-endomorphism. At least in these contexts it wouldn't be considered an endomorphism.
If by endomorphism it is meant "endomorphism in the category of sets" then I guess that they are endomorphisms then. But I still don't see what that buys us. That's a pretty weak requirement and I don't get how this provides us any benefit for parallelization.
I think he was probably referring to the 1 + 2 = 3 on the slide, and confusing that with the fact that in Javascript, 0.1 + 0.2 === 0.30000000000000004 due to floating point precision ¯\_(ツ)_/¯
There is no silver bullet unfortunately, all programming paradigms has its own pros and cons. In the end it depends on project, administrative politics, area of use, your team, your level, tasks etc. But it is better to be aware of all of them in order to choose the most appropriate one in your current situation. P.S. Yeah, seems like functional programming has its own benefits, compared to OO style for example and vice versa.
seems like youre just kicking the can down the road with exceptions in alot of ways. Also, dont you have to know how the function is implemented if you are passing for example "Divide(top, bottom, ifZero, ifSuccess)". Youre basically saying you know that zero is a special case, why not just check for it before you pass it.
the argument could be called ifFailure and would not lose any semantics. there's no need to know how it's implemented, and the type signature _forces_ you to have a contingency plan, rather than throwing exceptions that have no guarantee of being caught, or returning invalid or "magic number" answers that have no guarantee of being interpreted correctly.
Is it just me, or does anyone else think exceptions are kind of nice. Why do the extra work, everyone knows what divide by zero exception means. Handling exceptions with try catch and doing some custom logic is hardly a new concept.
I have to say i don't understand why go functional, when you just do OOP with functions. partial functions are objects of a class with one function. you have state. calling the same function twice might not produce the same results. i really can't see what's the advantages of using functional programming if you go this way.
Calling the same function with the same inputs twice yields the same result. In C# for example, static methods also yield the same output for given inputs. Where you get into trouble with OOP is in instance methods where the state of the object does impact the output.
This was a very good presentation. But the more i watch/learn about FP the more holes i discover. For very small benefits we add so many not necessary things, complexity, dependencies, ugliness of code, non-uniformity, potential for errors ...
you talk about how object-oriented is bad in the example where methods that take a string, expecting an e-mail address, could pass in some other string that's not an e-mail address such as a last name, to present the benefit of value-objects. but then you talk about how interfaces aren't needed in F# because you can functions compatible based on structure. It's a little bit misleading because in object-oriented programming, interfaces are the way you give data stronger type-safe guarantees. The interface from OOP which you seem to down-play *is* the primary mechanism to provide a type .
Functional programming is all BS, summed up as f(BS)! None of these talks illustrate clearly how to use functional programming to solve a the types of problems developers need to solve. They always cherrypick some mathematical problem like Fibonacci sequences. They also emphasize brevity of the code without discussing metrics we really care about such as performance. It's all intellectual masturbation.
That guy laughing is having the time of his life
best moment of his life
This is a seriously funny talk
Yes, he is a FUNctional programmer.
@A. B. Stone Haha, I think anamorphisms got him high (pun intended).
I found him laughing funny 🤣
This is by far the best functional programming talk I have ever seen.
Brilliant work.
Best talk on FP I've seen. And I've seen a lot. Thank you Sir.
I agree! If you are new to the field you'll want to stop and think about stuff here and there, but he really does do a very nice gradual introduction of FP and get to some advanced topics.
you're wrong
Юрий Яковенко
What would you recommend for an introductory lesson on FP?
One of the best intros do FP, hands down. Every concept is explained in a clear, pragmatic way and on top of that Scott has a great sense of humor!
Scott's explanation of mapping, functors, and monads is gold. I've read quite a few tutorials and videos in an effort to better understand Haskell, but couldn't quite understand the overall picture until watching this.
I'm currently learning Haskell and after watching many presentations, many books, many lectures, I find this to be the best introduction to functional programming. This talk generalizes, provides the big picture of functional programming. Many light bulbs popped up on my head when I was listening to this.
While I have quite a long way to go until I can properly re-program my brain to think more functionally, this talk really helped spark a couple light bulb moments. Some of the ideas I actually ran back to my company's OOP Enterprise code and implemented, funny enough. Great video, hopefully a couple more of these and I'll have my head wrapped around this crazy functional world :)
Best talk I have seen in a long long time
basically many presenters which i have seen so far started introduction into FP same way as the last statement. "Monad is just a monoid....." together with bunch of lines with FP stuff while repeating same words over and over. Monoids, monads, functors, endomorphisms without explaining single word. This guy made this stuff much more clearer. Very good presentation.
Clearest conceptual presentation on FP I have seen so far
Probably the best talk I've watched so far on FP! Explained everything clearly!
It's just amazing that i can find this full length lecture online! Thank you for the upload, the internet is incredible.
Finally I started to understand FP. Thanks for this presentation. Much better than a lot of those yapping and preaching ones on the RUclips.
Wow, in an hour you explained concepts that I was struggling to grasp for almost one week! Best talk on FP
Best talk ever about functional programming! Thank you so much
Enjoyed the talk very much, especially the humor. I came here as a professional Scala developer after an interview where I failed to enumerate functional programming patterns I use. I leave kinda disappointed because I use most of these patterns daily anyway. I guess when I'm asked to enumerate functional programming patterns I use again, I'm just going to use the fancy names like continuations, monadic bind and functors.
Thanks for this talk, FP is finally starting to make sense for me.
Best talk on FP I've seen. Thank you.
I"m only seven minutes in, but I have to say the speaker is hilarious. He has a great since of humor.
Beautiful talk. Probably the best introductions I've seen. And I think the comments under-rate the humour. I thought it was funny as hell.
One NB (which the speaker probably is well aware of): 56:37 all monoids are either groups or semigroups, identity element or no. A group is a monoid with inverses. For example, integers under addition, (Z, +), form a group, where the inverses are the negative numbers: 5 + (-5) = 0. However, if I take only non-negative integers with addition ( { z in Z | z >= 0 }, + ), there are no inverses so I get a monoid, which is called a semigroup because it doesn't entirely satisfy the group axioms. But notice, it still has an identity element (that is, zero).
So, it isn't wrong, but it isn't entirely correct to say that a monoid with no identity is called a semigroup, since monoids WITH an identity element may also be called semigroups.
Very well Indeed. I must agree with this amalgamation of information in integration of this mysterious calculation. Indeed this is a superb observation indeed.
This is great to get it clear for critical FP concepts from this video
Really great talk. Easy way of explaining very advanced concepts. I wish I've seen this video earlier
That's a very first CS semester talk. And give this man some water!
FP presentation in a pragmatic way! awesome!
Hey this is obviously the best explanation of Monoids (in human language).
Amazing! Thanks for the upload this was extremely useful!
Alright I'm sort of confused. At 23:19 he talks about how he could rewrite the interface in F# using one function... If the interface is already only one method, and that method only accepts int and returns int. Could he not have just done the same thing in the original language by just ... scrapping the interface?
Great talk. Thanks for the time and effort put into this.
i was numb through this, he really got me with that 1 + 0 = 0 + 1
Cool talk!
Still a great presentation even today. Thanks for posting.
Strait to the point - I loved it! :)
There is a problem with the technique used at 57:40.
The Order total is wrong when the code is refactored.
Original: 2 * 19.98 + 1 * 1.99 + 3 * 3.99 = 53.92
Refactored: 6 * 25.96 = 155.76
This was exceptionally good.
Very well done and wonderfully clear.
Brilliant talk and very funny.
Feel motivated to learn FP now. Any project ideas that excel with FP?
Thank you for this great talk!
Simple words to demonstrate hard things!
In minute 16:00 he talks about a NonZeroInteger that fails at COMPILING TIME, how can we do this in F#? I cannot find anyway to do that.
I post a question in Stackoverflow(stackoverflow.com/questions/45626196/defining-a-non-zero-integer-type-in-f/45635075#45635075) about this matter and all responses revolve around creating a NonZeroInteger type that throws an exception when a zero is passed to constructor, but this can be achieved with any OO programming language so why the video's author claims this as a F# or a functional programming goodness. Honestly I feel tricked.
F* has refinement types that can do exactly what you're thinking of (among many other static typing features), and can be reduced to F#. www.fstar-lang.org/
But I think you're missing the forest for the trees. Even that example you linked shows how sum types like Option can be used to handle errors _without_ exceptions. If you watch the rest of the talk (or more relevantly, the follow-up talk on error handling vimeo.com/97344498 ) there are many examples of why that property is useful.
Excellent talk!
Excellent talk
Wow, great! I think that FP patterns combine nicely with DB in 6th normal forms, since you call functions with included another functions - > You get benefits of right definition what to call. And in some super fast DB where only value pairs are possible you could get super fast code. There is Convergence I think.
Nice catch. You may try to implement a sample project since practice is the best judge for ideas. Try look at www.anchormodeling.com/about/ for some inspiration.
@@ivanplyusnin3292
I thought about using Key Value pair DB like REDIS or MemcacheDB. What do you think?
Tools you provided link to might be way to MODEL DATA -> USE Key value DB in 6NF WITH Functional programming .
Only think is how to store precipitant data.
ruclips.net/video/W2Z7fbCLSTw/видео.html
I'm more interested into Haskell than F#. This talk is generic enough to be understood. Thank you, very well done.
Great talk, thanks
Well..... THANK YOU!
Man, that one dude thinks this was really, really, funny.
+blahbl4hblahtoo he makes the the jokes much funnier than they actually are.
+blahbl4hblahtoo he's probably stoned
He might have been a hired cachinator.
one of those head hunter infiltrators.
Laughing dude looks up to the guy and is saying "I get the jokes which makes me smart too". When it's just a paradigm
The captions at 30:55 lmao
I took a lot from this but def started to lose me around 42:00 (although that does look a lot like error handling in node)
That guy man, I guess he is a best friend of the speaker.
1) 43:05 I didn't get the error handling part - he showed the code before and after error handling, it was the same, ok great soooo - where was the error handling? where did all the different error messages go? they were different for each error so surely you had to put them somewhere... highly misleading (he suggested 200% extra lines just for error handling, 0 extra lines for error handling in F# - that's what i'm challenging: the code did go into the monads sooo ... it's not like it disappeared ;)
2) 42:26 he got the promises wrong - the promises ACTUALLY SOLVE the pyramid of doom the exact same way bind does :-D he presented it as if they just rewrote it in different words.
Stay tuned for ES8: Optional Static Typing has been proposed.
The only reason mathematicians got there first is because they didn't have computers way before we didn't.
Is there no straightforward way to define NonZeroInteger type?
You can't really have something like NonIntegerZero in most languages. This is sort of the limit of even the most powerful static type systems. It is the domain of dependent typing but then you have to give up Turing-completeness...
no, dependently typed languages like F* and Idris are still Turing complete! diverging (non-terminating) functions are allowed, as long as they're marked as such.
really good
Does List.fold loop or is it a recursive higher order function? He says loop in the video but I am guessing that is just a hiccup and he means recursion but I don't know f#. From my understanding a big part of functional programming is treat data as immutable, so no loops. Is that right?
There is nothing wrong with loops in functional programming. Recursion is usually just a loop anyway. Look at the source for List.fold for an answer to your question. github.com/dotnet/fsharp/blob/main/src/fsharp/FSharp.Core/list.fs#L216-216
this is what happens when you invite a standup comedian to a technical conference
Top floor is for Martin-Löf, Idris, Agda, Coq and the like.
I came for the FP, I stayed for the LOLs.
awesome..
it is a good video and it gives me some incentive to continue working on JavaScript.
brilliant
That coughing is killing me. Much louder than the talking volume.
Can't hear him talk if I reduce the volume, but I get attacked by coughs if I turn it up.
If it kills you, you will end up buried in a coughin.
The guy laughing like a hyena at 4:20... I mean, 4 20. Joke makes itself.
is a monoid actually just an algebraic ring?
+eNSWE the multiplicative operation only, yes. There is a forgetful functor from Ring to Monoid, which forgets the abelian additive group of a ring.
+eNSWE In addition to what Magnetohydrodynamics said, you can think of a ring as a set R with operations * and + denoted as (R, +, *) where:
1. R under addition is an abelian group
2. R under multiplication is a monoid
3. Multiplication is distributive over addition
These 3 characteristics are much easier for remembering what a ring is, given that you know what the underlying structures are. In fact, these structures (abstract algebraic structures) arose largely out of linear algebra from relaxing the axioms of fields and vector spaces.
I started learning abstract algebra before number theory so this is actually how I remember what a ring is, instead of the 8+ axioms normally given in an introduction to rings.
Anyway, it is pretty easy to remember all the axioms just from knowing the group axioms and a couple of generalizations:
A group is a set G with an operation where:
1. G is closed under that operation
2. The operation is associative
3. There is an identity element for the operation in G (if we take the operation to be + and identity to be 0, then for every x in G, x + 0 = 0 + x = x)
4. Every element in G has an inverse for the operation (if we take the operation to be + and the identity to be 0, then for every x in G, there exists a y so that x + y = y + x = 0)
If you take 1 and 2, you get a semigroup. If you take 1,2,3 you get a monoid. All four gives you a group. If the operation is commutative, we say we have a commutative or abelian group.
You can also add commutativity to the operation for a strict monoid (something that is a monoid, but not a group, like multiplication in integers) to get a commutative monoid. If you do this for the multiplication operation of a ring, you get a commutative ring. If you also add inverses to the multiplication (except for 0), it becomes an abelian group (for its non-zero elements) and we can then say that (R,+,*) is a field.
How you remember this stuff will largely depend on the order you learn it in, but these concepts are all very much algebraically and historically related!
plus1 and subtract42 are not endomorphisms.
what are they then? endomorphism means same input and output type, which they have.
Well, in the category of sets they are endomorphisms. But they don't preserve the neutral element in the sense that:
`1 = plus1(0+0) =!= plus1(0) + plus1(0) = 2`, which means you can't use any kind of map-reduce approach. So I don't quite get the point he is trying to make at 1:03:04, as this definitely is not something that can be done "in parallel". (In the end he is more explicitly talking about "endofunctors" instead of "endomorphisms", which would be a better word for describing "plus1" and "subtract42".)
Are you sure? I would think that your example is not one of preservation of order. The correct example would be if this is valid:
plus1(0 + plus1(0)) + plus1(0) which is equal to plus1(0) + plus1(0 + plus1(0))
I'm not sure, but that's how I understood it.
I'm actually not so sure anymore. I know endomorphisms from an algebraic background: mathworld.wolfram.com/Endomorphism.html
It might be that there is a bit of a discrepancy between what endomorphism means in the context of functional programming compared to a group/module/ring/vector space-endomorphism. At least in these contexts it wouldn't be considered an endomorphism.
If by endomorphism it is meant "endomorphism in the category of sets" then I guess that they are endomorphisms then. But I still don't see what that buys us. That's a pretty weak requirement and I don't get how this provides us any benefit for parallelization.
who is the guy with wide laughter? since he get every point, must be another FP expert
I'm laughing not cuz it was that funny , but because that guy really thought it was ..
For years everything is input-> process-> output
what? it DOES work in javascript:
$ node
> 1+0
1
> 0+1
1
I think some person from the audience mentioned that the second one might not resolve to true (0+1=1). I do not know if he is right though.
I think he was probably referring to the 1 + 2 = 3 on the slide, and confusing that with the fact that in Javascript, 0.1 + 0.2 === 0.30000000000000004 due to floating point precision ¯\_(ツ)_/¯
I do not follow why some people are so thrilled by FP? At some point it looks simple but I see limitations mostly.
There is no silver bullet unfortunately, all programming paradigms has its own pros and cons. In the end it depends on project, administrative politics, area of use, your team, your level, tasks etc. But it is better to be aware of all of them in order to choose the most appropriate one in your current situation. P.S. Yeah, seems like functional programming has its own benefits, compared to OO style for example and vice versa.
The equivalent of Objects in Functional programming are actually not functions, it's actors.
what's with that hohohahaha?
🤔
Functional patterns allow you to accumulate a lot of calories in your body.
seems like youre just kicking the can down the road with exceptions in alot of ways. Also, dont you have to know how the function is implemented if you are passing for example "Divide(top, bottom, ifZero, ifSuccess)". Youre basically saying you know that zero is a special case, why not just check for it before you pass it.
pwnDonkey he's just demonstrating the principle. Sure maybe it was a bad example just don't get lost in the application
the argument could be called ifFailure and would not lose any semantics. there's no need to know how it's implemented, and the type signature _forces_ you to have a contingency plan, rather than throwing exceptions that have no guarantee of being caught, or returning invalid or "magic number" answers that have no guarantee of being interpreted correctly.
I made it to ivory tower before audience laughter became to annoying
Is it just me, or does anyone else think exceptions are kind of nice. Why do the extra work, everyone knows what divide by zero exception means. Handling exceptions with try catch and doing some custom logic is hardly a new concept.
that's really not the question. the question whether or not to compiler enforces that side effects are handled.
doesnt the program exit if you want it to
I have to say i don't understand why go functional, when you just do OOP with functions.
partial functions are objects of a class with one function. you have state. calling the same function twice might not produce the same results.
i really can't see what's the advantages of using functional programming if you go this way.
Calling the same function with the same inputs twice yields the same result. In C# for example, static methods also yield the same output for given inputs. Where you get into trouble with OOP is in instance methods where the state of the object does impact the output.
This was a very good presentation.
But the more i watch/learn about FP the more holes i discover.
For very small benefits we add so many not necessary things, complexity, dependencies, ugliness of code, non-uniformity, potential for errors ...
Except it's the exact opposite of most of those things. Keep watching/learning.
Joker ??? 😂
you talk about how object-oriented is bad in the example where methods that take a string, expecting an e-mail address, could pass in some other string that's not an e-mail address such as a last name, to present the benefit of value-objects.
but then you talk about how interfaces aren't needed in F# because you can functions compatible based on structure.
It's a little bit misleading because in object-oriented programming, interfaces are the way you give data stronger type-safe guarantees.
The interface from OOP which you seem to down-play *is* the primary mechanism to provide a type .
The laughing dude and the constant mic'ed throat clearing make this otherwise great looking video difficult to watch.
Functional programming is all BS, summed up as f(BS)! None of these talks illustrate clearly how to use functional programming to solve a the types of problems developers need to solve. They always cherrypick some mathematical problem like Fibonacci sequences. They also emphasize brevity of the code without discussing metrics we really care about such as performance. It's all intellectual masturbation.
Next time take some cough medicine before you give a talk 😐