Looking to upgrade my audio setup with a Blue Yeti USB microphone! If you'd like to support the channel, you can buy me a coffee here: ko-fi.com/thecodinggopher.
Wow how this video tiptoes around the actual answer in its own title and then goes on with explaining what functional programming actually is, is amazing. So to answer the question: to minimize state. State is the enemy of programmers, it causes bugs and it makes programs hard to understand. Object oriented programming went the path of organizing that state better to make it manageable. Functional programming goes to path of getting rid of state as much as possible instead.
@@TheCodingGopher that was actually my attempt on sarcasm but well. I like your videos but this one doesn’t deliver what it’s title promises. Maybe the format ist to short to convey such a complex topic
Nice video. Short and to the point. I'll point out that in Python, it's usually better to use a generator expression or list comprehension than map or filter. Reduce is rarely used.
Thanks for the insight! Could you elaborate on why generator expressions or list comprehensions are generally preferred over map or filter in Python? Are there specific use cases where map or filter might still be a better choice?
@@debbie8062 Hey. Generator expressions and list comprehensions are slightly more clear, tend to be faster, and don't require a bulky lambda if you don't already have a function defined for what you need to do. I'm unaware of a time when map or filter are better.
I heard that functional programming is really helpful when you are doing a functional programming class in college. I also heard that functional programming people are being mercilessly teased by all the other programmers who are keen to get fully working programs released into production. That so unfair. It's perfectly normal that theoretical types can always tell a better story, even if practical types are the ones that have live systems. Reminds me of that old adage, that most application software is garbage and in production, and most fantastic software is in the trash can covered in blood, sweat, and tears.
Love the adage! It’s true that FP can feel like the "philosophy major" of the coding world (i.e. deep, theoretical, and sometimes hard to explain). But there’s definitely a place for both schools of thought; FP when it comes to writing clean, predictable, and maintainable code - even if it sometimes feels like solving puzzles no one else asked for... That said, I get why practical folks like to focus on "shipping it." The reality is, the best ideas often come from balancing both (i.e. solid theory and real-world pragmatism). Maybe one day, we’ll see more of that great software making it into prod (minus the blood, sweat, and tears) :).
Haskell's a really interesting language, but sadly I think it's being held back by the poor error messages from its main implementation, GHC. Also, the language itself is a little hard to get a performance-intuition with, since it's so lazy - but I think that's a smaller problem than the error messages.
@danielstromberg I actually never had any problems debugging Haskell code... And well Performance is kinda great as long AS your algorithms don't get exponential O notation
Can everything that can be done with OOP re-written and done functional programming way? I would love to learn functional programming but OOP is often just so intuitive that it’s difficult for me to switch my mindset.
It can be partly done but this is not an end goal, FP is a different paradigm. For example dependency injection can be implemented with partial functions. If you want a language that does both you can look into Scala
@@fullfungo that's a misleading oversimplification because obviously methods and functions can be mapped 1-1 but the main difference is the lack of objects to hold your application state. You can't just create an object, set some properties and use anywhere in the same way you would in oop. Architectually your app needs to be created asking how to keep the relevant state and it's veerry different between paradigms Of course, it can be done, but is not a 1-1 "just map methods to functions" kind of situation
Thanks for another great video! I came in with already some information about functional programming which is easy to understand tbh, but i was never able to actually tell the difference between imperative and declarative, I've read so many articles, code examples, and watched demos... Nothing got me saying : "ahaaa" i think it's a me problem is my best guess (not that it matters, but i always like to know more)
IMO, stop going from one purity extreme to another, and just learn Common Lisp (minus the godawful loop macro extensions -- they aren't Lispy, and that is why they suck). Not every problem is purely functional or inheritance-based OO or prototype-based OO or imperative or declarative. Not everything needs to be immutable, nor does everything need to be recursive. Common Lisp is multi-paradigm for the very sensible reason that the people who came together to make the specification were solving GIGANTIC real-world problems, not writing classroom exercises.
Interesting perspective! Could you share specific examples of how Common Lisp's multi-paradigm approach helped solve complex real-world problems effectively compared to sticking with a single paradigm?
Ah, yes, high order function. The thing that makes callback. Functional programming is not immune to change. (Especially if you don't know what you are doing.) Because data or variable will change. The question is what is allowed and not allowed to change.
While “higher order functions” and “callbacks” may technically be the same, they imply very different things - callbacks in most languages are functions which are called once when something’s finished (I’m sure all the old school Node devs remember the horror). HOFs imply that making new functions by combining other functions is just how programs are written, they’re not a special case, taking in a function as an argument is just as natural as taking in a string or an int.
@Axman6 technically, you don't need to call callback at the end or when it is finished. A function can call them even at the start. While what you said is true. Most of the time, that function as argument is gonna be called as a callback. Because that's technically one of the purposes of HOF
The Haskell (a functional programming language) compiler for example is optimized for stuff like that. You're able to use recursion on infinite datasets and still won't run out of memory. - It's possible because Haskell is also lazy evaluated. And why is Haskell using recursion?: 1. Because everything is immutable. So there can't be a changing state that's controlled by a variable like in an old school for-loop. 2. Because functional programming is declarative. That means you don't say 'how' something is (imperative), you say 'what' something is, and therefore loops don't make sense. A small example: You want all even numbers from 2 to n. imperative: int nums = [] for ( int i = 2; i
Perceptive observation! Some languages can run into stack overflow from deep recursive calls, because each call adds a new stack frame. However, some languages, notably functional languages (the topic of this video), like Haskell and Scala, perform what's called tail call optimization, where each successive recursive call reuses the same stack frame, therefore optimizing memory usage and preventing stack overflow.
Because you can't create useful loop without mutating data Func langs compilers usually optimise it, for example, rewrite haskell to c, thinking as compiler: f 0 = 1 f x = x * f (x - 1) yeah, this create infinite loop if x < 0, but now it's dont matter Firstly, just rewrite: int F(int x){ if (x == 0) return 1; return x * F(x - 1); } But basically, we can represent calling function as just goto to start of func, and in the end, goto to place of calling. Our recursion already in the end, so just: int F (int x) { int ret_val = 1; start: if (x == 0) return 1; ret_val *= x; x -= 1; goto start; }
Yes it does. Like object oriented programming the paradigm itself doesn’t take limitations from the machine into consideration. The OOP paradigm was around for decades before it became actually useable by modern computers. The function programming paradigm is only that: a paradigm. Programmers can follow it if it’s useful, but if a functional implementation doesn’t fit the task, they don’t have to. (Unless the language forces them to do so, but then it’s time to ask if that’s the right language for the job) For example it’s also slow to allocate new space every time you pass dara through a function. It’s faster to modify it in place of course. So of speed is key it’s a good idea to let that part of FP slip.
I think of pure functional programming as more of an academic curiosity than anything meant to write actual software, but the practice of using higher order functions on iterators to perform data transformations instead of looping over indices and manually writing out all the transformations is something everyone should be thinking about. I demonstrated this to a friend recently by showing the matrix multiplication algorithm in pure imperative C++ and then in Rust. In C++ there were three nested for loops because you need one for the row, one for the column, and then one to sum up the products of the paired elements. It's quite ugly and long. In Rust I had specially defined iterators for rows and columns, so I just looped over the row and column iterators, and calculating each element is as simple as zipping the iterators, mapping the pairs of elements to their product, and then summing it. It's more concise, understandable, and beautiful.
@@meflea3675 I probably shouldn't have used plain English to describe code. The code to calculate one element is just "zip(row, col).map(|(x, y)| x * y).sum()" I find this much nicer and more readable than a for loop with an outside sum variable that manually calculates an offset to index into arrays at each step.
Because of the immutability functional programming has a huge production of objects during runtime. Nobody talks if and how compilers can make those things efficiently. Without that: you would parallelize a problem you have created yourself. (Yes, I know what streams and lambdas in Java are. I use them more fine tuned).
Haskell has quite an efficient garbage collector optimised for many short lived objects. Allocation for most objects is simply incrementing an offset into the current generation until it’s full, at which point that generation is garbage collected. Haskell also had the massive advantage of being pure and data is immutable, which allows the compiler to completely eliminate allocations at compile time - ` -> sum . map (^2) . filter isPrime $ [1..n]` will never allocate a list at all, and will end up with the whole computation will happen in registers.
Nice video, but functional programming seems as too much work and mental drama, for doing a simple thing. All this for the sake of 'doing functional programming'. My goal is to make life simple for me, not to adhere to an ideology, if that ideology does not suit me, or I don't really see the practical benefits from it.
Everyone is a functional programmer these days. Every piece of imperative code is being compiled through a pure functional intermediate representation - SSA. All the modern compilers do it.
Intermediate representations that use SSA are NOT functional. They contain many imperative constructs like looping control flow and procedures that produce side effects. Moreover, just because SSA values are only assigned a value once does not necessarily mean they are immutable. Most IRs have store instructions that can mutate stack allocations, dereference/write to pointers, etc, which by definition removes pure immutability. Compilers don’t use SSA for some arbitrary pseudo-philosophical reason like how these uptight functional whitepaper-writing academics do. It has real, practical usecases and allows a compiler to more easily optimize code and reason about data dependencies.
@Masq_RRade for the vast majority of code SSA is perfectly functional. GEPs are only used when you call other functions, and all loops are represented in a purely functional form in SSA (it is strictly equivalent to a subset of CPS). Even random array access is functional, see ArraySSA for details.
Looking to upgrade my audio setup with a Blue Yeti USB microphone! If you'd like to support the channel, you can buy me a coffee here: ko-fi.com/thecodinggopher.
Instructions unclear, accidentally installed NixOS.
😂
Wow how this video tiptoes around the actual answer in its own title and then goes on with explaining what functional programming actually is, is amazing.
So to answer the question: to minimize state. State is the enemy of programmers, it causes bugs and it makes programs hard to understand. Object oriented programming went the path of organizing that state better to make it manageable. Functional programming goes to path of getting rid of state as much as possible instead.
Thanks for the kind words!
And spot on :)
@@TheCodingGopher that was actually my attempt on sarcasm but well.
I like your videos but this one doesn’t deliver what it’s title promises. Maybe the format ist to short to convey such a complex topic
Nice video. Short and to the point. I'll point out that in Python, it's usually better to use a generator expression or list comprehension than map or filter. Reduce is rarely used.
Thanks for watching, and good call-out
Thanks for the insight! Could you elaborate on why generator expressions or list comprehensions are generally preferred over map or filter in Python? Are there specific use cases where map or filter might still be a better choice?
@@debbie8062 Hey. Generator expressions and list comprehensions are slightly more clear, tend to be faster, and don't require a bulky lambda if you don't already have a function defined for what you need to do. I'm unaware of a time when map or filter are better.
I too like it when my program functions
Nice video ! Tbh FP in python is disgusting so I have to use OOP for some advanced stuffs wich is also disgusting
i really enjoyed this video! ur vids really make my day :)
I heard that functional programming is really helpful when you are doing a functional programming class in college. I also heard that functional programming people are being mercilessly teased by all the other programmers who are keen to get fully working programs released into production. That so unfair. It's perfectly normal that theoretical types can always tell a better story, even if practical types are the ones that have live systems. Reminds me of that old adage, that most application software is garbage and in production, and most fantastic software is in the trash can covered in blood, sweat, and tears.
Love the adage! It’s true that FP can feel like the "philosophy major" of the coding world (i.e. deep, theoretical, and sometimes hard to explain). But there’s definitely a place for both schools of thought; FP when it comes to writing clean, predictable, and maintainable code - even if it sometimes feels like solving puzzles no one else asked for...
That said, I get why practical folks like to focus on "shipping it." The reality is, the best ideas often come from balancing both (i.e. solid theory and real-world pragmatism). Maybe one day, we’ll see more of that great software making it into prod (minus the blood, sweat, and tears) :).
I still wonder how to use this kind of paradigm in the real world with real projects 🤔 and who does it
Even the people who use FP everyday, ask the same question😅
Haskell is kinda underrated c:
Haskell's a really interesting language, but sadly I think it's being held back by the poor error messages from its main implementation, GHC. Also, the language itself is a little hard to get a performance-intuition with, since it's so lazy - but I think that's a smaller problem than the error messages.
@danielstromberg I actually never had any problems debugging Haskell code...
And well
Performance is kinda great as long AS your algorithms don't get exponential O notation
Can everything that can be done with OOP re-written and done functional programming way? I would love to learn functional programming but OOP is often just so intuitive that it’s difficult for me to switch my mindset.
It can be partly done but this is not an end goal, FP is a different paradigm. For example dependency injection can be implemented with partial functions. If you want a language that does both you can look into Scala
object_X.method_Y(argument_Z)
is just
method_Y(object_X, argument_Z)
So yes, it’s quite easy to rewrite
@@fullfungo that's a misleading oversimplification because obviously methods and functions can be mapped 1-1 but the main difference is the lack of objects to hold your application state. You can't just create an object, set some properties and use anywhere in the same way you would in oop. Architectually your app needs to be created asking how to keep the relevant state and it's veerry different between paradigms
Of course, it can be done, but is not a 1-1 "just map methods to functions" kind of situation
Thanks for another great video! I came in with already some information about functional programming which is easy to understand tbh, but i was never able to actually tell the difference between imperative and declarative, I've read so many articles, code examples, and watched demos... Nothing got me saying : "ahaaa" i think it's a me problem is my best guess (not that it matters, but i always like to know more)
Thanks for watching!
Added imperative vs. declarative programming to my list :) Would be a nice segue from this video
IMO, stop going from one purity extreme to another, and just learn Common Lisp (minus the godawful loop macro extensions -- they aren't Lispy, and that is why they suck). Not every problem is purely functional or inheritance-based OO or prototype-based OO or imperative or declarative. Not everything needs to be immutable, nor does everything need to be recursive. Common Lisp is multi-paradigm for the very sensible reason that the people who came together to make the specification were solving GIGANTIC real-world problems, not writing classroom exercises.
Interesting perspective! Could you share specific examples of how Common Lisp's multi-paradigm approach helped solve complex real-world problems effectively compared to sticking with a single paradigm?
Ah, yes, high order function. The thing that makes callback.
Functional programming is not immune to change.
(Especially if you don't know what you are doing.)
Because data or variable will change. The question is what is allowed and not allowed to change.
While “higher order functions” and “callbacks” may technically be the same, they imply very different things - callbacks in most languages are functions which are called once when something’s finished (I’m sure all the old school Node devs remember the horror). HOFs imply that making new functions by combining other functions is just how programs are written, they’re not a special case, taking in a function as an argument is just as natural as taking in a string or an int.
@Axman6 technically, you don't need to call callback at the end or when it is finished. A function can call them even at the start.
While what you said is true. Most of the time, that function as argument is gonna be called as a callback.
Because that's technically one of the purposes of HOF
Why functional programming uses recursion instead of loops. Doesn't it makes the program run out of memory if there is 1000s of steps.
The Haskell (a functional programming language) compiler for example is optimized for stuff like that. You're able to use recursion on infinite datasets and still won't run out of memory. - It's possible because Haskell is also lazy evaluated.
And why is Haskell using recursion?:
1. Because everything is immutable. So there can't be a changing state that's controlled by a variable like in an old school for-loop.
2. Because functional programming is declarative. That means you don't say 'how' something is (imperative), you say 'what' something is, and therefore loops don't make sense.
A small example: You want all even numbers from 2 to n.
imperative:
int nums = []
for ( int i = 2; i
Perceptive observation! Some languages can run into stack overflow from deep recursive calls, because each call adds a new stack frame. However, some languages, notably functional languages (the topic of this video), like Haskell and Scala, perform what's called tail call optimization, where each successive recursive call reuses the same stack frame, therefore optimizing memory usage and preventing stack overflow.
Because you can't create useful loop without mutating data
Func langs compilers usually optimise it, for example, rewrite haskell to c, thinking as compiler:
f 0 = 1
f x = x * f (x - 1)
yeah, this create infinite loop if x < 0, but now it's dont matter
Firstly, just rewrite:
int F(int x){
if (x == 0) return 1;
return x * F(x - 1);
}
But basically, we can represent calling function as just goto to start of func, and in the end, goto to place of calling. Our recursion already in the end, so just:
int F (int x) {
int ret_val = 1;
start:
if (x == 0) return 1;
ret_val *= x;
x -= 1;
goto start;
}
Yes it does. Like object oriented programming the paradigm itself doesn’t take limitations from the machine into consideration. The OOP paradigm was around for decades before it became actually useable by modern computers.
The function programming paradigm is only that: a paradigm. Programmers can follow it if it’s useful, but if a functional implementation doesn’t fit the task, they don’t have to. (Unless the language forces them to do so, but then it’s time to ask if that’s the right language for the job)
For example it’s also slow to allocate new space every time you pass dara through a function. It’s faster to modify it in place of course. So of speed is key it’s a good idea to let that part of FP slip.
@@yatsuk_vitaliithat last function does not return the desired output
I think of pure functional programming as more of an academic curiosity than anything meant to write actual software, but the practice of using higher order functions on iterators to perform data transformations instead of looping over indices and manually writing out all the transformations is something everyone should be thinking about.
I demonstrated this to a friend recently by showing the matrix multiplication algorithm in pure imperative C++ and then in Rust. In C++ there were three nested for loops because you need one for the row, one for the column, and then one to sum up the products of the paired elements. It's quite ugly and long. In Rust I had specially defined iterators for rows and columns, so I just looped over the row and column iterators, and calculating each element is as simple as zipping the iterators, mapping the pairs of elements to their product, and then summing it. It's more concise, understandable, and beautiful.
Ngl this sounds way more complicated with your wording than just simple loops
@@meflea3675 I probably shouldn't have used plain English to describe code. The code to calculate one element is just
"zip(row, col).map(|(x, y)| x * y).sum()"
I find this much nicer and more readable than a for loop with an outside sum variable that manually calculates an offset to index into arrays at each step.
@@biocta Love this example.
Its just a different style of how.
Baking is imperative.
FP is more like: "I want a cake," and letting the process figure itself out :)
@@TheCodingGopherno it’s not. That just an API
Because of the immutability functional programming has a huge production of objects during runtime. Nobody talks if and how compilers can make those things efficiently.
Without that: you would parallelize a problem you have created yourself.
(Yes, I know what streams and lambdas in Java are. I use them more fine tuned).
Elixir & Gleam: "Hello there"
Haskell has quite an efficient garbage collector optimised for many short lived objects. Allocation for most objects is simply incrementing an offset into the current generation until it’s full, at which point that generation is garbage collected. Haskell also had the massive advantage of being pure and data is immutable, which allows the compiler to completely eliminate allocations at compile time - `
-> sum . map (^2) . filter isPrime $ [1..n]` will never allocate a list at all, and will end up with the whole computation will happen in registers.
@@Axman6 The JVM is top in garbage collection too.
Compilers don’t need to create new objects, they can just mutate existing ones. This way the number of objects does not explode.
Nice video, but functional programming seems as too much work and mental drama, for doing a simple thing. All this for the sake of 'doing functional programming'. My goal is to make life simple for me, not to adhere to an ideology, if that ideology does not suit me, or I don't really see the practical benefits from it.
This is the way. A pragmatic way of thinking is always the best option.
But it does have a practical benefit. Maintainability
@@fullfungo I don't know functional programming - I have never worked with it. But i don't see how a recursive function can be easy to maintain.
Everyone is a functional programmer these days. Every piece of imperative code is being compiled through a pure functional intermediate representation - SSA. All the modern compilers do it.
Intermediate representations that use SSA are NOT functional. They contain many imperative constructs like looping control flow and procedures that produce side effects. Moreover, just because SSA values are only assigned a value once does not necessarily mean they are immutable. Most IRs have store instructions that can mutate stack allocations, dereference/write to pointers, etc, which by definition removes pure immutability. Compilers don’t use SSA for some arbitrary pseudo-philosophical reason like how these uptight functional whitepaper-writing academics do. It has real, practical usecases and allows a compiler to more easily optimize code and reason about data dependencies.
@Masq_RRade for the vast majority of code SSA is perfectly functional. GEPs are only used when you call other functions, and all loops are represented in a purely functional form in SSA (it is strictly equivalent to a subset of CPS). Even random array access is functional, see ArraySSA for details.