Speeding Up Python Code With Caching
HTML-код
- Опубликовано: 30 июн 2024
- Today we learn how to speed up Python code using caching.
◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾
📚 Programming Books & Merch 📚
💻 The Algorithm Bible Book: www.neuralnine.com/books/
🐍 The Python Bible Book: www.neuralnine.com/books/
👕 Programming Merch: www.neuralnine.com/shop
🌐 Social Media & Contact 🌐
📱 Website: www.neuralnine.com/
📷 Instagram: / neuralnine
🐦 Twitter: / neuralnine
🤵 LinkedIn: / neuralnine
📁 GitHub: github.com/NeuralNine
🎵 Outro Music From: www.bensound.com/ Наука
I just found out that the cache decorator was added in Python version 3.9, so those who are in a previous version have to use lru_cache instead.
ahh... lemme cache my hello world program now
😂
You can increase the recursion limit using sys module
import sys
sys.setrecursionlimit(10**12)
oh very good to know. Didn't know that ^^
Nice to know, but it looks like a dangerous tool to use!
Be careful setting it too high. Too much recursion will cause a Segmentation Fault.
OverflowError: Python int too large to convert to C int
This is awesome! Learned much about it!
Actually learned so much from this. Decorators finally make sense and I found a way to speed up my python programs
amazing video and content. Straight to the point.
Me: "How do I do caching in python... I bet there's a module for this."
NeuralNine: "yeah 'from functools import cache' "
Perfect video, it's rare that you get the exact answer you are looking for with a walk through right away. I love the future!
amazing video , just love it.
Awesome as always 👍😀
thanks ^^
When recommendation is faster than notification
When caching is used in calculations like this the technique is often called 'memoization'. Also quite handy when writing a prime factor sieve. Love the channel. Long live human 2.0. Go SpaceX.
Thank u, and is there anyway u could make a video on Cython, Numba, or any other library helping speed ups?
awesome toturial, thanks.
but can we know how the cache function acts or not, and can we write a normal function acts like a built-in cache function.
Thank you so much.
It was very helpful for me.
More of this content please !!
actual real life use-full things
thank you for the hard-work
thanks for your kind words :)
that's nice!!
Amazing thanks
Cool video!
Loved your intro music!
The video as well. You're awesome and unique as always :)
That is awesome
I had already tried to import cache from functool in google colab but it got an error, do you know why this happens? So instead i import Cache from cache-decorator
this cache is similar to the idea of dynamic - save some part of the result
Where can you learn about this caching from first principles?
When I saw it I just hoped that it wasn’t an april fools
That would not be the most catchy title for a joke xD
Thanks
Nice video
Good better and best
Search: Emanuel Swedenborg
my python 3.9 functools dont have @cache anymore i think , i get an error whereas @lru_cache works
Cool.....
helpful
YES
for me it isnt working. with cache and without cache always shows same time idk why
great
explanation is too clear !!!
Why did time.time() function is useless for results under one seconds ? In fact, executing code is happening in microseconds and milliseconds.
Caching data basically means you're decreasing the amount of calculations you need to do by optimizing the process through which the calculations are done.
But heres what im confused about.
Why does caching make it THAT much faster? Here you're decreasing the number of calculations u need to do by half (or is that more than half) so why is it so significantly faster
Found my answer. The decrease in calculations is significantly more than half.
The Fibonacci sequence is super unoptimized
Do a video about cython
Hello bro can you explain pwntools python library
Ok, so if I have a pyinstaller .exe onefile, will cache speed that up too?
Yes,
Pyinstaller exe is basically a self extracting archive.
I want to know this too
Yeah all pyinstaller does is bundle the whole python runtime with the code, it won’t be any faster. So yes, this will increase execution speeds. One note, if you are using linux or cygwin you can install cython to compile python to C, that would result in super fast execution times.
@@boxtalks7994 Oh great. Thanks!
🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥
And now should i Cache every function?
I think you would only need to do it on iterative functions, if you wanna learn more about this sort of "caching" data to speedup programs, FreeCodeCamp have a dynamic programming tutorial which basically explains how to do caching and speedup this function with code and not just a decorator. It's nice to see how things work :)
Pycharm? Vscode pls!
when i am working with classes and "@property" can i use the "@cache" too???
print([fib(n) for n in range(0, 10000)])
something like that with the cache enables python to bypass the maximum depth recursion
Does caching have a negative side?
you store values in memory, only downside I can think of, the bigger the cache, the more memory it will consume.
@@benoitgael2969 Thank you
april first lol
But Pycharm is heavy
Wieso wieder PyCharm?
Might've been useful to show how to implement caching like that instead of just importing
2nd
👍
1st
^^
Where have I seen this video before...? You copied that from someone right?
Ah, the video from mCoding!
Hi bro, Thanks for tutorial. but I have problem for this:
from functools import cache
when I run my code give me error:
ImportError: cannot import name 'cache' from 'functools' (/usr/lib/python3.8/functools.py)
`cache` was added to python 3.9. You will have to use `@functools.lru_cache(maxsize=None)` for the same effect as `@functools.cache`
So just import `lru_cache` from functools
can you help me 🤦♀🚨🚨
TypeError: unhashable type: 'numpy.ndarray'