Short and sweet. The algorithm threw this on my main page. Please do more of C# dotnet things. It's more concise than reading the docs, and I don't have time for 2h pluralsight babble. Thank You!
I appreciate your digging into and pointing out WHY the Server GC works this way. This discovery has been made many times over the years and many a developer has "solved memory problems" by telling their IT organization to switch, but they forget that on a server, if that memory isn't "needed" for something else, you are simply putting away the deck chairs that you need to take right back out in most instances. It is the IT organization that needs to learn to understand the difference between using resources and pressure on said resources. If my goal is to carry 16 oz of water, then a 16oz cup does the job. I do not need a five gallon jug. It is absolutely a fine line and often what really is the case is the app is not all that efficient anyways and allocating like crazy so there are significant Gen 2 collections.
Absolutely correct. It is hard to explain to IT departments that the GC will release memory back to the OS when under memory pressure and is greedy by default. They just see "its using all the memory" and panic.
MSDN GC NOTE : These settings can also be changed dynamically by the app as it's running, so any configuration options you set may be overridden. CLR will be fine tuning it eventually.
You probably shouldn't use the desktop setting in a container environment, but rather set the memory requisition limit on the container. The runtime respects that boundary OOTB.
I kind of had to enable 'ServerGarbageCollection', only because of my long running service. I'm going to leave it running for a few days and see what happens. All I'm doing in the ExecuteAsync is firing off a method to collect all AD accounts and if they already exist, then just update the properties. Maybe I need to rethink my approach and do it with an event handler.
In the case of limited memory, it may be a good choice to use workstation gc when you need to repeatedly load many different images in the disk...? Should I use it?
hello, I have an ApiService to launch to production with .net core 6 and mysql running with docker in an ubuntu instance in GCP e2-small (2gb Memory and 1 processor), it would be good to apply this configuration for 150 users maximum, Thank you for your opinion, It is my first application and I have had this doubt for a long time
If the instance has one processor it should be using Workstation GC already according to the docs. Could be worth setting the config value and seeing if there is a difference anyway.
Question: You mentioned a constrained environment, such as a single processor - would this therefore be useful when dealing with Blazor Webassembly applications - where you get a single thread from the browser?
question! this example, of generating million/billion/whatever semi-random strings what is it an example of? please note: Mb is mega/mebiBITS. MB is mega/mebiBYTES.
@@edandersen yes, but, i mean, i understand very well what you are demonstrating (that there are _two_ ways that memory can be freed, one: to become available to the managed process, two: to become available to the _OS_ running the process) but, like if i demonstrate that i made a sorting algorithm that sorts one million ints very fast that's nice but what will it benefit me when in my program there are always exactly (say) 1024 ints and they are _known_ to be almost sorted? I see this very, very frequently "this sorting/whatever algorithm is 5% faster!" while there is never, ever, a problem where sorting a completely random, very big sequence of numbers, will actually help why are we doing this?
The video is just to demonstrate the behavior of the garbage collection modes. Creating millions of random Barry strings is a way to demonstrate. Nothing more
@@GeorgeTsiros good question in this world of data abundance. Yeah, why are you using a computer? I think, in the best-case scenario, you'd like to solve various problem, utilizing various techniques to render answers that are satisfying to your questions? Since garbage collections is interfering in this process, and it's a memory hog while doing so, this vid is a nice example showing the difference in utilization. So if you aim for an app with maximus throughput, overhead should always be examined.
its so nice to see small languages keeping up, if they can reach java 8 level today already imagine what will be tommorow, they will be actually usable in prod lol
Useful? Let me know in the comments! 👍👍
Short and sweet. The algorithm threw this on my main page. Please do more of C# dotnet things. It's more concise than reading the docs, and I don't have time for 2h pluralsight babble. Thank You!
I appreciate your digging into and pointing out WHY the Server GC works this way. This discovery has been made many times over the years and many a developer has "solved memory problems" by telling their IT organization to switch, but they forget that on a server, if that memory isn't "needed" for something else, you are simply putting away the deck chairs that you need to take right back out in most instances. It is the IT organization that needs to learn to understand the difference between using resources and pressure on said resources. If my goal is to carry 16 oz of water, then a 16oz cup does the job. I do not need a five gallon jug. It is absolutely a fine line and often what really is the case is the app is not all that efficient anyways and allocating like crazy so there are significant Gen 2 collections.
Absolutely correct. It is hard to explain to IT departments that the GC will release memory back to the OS when under memory pressure and is greedy by default. They just see "its using all the memory" and panic.
To be fair, in reality you wouldn't fill a cup of 16 oz water to 16 oz if you are to carry it in your hands.
MSDN GC NOTE : These settings can also be changed dynamically by the app as it's running, so any configuration options you set may be overridden. CLR will be fine tuning it eventually.
So many secrets , that was really useful . Always wondered why the memory never goes down . Good to know .
Glad it was useful, don't forget to like and subscribe ;-)
Well that changes everything. I can think of a lot of environments where this would be a very very helpful tip
You probably shouldn't use the desktop setting in a container environment, but rather set the memory requisition limit on the container. The runtime respects that boundary OOTB.
Excellent advice
I kind of had to enable 'ServerGarbageCollection', only because of my long running service. I'm going to leave it running for a few days and see what happens. All I'm doing in the ExecuteAsync is firing off a method to collect all AD accounts and if they already exist, then just update the properties. Maybe I need to rethink my approach and do it with an event handler.
Interesting stuff. Thanks for sharing
No probs!
Very Useful !! By the way, what font did you ues in the editor ?
Thanks! It's Consolas I think.
@0:16 it is not one billion, it is 10 million considering the range and 0s provided in the code!! can you check again???
Well spotted 👍
In the case of limited memory, it may be a good choice to use workstation gc when you need to repeatedly load many different images in the disk...? Should I use it?
hello, I have an ApiService to launch to production with .net core 6 and mysql running with docker in an ubuntu instance in GCP e2-small (2gb Memory and 1 processor), it would be good to apply this configuration for 150 users maximum,
Thank you for your opinion,
It is my first application and I have had this doubt for a long time
If the instance has one processor it should be using Workstation GC already according to the docs. Could be worth setting the config value and seeing if there is a difference anyway.
Question: You mentioned a constrained environment, such as a single processor - would this therefore be useful when dealing with Blazor Webassembly applications - where you get a single thread from the browser?
This is a very good question. The answer is I have no idea - I'm guessing that a Workstation GC like mode is used for Blazor wasm.
@@edandersen Well I added it to a WASM I have, that is not in use - and err no idea if it made any difference if I am honest....
@@TheDuerden Yeah I'm not sure this applies to browser WASM. More of a server side thing.
isn't there an addon for that yet?
Wow, so useful
Really? Cool!
Should help out if some sysadmin is saying "your dotnet app is using too much memory", which happened to me and led to this discovery.
Excellent
question!
this example, of generating million/billion/whatever semi-random strings
what is it an example of?
please note: Mb is mega/mebiBITS. MB is mega/mebiBYTES.
Correct!
@@edandersen yes, but, i mean, i understand very well what you are demonstrating (that there are _two_ ways that memory can be freed, one: to become available to the managed process, two: to become available to the _OS_ running the process) but, like
if i demonstrate that i made a sorting algorithm that sorts one million ints very fast
that's nice
but what will it benefit me when in my program there are always exactly (say) 1024 ints and they are _known_ to be almost sorted?
I see this very, very frequently
"this sorting/whatever algorithm is 5% faster!"
while there is never, ever, a problem where sorting a completely random, very big sequence of numbers, will actually help
why are we doing this?
The video is just to demonstrate the behavior of the garbage collection modes. Creating millions of random Barry strings is a way to demonstrate. Nothing more
@@edandersen okay, I can accept that perfectly
@@GeorgeTsiros good question in this world of data abundance.
Yeah, why are you using a computer? I think, in the best-case scenario, you'd like to solve various problem, utilizing various techniques to render answers that are satisfying to your questions?
Since garbage collections is interfering in this process, and it's a memory hog while doing so, this vid is a nice example showing the difference in utilization. So if you aim for an app with maximus throughput, overhead should always be examined.
Damn.. i was swapping my macbook for no reason at all
its so nice to see small languages keeping up, if they can reach java 8 level today already imagine what will be tommorow, they will be actually usable in prod lol
🤣 Quality comment