Open .blend files faster
HTML-код
- Опубликовано: 1 июн 2024
- 🔽 The full playlist: Master Optimization in Blender 🔽
• Master Optimization in...
▶️ CHAPTERS ◀️
00:00 - Intro
00:18 - Why files open slow
01:17 - The exception
01:43 - Optimize viewport - Кино
Aaand large files can be slow if you have a slow hard drive because the data itself takes time to read. It’s all a bit of push and pull.
The newest blenders have a "bake" node in geometry nodes so you don't have to recalculate everything every time.
So counter intuitive at first, and then all so easy to understand just a few seconds later!
This is not new information to me. I've gotten quite deep into blender already. But you are one of the best if not the best when it comes to tutorials and information! Keep up the good work!
This is fantastic information to learn!
Oh, the promise for Proxies in the next video intrigues me. I know it's a native feature in programs like 3DSMax and the closest Blender has was the LODify addon made some time ago that isn't compatible with modern versions...Instead, I've just been working on things in parts and linking them into the main file as they get done.
Don't get too excited. The workflow I show isn't quite the same as proxies in Max. It procedurally calculates the LOD when you open the file. That makes it super simple to use, and very handy. But it has to calculate the modifier.
There are other workarounds. You could have two versions of your object, with one hidden in the viewport and the other hidden in the render. That collection could be instanced to scene or linked to another file. But I don't think you'll get around the need to actually load the entire hi-poly when you open the file.
a like this series 👏 🙌 👌
Hi Robin, could you please do a quick tuto how to do a clean reinstall / install of blender, and chose the add-ons to reinstall (the paid ones) ! ... is copying the folders in the new blender ok ?
And when it comes to rendering as well, It's both memory ballooning and time-consuming (re)calculations on every render being performed.
So a balance of time is identify the most resource-heavy causes and apply those while benefiting from procedurality...
(It's a bit offtopic from this video but thought to share anyways :)
Great point. I remember reading in the context of 3DS Max rendering that each modifier forces the program to load the model all over again, effectively doubling VRAM use for each modifier. Do you know if that is true for Blender as well?
@@robinsquares That's interesting, I believe that's what's described as a "memory leak" but no, in my experience memory usage
remains relatively stable once calculated. It's the processing of those procedurals per render that really takes a hit on rendering...
However, the render time can be comparable to "applied data" if "persistent data" is enabled
albeit quite memory heavy in combination with procedurals.
Oh, and maybe my use of the term "memory ballooning" was a poor choice of words.
What I intended to convey was a comparison with applied data, (which typically requires far less memory :).
@@GinnyGlider Awesome. I'd love for you to add your two cents to my video for tomorrow. It's about rendering, and it sounds like you know a thing or two about it.
By the way do you know of any way of turning instances to points in geometry nodes and preserving the instance rotation to be reinjected later?
I have a question, do you know, when we render an animation with procedural shading. If blender has to recalculate the procedural stuff with each frame?
Good question. As I understand it, (and I am not a render engine engineer, so take it with a grain of salt,) it has to recalculate everything for each frame if you use Eevee. And if you use Cycles, I believe it's more like it recalculates for every pixel every single sample. But at that point, it doesn't compile the entire material; it just calculates what that pixel should look like.
That's for shaders. Modifiers definitely have to recalculate each frame. But it may be that turning on "persistent data" in the render settings could reduce some of that redundancy.
@@robinsquares thank you for taking the time to answer. I love this series. It’s very interesting.
Hello Robin! Thanks for your videos. I'm new to your channel. I would like to ask a question. Could you make a video about scene optimization? Namely, large scenes that require a lot of video memory. For example, I need to render a scene that takes up 14 GB of video memory. I have 8 GB video memory. How can I divide this scene into layers so that I can easily render it? This would be a great topic for a video, especially for guys who do exteriors.
For the time being I find that "Bro 3D" video named: "How I Render large Scenes Very Fast and Easily in Blender"
Is quite understandable... Theres one tip that I don't see getting said much and that is Disable "viewport display"
(NOT the eye icon) before rendering as that also takes up an unnecessary amount memory when rendering.
Lucky you! I have made that video, and it will be available in two days.
Great!! But now will be possible to bake geometry nodes. I havent tested it, so i dont know how it is when i close and open again
a few questions homies…
how to bake procedural materials?
how to make proxies?
and when is it appropriate to use decimate mod. and not remesh mod. (I usually use remesh )
You got it:
- Look up a baking tutorial. It's easy. One button.
- Blender doesn't have a real proxy system anymore, but you can use the geometry nodes setup I show
- Decimate collapses neighboring vertices; remesh makes a new lattice and conforms it to the mesh. Different use cases.