You attained such speed because your "cache" is just reading a file of paths, the real and realistic time is ~50 sec like you showed since there is no cache invalidation. But hey, enough to fool most apparently
this is tik tok content stretched out to 4 minutes for no reason, every video this guy has made are superficial projects or stuff that he just gives up half way through
In practice this is more useful if you're looking for multiple files though. I wish windows would keep a cache. Sure it's a cheat, but it doesn't invalidate it being useful. It just changes the acceptable use cases
A file explorer is a pretty good project imo because it's easy enough on a basic level but it also can be made significantly more complicated if you want to torture yourself. Optimizing stuff like this is always very satisfying as it's what us - programmers - are paid for. I would love to see this project with a better UI, i think it's worth it. And finally, "wOw" - conaticus, 2023
I started playing with similar project several years ago. Now it is almost done, approaching 1.0 release... Just google for HiFile file manager... And I can confirm that it is really a lot of work if you want to have your file manager feature rich, crossplatform and supporting archiving! :)
yeah, the next level for search is to implement it in a way that it also knows about updates to the disk and you can have multiple file explorers open at the same time afterwards you want to add support for finding stuff in the file contents
Does it also beat Everything? I am asking this because if I am right that application does raw NTFS searches, without indirectness API of the Windows operating system in between, making it practically unbeatable, unless the application in this video is doing the same thing.
Yeah... But what if your file isn't cached? Does it just not appear? You should do an analysis of the Windows search, indexing and all. Then see where it leaps ahead.
If the file isn't found in the cache, it must be in a modified or new folder, so all you need to do is not check where you know it isn't, based on the modified folder time and the cache update time, and whether it was in the cache or not before the search. Or something like that. Comparing this to Windows search isn't really fair, as it searches file contents as well, like txt files, etc., idk.
@@johnmarston2474 this is exactly why I'm saying you should research the system you're saying you're improving. It's not fair to say you made a fast Explorer if you implemented it and missed a massive part of how it works. Windows search might be slower but it checks file contents and thus they didn't actually make the same functionality but fast.
@@IshCaudron It's true, look at the search settings: there's an option to enable content searches, which explicitly warns about performance degradation.
@@IshCaudron it's slow indeed, because the indexing feature is not applied to the entire drive. In your implementation you're basically caching the entire tree. This can also be done for the windows search, but you have to manually set it that way, because indexing comes at a cost. If you're on Windows 10 or 11, you can look at the indexing settings in the start menu.
@@minciNashu not slow, I'm saying it doesn't work. It doesn't find things on the drive even if I have an explorer window showing them. It's just broken.
@@IshCaudron file explorer search based on indexing. if your file doesn't get indexed, it won't get searched. regardless whether the file exists or not.
Wanted to say the same. Like querying your database without an index, just going from beginning of storage to the end of storage looking for whatever you're searching for lmao.
Yo super cool project. BTW there's something made just for this called as "Everything" which also searches your entire file system faster than windows could ever imagine.
Yeah, he could have just written a explorer with everything integration. But his results where pretty impressive as well while utilizing a similar approach. But I would say it's not possible to search faster than everything as it literally adapts the result by the keystroke. I actually installed it on computers of all my friends when I just wanted to search 1 file as it was faster to install everything than search it in any other way. Those who understood it's power continue to use it for everything 😉
Хаха, да :) find находит файлы очень быстро, обходит весь мой системный диск, диск с играми и находит нужные мне файлы. А ещё слышал про fd который с многопоточность ю и работает ещё быстрее ⚡ На самом деле не смотря на то что есть GUI, мне все равно проще искать файлы, открыв терминал и используя find, он позволяет более гибко указать что тебе нужно найти
Rust developers in nutshell: I don't get how turn on Windows indexation, and it's a reason to write it by myself from "scratch" and bunch of libraries...
Windows indexing is shit. It takes loads of performance in the background and can't handle large amounts of file changes. If you try indexing an entire disc you can literally trash the pc. Use "Everything" there is literally no discussion on what the best file search is. It's Everything period.
@@redcrafterlppa303 Yep. Windows indexation a lot worse in way of metafile size per file, and content indexation by default as i know. But as "From a box solution" it's enough, especially on "modern" hardware
@@CorrosiveCitrus not if u have 20-30 tabs open with vscode and notion running alongside it. The indexation is bad specifically because my pc just can't take it with all the other thjngs going on.
what happens if a file is created after the cache? does it know to recache? also i think windows search is understandably slower than this since it searches for non exact matches (like if i search for eclipse it will show me eclipse-workspace too), and it doesnt have a cache since it needs to find recent files. so basically yours was like a bit faster but then you added caching, which in my mind is kind of pointless since it constantly changes, but i guess it helps.
As was shown, substrings are also searchable. The cache doesn't change much. I believe the hybrid approach is the best in this case: first load the cache and ask the filesystem if the file still exists. If it does, congrats, you found a match. If it doesn't, delete it from the cache. And in parallel the slow search should happen which will re-evaluate the cache
@@phoenix-tt true i guess the hybrid solution is best sorry i missed the substrings being searchable this video is pretty cool though its inspired me to make my own in rust
Nice work! There is also a program called search everything that runs in the background. It's whole purpose is for seraching (you can use wildcards and that stuff) and is also superfast. I am mentioning this as a piece of information, I don't mean to insult or anything.
Yep, I was going to mention this. It's a fantastic program. Even the initial run only takes a minute to cache all the filenames, and after that the searching is instant.
Yess! You can also sort, add exceptions, hotkeys, everything (see what I did here) you want! I like to leave the sort from "date modified" , with this I can see in real time what's happening in windows. It's useful for me because I sometimes just save a quick word or paint file fast and forget where it is. In this case I just pull up search everything and see what was last editet. Game changer
This search is not fast because of the Rust programming language. Rather, it is fast because of the data file cache data structure you are using and the algorithm you are using to access it. Searching for strings inside a single json file will be way faster than running file search queries across a file system on a hardware disk for sure. Windows should be able to achieve similar or better results. I have no idea if the search is using the windows index or not as it can ve turned off
He just cached everything and searched there, but there could be a case when cached data is old and not up-to-date (some files/folders could be already deleted or renamed, new files could be already created at that time). So there must be some syncing that runs every n minutes/seconds. Am I right?
@@ulanaitbay afaik there are "watchers" that allow your program to know when a file/folder is modified, one could use that to add/remove cache entries dynamically
It's pretty much unbeatable as it literally searches multiple disks at the speed of a keystroke. It's literally faster to install it and initialize it that waiting for a windows search result.
I would like you to note that the windows file explorer's search function is a little more complicated than that, as it searches through the contents of the file as well. It looks for keywords in the files, or uses certain search terms to match with the file name. It's not just a simple .includes/.contains match.
As far as I know, Windows' search is significantly slower as it also searches file metadata for supported file types. There are apps that implement pretty fast searching using indexing as well though, like voidtools' "Everything" tool is blazing fast if you give it the time to index all your drives, and it's pretty good at picking up changes in your FS almost immediately and indexing those changes.
yes!!! more rep for the best search tool and the reason Everything can pick up changes so fast is because it reads the changes journal of the filesystem directly, which you can see by doing Ctrl+` (backtick) (closing the debug window will close Everything, do the keybind on the search window again to hide the debug window)
What about cache invalidation. What if we edit file delete or create from another application like Visual Studio, will the data be updated in the cache
One improvement you could do for NTFS file system is use NTFS USN Journal. Everything already uses that and it's insanely fast! Other than that, great project, good work!
This looks like a really fun project to put together, but as others have said there are some comparisons to be made. Windows explorer works without extra disk space, works on even the slowest of drives, and takes no extra boot time even after thousands of file changes. For this program to work at full speed, you'd have to have an insanely fast and empty disk to begin a cache and be ok with it loading/flushing that cache every time. Most people who use the search feature dont have such hardware which is why Windows had to sacrifice performance for full combability with all hardware.
this is only practical if you are planning on searching stuff in bulk... its undeniable that the windows search is slow af but comparing it to your caching aproach is pretty like comparing apples to oranges. windows search is used after an unknown ammount of changes were made to the file system meanwhile in your approach you need to re cache everything whenever you make any change so the boot time of the program is similar to a normal windows search (please correct me if i m wrong, i have no clue what your caching times are since you didnt really show them) so yes your code is faster id you need to make 2+ searches but for what people usually use it for its pretry useles
Probably worse than windows search, as that is designed to index the files in the background when you're not active so as to be as unnoticeable as possible - enabling indexing for your entire drive is probably a much better solution
@@CorrosiveCitrus Even with search indexing on, explorer takes at least a minute to find a file even when I give it the exact name and the start menu takes 10-30 seconds, and much more if the file is in multiple sub folders. I don't know how the explorer search manages to be so bad, Everything can index my entire computer faster than explorer can find a file.
Glad to know that the file search is not slow only on my computer. Honestly, how come Microsoft never fixed that? Windows after windows and it's all the same, you have to install third party software if you want to find files in your computer.
Because there's nothing to fix. It's a raw-dog search, not an indexed search. Nobody fails to understand that indexed searching is faster than unindexed searching, but indexing operations are expensive and you don't want to be running them on your system nonstop for ages. If your files are constantly changing, indexing can easily be out of date, and you then lose the benefit of indexing because you have to re-index before every search, which, surprise, is basically what windows search does.
There is a "fix", just go to your start menu and search for "indexing options" and turn on indexing for whatever folders that you want faster search in. And as @@Sammysapphira mentioned, indexing is expensive, so it will take a while for windows to build a full index in the background. But when it is done, searching in that folder is usually under a second.
@@SammysapphiraYeah if not for the literal fact that Windows already has indexing (at least partial) for other features and a boatload of "maintenance" or otherwise labeled services that spin all your drives at random, even when not needed at all for the current operation.
Enabling full disk indexing will be answer It's not enabled by default because it's taxing to the system and is supposed to I my run while you're not active, so needs to complete in a short time, but with a decent enough pc, or one you leave on, it should work fine
"so I just spent the entire afternoon looking at fancy searching libraries just to find that I can just do this with a hashmap" hashmaps are in fact so overkill sometimes it isn't even fun to use them
This is cool, but can it be set to just update the cache every now and then? Perhaps on the creation or movement of a file? If a file gets deleted while the explorer is running, will it notice the file no longer exists when you run a new search in the same instance of the explorer?
"There are only two hard things in Computer Science: cache invalidation and naming things." Unlike most of the usages of this quote, this video is not about the second one
Good question! I was trying to handle this with the notify crate. Sadly doing that asyncronously went past my Rust competency and I didn't have time to implement it 😅 It would look something like watching for a new file change, updating the cache in memory and then updating the cache on disk every so often
There's a really good program called "everything" that I use to search for files normally. It's been my go-to over default windows search for years now.
The problem with this implementation (as far as i know) is that if you want to keep all files findable, it is impossible to rely on the file cache alone. If you download a file and haven't cached the filesystem after that, you won't be able to find it, so you will have to cache very, very often, which would slow down the system. Another option would be to use the cache to show those results first (which would work most of the time) and then keep looking for files that match the search that are not cached, so at first it would quickly show you the files that are already cached, and after about 50 seconds show the rest (it would be an improvement compared to Windows Explorer, but in the base, in certain cases it would be the same).
You don't understand, Windows cares so much about user experience that they know the average user is gonna want to go to the kitchen and grab a cup of coffee while they are using file search so they make it slow on purpose. You've taken all the fun out of searching for a file.
Something you could do to account for changes in the file tree is have it redo the caching when it finds nothing. Only thing you should keep in mind is to show the file that's found during the caching immediately, not after the caching is done.
But windows caching is quite a performance killer. I disabled Windows indexing Service and literally gained fps. "Everything" is literally the answer to all file search related questions. It's as fast as possible as it changes the search result as you type in the search bar without any delay.
I have 12TB, does it mean it will cache into a big, very big file and memory usage? I'm using voidtools' Everything tool and it consumes 700MB for the caching, of course, ignoring node_modules folders, .git, windows, program files, etc. and I guess Everything uses a special way to list files in the NTFS (file system) USN Journal, which speeds up the caching process.
There are a lot of “programmers” in the comment section who have absolutely no idea what they’re talking about. My guy, what happens when your file structure changes and now your cache is invalid?
The cache generation took more than the explorer search. And what if a new file appear, do you have to wait 260 more seconds? Its not faster, it just slower at the start and faster in the end.
to be fair, windows explorer gets the metadata for every single file it scans, if you skip that step it would be much faster, smth you did it is easy to curse at windows, but if your "remake" is not a 1:1 recreation, then it can't be used as a benchmark. I think windows search also searches for attributes like artist name etc also, how long does your "cache" last til it expires? and what would happen if I made a new file after the cache was made?
Everyone one RUclips: Create full stack applications with this. Me when tried actually to do so: Oh f**k bro it is so error creating and outdated type of coding, and lot harder than coding, it frustrated me too much ☺️
Hello, first of all I wanted to say that I respect your project. I've got a question tho. Wouldn't you rather call this a faster file finder? Whenever I would copy/open/delete a file I still need to be in the determined filepath.
Now write a whole OS
thank you so much
Yea, Would love to see that
Yes please!
And make it Android, GNU/Linux, MacOS, Windows, FreeBSD, IOS apps compatible
This comment gives hard ‘Do a flip!’ vibes.
You attained such speed because your "cache" is just reading a file of paths, the real and realistic time is ~50 sec like you showed since there is no cache invalidation. But hey, enough to fool most apparently
this is tik tok content stretched out to 4 minutes for no reason, every video this guy has made are superficial projects or stuff that he just gives up half way through
gpt "programmers" are easily impressed
In practice this is more useful if you're looking for multiple files though. I wish windows would keep a cache.
Sure it's a cheat, but it doesn't invalidate it being useful. It just changes the acceptable use cases
@@CatMeowMeow as I'm modifying and removing files too often it's useless for me, luckily visual studio code has a pretty powerful search system
@@gonzalologwouldn't you be able then to use windows file events to update the cache?
rust programmers be like: "so I found this library that does everything for me" and posts it as a video
okay mr assembly, you don't have to implement everything from the system level
@@maerto wheres fun then, also c and asm > rust
A file explorer is a pretty good project imo because it's easy enough on a basic level but it also can be made significantly more complicated if you want to torture yourself. Optimizing stuff like this is always very satisfying as it's what us - programmers - are paid for. I would love to see this project with a better UI, i think it's worth it.
And finally,
"wOw" - conaticus, 2023
I started playing with similar project several years ago. Now it is almost done, approaching 1.0 release... Just google for HiFile file manager... And I can confirm that it is really a lot of work if you want to have your file manager feature rich, crossplatform and supporting archiving! :)
@@vladimirkraus1438 I just searched for it and didn't find the repo. Link it, please?
@@vladimirkraus1438 but is it fast?
@@vladimirkraus1438 That file manager looks really good!! Congratulations!
yeah, the next level for search is to implement it in a way that it also knows about updates to the disk and you can have multiple file explorers open at the same time
afterwards you want to add support for finding stuff in the file contents
The speed is honestly insane compared to the usual search! Great job man!
@@RadinTheFroggta 6?
Does it also beat Everything? I am asking this because if I am right that application does raw NTFS searches, without indirectness API of the Windows operating system in between, making it practically unbeatable, unless the application in this video is doing the same thing.
@@jongeduard Everything caches the data too so the result should be ~the same.
@@jongeduardit's just cache and there is no invalidation
find / | grep "your search" is faster lol
Yeah... But what if your file isn't cached? Does it just not appear?
You should do an analysis of the Windows search, indexing and all. Then see where it leaps ahead.
If the file isn't found in the cache, it must be in a modified or new folder, so all you need to do is not check where you know it isn't, based on the modified folder time and the cache update time, and whether it was in the cache or not before the search.
Or something like that. Comparing this to Windows search isn't really fair, as it searches file contents as well, like txt files, etc., idk.
@@johnmarston2474 this is exactly why I'm saying you should research the system you're saying you're improving. It's not fair to say you made a fast Explorer if you implemented it and missed a massive part of how it works.
Windows search might be slower but it checks file contents and thus they didn't actually make the same functionality but fast.
They could do a fast search and then do a file content search, a sort of two stage search. That would make more sense for this title.
@@johnmarston2474oh so it's like that meme explaining how the missile algorithm works.
@@StiekemeHenk linux file explorers are still faster, and they also index content
The explorer search does more than look at file names. It also does some content reading.
I'm not sure that's completely true, but it doesn't really matter since the search barely works.
@@IshCaudron It's true, look at the search settings: there's an option to enable content searches, which explicitly warns about performance degradation.
@@IshCaudron it's slow indeed, because the indexing feature is not applied to the entire drive. In your implementation you're basically caching the entire tree. This can also be done for the windows search, but you have to manually set it that way, because indexing comes at a cost. If you're on Windows 10 or 11, you can look at the indexing settings in the start menu.
@@minciNashu not slow, I'm saying it doesn't work. It doesn't find things on the drive even if I have an explorer window showing them. It's just broken.
@@IshCaudron file explorer search based on indexing. if your file doesn't get indexed, it won't get searched. regardless whether the file exists or not.
If you enable windows search indexing (enhanced mode), searching via explorer or start menu works amazingly fast.
Wanted to say the same. Like querying your database without an index, just going from beginning of storage to the end of storage looking for whatever you're searching for lmao.
Yo super cool project. BTW there's something made just for this called as "Everything" which also searches your entire file system faster than windows could ever imagine.
i use this every day also power toys has this feature too
Yeah, he could have just written a explorer with everything integration. But his results where pretty impressive as well while utilizing a similar approach.
But I would say it's not possible to search faster than everything as it literally adapts the result by the keystroke.
I actually installed it on computers of all my friends when I just wanted to search 1 file as it was faster to install everything than search it in any other way. Those who understood it's power continue to use it for everything 😉
SSD users can just use dir.
@@TariqSajid what? did you mean powertoys run? or is there something like everything inside powertoys
@@BarraIhsan yes i was talking about powertoys run
What a lovely thumbnail, the group that designed it have to be quite tallented :O
100%
It's such a shame that they'll never see our appreciation
Linux Users (like me): "Lol, we just use the terminal! And our GUI explorers are fast anyway! XD"
love dolphin
@@xvnexus8814 Dolphin is good.
Хаха, да :)
find находит файлы очень быстро, обходит весь мой системный диск, диск с играми и находит нужные мне файлы. А ещё слышал про fd который с многопоточность ю и работает ещё быстрее ⚡ На самом деле не смотря на то что есть GUI, мне все равно проще искать файлы, открыв терминал и используя find, он позволяет более гибко указать что тебе нужно найти
Rust developers in nutshell: I don't get how turn on Windows indexation, and it's a reason to write it by myself from "scratch" and bunch of libraries...
Windows indexing is shit. It takes loads of performance in the background and can't handle large amounts of file changes. If you try indexing an entire disc you can literally trash the pc.
Use "Everything" there is literally no discussion on what the best file search is. It's Everything period.
@@redcrafterlppa303 Yep. Windows indexation a lot worse in way of metafile size per file, and content indexation by default as i know. But as "From a box solution" it's enough, especially on "modern" hardware
@@redcrafterlppa303most pc's today can handle a bit of indexing
@@CorrosiveCitrus not if u have 20-30 tabs open with vscode and notion running alongside it.
The indexation is bad specifically because my pc just can't take it with all the other thjngs going on.
This is awesome! I've thought about doing this for a while but never got around to it haha
"I Made"
> Proceeds to have chatgpt and random libraries do all the work while he just assembles the kid puzzle.
Make yours
are you using jet brains? I wasn’t aware that actual people used jetbrains
what happens if a file is created after the cache? does it know to recache?
also i think windows search is understandably slower than this since it searches for non exact matches (like if i search for eclipse it will show me eclipse-workspace too), and it doesnt have a cache since it needs to find recent files.
so basically yours was like a bit faster but then you added caching, which in my mind is kind of pointless since it constantly changes, but i guess it helps.
As was shown, substrings are also searchable. The cache doesn't change much. I believe the hybrid approach is the best in this case: first load the cache and ask the filesystem if the file still exists. If it does, congrats, you found a match. If it doesn't, delete it from the cache.
And in parallel the slow search should happen which will re-evaluate the cache
@@phoenix-tt true i guess the hybrid solution is best
sorry i missed the substrings being searchable
this video is pretty cool though its inspired me to make my own in rust
@@lythd Yeah, it's a cool task, good luck mate
@@phoenix-tt thank you marcel
Great video, and I found it really interesting :)) Also, you became a victim of a premature optimization with your BK-tree :P
Nice work! There is also a program called search everything that runs in the background. It's whole purpose is for seraching (you can use wildcards and that stuff) and is also superfast. I am mentioning this as a piece of information, I don't mean to insult or anything.
Yep, I was going to mention this. It's a fantastic program. Even the initial run only takes a minute to cache all the filenames, and after that the searching is instant.
Yess! You can also sort, add exceptions, hotkeys, everything (see what I did here) you want! I like to leave the sort from "date modified" , with this I can see in real time what's happening in windows. It's useful for me because I sometimes just save a quick word or paint file fast and forget where it is. In this case I just pull up search everything and see what was last editet. Game changer
Thank you for making this file explorer! Now I wouldn't have to wait 7 minutes just to find one file anymore.
everything is a... thing and it does seraches by reading the ntfs data directly.
much faster then this.
@@v0xl just heard about it
Try enabling full disk caching for windows search, they have it disabled by default even if you have a powerful pc
This search is not fast because of the Rust programming language. Rather, it is fast because of the data file cache data structure you are using and the algorithm you are using to access it. Searching for strings inside a single json file will be way faster than running file search queries across a file system on a hardware disk for sure. Windows should be able to achieve similar or better results. I have no idea if the search is using the windows index or not as it can ve turned off
He just cached everything and searched there, but there could be a case when cached data is old and not up-to-date (some files/folders could be already deleted or renamed, new files could be already created at that time). So there must be some syncing that runs every n minutes/seconds. Am I right?
@@ulanaitbay afaik there are "watchers" that allow your program to know when a file/folder is modified, one could use that to add/remove cache entries dynamically
@@ulanaitbay You are indeed correct.
@@m.projects watching every single file on a drive is very inefficient.
Thanks!)
you don't even know how happy I was when I saw that you're using CLion, not a lot of creators use it, but it's such an amazing IDE
Amazing 🤩🤩🤩
You're absolutely inspiring and smart
Pls do more videos like this
that's pretty awesome lol, makes me wanna do something similar
Good job lad, you got a subscriber. Hope this goes viral.
That's why i always use "Everything" for searching, it's aeons ahead of the default file explorer searcher
Literally the first thing I install in windows 😂
Same :)
It's pretty much unbeatable as it literally searches multiple disks at the speed of a keystroke.
It's literally faster to install it and initialize it that waiting for a windows search result.
For those that want a native solution, you can also enable full disk indexing within windows search to achieve the same thing
I would like you to note that the windows file explorer's search function is a little more complicated than that, as it searches through the contents of the file as well. It looks for keywords in the files, or uses certain search terms to match with the file name. It's not just a simple .includes/.contains match.
Such great content. This is so cool and interesting. I need to come up with smaller project ideas that can be tackled in a few days like this one.
that moment you use your whole cpu/gpu and ram to just find that 1 file on your harddrive.... i love it
and if you delete a file? does the cache update? ...
As far as I know, Windows' search is significantly slower as it also searches file metadata for supported file types. There are apps that implement pretty fast searching using indexing as well though, like voidtools' "Everything" tool is blazing fast if you give it the time to index all your drives, and it's pretty good at picking up changes in your FS almost immediately and indexing those changes.
yes!!! more rep for the best search tool
and the reason Everything can pick up changes so fast is because it reads the changes journal of the filesystem directly, which you can see by doing Ctrl+` (backtick) (closing the debug window will close Everything, do the keybind on the search window again to hide the debug window)
I should work on a file explorer, seems like a fun project to do. Great job on how fast it was
This is awesome. Great stuff!
This really inspires me to try Rust ! Thanks for sharing and good trail and error!
Great work!
This is actually so sick. I need this so much in my life. Please keep updating this
Look up Voidtools Everything :)
It's not really a file explorer but it definitely finds files quickly
What about cache invalidation. What if we edit file delete or create from another application like Visual Studio, will the data be updated in the cache
no
> I made a FAST file explorer
> uses react
Okay so what happens if a new file comes in or gets removed?
One improvement you could do for NTFS file system is use NTFS USN Journal. Everything already uses that and it's insanely fast!
Other than that, great project, good work!
I think putting "Everything by Voidtools" would help provide context here
Next: I made whole GNOME Desktop Environment but faster (in Rust)
you also could just use "Everything" and "Explorer" together (Everything is like a super fast search engine for your files)
I was about to say this lol
And tomorrow, Spacedrive, a file manager that also uses Tauri, should finally be released. Great video and nice timing xD
This looks like a really fun project to put together, but as others have said there are some comparisons to be made. Windows explorer works without extra disk space, works on even the slowest of drives, and takes no extra boot time even after thousands of file changes. For this program to work at full speed, you'd have to have an insanely fast and empty disk to begin a cache and be ok with it loading/flushing that cache every time. Most people who use the search feature dont have such hardware which is why Windows had to sacrifice performance for full combability with all hardware.
this is only practical if you are planning on searching stuff in bulk...
its undeniable that the windows search is slow af but comparing it to your caching aproach is pretty like comparing apples to oranges.
windows search is used after an unknown ammount of changes were made to the file system meanwhile in your approach you need to re cache everything whenever you make any change so the boot time of the program is similar to a normal windows search (please correct me if i m wrong, i have no clue what your caching times are since you didnt really show them) so yes your code is faster id you need to make 2+ searches but for what people usually use it for its pretry useles
Windows search is still bad, Everything (by voidtools) caches things and updates immediately and can update the results as fast as you type
Probably worse than windows search, as that is designed to index the files in the background when you're not active so as to be as unnoticeable as possible - enabling indexing for your entire drive is probably a much better solution
@@CorrosiveCitrus Even with search indexing on, explorer takes at least a minute to find a file even when I give it the exact name and the start menu takes 10-30 seconds, and much more if the file is in multiple sub folders.
I don't know how the explorer search manages to be so bad, Everything can index my entire computer faster than explorer can find a file.
@@daviddube9215 yeah, explorer somehow manages to be slower than just brute force searching it for every new term
give the kid a break. He is learning and this is great initiative. He is prob gonna be a better problem solver than 99% of engineers.
Glad to know that the file search is not slow only on my computer. Honestly, how come Microsoft never fixed that? Windows after windows and it's all the same, you have to install third party software if you want to find files in your computer.
Because there's nothing to fix. It's a raw-dog search, not an indexed search. Nobody fails to understand that indexed searching is faster than unindexed searching, but indexing operations are expensive and you don't want to be running them on your system nonstop for ages.
If your files are constantly changing, indexing can easily be out of date, and you then lose the benefit of indexing because you have to re-index before every search, which, surprise, is basically what windows search does.
There is a "fix", just go to your start menu and search for "indexing options" and turn on indexing for whatever folders that you want faster search in.
And as @@Sammysapphira mentioned, indexing is expensive, so it will take a while for windows to build a full index in the background. But when it is done, searching in that folder is usually under a second.
the fix is to install "everything" and search using that instead
@@SammysapphiraYeah if not for the literal fact that Windows already has indexing (at least partial) for other features and a boatload of "maintenance" or otherwise labeled services that spin all your drives at random, even when not needed at all for the current operation.
Enabling full disk indexing will be answer
It's not enabled by default because it's taxing to the system and is supposed to I my run while you're not active, so needs to complete in a short time, but with a decent enough pc, or one you leave on, it should work fine
How do you detect when the cache becomes outdated without scanning the whole filesystem whenever the program opens?
I see the Tauri logo! Actual walking W
Honestly a clean FM using the Everything SDK for searching in the background would be my preference.
wait till he finds "everything"
"so I just spent the entire afternoon looking at fancy searching libraries just to find that I can just do this with a hashmap" hashmaps are in fact so overkill sometimes it isn't even fun to use them
This is cool, but can it be set to just update the cache every now and then? Perhaps on the creation or movement of a file? If a file gets deleted while the explorer is running, will it notice the file no longer exists when you run a new search in the same instance of the explorer?
Finally a project which makes use of tauri on RUclips. We need more tauri fr
Nice that you used Tauri! How was the DX for you?
cant wait the official release from you and i only need to execute the installation
"There are only two hard things in Computer Science: cache invalidation and naming things."
Unlike most of the usages of this quote, this video is not about the second one
ur my fav youtuber now.
Nice work!
How do you have that A: drive? I thought A and B are assigned only to floppy disks and now are just a part of the windows history
Quick question: how do you handle cache invalidation? What happens when the filesystem changes?
Good question! I was trying to handle this with the notify crate. Sadly doing that asyncronously went past my Rust competency and I didn't have time to implement it 😅 It would look something like watching for a new file change, updating the cache in memory and then updating the cache on disk every so often
I love how you even gave the windows explorer a head start and yours still won lol
I will show this video every person which will say that rust is over hyped. Great work.
Wise Jetsearch is an amazing tool to search multiple drives at the same time under 10 seconds
There's a really good program called "everything" that I use to search for files normally. It's been my go-to over default windows search for years now.
The problem with this implementation (as far as i know) is that if you want to keep all files findable, it is impossible to rely on the file cache alone. If you download a file and haven't cached the filesystem after that, you won't be able to find it, so you will have to cache very, very often, which would slow down the system.
Another option would be to use the cache to show those results first (which would work most of the time) and then keep looking for files that match the search that are not cached, so at first it would quickly show you the files that are already cached, and after about 50 seconds show the rest (it would be an improvement compared to Windows Explorer, but in the base, in certain cases it would be the same).
You don't understand, Windows cares so much about user experience that they know the average user is gonna want to go to the kitchen and grab a cup of coffee while they are using file search so they make it slow on purpose.
You've taken all the fun out of searching for a file.
Something you could do to account for changes in the file tree is have it redo the caching when it finds nothing. Only thing you should keep in mind is to show the file that's found during the caching immediately, not after the caching is done.
This is really good but you need a background thread that’s constantly updating the cache
I'm pretty sure Windows doesn't cache most directories? You can tell it which additional directories (additionally to documents etc) to index I think
But windows caching is quite a performance killer. I disabled Windows indexing Service and literally gained fps. "Everything" is literally the answer to all file search related questions. It's as fast as possible as it changes the search result as you type in the search bar without any delay.
@@redcrafterlppa303 Oh yeah I had my fair share of weirdness with it and was speaking more theoretically
If I need it, I use "Everything" as well
great project
thanks for uploading videos
Kind of funny that you basically recreated the unix utility “locate” (more easily googleable by it’s indexing command “updatedb”)
Alright underrated channel I just found!
I have 12TB, does it mean it will cache into a big, very big file and memory usage? I'm using voidtools' Everything tool and it consumes 700MB for the caching, of course, ignoring node_modules folders, .git, windows, program files, etc. and I guess Everything uses a special way to list files in the NTFS (file system) USN Journal, which speeds up the caching process.
There are a lot of “programmers” in the comment section who have absolutely no idea what they’re talking about.
My guy, what happens when your file structure changes and now your cache is invalid?
The cache generation took more than the explorer search. And what if a new file appear, do you have to wait 260 more seconds? Its not faster, it just slower at the start and faster in the end.
I require this searching speed
These videos are pure QUALITY!
How are you like "connecting" the NodeJs and Rust?
Forgot to add that the Rust Project did not review and approve your work and does not support it.
lmfao he better get rid of the logo on the thumbnail xD
A good example that windows can be optimized WAY more than it currently is, but M$ being M$ hardly gives two shits about it
But how do you update the cache to catch up with filesystem updates?
Configure your indexing options on windows. My laptop has the minimum requirements for windows 10 and it searches less than a second.
Fun fact: explorer doesn't search only files. It searches network locations, file contents and extensions to it
So you made spotlight search
Both use a cache
4:08, If it took 303ms to search, imagine how much faster it would be when compiled with optimisations enabled
but that's rust what compiled optimization does it have is it like c++? were we pass flags
@@ko-Daegu rustc is llvm based, so yes, it's somewhat like c++ with clang. Yes you can pass flags to rustc through cargo
@@ko-Daegurust by default compiles with no optimizations in debug mode. Release mode it does use things like -o3 on llvm
@@sohn7767 ah thankx that's what i was actually looking for
also why the fuck all reddits are down can't get into the r/rsut communities?
@@ko-Daegu pretty big news, there's a blackout from protesting subreddits. Read about it, it's long to post here.
That's actually slow as crap compared to Everything from Voidtools which gives you search results simultaneously
Nice, but what if you create a file once all files have been cached already?
they do? it’s that json file
if the file is deleted xD tho and ur cache still has it?
But fast would it be to hash the table since it’s not really dynamic
Bro you sound exactly like how i remember wilbur soot
"I made a file explorer in rust" repo being 60% typescript.
to be fair, windows explorer gets the metadata for every single file it scans, if you skip that step it would be much faster, smth you did
it is easy to curse at windows, but if your "remake" is not a 1:1 recreation, then it can't be used as a benchmark.
I think windows search also searches for attributes like artist name etc
also, how long does your "cache" last til it expires?
and what would happen if I made a new file after the cache was made?
@Watcherhe used a hashmap at the end.
Use Everything it's the undebatable champion of file search engines.
Everyone one RUclips: Create full stack applications with this.
Me when tried actually to do so: Oh f**k bro it is so error creating and outdated type of coding, and lot harder than coding, it frustrated me too much ☺️
that sounds fun, I wanna do something similar
That’s a good idea for a project, the gui was done using Tauri right ?
Next time: Building a blazingly fast Windows in Rust
It feels like I’m watching a sibling of the RUclipsr Wilbur Soot
who made it better?
multimillionaire company with thousands of employees ❌
a guy ✅
*a guy with glasses
*a multitrillion dollar company
This is absolutely beautiful
When is the cash invalidated? Because after changes on the drive it would need to be updated.
Hello, first of all I wanted to say that I respect your project.
I've got a question tho. Wouldn't you rather call this a faster file finder? Whenever I would copy/open/delete a file I still need to be in the determined filepath.