Hi everyone! First off, I want to thank you all so much for the insane amounts of support / feedback and all around great discussions and suggestions. It's always a pleasure to read so many different opinions and ideas, especially as I try to tackle projects with many unknown factors to me, so thanks for teaching me stuff as well! Now, I noticed a lot of you suggested adding a couple of things, and I'll go in order: 1. A requirements.txt file - I mean yeah, duhh, idk how I forgot about that one, consider it done. 2. Add support for subtitles - Also, great suggestion so I went and did just that. You can now specify if you want to use subtitles and in which language using some new flags: --subtitles, -s | --subtitles-lang, -l. If none are found, it'll try to get autogenerated ones. I hope this is a pretty satisfying solution, and maybe an alleviator for point 4. 3. Add support for RGB - Again, awesome idea (I mean they all are tbf), but for this one I had to sneak some trade-offs in, as for some reason, to this day and age, there are enough terminals that don't even support colors by default, nevermind a large palette of them. So I decided to use the basic ANSI-8 color palette (let's just embrace the vintage style ig) when rendering them, and also made it so it's disabled by default, in case your terminal doesn't support it. Also I should mention that, due to the fact that the project is based around an on-the-fly way of running, with everything rendered in real time, adding colors is quite the intensive task, so if you're set on using them, consider playing around with the frame size. You can enable this using the --colors, -c flag. Ah, also if the basic palette is too narrow for your tastes, I made it so you can expand the ANSIConstants.COLORS dictionary - just make sure to follow the established structure, and also beware that the more colors there are to choose from, the slower the conversion. 4. And now for the final and most debated point - audio. As you might surmise from that nod in point 2, I decided against adding support for audio. I fiddled with the idea and tried some implementations, but I realized the added complexity would slow down the app, and raise a lot (and I mean a lot) of syncing problems, because, compared to other video-to-ASCII converters, this one does everything in real time, so merging the audio and video at the end into a file is out of the question. * Also I expanded the README file so there is more info about how to use the damn thing. ** Also also fixed the color scheme (it was inverted if the background of the terminal was darker than the characters, which is pretty much the base case, so... thanks for pointing it out), allowing to specify if the color scheme should be inverted or not and also (the 3rd one) now it's possible to render using Unicode characters instead of the default ASCII ones - just make sure your terminal fonts support Unicode. All of these are commited to the same repo: github.com/icitry/RUclipsCLI
Is it possible that you have your ASCII char selection reversed? The way it is right now, black gets assigned to "$" while white gets " " I think it should be the opposite way When i reversed the string, things look much more accurate
@@CheatingChicken Indeed that was the case 😅 I've seen multiple people pointing it out, and addressed it in the latest commit. Can't believe I overlooked that - guess I was too focused on it actually working
but 8k is so last year, 16-32k would be more in line with the standards of today tbf (also RGB is possible, just that you'd probably need a bigger terminal for it to actually be viable - just get the dominant color for each pixel group and use it for the corresponding character)
you could make it colored using terminal colors. If you place the this: ESC[38;2;⟨r⟩;⟨g⟩;⟨b⟩ infront of the char it sets the foreground color and this the background: ESC[48;2;⟨r⟩;⟨g⟩;⟨b⟩. The Colors are not full 0-255 per Channel in the windows console. (ANSI_escape_codes on Wikipedia)
Well that's Inception-worthy enough. And hey, if you manage to tweak/line up the text pixel perfectly, you may actually get something readable (I, for one, prefer to keep my sanity levels in check though)
Oh, oh yeah I definitely thought about that when creating it pff (that's actually a really great idea - I'll use it when pitching it to the thousands of investors that'll come flying in any second now /s)
Dude! This is far out stuff! This reminds me of, back in 1990 when we were making "video" with ANSI characters (in color), frame by frame. I wish I had a tape drive to read these old tapes - lot's of crazy ass code we did back then (and had 2 BBSes running continually). Great times!
Reminds me of a decade ago when I was a kid and my only computer was a raspberry pi. It couldn't run youtube in a browser, so I installed a CLI interface for it that would let you search videos with the keyboard and then write the video directly to the framebuffer device.
Does this support ads? If not, you better expect them coming for you over their TOS /s This is the final boss for RUclips in their war against ad-block lmao Really great work btw. I'm blown away how well you got it to work.
Just tap into your synaesthetic powers... But really, I thought about whether I should also include audio, as for consistent sound quality you'd most likely want to download the audio sources - not the most ideal thing if you're actually planning to rapidly navigate through many videos, and also against my idea of it being as lightweight and portable as possible.
you can get double resolution and full RGB by displaying a half block with a custom background color and foreground color. as a side effect this also removes the stretching
Man, I wish I could find the old code we had for creating ANSI "videos" in color (1990) to share on our BBS systems (and the Internet - back when you had to be a student or professor OR pay a friggin' premium)
It is unironically batter than actual RUclips: 1. You can watch it without ads 2. You don't have shorts all over your screen 3. Search should be functional instead of showing random videos that "you might also like" You really have to implement sound thought speaker and \a symbol. I know it wouldn't really work(
I feel like a simple fix that would've made the videos look so much better would've been to simply... invert the character list. Since the characters are white and the background is blue, it makes sense to use the densest character with the lighter pixels, not the darker ones.
Awesome video! Dude you put in effort and the best part is, its not like it was extreme effort but it was so much more then i would've for a youtube CLI! You totally made one better!!! Im gunna subscribe for more fun coding with different clis and apis ❤️
They probably are less worried about their algorithm and more about apps that can replace yt apps and in result, their premium features and can remove or replace their ads. Algorithm changes, maybe even daily and for different users is probably different anyway, but they fight with for example with yt vanced for years now. They don't want to block all functionalities, but want to make competition worse than it could be, as bad as they can. What about... sound? Not possible?
That too, for sure, biggest giveaway was that deprecation notice from last august, which *coincidentally* overlapped with them trying to crack down on 3rd party apps, but I'g wager they also don't want outsiders to reverse engineer their algos, as it's not like they change it in major ways nowadays, more so than tweak it regularly. Also yep, sound is possible, just that I was kinda on the fence about adding it, but I'll probably add support for it as well
@@icitry Yeah, they are protective about algorithm, true. But I feel like the data from home page is not enough to reverse engineer, and you can scrape it anyway with bots. If you need the home page data at demand though, like with apps, it is much more important to have API. You probably could scrape it anyway, but I have no idea if its feasible in consumer product. I think that "the algorithm" is neural net/ some sort of AI anyway, so it would be like trying to use chatGPT answers to reverse engineer it.
@@syriuszb8611 Oh yep, fair points across the board. Although I'd still argue it would be easier than with GPT as the "data sets" (video and user data) are much more accessible through the other endpoints in the case on YT, so it'd be easier to try to make your model output similar stuff to what a 'homepage' endpoint would generate, speeding up the process (I probably misspoke about it being reverse engineering, but moreso emulating the behavior). But really, great points all around!
Yet they fail so horribly to get rid of vanced (I'm using it). There will NEVER not be an adblocker detector bypass for RUclips mobile and RUclips for PC.
This is how youtube was back in 1995 on DOS. Reminds me of the good old days watching Felix the Cat YTPs and lets plays of Minecraft on Virtual Boy. Good times.
Funnily enough one bit of this, specifically the constant loop looking for inputs using getch, is actually something I’ve been trying to do in a project I’m doing but (mainly due to inexperience) I don’t know how to get it working. If you would be able to help in any way it would be much appreciated
Awesome video, your code is so clean, it appreciate the architectural design haha Is there a repo for this? Id really like to fork it and try make it colour with some health threading abuse. Might even pull out the c++
Cool project idea! I think you could implement colors though, not really sure about how to do it in windows, but under linux I think you could play around with the terminal emulator's colorscheme
This is really really cool but I wish that you would do it with the graphical protocol in kitty. Since you are still going to be using the full bandwidth when downloading the video, it would be nice to show it for real.
(Edit: just read your pinned comment) audio world be nice.. hopefully someone figures out a simple & efficient way to implement it some day. I'm honestly not trolling here, I enjoyed this and it's been something on my todo list! And I also know I've seen other command line. Terminal video players do it in the past. Although it's been years
y'know it'd be legit great if, when the video is of text in a code editor, I actually get the text rendered as text on my device, rather than as an image can you add OCR ;)
Really cool idea, props for it, but based on a user's installed fonts, the terminal might not be able to render unicode characters, so I'm kinda on the fence about this
when i clicked this i thought "lol i wonder how wild itd be to make it ascii based" cuz i expected it to just load frames at a time... ur actually doin it. absolute madlad. im stupid so this may sound stupid but if you got it running in neovim so i could watch primeagen in neovim while he talks about neovim, i think we could achieve world peace.
Thank you, I always strive to reach the best (worst) possible solution, so ofc I had to go all the way with ascii. Also, are you sure a single text editor should be able to wield so much power? World peace is one option, but what if neovim becomes too powerful for it's opposition..
@@avi7278 for fun and improvement. building things like this gives transferrable skills. otherwise ive seen stupid comments before ... so why are you here
I WILL SUBSCRIBE, but please make more terminal apps, like for spotify or whatever else, i would love to have this run on a remote stream, these days theres not many people use the terminal like a genius
Oh yeah, that would definitely help make it more legible, and further refining the ASCII characters string would improve it, but a solution that works for all videos is, to put it lightly, quite the tall order 😅
You got something wrong: Hipster cli tools made so that you dont have to use an official UI client and can use the terminal are written in rust, not python. Also use Linux, rice that terminal and add colors.
Hi everyone!
First off, I want to thank you all so much for the insane amounts of support / feedback and all around great discussions and suggestions. It's always a pleasure to read so many different opinions and ideas, especially as I try to tackle projects with many unknown factors to me, so thanks for teaching me stuff as well!
Now, I noticed a lot of you suggested adding a couple of things, and I'll go in order:
1. A requirements.txt file - I mean yeah, duhh, idk how I forgot about that one, consider it done.
2. Add support for subtitles - Also, great suggestion so I went and did just that. You can now specify if you want to use subtitles and in which language using some new flags: --subtitles, -s | --subtitles-lang, -l. If none are found, it'll try to get autogenerated ones. I hope this is a pretty satisfying solution, and maybe an alleviator for point 4.
3. Add support for RGB - Again, awesome idea (I mean they all are tbf), but for this one I had to sneak some trade-offs in, as for some reason, to this day and age, there are enough terminals that don't even support colors by default, nevermind a large palette of them. So I decided to use the basic ANSI-8 color palette (let's just embrace the vintage style ig) when rendering them, and also made it so it's disabled by default, in case your terminal doesn't support it. Also I should mention that, due to the fact that the project is based around an on-the-fly way of running, with everything rendered in real time, adding colors is quite the intensive task, so if you're set on using them, consider playing around with the frame size. You can enable this using the --colors, -c flag.
Ah, also if the basic palette is too narrow for your tastes, I made it so you can expand the ANSIConstants.COLORS dictionary - just make sure to follow the established structure, and also beware that the more colors there are to choose from, the slower the conversion.
4. And now for the final and most debated point - audio. As you might surmise from that nod in point 2, I decided against adding support for audio. I fiddled with the idea and tried some implementations, but I realized the added complexity would slow down the app, and raise a lot (and I mean a lot) of syncing problems, because, compared to other video-to-ASCII converters, this one does everything in real time, so merging the audio and video at the end into a file is out of the question.
* Also I expanded the README file so there is more info about how to use the damn thing.
** Also also fixed the color scheme (it was inverted if the background of the terminal was darker than the characters, which is pretty much the base case, so... thanks for pointing it out), allowing to specify if the color scheme should be inverted or not and also (the 3rd one) now it's possible to render using Unicode characters instead of the default ASCII ones - just make sure your terminal fonts support Unicode.
All of these are commited to the same repo: github.com/icitry/RUclipsCLI
Who read
👇
Python has a getch() alt called input("Text: ")
Is it possible that you have your ASCII char selection reversed?
The way it is right now, black gets assigned to "$" while white gets " "
I think it should be the opposite way
When i reversed the string, things look much more accurate
@@CheatingChicken Indeed that was the case 😅 I've seen multiple people pointing it out, and addressed it in the latest commit. Can't believe I overlooked that - guess I was too focused on it actually working
i dont understand how to download and use it
great! now add full RGB support and make it 8k HDR
but 8k is so last year, 16-32k would be more in line with the standards of today tbf (also RGB is possible, just that you'd probably need a bigger terminal for it to actually be viable - just get the dominant color for each pixel group and use it for the corresponding character)
@@icitry you can print 2 blank spaces (to match the height to width) and change the background color.
@@roaringcow8163 that's exactly what I did when making the game of life and a script that shows the desktop environment display in the terminal.
you could make it colored using terminal colors. If you place the this: ESC[38;2;⟨r⟩;⟨g⟩;⟨b⟩ infront of the char it sets the foreground color and this the background: ESC[48;2;⟨r⟩;⟨g⟩;⟨b⟩. The Colors are not full 0-255 per Channel in the windows console. (ANSI_escape_codes on Wikipedia)
@@icitrybtw, you can do a lot with the half width block character by coloring foreground and background of different colors
Linux users removing bloat:
Fr lol
Wrong. It's written in p*thon
as a void linux user, i can confirm
As an arch user, I admit it 😂
As a kali Linux fanboy, I kinda agree(removing unnecessary tools)
just watched this video again using the tool. text was a bit hard to read, which is funny, because its a literal terminal made for text lmao
Well that's Inception-worthy enough. And hey, if you manage to tweak/line up the text pixel perfectly, you may actually get something readable (I, for one, prefer to keep my sanity levels in check though)
@@icitry who needs sanity when you can have neck beard. (I use arch BTW)
@@nabibunbillah1839 based
@@nabibunbillah1839😂
I wanted to create the inception D:
Literally this is a good option for dopamine detoxers
Oh, oh yeah I definitely thought about that when creating it pff (that's actually a really great idea - I'll use it when pitching it to the thousands of investors that'll come flying in any second now /s)
Wait a sec
how 💀
@@icitry i offer -50 dollars for a 20% stake in this product
@@HassanIQ777 coding
@@verizonextron
Wait a sec
how 💀
Dude! This is far out stuff! This reminds me of, back in 1990 when we were making "video" with ANSI characters (in color), frame by frame. I wish I had a tape drive to read these old tapes - lot's of crazy ass code we did back then (and had 2 BBSes running continually). Great times!
Ohh that sounds so cool! You should try to post them on youtube or the like, those are some real time capsules
Reminds me of a decade ago when I was a kid and my only computer was a raspberry pi. It couldn't run youtube in a browser, so I installed a CLI interface for it that would let you search videos with the keyboard and then write the video directly to the framebuffer device.
Ohh that's really neat, nice one!
TTRUclips
Average classmate in comp sci class
Cool project, now play Bad Apple on it!
simple task lol
Oh it's been done already 😭
3:41, this guy is fully dedicated, he even bothered fixing the responsive issues, lmao. I totally lost it.
Dude's been rickrolling us for 15 minutes. And we're enjoying it.
Ikr
Now i can finally watch youtube from my Ubuntu Server
At first i thought this was just going to be some youtube-dl frontend for selecting videos to open in mpv... Holy shit this is even better :D
And spare myself from some good ol' prolonged suffering? Couldn't be me
linux users would rather make a custom JS engine than just using a web browser
Windows powershell spotted🤢
imagine having bloatware on a paid operating system
Ladybird browser: am I a joke to you?
Sriously, I rather make my own .sh file than using VSC code and setting up a project, so it is not a joke at all.
No, the actual usable youtube in text based environment is tty + framebuffer (fbdev) + mpv with framebuffer support + mps-youtube
Isn't that technically a graphical environment, expect insterad of using a Window Server you're using a framebuffer mate.
I use ytfzf and fbterm, way better
I never thought I‘d get rickrolled by a terminal. Geez
Does this support ads? If not, you better expect them coming for you over their TOS /s
This is the final boss for RUclips in their war against ad-block lmao
Really great work btw. I'm blown away how well you got it to work.
Thank you! And guess I'll go start digging for my new anti-yt-tos bunker
Icitry, TOS, TempleOS RUclips, is it even possible?
What about... sound?
Just tap into your synaesthetic powers... But really, I thought about whether I should also include audio, as for consistent sound quality you'd most likely want to download the audio sources - not the most ideal thing if you're actually planning to rapidly navigate through many videos, and also against my idea of it being as lightweight and portable as possible.
@@icitry --audio
Maybe the CC or transcripts can be used?
@@icitryaudio is bloat
@@icitry use the pc speaker
you can get double resolution and full RGB by displaying a half block with a custom background color and foreground color.
as a side effect this also removes the stretching
and ANSI characters
Man, I wish I could find the old code we had for creating ANSI "videos" in color (1990) to share on our BBS systems (and the Internet - back when you had to be a student or professor OR pay a friggin' premium)
I remember once I've compiled player with libaa and libbb support, pretty cool
Advanced AA libs can use the shape of characters to better represent multiple pixels than just average them, which can lead to much better results.
Oh definitely, this is very much a crude approach
Icitry Python is bloat, C is the way!
@@unucellply4221C is bloat. Assembly is the way.
It is unironically batter than actual RUclips:
1. You can watch it without ads
2. You don't have shorts all over your screen
3. Search should be functional instead of showing random videos that "you might also like"
You really have to implement sound thought speaker and \a symbol. I know it wouldn't really work(
ALSA on an ARM SBC gets weird pitch, but mpv is able to adjust this
Tf are you yapping about be fr bro
I feel like a simple fix that would've made the videos look so much better would've been to simply... invert the character list. Since the characters are white and the background is blue, it makes sense to use the densest character with the lighter pixels, not the darker ones.
Yep, you perfectly pointed it out, thanks for that - I fixed it in the latest commit (still can't believe I missed that)
Awesome video! Dude you put in effort and the best part is, its not like it was extreme effort but it was so much more then i would've for a youtube CLI! You totally made one better!!! Im gunna subscribe for more fun coding with different clis and apis ❤️
Yeah, i'm just overwhelmed by the video quality; I've been watching every .000000000000004K video out there.
Awesome! Nice youtube CLI gui!! I finally can watch youtube videos at work!
They probably are less worried about their algorithm and more about apps that can replace yt apps and in result, their premium features and can remove or replace their ads. Algorithm changes, maybe even daily and for different users is probably different anyway, but they fight with for example with yt vanced for years now. They don't want to block all functionalities, but want to make competition worse than it could be, as bad as they can.
What about... sound? Not possible?
That too, for sure, biggest giveaway was that deprecation notice from last august, which *coincidentally* overlapped with them trying to crack down on 3rd party apps, but I'g wager they also don't want outsiders to reverse engineer their algos, as it's not like they change it in major ways nowadays, more so than tweak it regularly. Also yep, sound is possible, just that I was kinda on the fence about adding it, but I'll probably add support for it as well
@@icitry Yeah, they are protective about algorithm, true. But I feel like the data from home page is not enough to reverse engineer, and you can scrape it anyway with bots. If you need the home page data at demand though, like with apps, it is much more important to have API. You probably could scrape it anyway, but I have no idea if its feasible in consumer product.
I think that "the algorithm" is neural net/ some sort of AI anyway, so it would be like trying to use chatGPT answers to reverse engineer it.
@@syriuszb8611 Oh yep, fair points across the board. Although I'd still argue it would be easier than with GPT as the "data sets" (video and user data) are much more accessible through the other endpoints in the case on YT, so it'd be easier to try to make your model output similar stuff to what a 'homepage' endpoint would generate, speeding up the process (I probably misspoke about it being reverse engineering, but moreso emulating the behavior). But really, great points all around!
Yet they fail so horribly to get rid of vanced (I'm using it). There will NEVER not be an adblocker detector bypass for RUclips mobile and RUclips for PC.
0:59 IS THAT JOHNNY SINS 💀💀💀💀
This is how youtube was back in 1995 on DOS. Reminds me of the good old days watching Felix the Cat YTPs and lets plays of Minecraft on Virtual Boy. Good times.
Sure grampa, let's get you back to bed now
YES! I CAN FINALLY DAILY DRIVE UBUNTU SERVER! Who needs a GUI? Everyone knows terminal is op.
You are an excellent software engineer. Great video.
He just rickrolled everyone watching the video
Funnily enough one bit of this, specifically the constant loop looking for inputs using getch, is actually something I’ve been trying to do in a project I’m doing but (mainly due to inexperience) I don’t know how to get it working. If you would be able to help in any way it would be much appreciated
This was a great video! I now wanna try it in PHP as well 😬
Amazing project! You're the land's salt mate! Finally a worthy way to use yt!
😆 Thank you! I'm glad to hear people see the vision
Duh, im watching this video in my computer's terminal, easy stuff bro.
too cool -- i'm even happy to subscribe, even though it was python and powershell :-D can't wait for more, thank you!
Welcome aboard, and thank you as well! :D
Amazing, Love it
Thank you so much!
bro the mr beast edit lmaooooo
insane idea, excecuted brilliantly.
You should add support for the kitty graphical protocol, it would allow playing the video as an actual video in a terminal
watching this on terminal is fun
Terminology can do video in terminal, but API is hot spud.
for the love of god, the only thing this needs is rgb. that is. nice job.
you should look into the kitty graphics protocol if you want to keep working on this
not portable
This is awesome, you should try using a color library or something. Though this would likely complicate it a lot it would be worth a shot.
lets play the h1t1 video for the weirdest ways to play his video!
"...or does it?"
*vsause music starts playing*
im dying 🤣
Awesome video, your code is so clean, it appreciate the architectural design haha
Is there a repo for this? Id really like to fork it and try make it colour with some health threading abuse.
Might even pull out the c++
Oh, thank you! Yep, there's a link to the repo in the description. Go crazy, I'd actually be curios to see alternative implementations.
On 15 may, there's no "C++" forks. If you start this project, may you take some help?
Cool project idea! I think you could implement colors though, not really sure about how to do it in windows, but under linux I think you could play around with the terminal emulator's colorscheme
icitry: Builds a CLI YT player to have less bloat
Also icitry: Processes raw frames in Python...
How stallman uses the internet:
It would be awesome to see this with Acerola's ascii renderer
Not even gonna lie, mix it with an audio player and it would be a rad af retro terminal display
dang the intro made me fall asleep so good
This is really really cool but I wish that you would do it with the graphical protocol in kitty. Since you are still going to be using the full bandwidth when downloading the video, it would be nice to show it for real.
1:18 I loved that part
And ofc I've watched this on the terminal :>
Just the way I intended
No sixels? Not even colors? No audio? Windows is truly abomination
Oh you tell me, I added some of those or alternatives since then, but you can guess who the bottleneck was - again (aside from my brain ofc)
@@icitry will you post an update?
I can surely tell you you will gaze upn vsauce the whole day
(Edit: just read your pinned comment) audio world be nice.. hopefully someone figures out a simple & efficient way to implement it some day.
I'm honestly not trolling here, I enjoyed this and it's been something on my todo list!
And I also know I've seen other command line. Terminal video players do it in the past. Although it's been years
We now need the audio
I mean, for example for Linux at least, there is the Kitty terminal which supports images?
Yes, You can use the terminal for browsing and you have the thumbnails. Once you select what you want to watch the video launch with MPV .
You "Almost" mentioned my first 8th-grade teacher's name (before the switch).
We can definitely survive the apocalypse now
i had the idea yesterday to do that and was going to code it today lol
I say go for it! You've got many areas to expand upon/improve compared to this one
Uh, no you haven't.
The amount of times I got Rick Rolled in this video is insane.
This would be funny to do to avoid ads 😂
y'know it'd be legit great if, when the video is of text in a code editor, I actually get the text rendered as text on my device, rather than as an image
can you add OCR ;)
no search, iterate through all videos and compare strings
Great video, subscribed!
BUT now make it use colors, and probably make it use dithering :P
This is insaaaane 🔥🔥🔥
you could maybe use some type of sound library, so that you can have sound (ex. winsound)
bro, I beg your mercy. please forgive me for breathing the same air as you are.
This is perfect.
Instead of doing ascii black and white you could've done Unicode Upper Half Block foreground and background color keycodes in the terminal
Really cool idea, props for it, but based on a user's installed fonts, the terminal might not be able to render unicode characters, so I'm kinda on the fence about this
when i clicked this i thought "lol i wonder how wild itd be to make it ascii based" cuz i expected it to just load frames at a time... ur actually doin it. absolute madlad. im stupid so this may sound stupid but if you got it running in neovim so i could watch primeagen in neovim while he talks about neovim, i think we could achieve world peace.
Thank you, I always strive to reach the best (worst) possible solution, so ofc I had to go all the way with ascii. Also, are you sure a single text editor should be able to wield so much power? World peace is one option, but what if neovim becomes too powerful for it's opposition..
image to ascii is a solved problem, not sure why it's reinvented here.
@@avi7278 for fun and improvement. building things like this gives transferrable skills. otherwise ive seen stupid comments before ... so why are you here
@@BrentMalice have fun then?
I WILL SUBSCRIBE, but please make more terminal apps, like for spotify or whatever else, i would love to have this run on a remote stream, these days theres not many people use the terminal like a genius
just use cmus or something lmfao
Ditching decades of advancements in user experience technology in favor of using the terminal like a true Linux user
Wait....no.....it's Windows....that's impossible....no...
Never let them know your next move
your terminal ASCII palette seems to be inversed: dark parts are light and light parts are dark
This is art.
you have tyrian lannister voice
Well now that's truly something I can brag about - thank you!
This is awesome
please add dithering. i tried ascii stuff like this once and dithering is literally the best thing to happen to that project
Ohh, really cool idea, my only concern would be about the processing costs, I'm pretty hesitant to add any more stuff on frame renders
you can find everything on youtube for real.
1:08 "Running [RUclips] exactly how its creators envisioned it"? I don't know what you've been smoking, but I want no part of it.
Can you make a Windows 7 Installer Joke (just harmless) in python using Tkinter
If only the ascii characters actually matched the contours of the image. Like more traditional, hand-made ascii art.
Oh yeah, that would definitely help make it more legible, and further refining the ASCII characters string would improve it, but a solution that works for all videos is, to put it lightly, quite the tall order 😅
Watching RUclips Video in Matrix Computers be like 😂
And to think I can’t even get the RUclips app working on Android 4.0
We got ASCII RUclips before gta6
You got something wrong:
Hipster cli tools made so that you dont have to use an official UI client and can use the terminal are written in rust, not python. Also use Linux, rice that terminal and add colors.
Just use MPV with RUclips-dl and use mpv’s ascii video out mode. Seems the obvious choice.
Now let it play bad apple
1:02 yes, finally someone said it
👍 now add RGB and Sound Support
(I would sadly use this it had that…)
me installing this in the company's main server
Ok now just add audio ...
I know right
I just wanna say one big WOW!
I wonder if you use colorama for printing coloured ascii to terminal. Heh that'd be interesting
15:03 line 14 column 53
UwU
00:39 that's wild
you should make it colorful