This was serious stuff in the 1990s. This video is practically out of one of my university textbooks - Hearn, D. & Baker, M.P. (1994) Computer Graphics. I think people take the low-level stuff for granted nowadays, so I appreciate a video diving into something most people take for granted.
I remember in our first graphics lecture, the professor walked in and said he was going to spend the first half of the 2.5 hour lecture telling us how to draw a line. We were all confused, as we figured it'd be trivial to write some algorithm where you calculate the slope then plug coordinates in to y=mx+b. Then he said we weren't aloud to use floating point numbers and things got interesting.
been seeing a lot of cool channels like this explaining computer science (and just stem in general) and you guys are really impressive! if you stay consistent, i guarantee youll be successfull with your channels.
This is good. Anti-aliasing is commonly used to create better results (its used in most current 3d engines to create a better result). Outside of the typical interpolation, we instead calculate a weight for each pixel (how much does the line intersect a pixel partially, intuitively), and map that value to greyscale. I think it’s always made more sense to draw lines (perhaps for rendering wireframes) by using vectors - and interpolating along the length of the vector - that way we don’t have to factor in a slope (dividing by 0). Most of the graphics I’m used to seeing rending polygonal faces (not the wireframe), which use different techniques altogether (they happen to be very similar however - especially for “Triangle shaders”)
Interpolating between two points requires a continuous range of coefficients for each point, which means you not only have to multiply by fractional values, you also need to select the resolution of said fractional values to order for the line to actually be unbroken.
Yeah, I don't understand why they don't just do computations as if the screen was bigger than it actually is, then you can just average some pixel colors together to get your new ones for the screen you actually have.
Concise explanation that actually does the work to derive the algorithm + animations for clarity. This is probably the best video on Bresenham I have seen so far. 👍
It's like talking 4 minutes about how to make a wheel and then another 12 seconds: "Now connect wheels with axles, put some mechanical engineering on top and congratulations! You have a car!"
To be fair, mqnc, the title is just "How Your Computer Draws Lines;" not "How Computer Graphics Are Made" or "How GPUs Work," etc., so I guess we shouldn't expect too much past the basic line phase here. But it still seemed to end a little too abruptly.
3:20 another thing to note is in computer science dividing by 2 and multiplying by 2 is considered very trivial/cheap. Since you do not need to do any actual multiplication or division. Since in base 2 multiplying by 2 is just a left bit shift and dividing by 2 is a right bit shift. (In other words just add a 0 bit to the right for x2 and remove the rightmost bit for /2) (technically removing the rightmost bit of some number n is floor(n/2) meaning its rounded). So when you multiplied the equation by 2 to remove the fraction it was more like telling the computer to change the direction of the bit shift and change which number to apply the bit shift to. This adds some more precision (no rounding) but it come at the cost of space (adding an extra bit to the number). Which ironically, in some ways makes removing that fraction more expensive than keeping it. Or worse, it could introduce chance of overflow where the left most bit is lost, which can severely impact accuracy. But youd need really big numbers gor either of those to be a problem
One of my first projects when learning to code was a graphics program in c++ and it is still the most I ever enjoyed coding. This was a great video, hope to see more in the future
wow this was a great video. keep up the great work!
Год назад+3
Thank you. Nice video. I remember how I was some 30 years ago writing x86 assembler code for drawing lines. I had no clue, that there are standard algorithms, so I invented the wheel, like so many other students at the time. I was basically adding (I was actually subtracting, because CPU will automatically tell you, you reached or crossed zero) smaller delta and when it crossed the bigger one, i shifted one pixel in the shorter direction otherwise I would simply continue straight in the longer direction. To improve symmetry, I started with count value at half of the longer delta (one really fast bit shift right instruction). I had to check the parameters at the beginning and pre-calculate some things. Like the direction of the line (prepared values to add to go one pixel longer direction and one pixel diagonally, this allowed to go all directions required), also line boundary checks. I was even consulting with CPU manuals, how long each instruction takes to execute so it was fastest possible.
The Bresenham line algorithm is also used in 3D printing to perform the coordinated axis movements, saving lots of computation and allowing slower (16MHz) processors to easily handle the typical movement speeds of 3D printers. Bresenham allows the multiple linear motions to be perfectly coordinated to start and end at the same time, so it’s ideal for any multi-dimensional system that needs proportional linear elements.
I just wrote a doom remake in plain c last week and after setting up the window with the win32api i obviusly first made the background, then a square, then a circle and then wanted to calculate a line and it took me a good day to make up an algorithm an optimize the shit out of it. I'm now just at the start of the video but super hyped to watch how the lines are made, since I wondered how people with far greater knowlegde would do it.
i hated the video now im a stupid asf and of shit because i used floats 😭😭. Atleast i used the same technique as them with treating cases of lines in the negative x functions just like functions to the right by making my second vector the first one. (good video in actuallity)
The first and second graphics routines I ever wrote were "plot a point" and "draw a line". When I got my first CGA graphics card, I was aghast at how slow lines were, so I rewrote them in Intel x86 ASM (imbedded in a C structure.) Was 200 times faster than the CGA call.
@@tbird-z1r in the CGA days, there was "software" code in the card bios. You filled registers with needed info (x1, y1, x2, y2, colour) and called the routine address. Or, used a language like C that has a graphics library. Hand coding was way better than C, or the built in CGA routine. I calculated the memory address/bit for the pixel and set that memory location (ASM routine and call). My initial Line routine just called that Point Plot function after each new pixel calculation. (I was about to integrate the calls into the line algorithm. But by then I'd upgraded to VGA cards. Wasn't worth it to keep playing with my older system after that.)
@@hrayz Was the logic run on the graphics card? I guess it was just to simplify things for business programming etc? Still, you'd have thought they would tightly optimise any built in applications like that. Especially something as ubiquitous as a line. If someone nowadays decided to rewrite the triangle drawing for a GPU in software, you'd tell them they were wasting their time.
@@tbird-z1r there was very little in optimization back then, just function and compatibility. So, when you needed speed, you did it yourself from scratch. That's where the best game engines got there origins. Check out the old ASM Competitions. Unbelievable code, fast and small. Like an entire 3D game (logic, maps, textures, sounds) in 2k bytes.
@@tbird-z1rMost of the overhead would have come from firing a software interrupt for every line or point plotted. It wouldn't have mattered how well optimized the algorithm was, because the CPU would have spent the majority of its time just switching contexts. The first "GPU" to exist in the PC world was IBM's PGA (Professional Graphics Adapter), which included its own 8088 CPU that would execute graphics commands in parallel with the main CPU.
A problem I found using this in my simplistic mapping applications way back is that the line you get depends on which direction you draw. Drawing a line from left to right (x increasing), then erasing the line from right to left (x decreasing) doesn't always work because the corresponding values of y don't always match.
For an algorithm I personally devised for drawing aliased lines on tiles I had to check 6 different angles. The advantage of my line over the one in the video is that it is perfectly symmetrical with no artifacting at the edges. It's more computationally demanding, but it is a single time computation and then the line data is stored separately so it being unoptimized doesn't hurt the application. I hope one day to see if I can make my algorithm more performant since its results are better looking than the traditional one.
I remember writing a function to draw lines and hating the fact that you had to have a double nested if else condition for positive, steep positive, negative and steep negative lines. 4 code blocks all doing the same thing with just some values flipped around.
I think of the math here as saying, the function Ax+By+C (integer coefficients!) is positive on one side of the line and negative on the other. Keep a running tab of whether you're on the positive side or negative side to know whether you should increment x or y. Based on that, add A or B to your tab. The corresponding algorithm for circles (again based on integer addition -- and works for general conics) is yet cooler. Now instead of adding constants A,B to the running tab, you should add linear functions. But computing those linear functions is _itself_ done with a running tab for each one.
I really like the retro amber display style here, and the animations are great. However, it's worth noting: Initially, graphics were rendered using dedicated analog vector hardware, like in SAGE display terminals from the 1950s. Here, a CRT's electron beam drew directly on a phosphor screen, a stark contrast to subsequent raster graphics. Concerning modern vector graphics: GPUs generally transform lines into triangles for rendering and then employ sophisticated techniques for smoothing. CPUs typically use scanline approaches, coloring pixels based on line or polygon coverage, rather than tracing a line as seen in Bresenham's algorithm.
Well it's quite simple. Your computer stores a piece of paper for every drawing it could be asked to make in the future and just shows you those. If you ask nicely it'll show you the stash.
I had a Radio Shack TRS-80 computer in the late 70s. It had similarly simplistic graphics. You could turn (large) pixels on or off but that's it. No LINE function. So I worked out how to do it manually with a FOR/NEXT loop. Fortunately I was also learning algebra at the same time. I even discovered how to vary a sine wave's amplitude and frequency before learning it in school (once I upgraded to enhanced BASIC that included the SIN(x) function!)
I learned about this when I was doing Assembly Language. I wanted to expand on it and make a triangle painting algorithm. I never really got it there, at least not it a way that matches up with a line also drawn around the triangle as a border. But I'm sure others have solved that.
I remember looking up this algorithm when I wanted to draw lines on an HTML page before the invention of the HTML5 canvas. I've also heard of it being used for targeting in roguelikes.
I found the tools by following the link in the comments. He listed: Python Manim, DaVinci Resolve, Inkscape, and Audacity. I didn't find a link to source code though.
Hey, nice illustration. Thank you! Can you also do a video about the Bézier curves? They are also interesting in comluter graphics, but the eauations there are a bit more complex
Hi, if you want to learn more about them now, here's an incredible video on this topic that already exists: ruclips.net/video/jvPPXbo87ds/видео.htmlsi=w48XOUVPqqOEeocJ Not to discourage Andre from making one of his own by the way, I'd love to see another take on Bézier curves myself :)
@@adrianbik3366Intersting, I watched that just after I saw your video :) I am actually interested in making some string art for portraits, so this is the basic math for that. Thank you!
Draw a vector between two points, fill in every pixel in contact with the vector line, and if there are more than 2 pixels in a row filled in, only fill the one with the most contact with your vector line. Don't forget to account for weather or not the line is horizontally or vertically oriented Howd I do
That’s because every 2d slice of the shape is also an ellipse, which doesn’t guarantee a lack of overlap. This is a lot easier to see and understand if you try freehanding a sphere in mc.
Good video, just would change the multiplication by two to binary shift (given that inefficient division by 2 was already used as an argument, while with shift and AND to find if its odd it can be quite fast)
These days lines are drawn by the GPU as thin rectangles with floating point endpoints and thickness and it's still faster than Bresenham's on the CPU.
@@anon1963 I just do programming in general, as for what I was saying, you can take the sounding like trash part seriously, but everything other than that was mostly just a joke, while I do believe that more software functions should be placed into the hardware for a significant speed boost regardless of whether they're graphics related or not, line drawing is a tad bit excessive to throw into it's own piece of hardware.
@@flameofthephoenix8395 so if you aren't a graphics engineer, and speak as if you were, what does that make you? don't tell me you are a front end dev writing crude apps, that would be actually ironic.
@@anon1963 Everything at various random different times, I haven't written an app and haven't made a whole lot of profit. I do mathematics and programming out of passion. I spoke from a different standpoint then graphics engineering but a related one. I'm not exactly a graphics engineer although I've done plenty of work related to graphics, I do think that triangles should be rendered on hardware instead of software largely because of what I have observed to be true, and also because of the people I know who actually work/worked on hardware. If you look back when the NES released there's an interesting thing that is quite notable, the NES unlike other systems of its time runs significantly faster in a lot of ways, namely though graphics, and if you look into these older systems you might notice an interesting pattern. The NES had a lot of specialized hardware specifically for doing certain tasks and it also was very fast at the tasks that these different bits of hardware were made for, this logically makes sense, a more streamlined program is naturally going to run faster. There's a more recent example of this, you've probably heard of NVidia's RTX graphics, they are also on hardware and run many times faster than most other Ray Tracers.
This is so much more of a video than I would ever expect from a new channel. Idk why necessarily, but I mean even Mr. Beast took 10s of videos to make genuinely good content.
cool! i've been a game programmer for a while and the graphics functions are easy enough to use but they've always seemed like black magic to me under the hood. interesting to know that it's not really that complicated.
2:17 then cant you just make all the numbers doubled so that the fraction turns into an integer, theofore making there be no floating point arethmetic? or cant you just make there be fixed point numbers instead of floating point?
@@progect3548 Yeah man, we all did, it's just I didn't tell people my age. Trust me, there are some creeps here. I'm ok with underaged people using the website, just not them commenting their age for everyone to see.
Your graphics style is super cool and slick! My one criticism otherwise was that the explanations went by so fast i couldn't keep up with the information. Otherwise great video!
I completely expected this video to have hundreds of thousands of views. You have a great editing style, keep it up!
Appreciate it!
I absolutely agree! This was a great video
Ja!
Omg true I didn't even notice
It will soon indeed have hundreds of thousands of (well deserved) views.
No-nonsense, clear, concise, informative, and visually appealing! Great job!
The reason I love RUclips is these science channels barely known yet doing great stuff
This was serious stuff in the 1990s. This video is practically out of one of my university textbooks - Hearn, D. & Baker, M.P. (1994) Computer Graphics. I think people take the low-level stuff for granted nowadays, so I appreciate a video diving into something most people take for granted.
I remember in our first graphics lecture, the professor walked in and said he was going to spend the first half of the 2.5 hour lecture telling us how to draw a line. We were all confused, as we figured it'd be trivial to write some algorithm where you calculate the slope then plug coordinates in to y=mx+b. Then he said we weren't aloud to use floating point numbers and things got interesting.
So, you were allowed to use fixed-point numbers then?
That’s how it’s done. You can just use fractions instead of decimal points. A fraction can be stored as 2 integer values.
@@jamskinner Are you talking to me?
@@flameofthephoenix8395 no he isn't
@@CharlesShortsAlright then, sounds good to me. And thanks for telling.
been seeing a lot of cool channels like this explaining computer science (and just stem in general) and you guys are really impressive!
if you stay consistent, i guarantee youll be successfull with your channels.
Using lines to understand how they were made! Fantastic 😊
This is good.
Anti-aliasing is commonly used to create better results (its used in most current 3d engines to create a better result).
Outside of the typical interpolation, we instead calculate a weight for each pixel (how much does the line intersect a pixel partially, intuitively), and map that value to greyscale.
I think it’s always made more sense to draw lines (perhaps for rendering wireframes) by using vectors - and interpolating along the length of the vector - that way we don’t have to factor in a slope (dividing by 0).
Most of the graphics I’m used to seeing rending polygonal faces (not the wireframe), which use different techniques altogether (they happen to be very similar however - especially for “Triangle shaders”)
Interpolating between two points requires a continuous range of coefficients for each point, which means you not only have to multiply by fractional values, you also need to select the resolution of said fractional values to order for the line to actually be unbroken.
Yeah, I don't understand why they don't just do computations as if the screen was bigger than it actually is, then you can just average some pixel colors together to get your new ones for the screen you actually have.
Excellent explanation, hope this channel blows up
Concise explanation that actually does the work to derive the algorithm + animations for clarity. This is probably the best video on Bresenham I have seen so far. 👍
This was great short to the point, and didn't miss a thing.
It's like talking 4 minutes about how to make a wheel and then another 12 seconds: "Now connect wheels with axles, put some mechanical engineering on top and congratulations! You have a car!"
but the video is not about making a car. it is about making a wheel and then mentioning that you can build a car out of them
@@bartekoo2197: It's not; it's about how your computer draws lines, and then mentioning how you can expand from there.
To be fair, mqnc, the title is just "How Your Computer Draws Lines;" not "How Computer Graphics Are Made" or "How GPUs Work," etc., so I guess we shouldn't expect too much past the basic line phase here. But it still seemed to end a little too abruptly.
Short, Complex but Fully Effective !
You need at least 100k subs with this quailty of content
3:20 another thing to note is in computer science dividing by 2 and multiplying by 2 is considered very trivial/cheap. Since you do not need to do any actual multiplication or division. Since in base 2 multiplying by 2 is just a left bit shift and dividing by 2 is a right bit shift. (In other words just add a 0 bit to the right for x2 and remove the rightmost bit for /2) (technically removing the rightmost bit of some number n is floor(n/2) meaning its rounded). So when you multiplied the equation by 2 to remove the fraction it was more like telling the computer to change the direction of the bit shift and change which number to apply the bit shift to. This adds some more precision (no rounding) but it come at the cost of space (adding an extra bit to the number). Which ironically, in some ways makes removing that fraction more expensive than keeping it. Or worse, it could introduce chance of overflow where the left most bit is lost, which can severely impact accuracy. But youd need really big numbers gor either of those to be a problem
You're riding the yt algorithm, keep it up.
Wow, only the second video and already such a high quality. Keep on going :)
Good shit. Helpful for my own programming projects actually!
I forgot that such a mundane thing like the line needed its own algorithm. Thanks for the video!
one of the best videos I’ve seen so far
This is the craziest vid I’ve seen, thank you
Wow the limits of precision and resolution. So amazing and totally not obvious.
One of my first projects when learning to code was a graphics program in c++ and it is still the most I ever enjoyed coding. This was a great video, hope to see more in the future
If you haven't read it already, you might really enjoy the free book Ray Tracing in One Weekend.
From the production quality of this video I was fully expecting a forty minute essay when it came on
this is way too underrated
wow this was a great video. keep up the great work!
Thank you. Nice video.
I remember how I was some 30 years ago writing x86 assembler code for drawing lines. I had no clue, that there are standard algorithms, so I invented the wheel, like so many other students at the time.
I was basically adding (I was actually subtracting, because CPU will automatically tell you, you reached or crossed zero) smaller delta and when it crossed the bigger one, i shifted one pixel in the shorter direction otherwise I would simply continue straight in the longer direction. To improve symmetry, I started with count value at half of the longer delta (one really fast bit shift right instruction). I had to check the parameters at the beginning and pre-calculate some things. Like the direction of the line (prepared values to add to go one pixel longer direction and one pixel diagonally, this allowed to go all directions required), also line boundary checks. I was even consulting with CPU manuals, how long each instruction takes to execute so it was fastest possible.
Wow, so concise! I love it!
Very informative and helpful graphics.
Great video, well explained, short and informative. Keep going on
I’m happy I discovered this channel very early on.
The Bresenham line algorithm is also used in 3D printing to perform the coordinated axis movements, saving lots of computation and allowing slower (16MHz) processors to easily handle the typical movement speeds of 3D printers. Bresenham allows the multiple linear motions to be perfectly coordinated to start and end at the same time, so it’s ideal for any multi-dimensional system that needs proportional linear elements.
Comment for the algorithm!!! Great video!
Great video, I hope you have success!
Awesome vid, you’re super underrated. Congrats on your new subscriber
I just wrote a doom remake in plain c last week and after setting up the window with the win32api i obviusly first made the background, then a square, then a circle and then wanted to calculate a line and it took me a good day to make up an algorithm an optimize the shit out of it. I'm now just at the start of the video but super hyped to watch how the lines are made, since I wondered how people with far greater knowlegde would do it.
i hated the video now im a stupid asf and of shit because i used floats 😭😭. Atleast i used the same technique as them with treating cases of lines in the negative x functions just like functions to the right by making my second vector the first one. (good video in actuallity)
The first and second graphics routines I ever wrote were "plot a point" and "draw a line".
When I got my first CGA graphics card, I was aghast at how slow lines were, so I rewrote them in Intel x86 ASM (imbedded in a C structure.)
Was 200 times faster than the CGA call.
So did the graphics card actually have a "draw line" command in hardware? I'd have thought that type of thing would be left in software then.
@@tbird-z1r in the CGA days, there was "software" code in the card bios. You filled registers with needed info (x1, y1, x2, y2, colour) and called the routine address.
Or, used a language like C that has a graphics library.
Hand coding was way better than C, or the built in CGA routine.
I calculated the memory address/bit for the pixel and set that memory location (ASM routine and call).
My initial Line routine just called that Point Plot function after each new pixel calculation. (I was about to integrate the calls into the line algorithm. But by then I'd upgraded to VGA cards. Wasn't worth it to keep playing with my older system after that.)
@@hrayz Was the logic run on the graphics card? I guess it was just to simplify things for business programming etc? Still, you'd have thought they would tightly optimise any built in applications like that. Especially something as ubiquitous as a line.
If someone nowadays decided to rewrite the triangle drawing for a GPU in software, you'd tell them they were wasting their time.
@@tbird-z1r there was very little in optimization back then, just function and compatibility.
So, when you needed speed, you did it yourself from scratch.
That's where the best game engines got there origins.
Check out the old ASM Competitions. Unbelievable code, fast and small. Like an entire 3D game (logic, maps, textures, sounds) in 2k bytes.
@@tbird-z1rMost of the overhead would have come from firing a software interrupt for every line or point plotted. It wouldn't have mattered how well optimized the algorithm was, because the CPU would have spent the majority of its time just switching contexts.
The first "GPU" to exist in the PC world was IBM's PGA (Professional Graphics Adapter), which included its own 8088 CPU that would execute graphics commands in parallel with the main CPU.
I know first-hand the difficulty of creating lines on a computer, from my days in the past of making RS2 quest guides with screenshots and MS paint 😂
Well explained and visualized
This is amazing... such a nice video with only 8500 viewers? 😮😮😮
Great video king
extremely underrated
Keep it up ! Great work.
I look forward to more of your videos
A problem I found using this in my simplistic mapping applications way back is that the line you get depends on which direction you draw. Drawing a line from left to right (x increasing), then erasing the line from right to left (x decreasing) doesn't always work because the corresponding values of y don't always match.
For an algorithm I personally devised for drawing aliased lines on tiles I had to check 6 different angles.
The advantage of my line over the one in the video is that it is perfectly symmetrical with no artifacting at the edges. It's more computationally demanding, but it is a single time computation and then the line data is stored separately so it being unoptimized doesn't hurt the application.
I hope one day to see if I can make my algorithm more performant since its results are better looking than the traditional one.
I remember writing a function to draw lines and hating the fact that you had to have a double nested if else condition for positive, steep positive, negative and steep negative lines. 4 code blocks all doing the same thing with just some values flipped around.
Very interesting, thanks!
Great video, subscribed
2nd video and 43k views, you deserve it. tge video looked great.
I think of the math here as saying, the function Ax+By+C (integer coefficients!) is positive on one side of the line and negative on the other. Keep a running tab of whether you're on the positive side or negative side to know whether you should increment x or y. Based on that, add A or B to your tab.
The corresponding algorithm for circles (again based on integer addition -- and works for general conics) is yet cooler.
Now instead of adding constants A,B to the running tab, you should add linear functions.
But computing those linear functions is _itself_ done with a running tab for each one.
Hey great video! Hope it gets more views!
Another gem of youtube algorithm. Keep it up, great video
Underrated
i like your funny words, magic man.
Excellent.
Great video!!!
so cool
What a succinct video
I really like the retro amber display style here, and the animations are great. However, it's worth noting:
Initially, graphics were rendered using dedicated analog vector hardware, like in SAGE display terminals from the 1950s. Here, a CRT's electron beam drew directly on a phosphor screen, a stark contrast to subsequent raster graphics.
Concerning modern vector graphics: GPUs generally transform lines into triangles for rendering and then employ sophisticated techniques for smoothing. CPUs typically use scanline approaches, coloring pixels based on line or polygon coverage, rather than tracing a line as seen in Bresenham's algorithm.
Amazing video! Repping ITK!
Well it's quite simple. Your computer stores a piece of paper for every drawing it could be asked to make in the future and just shows you those. If you ask nicely it'll show you the stash.
I had a Radio Shack TRS-80 computer in the late 70s. It had similarly simplistic graphics. You could turn (large) pixels on or off but that's it. No LINE function.
So I worked out how to do it manually with a FOR/NEXT loop. Fortunately I was also learning algebra at the same time.
I even discovered how to vary a sine wave's amplitude and frequency before learning it in school (once I upgraded to enhanced BASIC that included the SIN(x) function!)
I didn't even realize video was over.
I learned about this when I was doing Assembly Language.
I wanted to expand on it and make a triangle painting algorithm. I never really got it there, at least not it a way that matches up with a line also drawn around the triangle as a border.
But I'm sure others have solved that.
Very cool
thanks for the video
What a msterpiece of a video!
wee just what i was looking for
I remember looking up this algorithm when I wanted to draw lines on an HTML page before the invention of the HTML5 canvas. I've also heard of it being used for targeting in roguelikes.
This is a great video! Can you share the source code for this video and tell us what you used for the animations?
I found the tools by following the link in the comments. He listed: Python Manim, DaVinci Resolve, Inkscape, and Audacity. I didn't find a link to source code though.
Maybe the image at #0:13 is incorrect. The points should be placed in the center of the square, not where the grid lines intersect.
Why?
Hey, nice illustration. Thank you!
Can you also do a video about the Bézier curves? They are also interesting in comluter graphics, but the eauations there are a bit more complex
Hi, if you want to learn more about them now, here's an incredible video on this topic that already exists: ruclips.net/video/jvPPXbo87ds/видео.htmlsi=w48XOUVPqqOEeocJ
Not to discourage Andre from making one of his own by the way, I'd love to see another take on Bézier curves myself :)
@@adrianbik3366Intersting, I watched that just after I saw your video :) I am actually interested in making some string art for portraits, so this is the basic math for that. Thank you!
Underrated!
Why did you waited for me to finish my DDA and Bresenham studies in graduation to post this? It was 14 days ago.
Anyway, nice video.
and today this method is still used for software rendering
nice video
Cool video
I was thinking about this earlier, just for image/font dilations instead
Draw a vector between two points, fill in every pixel in contact with the vector line, and if there are more than 2 pixels in a row filled in, only fill the one with the most contact with your vector line. Don't forget to account for weather or not the line is horizontally or vertically oriented
Howd I do
This is where I draw the line!
➡️ -------------
Minor nitpick: floating point operations in computers only used to be slow.
Do you have an idea how it is done in the opposite direction, from raster to SVG? E.g. using a library like potrace.
I use plotz for minecraft ellipsoids, but I noticed that sometimes it has blocks that you can remove and still have a sealed object.
That’s because every 2d slice of the shape is also an ellipse, which doesn’t guarantee a lack of overlap. This is a lot easier to see and understand if you try freehanding a sphere in mc.
Good video, just would change the multiplication by two to binary shift (given that inefficient division by 2 was already used as an argument, while with shift and AND to find if its odd it can be quite fast)
These days lines are drawn by the GPU as thin rectangles with floating point endpoints and thickness and it's still faster than Bresenham's on the CPU.
Sounds about like trash. Also, it should be on the LPU a piece of hardware with the singular goal of drawing a line.
@@flameofthephoenix8395are you a graphics engineer?
@@anon1963 I just do programming in general, as for what I was saying, you can take the sounding like trash part seriously, but everything other than that was mostly just a joke, while I do believe that more software functions should be placed into the hardware for a significant speed boost regardless of whether they're graphics related or not, line drawing is a tad bit excessive to throw into it's own piece of hardware.
@@flameofthephoenix8395 so if you aren't a graphics engineer, and speak as if you were, what does that make you? don't tell me you are a front end dev writing crude apps, that would be actually ironic.
@@anon1963 Everything at various random different times, I haven't written an app and haven't made a whole lot of profit. I do mathematics and programming out of passion. I spoke from a different standpoint then graphics engineering but a related one. I'm not exactly a graphics engineer although I've done plenty of work related to graphics, I do think that triangles should be rendered on hardware instead of software largely because of what I have observed to be true, and also because of the people I know who actually work/worked on hardware. If you look back when the NES released there's an interesting thing that is quite notable, the NES unlike other systems of its time runs significantly faster in a lot of ways, namely though graphics, and if you look into these older systems you might notice an interesting pattern. The NES had a lot of specialized hardware specifically for doing certain tasks and it also was very fast at the tasks that these different bits of hardware were made for, this logically makes sense, a more streamlined program is naturally going to run faster. There's a more recent example of this, you've probably heard of NVidia's RTX graphics, they are also on hardware and run many times faster than most other Ray Tracers.
Can you do one for circles?
actual good vid, seems like commenting helps hehe
simply ... it draws lines the same way i do in minecraft when building road curves
I'm pretty sure moder GPUs don't use this algorithm since it is sequencial and GPUs are heavily optimized for parallel algorithms.
This is so much more of a video than I would ever expect from a new channel. Idk why necessarily, but I mean even Mr. Beast took 10s of videos to make genuinely good content.
0:40 If you only get the starting letters of jack elton bresenham's name, it's jeb. You know it.
cool! i've been a game programmer for a while and the graphics functions are easy enough to use but they've always seemed like black magic to me under the hood. interesting to know that it's not really that complicated.
2:17 then cant you just make all the numbers doubled so that the fraction turns into an integer, theofore making there be no floating point arethmetic? or cant you just make there be fixed point numbers instead of floating point?
As a 4th grader I understand everything in this video 😂
As a 4th grader you aren't legally old enough to use youtube 😂
@@beefsmeller27your saying that like you didnt use RUclips underaged either
don’t lie to me, we all did it.
@@progect3548 Yeah man, we all did, it's just I didn't tell people my age. Trust me, there are some creeps here. I'm ok with underaged people using the website, just not them commenting their age for everyone to see.
@@beefsmeller27 fair
4th is crazy
very pog vid
banger video
Your graphics style is super cool and slick! My one criticism otherwise was that the explanations went by so fast i couldn't keep up with the information. Otherwise great video!
what do you use for animations?=
Manim in python
Nice video, can you upload it without the background music?
Liked the video, but the explanation for the algorithm could've been slower and a bit more to the viewer
That's bresenham's line drawing algorithm