What an amazing tutorial: Just the necessities, no annoying background music, no annoying calls to "subscribe and like". If all youtube channels were like that, we could heal the world. Also, I checked your channel page and was shocked to find that this was only your 3rd video. Keep being awesome!
Mentioned a lot in the comments, but I have to say as well: what a great explanation, straight to the point, no bs and gives enough info without overwhelming with details. Thank you!
Stunning. It's not abt any topic related to computer science or tech, if anyone teach me anything like this, i will skip everything and learn. Thank you for changing lives of people.
Seriously, thanks a lot Alex for all the stuff you convey through your LinkedIn network and RUclips videos. Just love the way you distil the topics and make them understand beautifully.
Wow. Never heard about Kafka, but after this brilliant video now I know why it is so fast. Still no idea what it is, though. And so many totally not astroturfed comments. Sweet.
Great technical explanation. I just want to add that Kafka can be used for much more than just data ingestion sending data from a data source to a data sink. The Apache Kafka open source project also includes Kafka Connect for data integration and Kafka Streams for data processing. Therefore, you can leverage the characteristics explained in this video to build a modern data flow with a single (scalable and reliable) real-time infrastructure instead of combining several different components (like Apache Kafka for ingestion, Apache Camel for data integration, and another stream processing framework like Apache Flink for real-time analytics).
Reliability of Kafka has yet to be proven. Ever so often it does not meet data integration core requirements on reliability, especially in the area of disruption and recovery, where it quickly says GoodBy to “At-most-once” semantics. Don’t get me wrong, Kafka is really great for what it is designed for: efficient streaming in BigData architecture, but that architecture will tolerate a certain fuzziness of data, which pure data integration architecture would not allow for.
Thanks, brilliant tutorial. My company are currently gearing up to adopt a data mesh architecture and It's gonna be fun moving from batch to this CDC stream methodology.
After going through the video and your explanation, I am decided to take a paid subscription in byte byte go! Your explanations are to the point and succinct to understand a topic ! Thank you for the video.
While sequential access can be efficient for certain tasks, it also has several downsides: Slow Access for Individual Records: If you need to access a specific record in the middle or at the end of a sequentially accessed file or data structure, you would have to traverse through all preceding records. This can be very inefficient and time-consuming, particularly for large datasets. Inefficient Updates and Deletions: If a record in a sequentially accessed file needs to be updated or deleted, you often have to rewrite the entire file, or at least all the data following that record, which can be very slow and inefficient. Inefficient for Concurrent Access: In situations where multiple users or processes need to access data concurrently, sequential access can be very inefficient and may even lead to data corruption if not handled correctly. Lack of Flexibility: Sequential access doesn't allow for as much flexibility in terms of data access patterns. You are essentially restricted to accessing data in the order it was written. Space Inefficiency: Sequential files can become space inefficient over time. If records are deleted, the space they occupied often cannot be reused, leading to wasted space. Data Structure Overhead: In certain data structures optimized for sequential access, such as linked lists, there can be significant overhead in terms of additional pointers or other structural information that needs to be stored along with the actual data. Sequential access is particularly useful and efficient in certain scenarios, including: Data Streaming: When data is being streamed from one point to another, such as in audio or video streaming services, sequential access is ideal. Data is read in the order it arrives, and there's usually no need to skip forward or backward. Log Files: Log files are typically written and read in a sequential manner. The most recent events are appended to the end of the log, and when reviewing the logs, it's often most useful to read events in the order they occurred. Backup and Restore Operations: When performing backup operations or restoring data from backups, the data can be processed sequentially. The backup process involves reading all data from a source and writing it to a backup medium, while restore operations read the data from the backup medium and write it back to the source or a new location. Batch Processing: In scenarios where large volumes of data need to be processed in one go, such as overnight processing of transactions, sequential access can be used efficiently. Data Warehousing and Data Mining: In data warehousing and mining operations where huge volumes of data are processed, sequential access is often used. Sequential Read/Write Media: For certain types of media, such as magnetic tapes, sequential access is the only viable method. You read from or write to the tape in a linear fashion, from one end to the other. Zero copy is a technique that reduces CPU usage and increases data processing speed by eliminating unnecessary data copying between user space and kernel space during network communication or file I/O operations. The data to be sent over the network is sent directly from the disk buffer cache to the network buffer without being copied. Pros: Increased Efficiency: Zero-copy can significantly speed up data transfer rates because it removes the overhead of copying data between user and kernel space. Reduced CPU Usage: As there's no need to copy data, zero-copy methods can reduce CPU usage, freeing up resources for other tasks. Reduced Memory Usage: Zero-copy techniques can lead to less memory usage because they avoid creating extra copies of data in memory. Lower Latency: By avoiding the overhead of data copying, zero-copy can lead to lower latency in network communication or file I/O operations. Cons: Complexity: Implementing zero-copy can be complex and may require a deep understanding of the operating system and network interfaces. This can increase development time and potentially introduce more bugs. Data Security: With zero-copy, the data stays in the kernel buffer and is directly accessible to user space. This could potentially lead to security vulnerabilities if not managed correctly. Buffer Availability: Zero-copy can lead to buffers being locked for longer periods, as the same buffer is used for reading data from the disk and sending it over the network. This could potentially impact other tasks that need to use these buffers. Non-Contiguous Memory Issues: If data is stored non-contiguously in memory, zero-copy can be challenging to implement effectively. The decision to use zero-copy would largely depend on the specific needs of the system and whether the benefits of increased data transfer speed, reduced CPU usage, and lower memory footprint outweigh the increased complexity and potential risks.
I recently found your channel and honestly think this is one of the best tech bagels on RUclips undoubtedly. Awesome work in such a short amount of time!
@@0031400 lmao, I didn't even notice that. I use swipe typing so mistakes like these do occur from time to time. Honestly, wouldn't mind a tech bagel though 👀😂
Absolutely fantastic video - went over a lot of concepts like minimizing disk io, engineering constraints of kafka, different memory access patterns, with very good diagrams! Thank you :)
Thank you. I have not _tried_ Kafka, but now that I am nominally out of _the penal colony_ I am trying to _metamorphose_ back into a geek with _a hunger artist's_ budget. RUclips has been invaluable.
I have been following your channel for a long time and I truly enjoy your videos. The animations and visual effects you create are absolutely stunning! I am a programmer by profession, definitely not in your field, but I am very interested in learning how to create such beautiful animations. Specifically, I am looking to transform flowcharts into engaging videos. I would be extremely grateful if you could share some tips, the names of the software you use, or any methods you recommend for achieving this. Thank you so much for your time and for the fantastic content you produce.
Subscribe and Kafka will say thank you :)
ok, it's done Sir
What software do you use to create this awesome motion graphics?
May I know what tool you guys use to make these animated videos? Just curious..!!
I just discovered this video in my feed. _Sometimes_ the RUclips algorithm actually works! 🤠Great video! I just subscribed to your channel!
I wish you were my professor in college.
The absence of any background music makes this video great.
This comment. Yes.
fully agree!
amen!
i agree
Exactly
What an amazing tutorial: Just the necessities, no annoying background music, no annoying calls to "subscribe and like".
If all youtube channels were like that, we could heal the world.
Also, I checked your channel page and was shocked to find that this was only your 3rd video.
Keep being awesome!
This Tutorial is insanly "Zen" but he said "please subscribe" right at the end :P
I 100% believe you should make a whole series on Kafka, your way of simplifying the subject is legendary.
How can one keep things so deep and yet stunningly simple. Hats off!
Mentioned a lot in the comments, but I have to say as well: what a great explanation, straight to the point, no bs and gives enough info without overwhelming with details. Thank you!
These videos are amazingly simple and clear. The animations are spot on!! Too good xD I wish this channel never stops uploading new content
I have used kafka before but never had to think about why it is actually fast. This was very informative. I like the format of the video as well
this guy is so sweet. man! i was struggling on this system design, all his books and posts are too easy to follow and helped me become more confident
This is not the same Kafka I was expecting, but happy to learn. thanks for sharing!
You made me realize the importance of expressing thought in a clear and concise way. Thank you
Having re/viewed a ton of these, you're the best in the business bar none
No frills and thrills, just pure nuggets of value. Exactly what I needed. Thank you. You earned my sub.
Man this is gold. Saying thank you does not feel enough. Please keep it up.
Short, high quality, clean and extremely precise content...Many Thanks!
Short, concise and concrete. Very easy to understand. Thanks a lot
Stunning. It's not abt any topic related to computer science or tech, if anyone teach me anything like this, i will skip everything and learn. Thank you for changing lives of people.
Wow, this one is super cool. No background music, cool minimalistic diagrams, calm voice!
In 5 minutes I learned a lot! Amazing video!
You are a good teacher!
Thank you and I hope to see more videos from you!
So glad the algorithm found this channel for me, the content is so clear and digestible, thank you please keep up the fantastic work
wow!! this channel is a goldmine for backend engineer
Seriously, thanks a lot Alex for all the stuff you convey through your LinkedIn network and RUclips videos. Just love the way you distil the topics and make them understand beautifully.
Simple and very insightful, I like the lack of music and the use of motion graphics, helps me focus.
When knowledge calms your nerves!! Hats off to your delivery mechanism and apt data accumulation✌🏻
Greatest video series with fluenent + clear + intuiative illustration ( master-quality ##) , can not thanku enough!
Succinct.
Precise.
Educative.
Excellent animation.
Simply the best 💯
Wow. Never heard about Kafka, but after this brilliant video now I know why it is so fast. Still no idea what it is, though. And so many totally not astroturfed comments. Sweet.
your video is very clear and on-point Sir, thanks a lot 👍👍
Really great presentation! I was scared when I saw Kafka but you explained it really well.
You have a extremely clear and nice way to talk and explain! Please make more videos like that. Awesome work!
Clear and straight forward explanation. Thanks.
Not have any doubt , will be trending in top RUclips channel in system Design world wide, great start.
Essential collection of videos in this channel for a software developer
Great technical explanation. I just want to add that Kafka can be used for much more than just data ingestion sending data from a data source to a data sink. The Apache Kafka open source project also includes Kafka Connect for data integration and Kafka Streams for data processing. Therefore, you can leverage the characteristics explained in this video to build a modern data flow with a single (scalable and reliable) real-time infrastructure instead of combining several different components (like Apache Kafka for ingestion, Apache Camel for data integration, and another stream processing framework like Apache Flink for real-time analytics).
Reliability of Kafka has yet to be proven. Ever so often it does not meet data integration core requirements on reliability, especially in the area of disruption and recovery, where it quickly says GoodBy to “At-most-once” semantics. Don’t get me wrong, Kafka is really great for what it is designed for: efficient streaming in BigData architecture, but that architecture will tolerate a certain fuzziness of data, which pure data integration architecture would not allow for.
Amazing details about frequently used software. Lucky to bump into this page. Thanks
Great video, explain kafka design so clearly. Thanks very much
影片中說明兩個為什麼 Apache Kafka 能夠提供高流量傳輸大量紀錄的特性:
1. 循序 I/O
以 C 來說,當使用 fopen() 需要開啟一個檔案為 append 模式,file pointer 會直接在檔案尾端準備以新增方式繼續加入新資料,會比每次加入資料需要移動 Pointer 到特定位置再寫入來的快速。如果用硬碟的循序讀寫與隨機讀寫,會更容易理解。
在 File-based Database,例如 dBASE, COBOL + ISAM, Paradox,也是直接將新紀錄寫在檔案後方。可以用 PC-Tools 打開檔案觀察 HEX Code 確認。風險在於如果來不及寫入 EOL,沒有順利關閉檔案,就會造成檔案損毀與資料遺失。
刪除紀錄也只是在記錄上做個標記,並不會真正刪除,需要等到執行 compact database 才會真正刪除。因此我在設計需要確實刪除客戶個人資料時,會以無意義的字串覆蓋,直接刪除其實只是標記,資料還在。
2. [Zero Copy](en.wikipedia.org/wiki/Zero-copy) 避開將相同資料在不同記憶體區塊再次複製後移動,縮短傳送路徑。例如在提供 DMA 模式情況下,讓系統函數直接將讀取已經被讀入記憶體緩衝區的資料放入網卡 NIC 緩衝區開始傳送,省略 Socket Buffer 路徑。
A truly educational and concise video.
Thank you.
Short ,Crisp and To the point contents , Great work !!
USP of this channel is "No bla bla story...precise n to the point on topic "❤
First time I actually WANT to subscribe to a newsletter.
So simple yet so powerful explanation, thanks
I love all the System-design Content posted by you!
Thanks for sharing your knowledge! 🙏
wow. No BS, only content! Thank you!
Amazing! Love the quality and getting straight to the point. Not a second wasted.
Thanks, brilliant tutorial. My company are currently gearing up to adopt a data mesh architecture and It's gonna be fun moving from batch to this CDC stream methodology.
After going through the video and your explanation, I am decided to take a paid subscription in byte byte go! Your explanations are to the point and succinct to understand a topic ! Thank you for the video.
Thank you for the wonderful explanation of Kafkas abilities.
Crisp yet complete info. Good content. Thank You.
Thank you for putting up this tutorial! Study vidoes like this and then practice at Meetapro with mock interviews will help you land multiple offers.
You guys are doing amazing work here. I love the aesthetics, pace, explanations, topics, and cadence of it all. Kudos!
My head exploded with the DMA. I had not idea! Great learning! :)
concise and crisp clear... Thanks for making such amazing and valuable videos.
those minimalistic graphics makes complicated topics easy to ingest. Subscribed!
i wanted to comment that i appreciate the level of detail in the explanations in the video.
looking forward to more useful content!
Very clear explanation. Thank You!
content is simple and crisp... thank for bringing this to us...
thanks you finally understood why DMA is so important.
Very simple with good animation to explain things clearly. Keep publishing these kinds of useful videos.
Thank you! Such a great delivery and explanation. Particularly, great choice of aspects to share.
Bridging the dearth of senior developer content on youtube. I'm here for it.
wow the comments are right. simple and clear... subscribed
This helps to explain why the sequential read speed of HDDs is on the AWS Cloud Solutions Architect study guides.
While sequential access can be efficient for certain tasks, it also has several downsides:
Slow Access for Individual Records: If you need to access a specific record in the middle or at the end of a sequentially accessed file or data structure, you would have to traverse through all preceding records. This can be very inefficient and time-consuming, particularly for large datasets.
Inefficient Updates and Deletions: If a record in a sequentially accessed file needs to be updated or deleted, you often have to rewrite the entire file, or at least all the data following that record, which can be very slow and inefficient.
Inefficient for Concurrent Access: In situations where multiple users or processes need to access data concurrently, sequential access can be very inefficient and may even lead to data corruption if not handled correctly.
Lack of Flexibility: Sequential access doesn't allow for as much flexibility in terms of data access patterns. You are essentially restricted to accessing data in the order it was written.
Space Inefficiency: Sequential files can become space inefficient over time. If records are deleted, the space they occupied often cannot be reused, leading to wasted space.
Data Structure Overhead: In certain data structures optimized for sequential access, such as linked lists, there can be significant overhead in terms of additional pointers or other structural information that needs to be stored along with the actual data.
Sequential access is particularly useful and efficient in certain scenarios, including:
Data Streaming: When data is being streamed from one point to another, such as in audio or video streaming services, sequential access is ideal. Data is read in the order it arrives, and there's usually no need to skip forward or backward.
Log Files: Log files are typically written and read in a sequential manner. The most recent events are appended to the end of the log, and when reviewing the logs, it's often most useful to read events in the order they occurred.
Backup and Restore Operations: When performing backup operations or restoring data from backups, the data can be processed sequentially. The backup process involves reading all data from a source and writing it to a backup medium, while restore operations read the data from the backup medium and write it back to the source or a new location.
Batch Processing: In scenarios where large volumes of data need to be processed in one go, such as overnight processing of transactions, sequential access can be used efficiently.
Data Warehousing and Data Mining: In data warehousing and mining operations where huge volumes of data are processed, sequential access is often used.
Sequential Read/Write Media: For certain types of media, such as magnetic tapes, sequential access is the only viable method. You read from or write to the tape in a linear fashion, from one end to the other.
Zero copy is a technique that reduces CPU usage and increases data processing speed by eliminating unnecessary data copying between user space and kernel space during network communication or file I/O operations. The data to be sent over the network is sent directly from the disk buffer cache to the network buffer without being copied.
Pros:
Increased Efficiency: Zero-copy can significantly speed up data transfer rates because it removes the overhead of copying data between user and kernel space.
Reduced CPU Usage: As there's no need to copy data, zero-copy methods can reduce CPU usage, freeing up resources for other tasks.
Reduced Memory Usage: Zero-copy techniques can lead to less memory usage because they avoid creating extra copies of data in memory.
Lower Latency: By avoiding the overhead of data copying, zero-copy can lead to lower latency in network communication or file I/O operations.
Cons:
Complexity: Implementing zero-copy can be complex and may require a deep understanding of the operating system and network interfaces. This can increase development time and potentially introduce more bugs.
Data Security: With zero-copy, the data stays in the kernel buffer and is directly accessible to user space. This could potentially lead to security vulnerabilities if not managed correctly.
Buffer Availability: Zero-copy can lead to buffers being locked for longer periods, as the same buffer is used for reading data from the disk and sending it over the network. This could potentially impact other tasks that need to use these buffers.
Non-Contiguous Memory Issues: If data is stored non-contiguously in memory, zero-copy can be challenging to implement effectively.
The decision to use zero-copy would largely depend on the specific needs of the system and whether the benefits of increased data transfer speed, reduced CPU usage, and lower memory footprint outweigh the increased complexity and potential risks.
Nice, I definitely learned something new about the Kafka internals today!
I recently found your channel and honestly think this is one of the best tech bagels on RUclips undoubtedly. Awesome work in such a short amount of time!
love a good tech bagel.
@@0031400 lmao, I didn't even notice that. I use swipe typing so mistakes like these do occur from time to time. Honestly, wouldn't mind a tech bagel though 👀😂
Awesome Explanation about Kafka is amazing...Thank you, Alex
Exactly my kind of content. Interesting, insightful and to the point.
5 minutes of high quality content, thanks!
This was a clear and concise presentation. Thank you so much 👍
Absolutely fantastic video - went over a lot of concepts like minimizing disk io, engineering constraints of kafka, different memory access patterns, with very good diagrams! Thank you :)
Loved the animation and explanation. Keep enlightening us all!
Very cool channel you keep the most important stuff compact, not everyone can do that.
really this is high quality videos and lovely animations ... thanks a lot for simplifying why kafka is fast
Thank you. I have not _tried_ Kafka, but now that I am nominally out of _the penal colony_ I am trying to _metamorphose_ back into a geek with _a hunger artist's_ budget. RUclips has been invaluable.
This is explained so well. I've love to hear you speak more about kafka.
EDIT: 100% ådding that newsletter to my rss.
That's how you make a great learning video, without background music & Advertisement.
1. Have solid content.
2. Keep it concise.
2. Use visuals.
Amazing explanation. Thank you sir.
Amazing video. This channel is so under rated.
Thanks youtube algorithm to suggest me this channel.
We need so much more of this.
Your explanation is lucid and to the point. Thanks for the video. Keep up the good work! Wish you the best of luck.
Very insightful. The diagrams made me understand the concepts
Thank u so much!!! I had this question in my mind and got explained by your in a very easy way!!!
Such a good content in just 5 minutes!
Thanks for the useful instruction!
Thanks a lot, finally are clear answer why kafka is everywhere now
I love the format of these videos. Looking forward to more and to the newsletters too!
Very deep insight! Looking forward to your next videos, please keep going
The "2" on cue was amazing
Never heard of Kafka. Thank you RUclips algorithm.
What are you gonna do with this knowledge now?
I have been following your channel for a long time and I truly enjoy your videos. The animations and visual effects you create are absolutely stunning!
I am a programmer by profession, definitely not in your field, but I am very interested in learning how to create such beautiful animations. Specifically, I am looking to transform flowcharts into engaging videos.
I would be extremely grateful if you could share some tips, the names of the software you use, or any methods you recommend for achieving this.
Thank you so much for your time and for the fantastic content you produce.
Very beautifully explained 👌
This is so amazing! Straight to the point!
How can you be so good at explaining things :)
I really appreciate your work. Excellent video. Superbly Articulated. Easy to grab the concepts. Great work. 😍
Amazing! Thank you for making this video, appreciate it esp diagram explaining zero copy principle
Short & sweet. Thank you.
Very simple and clear! Thank you!
Very informative video, thank you so much!!
Brilliantly explained!! 👏