How a UART works and how to make one in software - Part 11 Microcontroller Basics (PIC10F200)

Поделиться
HTML-код
  • Опубликовано: 19 июн 2024
  • UART, or Universal Asynchronous Receiver Transmitter, is a very popular embedded systems communication protocol because it's simple and universal. Perhaps not capable of transferring large amounts of data or directly communicating over USB, it is popular enough that there are devices that do USB conversion, and even pass-through wireless communication protocols (XBee). Either way, we drill down into UARTs, how they work, and how we can create our own in software, specifically in Assembly, using our ever present PIC10F200. We really enjoyed this tutorial and hope that you enjoy watching it as much as we did making it.
    These tutorials are starting to complicated enough that the videos are staying pretty high-level. If you want the code that you can copy and paste or need some more time going through everything line by line, check out the written tutorial that this video is based off of: www.circuitbread.com/tutorial...
    If you find this interesting, subscribe to the CircuitBread channel for more videos on microcontrollers and other beginner and intermediate electronics!
    Table of Contents
    0:38 Our UART setup with the PIC10F200
    1:49 What a UART actually is
    3:54 Our hardware setup to experiment with the communication.
    5:35 Starting the assembly review of our UART program
    12:58 Demonstrating the program in action.
    For electronics tools, tutorials, equations and more check out our site: www.circuitbread.com
    And check out our Friends of CircuitBread, who offer special discounts, product samples, resources and more to our users: www.circuitbread.com/friends
    CircuitBread is joining the fight to help people more easily learn about and use electronics. With an ever-growing array of equations, tools, and tutorials, we're striving for the best ways to make electronics and electrical engineering topics more accessible to everyone.
    Connect with CircuitBread:
    Instagram ➤ / circuitbread
    Facebook ➤ / circuitbread
    Twitter ➤ / circuitbread
  • НаукаНаука

Комментарии • 37

  • @tedbastwock3810
    @tedbastwock3810 2 месяца назад +1

    I noticed in these videos there is no error handling or argument checking in the subroutines. This is something that we typically do in other languages. Is this common practice for programming in an assembly language, or is it just left out for simplicity here, or maybe its not needed in these subroutines?
    This series is absolutely fantastic! Thank you so much for making and sharing these videos!!!

    • @CircuitBread
      @CircuitBread  2 месяца назад +1

      I'm glad you're enjoying these, Sergey is very proud of this series and is excited about some other series he's working on.
      For the error handling, it's a combination of things but, honestly, mostly for simplicity's sake to make the structure of the program as simple and easy to follow as possible. If you were to do a mission-critical application that needed to make sure everything worked every time (at least as much as possible), then the code would probably spend those delay cycles to double-check that the data it's receiving is actually good instead of literally just wasting its time.

    • @tedbastwock3810
      @tedbastwock3810 2 месяца назад

      @@CircuitBread Got it, thanks very much. I hope I get to see Sergeys next series!!

  • @Dykadda
    @Dykadda 4 года назад +2

    Just found this gem of a channel today. :D knew nothing about microcontrollers this morning but now I feel like I need to program a microcontroller :/

    • @CircuitBread
      @CircuitBread  4 года назад +1

      Awesome! That's a great feeling. Glad we could help with your understanding AND motivation! 👍

  • @juansauceda9656
    @juansauceda9656 3 года назад

    Like always, excellent video! My ideas are starting to take light! Thanks!

  • @fuzzs8970
    @fuzzs8970 2 года назад

    Thank you very nice video. A lot of time while using UART there's more than the two rx tx. Reset MDo, MD1 +5V and ground

    • @CircuitBread
      @CircuitBread  2 года назад +1

      Yep, there are more portions of UART but from a software point of view, RX and TX are the minimum connections to make it work (if you want two-way communication, that is).

  • @chainhonglim1281
    @chainhonglim1281 3 года назад

    Hi, may I know how to wire up between MPLAP ICD4 to atmega328p. I just try to do it in a embedded C to blink an LED. Any advice would be appreciated. Thank you.

    • @CircuitBread
      @CircuitBread  3 года назад +1

      If you're looking at the wiring, you'll need to pull up the ICD4 data sheet (ww1.microchip.com/downloads/en/DeviceDoc/50002596E.pdf - page 13) to find the pinout of the programmer and the 328p datasheet (ww1.microchip.com/downloads/en/DeviceDoc/ATmega48A-PA-88A-PA-168A-PA-328-P-DS-DS40002061B.pdf - page 12) and match them together, depending on your device package. Sergey was originally thinking of doing some Atmega based tutorials but decided to do FPGAs next instead. I hope this is sufficient to help you figure things out!

  • @olafmeyrad7918
    @olafmeyrad7918 Год назад

    Great, instructive Video! Have you by any chance the Code in C?

    • @CircuitBread
      @CircuitBread  Год назад +1

      Unfortunately, not really... We've started a C based tutorial series, there are quite a few written tutorials already and about 6 videos have been shot and are just queued up for editing.
      www.circuitbread.com/tutorials/series/embedded-c-programming-with-the-pic18f14k50 Again unfortunately (maybe?), since this uses a more powerful PIC microcontroller, there's no need to bit bang a UART so we don't have a specific tutorial on how to create a UART in C.

  • @ronen124
    @ronen124 4 года назад

    this small uC has RS232 hardware support...nice
    I guess it offers only one speed option because I did not notice any fuses that dealt with this

    • @CircuitBread
      @CircuitBread  4 года назад +1

      Hey Ronen! It actually doesn't have hardware support, this is all software / bit banging. Because of that, you can actually change the speed for faster or slower communication as necessary. We just set it at 9600 baud for this example because that's pretty typical for small embedded applications.

    • @ronen124
      @ronen124 4 года назад +1

      @@CircuitBread Beautiful work sir, after surfing to your website looking for the code and coming back here I noticed at 10:58 minute the calculated delay to transmit in 9600 Baud.
      this tutorial can be very helpful for the ones who wants to code in assembly language.
      cheers ✌👍😊

  • @user-st6fc9dw9g
    @user-st6fc9dw9g Год назад

    great , but I have to use UART in my project with two component (receive data from RFID cad and handle it with microcontroller after that send it into pc) , how can I handling with this or there are another ways to attach complements uses uart in the same microcontroller

    • @CircuitBread
      @CircuitBread  Год назад

      One of the weaknesses of UART is the lack of addressing inherent so I think you'd either need to get another MCU that has more pins (if you still want to bit bang it) or both more pins and more features (if you want hardware-based UART modules) or you can switch to I2C, which sounds like it may not be an option at all.

  • @themillionairereview1469
    @themillionairereview1469 4 года назад +2

    I found this channel today. Im going through a tough time. Im in grade 12 and I am extremely passionate about Electrical engineering and college admissions are coming up. But looking beyond my degree, Im very dissapointed at how little electrical engineering grads are getting paid, compared to comp sci grads. Do you think it will improve in the future? Please address my concerns.

    • @CircuitBread
      @CircuitBread  4 года назад +1

      While we will not claim to be experts in current and future industry trends, my personal opinion is that it won't change. There is, and will continue to be, a huge demand for electrical engineers to continue to push hardware technology further and further. But with the increasing reliance on AI, IoT, and and Industrial IoT, computer science is also in high demand and will probably grow more. However, at the moment, there is considerable overlap in the pay scales of EE's and CompSci grads so in terms of money, it really still depends on what you're doing in either of the fields. And, of course, would you rather make $70K and love your job or $75K and hate your job? Though it may be more extreme - $50K versus $150K. Then it's a lot tougher to decide.

    • @themillionairereview1469
      @themillionairereview1469 4 года назад +1

      @@CircuitBread Yes sir, that is the problem. Many Software developers in silicon valley are making around 300K+ . Electrical engineering demands hard work and labour but they pay is less. Thanks for replying though. I just have a query. Can a top tier Electronics/Electrical Engineer from MIT, Harvard or Princeton with perfect GPA and projects can REALLY ever bridge the gap between tier one software developers? I mean how high can the maximum salaries go? I havent heard of any Electrical engineer making 200K+ recently but for software developers, sky is the limit. Im hella passionate about Physics and Maths and I have an olympiad medal in Physics. So, in case I do manage to get into a top school in the US, how much salary can I really expect at the end of the tunnel?, late career that is. It is hard for me honestly cause I witness lots of engineering grads doing computer courses and switching to tech. Also, as a disclaimer, I know money is not everything but being smart is also choosing a field which will pay well for my hard efforts. Personally, my brother is also an Electronics Engineer at Qualcomm and from his own experience, he is a hard worker but is not the most satisfied.

    • @CircuitBread
      @CircuitBread  4 года назад +1

      Oh, that is tough because, again, industry trends are really not my forte. But from what I've seen and looking around at different jobs, if you stay as a salaried electrical engineer (not switch to a managerial/leadership role or become an entrepreneur) you're probably going to top out at $200K-$250K. Of course, at the end of your career, after inflation and other stuff, the number will likely be much higher, but in 2020 money, that would be my guess.
      And while everyone is different, I will always push the "passion over money" line as long as you're making enough to satisfy your needs, which nearly all EE jobs can do. Unfortunately, sometimes the job environment can suck even if the work itself is satisfying. If only we had a functional crystal ball, this would all be so much easier.

    • @themillionairereview1469
      @themillionairereview1469 4 года назад

      @@CircuitBread Ok sir, one last thing. What will be my take home salary after 250K? From what I know, 250k isnt that great in San Francisco bay area, or San Jose as those areas are pretty expensive. So if you could maybe give me an estimate of the take home after taxes, it can be compared. Also, to take the managerial role, is it enough to perform decently at my workplace or would I also need a Engineering management/MBA degree?

    • @CircuitBread
      @CircuitBread  4 года назад

      Take home salary is *highly* dependent on where you live. Both in terms of taxes and what you see in your paycheck and also in terms of where the majority of that money in the paycheck goes. For example, we're based out of Boise and $250K/year (especially before housing prices skyrocketed in the last 6 years) would buy you a massive house with plenty of money leftover for frequent new cars and putting money into savings. Whereas you would probably still have to live in an apartment if you lived in San Francisco. My brother worked for Apple for awhile and made (to me) gobs of money but he had a 2 hour commute each way every day and even then, his house was crazy expensive. He lasted a little over a year before quitting. Whereas if you were in the military as an engineer, your base pay looks terrible but you get untaxed housing, some untaxed/some taxed bonuses, full health coverage, and other benefits. It's very apples to oranges to compare.
      As for management, an MBA would be helpful but usually not necessary. More you need to show a good ability to communicate and lead while still being technically proficient.

  • @Hacker-at-Large
    @Hacker-at-Large 4 года назад

    I think the video would have been better if the difference between DCE and DTE was briefly explained along with the usage of a “null modem” cable.

    • @CircuitBread
      @CircuitBread  4 года назад +2

      Hi Stephen! Thanks for the feedback! We always struggle to figure out how to be concise yet still cover everything we feel is applicable for the topic. For this one, the hardware protocols lost out...

  • @james77011
    @james77011 Год назад +1

    the ASCII code for capital 'H' is 01001000
    so, 01001000 01100101 01101100 01101100 01101111 is hello 🤓🧐🤦 kind of weird that i still remember that

    • @CircuitBread
      @CircuitBread  Год назад

      Wow, I have done various projects with ASCII but I've never explicitly memorized the codes. When I'm doing them, I remember a couple just due to exposure but then I forget those few as soon as I'm done with the project.

    • @james77011
      @james77011 Год назад +1

      @@CircuitBread when I was in school for basic electronics, my teacher had us working on the ASCII code for about three months.. that's probably why I remembered much of those codes 🤦🏿‍♂️

    • @davidconner-shover51
      @davidconner-shover51 Год назад +1

      @@james77011 Lol, in my business, I've occasionally found the need to do some data entry in ASCII as a decimal, with practice, it almost becomes muscle memory, almost like reading.
      though I still prefer the Hex numbers for this sort of thing

    • @james77011
      @james77011 Год назад

      @@davidconner-shover51 i think I will get back into studying ASCII code and see how I can really do 🤔🤦🏿‍♂️

  • @ngonidzashemwanjira208
    @ngonidzashemwanjira208 Год назад

    😂😂😂Did he say, “Maybe I’m crazy, it’s entirely possible”, ?

  • @amarobarbosa8483
    @amarobarbosa8483 2 года назад

    Too bad it's in assembler, I only understand in C

    • @CircuitBread
      @CircuitBread  2 года назад +1

      Yeah, we'd like to go through these tutorials at different levels, including C, but the idea of these in particular was more to help everyone understand how microcontrollers work at a lower level. The concepts behind it would help anyone bit bang a UART with any language, though! Of course, I don't go into that level of detail in the video, but Sergey explains a lot in the written tutorial, if you want to check that out.

    • @davidconner-shover51
      @davidconner-shover51 Год назад

      One thing to note, he is showing the bare bones basics of these sorts of systems.
      so far, watching this series, he is not showing any other resource on this chip other than GPIO, like the timer, also full use of the rather short stack on this particular processor. introducing a few native instructions at a time. he hasn't really gotten into the wealth of bit banging coding, including being able to use libraries of code; Floating Point math, UART handling, etc. Macros; One can create an entire higher level language of one's own, in some ways more rich than C. Much of the goal in bit banging is speed and compactness in code, IE just how much can you make that little bugger go.
      An example; about 20 odd years ago, I made a nice little toy out of a 12C509 controller, a short step up, same core processor, though with 1 K instructions, with 5 IOs and 1 input, a 2 level stack, 42 bytes of RAM and no interrupts. This ran 20 LEDs, arranged in a 4x5 grid display, 4 buttons, and an input only UART(1200 baud). This little baby could display scrolling messages with a bit map, do 4 bit monochromatic shading, and even play pong, all on 1K. Also, when the clock was limited to 4Mhz(1MIPS) (32Mhz(8MIPS) now) Oh, yes, no extra silicon of any sort. Granted, Microchip had already published similar, but I took it a couple of steps further, and didn't even look at the code.
      You would be surprised at how much bit banging gets incorporated into higher level languages, especially when one discovers new efficiencies at the bit bang level. someone had to write the base code for that compiler.