I would not be worried. I tried ChatGPT and Google Bard and asked for a timecode decoder and all I got was an underwhelming solution. While it understood the format of the timecode in BCD, it could not understand that it might be in the middle of the timecode at power up, causing a failure to sync properly. I tried multiple iterations of prompt only for it to keep missing the correct answer. It's like it does not understand the concept of how data flows in time and definitely does not understand that input states will not be perfect at start up. I also asked both to give me a 30-day plan with sources to do image processing in FPGAs. While the resulting plan looked impressive, most of the sources it cited did not even exist. Suffice to say these are more of a curiosity. They certainly not an augment to my skills or job, and they certainly won't replace my job yet.
It's worth keeping in mind that ChatGPT is a massively underpowered model compared to more recent models based off GPT4. While these models currently struggle with the concept of time, I think it will improve over the course of the next 10 years, to the point where I don't think any job will be safe from it's influence.
Nope. ChatGTP 4 is used in the paid version, very underwhelming. The possibility is there but it won't be with transformer based LLMs due to the limitations f how they work.
Influence is different than "going to take my job". I'm not sure it will ever get to the point where it will replace anyone, more like it will be used for the simple or tedious sections, but I kind of stink at telling the future, so....maybe?
Designing hardware is a declarative process. You do not implement anything but tell the synthesis tool what to implement, with a description language. Making a machine learning this process naturally has the same complexity as doing it yourself, so there is no benefit to it besides the fact that a non-HDLs speaker could train the machine. That again is nonsense because virtually every human being who is experienced enough in hardware design knows such a language. LOL
I would not be worried. I tried ChatGPT and Google Bard and asked for a timecode decoder and all I got was an underwhelming solution. While it understood the format of the timecode in BCD, it could not understand that it might be in the middle of the timecode at power up, causing a failure to sync properly. I tried multiple iterations of prompt only for it to keep missing the correct answer. It's like it does not understand the concept of how data flows in time and definitely does not understand that input states will not be perfect at start up.
I also asked both to give me a 30-day plan with sources to do image processing in FPGAs. While the resulting plan looked impressive, most of the sources it cited did not even exist. Suffice to say these are more of a curiosity. They certainly not an augment to my skills or job, and they certainly won't replace my job yet.
The fifo ptr was 4 bits wide so it will not work with a buffer of 8 bytes. It will not roll over. Maybe it will work bc of losing the highest bit.
THIS WAS ONE YEAR AGO ... WHAT ABOUT NOW ?
I should try again and see!
We are in the early days, it will progress quickly.
I tried prolog but didn't give me working code too
You should ask for axi enabled perioherals😊
I should!
@@FPGAsforBeginners please make vidoes about that☺️
@@asidesigner8542 Will add it to my list!
Is it right that ChatGPT doesn't even compile the code it suggests?
AI still need to get through intricacies that hardware engineering (FPGA , ASIC , CHIP DESIGN)includes, it cannot learn by itself .
It's worth keeping in mind that ChatGPT is a massively underpowered model compared to more recent models based off GPT4. While these models currently struggle with the concept of time, I think it will improve over the course of the next 10 years, to the point where I don't think any job will be safe from it's influence.
Nope. ChatGTP 4 is used in the paid version, very underwhelming. The possibility is there but it won't be with transformer based LLMs due to the limitations f how they work.
Influence is different than "going to take my job". I'm not sure it will ever get to the point where it will replace anyone, more like it will be used for the simple or tedious sections, but I kind of stink at telling the future, so....maybe?
Designing hardware is a declarative process. You do not implement anything but tell the synthesis tool what to implement, with a description language. Making a machine learning this process naturally has the same complexity as doing it yourself, so there is no benefit to it besides the fact that a non-HDLs speaker could train the machine. That again is nonsense because virtually every human being who is experienced enough in hardware design knows such a language. LOL