@@reggiedixon2 I use uppercase for built-ins 😆 but I just got into fortran so maybe I'll change my mind... Do you still write in COBOL ?? (I'm assuming you are one of the remaining pillars of the banking system) How did it evolve ??
@LiamInviteMelonTeee No, I last actively wrote COBOL in 1992 then we migrated our mainframe systems to Unix running C and RDBMS. I tittered at the description of C as an old language. Linus Torvalds would not agree.
I do remember the name changed from FORTRAN to Fortran when they stopped using the old 80 column punch-card structure, one of the 1980's language revisions.
@@reggiedixon2 UPPERCASE comes from the 6-bit encoding era (max 64chars including control and whitespace) before the concept of bytes. Machines all worked with "words" of say 18bit, 24bit, and 36bit, normally used for math and logic, and then packed multiple text characters into each word. Keep in mind memory was on the order of $1 per bit at that time so in modern terms that's $1,000,000 for 128 kibytes. And why IBM's later proposal to make a standard 8 bit per byte was very controversial even in the late 1960s, that's a lot of wasted dollars though prices were declining rapidly. IBM made heavy use of 4b encoded decimal math in their business/banking machines (avoids rounding problems that come from converting between binary and decimal) so 8b bytes fit well. Many simple pocket calculators use similar math, each digit is represented internally by a 4b integer and processed very similar to the way humans do math digit by digit with carries using pencil and paper. As opposed to the pure binary methods used for monolithic numbers [eg a single 32bit float].
Please continue doing videos on Fortran!! It has a very unique syntax; and there is even a manual for Fortran 90 that teaches the basics. Fortran is commonly used for highly mathematical calculations. If you would like to, you could do a video on subroutines too. Because I don’t quite understand the syntax for that!
Fortran was created as a scientific programming language. One of my friends in college many decades ago had a project to use Fortran to write checks in business style including the numbers for the amount as well as the words for the amount. Note, this was in the 1970's so the Fortran language didn't have many of the conveniences of F90. You had to print one character at a time so printing was a loop through the string.
Really all languages print text one char at a time by looping over a string, it just gets hidden in a standard function like printf(). (Aside from raw rendering individual pixels of full graphical frames.)
Where is APL, A Programming Language, on the timeline. I used APL on a teletype-type typewriter machine. The remote computer was about 200-300 miles away. At the time I was both thrilled and in awe...
At the University I wrote in Fortran 77 and the most annoying thing about it was the fact that you needed to start writing from the seventh column. If you forgot it wouldn't compile and the compiler would not give you any hint as to why it didn't compile
Indents are for wimps. True programmers only use FORTRAN 66! Yes, that's ALL CAPS, because lower case hadn't been invented yet. LOL. Seriously, my first brush with Fortran was Fortran 77 in college. Editing was with IBM 96-column punch cards, and compiling was putting your stack of cards on the table outside the data center with the humming IBM System/3 and very loud line printer, then coming back several hours later to find your card stack and a printout (after the operator loaded your stack in the card reader hopper and ran your job). That's when you find out you had a typo on line 26 Arrrrrrgh!!!
Worse than finding the error on card/line 26 is dropping your whole box of cards while walking to the computer center to drop them off so the print out is ready for class tomorrow…. That sucked.
If you look up Programming the Manchester Baby on RUclips it is a very interesting couple of videos. And no, nothing to do with me. The Manchester Baby was the first stored program computer.
I was expecting to see something like Plankalkul. en.wikipedia.org/wiki/Plankalk%C3%BCl Now I feel even older, Fortran IV was my first programming language.
This isn't the oldest coding language either. Ada Lovelace -> en.wikipedia.org/wiki/Ada_Lovelace is credited with the 1st algorithmic encoding ever achieved, and is thus widely regarded as the inventor of programming (in 1842). Also, direct machine language (coding straight to the metal 💓) was used long before Fortran came on the scene. Fortran is a high(er) level language and had to be written in ... something. I got my 1st (and thankfully last) exposure to Fortran on a Dec PDP-11 in H.S. ☮💓n🎸, 🖖😎👍
Real languages are all bootstrapped from a tiny bit of machine instructions (Possibly with the aid of a mnemonic, aka assembly, code.), to compile just a bare minimum piece of the language into a better compiler which is then used to compile more of the language, and eventually implementing more advanced features like performance optimization after several iterations of the compiling the compiler loop. FORTRAN was one of those. Machine instructions are not exactly a language, they are just switches to activate and deactivate circuits.
@@mytech6779 In general Machine Language is usually context free and Turing complete (depends on the machine to be sure). I'd say it qualifies as a language. In the old days, we didn't HAVE assemblers and compilers, we wrote our own. (Even basic I/O was still a major consideration, early uP days, 8080, Z80 etc. Not many peeps have to think these days about how to scan a keyboard matrix. edit: Trust me, you don't want to run your interrupt handlers using Fortran 😉) Even in 'bootstrapped' languages, it still started st some level with some hand ML coding.
@ Machine instructions are not a language in the sense of abstraction, they most certainly depend on state(context), and not all are instruction sets are Turing complete(But most are.). An instruction is just physical activation of hardware circuitry. ie the 5th conductor from the right has voltage above some threshold which causes a particular cascade event through the attached circuits, dependent on the initial state(eg register content, interupt signals) with some ending state. That isn't a language statement in any meaningful sense. To call that a language is like saying the positions of the stick shift in your truck constitute a language.
This is not the original fortran, today's fortran really is a comfy experience in comparison to whatever it was in 1958
When I started on a mainframe using COBOL there were a few Fortran subroutines in use. I recall that like COBOL, everything was uppercase.
@@reggiedixon2 I use uppercase for built-ins 😆 but I just got into fortran so maybe I'll change my mind...
Do you still write in COBOL ?? (I'm assuming you are one of the remaining pillars of the banking system) How did it evolve ??
@LiamInviteMelonTeee No, I last actively wrote COBOL in 1992 then we migrated our mainframe systems to Unix running C and RDBMS. I tittered at the description of C as an old language. Linus Torvalds would not agree.
I do remember the name changed from FORTRAN to Fortran when they stopped using the old 80 column punch-card structure, one of the 1980's language revisions.
@@reggiedixon2 UPPERCASE comes from the 6-bit encoding era (max 64chars including control and whitespace) before the concept of bytes. Machines all worked with "words" of say 18bit, 24bit, and 36bit, normally used for math and logic, and then packed multiple text characters into each word. Keep in mind memory was on the order of $1 per bit at that time so in modern terms that's $1,000,000 for 128 kibytes.
And why IBM's later proposal to make a standard 8 bit per byte was very controversial even in the late 1960s, that's a lot of wasted dollars though prices were declining rapidly. IBM made heavy use of 4b encoded decimal math in their business/banking machines (avoids rounding problems that come from converting between binary and decimal) so 8b bytes fit well. Many simple pocket calculators use similar math, each digit is represented internally by a 4b integer and processed very similar to the way humans do math digit by digit with carries using pencil and paper. As opposed to the pure binary methods used for monolithic numbers [eg a single 32bit float].
Please continue doing videos on Fortran!! It has a very unique syntax; and there is even a manual for Fortran 90 that teaches the basics. Fortran is commonly used for highly mathematical calculations.
If you would like to, you could do a video on subroutines too. Because I don’t quite understand the syntax for that!
Fortran was created as a scientific programming language. One of my friends in college many decades ago had a project to use Fortran to write checks in business style including the numbers for the amount as well as the words for the amount. Note, this was in the 1970's so the Fortran language didn't have many of the conveniences of F90. You had to print one character at a time so printing was a loop through the string.
oh my god
Really all languages print text one char at a time by looping over a string, it just gets hidden in a standard function like printf(). (Aside from raw rendering individual pixels of full graphical frames.)
Where is APL, A Programming Language, on the timeline. I used APL on a teletype-type typewriter machine. The remote computer was about 200-300 miles away. At the time I was both thrilled and in awe...
At the University I wrote in Fortran 77 and the most annoying thing about it was the fact that you needed to start writing from the seventh column. If you forgot it wouldn't compile and the compiler would not give you any hint as to why it didn't compile
Great video, would love to see more of this type of content!
Indents are for wimps. True programmers only use FORTRAN 66! Yes, that's ALL CAPS, because lower case hadn't been invented yet. LOL. Seriously, my first brush with Fortran was Fortran 77 in college. Editing was with IBM 96-column punch cards, and compiling was putting your stack of cards on the table outside the data center with the humming IBM System/3 and very loud line printer, then coming back several hours later to find your card stack and a printout (after the operator loaded your stack in the card reader hopper and ran your job). That's when you find out you had a typo on line 26 Arrrrrrgh!!!
Worse than finding the error on card/line 26 is dropping your whole box of cards while walking to the computer center to drop them off so the print out is ready for class tomorrow…. That sucked.
@@johnprouty6583 I feel your pain. Sequence numbers to the rescue!
*That statement `end` surprising me* 😍
Fortran is my fav statically typed language
If you look up Programming the Manchester Baby on RUclips it is a very interesting couple of videos. And no, nothing to do with me. The Manchester Baby was the first stored program computer.
Far out! Thanks!
A Fortran program without GOTO? Should be disallowed by the compiler.
@@PerKraulis thank you for the good laugh kind sir
Do lisp bro, second oldest with really unique syntax and the pioneer of everything cool.
Thank you
Make it with asm 😂
auto dubbing 👎
I was expecting to see something like Plankalkul.
en.wikipedia.org/wiki/Plankalk%C3%BCl
Now I feel even older, Fortran IV was my first programming language.
This isn't the oldest coding language either.
Ada Lovelace -> en.wikipedia.org/wiki/Ada_Lovelace is credited with the 1st algorithmic encoding ever achieved, and is thus widely regarded as the inventor of programming (in 1842). Also, direct machine language (coding straight to the metal 💓) was used long before Fortran came on the scene. Fortran is a high(er) level language and had to be written in ... something. I got my 1st (and thankfully last) exposure to Fortran on a Dec PDP-11 in H.S.
☮💓n🎸,
🖖😎👍
Real languages are all bootstrapped from a tiny bit of machine instructions (Possibly with the aid of a mnemonic, aka assembly, code.), to compile just a bare minimum piece of the language into a better compiler which is then used to compile more of the language, and eventually implementing more advanced features like performance optimization after several iterations of the compiling the compiler loop.
FORTRAN was one of those. Machine instructions are not exactly a language, they are just switches to activate and deactivate circuits.
@@mytech6779 In general Machine Language is usually context free and Turing complete (depends on the machine to be sure). I'd say it qualifies as a language. In the old days, we didn't HAVE assemblers and compilers, we wrote our own. (Even basic I/O was still a major consideration, early uP days, 8080, Z80 etc. Not many peeps have to think these days about how to scan a keyboard matrix. edit: Trust me, you don't want to run your interrupt handlers using Fortran 😉)
Even in 'bootstrapped' languages, it still started st some level with some hand ML coding.
@ Machine instructions are not a language in the sense of abstraction, they most certainly depend on state(context), and not all are instruction sets are Turing complete(But most are.).
An instruction is just physical activation of hardware circuitry. ie the 5th conductor from the right has voltage above some threshold which causes a particular cascade event through the attached circuits, dependent on the initial state(eg register content, interupt signals) with some ending state.
That isn't a language statement in any meaningful sense. To call that a language is like saying the positions of the stick shift in your truck constitute a language.