1963 Timesharing: A Solution to Computer Bottlenecks

Поделиться
HTML-код
  • Опубликовано: 24 ноя 2024

Комментарии • 350

  • @joojoojeejee6058
    @joojoojeejee6058 5 лет назад +90

    Corbato died on July 12, 2019, rest in peace. I'm glad he got to see computing evolve so much from the early 1960s and he played of course an essential part in it.

    • @MrEnsiferum77
      @MrEnsiferum77 3 года назад

      Nothing is evolved. Complexity is still there.

    • @jeffwads
      @jeffwads 3 года назад

      @@MrEnsiferum77 You must new. Educate yourself.

    • @MrEnsiferum77
      @MrEnsiferum77 3 года назад +3

      @@jeffwads Educating myselft everyday(web, devops, design patterns, data structures etc...), for doing the same thing. Period. But, I'm not jumping on some tech buzz keywords, wow, this is our new future ding ding.... 40+ frameworks and tools from users written around the world, and which no one known what is written and how, leaking security issues, for the same kind of apps written in '63 in assembly...

    • @GodEmperorSuperStar
      @GodEmperorSuperStar 2 года назад +2

      Corby looked so young in 1963. He was about 37 years old.

    • @inlovewithi
      @inlovewithi 8 месяцев назад +1

      I searched for this video, because I was curious to see if he was still alive. I was hoping he still was.

  • @hyperthreaded
    @hyperthreaded 9 лет назад +226

    Great video. I was actually waiting for him to talk about a concept -- any concept -- that would be outdated and no longer in widespread use in computers today. He didn't. RAM, stored programs, CPUs, mass storage, operating systems, round-robin and priority scheduling, run queues, it's all there in every modern general-purpose computer (including smartphones etc.) in use today. The CPUs have gotten faster and faster by a factor of millions, and a lot of new, additional stuff has been invented since then, especially in the areas of input/output processing and human-computer interaction obviously, but none of those things have actually *replaced* any of the concepts and foundations laid out by these people back then. Quite astonishing if you think about it.

    • @drumphil00
      @drumphil00 8 лет назад +13

      Yep. All that has changed is the computers are faster and have more storage, and the input/output devices have got fancier.

    • @joojoojeejee6058
      @joojoojeejee6058 6 лет назад +13

      We even have magnetic discs still in widespread use 55 years later... One would've assumed that they would be outdated by now, but not really. Solid state drives still can't beat them in capacity (for a sane price).

    • @awuma
      @awuma 6 лет назад +8

      This was before microelectronics were widely used The IBM 7090 series were powerful "second-generation" computers, using discrete solid-state components instead of vacuum tubes. The IBM 360 was the breakthrough "third generation" family, using microcircuits, which were a direct technology spin-off from the Minuteman ICBM program and the space program. These microcircuits were still a far cry from microprocessors, which first appeared about ten years after this film.

    • @alberoDiSpazio
      @alberoDiSpazio 6 лет назад

      Olaf Klischat Economically, wouldn't computers be deflationary?

    • @jugganuat6440
      @jugganuat6440 6 лет назад

      Certainly not enough technology to travel almost 500000 miles successfully

  • @rushfari
    @rushfari 11 лет назад +24

    I have no idea why all these 50-year-old documentaries and reports are so utterly fascinating to me. Well, the fact that I'm a 56-year-old may have something to do with it, but man, this is cool stuff.

    • @klaasbernd
      @klaasbernd 4 года назад +3

      Nope. I'm 34 and interested. I think it's because the target audience was people who were interested and were created by experts. It's slow and goes deep. I like them because they explain tidbits

  • @drumphil00
    @drumphil00 8 лет назад +63

    Should be mandatory viewing for anyone studying software development or computer science. Gives you a much more useful perspective on computer architecture than is usually gained from high level application development.

    • @mehmetgokturk5596
      @mehmetgokturk5596 Год назад

      mandatory in my graduate level operating systems course. :) Great indeed.

  • @tachikomakusanagi3744
    @tachikomakusanagi3744 3 года назад +14

    This guy in the interview is an actual genius. He invented the concept of the operating system, and then went ahead and created the first one. All of modern computing surges forward from this point.

  • @morphykg1503
    @morphykg1503 4 года назад +4

    Dr. Corbato, the professor giving the lucid lecture here, invented Multics, an operating system that directly influenced Unix. In this video, he's likely describing Multics or a prototype of it. Born in 1926, he served in WW2, and then spent a career as an MIT professor. He passed away last year. Rest in peace, Dr. Corbato. Your impact on computing is a tremendous one.

    • @GH-oi2jf
      @GH-oi2jf 4 года назад

      Satrio H - Multics influenced every subsequent multiuser operating system.

    • @dutchcanuck7550
      @dutchcanuck7550 Год назад +1

      He's describing the prototype of what would become CTSS, the Compatible Time Sharing System. Ancestor of all multi-user, multi-tasking OS's.
      en.wikipedia.org/wiki/Compatible_Time-Sharing_System
      Multics is still a few years off.

  • @uludodo
    @uludodo 10 лет назад +57

    Birth of Operating systems. They don't even call it OS, they call it the Supervisor :)

    • @dimbulb23
      @dimbulb23 7 лет назад +11

      Actually the 'supervisor' was only part of an OS, typically the part that controls when tasks get control of the processor and for how long. Other parts of the OS might handle disk operations, other I/O operations, error recovery, communications, etc. Much of the 'supervisor' activity in more modern devices and even ones in '70s was handled in firmware or microcode and are no longer a part of the OS at all. But it all started then.

    • @GH-oi2jf
      @GH-oi2jf 5 лет назад +1

      Doğan Kurt - When I started using computers in 1965, we called it an “operating system.”

  • @SirTusharGupta
    @SirTusharGupta Год назад +1

    I am a Computer Science Student and In my Operating System subject book, I study about batch processing and time sharing.
    And this video was recorded at that time when these concepts was developed.
    Awesome 😮

  • @truantbuick
    @truantbuick 10 лет назад +36

    Almost everything he said is still relevant today to process scheduling and memory management. (For most intents and purposes, process scheduling = timesharing. Timesharing resolves the problem of multiple humans wanting to run their programs. Process scheduling resolves the problem of multiple applications wanting to run their programs.) Process scheduling is still an active area of research, and lots of smart people continue to try to hammer out the particulars of the best way to do it, and yet it was accurately summarized in this video from 51 years ago.

  • @charles-y2z6c
    @charles-y2z6c 3 года назад +3

    John Fitch Science Reporter passed away, November 2020 at age 94. He said he achieved all his goals in life. The only one he missed was he wanted to live to 100
    RIP

  • @JimGardner
    @JimGardner 6 лет назад +4

    No matter how absolutely mind blowing it is to be watching this film via the commonplace medium we call RUclips, nothing is more incredible to me than the fact this was cutting edge technology just 10 years before I was born, and by the time I was 11 I owned a computer of my own.

    • @arsnakehert
      @arsnakehert 6 лет назад

      It all advanced just to rapidly, I'm glad it did tbh

  • @iPondR
    @iPondR 5 лет назад +5

    I like how Dr Corbato uses the word "we" and not I... thinking as a team not as a 'founder' - this film is gold.

  • @XTL_prime
    @XTL_prime 7 лет назад +9

    "Like a group of people catching a bus." What a great analogy for a batch.

  • @vm2463
    @vm2463 3 года назад +5

    $600 per hour in todays money (based on gold price comparison) is $31,389.45. that's thirty one thousand dollars per hour is the cost of operation. the price of single second of computer time is $8.72 in todays money. the program they ran took 18 seconds that means that it cost them $156.95 to run the program that took square root and found a hypotenuse. incredibly expensive computing

  • @davidgrisez
    @davidgrisez 2 года назад +3

    I am old enough to remember the days of Computer Time Sharing at the University that I went to. The University had two CDC 3170 computers. There was a room with a number of teletype time share terminals that had paper tape readers and paper tape punches where students could enter and run their programs. Also punch cards programs could be dropped off at the Computer Center for when the Computers were being used for batch jobs. Today I have a powerful iMac computer with internet service at home, which is what a lot of people have today. It is a big difference.

  • @17R3W
    @17R3W 15 лет назад +10

    Around the 13 minute mark, where he's talking about doing things in burst, that's how computers worked up until just a few years ago. (in fact to a large extent even today).
    Until the advent of Hyper-threading and multi-core, a computer could not do two things at once.
    When things were "multi-tasked" they were actually processed in short bursts. This all happened so fast that you couldn't tell, but that's how it worked.

    • @awuma
      @awuma 6 лет назад +3

      Things are still done that way. Typically, many more processes are running than there are cores or "threads".

    • @relaxtosoothe5017
      @relaxtosoothe5017 2 года назад +2

      Basically the cpu’s now, are time sharing the tasks. Computers can still only do one task at a different time, however the time has been split into nano/pico to the nth power lol

  • @UmarSear
    @UmarSear 9 лет назад +47

    Isn't it remarkable how much things have changed on the surface and how little in other respects. Computers still operate on the same principals and the current "All the rage" cloud computing is just another version of yesterdays time share !

    • @nickfarmer2452
      @nickfarmer2452 3 года назад +2

      Neal Stephenson wrote an essay "In the Beginning Was the Command Line". In it he makes the point that all the fanciness in and around the CPU that has accumulated over the years is there to make the outside world look to the CPU like 5:15 .

  • @null1023
    @null1023 7 лет назад +8

    Very easy to understand explanations (and relevant, even decades later!) in this video.

  • @shahinarya
    @shahinarya Год назад +1

    Wonderful people, great inventions. I feel fortunate to have been educated in the field in the 70's and been a part of various developments in the filed since! Thank you for posting!

  • @strife1012
    @strife1012 3 года назад +2

    This is Project MAC, Mathematics and Computation, a collaborative computer, or a network of computers, that sought to to create a functional time-sharing system. scientist Robert M. Fano, with computer scientist Fernando José Corbató as a founding member.

  • @Roysac
    @Roysac 15 лет назад +9

    Nice gem from history. It is hard to imagine for people today what the thought processes were in the past during the early computers. This stuff is what got us where we are today and now taken as a given.. what it was not.

  • @adud6764
    @adud6764 3 года назад +2

    Fernando José Corbató (the guy explaining) only died in 2019. Really nice that he still got to experience all the technological development.

  • @auronoxe
    @auronoxe 5 лет назад +3

    Great interaction between the reporter and the professor. Just a board, no fancy animations needed to explain it so well.

  • @johneygd
    @johneygd 8 лет назад +4

    I like those noises from type writers and printers,it's amezing how those peoples already thinked forward backthem.

  • @DrFiero
    @DrFiero 6 лет назад +7

    Ahhhh... "that smell".
    Sort of a combination of air conditioned air, and the warm oil/ink coming off of the teletype console.
    Brings back memories!

    • @DaveYostCom
      @DaveYostCom 3 года назад +1

      Visit the 1401 room at CHM!

  • @rodcorkum
    @rodcorkum 11 лет назад +8

    Your post brought back memories - I was doing the same thing around the same time in Bridgewater, Nova Scotia, Canada. A time share computer company had put a terminal in the local high school for demontration purposes and a teacher was using it at night to give some free BASIC classes to the public.

  • @FaustoM7432
    @FaustoM7432 3 года назад +1

    Computer languages, users-machine interaction, multi user and multi tasking, telecommunications, magnetic storage, this man already predicted (or perhaps he has contribuite to solve) all the major improvements we have on our pcs, tablets, smartphones today.

  • @laser31415
    @laser31415 12 лет назад +5

    I'm glad this old films survived.

  • @MrHmg55
    @MrHmg55 12 лет назад +7

    I remember sitting in the "computer lab" at my high school in Massachusetts in 1972-73, except there was no computer there, just two teletype machines. The computer was at SETS (Systems for Educational Time Sharing) in Waltham, Mass., about 20 miles away. Fun times sharing BASIC programs with students at other schools and even "texting" on occasion.

    • @allanrichardson1468
      @allanrichardson1468 6 лет назад +2

      MrHmg55 Once time sharing was a somewhat mature technology, companies were formed to sell time sharing to the general public (mostly businesses because of the cost), with large computers set up to accept dialup calls from terminals (Teletypes at first). Their computers did nothing but service users and compute the bills (based on connect time, CPU time, and amount of data stored on the disk drives). These companies eventually morphed into internet providers like AOL and the late great Compuserve.

    • @dave-yj9mc
      @dave-yj9mc 5 лет назад +1

      @@allanrichardson1468 wo loo loo all bless Compuserve and pay your connection bill $3.25 per hour..... which was $300 plus a month!!!!!

  • @chuppa1chups
    @chuppa1chups 7 лет назад +5

    It's also noteworthy that portable clip-on microphones have evolved according to Moore's Law as well. (I'm actually impressed by seeing one in this video.)

  • @ComputerHistoryArchivesProject
    @ComputerHistoryArchivesProject 8 лет назад +6

    Excellent vintage film. Brings back lots of memories of early math and computer classes. : )

  • @AventurasconAlberto
    @AventurasconAlberto 3 года назад +2

    This video was absolutely fantastic, well done for uploading this jewel!

  • @awuma
    @awuma 6 лет назад +2

    Excellent video, still fully relevant, very advanced for its time. Captures the nature of computing in those times. Little changed overall until the mid-70's, when smaller multi-user systems with many CRT display terminals became widespread. I recall how in 1977, the campus IBM mainframe at Caltech was still a cards-over-the-counter, printout-in-mailboxes operation, the only two user CRT terminals being reserved for JPL Accounting, since the standard operating system software was so inefficient in handling terminals that the overall performance would suffer unacceptably. Fortunately, pretty soon the DEC VAX-11/780 came on the market and was eagerly purchased by departments and research groups. Despite being equivalent to a later Intel 80386/387 of less than 20 MHz clock speed, the VAX could handle a dozen users with a respectable 0.5 Megabyte memory and 70 Mbytes of disk space. Of course, all of this is far, far less than low-end cellphones of today. In physics and astronomy, the VAX reigned supreme until about 1985, when Sun (and more expensive manufacturers) brought out UNIX and microprocessor based workstations and servers, and then in the '90's, Linux and Mac became ubiquitous as PC's became cheaper and more powerful.
    The best central campus mainframe setup I encountered was at UBC in Vancouver, which used IBM or Amdahl hardware with the Michigan Terminal System. In the years 1971-1977, I found it very powerful, easy to use, with video and card terminals, and printers, in several locations on campus. I found it even more convenient than the later, excellent DEC VAX. Of course, UNIX surpassed them all, though it has a steeper learning curve.
    The Intel i7-4790K on which I write this is less than 2cm x 1cm of silicon, yet has incomparably more power than the Cray supercomputer I used 30 years ago ... I suspect even my cellphone is more powerful that Cray.

  • @slapkickinmule
    @slapkickinmule 10 лет назад +6

    I actually saw this guy talking about the Apollo Guidance Computer, great host

  • @evanfitz
    @evanfitz 10 лет назад +8

    This is a lot different from the usual heavily scripted reports that come from this time period.
    great video, thanks for uploading!

  • @peternielsen8362
    @peternielsen8362 Год назад

    I first started on a terminal using time sharing.
    Made so much sense with many users on one computer.

  • @brianarbenz7206
    @brianarbenz7206 7 лет назад +3

    "Seminal 90 High Speed General Purpose Digital Computer." Just the kind of jawbreaker moniker that inspired Steve Wozniak to name his "The Apple."

  • @bob4analog
    @bob4analog 8 лет назад +5

    This is so cool! Explains how computets got where they are today.

  • @Bernievidtime
    @Bernievidtime Год назад

    Wow, love these perspectives that you get seeing old time computer documentaries. The guys are down to earth as well.

  • @Scousar
    @Scousar 14 лет назад +3

    This is superb. Corbatto is actually telling us about Multics a very advanced OS developed at MIT. It never enjoyed great commercial success but it was the first OS to be written in a high level languages (PL/1 also called PL/I).
    Unix and later Linux all owe a great deal to Multics, which was better enginered that Unix but was never as freely available, great video, thanks.

  • @Demache92
    @Demache92 11 лет назад +11

    Color films existed. But it was limited to big budget films, as color processing on film was horribly expensive back then. This is just what low budget educational films were like at the time.

  • @jeffm4284
    @jeffm4284 3 года назад +1

    I am SO GLAD that this information was captured and made available! The tech of the time thankfully included this type of video that lasted until it was digitized.
    This is a very, VERY early pioneering computer concept of "time sharing". According to my research - John McCarthy at MIT was the first to implement time-sharing of user programs in 1959. He was involved in two other successful efforts (not to mention - he coined the term Artificial Intelligence and laid significant groundwork in the field).
    There are obviously others who participated, but McCarthy by all accounts did the most to create this capability (versus wrote a paper), which in my book is the most important aspect (literally in my book - I'm currently neck deep in writing 2 books on cloud computing at this time).

  • @Peter_Scheen
    @Peter_Scheen 5 лет назад +1

    18 seconds to do two square roots and one hypotenuse. We have come a long way. I liked it when he said they are experimenting with visual output.

    • @superdepronic
      @superdepronic 5 лет назад

      Quite fast ! Today a lot of people spend a whole lifetime without calculating two square roots and one hypotenuse with the aid of modern computers !

  • @theloniousMac
    @theloniousMac 4 года назад +1

    And now my laptop that is incomprehensibly more powerful, is twiddling its thumbs while I scratch my balls.

  • @mrjohnson0asdf
    @mrjohnson0asdf 15 лет назад +7

    Actually, it's kinda freaky how similar a modern computer still is.. How far we've yet to come.

  • @sujalgarewal2685
    @sujalgarewal2685 3 месяца назад +2

    "A computer has 65000 words of memory. That's the scale we are talking about"

  • @17R3W
    @17R3W 15 лет назад +5

    The interesting thing, is this is still a problem when working with expansive computers.
    When working at a small TV station, I made a habit having making sure the computer was working while I'm on a break, or after I left work for the day.
    I always wanted to have a computer, rendering or compressing as many hours a day as possible.

  • @needlove1982
    @needlove1982 11 лет назад +4

    These guys were very foresighted to what the current problems of the day were, and what was foreward in the future. wow

  • @wel97459
    @wel97459 15 лет назад +3

    All OS basically Time sharing programs running on your computer, the OS dose what the supervisor would do for the consoles that were tied to it.
    And in someways its more like a computer terminal servers like SSH.

  • @saskiavanhoutert3190
    @saskiavanhoutert3190 6 лет назад

    Great explanation of 'old' computer-using. The new computer-enginering couldn't go so far without this past.

  • @GH-oi2jf
    @GH-oi2jf 5 лет назад +1

    Corbato is talking about MULTICS, known to every serious student of operating systems as the foundation of time-sharing.

  • @rdvqc
    @rdvqc 5 лет назад +1

    My first experience with timesharing was on the McGiill Rax system in the late 1960's. At the time it was running on an IBM 360/50. Most access was via TTY33 and TTY35 dial up terminals. Those with more budget could use an IBM 2741 terminal which was built around a Selectric typewriter. It was about 50% faster.

    • @rdvqc
      @rdvqc 4 года назад

      @referral madness The earliest use of "dial up" communications were for applications like Telex and Teletype where the terminals would be connected to each other. You can get a general idea by looking up "Teletype Model 33" in Wikipedia. These same devices were also used as consoles for some systems. Many of the TTY variants had built in MODEMs and dials. They typically used ASCII async as the protocol. For some terminals they even had acoustic couplers where you could dial a regular phone and put the receiver into the modem.
      You memory of hardwired terminals is probably of the IBM 3270 class of terminals which were typically hardwired even if extended over WAN.
      This is waayyyy too big a topic for a RUclips comment.

  • @mikeklaene4359
    @mikeklaene4359 10 лет назад +4

    What is old is new again. I first heard of timesharing systems in the late 60's when a younger brother was a physics major at Xavier University in Cincinnati. I had just landed my first programming job at a DOS-360 shop.

  • @needlove1982
    @needlove1982 11 лет назад +6

    This has to be before public education in the United States fell completely apart. The comments on here about these people being so much smarter are 100% correct.

    • @GH-oi2jf
      @GH-oi2jf 5 лет назад

      needlove1982 - Before the web, most people could spell all the simple words.

  • @lucasbachmann
    @lucasbachmann 11 лет назад +5

    It would be awesome to get a followup interview from Fernando Corbato commenting on this video.

  • @fabriziodutto7508
    @fabriziodutto7508 2 года назад

    Love you guys at C.H.M. ! What a beautiful trip into history of mankind! As a technician, I really appreciate your work, hoping the museum economics will survive this pandemic situation, to grow and bring more machines to a second life! Thank you so much. Respectfully. F.D.

  • @mauryginsberg7720
    @mauryginsberg7720 7 лет назад +4

    This seems very advanced for 1963

  • @leberkassemmel
    @leberkassemmel 6 лет назад +5

    This is, simplified, how modern OSes do multitasking.

  • @ForbinKid
    @ForbinKid 13 лет назад +1

    We had to upgrade a Honeywell computer in the mid 70s and decided to evaluate the competition too. The Honeywell salesmen told our boss that no one would run a business on a time-sharing computer.
    We finally went with a Dec time-sharing machine rather than a Honeywell punch card/tape machine. I made sure of that, having enjoyed the Dec machine so much in school :)

  • @martinhertog5357
    @martinhertog5357 3 года назад

    Round Robin is still used for serving a list of concurrent tasks. Nice vintage documentary.

  • @radarmus
    @radarmus 10 лет назад +2

    Wow what a time: we got the magnetic disk around for a year! ( this was the pioneer of computer ) many times they actually used the resourcer much better at this time than now.

    • @awuma
      @awuma 6 лет назад

      There also were drums, and all those early devices were huge. I recall one experimental disk drive at University of Hawaii Computer Center which was a meter or so in diameter, and IIRC about a quarter to half an inch thick! The standard 2 MByte disk we used from the mid-sixties to about 1980 was in a plastic cartridge about 18 inches in diameter. Things changed in the mid 70's when IBM developed the sealed "Winchester " drive. Only about 1980 did disk drives shrink to the 5.25 inch format accomodated by PC cases.

  • @samiam5557
    @samiam5557 8 лет назад +3

    This is a demo of my High Schools tech level at the student computer lab in 1978, just a phone cradle modem and a terminal typewriter/printer! LOL

    • @Pimp-Master
      @Pimp-Master 5 лет назад +1

      Me too. Whenever I think of computing, it’s you go to the university for some time sharing. The output is continuous, rolled green paper.
      There’s not a GUI in sight, plus there’s a lot of noise and warmth in the room.

  • @James_Bowie
    @James_Bowie 3 года назад +1

    What a great bit of history preserved there. Thanks for posting. 👍
    (I pity the poor camera guys with those behemoths of the day.)

  • @afro-haitian2131
    @afro-haitian2131 11 лет назад +1

    wonderful post
    it is amazing to see how computer has been developed.

  • @alberoDiSpazio
    @alberoDiSpazio 6 лет назад +1

    Attaching a bunch of consoles to a computer? Is that like attaching a bunch of chromebooks to a cloud?

  • @zoltanberkes8559
    @zoltanberkes8559 9 месяцев назад

    This is basically the born of command line, file system and operating system.
    Oh, and remote console, computer network.

  • @lowercherty
    @lowercherty 5 лет назад

    I remember playing the original text based Star Trek game on a paper based terminal at college in the mid 70's. Miles of paper, but at least you could go back and look at the previous long and short range scans. How things have changed.

  • @joyange1
    @joyange1 11 лет назад +5

    And to think that most of us just leave our PCs running all the time doing nothing most of the time and here they are trying to utilize every precious second of CPU time.

    • @oldtwinsna8347
      @oldtwinsna8347 6 лет назад +4

      There are distributed number crunching apps you can download to help cure cancer, find aliens, etc, that were quite popular at the turn of the century but most folks removed them as they max your cpu and so create a racket and drive up your electric bill.

    • @allanrichardson1468
      @allanrichardson1468 6 лет назад

      oldtwins na They’re still around. I run SETI@Home as a screensaver.

    • @awuma
      @awuma 6 лет назад +1

      The power we have now seems absurd, almost inconceivable only thirty years ago, but the limitation is software, which rarely uses the full potential of the hardware. Furthermore, most personal computers are not built to run full blast, they overheat and automatically slow down. To do serious number crunching or even gaming, you need to install much better cooling of the CPU and GPU, in many cases having to take off the CPU lid and apply a better thermally conducting substance between the CPU die and the lid. There is a pretty big sub-culture of people who push their computers to the limit, usually for gaming but also for scientific number crunching and cyptocurrency mining.

  • @TikluGanguly
    @TikluGanguly 12 лет назад +4

    Wow..unix in making..his works inspired multics which in turn inspired unix and linux :). Man he is the grand dad of Modern OS's

    • @awuma
      @awuma 6 лет назад +2

      Actually, a much more practical system called Michigan Terminal System (MTS) was developed at University of Michigan and was used from 1967 to 1999 at a number of universities around the world. I used it for most of my thesis work. Of course, the people at MIT later developed Multics which then intellectually led to Unix, but Multics itself ran only on GE/Honeywell computers, whereas MTS ran on the much more common IBM 360/370 family. When I arrived at UBC in 1971, MTS was an absolute revelation to me, since I had used several systems already, including DEC PDP of some sort, IBM OS and PS, and CDC Scope. MTS was a pioneer of a number of features, such as dual processor operation and virtual memory, as well as having a user friendly command language. It's sad to see the successful MTS forgotten today, while Multics basks in the reflected glory of being the (anti-)precursor of Unix.
      For six years, my stuff stayed stably on that MTS system at UBC, despite complete hardware replacement. Only with Linux have I experienced similar stability (twenty year old binary applications still running).

  • @SalvatoreDAgostino_0
    @SalvatoreDAgostino_0 10 лет назад +3

    Ran across this in some research on cloud computing...

  • @fitfogey
    @fitfogey 4 года назад

    I never had an amazing teacher like this.

  • @octoberskye1049
    @octoberskye1049 5 лет назад +2

    Astonishing. The more things change, the more they remain the same. Travel well. Safe return. ❤🐯

  • @badATchaos
    @badATchaos 10 лет назад +30

    Amazing how little computers have changed

    • @hyperthreaded
      @hyperthreaded 9 лет назад +15

      Y'know what's funny? I can't decide whether the other person who liked your comment did so because he thought you were being funny, or because he understood you were right. :-D

    • @awuma
      @awuma 6 лет назад

      The big ones are now made up of very many "little ones" :-)

  • @radarmus
    @radarmus 10 лет назад +8

    en.m.wikipedia.org/wiki/Fernando_J._Corbató

    • @tomp2008
      @tomp2008 10 лет назад

      wow does he ever look old!

    • @RCAvhstape
      @RCAvhstape 9 лет назад +3

      +Mads Thorup Awesome! He's old enough to be a WWII vet and he's still a computer scientist. And he's the guy I can blame for having to remember so many passwords lol! Would be cool to take a class with him, imagine the knowledge in that brain.

  • @toddojala8867
    @toddojala8867 10 лет назад +20

    I've only skimmed the comments, but am amazed by the chronocentrism and arrogance of "modern" commentators. Most of them couldn't think their way out of a Walmart parking lot. Considering the resources they had, these early computer pioneers were giants compared to the ants of today.

    • @awuma
      @awuma 6 лет назад +2

      And those giants could use resources very efficiently. These early operating systems were written in Assembler, as close to the hardware as is humanly possible. The operations described in this film could be carried out in a few KBytes of code.

    • @Bodragon
      @Bodragon 6 лет назад +2

      Assembler (or assembly language) isn't as close to the hardware as is humanly possible. That would be machine code. And yes, programs were actually written in machine code before assembler was developed, the first major jump to language (in the form of nmemonics), oriented programming.

    • @Pimp-Master
      @Pimp-Master 5 лет назад

      I’m much more interested in the subject now that I have seen what computers can do.
      I’m only 12 minutes in, and I couldn’t stay interested in it unless I was a math, logistics, or engineering student of ‘63.

    • @GH-oi2jf
      @GH-oi2jf 5 лет назад

      Todd Ojala - I’m inclined to agree, since I started programming in 1965.

    • @JanBruunAndersen
      @JanBruunAndersen 2 года назад +1

      @@Bodragon - I disagree. Expect for two pass assemblers which freed you from having to calculate adresses ahead of time, there really is no difference between coding in assembly language and coding "by numbers". There is a 1-to-1 correspondence between the assembly code you write and the code that the computer executes.
      But you are right - there is something even closer to the hardware than machine code. That level is called microcode. At that level you are not telling the computer WHAT to do. You are telling it HOW to do it by controlling which chips, gates, and transistors that are turned of and on, when, and for how long.
      The low level machine code (which is really just a set of bit patterns) is basically used to look up a particular sequence of microcode instructions (somewhere between 15 - 45) for each machine code, and then feeding those microcode instructions = signals to the hardware.

  • @GeoNeilUK
    @GeoNeilUK 12 лет назад +3

    Possibly, though today's computers work in exactly the same way as they did in this film. Disk caching, multitasking, the only things missing are threading and multicore

  • @moopet8036
    @moopet8036 5 лет назад +1

    OMG the Twilight Zoney music at the end is creepy as hell.

  • @dragonheadthing
    @dragonheadthing 15 лет назад +1

    Thank you for posting this!

  • @Kg277
    @Kg277 12 лет назад +2

    My left ear got pulled back in time.

  • @gonzobrains
    @gonzobrains 13 лет назад +4

    this is a great find!

  • @anonydun82fgoog35
    @anonydun82fgoog35 2 года назад +1

    And then someone said hey, you know we could sell CPU time to researchers and people would pay for it - we will make so much money! And then researchers after a few years said "Wait a minute, how come we have to pay so much for use of a computer and wait in line for weeks or months? Wouldn't it be nice if everyone had their own small computer?" And thus, the personal computer was invented. This worked fine for a while, and then someone else said "Hey, but how about we stick a bunch of computers together, call it "The Cloud", and sell CPU time and people will pay for it... we will make so much money!"

  • @fjashfcsahds
    @fjashfcsahds 2 месяца назад

    Soo cool. The beginning of the terminal!

  • @thisismyname007
    @thisismyname007 11 лет назад +3

    "The opponents are the people at the consoles who need quite a bit of time to figure out what the computer has done to them." LOL

    • @ShazzPotz
      @ShazzPotz 4 года назад

      I found the word opponents jarring. I think he should have said "players" instead.

  • @maxanderson9187
    @maxanderson9187 4 года назад +1

    Back when reporters had the capability for evaluation and rational thought beyond "reuters told me"

  • @quintopia
    @quintopia 13 лет назад +2

    john finch seems to be pretty smart. i want to see the episode about the trieste now. that was the first bathysphere to hit the bottom of the marianas trench.

  • @pyak6761
    @pyak6761 6 лет назад

    So crazy how this actually still makes sense today as a Grad student ...

  • @joojoojeejee6058
    @joojoojeejee6058 6 лет назад +4

    Supervisor = Operating System.

  • @needlove1982
    @needlove1982 11 лет назад +3

    There is a book that describes this into detail, its called "By bit bit, an illustrated history of Computers."

  • @donmoore7785
    @donmoore7785 5 лет назад

    I believe timesharing and BASIC were developed at the same time. An easy to use language was needed for the masses now able to easily access computers.

    • @jeffhartman7000
      @jeffhartman7000 5 лет назад +1

      Don Moore BASIC was developed at Dartmouth by George Kemeny’s and Tom Kurtz’s group sometime in the 1960s. They developed a practical way to make timesharing work on a GE computer.

  • @tommylopez8338
    @tommylopez8338 5 лет назад

    So nice to see someone writing in cursive.

  • @JacGoudsmit
    @JacGoudsmit 11 лет назад +3

    Too bad they don't really go into any detail about things like interrupts, storing the program state, reading input and sending it to the right program etc.
    Really the most important benefit to users back then, is demonstrated when he types "sure" instead of yes or no. In a world of batch processing, being in a situation where making a simple mistake like that is not going to cost you hours of time must have been a glorious prospect.

    • @awuma
      @awuma 6 лет назад

      Yes, I noticed this right away, too, but I guess that's a whole extra level lower on the abstraction ladder. Still, the effects were correctly described. An aside: The original IBM PC BIOS was incredibly bad, it's serial ports did not work on interrupts. Fortunately, a couple of good handlers could be found on the mid-1980's Internet, or programs bought. This allowed me to timeshare my PC with an assistant.

  • @jaworskij
    @jaworskij 5 лет назад +1

    The IBM 7090 was one of the first solid-state computers of its time.

  • @gfixler
    @gfixler 9 лет назад +3

    21:50 - 22:30 Discovery of the goodness of REPLs.

  • @inquisitive871
    @inquisitive871 12 лет назад +4

    Fernando Corbato is still alive.

  • @LucasStoller
    @LucasStoller 7 лет назад +1

    Great vídeo! every programmer must see this!

  • @ARichardP
    @ARichardP 2 года назад

    9:28 “Eventually we’d like to see graphical display on this kind of a console. There are technical problems still, but … .”

  • @paddle_shift
    @paddle_shift 4 года назад

    IBM used the term TSO for time share option. It was part of ISPF mainframe o/s. ISPF was and still is with their operating systems stood for Interactive System Productivity Facility.

    • @rdjnova
      @rdjnova 2 года назад +1

      No, ISPF was not the operating system, it was basically a text editor with some features approximating IDEs (Integrated Development Environments) to come, such as Eclipse, though much simpler than those. TSO was a subsystem of the OS/360 operating system which was ordinarily batch-oriented.
      In addition to TSO, IBM developed CICS, the Customer Information Control System, which behaved like a time sharing system but only for end-user interactions with specific application programs; and VM/370 (née CP67) with CMS, the Conversational (née Cambridge) Monitor System, a single-user monitor that depended on the underlying virtual machines of the Control Program to manage time sharing and other shared resources.

    • @paddle_shift
      @paddle_shift 2 года назад

      @@rdjnova Sorry, you are correct. I should have read my response better and add proper punctuation. I meant to say that TSO was really part of ISPF that was used to communicate to the mainframe o/s such as unix or system 360

  • @raf.nogueira
    @raf.nogueira 6 лет назад +1

    I wish I could be in that place in that time.... Sadly I'm no that good with maths, but I love programming

  • @rediculousman
    @rediculousman 4 года назад

    Need to make the sound mono.
    It's currently stereo with the sound track assigned to the LH speaker

  • @Skweez
    @Skweez 5 лет назад

    Incredible historical video, thanks!