Additional Processors - Computerphile

Поделиться
HTML-код
  • Опубликовано: 11 янв 2025

Комментарии • 148

  • @AcornElectron
    @AcornElectron 6 лет назад +41

    Yay, in between Christmas and new year computerphile!

    • @Paltse
      @Paltse 6 лет назад

      Are you tslking about the commercial Christmas, or the one that ends in January?

  • @lawrencedoliveiro9104
    @lawrencedoliveiro9104 6 лет назад +6

    8:56 One of those “couple of other things” in the Amiga was the Copper. This implemented display lists, which allowed it to do fancy animation effects without actually needing to blit large arrays of pixels around. Now *that* was impressive.

  • @dipi71
    @dipi71 6 лет назад +2

    My MegaST4 has a Blitter chip. Mad props for mentioning it. Cheers!

  • @Ojisan642
    @Ojisan642 6 лет назад +16

    Back when generating fractals on the computer was a new thing to do in the 80s, there was a DOS program called FRACTINT where the “INT” part was because it could do fractal calculations relatively quickly without a coprocessor, using just integer maths. Having an FPU was a luxury.

    • @superdau
      @superdau 6 лет назад +2

      Ooh, yeah I was randomly zooming into fractals without any special reason with that program. But I think I either ran it on an Amiga (don't know if it was available for that, can't remember) or a Pentium 60 otherwise. ALL THAT COMPUTING POWER!

    • @gordonrichardson2972
      @gordonrichardson2972 6 лет назад +1

      Yep, flashback from the 1980s!

    • @TheTurnipKing
      @TheTurnipKing 6 лет назад +2

      @@superdau Fractal generation was an entire genre of software. While I'm not sure that Fractint itself was available, there were certainly equivlalents. Vista and Mandelbrot, for example

    • @MrGustaphe
      @MrGustaphe 6 лет назад +3

      Took a course in chaos theory at Uni, and when my father heard about it he proudly handed me a book on fractals written very much in the 80's. "Obviously, graphics like these couldn't be produced on a home computer, but require large amounts of time on a university computer cluster" *(128x128 pixel koch snowflake)*

    • @Ojisan642
      @Ojisan642 6 лет назад

      David Gustavsson FRACTINT didn’t come out until the very late 80s, so that would have been absolutely true at the time. Any chance the book you’re referring to was the James Gleick one?

  • @gordonrichardson2972
    @gordonrichardson2972 6 лет назад +21

    I clearly remember using the Intel 8087 co-processor in the 1980s, and it made a huge difference to the speed of floating point arithmetic. It was quite an expensive add on...

    • @TheTurnipKing
      @TheTurnipKing 6 лет назад +3

      It's interesting, because it doesn't really act like a co-processor so much as an extension to the actual CPU.

    • @gordonrichardson2972
      @gordonrichardson2972 6 лет назад +6

      The Intel 486 was the first to have a built-in co-processor. Not all chips had this feature, in the SX version it was either absent, or disabled after failing manufacturing failure. All subsequent Intel chips always have a built in processor (Pentium onwards).

    • @lawrencedoliveiro9104
      @lawrencedoliveiro9104 6 лет назад +13

      @@gordonrichardson2972 The 486SX was a completely cynical marketing ploy. It lacked the (functioning) floating-point hardware, but you could buy a 487SX chip to add this. However, the 487SX was more than just a floating-point coprocessor: it was actually a complete functioning CPU. Plugging it in caused the 486SX to turn itself off and hand over all functions to the 487SX!

    • @gordonrichardson2972
      @gordonrichardson2972 6 лет назад

      Long before the Spectre security vulnerability, there was this terrible fear that the Pentium FDIV bug could produce a wrong answer in your calculations (maybe one in a million chance if you were a serious number cruncher). The shock and horror caused Intel a pre-tax charge of $475 million!? Edit: Only very early versions of the Pentium, the P4 is fine. Its kinda confusing that we had the 486, then Pentium P4...

    • @AndersJackson
      @AndersJackson 6 лет назад

      It was expensive, but also the number of transistors on the chip was reduced, and thus the yield was much better and you get a better price for the customers. As the number of transistors was increasing, the cost of adding the FPU was actually not that expensive to add to the chip.

  • @hrnekbezucha
    @hrnekbezucha 6 лет назад +13

    You know what would be cool? A high-level look at some CPU architectures. Like, even today if you want to license an ARM core, you still get a question whether you need a floating point operations to be done in one clock-cycle or if you can get away with it taking longer.

    • @UpcycleElectronics
      @UpcycleElectronics 6 лет назад +4

      In the context of this channel they should do an abstract overview of all the different MPU/MCU/CPU/PLD stuff used in hardware design courses over the years. That would be really interesting for the plebs that haven't attended, and could be nostalgic for those that have.

    • @hrnekbezucha
      @hrnekbezucha 6 лет назад

      Considering there are, of course, any hardware focused teachers. I don't know how is Nottingham Uni structured. It may be all history/software for all I know. Also, MCUs aren't exactly what we imagine a computer to be but they're extremely interesting little devices. And firmware engineers are few and sought after. There are so many great topics to cover.

    • @UpcycleElectronics
      @UpcycleElectronics 6 лет назад

      @@hrnekbezucha
      Yeah, I know I'm mixing interests. I am currently goofing around with vintage MPUs for the first time but my primary interests are MCU's and PLDs :)

  • @AnimilesYT
    @AnimilesYT 6 лет назад +3

    3:00 I love the way you say "pretty much" :D

  • @FriedEgg101
    @FriedEgg101 6 лет назад +4

    I remember my dad fitting a maths co-processor to our 486 SX25. I think I remember him saying that this made it a DX2 50. I was 10 at the time so didn't really understand.

  • @borisdorofeev5602
    @borisdorofeev5602 6 лет назад +6

    That high society "prrretty much" at 3:00

  • @lawrencedoliveiro9104
    @lawrencedoliveiro9104 6 лет назад +2

    6:09 Actually that was done using fixed-point (scaled-integer) calculations. Which were quite a pain to work with, but saved on floating-point hardware.
    As the hardware cost dropped, eventually it got to the point where the saving from leaving floating point out was negligible compared to the cost of programming without it. This was about the early-to-mid-1990s.

    • @gordonrichardson2972
      @gordonrichardson2972 6 лет назад +1

      The first Intel 8087 co-processor had a total of 45,000 transistors, whereas today's CPUs (Pentium onwards) have millions, which shifts the cost ratio considerably.

    • @rasz
      @rasz 6 лет назад

      It was never about "savings". Intel simply liked charging double. You have to remember their games with first 486SX being locked DXes, or 487SX. Competition forced them to end it.

    • @lawrencedoliveiro9104
      @lawrencedoliveiro9104 6 лет назад

      @@rasz It was never just about Intel. Unix workstations used quite a wide variety of chips then -- MIPS, PowerPC, Alpha, SPARC, HP-PA etc.

  • @bgoggin88
    @bgoggin88 6 лет назад

    Sean is such a great videographer. I hope Brady is proud.

  • @RealCadde
    @RealCadde 6 лет назад +13

    "I am using my keyboards CPU to gain a 5% boost in bitcoin mining speed"
    - Some dood from the 90's, if bitcoins was a thing back then.

  • @asgerms
    @asgerms 6 лет назад +1

    Anybody remember the Weitek processors for PCs? The early versions of 3D studio had support for them. Allways wondered what sort of performance boost they could deliver (relative to the x87).

  • @lawrencedoliveiro9104
    @lawrencedoliveiro9104 6 лет назад +1

    8:24 The trouble with the hardware blitters in the Amiga and Atari ST is that they were never as powerful or flexible as software. The Macintosh had its QuickDraw graphics routines, which implemented sophisticated drawing modes and nonrectangular clipping. When you tried to build similar functionality on top of a more limited hardware blitter, the performance would often end up worse than if it was all done in software.

  • @totlyepic
    @totlyepic 6 лет назад +15

    A little bummed that he didn't even mention any modern co-processors/accelerators.

  • @mtbrain1
    @mtbrain1 6 лет назад +1

    awe I was hoping for an episode on co-processors and accelerators

  • @FireDragonAndromeda
    @FireDragonAndromeda 6 лет назад +3

    With some Amiga models (not all of them) it was possible to add in an expansion board with a new processor that effectively took over as the main CPU... so you could get an accelerator board for the Amiga 1200, for instance, with a 68040 or 68060 processor that took over from the main 68EC020 chip.

    • @NicolaiSyvertsen
      @NicolaiSyvertsen 6 лет назад

      Only the PC rivaled the Amiga in expandability. People didn't really go all out on expanding Amigas until after 1992.

    • @TheTurnipKing
      @TheTurnipKing 6 лет назад

      There were ways developed to hardcore upgrade most Amigas, even the 600, a feat which involved fitting a PLCC socket upside down to the surface mount 68000 and basically bolting it to prevent it coming loose.

  • @Zadster
    @Zadster 6 лет назад +2

    Aw, I was hoping it was going to look at the "sideways" CPU expansions for the BBC Micro too. 6502, Z80, 68000, 32016, ARM...

  • @SeanClarke
    @SeanClarke 6 лет назад +2

    Amiga! The best computers I've used.

  • @samueldevulder
    @samueldevulder 6 лет назад +7

    These are (were) called co-processors. Amiga has lots of them! Amiga rules (well, used to rule :-/ )

  • @wp5355
    @wp5355 5 лет назад

    Excellent description!!

  • @SimGunther
    @SimGunther 6 лет назад +9

    Any chance that we'll ever get a video on management engines and how we can safely use our computers without a management engine? Seems like all the major CPUs have them, but there's zero point in having them.

    • @modolief
      @modolief 6 лет назад +2

      This is a _very_ serious topic and one I'd love to see Computerphile tackle.
      The Intel Management Engine (Intel ME) is a very insidious piece of hardware and very difficult to extricate from your (Intel) computer -- and AMD has an equivalent beast, so no easy out there. Maybe computers built around ARM chips are an alternative? There is ME Cleaner for the Intel situation -- maybe you guys could do a demo of that.

    • @SimGunther
      @SimGunther 6 лет назад

      @@modolief ARM has a similar engine called TrustZone, but that's for their chips with speculative execution. Here's to future success of RISC-V and the Mill processor.

    • @modolief
      @modolief 6 лет назад +1

      @@SimGunther Wow, the Mill processor! I thought nobody knew about that.

  • @moccaloto
    @moccaloto 4 года назад

    Would there be any benefits to moving some of the more bloaty/specialized CPU instructions to a separate accelerator chip/card ? For instance, I'm guess that normal consumer PCs do not necessarily need HW accelerated hashing. Maybe they don't even really need AVX. Would removing such functions free up chip area? And could that chip area be used to somehow speed up the more commonly used instructions?

  • @heidirichter
    @heidirichter 6 лет назад +1

    I was hoping for mention of machines such as the Commodore 128 (6502/Z80), or machines such as the BBC Micro or Amiga/Macintosh with x86 hardware add-ons (so 680x0/80x86), where there was more than one general purpose CPU in the system that could operate simultaneously, and thus the issues that could come up in such a system, such as bus contention, and how this problem was dealt with in hardware or software on those systems. Any chance of a video about this in the future please?

    • @rasz
      @rasz 6 лет назад

      C128 was a trash fire, you couldnt run both CPUs simultanously. C128D was even worse, you paid almost full Atari ST price for a system with 3 CPUs, but only 1 ever working at any given moment.

    • @heidirichter
      @heidirichter 6 лет назад

      @@rasz Ahh, but even if you can't use more then one CPU at a time in the 128, it's still an interesting case to discuss, because of the difference in opcodes and method of switching control, to name just two. In the case of the Amiga with the Janus library, that offered some very interesting possibilities that I don't think were ever really explored as much as they could have been. And I understand some Apple Machines were sold with a full 80486 - the Power Macintosh 6100 DOS Compatible is one of them, and those are interesting machines for such a discussion as well.

  • @marsgal42
    @marsgal42 6 лет назад

    I remember being in awe of the NeXT workstations with a 68030 CPU and a 56001 DSP.
    I now maintain legacy systems that combine a 68060 derivative with an ADSP-2181 for I/O. Seriously idiosyncratic assembly language on those things...

  • @9a3eedi
    @9a3eedi 6 лет назад

    A modern example I have played around with for a while is the Nintendo DS. It had both an ARM6 and an ARM11, each having different speeds and capabilities. You mainly would program on the ARM11 but the ARM6 acts like a coprocessor for things like wifi, etc.

    • @TheTurnipKing
      @TheTurnipKing 6 лет назад

      IIRC it was actually there for gba compatability. So, in design it's actually something like the z80 in the Megadrive, which was there partially for backwards compatability with the Master System.

    • @9a3eedi
      @9a3eedi 6 лет назад

      @@TheTurnipKing pretty much. Smart design if you ask me.

  • @suleimanmustafa1473
    @suleimanmustafa1473 6 лет назад +9

    Are GPUs considered as additional processors?

    • @bbugl
      @bbugl 6 лет назад +10

      absolutely

    • @sammbo250
      @sammbo250 6 лет назад

      Are they like FPU's, except it is matrix arithmetic instead. Effectively an MAU Matrix Arithmetic Unit.
      I don't know whether graphics card do more than matrices (what a torturous life that must be).

    • @TheTurnipKing
      @TheTurnipKing 6 лет назад +3

      Honestly, they're more like complete, dedicated sub-computers.

    • @FrankHarwald
      @FrankHarwald 6 лет назад

      GPUs mostly do both floating point & integer matrix-vector arithmetics of small fixed arities plus a couple of specialized perspective rasterization pipelines.

    • @Keldor314
      @Keldor314 6 лет назад +2

      Modern GPUs are, for all practical purposes, CPUs. The difference between a CPU and a GPU today is that CPUs are optimized for running code on a small number of threads (10's), while GPUs are optimized for running code on a very large number of threads (100's of thousands).
      The reason they're separate is that these uses require very different microarchitecture under the hood to run fast. A modern CPU throws huge amounts of hardware resources at making a single thread run 10% faster. This is extremely wasteful if you're concerned with running code with many, many threads.
      GPUs tend to also have more "exotic" specialized hardware units internally than CPUs have. Things like triangle rasterizers, texture units, ROPs, and very recently even specialized BVH traversal units for ray tracing. These would have been fast enough for shader software, but developers keep throwing more and more work at them as graphics move forward.
      CPUs tend to have much larger caches than GPUs. This is largely because cache misses hurt CPUs much more than GPUs, since an access all the way to RAM incurrs 100's of clock cycles of latency before the results come back. A GPU will just switch to another thread while waiting. A CPU often doesn't have this luxary because there might not even BE another thread to switch to.

  • @brianmiller1077
    @brianmiller1077 6 лет назад

    The Nextstation had a Digital Signal Processor (DSP) for handling multimedia functions like voicemail attachments in an e-mail (If you had too many ahhs and umms in your voicemail, you could edit it before you sent it!). That and 32 bit color (8 bits for Red, Blue and Green + 8 bits for transparency) made it really hard to go back to DOS and CGA graphics.

  • @charlesmayberry2825
    @charlesmayberry2825 6 лет назад

    Fun fact. if you're doing CG rendering and you have an AMD Threadripper, and your graphics card is "average" (AMD RX560) Utilizing the CPU to render it rather than the GPU actually yields a faster result. I mention the 560 is average, it's average low end, The CPU in this case is astoundingly fast and any decent rendering or graphics baking software can utilize every thread that can be handed to it, It illustrates the point you make about handing it to the dedicated hardware becomes redundant because the CPU can do it faster than the dedicated hardware, In this case, The advantage of handing it to the GPU is that while its rendering the machine can do other tasks, albeit a little slower as it communicates with the gpu periodically, but if you set the CPU to process it. it will get done faster, but the system will feel very very very sluggish as its dedicating most of it's cycles to that software. I just thought it was an interesting point that even in the modern era can be illustrated if your system hardware isn't equally balanced where there's one chip that's way faster than the other. (this machines primary job is crunching numbers and audio processing, so the GPU is almost an afterthought)

  • @Eric_D_6
    @Eric_D_6 6 лет назад +1

    But how many flops did that early flop unit flop if it could flop in flops?

    • @gordonrichardson2972
      @gordonrichardson2972 6 лет назад

      Early co-processors like the 8087 worked in mili-flops (joke) compared to today's CPUs.

    • @TruthNerds
      @TruthNerds 5 лет назад

      A couple hundred kiloflops or low megaflops. With software emulation, only a few kiloflops.

  • @nO_d3N1AL
    @nO_d3N1AL 6 лет назад +1

    Next, GPGPU videos please!

  • @jhomenchak
    @jhomenchak 6 лет назад

    I would love to see computerphile do an episode about computer/internet operating speeds and the relationship between hardware and softare. And ive never quite understood what firmware is. Why do storebought PCs slow down so soon after buying?

    • @bradburyrobinson
      @bradburyrobinson 6 лет назад

      When you buy a PC new it's going to be running as pure and untouched as possible, aside from any bloat that the manufacturer has added. You then get it home and add all of your peripheral drivers and software, install the dozens of software packages that you use, download a couple of browsers and then a handful of games. All of these things are taking up the resources (CPU/RAM) that was previously free when you initially bought the machine. Modern computers are abundant in resources and timescales may only be milliseconds but each one of these things takes time and effort to 'do something' which is most likely the slowdown that you see.

  • @naami2004
    @naami2004 6 лет назад +57

    3:00 lol what was that !

    • @YLKRubiks
      @YLKRubiks 6 лет назад +1

      Yay A cuber

    • @andrybak
      @andrybak 6 лет назад +4

      And similar thing for "blitter chip" at 07:05, even though we hear the same word at 07:03 just fine.

    • @chris_tzikas
      @chris_tzikas 6 лет назад +2

      Prrrety

  • @RealRobotZer0
    @RealRobotZer0 6 лет назад +7

    Only Dave from EEVblog has a old laptop with a POPULATED socket for FPU. (Of all the videos I have seen)

    • @rasz
      @rasz 6 лет назад

      All of the collectors put one in, because.
      In reality only specialized software used FPU pre ~1994. First non scientific/business "program" I encountered that used one was Magic Carpet (game), and then Quake dropped in 1996.

  • @Furiends
    @Furiends 6 лет назад

    Uniform memory access architecture should be seen as an ideal along with the generalization of multi-processing parallel computing. Ideally general purpose programming could send tasks to both CPU and GPU as resources are used. One issue with this is the need for Just In Time compiling for all programs running on the system. This means that games developed with Shader programs simply send their programs to run on the GPU while the OS can recompile and move a program to the GPU that is either better fit or to offload the CPU. If then any part of the GPU still needs dedicated memory that part of the GPU should be treated as a sub-processor with highly specific functions and infrequent reprogramming. The rest is unified and as such general purpose programs can be parallelized between processors since they share memory.
    Some processors are interdependent. Like the keyboard CPU is the only one that can get interrupts for the physical keys nor would unified memory make a lot of sense for a peripheral. In this sense the CPU on the device is really a sub-processor which might be general purpose for development purposes since microcontrollers are so accessible. But its functionality is slave to the system. It doesn't implement general purpose processing or programability. This is also true of much older GPUs or the display controller hardware on dedicated cards or as part of GPU chipsets.

  • @lawrencedoliveiro9104
    @lawrencedoliveiro9104 6 лет назад +1

    7:35 Presenter points finger at one of an array of a dozen nondescript-looking chips. Audience nods sagely.

  • @wcarlin
    @wcarlin 6 лет назад +1

    I was hoping you would get into something more interesting like computers that actually had 2 CPUs like the Fujitsu FM-7 with 2 MC6809 or Sega Genesis/MegaDrive with it’s MC68000 and Z80. Perhaps in another video.

    • @rasz
      @rasz 6 лет назад

      or C64 + Commodore 1541 combo used as a dual CPU system.

  • @Monothefox
    @Monothefox 6 лет назад

    The Intel 286/386/486 used to have a coprocessor called -87. What was that about?

    • @TruthNerds
      @TruthNerds 5 лет назад

      This *was* the FPU (floating-point coprocessor). The 80486 was the first generation IIRC which had the FPU integrated on a single die, but only for the "DX" version. There was also an "SX" version without built-in floating-point support, and a 80487SX coprocessor.
      With the Pentium, Intel finally chose to make floating-point support a full, on-die feature of every new processor.

    • @TruthNerds
      @TruthNerds 3 года назад

      @Jeff Jibson Interesting, thanks… I never even heard of Weitek before. No wonder that company went under if the sales were that bad.

  • @Dust599
    @Dust599 6 лет назад

    math co-processor, remember them well... installed several cad lab with them... those where the days.

  • @skilletpan5674
    @skilletpan5674 6 лет назад +1

    Why not just call them what most people called them back then? Co-Processors ? (I might have the spelling and hyphen wrong.)

  • @hman2875
    @hman2875 6 лет назад

    Didn't expect an upload, but that's okay

  • @robertlinke2666
    @robertlinke2666 6 лет назад

    could you make a converter for the fpu slot to accept a 8088?
    that be cool

    • @TheTurnipKing
      @TheTurnipKing 6 лет назад

      Not really. It's designed specifically to accept the matching FPU. It's best to think of the 8088 and FPU combo as one processor cut into two parts.
      If you wanted to multiprocessor an XT class machine, you'd probably want to design a card to fit into the cpu socket... or on the PC bus itself.

    • @robertlinke2666
      @robertlinke2666 6 лет назад

      @@TheTurnipKing well, unfortunatly im not smart enough for PCB design XD.

  • @garethdean6382
    @garethdean6382 6 лет назад +2

    YOU MUST CONSTRUCT ADDITIONAL -PYLONS- PROCESSORS.

  • @NotMarkKnopfler
    @NotMarkKnopfler 6 лет назад +3

    How about a video on fixed-point arithmetic as a poor-man's alternative to floating-point?

  • @World_of_OSes
    @World_of_OSes 6 лет назад +4

    Why do Computerphile videos start off with an out of context clip from the middle of the video?

  • @sblop
    @sblop 6 лет назад +1

    Thanks for great videos.

  • @MatthewSuffidy
    @MatthewSuffidy 6 лет назад

    I thought this was going to be about the mixed performance of various tasks on multi core computers, not a history of sort of multi processing.

  • @phoenix2464
    @phoenix2464 6 лет назад

    Dr Steve Bagley takes us on a trip into an FPU, next video please ?

  • @xSCOOTERx2
    @xSCOOTERx2 6 лет назад

    You must construct additional processors!

  • @Chuek-WaiTai
    @Chuek-WaiTai 6 лет назад

    What about i860?

    • @gordonrichardson2972
      @gordonrichardson2972 6 лет назад +1

      The i860 was not very successful by most accounts, except for very specialised applications, such as signal processing and aerospace.

  • @100radsbar
    @100radsbar 6 лет назад

    You should invest in a stand for the camera, the wobbling, and swaying of the camera is annoying.

  • @luisaisat
    @luisaisat 6 лет назад +16

    put subtitles please

    • @Computerphile
      @Computerphile  6 лет назад +5

      RUclips community subtitles are switched on to allow the community to help subtitle the
      films. Sadly this means the automatic subs don't show.
      Perhaps go into community subs and look there? >Sean Once someone starts work
      on this the subtitles will be here:
      ruclips.net/user/timedtext_video?v=CDpL9wOQcus&ref=share

  • @Petertronic
    @Petertronic 6 лет назад

    Dr Steve needs a bigger office!

  • @DrRChandra
    @DrRChandra 6 лет назад +1

    Blitter always seemed to me like a rudimentary GPU

    • @TheTurnipKing
      @TheTurnipKing 6 лет назад

      Most home computers employed a rudimentary GPU to free up the processor from the quite onerus task of generating a video signal. Some of the very early, cheaper systems did not and the ZX80.
      The problem is that generating a video signal is not a task that can be interrupted, so ideally you need something in parallel with the CPU doing it so as to leave your CPU free to work.
      The fun thing is that this, not the CPU speed, is ultimately responsible for most of the distinctive video limitations of a system.

  • @Nagria2112
    @Nagria2112 6 лет назад

    the camera angle is awfully low

  • @confidencial7964
    @confidencial7964 6 лет назад +2

    Happy 7E3 to you all or should i say happy 11111100011?

    • @JR-mk6ow
      @JR-mk6ow 6 лет назад +1

      FE3? Did you switch the first bit?

  • @rapscallion3506
    @rapscallion3506 6 лет назад

    What would he do without his two hands to help him speak?

  • @josephpeters5681
    @josephpeters5681 6 лет назад

    Sound like teckies always knew work load will always increase. Cool

  • @roblinea6
    @roblinea6 6 лет назад +1

    is this channel for english-native-speakers only ?

    • @yearswriter
      @yearswriter 6 лет назад

      Russian here, have no problems

    • @jhomenchak
      @jhomenchak 6 лет назад +1

      It's for any person who loves computers. I believe it is filmed at a university in great Britain. Not sure where though

    • @frazer26
      @frazer26 6 лет назад +1

      Nottingham

    • @roblinea6
      @roblinea6 6 лет назад

      I love this channel, but sometimes I have problems with the very fast speaking and with no subtitles. The start of the video is really too fast for an average non-native English speaker.

  • @Locut0s
    @Locut0s 6 лет назад +2

    Christmas Pudding Mug Mix

  • @_DarkEmperor
    @_DarkEmperor 6 лет назад +1

    When i was kid, i learned how to program ANTIC, which is graphic processor of Atari 65XE.

  • @Architector_4
    @Architector_4 6 лет назад +1

    His shirt looks like Minecraft bedrock.

  • @TechXSoftware
    @TechXSoftware 6 лет назад

    These days you can do those calculations on ya watch :)

  • @YouPlague
    @YouPlague 6 лет назад

    A CPU of that subsystem... yeah, no. It's not a CPU then.

  • @arongooch
    @arongooch 6 лет назад

    Amiga!!

  • @ice_wallow_come1813
    @ice_wallow_come1813 6 лет назад

    try making a game with only 40 kilobytes, like an NES game

  • @clangerbasher
    @clangerbasher 6 лет назад

    Built like a tank or built properly........... ;)

  • @potatok123
    @potatok123 6 лет назад +2

    2:24
    *_ThAt's a LoT Of DAmAGe_*

  • @ThePositiev3x
    @ThePositiev3x 6 лет назад

    Is this English?

    • @late7245
      @late7245 6 лет назад

      Kubilay Yazoğlu ingiliz zaten

  • @rsnilssen
    @rsnilssen 6 лет назад

    I hope all of these videos were done in the same day, either that or Steve have a very boring wardrobe. Nice shirt though.

  • @KosmoGamesKill
    @KosmoGamesKill 6 лет назад

    You guys picked a bad upload time, between hundreds of CCC videos

  • @U014B
    @U014B 6 лет назад

    Is Steve diabetic, or does he just genuinely like Diet Coke? (I'm both, just for reference.)

    • @lawrencedoliveiro9104
      @lawrencedoliveiro9104 6 лет назад +1

      Phosphoric acid dissolves your teeth!

    • @mandolinic
      @mandolinic 6 лет назад

      You don't have to be a diabetic to drink diet drinks. A few milligrams of Aspartame is a far better option than a few teaspoons of sugar, even for a non-diabetic.

    • @TruthNerds
      @TruthNerds 5 лет назад

      ​@@mandolinic You could also drink tea… without sugart. :-)

    • @mandolinic
      @mandolinic 5 лет назад

      @@TruthNerds
      Oh sure. I don't take sugar or any sweeteners in my tea or coffee, and I probably drink more of those than soft drinks. But as a non (alcohol) drinker, if I go in a bar or pub, then tea or coffee isn't always on offer, or if it is then it's some gunk from a machine, so I need to know there's a soft drink option for me.

  • @kenichimori8533
    @kenichimori8533 6 лет назад

    加法的演算器(Alphabet )

  • @smiauu
    @smiauu 6 лет назад

    Swag

  • @black_platypus
    @black_platypus 6 лет назад +1

    What'S MAFFS? Is that an acronym?
    ...Oh... You mean math :P

  • @veresur6519
    @veresur6519 6 лет назад +4

    Your speech is really hard to understand :ccc

    • @simatbirch
      @simatbirch 6 лет назад +2

      chains00 no it ain’t

    • @Scratchci
      @Scratchci 6 лет назад +4

      @@simatbirch it really is for a non native English speaker. (at least for me) it would be amazing if they enable automatically created subtitles

    • @veresur6519
      @veresur6519 6 лет назад

      ofc its not if your native language is english. Mine is hungarian, and i have a B2 english language certificate. And another reason is the special vocabulary he uses. Although i read a lot in the topic of computer science, there are always quite a lot unknown words for me in his videos.

    •  6 лет назад +2

      chains00 try slowing it down just a little. It will help.

  • @sblop
    @sblop 6 лет назад

    First 😁

    • @potatok123
      @potatok123 6 лет назад

      Benjamin Buter Petersen
      Actually Gareth Ellis is...