The Transputer: A parallel future

Поделиться
HTML-код
  • Опубликовано: 26 дек 2024

Комментарии • 797

  • @IvanToshkov
    @IvanToshkov Год назад +1055

    There was a joke at the time, that when Intel created their new processor, they tried to follow their naming scheme and added 100 to 486 but got 585.99382. That's why they named it Pentium :D

    • @mk72v2oq
      @mk72v2oq Год назад +253

      - How many Intel engineers does it take to change a lightbulb?
      - 1.99382, that's enough precision for the answer.

    • @Joshinken
      @Joshinken Год назад +133

      If you’re curious the actual reason intel named it the pentium was because amd and cyrix copied their names because you can’t trademark a number, so intel have their processors and actual name

    • @surters
      @surters Год назад +108

      @@JoshinkenDon't ruin a good joke by naming facts!

    • @Clancydaenlightened
      @Clancydaenlightened Год назад +71

      Why did the Intel engineer end up un jail?
      For running illegal instructions and possession of illegal opcodes
      Lulz

    • @Carstuff111
      @Carstuff111 Год назад +16

      I mean, as far as government contracts would be concerned, that is close enough for military grade.....in fact, its too precise, and Intel could have charged more per chip........ :P

  • @supercompooper
    @supercompooper Год назад +358

    I wrote a real time image processing system on these in grade 12. I used SCSI bus on a sun workstation and 8 transputer chips to analyze video from 7 cameras in real time which captured infrared images from metal cooling from steel mills to detect defects.

    • @wizard-pirate
      @wizard-pirate Год назад +49

      In grade 12? Goodness. What are you doing now?

    • @supercompooper
      @supercompooper Год назад +50

      @@wizard-pirate I've been CTO of a few businesses. Now I own a few and invest in a few. I started coding when I was maybe 11 years old and still code and very passionate re: computer language development. Went to uni for computing/computer electronics and have pretty good skills there too.

    • @CD3WD-Project
      @CD3WD-Project Год назад +13

      ​@@supercompoopercan I have a job lol ?

    • @oidpolar6302
      @oidpolar6302 Год назад +6

      What was the camera resolution one? Was the ADC involved with the analog camera input or it was SCSI/iEEE1394 connected ones?

    • @supercompooper
      @supercompooper Год назад

      @@oidpolar6302 each camera was a SCSI target device and another team member handled cameraSCSI. I implemented SCSI read buffer into the inmos where we had dual ported ram as SCSI data-in buffer. I don't recall exact resolution I think it was 1024 across. It was more complicated as the cameraSCSI did a convolution of several scan lines and some compression. We were not xfering full frames. More like a convolution of some blocks and cameras were treated more like a fax machine or flat bed scanner. Cameras were IR. As metal cooled if it had an imperfection it would cool at different rates. The system was set up to detect imperfections and characterize them based on their size and cooling rate from time sampling across multiple cameras.
      Also the cameras would write to a pseudo device and timestamp their writes so I could asynchronously read from my last ring buffer pointer and I could re assemble the frames of several cameras that way. Was a fun project!

  • @MarkWebbJohnson
    @MarkWebbJohnson Год назад +97

    In the 80’s I worked with a friend to develop a native 80x86 implementation of occam (called pc-occam). It is a brilliantly simple but powerful language, and so easy to use to write bug-free code. The idea was to make Occam available to students and others who couldn’t afford / get access to a transputer system.
    The three things that impressed me most were:
    1) Parallel tasks could run on one transputer or another - but the means of inter-task communication (channels) was identical.
    2) Multi-tasking was built into the transputer hardware based on task switching on the jump instruction. Simple and efficient.
    3) SEQ / PAR. So simple and easy to use.

    • @cal2127
      @cal2127 Год назад +4

      wonder how well itd run on a modern ryzen system

    • @paulie-g
      @paulie-g Год назад +13

      @@cal2127 Not well. Remember that this was a time when memory access latency was much more in line with CPU performance. To get decent performance you'd have to limit memory use to L1 cache only (and there's no way to do so explicitly on x86/x86-64) and then you'd swamp IPI/IPC (which you also can't use explicitly). Modern architectures are simply not designed for Occam's execution model, which isn't particularly efficient to begin with (it 'won', to the extent that it was used, by being a) tractable and b) giving you a lot of compute by the standards of the day with the transputer). OTOH, you could write an emulator, run the whole thing on a single core and get oodles more performance than it had back in the day (such a thing exists as 'the transterpreter' - a backend for an occam compiler, both now sadly abandoned). Incidentally, there are a few similar projects alive today - a wafer-scale supercomputing chip with tightly coupled processor+memory tiles and a switching fabric, and an ARM-based 'neural computing' set-up in Manchester iirc that is also processor+small memory on a single board interconnected with some sort of ARM bus (not aiming at high performance, just an experiment in mimicking human brain style computation).

    • @drlloyd11
      @drlloyd11 Год назад

      I used Occam commercially in the 90s. A total dead end of a language.

    • @paulie-g
      @paulie-g Год назад +1

      @@drlloyd11 In what sense? Popularity in industry? Sure. Syntax seems.. inelegant, particularly to people acculturated to Algol-style syntax (Wirth-inspired syntax languages generally suffer from this due to not having a single outstandingly successful language, cf Lua and FreePascal). Conceptually and technically? Very interesting. Unfortunately, the type of problem it is most suited for also typically demands performance and scalability and Occam as-is doesn't deliver on that front, so even if it were updated to have modern tooling, people would likely use more performant languages and kludge CSP constructs on top if they really wanted them. One can see microservices in that way if one squints. Golang is the unobvious and indirect successor.

    • @drlloyd11
      @drlloyd11 Год назад +1

      @@paulie-g In the sense it's performance was mediocre compared with using the other hardware available at the same time.
      It wasn't useful.

  • @PCPSolutions
    @PCPSolutions Год назад +71

    This is a really well timed video. I recently came into contact with a Transputer at the Large Scale Systems Museum in Pittsburgh, PA. They acquired one over the last summer and have been getting it going. What an amazing technology for the time. This was a very informative and well done video, your layout and production quality is top notch!

    • @EdNutting
      @EdNutting Год назад +1

      They’ve got one going again? Nice! Is there a webpage about it or a news article? I’d love to forward it to David May - he’d be so pleased to hear a museum is keeping this stuff alive. He wanted to set up a computing museum in Bristol (UK) but hasn’t quite got there yet.

    • @ralphhyre4973
      @ralphhyre4973 Год назад +1

      It would be cool to see if there’s a wire-wrapped Warp or CMU Nectar system there.

    • @Anatoliys_Adventure
      @Anatoliys_Adventure 10 месяцев назад +3

      I didnt know Pittsburgh had a museum like that! I'll have to check it out!

    • @PCPSolutions
      @PCPSolutions 10 месяцев назад +2

      @@EdNutting Sorry I had no idea about this comment! No news article but you can check out the Large Scale Systems Museum in NewKensington PA, we have one and Dave has it working!

    • @PCPSolutions
      @PCPSolutions 10 месяцев назад

      @@Anatoliys_Adventure Check it out! Large Scale Systems Museum on New Kensington. A wide range of really cool old hardware.

  • @valdisxp1
    @valdisxp1 Год назад +105

    This thing has had some spiritual successors. Basically multicore CPUs with on-chip networking and message passing. Tilera made up to 72 core CPUs. Tilera that go bought by ezchip, then ezchip -> mellanox -> nvidia. And there is also parallella with their 16 core Epiphany chips. Tried writing something for the parallella board back in ~2010s, but was a pretty green computer science student. Could not have imagined that there was stuff like it already in the 80s. Cool video!

    • @AnonyDave
      @AnonyDave Год назад +5

      There was also XMOS. Haven't heard much from them for a while. A short bit of time there they had a cheap dev board for their chips but then suddenly they pivoted their designs to weird audiophoolery stuff and I'm not even sure you can get the software to develop for them anymore 🙄

    • @semicuriosity257
      @semicuriosity257 Год назад +5

      SUN's GX quad polygon 3D system has influenced NVIDIA's NV1.

    • @garyclouse4164
      @garyclouse4164 Год назад +9

      There's a US company call Parallax that developed an 8 core 32 bit risc microcontroller - the P8X32A Propeller - with each of the 8 cores having 2k ram and shared access to common memory and io through a multiplexor
      Parallax codevelopef the chip along with a language called Spin

    • @0LoneTech
      @0LoneTech Год назад

      ​@@garyclouse4164The Propeller is a rather different beast, though. Transputers were designed to run Occam as directly as possible, while Spin is just yet another interpreted language - with the bytecode interpreter residing in ROM on the Propeller. Parallax were known for BASIC Stamp, another interpreter on a microcontroller. Propeller also has no off-chip communications, common to Transputer, XMOS and Epiphany. And while they optimized for handling events asynchronously with sleeping threads and event dispatch, Parallax instead went "interrupts sound complicated, let's not - people can poll now they have 8 processors".

    • @12Mantis
      @12Mantis Год назад +1

      @@AnonyDave I think I have one of those boards. Saw an article on Hack-A-Day about them giving away boards to people with interesting project ideas, now I didn't have a project in mind for it but it sounded interesting so I sent an email saying as much and some time later I got one in the mail.

  • @RayBellis
    @RayBellis Год назад +18

    Tony Hoare was head of the Oxford University Programming Research Group (in effect the CS department) when I did my Engineering and CS degree there in the late '80s. Part of the CS course was based on his "Communicating Sequential Processes" (CSP) formal language, upon which Occam was built. In later undergrad projects I got to play with quite a few transputers, including ones with audio DACs and 24-bit frame buffers attached. CSP and Occam's communications channels also heavily influenced Google's "go" language.

  • @lrochfort
    @lrochfort Год назад +174

    In the early 2000s I worked for a division of DeBeers designing and building diamond sorting machines.
    One of the projects was to replace a 1980s optical sorting machine that ran on Transputers. I got to program the Transputers as part of a life extension program whilst we developed the new system. The replacement intel solution had more horsepower, but the Transputer solution was much more flexible and malleable.
    More recently, I repaired an ICG photographic drum scanner. The custom electronics in that used a bunch of Transputers on an ISA card for processing the image signals.
    I sold some spares to someone in Australia who had a whole cluster running

    • @stalinvlad
      @stalinvlad Год назад +2

      USSR Diamonds USSR Oil Wave flag Putin....keep waving flag

  • @Error6503
    @Error6503 Год назад +38

    I worked at the Inmos Newport fab during the Thorn EMI years; I wrote test routines so if you have a duff chip it might be my fault 🙂 The transputer had very few opcodes (they were fundamentally 4 bits long) but it wasn't a RISC chip to anyone except marketing; Its instruction set natively implemented multi-threading, which was very complex, in order to allow single chip parallel algorithms. In my time there we went from successfully introducing 20Mhz T800s to failing to introduce 30Mhz T800s, with some cool new stuff cancelled along the way. Meanwhile Intel had gone from 12Mhz 386s to 50Mhz 486s in the same period, which really showed what we were up against.

    • @12Mantis
      @12Mantis Год назад +2

      So what sort of cool stuff got cancelled?

    • @Error6503
      @Error6503 Год назад +14

      @@12Mantis The most interesting was a vector processor in place of the FPU on the T800; it could have been a GPU before the world had ever though of a GPU. There were also plans for other dedicated interface processors alongside the M212 (which had a disk interface on chip). And the project which eventually produced the T9000 was restarted from scratch at least twice.

    • @JamesOversteer
      @JamesOversteer Год назад +1

      @@Error6503I grew up near and spent a lot of time completely unaware of that spot in Newport. Interesting to hear some inside knowledge, thank you for sharing.

    • @markbooth3066
      @markbooth3066 Год назад +1

      I always thought it was a shame that T9000 link packets weren't just a little bit longer. If they had been, the technology would have been shoe in to implement ATM. The C004 switch chips were ahead of their time, and the C104 was the logical extension of that, but without being able to tap into that burgeoning telecoms market, DS-Links never got the economics of scale to become a world beater.

    • @abarratt8869
      @abarratt8869 9 месяцев назад +2

      Thinking they could stay in the fab-ownership game was what finished them. Intel were pouring $billions into silicon process development, Inmos had £0.50... It was never going to work.

  • @goranaxelsson1409
    @goranaxelsson1409 Год назад +72

    There actually was another OS for transputers, Trillium.
    I once had the source code for it but a disk crash 30 years ago made me loose it.
    From the beginning of an article...
    "Trillium is an operating system designed for a
    loosely coupled, message passing ensemble architccture,
    such as the hypercube class of machines. It has
    been implemented for the FF’S T-Series[l], a transputer
    [2] based machine. Trillium presents a uniform
    programming environment that extends out of
    the parallel processors to encompass the front end
    computer and and personal workstations accessible
    over a network. C and Fortran compilers are available
    along with a suite of backend development tools
    including a transputer board simulator."
    A couple of friends and I actually had it running on our homebuilt transputer hardware back in 1991.

    • @majorpentatonic2310
      @majorpentatonic2310 Год назад +5

      There was also Minix. I had it running on my transputer system a few years ago.

    • @llothar68
      @llothar68 Год назад +3

      Homebuild Transputer, maybe the only thing that i was interested more in 1987 then losing my virginity.
      But unfortunately, all tries with my own first designed and build Z80 system showed me i will never be able to this fine tuned skills of soldering.

  • @philh9421
    @philh9421 Год назад +15

    My masters degree included quite an in-depth study of the Transputer, and Occam, and it was delivered by Andy Burns at Bradford University, who was on the Occam language panel. A fantastic design concept that in tradition was flogged to the Americans for a song.

  • @rtechlab6254
    @rtechlab6254 Год назад +24

    Used a crapton of these in my first job. Crash test systems, predictive failure analysis and signal processing. We went through thousands of them. Development used two Meiko dev systems and one very much like if not the same type unit you have. We used the ISA cards and TRAM modules as well as our own cards. Youd have a system with a couple of T440s called PMMs then boards with high end Analog Devices ADCs and filters and a single T400 with a chunk of local RAM. The comms and booting was handled on the power station units, by a 486DLC, pc104 based machine. Most power stations in the UK ran these.

    • @rtechlab6254
      @rtechlab6254 Год назад +3

      You might get some luck reaching out to Prosig in Fareham UK

    • @stephenhookings1985
      @stephenhookings1985 10 месяцев назад +1

      I did something similar (designed hardware to memory mapping into two transputers on the Meiko image bus with some A110 (I think) digital convolution chips and a ton of Intel i860s (whilst waiting for the T9000 who mysteriously turned up in a Terminator film!).
      I shrank my system down to run on an M10 in the back of a Armoured Personnel Carrier and observe IR images then identify potential military vehicles. I really like what Meiko did - I am sure I worked with David May (he uses to call me Derek of Derek and Clive fame) and Neil Coulson - both exceptional Meiko engineers. Meiko is not a canteen automation system - look at end of some belt driven tray conveyor belts. I did get my hands on an early Atari system, as well as the TDS boards for the 80286's. But the Meiko with its programmable MRUN took things to the next level. A few years later one DEC Alpha (ruggedised Military spec) could so same job as 6 i860s with 2x transputers for comms, the image capture cards and the Sparc 2 on the VME acting as display.
      Progress

  • @kenfuller9907
    @kenfuller9907 Год назад +27

    Very nostalgic for me. I did my masters degree in Parallel Computing Systems in Bristol at what was once called The Bristol Transputer Unit. The last year the course was run. So I’ve experienced the flashing curser indicating deadlock first hand…..a lot!

    • @RetroBytesUK
      @RetroBytesUK  Год назад +2

      Was David May one of your lectures?

    • @kenfuller9907
      @kenfuller9907 Год назад +4

      @@RetroBytesUK not that I recall but the transputer was pretty much dead by that time and there was more emphasis on parallel paradigms like neural networks rather than hardware.

    • @richard_loosemore
      @richard_loosemore Год назад +2

      Was that at Bristol University, or the Polytechnic? I worked in the Bristol Transputer Centre, located in the Poly, during the peak period of 1987-1990. We had tons of transputers, but couldn’t use most of them on accounted of lack of dev tools.

    • @richard_loosemore
      @richard_loosemore Год назад +3

      I also dated Ian Barron’s daughter Clare for a while. Very sweet girl, but I was kinda useless as a boyfriend. 😀

    • @kenfuller9907
      @kenfuller9907 Год назад +1

      @@richard_loosemore it was called the University of the West of England when I went there but yes it was the Poly.

  • @cohort6159
    @cohort6159 Год назад +20

    I built a 64 processor reconfigurable transputer system in grad school. I went to the Transputing 91 conference where the T9000 was announced. I never fooled with them after I graduated.

  • @andrewstones2921
    @andrewstones2921 Год назад +14

    Wow this brings back some memories. I used to work for a company in Portsmouth in 1989 that had just moved from Bristol and was manufacturing and testing these transputer boards, we were told that they were used in high end simulators. I didn’t last long, I worked as a test engineer and shortly after joining the company I made some mistake on a batch and they shipped with some problem and I was rightly blamed and fired.

    • @stephenhookings1985
      @stephenhookings1985 10 месяцев назад

      FPS America filed for Chapter 11 after their Array Processor mk 2 failed - they blamed it on the Transputer. So don't feel so bad.

  • @ebrombaugh
    @ebrombaugh Год назад +24

    Back in the 1990s I worked for a small R&D company that was developing a radio analysis system based on transputers connected to a huge rack full of Xilinx 3000-series FPGAs. We had a couple PhDs writing code for the system in various flavors of parallel languages (Fortran & C IIRC) to do signal analysis. Not sure if it ever went anywhere, but it sure generated a lot of interesting hardware.

    • @Aim54Delta
      @Aim54Delta Год назад +3

      I know some people who were working on some signal capture stuff that went toward the MALD and similar programs.
      While probably not the same company, I am sure some of the same people, or students of those people, were working on similar stuff.
      I think FPGAs generally outshined the transputer in time - although... the simplicity of the transputer would make it scale into high frequencies rather easily and it would not be difficult to commission a custom series on new technologies of die lithography.
      A bit much for a dod contract, but depending on the gusto of the groups involved and whose signatures they could get - not outside the realm of possibility.
      Generally, though, I would say that modern FPGAs are the successor to the transputer in many ways. If not literally, then functionally.

    • @ebaystars
      @ebaystars Год назад

      satellite downlink interception/microwave point to point/ 🙂

    • @piotrd.4850
      @piotrd.4850 Год назад +1

      @@Aim54Delta think that every decade or so, some projects and concepts should be revisited with updated tech, manufacturing and other advancements.

  • @TawaSkies
    @TawaSkies Год назад +18

    The Atari Transputer was one of my dream machines in the day. I remember reading an article about one of the UK's universities building a machine based on these with no ram. A bit fuzzy on details time always warps memories lol, might have been Cambridge anyway loved your story on theses, nice bit of research and history. Looking forward to you being able to get your machine running.

  • @caryj57
    @caryj57 Год назад +30

    The strangest thing about working with transputers was the memory address range. It went from -maxint .. +maxint so 0 is a valid memory address right in the middle! You had to redefine what NUL was because dereferencing *0 was perfectly valid.

    • @renakunisaki
      @renakunisaki Год назад +12

      Nice, an addressing system even weirder than x86 segmented mode.

    • @rosomak8244
      @rosomak8244 10 месяцев назад

      @@renakunisaki Look up the Zilog Z8000. You will be pleased. Another gem in this regard is the Power architecture. However there you can safely ignore the oddities.

    • @jfbeam
      @jfbeam Месяц назад

      Good point, but zero is a valid memory (bus) address on PC's, too. OSes protect it - as part of page-zero - so NULL references can be detected. I've used "zero" on loads of simple devices. (and some not so simple... moto 68k)

  • @ardentdrops
    @ardentdrops Год назад +42

    I can't believe the TIS-100 actually exists! I thought it was an original idea 🤯

  • @GodmanchesterGoblin
    @GodmanchesterGoblin Год назад +11

    This is a really well set out and comprehensive introduction to the Transputer. Well done! Massive bonus points for the BCPL mention. That was my first encounter with a serious programming language back in 1981, with the compiler and linker generating code for a TMS9900 based system. It's a relatively simple language but well suited to older processors with their more limited memory and instruction sets. As for the Inmos founders, the only name that I recognised was Ian Barron. The Transputer was way ahead of its time. Too bad that I never did any hardware design with it.
    Also, interesting footnote... Inmos was also the company that created the IMS-G171, the colour palette chip used in the original IBM VGA implementation and its many clones. There were other widely used variants too such as the G176.

  • @GeekBoi
    @GeekBoi Год назад +16

    I always wondered what the deal was with the transputer cards was. Always saw them in the Amiga magazines but never really saw any articles or documentation on what they were used for. The 286/386 PC cards always made sense to me but these were just a concept I could not follow just based on the advertisements. Thank you for the interesting history. Having worked on parallel computing clusters in my professional life I now have a much better appreciation of what these cards could do all the way back then. Also loved the razor joke 😂

  • @5urg3x
    @5urg3x Год назад +8

    This kind of stuff should really be preserved and enter the public domain after a certain amount of time; the development boards and software and all that, the source for everything as well. I think the same should apply to older video game consoles and other computing devices, etc.

  • @petergoodall6258
    @petergoodall6258 Год назад +6

    In the mid-to-late ‘80s worked for Integrated Arts (Tiny Grated Rats) in Australia. We were building a broadcast video animation system, which had hundreds of t800s. Very difficult to write something like an operating system for the transputer. No exception handling. If you wanted to kill a process, you had to write ‘stop’ all over its address space. I wrote a parallel memory management system for the other folks to use. So much fun for so little result :-) Occam - a British intellectual programming language.

  • @Pest789
    @Pest789 Год назад +24

    I was heavily into Atari 16 bit machines at the time and I followed the Atari transputer development as closely as possible. From my perspective back then, it seemed a lot more like they never had a product to sell, rather than a product to sell that nobody wanted.

    • @alexhajnal107
      @alexhajnal107 Год назад +6

      My recollection is that they were always busy developing new things, not so much on releasing them. What little did make it out the door was mostly minor upgrades on existing kit so they ended up falling further and further behind.

  • @Distinctly.Average
    @Distinctly.Average Год назад +18

    I was at Hatfield Polytechnic back in the day. It was astonishing at the time to see a computer generating live Julia sets. This was on their small transputer setup. It was a shame it died off as many thought at the time it was the future.

    • @RetroBytesUK
      @RetroBytesUK  Год назад +15

      If it had not been starved of funds at a key time in its development they probably would have been right, and it would still be a major player.

    • @6581punk
      @6581punk Год назад +4

      Parallel processing was a reaction to the fact that 32-bit processors weren't showing the performance gains expected of them. But that soon ceased to be the case, taking the wind out of the sails of parallel processors. Doing software development across multiple execution cores is vastly more difficult and still is even to this day. So nobody wanted to go down the parallel processing route until they absolutely had to. Real life applications are a lot more complex than generating fractal images.

    • @Distinctly.Average
      @Distinctly.Average Год назад

      @@6581punk That depends on the application. Breaking a task down into sub tasks and doing them separately is something we have been doing for a very long time. Having a well designed OS that can deal with that is the key. I’ve worked in the mainframe arena for many years now so have worked on applications that do just that.

    • @Distinctly.Average
      @Distinctly.Average Год назад +5

      @@RetroBytesUK Sadly a lot of things that could have been great have failed not because of the technical issues, but financial or poor management. I recently listened to a talk by some of the original Amiga team and they honestly felt some of their plans were way ahead of their time and would have come to fruition had Comodore not failed. I blame John Carmack for creating a game that the Amiga couldn’t play lol

    • @andywilkes3333
      @andywilkes3333 Год назад +2

      The company sent a couple of the team to the college in Norwich at the time that the Transputer was big news in the electronics mags. I remember, at a time when a single fractals took minutes to be drawn on the computers available to me that they showed a 'flying view' over a fractal landscape.

  • @keirthomas-bryant6116
    @keirthomas-bryant6116 Год назад +15

    Another amazing video, thanks! I was in the Atari orbit across the 1990s because I had an Atari ST. (Well, several Atari STs... They weren't very reliable.) And so when you mentioned the Atari Transputer, a load of lightbulbs went off in my head. It was featured quite a lot in the Atari ST magazines. There was such a push in the world of Atari for serious computing tasks that belies the memories of the Atari ST as being a weaker version of the Amiga. Lots of us felt the ST was very Mac-like, and certainly much more than the Amiga. Atari already had the music scene sewn-up and there was quite the push into DTP, too, if I recall. Atari platforms by this point were just more capable, and this affected the culture, too. The Amiga was very good at some things, but this caused is community to be very inward-looking. The Atari ST, on the other hand, transformed into more of a platform, in the way that the IBM PC would be. This was because of those MIDI ports, or the high-res monochrome graphics mode and monitor that facilitated pin-sharp DTP. I kinda wish Atari had continued with this philosophy but the history of all the big home computing names in the 1990s was essentially self-destruction, as we all know.

    • @medes5597
      @medes5597 Год назад +8

      As a graphic designer, I was always impressed that Atari users tended to know their computers absolutely inside out. They knew how to make art programs do anything you wanted. I remember seeing a floppy disk that receated Andy Warhol's Amiga art experiments perfectly on the Atari. If you were a designer on a budget, an Atari ST was a must have tool. I honestly don't understand why Atari didn't make a bigger deal of its art capabilities. The Amiga and Mac absolutely cleaned up with artists and it was down to the users to show that an Atari ST could do more than a Mac and as much as an Amiga for less than half the price.
      I've always felt like the knowledge of the atari userbase has been hugely understated. Atari being kind of useless meant their users took ownership of their machines.
      The ST was like an amazing little creative toolbox and was cheaper than any of its competitors. And yet I've never seen a single advertisement that targeted that audience.
      The ST was honestly one of the best computers I've ever owned for art.
      I know it wasn't quite what you were talking about but the amount of missed opportunities around the ST just always make me wonder what could have been.
      Even the head of Atari UK said "by the time word had caught on that the ST was an amazing machine for music, being used by fat boy slim and white town to get number one singles, we'd stopped making the bloody things. Only the surplus stores and second hand market got the benefit. Atari had trouble telling people anything about it. "

    • @another3997
      @another3997 Год назад +2

      I'm a fan of both companies product, but your assertions about the ST being "more capable" and Amiga "inward looking" are simply not true. The fact that GEM looked like the Mac desktop was why Digital Research were (unfairly) sued by Apple, and it was changed accordingly, except on the ST, because Apple didn't see Atari as a credible threat. And they were correct. Like the early Macs, the ST OS was fairly primitive, single tasking, quite limited and no CLI until Atari adopted MINT. The Atari hardware suffered in comparison to the Amiga's graphics, and outside of MIDI, it's built in sound. Even the later inclusion of a bitter and DSP on some models didn't help, as virtually nothing took advantage of them. And whilst the Amiga got the 68040 CPU, Atari never went beyond the slower 68030. Then the Amiga had the Newtek Video Toaster series with Lightwave 3D, which catapulted it in to TV and video production. They could be linked in to render farms, and a high end MIPS R4400 based parallel render system was developed by Desktop Technologies just for rendering Amiga Lightwave animations. Let's not pretend one system was ultimately superior... they both had their blaze of glory, and both died painful deaths.

    • @keirthomas-bryant6116
      @keirthomas-bryant6116 Год назад +3

      @@another3997 I think you missed my point but maybe I didn't explain properly. When I say the Amiga system is inward looking, what I mean is that its built-in chips for sound and graphics meant that people had all they needed. More than they needed, in fact! Because of this, most people who bought an Amiga in its heyday were in the game for what it could do on its own, and by itself. But because of design decisions with the ST - and I mention its MIDI ports and the Mac-like high-res monochrome SM124 monitor in my comment above - a culture grew up where the ST was more the centrepiece of an external setup. I think this is shown really well by comparing the music scene that arose around the Amiga and the ST. With the Amiga, it was mostly about mod music, because the Amiga had a DSP. An Amiga was all you needed! With the ST, it was about loading up Steinberg 24 and powering your synthesisers. The ST was just one component of a wider system. In other words, the Amiga was still very much a home computer in that it was all about what it alone was capable of. But the ST was indicating the direction of travel for the coming decades, where a computer would drive peripherals - where a computer wasn't inward-looking but outward-looking. This is my point. And a few other design decisions of the ST underlined this approach and culture, too, such as MS-DOS floppy compatibility (the Amiga formatted disks in the Amiga way because why would you ever need anything other than an Amiga?!). Plus there was the GEM desktop, which was much more Mac-like than Workbench. Workbench carved its own path in ways that you never found in any other GUI like Windows and, again, this indicates the inward-looking direction of the Amiga. If you used GEM then you could switch to MacOS and Windows pretty easily. If you used Workbench you'd be wondering why everything in Windows or MacOS was so weird. I realise that, as you say, video capture and animation opened up the Amiga to a certain kind of low-budget TV and movie production workflow. So it did catch-up to the "outward-looking" culture of the ST. But the whole video scene was some years after the launch of the Amiga. Summary: People bought an ST for doing the things they were interested in. People bought an Amiga for the things you could do with an Amiga. That's a subtle but important difference.

    • @alexhajnal107
      @alexhajnal107 Год назад +1

      @@another3997 The result of the Apple/DRI lawsuit didn't apply to Atari because they (Atari) had done the bulk of the m68k porting and were thus given full development rights without having to license the changes it back to DRI. For legal reasons I won't even pretend to understand this meant that they were not bound by the suit's result. Not sure why Apple didn't sue Atari directly though.

    • @paul_boddie
      @paul_boddie Год назад

      @@another3997 "and a high end MIPS R4400 based parallel render system was developed by Desktop Technologies"
      That would be DeskStation Technology who had already released a number of MIPS-based workstations, initially aiming to ride the Advanced Computing Environment bandwagon, eventually having to settle for making Windows NT workstations based on MIPS and later Alpha. Interestingly, NewTek and DeskStation were both Kansas-based companies.

  • @chriswareham
    @chriswareham Год назад +19

    Wow, I knew some of the history of the transputer but not the details. I love the foresight of designing the programming language and implementing a compiler for it first. There's some great British contributions to computing that are in the shadows, and you mentioned one them in passing - BCPL. This was the inspiration of the B programming language, that evolved into C. I still have the definitive (only?) book about it, written by Richards and Whitby-Strevens.
    As for the fate of Inmos, another example of Thatcher's blind belief that the private sector always runs things better than the public sector. Same blindness that ruined so much of Britain's economy, such as the rail industry (not just the railways themselves but the supporting industries like train manufacturing).

    • @RetroBytesUK
      @RetroBytesUK  Год назад +12

      The approach of the Thatcher goverment did do a lot of longer term economic damage, and snuffed out areas of development that could have generated whole sectors of the economy. There jusr seamed to be an absense of medium and long term thinking.

    • @Cybergorf
      @Cybergorf Год назад +1

      BCPL did earn itself a very bad reputation within the Amiga crowd.
      BCPL was a nice academic language and TripOS, the operating system build upon it, saved the early Amiga by providing the missing disk operations and a shell.
      But BCPL was stuck on 32bit pointers that did not take into account that the Motorola 68000 (as all other common processors) would address memory in bytes (8bits).
      So to get to your real address in memory you had to multiply by 4 - or to get to your "BCPL-pointer" divide by 4. This was a huge burden for programs written in C or Assembler and a constant reason for bugs.
      Commodore later spent quite a lot of resources to replace the BCPL code with rewritten C code.

    • @chriswareham
      @chriswareham Год назад

      I didn't know that BCPL was used for the early Amiga operating system - that's fascinating! I do recall having to deal with data alignment issues with some C code that was ported from a 16 bit system to a 32 bit one, but thankfully the compilers got better and started to handle the alignment in structures itself.

    • @Cybergorf
      @Cybergorf Год назад +1

      @@chriswareham Yes: the original Amiga crew (when it was still a startup before Commodore acquired it) tried to build up their own operating system. They got a preemptive multitasking kernel (Exec), developed by Carl Sassenrath, and the GUI (Intuition) by RJ Mical, but they were lacking proper device handlers and a command line interface. So Commodore contracted British Metacomco to "transplant" BCPL based TripOS onto the Exec kernel.
      So the System was from the start a weird mixture of C structures and BCPL structures, which made it overcomplicated for developers ...

  • @neogen23
    @neogen23 Год назад +4

    Only a year ago I listened to discussions among Forth groups on a parallel chip design that uses the exact same grid style for layout and communication between cores. Who knew it had already been done by others (and failed). Also, reading recently on Haskell parallelization, par and seq and relevant functions are front and center, along with functions that serve exactly like alt. So, thanks for the history lesson. It has been eye opening. Never knew these systems even existed. Keep them coming!

    • @paulie-g
      @paulie-g Год назад +1

      The Forth chips you're talking about, whether they're the Green Arrays (iirc) or Propeller are a different lineage and different purpose - they're essentially programmable realtime IO and run fairly independently of one another with not much processing power available or required. Seems similar, but isn't.

    • @interlocking-joinery
      @interlocking-joinery 11 месяцев назад

      I did a lot of work with the Novix chip in the 1980's which had Forth as it's native instruction set (assembler). Is this the chip you are referring to?

  • @NotMarkKnopfler
    @NotMarkKnopfler Год назад +9

    There is a chip that is still in production today that is similar. The Green Arrays GA144 has 144 cores _on the same chip_ and they all run their code in parallel. The cores on the edges have access to the chips pins. The native assembly language is Forth. They are stack based cores. 👍

  • @dipaoloboat
    @dipaoloboat Год назад +2

    Thank you for enlightening me as to what a transputer is, I had heard the term back in the 80's but never knew exactly what it was. Around the same time, maybe a bit earlier about 1984 I was an engineer working for Loral Instrumentation of San Diego California on the development of the Loral Dataflow computer, one of the first parallel processing computers, which was based on the concept of a "data freeway" called the "Tag/Data bus". The Dataflow computer contained up to 512 processing nodes, which like transputers each had own local memory and ran their own code. The Dataflow computer was intended for processing an incoming stream of realtime telemetry data. Each node was programmed to grab from the data bus one or more data elements having predefined "tag" values, do its processing on that data, then put the result back out on the bus with its own predefined tag value to be picked up by either an output device or one or more other processing nodes to perform further processing and put their results back onto the data bus with a unique tag value identifying its result to any other nodes needing that result, etc., etc. Often a processing node needed more than one specific "tag" of data and captured the most recent of its assigned "tagged" data elements, processing a single "result" when all values its algorithm needed had been captured from the data bus. So unlike transputers which were matrixed together the Dataflow nodes were all attached to one large high speed parallel data bus and a parallel tag bus (taking the place of the address bus in a conventional Von Neumann architecture). Its intended uses were for realtime processing of telemetry streams from spacecraft, aircraft, and signals intelligence systems. Those were exciting times, we felt back then like we were inventing the future and having a lot of fun doing it. :)

  • @MoreFunMakingIt
    @MoreFunMakingIt Год назад +11

    Such a great video. I learn about things I could only have dreamed existed in my wildest imaginings! Superb.

  • @kazzle101
    @kazzle101 Год назад +10

    24:07 I now see why I got to play on an Atari Transputer Workstation for a week at Sheffield TriTec as a technology demonstration event sort of thing... Not that I knew what I was doing or anything about it really, like the basics of how it worked. Very impressive computer though, a big tower box in that Atari Grey colour and a massive Monochrome high resolution CRT monitor.

  • @m-u-l8r747
    @m-u-l8r747 Год назад +2

    Back in the 80s there was a front cover of a gaming magazine (I can't remember which one) which featured screen shots of a game for the transputer. It was a multiplayer 3d shooter where each player had a different military vehicle, a plane, a tank, etc. and they had to take each other out. I remember seeing it and being amazed at the graphics (very rudimentary 3d shaded) at the time, as well as the concept of multiplayer 3d shooters with players on different machines - it was a forerunner of online gaming. I had no idea what a transputer was at the time and this is the first time I've heard about it since. Thanks for the great video.

  • @mikehibbett3301
    @mikehibbett3301 Год назад +4

    What a lovely project to work on! I have a transputer somewhere in my component collection, which I was given while working in Cambridge in the 1990s. I hope i find it again :)

  • @assifmirza130
    @assifmirza130 Год назад +1

    Great video, really well researched and presented. My run-in with the Transputer was at UMIST in 1988 - we had a lab with 20 Atari 520STs as workstations, and access to a number of transputer system. I can't remember how many or how they were hosted because all we undergrads ever got was a bunch of lectures about them.
    Thanks for reigniting those lost memories!

  • @NeilFuller-r3v
    @NeilFuller-r3v Год назад +7

    The transputer was definitely ahead of its time. Intel were waging a MHz war for much of the 90s/00s, while the transputer designers foresaw the need to go parallel. Looking at modern CPUs, inmos were right.
    Programming languages weren't designed for massively parallel computer systems (many still aren't), and occam introduced new paradigms that would be necessary based on Communicating Sequential Processes (CSP) with channels and guards. AIUI, CSP was developed after Hoare abandoned "monitors" as being unworkable. Monitors found their way into the java language as a primitive. Quite easy to shoot yourself in the foot with them.
    Echos of this work lives on in languages like Go, which have support for channels.

    • @0LoneTech
      @0LoneTech Год назад

      Chapel is another programming language built for distributed computing.

    • @johnjakson444
      @johnjakson444 Год назад +1

      For anyone that is a hardware and software engineer, CSP/Occam and hardware description languages like Verilog are on a continuum, anything can be modeled in either, but Verilog is more suited to modelling describing real hardware while Occam more suitable for modelling the parallelism structure, sort of 2 sides of the same coin. See also HandelC.

  • @Dad_In_The_Box
    @Dad_In_The_Box Год назад +1

    I remember programming one of these during my degree course back in ‘89. I loved Occam and the folding editor.

  • @hessex1899
    @hessex1899 Год назад +1

    When I was in school I used to pour over the specs of the transputer and daydream about getting access to one. Thank you so much for doing this video. :)

  • @KryyssTV
    @KryyssTV Год назад +8

    It's interesting how computing ended up needing to go back to transputer-like arcitecture to overcome the limitations of the CPU by implementing multiple cores and multi-threading. The guys who designed the transputer really were looking at the big picture.

  • @0LoneTech
    @0LoneTech Год назад +6

    Three modern day cousins to the Transputers are XMOS, Epiphany and GreenArrays. Mailbox registers, as used in e.g. the RP2040 for interprocessor messages, are also related.

  • @iforth64
    @iforth64 Год назад +7

    With some friends I wrote a Forth interpreter/compiler ("tForth") for the T800 (Forth is quite close to OCCAM, philosophy-wise). I did incorporate all the special OCCAM constructs and the link interface. It ran under Windows on an Intel 286, using a "C server" to do all the I/O and graphics. Unfortunately, current computers don't have the ISA interface anymore. Without ISA the link chip can't be accessed, and without that link chip you can't boot the transputer. The T801 card I still have also needs an ISA slot. The link concept and PAR, SEQ are retained in my current Forth compilers, using OS sockets. With today's multiple core CPU's, shared memory works much better than links. For more than 256 cores multiple boxes with links may become interesting again.

    • @xiaodown
      @xiaodown Год назад

      Wow! Literally the only thing I know about Forth is that it's the programming language that the magic system in the "Wiz" books is based on!

    • @TheUAoB
      @TheUAoB Год назад +5

      You can still get "industrial" boards with ISA slots. Also many modern PCs still have an ISA bus in the form of LPC which is exposed through the TPC connector. There's a project on Vogons to add an ISA slot called "dISAppointment".

    • @alexisread5325
      @alexisread5325 Год назад

      It should be possible to interface to a modern pc using an arduino, see trochilidae.blogspot.com/2021/07/stack-based-with-os-in-hardware.html?m=1

    • @GrahamMNewton
      @GrahamMNewton Год назад +1

      I worked for The Royal Greenwich Observatory and we used a Forth system to control an image acquisition/autoguider system. Whist we didn't run Forth on the Transputers which did the image processing we had a really flexible system where the main Forth control system running on a 68030 had a Transputer assembler which would allow us to write/modify Transputer programs which were then uploaded to the Transputers. The neat thing about the assembler was each Transputer op-code was a Forth word. So you wrote the Transputer program and then ran it as a Forth program to produce the assembly code. The inbuilt 2d block move instruction was really powerful as we could use it to cut sub-images from the main image all in 1 opcode. As we had a chain of transputers processing the image we could do some processing on the fly as it was read out from the CCD.

    • @markbooth3066
      @markbooth3066 Год назад

      I love the idea of a Transputer Forth with Occam like primitives.

  • @zh84
    @zh84 Год назад +5

    I remember advertisements in Byte for Microway transputer ISA boards. They had the Monoputer, with a single transputer on it, the Quadputer, with four, and the Number Smasher, which had a transputer doing I/O for an Intel i860 - a combination of two powerful and now obscure technologies!

  • @Vanders456
    @Vanders456 Год назад +8

    I very very very briefly worked for Quadrics, the company that was spun out from Meiko to develop the cross-bar switching fabric into a supercomputer interconnect. There was a Meiko CS kicking around the office. I wish I had known more about it at the time: I'm fairly sure I could have got my hands on it if I had asked.

    • @howardjones543
      @howardjones543 Год назад

      I always thought the name Computing Surface sounded super bad-ass.

  • @bobfish7699
    @bobfish7699 Год назад +22

    I remember seeing the transputer / occam being announced on Tomorrow's World. They made a huge deal about it, but the Demo they put together was so under-whelming I remember being unimpressed. But I was only in my early teens so I didn't know anything.

    • @RobBCactive
      @RobBCactive Год назад +7

      Tomorrow's World really suffered when everything went micro, before that the mechanical inventions were inherently better TV, a "magic black box" without moving parts to film is inherently dull.
      One idea I do remember was a light bulb saving switch that would sync power on to O Volts, so the filament heated more gently. I think they could show it working with oscilloscopes.

    • @RobBCactive
      @RobBCactive Год назад +4

      And I was mad keen on computing, went off to do Comp. Sci. degree, so if I found it dull the average viewer must have been catatonic.

    • @BobBeatski71
      @BobBeatski71 Год назад +2

      Me too. I was just a kid when I saw it TW and just like you, wasn't really that impressed with what they showed. Fast forward a couple of years and I got to see an Atari workstation at a computer fare. It was loading and display images from a drive, or so I thought. Turned out it was doing ray-tracing. A ray traced image in the same time it took my Amiga to load from disk ! Now THAT was impressive.

    • @BobKatzenberg
      @BobKatzenberg Год назад +2

      Hello fellow Bobs.

    • @deang5622
      @deang5622 Год назад +1

      I remember the demo on Tomorrow's World, it was quite good and they were demonstrating two things, firstly the fault tolerant nature of a system containing multiple processors so long as the code was written to support it and detect when a transputer had failed and not push a workload to it, and second they were demonstrating the performance.

  • @julianskidmore293
    @julianskidmore293 Год назад +3

    I used the UEA's 9 Transputer rack in the late 1980s to do my BSc dissertation: "The Implementation of Neural Nets on a Transputer Rack" which implemented a fully-interconnected, parallel neural net on 20MHz (ie 5MHz x 4) T424s. Amazing system!

  • @paulfrindle7144
    @paulfrindle7144 Год назад +6

    Back in the late 1980s I was a co-director of a company charged to design a large scale digital audio mixing system for a multinational organisation. At the same time close friends of mine were working for another company who were trying to do the same thing. They were wedded to the early transputer system as its core processing engine - they had been sold the concept of it.
    However it was pretty clear that politics in the UK would eventually clobber the transputer initiative - the UK was supposed to become a 'trading nation' and it was therefore bound to be sold off and eventually fail etc :-(
    So instead, we produced a large scale array of our own home-grown custom processors, specifically designed for the tasks we needed, with several added custom features, which allowed our advanced development system to function, which would never have been viable with the transputer or any other processor at that time. Needless to say we succeeded, and sadly the other team failed.
    But the moral of the story is that the volatile political system in the UK from the 1980's no longer allowed any government backed home grown technical development over more than one parliamentary period - simply because no politician during a 4 year period would ever understand its significance - and the mantra of 'trading nation' is now burned into our politics of all flavours. This is what would kill off the transputer and we had predicted it correctly. It's such a pity :-(

    • @hellbreakfast
      @hellbreakfast 10 месяцев назад

      The thing about the politicians is absolutely wild to me, because developing computers was such a good play towards success. So much great tech we use today, including the internet, is simply because one government or another backed and developed it. I actually only recently found out how on the cutting edge the UK used to be in regards to computer tech- I'm not from there, and of an entirely different generation, so I never would have guessed until taking an interest in vintage computers. Blows my mind.
      The closest thing I can think of right now, as an American, is how we've gutted NASA to near non-existence, and handed off space travel to private interests. It's gone... Well, exactly as one may expect, to be honest.

  • @0Zed0
    @0Zed0 Год назад +3

    Just started this video and it brings back memories. At the time of the transputer my Dad was working for a smallish print firm that printed some marketing materials for Inmos and knowing I was into computers he brought home some for me. I had a booklet on the ITEM400 and ITEM360 (Inmos Transputer Evaluation Module) 400 or 360 MIPS. Now back to the Video.

  • @andygardiner6526
    @andygardiner6526 Год назад +12

    I remember seeing a machine at university - a student's father worked at Inmos and got a development system running in his room for a couple of hours! The classic ray traced Newton's Cradle demo at the time just blew our comp-sci nerdy brains. :-) For a term we studied Occam and I still have a copy of he Occam programming guide which was expensive, very thin and yet managed to have many pages with "this page is intentionally left blank" printed in the middle which made it the worst value book I have ever purchased!

    • @RetroBytesUK
      @RetroBytesUK  Год назад +7

      Always annoying to feel you got ripped off by buying the text book they told you to buy.

  • @mikemines2931
    @mikemines2931 Год назад +1

    Thanks for the trip down memory lane I was expecting Raymond Baxter at any second.

  • @jumpstar9000
    @jumpstar9000 Год назад

    I remember messing with Occam and the transputer at Uni. I wish I had paid more attention. Apparently I did though because I've been building parallel processing solutions ever since! I still love the Transputer. It is very inspirational. Thanks a lot for the great video and Happy 2024!

  • @johnfranchina84
    @johnfranchina84 Год назад +1

    Great video! I graduated in Electronics Engineering in 1980 (yes long time ago) and while my early career was in Motorola 6802, 6809 and 68000, I was fascinated by the Transputer chip sets and thought these were definitely the future. Their demise due to political and corporate stuff ups parallels many similar observations during my long career. Thank you for putting this together.

    • @pirobot668beta
      @pirobot668beta Год назад

      I loved the 6809!
      Radio Shack Color Computer used it.

    • @johnfranchina84
      @johnfranchina84 Год назад

      I used to program the 6802 and 6809 directly in machine code, burn PROMs, then EEPROMs.. fun times 🤔😀

  • @KurtisRader
    @KurtisRader Год назад +1

    Interesting video and history. I went to work for Sequent Computer Systems as a support engineer around 1990 when they were a leader in SMP (symmetric multi processing) computers. At that time they were transitioning from the NS32032 to the i386. Before watching this video I don't think I had heard of the British "Transputer". Much thanks for expanding my knowledge of the state of the art when I was much younger.

  • @andyc9921
    @andyc9921 Год назад +2

    Great video, thanks! A company in Louth, UK used transputers on an offshore gas platform control system called CalCam (Computer Assisted Logical Control And
    Monitoring). All gone now but I saw the system being used many times. I have some screenshots kicking around somewhere.

  • @M0UAW_IO83
    @M0UAW_IO83 Год назад +5

    Long time ago part of my job on an industrial estate in South Manchester was looking after and training interns, usually uni students on industry experience.
    One of my interns was incredibly knowledgeable about computing theory (not so much on the practical bits of PC hardware but that's another story).
    He started banging on about this thing called the Transputer and, it turned out, was writing a series of articles about them for one of the UK electronics magazines.
    Lost contact with him when his internship ended but it'd be cool to catch up even if just to say hi.
    Clint

  • @emq667
    @emq667 Год назад +3

    Very cool! I hope you get it running because I’d love to see it in action. It would be awesome to see how it stacks up against modern commodity hardware.

  • @karlosh9286
    @karlosh9286 Год назад +1

    That was fascinating. I remember the when the Transputer came out, and seeing demos. I never got my hands directly on one. It is a shame Inmos never really got the backing they needed in the second half of the 1980s.

  • @luke_fabis
    @luke_fabis Год назад +6

    Man, I'd love to see a modernized derivative of the Transputer. x86 and ARM are still chugging along. I'd bet a lot of these design features could still be relevant today.

  • @supralapsarian
    @supralapsarian Год назад +1

    Another fantastic video. Great stuff, man. Cheers!

  • @kahnzo
    @kahnzo Год назад +3

    I love the compiler first approach. My hope is that moving forward it will be application first - hardware last.

  • @projectartichoke
    @projectartichoke Год назад +2

    So cool! What a stunning piece of computing history. I hope you're able to find the software needed to get it running, that sure would make an interesting follow-up to this most excellent video.

  • @davidhalliday7776
    @davidhalliday7776 Год назад +4

    Programmed Ocam on the ATW (Atari Transputer Workstation) in the 80s at the University of Newcastle Upon Tyne.

  • @clivepacker
    @clivepacker 10 месяцев назад

    Worked at RSRE Malvern on a European ESPRIT project - P1085- to develop a configurable networked transputer machine and applications, 1987 to 89. The project developed SuperNode, a fabric of n transputers in which the 4 links from all were linked to a configurable switch, enabling any topology to be built and optimized for particular parallel computing applications. I worked on dynamic programming algorithms - optimal route finding - and compared the MIMD transputer approach with a SIMD machine, the Distributed Array Processor, which was optimized for matrix operations. Wrote a bunch of code in Occam. We were way ahead of our time.

  • @spacedock873
    @spacedock873 Год назад +2

    My university had a transputer development system (in the mid-late 80's). It was in the same system lab that I used when doing my dissertation. I was quite interested in it but in all the time that I spent in the lab I never saw it switched on!

  • @enitalp
    @enitalp Год назад +4

    I was an official Atari dev, and I have a good collection of Atari hardware, one day I saw 2 ATW800 Atari transputer, and they were for sell for cheap, but at the time I did pass the offer. I still regret it. There was a picture at a time, at Kodak ,with a wall of ATW800 doing images manipulations.

  • @williambrasky3891
    @williambrasky3891 Год назад

    My favorite computer of all time. Thanks!

  • @helicocrasher
    @helicocrasher Год назад +3

    Cool stuff. I did program T800s in a custom biomedical film scanner as part of a research project around 1986.
    Good memories

  • @davidwhatever9041
    @davidwhatever9041 11 месяцев назад

    this takes me back, i was a phd theoretical chemist working on laser crystals, one of my systems was a pc with a pile of these processors that i would program with a specialised version of fortran77.

  • @rseichter
    @rseichter Год назад +2

    Thank you for this video. It presented many new and interesting aspects of the Transputer ecosystem which I wasn’t aware of back in the days when I tried my hand at programming in Occam. 👍

  • @orangejjay
    @orangejjay Год назад

    I don't know what happened but RUclips stopped recommending your channel to me. Sadly I didn't notice until now ... and I'm so pleased to be going through your videos I've missed. ❤❤
    Now, to start checking the hundreds of others I've subbed to and likely missed.

  • @NeilFuller-r3v
    @NeilFuller-r3v Год назад +8

    I believe the Transputer showed up in embedded systems like avionics (e.g. the RAF Jaguar plane), various other military applications and maybe even the Boeing 777. Not sure on that.
    Prof Peter Welch was still extolling the virtues of Occam and the transputer (and it's CSP underpinnings) until at least 1999 for degree courses at the University of Kent. I was a big fan.
    The Kent retargetable occam compiler (useful when transputer hardware became scarce) looks like it carried on until 2006.

    • @markbooth3066
      @markbooth3066 Год назад

      Back in the 80's and 90's the U.S. were far too hung up on Ada and determinism. Transputers were rejected from many U.S. military applications because D.o.D guidelines required software suppliers to be able to say which line of code was running at any given time, and you couldn't do that with a Transputer due to it's microcoded scheduler. U.K. military projects were more pragmatic though, and were happy with the guaranteed latency that Transputers could provide. In fact, one of the big advantages of transputers was the exceptionally low time needed to switch from a co-operatively scheduled low priority task, to a high priority task, which made it ideal for real time systems, as long as you didn't try to do too much in the PRI PAR. A huge part of this was independent high and low priority register stacks, so you never needed to store or restore context when task switching between low and high priority tasks.

  • @dingo596
    @dingo596 Год назад +9

    It very much does seem like the UK was a centre of innovation and engineering prowess that the government was hell bent on smothering. So many UK engineering and technology stories start with a team creating something innovative and world class with a surprisingly small budget then the government decides to sell it off or cuts funding and the technology dies or is sold off internationally.

    • @ebaystars
      @ebaystars Год назад

      usually the MoD get it off the government then smother it for their close friends to make money out of, in USA small innovative companies get first bid at some defence-related work to give them a first rung chance, by law, UK?, forget it the people running innovation funding all collected together haven't got a brain cell between them, let alone any strong imagination.

    • @paul_boddie
      @paul_boddie Год назад

      Indeed. The mantra is always that "we can buy it cheaper from someone else". So it all gets sold off, but then the country ends up buying it in not so cheaply from the people it sold it to. There must be some kind of corrupt political and commercial subculture in the UK where those responsible end up getting some kind of commission from the beneficiaries of these decisions. But for the nation and its economy, such decisions are short-sighted, imprudent, and harmful.

    • @kippie80
      @kippie80 Год назад

      That is because UK is a vassal state since WWII . Same with Canada which fully lost its balls in 1960
      With the RedHawk incident. I think of Nortel. I think of Ontario Hydro. Both destroyed by Yankee imports to destroy our competitiveness. We all gotta wise up and think about our neighbours more … kind of like a group of countries that rhyme with SICKS

  • @LatitudeSky
    @LatitudeSky Год назад +1

    Currently work with a big industrial machine the size of a house that at least carries on the idea of a transputer. The machine has a bunch of moving parts that all have to work at precise times and with zero delays. Rather than having a PC talk to hundreds of sensors and boards, the machine has a ton of identical transputer-like boards, each one attached to local boards that actually do whatever the functions are. The supervisor board gives out orders and contribute a status to a data bus. The machine main controller PC just watches the bus. It works incredibly well unless there's a broken wire somewhere. If a board fails, no problem. They are all identical. Swap in a new one. This setup has greatly reduced the number of parts needed to run the machine which in turn improves reliability. We've had this thing run continuously for almost 24 hours at time, as long as the humans can keep up. The distributed logic side keeps the mechanical side humming. By far the most productive such machines any of us have ever worked with. A joy to operate as much as work can be a joy.

  • @Drmcclung
    @Drmcclung Год назад +10

    Ah the 80's tech scene.. Port gender benders, sockets that identified as Zero Insertion Force slots, plug-n-play, Transputers, you can't say we weren't having fun with language back then

  • @robertpearson8546
    @robertpearson8546 Год назад +1

    Another example of how NOT to design a CPU is the 8086. They looked at lots of 8085 code and noticed that there were an awful lot of software interrupt instructions used. They looked at the frequency of instructions but made no attempt to understand the program's semantics. The 8085 only had 64K bytes of memory. A Call instruction took 3 bytes and an interrupt took only 2. So codere used software interrupts for subroutine calls. The resulting 8086 has a very extensive interrupt system that had to be run in "real mode". It took years for programmers to learn how to set up all the tables needed for the "protected mode". Plus they had to add a CMOS chip to save context during a reset because the real mode/protected mode could not be turned off by software.

  • @interlocking-joinery
    @interlocking-joinery 11 месяцев назад

    In 1992 I was working for a company called Computer Solutions when we got a contract from Mercedes Benz to write an implementation of polyFORTH for the T800. I was the sole engineer assigned to the task, and really enjoyed working with the Transputer.
    I wrote a full development environment including a text editor, optimizing compiler, assembler, and single step debugger. Keyboard and screen were provided by an IBM PC. I wrote the host software using the 8086 version of polyFORTH.
    Bootstrapping the project was interesting, because I had to first write a cross-assembler on 8086 to write a kernel that ran on the Transputer, and allowed me to develop the rest of the development tools on the Transputer itself.
    I still have a physical copy of the manuals if there are any technical details that would interest you.

  • @MocSomething
    @MocSomething 6 месяцев назад

    The 'transputer' seems like it was way ahead of its time! 32 bit chips in 1985, using parallel processing and a language that was optimised for it. Parts of it sounded very modern.

  • @davidjohnston4240
    @davidjohnston4240 Год назад +2

    My first job out of college, 1991, was at Inmos. Designing the trams and associated goodies. Fun times.

    • @RetroBytesUK
      @RetroBytesUK  Год назад

      Thats a good place to start a career, I can imagen it must have felt quiet exciting to be working on something so leading edge.

    • @davidjohnston4240
      @davidjohnston4240 Год назад

      ​@@RetroBytesUKYes. I came with a CS degree and not an EE degree..but the job was for a microelectronics engineer. So I had to hit the books pretty hard to catch up on the electronics but my CS skills along with the electronics skills provided a good basis for my career. I am now a principle engineer at Intel and I've done work in an F1 team, design consulting, standards development and cryptography work. Lots of fun over the years.

  • @ssokolow
    @ssokolow Год назад +11

    Huh. Occam is one of those languages I'd heard of but never looked into but your description of its approach to parallelism sounds like it combines Go's channels and the Rayon parallel iterators library from Rust... nice to see those ideas are getting a second chance.

    • @johnjakson444
      @johnjakson444 Год назад +1

      yes the Go team put Occam into the language, so you say rust has the same thing, will have to look into that.

    • @ssokolow
      @ssokolow Год назад

      ​@@johnjakson444The Rayon library extends Rust's built-in support for iterators so you can just replace .iter() with .par_iter() for most uses of iterators and have them spread across your cores. Nightly versions of the Rust compiler recently started offering to use it for parallelizing the last not-yet-parallel part of compilation and they hope to make that default some time in 2024. For more Go-esque (i.e. imperative with support for green threading) parallelism, check out the Flume channel library.

    • @davidboreham
      @davidboreham Год назад +4

      Go's channels and goroutines were inspired by Occam.

    • @ssokolow
      @ssokolow Год назад

      @@davidboreham Yeah. I guessed as much. That's part of why I said "nice to see those ideas are getting a second chance".

  • @ewookiis
    @ewookiis Год назад

    Since one was born in the 80's - this was something I had no idea of - ever! Amazing and interesting!

  • @flagger2020
    @flagger2020 Год назад

    Love it. Fast Ram usage was fun. Take the routine you want to run, put a blank routine behind, take the diff of the two *fp's thats the size you memcpy to fast ram starting at the first *fp, then run the routine from the fast ram ptr.. all in CSTools of course. Reading Uni CS had a CS1 paid for by Eu ESPRIT project, and as the DAP support officer for the CompCtr I had the two machines use side by side Sun systems as hosts. Then for my phd i wrote a compiler that treated them as a single machine with a few Sparc workstations thrown in for good measure. Happy times. Never got to use EPCC system but did get time on the ECS Southampton CS2. After porting pvm3.3 to the CS1 I ended up working with ORNL on pvm3.4 and eventually finished on openMPI. That t800 machine was the start of an amazing journey. Thanks for the memory. Dont forget the innovative folding occam editor!

  • @Texacate
    @Texacate Год назад

    I was an intern, and later a design engineer at inmos Colorado Springs. Great experience!

  • @ATomRileyA
    @ATomRileyA Год назад +2

    Really enjoyed this video, great overview on transputer. Always sad to see these ideas ruined by mismanagement and lack of interest which seems to be the UK's curse sometimes :)

  • @JohnVance
    @JohnVance Год назад

    This is really cool and was totally off my radar! Subscribed!

  • @cdl0
    @cdl0 Год назад +2

    Great video! Meiko Scientific did have some modest success with their Transputer-based _Computing Surface_ systems, IIRC. Also, the original ARM processor was designed with support for co-processors in some form, and its instruction set has a fairly close relationship with C-like languages.

  • @eternaldoorman5228
    @eternaldoorman5228 Год назад +1

    The quality of comments on this video is outstanding! What a community!❤

  • @jimhawkins2
    @jimhawkins2 Год назад +1

    Very good. Incidentally, Metacomco did NOT produce the Amiga operating system. The Amiga team did not have time to finish their disk operating subsystem, and licensed one from Metacomco. Tim King was at the first UK developers' conference in 1985 to talk about AmigaDos, which was written in and could only be used through, BCPL. This was a major pain in the arse for early Amiga developers like me.
    My company at that time -Sophos Software - provided Amigas and support for two electronic engineering students at Hull University, where we were based on the Science Park, to do a final year project to produce a Transputer board for the Amiga 500. It worked, but didn't come to market for various contractual reasons.

  • @mmille10
    @mmille10 11 месяцев назад

    It was icing on the cake that you talked about the ATW. That's how I found out about transputers. I was following the Atari news at the time. I heard a little about it, and found a little video on it, as Atari was demonstrating it at Atarifest in Germany. It had graphics features that we wouldn't see in commercial PCs and Macs until the late '90s, or early 2000s. Though, the Atarifest video (which was short) was the last I saw of it for many years.
    What I've read about it more recently was that most of the ATWs were sold to Kodak. A few were sold to universities. I happened to find a rare, archived video that featured an ATW, some years ago, from the Univ. of Hamburg, as I recall.
    Atari also briefly flirted with making a Unix workstation out of the TT030, with a 32 Mhz 68030, 19" monochrome monitor, and X/Windows, called the TT/X. They sold it, but only for a few months in the early '90s.
    It was too bad, since I liked the fact that Atari was making high-end systems. I never heard why these efforts were so short-lived. It just seemed to be the usual story, that Atari couldn't get its shit together. Another obvious answer was that Atari had an image problem, "game company." Also, the fact that Jack Tramiel was at the helm, I hear, was a factor. He'd burned a lot of bridges prior to buying Atari.

  • @limsolo
    @limsolo Год назад +1

    Well done and thank you I’ve been waiting for somebody to do a good recount of Inmos and the Transputer. First time I heard about this I was a kid and they had two Beebs hooked up on Tomorrow’s World and it showed the Beebs attached to the Transputers and a demo showing a butterfly moving from one screen/Beeb to the other, I remember being blow away at that. Must’ve been mid 80’s maybe slightly earlier. I think it was Judith Hann presenting.

  • @Parker8752
    @Parker8752 7 месяцев назад

    The idea of channels has of course been brought into asyncronous programming in modern languages because it's still a really good idea to make sure that separate threads don't get to share variables.

  • @alaningham1398
    @alaningham1398 Год назад

    Wow - this is a blast from the past! David May was my tutor at Bristol!

  • @edrose5045
    @edrose5045 10 месяцев назад

    David May was my lecturer for Computer Architecture at Bristol. I remember presenting a very sub-par coursework assesment to him in a viva, and it was pretty intimidating

    • @RetroBytesUK
      @RetroBytesUK  10 месяцев назад

      That's not going to have been the easiest hour of your life.

    • @edrose5045
      @edrose5045 10 месяцев назад

      @@RetroBytesUK It was worse because I had a lot of respect for both David and the other lecturer in the viva. I knew the coursework was not very good so I knew it was going to be uncomfortable.
      David is very intellectually intimidating too - you can guarantee that he'll be able to question every detail of your answer to test your understanding, so there is no point trying to blag it.
      Despite my bad coursework on one of his units (which was my fault), I still feel very lucky to have been taught by him

  • @herebejamz
    @herebejamz Год назад

    As a '92 baby, I really appreciate the time and effort people go to in discussing pre-net and early net technologies when the future was so vastly open to possibility. Now it feels like a lot of things if looked into enough are starting to crystallize and feel "solved". We're at the point to where we're basically waiting for theoretical super-materials to be created before any real paradigm shift. No wonder sci-fi was so much fun back in the day. Back when people could've at least imagined a future where some guy in his garage could scrounge up enough tech to build something almost as good as what they had at Harvard at the time.
    Now it's all proprietary silicon made with delicate multi-million dollar machinery... :(

  • @brettemurphy
    @brettemurphy Год назад +1

    In 1987 I did my fourth year Digital Systems Engineering degree thesis using Transputers and Parallel Fortran to implement a medical professor's algorithm that removed scattering from x-ray images, they were beautifully suited for this sort of thing.

  • @masonthedunce3711
    @masonthedunce3711 Год назад +1

    Always excited to see an upload :)

  • @anonamouse5917
    @anonamouse5917 Год назад

    The only time I heard anything about a Transputer was an ad in Byte magazine.
    Now I know what it is !

  • @IvanToshkov
    @IvanToshkov Год назад +4

    11:50 - this reminded me of the game TIS-100. I guess it might have been inspired by the transputer design.

  • @MarkSeve
    @MarkSeve Год назад

    Very cool. I remember hearing about the transputer, but never saw one. thank you for sharing.

  • @OverDriveOnline7921
    @OverDriveOnline7921 Год назад +3

    Ah, the good old ATW, as I remember, Kodak were the main (only?) purchaser of that system buying around 100 machines for image processing.
    Fun fact, my first awareness of the Transputer was in 1987, through a review of an add on box for the Atari ST which contained a single transputer for, well I guess anything you wanted to do with it. I remember a picture of the inside of that box showing the transputer chip in the centre, with 4 empty sockets surrounding it, and the review mentioning that more powerful boxes could be bought with more transputers in them to “fully exploit the power of parallel computing” and that transputers were “The future of computer technology”. For an Atari ST peripheral, it was about £800, so 15 year old me wasn’t going to save up for one anytime soon, also not sure if any of the upgraded boxes ever shipped as they were never mentioned again!
    Bonus fun fact, the people who developed the ATW also shrunk the ST into the Atari Stacy, which was done alongside adapting Inmos reference designs to work alongside the Mega ST so that the ATW had a bootstrap system to wake the transputer side up and do some work.

  • @PrinceWesterburg
    @PrinceWesterburg Год назад +1

    Transputer - Gets its name from the fact the grid of cores transfer data between them in a programmable manner, so every task has the best architecture needed.

  • @stephenbooth1629
    @stephenbooth1629 Год назад

    Well that takes me back. I as part of the Edinburgh research group you showed (actually I'm still there :-) ) One lesser known fact is that the transputer had hardware thread scheduling. Features that are usually parts of the software kernel were actually directly implemented in the harware. This was particularly important when it came to the implementation of channels. When a thread waited on channel communication its program counter was actually stored in the channel itself. When the other end of the channel became active the thread was automatically re-added to the thread run-queue by the communication instruction. The transputer (and much of the software that went with it) was a brilliant in many ways but the whole effort did suffer a little because the designers did not care about supporting any other software. Occam did not need an MMU because its memory use could be mathematically proved to be correct. Therefore anyone who needed to run code witten in C had a real hard time.

    • @RetroBytesUK
      @RetroBytesUK  Год назад

      Thats great to hear some of the team from then is still there, are their many of you from the transputer days ?

  • @loudscotsbloke
    @loudscotsbloke Год назад

    Cheers John, a brilliant video mate!!