These are the people that deserve our respect and admiration, not the politicians and celebrities people are rambling about. People working at IBM, Bell Labs, people who designed according to MIT principles of design. It was Engineering with a capital E. Not to mention that they made current tech possible, They designed things that didn't break, and thought out their designs. As opposed to what we have today, which is a shame. Windows 10, Apple, planned obsolescence. They don't waterproof their laptops - technology that was designed 10 years ago, they don't release schematics so engineers can repair equipment. Back in the day you could get a schematic if you asked nicely even if you weren't a business. Now companies go out of their way to make your equipment break as quickly as possible, so you have to buy new. I swear, I am a millennial, maybe I am romanticizing, but it seems to me that back in the day people had ideas. Today people's approach to living is pathetic. Everything is for sale, and people don't care about anything, besides showing off that new iPhone, or a computer that they can't even utilize.
There is truth to that and not. Read Thomas Pynchon, a lot of his stuff focuses on how commercialization causes the misdirection of science. And, he mostly wrote in the 70s. Capitalism is innately tied to consumerism. I'm pretty sure the first scientists to actively create incorrect information in an effort at brand management was for the lead consortium, and that was 100 years ago. But, don't quote me on that. Anyway, I wanted to add that some apple heatsinks literally seal themselves to the keyboards. I was repairing cell phones for a while (terrible gig, don't do it, unless you gotta) and that's where you see the planned obsolence the most. It's pathetic. Older model Samsung phones have a much better and more easy to repair construction than newer ones. And, don't even get me started with Apple. They've purposely turned their chassis into shit. It is a sad state of affairs, indeed. Then again, through the internet, and hopefully into the future, with the decentralization of information and the growing necessity for competence in a digital world. I see people being thrusted towards technology if only to put food to mouth. Either way, I figure the earlier you start trying to become more intimate with lower levels processes, the better. Open source, that's how I see the future. And, based on their most recent actions, so does Microsoft. As genuinely surprising as that is lol. Or not surprising. I'm sure you know what I mean.
Planed obsolescence is not major issue for me, major issue is, that products you mentioned ARENT fully operational by the end of their life cycle, thats what is rly wrong with todays tech.
I have great appreciation for the early pioneers such as Dr. Widrow. Not many years after the era of the Whirlwind, as a young 18-year old recruit in the USAF, I learned what constituted the basics of a computer in February 1964. Starting that day and working in all my spare time, I managed to build a simple, fully programmable digital home computer by August 1967. Entirely built using discrete components (no ICs), its magnetic drum memory stored 128 10-bit words. Its hardware could directly add, subtract, and multiply as well as perform conditional and unconditional branches. Input was done using an 9-button keypad and output was via a modified typewriter. Upon completion, that project launched me into a 6-week job as an IBM-360 operator, immediately followed by a 43-year career as a software engineer, retiring at the end of 2010. I have seen and used many of these magnetic core memories during my lifetime as well as most of the memory technologies that have followed.
Where I worked at one time, there was a manager who had who knows how many degrees. He was a professor at a college, so he really knew computers. Oh and he in the 1980s was in his 60s or so. He liked the older technologies, and we had a huge and I mean huge, teletypewriter up against the wall, that no one touched, but he used it for something cics related. Unfortunately what the man had going for him regards the brainiac department, he was missing completely in the interpersonal department. Personnel told him to take a leave of absence and take a vacation as they were afraid he was going to go postal.
A real pioneer of electronics, it is sad to see modern electronics firms fighting over patents that concern finger swipes, or rectangular cases with curves on them, kinda pathetic when compared to something like this. Apple and Samsung should be ashamed.
Mainly Apple because they are the biggest patent Trolls in history, where Samsung have actually had a good history of sharing intellectual property and most of their patent cases now are in defense of Apple in a kind of Mutually Assured Destruction.
its not Samsung fault, its apples. Apple is a patent troll stealing already made technology and changing its shape then yelling i invented that you cant have it.
Aah, the nostalgia! I got my first paid programming job on a DEC PDP11/40 back in 1973. It had IIRC 16k x 16 of ferrite cores, and each chunk (4k?) was three huge PCBs, the core plane, the drivers and the sense amplifiers. Remember the destructive read, and the Unibus DATIP cycle that paused a memory operation in the middle so that you could do a read-modify-write efficiently? Those were the days when programmers were real programmers.
Interesting until about 3 years ago the last ferrite memory computer was being used by Polish State Railways (PKP) to compute train routes and timetables !
+Richard S Could they really not afford to hire an intern to write that into an applet for a laptop to compute? The cost of running that behemoth must have been insane, would have been better served in a museum IMO...
+Teth47 they could have, but the IT staff were very happy with their antique computer. They were forced to upgrade about 2 years ago and immediately there were problems!
Richard S There're always problems when something changes, it's a system everyone's not used to. The point is that it's way cheaper and better once it's going smoothly
+Teth47 I believe commercial computer manufacturers settled on hiring Asian companies who hired women to weave core planes. There were also wiring jigs with grooves for wires and slightly deeper "pits" into which the cores would settle when shaken, with the proper orientation. Cores would be poured from a container onto the jig like pouring salt, the jig would be shaken until the excess cores fell off into a bin for reuse, and each necessary core was sitting in its pit with the hole oriented to insert the wires. Every core had four wires going through it: one passing through each row, one through each column (one row and one column carrying a pulse of current just over HALF the amount needed to switch the core's magnetization), one diagonally through every core in the plane to sense, when a core was reset to the 0 direction, whether it had changed from a 1 (thus reading the data), and another wire going diagonally through the plane on the opposite diagonals to inhibit the writing of a new 1 in the cores which contained (or the computer needed to make them contain) a 1 before the first pair of pulses. After the cores were settled properly in the jig, the workers ran a separate wire for every row (in this case, 64 rows), a separate wire for every column (in this case, 64 columns times 64 rows make 4096 cores), and the sense and inhibit wires through every core, according to the diagram. Core memory was destructive read-out; the memory would write a zero and amplify any pulse on the sense wire, storing the bit in a temporary holding register (one flip-flop circuit per bit). If the computer was writing new data, that would be replaced; if it was reading data, that holding register would be made available to the computer. In either case, the old or new data would be written back; the row and column would be selected to sent pulses in the opposite direction, trying to write a 1, but the inhibit wire would have a pulse set up to oppose them if the bit was supposed to become, or remain, a 0. And this remained the most effective memory technology until the production of large enough semiconductor chips.
Hilda! She had knowledge of actually creating the first RAM! Think if we were starting all over again! How invaluable SHe and Dr Bernard Widrow would be! What a team!
I'm in awe of what these engineers and scientists did back in the day. I only hope I can make 1/100th of the same contribution to humanity these guys did. I would die happy.
I worked on some of the first 32 bit mainframes in the early 70's. They had 8192x32 bit core memories (in this case made by AMPEX), basically the same as what Dr. Widrow was holding only smaller and denser (each module was about the size of a medium sized hardback book).
Very interesting. I have the memory plane out of the last Univac II that was operational. It is only appx 6" x6". Myself and another programmer were the last programmers to work on this computer for Franklin Life Insurance Co. in Springfield Illinois in 1973.
Thanks for posting a great video. My did didn't deal with the field directly, but he used to sell computer connectors and computer boards and cables and he remembered core memory.
+Nuckelhedd Jones I have never done this before but here it goes - go and fuck yourself. By your logic going to the moon is just a few people, some fuel and a rocket. I bet you are the kind of person that diminishes the work of others while valuing your own above its true worth.
+Nuckelhedd Jones by stating that it's not a personal story you're trying to say that this video is full of lies. Why? Because all the information in this video comes from this great man who was a part of that research. He's speaking in first person.
Those 20 facilities were built in 1962 and were the Air Traffic Control Centers of the US. under the CAA later to become the FAA.... and yes IBM made the computers, Ratheon made the Displays.
Sounds like another example (like Edison/Tesla) where the big guy (IBM) bullied the little guy out of his innovation and turned a profit. I tip my hat to you Sir for your hard work and brilliance!
Very nice to watch this video. I just finished building my own magnetic core memory plane last week though it is only 16 bits. Works great and a great learning experience to see how it was done years ago.
i have watched this video many times,but now i just comment that this man is awesome by his invention,the intelligent emitted by his thrilling is overwhelming.
4,096 bits... That's interesting... My Linux computer stores passwords in PGP encryption format. it's 4096 as well, so the bit-width alone (no data) would fill this entire plane.... Wow. It's neat, though, how you can see the actual wires and bits inside it.(and if it broke, you COULD actually fix it, as much of a pain as it would be)
Magnetic Core Memory was an amazing development for its time. I am old enough to remember when there were computers with Magnetic Core Memory. One of the things that made Magnetic Core Memory expensive was the fact that it had to be made by hand weaving. There was a time in the past when IBM employed large numbers of women to make this memory. With the modern Static RAM and Dynamic RAM this type of memory has been obsolete for a number of years. It was modern Static RAM and Dynamic RAM that helped get the price of computers so low that almost anyone can afford a computer.
niiiiiiiiiiice... I love watching movies about computers history. and I remember 'only' zx 80 with 1 kB ram. :) And I spent hours on programming this 'little baby' as 11 y.o. kid. Later I get 64 kB expansion card .... and I was the KING of the WORLD !!! .. then.. :-D
The bridge from analog to digital. This man and men like him will be venerated in the future. But they should be praised now, while they're still with us.
I worked with a Data Genera computer in the early 70's that had 8K x 16 bits of Core Memory on 2 modules of 4K each. The input/output device as a model ASR-33 Teletype machine at 10 Characters per second. The great thing about core memory, was that once the program is loaded, it is immediately available whenever the computer was powered up (semi -ROM).
Actually, he is mistaken about one tidbit: The IBM 704 was the first to feature ferrite cores; the 701 used Williams tubes - which was its main handicap!
Core memory was probably offered as an upgrade to the 701, though...? After all, it originally came out in 1952, so it wouldn't have been something he saw "come out" _after_ they stole the MIT work (...something which was probably seen as fair game at the time, as they were both doing government work on the respective machines - the 701 being built first for the DoD - and anything invented at a university was free for the good of all mankind, right...?), but they might have introduced a revised model with upgraded memory as a cheaper alternative to the 704... The processor doesn't know what kind of memory it has attached, after all... it submits a read request to the memory system and waits for the data to come back. Maybe it's faster, maybe it's slower, but the actual interfacing is dealt with by a different module. EDIT: in fact, from an article on the 701... "Memory could be expanded to a maximum of 4096 words of 36 bits by the addition of a second set of 72 Williams tubes or (later) by replacing the entire memory with magnetic-core memory" (using an IBM 737 magnetic core unit, which was the standard configuration for the 704). The bigger advance the 704 made was the addition of index registers to the CPU, which (for reasons I don't fully understand myself) made it much more capable of performing complex calculations at high speed, so it still considerably outperformed a core-equipped 701.
Yes I'm sure that the guy who was there and developed it and saw firsthand what it was used in knows less about it than the guy who's looking up all of his information secondhand on the internet.
Nice historic video! Great time to be alive for the first 'Digital Guys'. And kudos to Hilda the core knitter. 4096 lattice points and she got 'em all aligned right. Though I'm in my 50's and been using & teaching about computers, I've not run across 'Whirlwind' until just recently (2 days ago 11-20-16) and now in this video. Good protection of trade secrets!
boboala1 right now we just need quantum computing to be commercialized. Again China has a satellite launched this year with this, for a secure comms network. Nasa (and I think google) has one as well
boboala1 no. Basically it uses "quantum physics" to perform calculations. Much more complex, and it simulates a biological brain, which is more powerful than a PC
And all the other actual processing hardware, interfaces, tape drives etc; the computer itself resided in the _main frame_ , and it was controlled by various remote video or teletype terminals (in the same room, elsewhere in the building, or working over a phone line) and accessed card readers, hard disc packs etc that were freestanding. But everything absolutely needed to run some kind of basic program was contained within the main 19 inch by ~7ft frames. A standard that's persisted into the modern day for server farms and the like because IBM designed them to fit very neatly through a standard size doorway with just enough wiggle room to allow for hallway carpets and the door itself, and so you could get your fingers around the edges to pull it into the computer room without skinning your knuckles _every_ time. Minicomputers then originally meaning anything smaller than that, particularly something that would fit into the size of a mainframe accessory like a card machine or terminal desk. Which is why we have the otherwise seeming misnomer of "micro" computer for those which fit into the space of a terminal or remote access modem. Perhaps thankfully the terminology moved to "laptop" and "pocket" for the futher reduced forms instead of "nano" and "pico" etc :)
+Richard Smith PARITY refers to the oddness or evenness of the ones in a group of bits. Data was stored with one or two extra "parity" bits set to one or zero to make the whole group of bits have odd parity (in a few applications, even parity was the standard. Upon reading the bits back, if there was still an odd number of ones, most likely the data was correct (barring two simultaneous errors); if not, either the parity bit or one of the data bits had changed, so the data could not be trusted. More bits for error checking (redundancy) and a more complex formula allowed CORRECTING a single bit error, and detecting a double bit error.
+EdisonTechCenter Except this isn't really an issue. Most of the kids watching this today, don't know the difference. They can barely spell their names. And as soon as common core takes hold, they won't even need to do that to pass.
+Pablo Pablo lets hope so go out open up sme eyes shock some people. Just teach everybody everything you know. They want us to be uneducated so we go with the flow. I have been realizing that with furthering my education it is hard to learn anything that is truly worthwile unless it takes two years or more. I also realize that most of the work we do is for the benefit of someone else.
These were not really RAM chips, but magnetic core memory. I still have a 2K x 12 bit magnetic core memory which I salvaged from the trash when I worked at Digital Equipment Corporation. Very expensive to construct these by hand, but retained information without using any power, although that feature was rarely used. They needed relatively high-power electronics to drive the wires that read ("sensed") the current state and could change the state of the core at position x, y, and z. They were SO economical as compared with mercury drum, magnetic drum, or other kinds of memory.
It's "they're". Just kidding. I saw the nuckelhedd comment and couldn't help myself. On another note, people tend to forget that RAM isn't just the volatile stuff that forgets everything once you cut the power. RAM is any memory device where you can access any part at any time. All of the solid state memory ever created is one form of RAM or another. magnetic tapes, hard disks, and CDs are not RAM since you can only read one bit at a time.
Jason Queen Nope, hard drives are random access storage too. The number of bits that can be read (Not one in the case of most of those formats anyway, BTW) has nothing to do with whether or not a given storage type is RAM. Tapes are an example of sequential storage, in order to read data in the middle of the tape, you have to scroll through the whole tape, you can't skip a part because of how tape is read. Hard drives and CDs can jump tracks and rotate directly to any point of data at any time, without having to traverse the circumference of every track in between, that's what makes the distinction.
Teth47 You make a good point but, I always saw them as sequential since you still have to move a read/write head to a location and you can't read/write multiple locations at the same time. When I was a kid, I wondered if you could make hard drives and CD drives faster by having more than one read/write head per disk.
Jason Queen The differentiating point between sequential and random access memory is how the reader head is capable of moving, not whether or not there is one. A hard disk or CD's read head can move directly to any point of data on the disk without having to sequentially read every bit between those two points, like a tape would. If a hard drive were sequential storage, if you tried to copy two things that were far apart, it would take hours to move even small files, because you'd basically have to read the entire drive to find the spot you're writing to. Hard drives also generally have more than one R/W head per disk, but only one per surface. Adding a second channel to each disk doubles the amount and complexity of the supporting circuitry, so while it would theoretically increase performance, it's cheaper and more effective to simply use two hard drives, which is what RAID is.
Microsecond access time? So you could still (apart from physical dimensions and interfacing difficulties) succesfully use it for some kind of mechanical machine control application. Todays simplest microcontrollers still work at these speeds.
+pvc988 It was used between 1944? and 1978? in several general-purpose computers, including the one I programmed, the LINC, which was developed by Lincoln Labs and manufactured by Digital Equipment Corporation.
+pvc988 According to one source, it was first developed for JUKE BOXES (coin operated machines which played music stored on phonograph discs in the machine) to allow multiple selections to be entered at once: the core for a particular song was set to a 1 when purchased, the machine would scan for the next core with a 1 and play that song, then reset its core to 0 while playing, then look for another core with a 1. From there to computers with many more cores!
It's not even as slow compared to modern memory as made out in the video... DRAM has reached a bit of a plateau in terms of access speed that's equal to no more than about 200 ~ 266mhz, so 4 ~ 5ns (for pure random access; bulk burst reads are somewhat accelerated)... so "thousands of times faster" isn't really accurate. Though the on-die and SRAM based cache of a modern CPU does run at some GHz and in terms of capacity can be many times more than the total storage a classic mainframe ever had online, so there's still some truth there.
Whomever named this video clearly doesn't understand anything about computer or electronic hardware as a chip is an integrated circuit, and this is about the furthest thing from an IC you can get. Otherwise it's a very interesting video. You could have called it a RAM module, or memory module or something less specific.
I ran across this old digital computer patent 3190554 that used compressed air instead of electricity to compute. Was this computer ever built and use for anything? Could one be built today using 3D printing? If Babbage had gone this route could his Difference Engine have been built buy the folks that built pipe organs? Seems like this computer would have been electromagnetic pulse proof and that would have been handy during WW3. Thanks for the post.
+ufoengines Fluidic devices cannot operate as fast as electronics, but for a low speed application fluidics would be feasible, although expensive (that may change with nanotech). Some manufacturers of the furniture-sized hard drives of the day experimented with using fluidics to do the power-up power-down and disc switching logic in order to avoid EM interference, but it turned out to be cheaper (to build AND maintain) to use electronics and just shield the data circuits from the EM pulses. There was also talk about using rocket exhaust in the guidance logic of the boost phase. As far as I know, none of this actually reached the market or the space hardware of any country's government. Vacuum tubes are also EMP proof as far as being destroyed is concern. They may respond one time to the pulse, but it won't destroy them.
I’m trying to imagine being smart enough to tell when the computer is giving me wrong data. I think he’s describing computers that were made entirely of vacuum tubes. Solid-state computers wouldn’t require half the building for air-conditioning. I believe in 1940, there was a magnetic core memory that used mercury & some chemicals.
samljer 4kbit if I have been follorwing it correctly. Youd need at least 8 of these memory planes for 4kBytes, more if you wanted to add parity checking. Expensive to make.
Wasn't meant as an offense, I am constantly amazed at the bad grammar/spelling that gets through. But then I'm constantly having to fix Siri's mis-hearings, some of which are hilarious!
Happy to correct minor mistake too, for the pleasure to share. If these magnetic memory were working at the micro-second time scale, I would say that the most recent computers as of 2016 access memory approximately 2000 times faster, instead of "hundred of thousands". I wander also if the title "chip" apply to these magnetic memory. I was thinking "chip" mean some integrated circuit, fairly small, encapsulated in a water/chemical resistant ceramic or plastic.
This is not a ram chip. A chip is by definition manufactured directly onto the surface of a chip of semiconducting material. This is a plane of magnetic core memory. Nor is it the first random access memory device. That accolade belongs to the Williams Kilburn tube, which used dots on a specially modified crt to store bits. The core memory is a great achievement of course, but please make the video title accurate.
+Sokami Mashibe yes, I just wonder if it could be realized by sequential memory in theory? like tape, witch is not a random access pr def, but it sounds difficult.
It's an idea that's been tried several times before as a cost-cutting measure, both in mainframes and some early micros, and whilst it did still work the performance was pretty terrible. Rather like drum memory just without the bulk and expense of all the big whirly physical parts and multiple read heads etc. Though we can see some of the effect in modern systems both with many-times DDR memory (which has a much slower random access speed vs sequential burst speed) and virtual memory (even more so, as it's storing data on a hard drive, equal to either very fast drum or rather slow recirculating bit memory depending on whether it's platter or SSD), which are essentially the same idea given that they're many times cheaper for the same capacity than the multi-GHz SRAM that would be needed for absolute maximum performance but whose cost means it only gets used for small amounts of cache, and these days almost always built directly into the CPU...
These are the people that deserve our respect and admiration, not the politicians and celebrities people are rambling about.
People working at IBM, Bell Labs, people who designed according to MIT principles of design. It was Engineering with a capital E.
Not to mention that they made current tech possible,
They designed things that didn't break, and thought out their designs. As opposed to what we have today, which is a shame.
Windows 10, Apple, planned obsolescence. They don't waterproof their laptops - technology that was designed 10 years ago,
they don't release schematics so engineers can repair equipment. Back in the day you could get a schematic if you asked nicely even if you weren't a business. Now companies go out of their way to make your equipment break as quickly as possible, so you have to buy new.
I swear, I am a millennial, maybe I am romanticizing, but it seems to me that back in the day people had ideas.
Today people's approach to living is pathetic. Everything is for sale, and people don't care about anything, besides showing off
that new iPhone, or a computer that they can't even utilize.
Matthew Nielsen they should be our government
There is truth to that and not. Read Thomas Pynchon, a lot of his stuff focuses on how commercialization causes the misdirection of science. And, he mostly wrote in the 70s. Capitalism is innately tied to consumerism. I'm pretty sure the first scientists to actively create incorrect information in an effort at brand management was for the lead consortium, and that was 100 years ago. But, don't quote me on that. Anyway, I wanted to add that some apple heatsinks literally seal themselves to the keyboards. I was repairing cell phones for a while (terrible gig, don't do it, unless you gotta) and that's where you see the planned obsolence the most. It's pathetic. Older model Samsung phones have a much better and more easy to repair construction than newer ones. And, don't even get me started with Apple. They've purposely turned their chassis into shit. It is a sad state of affairs, indeed. Then again, through the internet, and hopefully into the future, with the decentralization of information and the growing necessity for competence in a digital world. I see people being thrusted towards technology if only to put food to mouth. Either way, I figure the earlier you start trying to become more intimate with lower levels processes, the better. Open source, that's how I see the future. And, based on their most recent actions, so does Microsoft. As genuinely surprising as that is lol. Or not surprising. I'm sure you know what I mean.
Planed obsolescence is not major issue for me, major issue is, that products you mentioned ARENT fully operational by the end of their life cycle, thats what is rly wrong with todays tech.
Agreed
Most of those made more money per month than most people got in years, so I wouldn't feel so sorry for their lack of appreciation.
I have great appreciation for the early pioneers such as Dr. Widrow. Not many years after the era of the Whirlwind, as a young 18-year old recruit in the USAF, I learned what constituted the basics of a computer in February 1964. Starting that day and working in all my spare time, I managed to build a simple, fully programmable digital home computer by August 1967. Entirely built using discrete components (no ICs), its magnetic drum memory stored 128 10-bit words. Its hardware could directly add, subtract, and multiply as well as perform conditional and unconditional branches. Input was done using an 9-button keypad and output was via a modified typewriter. Upon completion, that project launched me into a 6-week job as an IBM-360 operator, immediately followed by a 43-year career as a software engineer, retiring at the end of 2010. I have seen and used many of these magnetic core memories during my lifetime as well as most of the memory technologies that have followed.
Joseph Watson cool story bro
Very interesting story Sir!
''immediately followed by a 43-year career as a software engineer, retiring at the end of 2010.''
Elementary...dear Watson!
Where I worked at one time, there was a manager who had who knows how many degrees. He was a professor at a college, so he really knew computers. Oh and he in the 1980s was in his 60s or so. He liked the older technologies, and we had a huge and I mean huge, teletypewriter up against the wall, that no one touched, but he used it for something cics related. Unfortunately what the man had going for him regards the brainiac department, he was missing completely in the interpersonal department. Personnel told him to take a leave of absence and take a vacation as they were afraid he was going to go postal.
Are there any more videos narrated by Dr. Widrow? The way he talks makes this topic so fascinating and his voice is clear and warm.
A real pioneer of electronics, it is sad to see modern electronics firms fighting over patents that concern finger swipes, or rectangular cases with curves on them, kinda pathetic when compared to something like this. Apple and Samsung should be ashamed.
Mainly Apple because they are the biggest patent Trolls in history, where Samsung have actually had a good history of sharing intellectual property and most of their patent cases now are in defense of Apple in a kind of Mutually Assured Destruction.
its not Samsung fault, its apples. Apple is a patent troll stealing already made technology and changing its shape then yelling i invented that you cant have it.
Samsung still contributes to open source, Apple doesn't.
Aah, the nostalgia! I got my first paid programming job on a DEC PDP11/40 back in 1973. It had IIRC 16k x 16 of ferrite cores, and each chunk (4k?) was three huge PCBs, the core plane, the drivers and the sense amplifiers. Remember the destructive read, and the Unibus DATIP cycle that paused a memory operation in the middle so that you could do a read-modify-write efficiently? Those were the days when programmers were real programmers.
Interesting until about 3 years ago the last ferrite memory computer was being used by Polish State Railways (PKP) to compute train routes and timetables !
+Richard S Could they really not afford to hire an intern to write that into an applet for a laptop to compute? The cost of running that behemoth must have been insane, would have been better served in a museum IMO...
+Teth47 they could have, but the IT staff were very happy with their antique computer. They were forced to upgrade about 2 years ago and immediately there were problems!
Richard S There're always problems when something changes, it's a system everyone's not used to. The point is that it's way cheaper and better once it's going smoothly
+Teth47 I believe commercial computer manufacturers settled on hiring Asian companies who hired women to weave core planes. There were also wiring jigs with grooves for wires and slightly deeper "pits" into which the cores would settle when shaken, with the proper orientation. Cores would be poured from a container onto the jig like pouring salt, the jig would be shaken until the excess cores fell off into a bin for reuse, and each necessary core was sitting in its pit with the hole oriented to insert the wires. Every core had four wires going through it: one passing through each row, one through each column (one row and one column carrying a pulse of current just over HALF the amount needed to switch the core's magnetization), one diagonally through every core in the plane to sense, when a core was reset to the 0 direction, whether it had changed from a 1 (thus reading the data), and another wire going diagonally through the plane on the opposite diagonals to inhibit the writing of a new 1 in the cores which contained (or the computer needed to make them contain) a 1 before the first pair of pulses. After the cores were settled properly in the jig, the workers ran a separate wire for every row (in this case, 64 rows), a separate wire for every column (in this case, 64 columns times 64 rows make 4096 cores), and the sense and inhibit wires through every core, according to the diagram.
Core memory was destructive read-out; the memory would write a zero and amplify any pulse on the sense wire, storing the bit in a temporary holding register (one flip-flop circuit per bit). If the computer was writing new data, that would be replaced; if it was reading data, that holding register would be made available to the computer. In either case, the old or new data would be written back; the row and column would be selected to sent pulses in the opposite direction, trying to write a 1, but the inhibit wire would have a pulse set up to oppose them if the bit was supposed to become, or remain, a 0. And this remained the most effective memory technology until the production of large enough semiconductor chips.
+Richard S
Punctuation is a thing.
I could listen to this guy talk all day about the old stuff. and how much things have changed. great video
Hilda! She had knowledge of actually creating the first RAM! Think if we were starting all over again! How invaluable SHe and Dr Bernard Widrow would be! What a team!
I'm in awe of what these engineers and scientists did back in the day. I only hope I can make 1/100th of the same contribution to humanity these guys did. I would die happy.
Thank you for your time and contributions.
I worked on some of the first 32 bit mainframes in the early 70's. They had 8192x32 bit core memories (in this case made by AMPEX), basically the same as what Dr. Widrow was holding only smaller and denser (each module was about the size of a medium sized hardback book).
Do you have a piece of those? It would be great to cut it in halves and look at it for a couple of hours. xD
@@RREDesigns 32 bit or 36 bit?
@@rra022001 Either would be awesome. :D
Very interesting. I have the memory plane out of the last Univac II that was operational. It is only appx 6" x6". Myself and another programmer were the last programmers to work on this computer for Franklin Life Insurance Co. in Springfield Illinois in 1973.
Thanks for posting a great video. My did didn't deal with the field directly, but he used to sell computer connectors and computer boards and cables and he remembered core memory.
What a great effort to make this very personal view on computer history.
+roff poff
No. No it wasn't. A couple cameras, a script of questions, an editor and bang. Not actually an effort.
+Nuckelhedd Jones I have never done this before but here it goes - go and fuck yourself. By your logic going to the moon is just a few people, some fuel and a rocket. I bet you are the kind of person that diminishes the work of others while valuing your own above its true worth.
+Nuckelhedd Jones by stating that it's not a personal story you're trying to say that this video is full of lies. Why? Because all the information in this video comes from this great man who was a part of that research. He's speaking in first person.
@@nuckelheddjones6502 autism
Those 20 facilities were built in 1962 and were the Air Traffic Control Centers of the US. under the CAA later to become the FAA.... and yes IBM made the computers, Ratheon made the Displays.
Sounds like another example (like Edison/Tesla) where the big guy (IBM) bullied the little guy out of his innovation and turned a profit. I tip my hat to you Sir for your hard work and brilliance!
Very nice to watch this video. I just finished building my own magnetic core memory plane last week though it is only 16 bits. Works great and a great learning experience to see how it was done years ago.
This was a real treat. THANK YOU FOR SHARING!
so cool, I had a CCNA teacher that had the successor to that memory module sitting on his desk. love seeing where this all started
Thankyou Dr. Bernard Widrow for your contribution to the future of humanity.
i have watched this video many times,but now i just comment that this man is awesome by his invention,the intelligent emitted by his thrilling is overwhelming.
4,096 bits... That's interesting... My Linux computer stores passwords in PGP encryption format. it's 4096 as well, so the bit-width alone (no data) would fill this entire plane.... Wow. It's neat, though, how you can see the actual wires and bits inside it.(and if it broke, you COULD actually fix it, as much of a pain as it would be)
And files are stored in 4K chunks, interesting
Magnetic Core Memory was an amazing development for its time. I am old enough to remember when there were computers with Magnetic Core Memory. One of the things that made Magnetic Core Memory expensive was the fact that it had to be made by hand weaving. There was a time in the past when IBM employed large numbers of women to make this memory. With the modern Static RAM and Dynamic RAM this type of memory has been obsolete for a number of years. It was modern Static RAM and Dynamic RAM that helped get the price of computers so low that almost anyone can afford a computer.
niiiiiiiiiiice... I love watching movies about computers history.
and I remember 'only' zx 80 with 1 kB ram. :)
And I spent hours on programming this 'little baby' as 11 y.o. kid.
Later I get 64 kB expansion card .... and I was the KING of the WORLD !!! .. then.. :-D
The bridge from analog to digital. This man and men like him will be venerated in the future.
But they should be praised now, while they're still with us.
You can see it? So like if I ran my program through that memory, I could see what love is?
Awesome video. Thanks!
So the memory stacks used to be real physical stacks back in the day.
"What's the hurry? Why do you have to go so fast?" Brilliant. Dr. Widrow's dad was the inventor of dad jokes.
I worked with a Data Genera computer in the early 70's that had 8K x 16 bits of Core Memory on 2 modules of 4K each. The input/output device as a model ASR-33 Teletype machine at 10 Characters per second. The great thing about core memory, was that once the program is loaded, it is immediately available whenever the computer was powered up (semi -ROM).
on today's memory you would need an electron scanning microscope, and you would never technically see them.
Here is a video showing off the Whirlwind computer mentioned in this video: 1951 -Modern Computers MIT Whirlwind I with Real Time Video
It amazes me that this worked.
This really wasn't that long ago. Imagine how far technology will have advanced in a couple of hundred years from now.
Actually, he is mistaken about one tidbit: The IBM 704 was the first to feature ferrite cores; the 701 used Williams tubes - which was its main handicap!
Core memory was probably offered as an upgrade to the 701, though...? After all, it originally came out in 1952, so it wouldn't have been something he saw "come out" _after_ they stole the MIT work (...something which was probably seen as fair game at the time, as they were both doing government work on the respective machines - the 701 being built first for the DoD - and anything invented at a university was free for the good of all mankind, right...?), but they might have introduced a revised model with upgraded memory as a cheaper alternative to the 704...
The processor doesn't know what kind of memory it has attached, after all... it submits a read request to the memory system and waits for the data to come back. Maybe it's faster, maybe it's slower, but the actual interfacing is dealt with by a different module.
EDIT: in fact, from an article on the 701...
"Memory could be expanded to a maximum of 4096 words of 36 bits by the addition of a second set of 72 Williams tubes or (later) by replacing the entire memory with magnetic-core memory" (using an IBM 737 magnetic core unit, which was the standard configuration for the 704).
The bigger advance the 704 made was the addition of index registers to the CPU, which (for reasons I don't fully understand myself) made it much more capable of performing complex calculations at high speed, so it still considerably outperformed a core-equipped 701.
Yes I'm sure that the guy who was there and developed it and saw firsthand what it was used in knows less about it than the guy who's looking up all of his information secondhand on the internet.
Nice historic video! Great time to be alive for the first 'Digital Guys'. And kudos to Hilda the core knitter. 4096 lattice points and she got 'em all aligned right. Though I'm in my 50's and been using & teaching about computers, I've not run across 'Whirlwind' until just recently (2 days ago 11-20-16) and now in this video. Good protection of trade secrets!
boboala1 right now we just need quantum computing to be commercialized. Again China has a satellite launched this year with this, for a secure comms network. Nasa (and I think google) has one as well
Pre-Google: what is 'quantum computing'? Does it have anything to do with nuclear wessels?
boboala1 no. Basically it uses "quantum physics" to perform calculations. Much more complex, and it simulates a biological brain, which is more powerful than a PC
Is your avatar/pic the guy off the animated 'Heavy Metal' movie? I love it!
It's 2024 and I think, Dr. Widrow is still alive.
I imagine Hilda sat in a rocking chair with a blue rinse and knitting needles making that ram.
Damn, making those memory planes by hand must have required some serious patience.
Seriously, those man are legend.
I worked for DEC in the 80s
That's amazing, especially to think one bit per panel... now we have 4 gb on a thing like 1/6 of that size.
Amazing, thank you for this video.
Magical memories.
Hilda did a good job !
Thank you for very informative video!
I think they were called mainframes because of the frames that held the memory lattices.
And all the other actual processing hardware, interfaces, tape drives etc; the computer itself resided in the _main frame_ , and it was controlled by various remote video or teletype terminals (in the same room, elsewhere in the building, or working over a phone line) and accessed card readers, hard disc packs etc that were freestanding. But everything absolutely needed to run some kind of basic program was contained within the main 19 inch by ~7ft frames. A standard that's persisted into the modern day for server farms and the like because IBM designed them to fit very neatly through a standard size doorway with just enough wiggle room to allow for hallway carpets and the door itself, and so you could get your fingers around the edges to pull it into the computer room without skinning your knuckles _every_ time.
Minicomputers then originally meaning anything smaller than that, particularly something that would fit into the size of a mainframe accessory like a card machine or terminal desk. Which is why we have the otherwise seeming misnomer of "micro" computer for those which fit into the space of a terminal or remote access modem. Perhaps thankfully the terminology moved to "laptop" and "pocket" for the futher reduced forms instead of "nano" and "pico" etc :)
Wow, that so awesome, I'd love such a thing, such technological history :o) Wow, nearly blew my ears off with the bass in the outro!! lol
Amazing. Thank you man.
awesome story
Actually it's 4kb/panel but yes, it's pretty amazing how far we have come.
4k *bit* ...
To have the todays technology, alot things have updated till now
in memory of Hilda.. unsung dream weavers..
hey internet.. this ol'doc stores forgotten memory.. ruclips.net/video/HPT7Wtp3yoo/видео.html
Well I wish I lived back then, now thanks to Windows 10 I get memory errors couple times a month
Pioneers. We rest on the shoulders of giants.
That was awesome. :)
small transcription error: parody checking should be parity-checking
+RyuDarragh I'll give you a thumbs up since I may be the only one who understands the panametric conundrite of your reference.
+Richard Smith PARITY refers to the oddness or evenness of the ones in a group of bits. Data was stored with one or two extra "parity" bits set to one or zero to make the whole group of bits have odd parity (in a few applications, even parity was the standard. Upon reading the bits back, if there was still an odd number of ones, most likely the data was correct (barring two simultaneous errors); if not, either the parity bit or one of the data bits had changed, so the data could not be trusted. More bits for error checking (redundancy) and a more complex formula allowed CORRECTING a single bit error, and detecting a double bit error.
This is so cool !
As the beer commercial says.... real men of genius
I found a mistake in the cc at around 2:40. I'ts parity, not parody
+web1bastler Thanks, we produce a lot of video products so its nice to get input on small things that are easy to fix after the fact.
+web1bastler
You may want to re-evaluate your life.
+EdisonTechCenter
Except this isn't really an issue. Most of the kids watching this today, don't know the difference. They can barely spell their names. And as soon as common core takes hold, they won't even need to do that to pass.
+Pablo Pablo lets hope so go out open up sme eyes shock some people. Just teach everybody everything you know. They want us to be uneducated so we go with the flow. I have been realizing that with furthering my education it is hard to learn anything that is truly worthwile unless it takes two years or more. I also realize that most of the work we do is for the benefit of someone else.
These were not really RAM chips, but magnetic core memory. I still have a 2K x 12 bit magnetic core memory which I salvaged from the trash when I worked at Digital Equipment Corporation. Very expensive to construct these by hand, but retained information without using any power, although that feature was rarely used. They needed relatively high-power electronics to drive the wires that read ("sensed") the current state and could change the state of the core at position x, y, and z. They were SO economical as compared with mercury drum, magnetic drum, or other kinds of memory.
+David Spector Yes they are. They are cards that contain memory that can be accessed at random. That's what RAM is.
It's "they're". Just kidding. I saw the nuckelhedd comment and couldn't help myself. On another note, people tend to forget that RAM isn't just the volatile stuff that forgets everything once you cut the power. RAM is any memory device where you can access any part at any time. All of the solid state memory ever created is one form of RAM or another. magnetic tapes, hard disks, and CDs are not RAM since you can only read one bit at a time.
Jason Queen Nope, hard drives are random access storage too. The number of bits that can be read (Not one in the case of most of those formats anyway, BTW) has nothing to do with whether or not a given storage type is RAM. Tapes are an example of sequential storage, in order to read data in the middle of the tape, you have to scroll through the whole tape, you can't skip a part because of how tape is read. Hard drives and CDs can jump tracks and rotate directly to any point of data at any time, without having to traverse the circumference of every track in between, that's what makes the distinction.
Teth47
You make a good point but, I always saw them as sequential since you still have to move a read/write head to a location and you can't read/write multiple locations at the same time. When I was a kid, I wondered if you could make hard drives and CD drives faster by having more than one read/write head per disk.
Jason Queen The differentiating point between sequential and random access memory is how the reader head is capable of moving, not whether or not there is one. A hard disk or CD's read head can move directly to any point of data on the disk without having to sequentially read every bit between those two points, like a tape would. If a hard drive were sequential storage, if you tried to copy two things that were far apart, it would take hours to move even small files, because you'd basically have to read the entire drive to find the spot you're writing to. Hard drives also generally have more than one R/W head per disk, but only one per surface. Adding a second channel to each disk doubles the amount and complexity of the supporting circuitry, so while it would theoretically increase performance, it's cheaper and more effective to simply use two hard drives, which is what RAID is.
Wow, that was awesome
it's like a giant processor
Respect to Dr. Widrow and all who made it possible including the political system which spent the money and ...
Microsecond access time? So you could still (apart from physical dimensions and interfacing difficulties) succesfully use it for some kind of mechanical machine control application. Todays simplest microcontrollers still work at these speeds.
+pvc988 It was used between 1944? and 1978? in several general-purpose computers, including the one I programmed, the LINC, which was developed by Lincoln Labs and manufactured by Digital Equipment Corporation.
+pvc988 According to one source, it was first developed for JUKE BOXES (coin operated machines which played music stored on phonograph discs in the machine) to allow multiple selections to be entered at once: the core for a particular song was set to a 1 when purchased, the machine would scan for the next core with a 1 and play that song, then reset its core to 0 while playing, then look for another core with a 1. From there to computers with many more cores!
+Allan Richardson Yeah. Today that would be a job for a microcontroller.
It's not even as slow compared to modern memory as made out in the video... DRAM has reached a bit of a plateau in terms of access speed that's equal to no more than about 200 ~ 266mhz, so 4 ~ 5ns (for pure random access; bulk burst reads are somewhat accelerated)... so "thousands of times faster" isn't really accurate. Though the on-die and SRAM based cache of a modern CPU does run at some GHz and in terms of capacity can be many times more than the total storage a classic mainframe ever had online, so there's still some truth there.
i love technology. but not as much as you, you see. But i still love technology
This this was Great ❤
Just Awesome!
but how did they come up with the idea of where to start to build it
Whomever named this video clearly doesn't understand anything about computer or electronic hardware as a chip is an integrated circuit, and this is about the furthest thing from an IC you can get. Otherwise it's a very interesting video. You could have called it a RAM module, or memory module or something less specific.
Wow! So cool.
I ran across this old digital computer patent 3190554 that used compressed air instead of electricity to compute. Was this computer ever built and use for anything? Could one be built today using 3D printing? If Babbage had gone this route could his Difference Engine have been built buy the folks that built pipe organs? Seems like this computer would have been electromagnetic pulse proof and that would have been handy during WW3. Thanks for the post.
+ufoengines Fluidic devices cannot operate as fast as electronics, but for a low speed application fluidics would be feasible, although expensive (that may change with nanotech). Some manufacturers of the furniture-sized hard drives of the day experimented with using fluidics to do the power-up power-down and disc switching logic in order to avoid EM interference, but it turned out to be cheaper (to build AND maintain) to use electronics and just shield the data circuits from the EM pulses. There was also talk about using rocket exhaust in the guidance logic of the boost phase. As far as I know, none of this actually reached the market or the space hardware of any country's government.
Vacuum tubes are also EMP proof as far as being destroyed is concern. They may respond one time to the pulse, but it won't destroy them.
I’m trying to imagine being smart enough to tell when the computer is giving me wrong data.
I think he’s describing computers that were made entirely of vacuum tubes. Solid-state computers wouldn’t require half the building for air-conditioning.
I believe in 1940, there was a magnetic core memory that used mercury & some chemicals.
Amazing
So that plate is 4k? damn
samljer 4kbit if I have been follorwing it correctly. Youd need at least 8 of these memory planes for 4kBytes, more if you wanted to add parity checking. Expensive to make.
Most of my computer products become unusable before the guarantee expires, as in can't do the updates!
that 1 kof ram
Sounds so much like Jim Morrison it’s unreal.
He doesn't sound like a heroin addict to me?
AWESOME
Working on it, but life keeps throwing one bag of shit infront of me after another :o(
parody checking is important !!!!
There's still electricity... There, I said it ⚡🔌
whoever wrote the subtitles, its "two bits for PARITY", not PARODY
Wasn't meant as an offense, I am constantly amazed at the bad grammar/spelling that gets through. But then I'm constantly having to fix Siri's mis-hearings, some of which are hilarious!
Happy to correct minor mistake too, for the pleasure to share. If these magnetic memory were working at the micro-second time scale, I would say that the most recent computers as of 2016 access memory approximately 2000 times faster, instead of "hundred of thousands".
I wander also if the title "chip" apply to these magnetic memory. I was thinking "chip" mean some integrated circuit, fairly small, encapsulated in a water/chemical resistant ceramic or plastic.
So where's the chip ?
This is not a ram chip. A chip is by definition manufactured directly onto the surface of a chip of semiconducting material. This is a plane of magnetic core memory.
Nor is it the first random access memory device. That accolade belongs to the Williams Kilburn tube, which used dots on a specially modified crt to store bits.
The core memory is a great achievement of course, but please make the video title accurate.
They can fix the title of the video by just adding successful before RAM.
Hi :) Does anyone know what defines a 50 MIL core memory bead?
Who was Hilda? Somebody track down Hilda. She did all the wiring on these devices. Presumably MIT can recall or Hilda was?
Shes probably dead!
before anyone dares to criticize this video, one must remember that they MUST have ram in their computers to even watch this video in the first place.
+Sokami Mashibe yes, I just wonder if it could be realized by sequential memory in theory? like tape, witch is not a random access pr def, but it sounds difficult.
It's an idea that's been tried several times before as a cost-cutting measure, both in mainframes and some early micros, and whilst it did still work the performance was pretty terrible. Rather like drum memory just without the bulk and expense of all the big whirly physical parts and multiple read heads etc.
Though we can see some of the effect in modern systems both with many-times DDR memory (which has a much slower random access speed vs sequential burst speed) and virtual memory (even more so, as it's storing data on a hard drive, equal to either very fast drum or rather slow recirculating bit memory depending on whether it's platter or SSD), which are essentially the same idea given that they're many times cheaper for the same capacity than the multi-GHz SRAM that would be needed for absolute maximum performance but whose cost means it only gets used for small amounts of cache, and these days almost always built directly into the CPU...
M I T called. They want their memory plane back. 😪
Watch, he recalls the Cold War
insane
Is there any way I could download something like this to my computer?
youtubedownloader
2:30 ITYM "Parity checking" not "Parody checking"
Why this video has dislikes?
People are idiots!
Cool
512 bytes of memory
cool....im 27 and now i look back at FPM..... yea not slow. lol or dense
I have the exact same pc, can i run crysys 3 at ultra 4k or maybe gta 5 also i want to run FFX
I thought this stuff was ROM?
How in the heck did we go from that, to today... crazy. I'm glad for great minds.