Driver date is a placeholder that Microsoft implemented way back in Windows Vista. All default Windows drivers from that point use 21 Jun 2006 date so any newer drivers from manufacturers override it due to how WIndows picks the drivers from available options. Since there was no specific Intel driver installed, it just shows the date of the Microsoft Basic Display Adapter driver.
Fun fact: Intel uses 07/18/1968 for their chipset drivers. This was the day Intel was founded. It is used as a priority degrading technique, so the drivers won't overwrite any other chipset drivers/
The idle was a bit loud, however, the paste looked old and the pads also looked old. Maybe a repast to solve that issue. It looks like an LGA 775 sized die too!
I work on the medical industry and Larrabee cards were truly found on our prototyped CT Scanners control computers from 2008 and 2009. The Intel Larrabee dev group used to sit less than 20mi away from our mfg facility, so they were always on site working with our RnD group. The 2010 dated VBIOS might have been one of the latest attempts to get it working without re-engineering it and ultimately pull the project plug. I have heard this story many moons ago. Will ask around with the RnD folks if anyone managed to keep one of these cards after gone and more importantly if they kept its drivers around. Likely will be for XP - maybe 7 - only but shall give some idea what this card would be capable, had it passed. Merry Xmas Roman!
@@rkan2 WDDM was introduced with Vista. XP o Win7 are irrelevant in this context. If it can run in Vista under WDDM then it can run in WIndows 11. You are right tough that some Xp drivers can work in WIndows 7. XDDM/XPDM was removed in Windows 8. But considering Vista was released in 2007, I think this should have a WDDM driver.
I have been mad that LTT did so little about their sample of Larrabee, especially once the story about how they got it went public. Glad one sample found a way to you. 🙂
@@SkynetCyb I cant find details from that time so keep in mind that the following text are not proven facts, but just a faint memory of one side of the story: Basically some guy won an internet auction (possibly Ebay since LTT mentiones it as source of it) for it and had knowledge and plans to use it. But after the auction ended, LTT found it and contacted the seller. He then canceled the auction and send the card to LTT, which made one video with barelly trying to start a computer with it and it was not seen again. Meanwhile the original buyer made some posts about his side of the buying process, but conceded that he does not want to fight Ebay and LTT for it even though the auction has been closed when it happened.
@@pavelsovicka5292 Looks like it's from a hardforum thread, two users were bidding on it and one won. Then the seller claimed that he had been "contacted by intel and could not sell the card" only for it to show up in an LTT video a short while later. Bastards
"Microsoft Basic Display Adapter" just means that Windows is using the BIOS/EFI provided framebuffer or a generic VESA video mode. It's a vendor-agnostic, minimal, fallback "driver". Merry Christmas!
yup many of microsofts most basic drivers and core NT drivers where developed / released on 6/21/2006 and has nothing to do with larrabee being around then
For those that don't know what Larrabee actually is and why is it so impressive, it's software defined gpu, it has a bunch of general purpose processors (I think they are based on x86 actually) that can be programed within drivers to run any current or future API and optimized with time for much greater performance, it also means that API can be optimized per game or even that a game or application can load it's own preparatory API, it also means that in theory you could natively run apis for consoles like PSSL (used in PS5) and run console games without a need to emulate GPU.
@@EXG21 ouch, didn't notice typo, what an embarrassment. Dunno how I mistyped, because I know how to spell the the name (we have exactly the same name in Poland which is also spelled exactly the same)
@@Zeno- The term was popularized by Nvidia in 1999, who marketed the GeForce 256 as "the world's first GPU". It was presented as a "single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines
@@Krisztian5HUN If it's not integrated, it's discrete. The definition encompasses all different designs so long as they are a standalone graphics processing unit.
Merry Christmas, Roman. Thanks for all the great content this year and I'm looking forward to more. By the way, I picked up your O11 Air Mini case a few weeks ago for my 13700k build and this is easily the best case I've ever worked in, and the airflow is fantastic. Great collab with Lian Li!
There was a guy in the comments on Linus' video who seemed guite genuine, who claimed he worked on the project and offered to look for documentation he had. Might be worth digging up 🤷
Debug connector is most likely Lauterbach JTAG/trace connection to do very hardware level software debug/development. With the right set of tools, you can do some interesting stuff from there.
Yep pretty much. On the development vehicles Intel supplied to partners that connector was used for an LED debug/POST board with reset and one unknown function button, hung off the front of the case.
I remember when Intel announced Larrabee, the concept was hugely different from Nvidia/ATI GPU. If I remember correctly, in simple terms, they wanted to put many CPU cores inside a GPU.
It really isn't a traditional GPU at all... It's a many-core x86 CPU moonlighting (poorly) as a graphics processor. It's basically software graphics rendering on fucking CRACK! 🤣 And this is exactly why Intel quickly realized using what they had made to slowly & poorly render graphics was ridiculously fucking stupid, and pivoted the design entirely to HPC under the "Xeon Phi" rebranding, where such a "metric FUCK-TONS of simple x86 cores on a chip" design actually made some sense. 🤷♂️
It was an ambitious goal, but i think the decision to use software to emulate all rasterization operations instead of adding a few pieces of dedicated hardware like ROPS and TMUs was too much of an albatross for it to ever be successful.
@@defectivedegenerate4046 Yeah, I remember they "shrank" basic x86 cores to the minimum, and tried to make a GPU. Shader cores in modern GPUs are actually very similar but without x86 baggage. And when I was at Intel Conference, they said you will be able to do real time Ray Tracing on Larrabe. At that time, in movie industry, for ONE Frame per second you would need 100 Larabbes. It was hilarious, but if they pulled something off, Game Industry would be very different now. Like with Sony CELL. When they first started to design PS3, it didn't have a dedicated GPU. Everything would be done on CELL. it had normal CPU, which would run OS, and 8 SPE (?). They realized this would not work and quickly added a dedicated GPU. That's why PS3 was a beast, but as a developer, you had to know low level stuff to get CPU performance out of it.
I remember Larrabee. It was supposed to have ray tracing. Technically making it the first raytracing card LONG before nVidia did it. If I recall it's also basically a multi-core pentium pro.
GPU core reminds me of the old socket 7 CPUs with how it's designed. Cool review, it's neat to see the "predecessor" of the current Intel cards & how they've evolved.
It is possible to extract the driver from the windows system that used it. Probably need to match windows version to install it after. I'd try linux instead maybe.
That massive heat output could be caused by power management being non functional because there's no driver to control it. Definitely agree that it is a piece of history right there even if it's not a success...would have been interesting to see how it does things if there's a driver...
Lol I had the luxury of being there when the card was acquired originally before being placed into his hands. The card was super clean and heavy I have a couple pictures from before it was shipped
amen, der8auer clearly has never known the desperation of needing to troubleshoot a defective card he did not break himself. This Larrabee may display, but whether it is truly functional is still up in the air.
i think that the OEM that builds those CT Scanners had some of these to test from Intel. After a while that CT scanner machine was getting old and they sold it forgetting about the GPU
Intel itself tried to insist the agp bus was fast enough to use shared memory. But partner builders realized how stupid that was, and so most added in some video ram to save face for how abysmal performance would be without.
"Izmir Dikili" "I have the drivers if you still want them. We recieved this card for test purposes in our Crytek headquarter in Germany" This is one of the viewer messages from Linus video about this Intel Prototype. He claims he's one of the developers for Crysis. Maybe try to contact him for the driver.
11:39 AFAIK at least some larrabee prototypes (LTT one?) actually work with xeon phi drivers, but cannot boot release kernel/rootfs (release driver loads them to mic-device from host pc).
destroyed by those bollocks which had fingers before on this card -> museum for electrical trash - or you can help intel to catch those criminals... ...it's on you
The gpu package looks like intel just took the heat spreader from the Beckton LGA1567 Xeon family. oddly similar. The entire package looks to be a similar size too. It would be interesting to see how many pins are on the BGA.
Of the couple of different samples shown, yours and LTTs, I have yet to see either of you try it in Linux. The person who pulled it from the machine, could they actually boot the machine and pull the drivers, maybe? It was working in that, why not grab the drivers. So cool!
Interestingly, from the cooler side at least, the card looks fairly modern; not knowing about the project existing, I kind of assumed from the thumbnail that this would be an ARC prototype.
Xeon Phi was born because of the failures of Larrabee. Kirk Skaugen made the decision to resurrect it as an HPC co-processor which was a brilliant move by him at the time.
I really REALLY want Larrabee style drivers for like Knights Mill Larrabee was amazing for being what is a CPU, running games, through software. Imagine the 72 cores of Knights Mill doing both CPU and GPU tasks, and with the nature of CPU, this might have actually worked with an SLI-like software solution, at the very least if the drivers allowed each PCIe card to drive an independant monitor, which would probably work with GPUs equally well.
I plan on buying an intel card eventually since I can never really afford anything beyond mid tier. I'm actually very excited about intel entering the GPU market.
I enjoyed that and understood all your reasoning for buying it. I think that's why I bought a Radeon VII, I knew it was never going to be great??? As a nerd you just have to do these things. It would be nice to see it running with a proper driver.
In 2018 my son came up to me being a teen and asked for this card in 2018. Based on color I believe is why he picked this . Back then it was only 3500ish USD and I swore there was a write up about this. Maybe way back machine cpu-tech or hardforum and keyword larabee
I have an cheap business computer, which is a DELL 7070 SSF with a COLOR AMD Radeon™ RX 6400 Desktop Graphics card, a SAMSING SSD 980 PRO NVMe 500GB card, an i5-8500 CPU, two 8GB KINGSTON CL16 FURY 2666MHZ memory cards, a 1TB HDD, and a case fan that I added using a CPU fan cable splitter. I have a slow silent case fan hooked up to the CPU fan connector as the master fan, and the CPU fan set as the slave. That way the PWM goes higher to get the slow silent case fan up to the set CPU fan RPM, and in turn that increases the CPU fan to a higher level than normal. So everything stays nice and cool. But what is darn confusing to me, is that at AMAZON you see the earlier model, the DELL 7060, with less performance, for up to $3,000. It is crazy.
I am confused. I hear all the time that the industry needs validated products (hardware), drivers, etc. and that is the reason why professional GPUs are insanely expensive to buy, despite costing the same to produce. This card being an engineering sample and used in a CT machine is a bit conflicting.
these cards were great in advanced body mapping ..i seen it in hospital environment ...yes they are not just designed for gaming in mind, its for using machines application like VR headsets they use to operate on human person by this heavy machine that attached to heavy computer ...
The Driver is a basic standard for all GPUs to be able to display a basic video output without the proper Drivers being installed. Without it you would only have a black screen until you install the proper Drivers, what would be a lot harder to do without some visual feedback. Since its a standard for basic functionality it doesn't need to be updated unless they find a security vulnerability that involvers it and that is why the basic Driver is from 2006.
Driver installation package was probably on that PC connected to that CT scanner. Also, the driver date is for the Windows 11 "Microsoft Basic Display Adapter"-driver and got nothing to do with the card. Furthermore, the date is wrong - it just shows Microsofts failure to update information, such as dates, in their driver files.
@@cameramaker ah inside the chp architecture is the same as xeons from that era besides the card is not before 2000 actually, that particular card is 2009 era and drivers for are on intel website
I hope someone from Intel sees this and provides you with some kind of a working driver, win 7 i dunno. but would love to see performance in like 3dmark 06/vantage and some games of the time.. Merry Xmas m8.
A CT-Scanner terminal/controller PC... That is a very interesting source! I presume it was used for some sort of processing, and not just 2D graphics output.
Larrabee was the last time Intel excited me - the GPU would have become featureless - rendering engines would have become timeless software rendered modules.
I remember waiting to get Bulldozer/Larrabee/Radeon/Geforce/Hydra system, Jim Keller was gas lighting us on what Bulldozer would do. I think Nvidia bought and buried Lucid Hydra like they did other competing solutions over the years.
The Microsoft basic display adapter was written well before the card existed and given the fact that they would have probably wanted to test basic display out seperate from the driver development and to aid in it and would have been manufactured to give basic display out to meet the microsoft driver requirements for basic display out, the chicken came first and the egg was made to support the chicken..... lol
What about that CT scanner PC? Card probably didn't work on basic display driver there if it actually did something. Was it windows or linux machine? Maybe its possible to contact someone that worked with that pc.. someone from tech support.. Also 8 GB of memory in 2010??? Did you find correct chips? Around that time cards had 256, 512 MB of video ram, later 1-2 GB.
First cards with 2GB of GDDR5 (like sapphire HD4870 2GB and HD5870 eyefinity) used 16 of 1Gbit chips (2008-2009). 2Gb chips were widely present at the launch of HD6000 series (used for sure on HD6970) which was 12.2010. They were also used later on team green GTX580 3GB (12x 2Gb ICs). Ofc der8auer mistaken Gb with GB or the datasheet of memory ICs were wrongly described. That's all ;)
I would think you had contacts at Intel... you should talk to them about who would be the RIGHT person to talk to at Intel andthat might be your best bet... No matter, this was SUPER INTERESTING!!! I had thought Intel DID release the Larabee card for Servers and such... but hearing it was in an MRI box.... well... I just don't believe it TBH .... you should look to the serve spacew for the drivers, you should have better luck there
Kinda sucks that Intel doesn't have the confidence to try for a few generations. They're about to do the same thing again. Because they can't replicate the competition on gen 1 they don't feel it's useful to keep going into gen 2 and gen 3.
As I Recall... Despite being co-designers of AGP, Intel leadership was initially very opposed to the idea of complex video cards that would help process graphics data. They wanted everything to be handled by the cpu and rather stubbornly stuck to that concept for years. I think Larrabee was still catering to it somewhat, by using cpu dies rather than a real gpu like ATi and NVidia.
Interesting vidéo ! but Larabee was far from intel first attempt at video cards. As far as I remember they had AGP and PCI cards (3D GPU) back in 1995-1996 with the i740 which actualy sold to public for a while. But intel failed to anticipated the S3, ATI, Matrox and then Nvidia competitive products for games or productive softwares. I remember installing and selling those cheap cards on PC with Windows 95/98/NT4.0...
Are you sure those are 1 Gigabyte VRAM chips and not 1 Gigabit for a total of 8 Gigabits or 1 Gigabyte of VRAM? 1 Gigabyte of total VRAM seems way more likely given the card's date and what other cards of the time had. My top of the line ATI HD 4870x2 had 2GB of GDDR5 RAM 1GB for each GPU and it was release around that time as well.
Me too, I'm wondering if it indeed had that much VRAM. Even a more recent card, the R9 Fury of 2015, had 4GB of VRAM, and the infamous GTX 970 of the same era had 3.5+0.5GB.
sounds like the first time ray tracing was to be used in consumer gpus A public demonstration of the Larrabee ray-tracing capabilities took place at the Intel Developer Forum in San Francisco on September 22, 2009. An experimental version of Enemy Territory: Quake Wars titled Quake Wars: Ray Traced was shown in real-time. The scene contained a ray traced water surface that reflected the surrounding objects, like a ship and several flying vehicles, accurately.[25][26][27] A second demo was given at the SC09 conference in Portland at November 17, 2009 during a keynote by Intel CTO Justin Rattner. A Larrabee card was able to achieve 1006 GFLops in the SGEMM 4Kx4K calculation.
Seeing this coprocessor card, I always wondered why don't we have them for like ARM for bare metal processing. Considering the PS1 dev kit was a card that can be placed into a PC for development.
i mean they do exist, but they are just FPGAs with ARM cores built-in and are usually pretty damn expensive. a RISC-V Variant would also be pretty cool
I'm really hoping that Intel can find and send you some drivers for this (as a stockholder, I command it!). This is a piece of history and should be properly tested with whatever they were working on at the time. That said, I know it's difficult and perhaps the engineers at Intel are not as paranoid as I am. . .100% have every bit of code, every technical document and all notes for everything I ever did thru my entire career (not that I expect someone to ever ask me for information on any product that I designed). I hope someone, somewhere has what you need to make this fully functional.
When our RnD group used this card, it was for a Windows XP or Windows 7 application, so unless Intel were also working with other use cases, I wouldn’t think drivers were ever built for non-Windows OS.
GPU-Z also doesn't recognize my VIA uD8 (5400ew x2) and uH4 (5400ew) Video Card that was an embedded video card for video walls. It was a real pain in the butt to acquire drivers for these cards. They have the performance of the Chrome 540 GTX, in fact I have put the device ID in the Chrome drivers and they work.
@@FixedFunction Yeah, it is disappointing to see no S3 Graphics cards on the Techpowerup site. I try to add as many cards as I can on the Passmark site. Although, it won't show my 530 GT in the card rankings.
I forgot that an IHS in graphics cards used to be a thing, also interesting to note the speaker symbol on the silkscreen, i have never even seen an accelerator card with one of those also that giant CPLA-4-50 4 phase inductor is a neat choice but the choice to use rather expensive delta regulator modules stands out, clearly a prototype never intended for mass production with that
There are no drivers, we've been trying to find the interface software for Windows for over a decade. The problem is the driver needs to communicate with the integrated operating system that loads off the EEPROM. These cards don't have a "BIOS" they have a sequence of software layers that initialize the cores to do different things, and the upper interface layer is effectively a FreeBSD distribution.
@@777anarchist Nothing, because the software stacks for graphics and compute are completely different. They're using the core resources in different ways, and the graphics stack is activating portions of the die that handle texture filtering, a portion of the die that is present and entirely deactivated in Xeon Phi.
Driver date is a placeholder that Microsoft implemented way back in Windows Vista. All default Windows drivers from that point use 21 Jun 2006 date so any newer drivers from manufacturers override it due to how WIndows picks the drivers from available options. Since there was no specific Intel driver installed, it just shows the date of the Microsoft Basic Display Adapter driver.
Fun fact: Intel uses 07/18/1968 for their chipset drivers. This was the day Intel was founded. It is used as a priority degrading technique, so the drivers won't overwrite any other chipset drivers/
I was going to say the same thing.
Also without installing drivers, the card is not idling correctly.
@@CoreyPL I love it when I see that. It is the exact date Intel was founded.
The idle was a bit loud, however, the paste looked old and the pads also looked old. Maybe a repast to solve that issue. It looks like an LGA 775 sized die too!
I work on the medical industry and Larrabee cards were truly found on our prototyped CT Scanners control computers from 2008 and 2009. The Intel Larrabee dev group used to sit less than 20mi away from our mfg facility, so they were always on site working with our RnD group. The 2010 dated VBIOS might have been one of the latest attempts to get it working without re-engineering it and ultimately pull the project plug. I have heard this story many moons ago.
Will ask around with the RnD folks if anyone managed to keep one of these cards after gone and more importantly if they kept its drivers around. Likely will be for XP - maybe 7 - only but shall give some idea what this card would be capable, had it passed.
Merry Xmas Roman!
Some XP driver might even run on Win7
@@rkan2 WDDM was introduced with Vista. XP o Win7 are irrelevant in this context. If it can run in Vista under WDDM then it can run in WIndows 11. You are right tough that some Xp drivers can work in WIndows 7. XDDM/XPDM was removed in Windows 8. But considering Vista was released in 2007, I think this should have a WDDM driver.
any luck?
2009 GPU with 8GB GDDR5 is heckin ahead of its time
It sounds a bit much... Samsung and Micron introduced 8Gb chips in 2015. 2009 there was only 1Gb 50nm according to Wikipedia anyway.
@@rkan2 You are correct, that's why there are 8 memory chips. 8x 1GB. Definitely a super neat card.
@@chrisdib9269 8 x 1Gb not GB :P
8Gb is 1GB
I remember plans for this card. I even named my character in an MMO Larrabee :)
Yeah... I'm old...
It also has a Lawrence of Arabia Hero vibe.
I have been mad that LTT did so little about their sample of Larrabee, especially once the story about how they got it went public. Glad one sample found a way to you. 🙂
What's the story?
What's the story? I never heard anything about how they got it
@@SkynetCyb I cant find details from that time so keep in mind that the following text are not proven facts, but just a faint memory of one side of the story:
Basically some guy won an internet auction (possibly Ebay since LTT mentiones it as source of it) for it and had knowledge and plans to use it. But after the auction ended, LTT found it and contacted the seller. He then canceled the auction and send the card to LTT, which made one video with barelly trying to start a computer with it and it was not seen again. Meanwhile the original buyer made some posts about his side of the buying process, but conceded that he does not want to fight Ebay and LTT for it even though the auction has been closed when it happened.
@@pavelsovicka5292 Looks like it's from a hardforum thread, two users were bidding on it and one won. Then the seller claimed that he had been "contacted by intel and could not sell the card" only for it to show up in an LTT video a short while later. Bastards
"Microsoft Basic Display Adapter" just means that Windows is using the BIOS/EFI provided framebuffer or a generic VESA video mode. It's a vendor-agnostic, minimal, fallback "driver". Merry Christmas!
yup many of microsofts most basic drivers and core NT drivers where developed / released on 6/21/2006 and has nothing to do with larrabee being around then
For those that don't know what Larrabee actually is and why is it so impressive, it's software defined gpu, it has a bunch of general purpose processors (I think they are based on x86 actually) that can be programed within drivers to run any current or future API and optimized with time for much greater performance, it also means that API can be optimized per game or even that a game or application can load it's own preparatory API, it also means that in theory you could natively run apis for consoles like PSSL (used in PS5) and run console games without a need to emulate GPU.
Merry Christmas to you too Roman! Thank you for a year of amazing content!
Mr. Noodle to you!
Now you got me hungry for Ramen. Ha ha. Thanks for helping me get fatter. Roman eating Ramen for X-mas.
@@EXG21 ouch, didn't notice typo, what an embarrassment. Dunno how I mistyped, because I know how to spell the the name (we have exactly the same name in Poland which is also spelled exactly the same)
I am glad they decided to give it a go again. I have an a770 and have very few issues with it. I see great potential on what is already a nice gpu.
If I remember correctly, this GPU contains many (32?) Intel-Atom-like x86 or x64 CPUs.
Pentium 3
@@primus711 P55C, so Pentium MMX. These are not P6 cores.
In 1998 Intel made the i740 as their first discrete GPU.
everyone forgets about the poor little guy
That was a discrete graphics card not a GPU.
@@Krisztian5HUN???
@@Zeno- The term was popularized by Nvidia in 1999, who marketed the GeForce 256 as "the world's first GPU". It was presented as a "single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines
@@Krisztian5HUN If it's not integrated, it's discrete. The definition encompasses all different designs so long as they are a standalone graphics processing unit.
Merry Christmas, Roman. Thanks for all the great content this year and I'm looking forward to more.
By the way, I picked up your O11 Air Mini case a few weeks ago for my 13700k build and this is easily the best case I've ever worked in, and the airflow is fantastic. Great collab with Lian Li!
There was a guy in the comments on Linus' video who seemed guite genuine, who claimed he worked on the project and offered to look for documentation he had.
Might be worth digging up 🤷
Debug connector is most likely Lauterbach JTAG/trace connection to do very hardware level software debug/development. With the right set of tools, you can do some interesting stuff from there.
Yep pretty much. On the development vehicles Intel supplied to partners that connector was used for an LED debug/POST board with reset and one unknown function button, hung off the front of the case.
That connector is for Intel's ITP debugger.
I remember when Intel announced Larrabee, the concept was hugely different from Nvidia/ATI GPU. If I remember correctly, in simple terms, they wanted to put many CPU cores inside a GPU.
It really isn't a traditional GPU at all... It's a many-core x86 CPU moonlighting (poorly) as a graphics processor. It's basically software graphics rendering on fucking CRACK! 🤣
And this is exactly why Intel quickly realized using what they had made to slowly & poorly render graphics was ridiculously fucking stupid, and pivoted the design entirely to HPC under the "Xeon Phi" rebranding, where such a "metric FUCK-TONS of simple x86 cores on a chip" design actually made some sense. 🤷♂️
It was an ambitious goal, but i think the decision to use software to emulate all rasterization operations instead of adding a few pieces of dedicated hardware like ROPS and TMUs was too much of an albatross for it to ever be successful.
@@defectivedegenerate4046 Yeah, I remember they "shrank" basic x86 cores to the minimum, and tried to make a GPU. Shader cores in modern GPUs are actually very similar but without x86 baggage.
And when I was at Intel Conference, they said you will be able to do real time Ray Tracing on Larrabe. At that time, in movie industry, for ONE Frame per second you would need 100 Larabbes. It was hilarious, but if they pulled something off, Game Industry would be very different now.
Like with Sony CELL. When they first started to design PS3, it didn't have a dedicated GPU. Everything would be done on CELL. it had normal CPU, which would run OS, and 8 SPE (?). They realized this would not work and quickly added a dedicated GPU.
That's why PS3 was a beast, but as a developer, you had to know low level stuff to get CPU performance out of it.
@@alpha007org The PS3 CELL CPU has 7+1 cores. The 8th core is a dummy/broken core.
@@Cooe. Would these chips be more usable for raytracing and rendering then?
I remember Larrabee. It was supposed to have ray tracing. Technically making it the first raytracing card LONG before nVidia did it. If I recall it's also basically a multi-core pentium pro.
Multi-core P55C, not Pentium Pro (P6), with 4-way SMT.
P55C based without mmx ?
@@Carlos_Rodrigo LRB1/2/Knights Mill do have MMX, SSE, SSE2, SSE3, and AVX512.
@@FixedFunction Nice 👍 Thanx
GPU core reminds me of the old socket 7 CPUs with how it's designed. Cool review, it's neat to see the "predecessor" of the current Intel cards & how they've evolved.
It is possible to extract the driver from the windows system that used it. Probably need to match windows version to install it after. I'd try linux instead maybe.
Intel probably very proud that absolutely nothing is available for this online. No leaks. But for us it's unfortunate.
That massive heat output could be caused by power management being non functional because there's no driver to control it. Definitely agree that it is a piece of history right there even if it's not a success...would have been interesting to see how it does things if there's a driver...
Great video! Love these almost historical videos!! :D
Lol I had the luxury of being there when the card was acquired originally before being placed into his hands. The card was super clean and heavy I have a couple pictures from before it was shipped
Nice work, Roman!! Merry Christmas.
the driver date of june 21 2006 is just the microsoft basic display adapter driver date
amen, der8auer clearly has never known the desperation of needing to troubleshoot a defective card he did not break himself.
This Larrabee may display, but whether it is truly functional is still up in the air.
Hey! Previous owner here! Glad to see it be used and further researched!! Thank you so much!
i think that the OEM that builds those CT Scanners had some of these to test from Intel. After a while that CT scanner machine was getting old and they sold it forgetting about the GPU
actually the first intel gpu is the 740 from the 90s and i think it used shared memory instead of vram
Iirc they had versions that either used system RAM or had dedicated VRAM.
Intel itself tried to insist the agp bus was fast enough to use shared memory. But partner builders realized how stupid that was, and so most added in some video ram to save face for how abysmal performance would be without.
@@ZeroHourProductions407 That rings a bell :-) Long time ago, and it never was an interesting card anyway (for me).
Merry Christmas and a Happy New Year to you and your fur babies Roman!
great video man! you are a true enthusiast and professional. Intel should provide you at least some basic open GL drivers.
Happy holidays!
why is this guy so underrated? he has such cool content man
Nothing unusual about an $8000 graphics card. That's just the reference model price for the 5090.
I love the passion for such a collectable card. And then we dismantle it. Absolute nerd out. Doing God's work, Roman.
Merry Christmas to you as well Roman. Enjoy the holidays.
"Izmir Dikili"
"I have the drivers if you still want them. We recieved this card for test purposes in our Crytek headquarter in Germany"
This is one of the viewer messages from Linus video about this Intel Prototype. He claims he's one of the developers for Crysis. Maybe try to contact him for the driver.
11:39 AFAIK at least some larrabee prototypes (LTT one?) actually work with xeon phi drivers, but cannot boot release kernel/rootfs (release driver loads them to mic-device from host pc).
destroyed by those bollocks which had fingers before on this card -> museum for electrical trash - or you can help intel to catch those criminals...
...it's on you
Wouldn't it be possible to retrieve the driver from the CT system?
The gpu package looks like intel just took the heat spreader from the Beckton LGA1567 Xeon family. oddly similar. The entire package looks to be a similar size too. It would be interesting to see how many pins are on the BGA.
That’s way cool but it’s not intels first attempt at a dedicated graphics card. That was the i740 from 1998.
Of the couple of different samples shown, yours and LTTs, I have yet to see either of you try it in Linux. The person who pulled it from the machine, could they actually boot the machine and pull the drivers, maybe? It was working in that, why not grab the drivers. So cool!
Thanks so much for doing this video, I really loved your videos on Xeon Phi and water cooling them
This channel is highly underrated.
Merry Christmas Roman. All my best to you and yours. Much love.
The 2006 date is the WDDM basic display driver date. WDDM basic (using openGL) launched with the longhorn beta in 2006.
13:50 That's probably 1 _Gigabit_ per chip. I don't think anyone was making 8 Gigabit GDDR5 chips in 2009.
Interestingly, from the cooler side at least, the card looks fairly modern; not knowing about the project existing, I kind of assumed from the thumbnail that this would be an ARC prototype.
Xeon Phi was born because of the failures of Larrabee. Kirk Skaugen made the decision to resurrect it as an HPC co-processor which was a brilliant move by him at the time.
I really REALLY want Larrabee style drivers for like Knights Mill
Larrabee was amazing for being what is a CPU, running games, through software.
Imagine the 72 cores of Knights Mill doing both CPU and GPU tasks, and with the nature of CPU, this might have actually worked with an SLI-like software solution, at the very least if the drivers allowed each PCIe card to drive an independant monitor, which would probably work with GPUs equally well.
great video have a merry christmas and happy new year
i'll never have the budget for cards like this. love to see videos about them in detail. so thx for the video and merry christmas
This is awesome never knew this existed
I plan on buying an intel card eventually since I can never really afford anything beyond mid tier. I'm actually very excited about intel entering the GPU market.
Cheers. Merry Christmas.
Merry Christmas, Roman!!
I enjoyed that and understood all your reasoning for buying it. I think that's why I bought a Radeon VII, I knew it was never going to be great??? As a nerd you just have to do these things. It would be nice to see it running with a proper driver.
In 2018 my son came up to me being a teen and asked for this card in 2018. Based on color I believe is why he picked this . Back then it was only 3500ish USD and I swore there was a write up about this. Maybe way back machine cpu-tech or hardforum and keyword larabee
I have an cheap business computer, which is a DELL 7070 SSF with a COLOR AMD Radeon™ RX 6400 Desktop Graphics card, a SAMSING SSD 980 PRO NVMe 500GB card, an i5-8500 CPU, two 8GB KINGSTON CL16 FURY 2666MHZ memory cards, a 1TB HDD, and a case fan that I added using a CPU fan cable splitter. I have a slow silent case fan hooked up to the CPU fan connector as the master fan, and the CPU fan set as the slave. That way the PWM goes higher to get the slow silent case fan up to the set CPU fan RPM, and in turn that increases the CPU fan to a higher level than normal. So everything stays nice and cool. But what is darn confusing to me, is that at AMAZON you see the earlier model, the DELL 7060, with less performance, for up to $3,000. It is crazy.
I LOVE this guy videos.
Must love cats 🙂 Merry Christmas to you and your cats, greetings from Norway.
So, oddly 8086 is the Intel vendor ID, but 2240 is not listed as an ID assigned to Intel.
I have listed it now to the pciid database. There was already a 2241 entry for a different larrabee and its right before the Phi ID numbers.
I am confused. I hear all the time that the industry needs validated products (hardware), drivers, etc. and that is the reason why professional GPUs are insanely expensive to buy, despite costing the same to produce. This card being an engineering sample and used in a CT machine is a bit conflicting.
Happy holidays Roman!
Anyone notice how the 8 pin is a different colour than the 6 pin, just like Intel Arc... Was that colour difference on Arc an Easter egg?
Maybe they had some warehouse full of them xD
@@rkan2 haha that would be funny
either someone on the team is a fan of the mismatch or intel is really unlucky at sourcing matching parts 😂
And now I have an Arc GPU, so great to finally see them properly release any GPU!
these cards were great in advanced body mapping ..i seen it in hospital environment ...yes they are not just designed for gaming in mind, its for using machines application like VR headsets they use to operate on human person by this heavy machine that attached to heavy computer ...
The Driver is a basic standard for all GPUs to be able to display a basic video output without the proper Drivers being installed. Without it you would only have a black screen until you install the proper Drivers, what would be a lot harder to do without some visual feedback.
Since its a standard for basic functionality it doesn't need to be updated unless they find a security vulnerability that involvers it and that is why the basic Driver is from 2006.
Really cool to see a Larrabee PCB. Not surprised that Roman would end up being the one to show me.
Driver installation package was probably on that PC connected to that CT scanner.
Also, the driver date is for the Windows 11 "Microsoft Basic Display Adapter"-driver and got nothing to do with the card. Furthermore, the date is wrong - it just shows Microsofts failure to update information, such as dates, in their driver files.
the chip on the Larrabee is 3x x86 Xeon it has huge paralel computations try in linux you will be able to have all
that 3x Xeon is not that card, but it is a modern Intel VCA or VCA2 product.
@@cameramaker ah inside the chp architecture is the same as xeons from that era besides the card is not before 2000 actually, that particular card is 2009 era and drivers for are on intel website
@@adriancoanda9227 so where at on intels website?
I hope someone from Intel sees this and provides you with some kind of a working driver, win 7 i dunno. but would love to see performance in like 3dmark 06/vantage and some games of the time.. Merry Xmas m8.
I doubt that would ever happen, but one can dream, no?
NDA, IP bla bla
@@nvignesh Pretty much...
A CT-Scanner terminal/controller PC... That is a very interesting source! I presume it was used for some sort of processing, and not just 2D graphics output.
Wonder if the drivers would be on the HDD of that CT-Scanner
@@ereksat Of course, unless they used Linux (which I doubt)
@@rkan2 inside the card itself
was supposedly using FreeBSD
Larrabee was the last time Intel excited me - the GPU would have become featureless - rendering engines would have become timeless software rendered modules.
I remember waiting to get Bulldozer/Larrabee/Radeon/Geforce/Hydra system, Jim Keller was gas lighting us on what Bulldozer would do.
I think Nvidia bought and buried Lucid Hydra like they did other competing solutions over the years.
It's just a standalone IRIS card, so no loss to any of us !
The Microsoft basic display adapter was written well before the card existed and given the fact that they would have probably wanted to test basic display out seperate from the driver development and to aid in it and would have been manufactured to give basic display out to meet the microsoft driver requirements for basic display out, the chicken came first and the egg was made to support the chicken..... lol
What about that CT scanner PC? Card probably didn't work on basic display driver there if it actually did something.
Was it windows or linux machine?
Maybe its possible to contact someone that worked with that pc.. someone from tech support..
Also 8 GB of memory in 2010??? Did you find correct chips?
Around that time cards had 256, 512 MB of video ram, later 1-2 GB.
Could be 1 Gbit ICs, that would result in 1GB total memory. More likely for that time period.
8GB shocked me too. GTX 480 is from 2010 and only has 1.5GB of GDDR5. AMD launched the HD 7970 in December 2010 and that has 2GB of GDDR5.
First cards with 2GB of GDDR5 (like sapphire HD4870 2GB and HD5870 eyefinity) used 16 of 1Gbit chips (2008-2009). 2Gb chips were widely present at the launch of HD6000 series (used for sure on HD6970) which was 12.2010. They were also used later on team green GTX580 3GB (12x 2Gb ICs). Ofc der8auer mistaken Gb with GB or the datasheet of memory ICs were wrongly described. That's all ;)
I would think you had contacts at Intel... you should talk to them about who would be the RIGHT person to talk to at Intel andthat might be your best bet...
No matter, this was SUPER INTERESTING!!! I had thought Intel DID release the Larabee card for Servers and such... but hearing it was in an MRI box.... well... I just don't believe it TBH .... you should look to the serve spacew for the drivers, you should have better luck there
Merry Christmas.
Get this man a DRIVER! =D
Good stuff
Kinda sucks that Intel doesn't have the confidence to try for a few generations. They're about to do the same thing again. Because they can't replicate the competition on gen 1 they don't feel it's useful to keep going into gen 2 and gen 3.
Happy Holidays everyone
You could contact last owner to get their drivers in pc😊
As I Recall... Despite being co-designers of AGP, Intel leadership was initially very opposed to the idea of complex video cards that would help process graphics data. They wanted everything to be handled by the cpu and rather stubbornly stuck to that concept for years. I think Larrabee was still catering to it somewhat, by using cpu dies rather than a real gpu like ATi and NVidia.
Interesting vidéo ! but Larabee was far from intel first attempt at video cards.
As far as I remember they had AGP and PCI cards (3D GPU) back in 1995-1996 with the i740 which actualy sold to public for a while.
But intel failed to anticipated the S3, ATI, Matrox and then Nvidia competitive products for games or productive softwares.
I remember installing and selling those cheap cards on PC with Windows 95/98/NT4.0...
Are you sure those are 1 Gigabyte VRAM chips and not 1 Gigabit for a total of 8 Gigabits or 1 Gigabyte of VRAM? 1 Gigabyte of total VRAM seems way more likely given the card's date and what other cards of the time had. My top of the line ATI HD 4870x2 had 2GB of GDDR5 RAM 1GB for each GPU and it was release around that time as well.
Me too, I'm wondering if it indeed had that much VRAM. Even a more recent card, the R9 Fury of 2015, had 4GB of VRAM, and the infamous GTX 970 of the same era had 3.5+0.5GB.
Yeah, sounds a bit much... Samsung and Micron introduced 8Gb chips in 2015. 2009 there was only 1Gb 50nm according to Wikipedia anyway.
The reasoning for the drivers reminds me of Morty trying to convince Rick to buy him the sex robot.
sounds like the first time ray tracing was to be used in consumer gpus
A public demonstration of the Larrabee ray-tracing capabilities took place at the Intel Developer Forum in San Francisco on September 22, 2009. An experimental version of Enemy Territory: Quake Wars titled Quake Wars: Ray Traced was shown in real-time. The scene contained a ray traced water surface that reflected the surrounding objects, like a ship and several flying vehicles, accurately.[25][26][27]
A second demo was given at the SC09 conference in Portland at November 17, 2009 during a keynote by Intel CTO Justin Rattner. A Larrabee card was able to achieve 1006 GFLops in the SGEMM 4Kx4K calculation.
To bad you can get ahold of a disk image from the CT Computer this was in. Then you could extract the driver.
Seeing this coprocessor card, I always wondered why don't we have them for like ARM for bare metal processing. Considering the PS1 dev kit was a card that can be placed into a PC for development.
i mean they do exist, but they are just FPGAs with ARM cores built-in and are usually pretty damn expensive.
a RISC-V Variant would also be pretty cool
@@proxy1035 Yea, that's what I meant, usually meant for development. Rather than like encoding/capture/sound cards we are used to.
I'm really hoping that Intel can find and send you some drivers for this (as a stockholder, I command it!). This is a piece of history and should be properly tested with whatever they were working on at the time. That said, I know it's difficult and perhaps the engineers at Intel are not as paranoid as I am. . .100% have every bit of code, every technical document and all notes for everything I ever did thru my entire career (not that I expect someone to ever ask me for information on any product that I designed). I hope someone, somewhere has what you need to make this fully functional.
Aww, the protector of the larrabee
Driver is from 2006.
Driver, that is "generic driver for everything I don't actually know how it works".
It looks remarkably modern, exactly like a workstation card youd expect to see now.
i think the driver date shown in gpuz is the date of the microsoft basic adapter driver
Maybe you should investigate if Linux has drivers for it?
When our RnD group used this card, it was for a Windows XP or Windows 7 application, so unless Intel were also working with other use cases, I wouldn’t think drivers were ever built for non-Windows OS.
GPU-Z also doesn't recognize my VIA uD8 (5400ew x2) and uH4 (5400ew) Video Card that was an embedded video card for video walls. It was a real pain in the butt to acquire drivers for these cards. They have the performance of the Chrome 540 GTX, in fact I have put the device ID in the Chrome drivers and they work.
W1zzard refuses to implement S3 support in GPU-Z and in GPU Database. It could be done, but he's stubborn.
@@FixedFunction Yeah, it is disappointing to see no S3 Graphics cards on the Techpowerup site. I try to add as many cards as I can on the Passmark site. Although, it won't show my 530 GT in the card rankings.
I forgot that an IHS in graphics cards used to be a thing, also interesting to note the speaker symbol on the silkscreen, i have never even seen an accelerator card with one of those
also that giant CPLA-4-50 4 phase inductor is a neat choice
but the choice to use rather expensive delta regulator modules stands out, clearly a prototype never intended for mass production with that
The first consumer Intel gpu cards was the i740 back in 1998. They actually came out and were weak compared to the competition at the time.
Try searching for the driver by the VID and PID values. Maybe for Linux or another platform like SPARC.
There are no drivers, we've been trying to find the interface software for Windows for over a decade. The problem is the driver needs to communicate with the integrated operating system that loads off the EEPROM. These cards don't have a "BIOS" they have a sequence of software layers that initialize the cores to do different things, and the upper interface layer is effectively a FreeBSD distribution.
@@FixedFunction Any clues from the Xeon Phi drivers? There has to be at least some resemblance.
@@777anarchist Nothing, because the software stacks for graphics and compute are completely different. They're using the core resources in different ways, and the graphics stack is activating portions of the die that handle texture filtering, a portion of the die that is present and entirely deactivated in Xeon Phi.
Awesome insight to Intel's prototype GPU's
_" not for resale and resellers "_
... fish on the net