Entry-Level GPUs: Intel addresses the VRAM problem

Поделиться
HTML-код
  • Опубликовано: 16 дек 2024

Комментарии • 199

  • @lvyathan
    @lvyathan День назад +124

    I'm glad Intel actually bringing some good competition, hopefully their GPU's do well in the low-mid range market. They need to succeed..

    • @farhanrejwan
      @farhanrejwan 23 часа назад +6

      my guess is it will be short-lived, once intel establishes a loyal customer base in this market.

    • @antondovydaitis2261
      @antondovydaitis2261 22 часа назад +7

      If only they were actually for sale at anywhere close to $250....

    • @johnsmith-i5j7i
      @johnsmith-i5j7i 21 час назад +3

      Still expensive, almost £300 for the Asrock.

    • @roklaca3138
      @roklaca3138 18 часов назад

      ​@@antondovydaitis2261now you will see, which retailer is a scalper... Most of them.... If only i could get one directly from manigacturers to skip the retail mafia

    • @HenryThe12
      @HenryThe12 17 часов назад

      People will still complain and many of you guys still won’t buy them. Same thing with AMD, many people only want them for competition, not because they are actually interested in buying their products.

  • @Lemard77
    @Lemard77 14 часов назад +9

    "they are not lying, they just have a different calculation" Nvidia calculation probably was: if with the original price we were making 80% margin on the card, adding 8 GB extra of vram (same PCB and same bus size), cost us 20$ more, we have to charge an extra 100$ so it keeps on the same 80% profit margin as the original variant.

  • @JayzBeerz
    @JayzBeerz 23 часа назад +61

    I just bought a few A770 16GB for $229.99 what a steal for content creation and gaming with the 16GB VRAM.

    • @moto6981
      @moto6981 22 часа назад +3

      The overall design of the last gen intel gpus is not that good. Battle Image is alot better

    • @ZahidHasan-tj2rl
      @ZahidHasan-tj2rl 22 часа назад

      @@moto6981 this is battlemage not extra image word!

    • @johnsmith-i5j7i
      @johnsmith-i5j7i 21 час назад +7

      why would you buy the first gen model???

    • @DaveTheeMan-wj2nk
      @DaveTheeMan-wj2nk 21 час назад

      @@moto6981 for $230 it's still an amazing deal. Compared to the nvidia offering of that price which is like a 1660 lol.
      most 3060s are around 250+
      a770 isn't always the fastest, but in many titles it still gets close to a 4060 ti.
      it can beat, or compete with a 6700xt too often.
      for 230 bucks you aren't getting a bad deal.

    • @mastroitek
      @mastroitek 21 час назад +7

      had one for a few months and it truly was great for vram intensive workloads, gaming was meh, I mean, it works, but it obviously underperforms. But there is no other way to get 16gb anywhere close to that price

  • @BangHJL
    @BangHJL 23 часа назад +88

    I'm currently playing Dying Light 2 on 1440p with max ray tracing on a RTX 3080 10G. For the first 10-15 hours it ran pretty good but once I got to the big city I keep experiencing burst of stutters. It was indeed a vram issue and it's annoying. The 3080 is definitely a capable card, but Nvidia intentionally handicapped it. All this is to force people to upgrade their gpu in just one generation. They know exactly what they're doing. This is a calculate decision made by a multi-trillion dollars company to squeeze every last penny out of us the consumers. I'm glad Intel is also fighting this unnecessary issue that Nvidia started. I hope Arc stays relevant in the market for a long time. Nvidia needs to be humbled.

    • @tanmayhembram9249
      @tanmayhembram9249 23 часа назад +8

      Same card user here, that 10gb is bugging me too such a capable card but limited by vram. Wonder how long will it last on 1080p high on the years to come.

    • @emanuelefusco4466
      @emanuelefusco4466 22 часа назад +4

      I also have a 3080 10GB, you are right, this gpu is too fast to have just 10gb of vram and in some cases it can be limiting factor. I play at 1440p and usually I have some issues only when I turn RT on because (ironically) it uses more vram, without it it's perfect. That being said it is possible to have a good experience with 10GB of vram maybe lowering textures from ultra to high (or shadows), some RT related setting or just using dlss that looks quite good at 1440p. I think 10gb are also more than enough at 1080p (but who plays at 1080p with a 3080??). So yeah you can play well lowering some settings and most of the times you won't even notice the difference but it's annoying that it has to be done on such a capable gpu. At least I bought it used for just 350$, I would have never paid full price for it.

    • @1vaultdweller
      @1vaultdweller 22 часа назад +6

      RTX 3080 10G, like many Nvidia gpus, is a waste of sand. A good gpu should have proper Vram, therefore, half the Nvidia lineups that fanboys claim to be good gpus, are trash

    • @tilapiadave3234
      @tilapiadave3234 22 часа назад

      It's all a conspiracy to get YOU ,, yes a multi billion dollar company planned for decades to just effect YOU

    • @emanuelefusco4466
      @emanuelefusco4466 21 час назад +1

      @@1vaultdweller I mean, most nvidia gpus don't have a lot of vram and that's true, I wouldn't buy anything with only 8gb of vram at this point but maybe you are exaggerating just a little bit. Changing textures from ultra to high is not a big deal most of the time and it's the only thing you need with a 3080 at 1440p (most of the times you can max out everything if you don't use RT). I think the 3080 should have 16gb of vram but that doesn't mean it's a trash gpu, in fact it can run any game without issues (in rare occasions you lower textures by a bit). Nvidia has a vram issue in general but also AMD gpus have some issues, for example FSR is currently the worst upscaler and RT performance on all AMD gpus are a bit dissapointing, yes in most games you can always play without RT but in the near future games are going to only have RT (or PT) rendering, also if you buy a high end AMD gpu (for example the 7900xt) you'll most likely never use RT because it's too heavy at 1440p or 4k and in some games it really makes the difference. Like it or not upscaling and RT are going to be the future of gaming and with FSR 4 being an AI upscaler old AMD generations are most likely to be left behing just like nvidia did. I'm not saying AMD, NVIDIA, or Intel are bad, I'm just saying every GPU has some flaws and we should buy what fits best for us without hating. A big factor is also price, most of the times AMD gpus are cheaper but it's not always like that especially on the used market, if you could buy, for example, a used 3080 for 350$ or 300$ (the price my 3080 had) would you refuse just because of the 10gb of vram? Every gpu is good at the right price.

  • @henriquebrenzinger4406
    @henriquebrenzinger4406 20 часов назад +27

    Currently having endless stuttering with my 3070 due to the 8GB of VRAM, I'm never buing Ngreedia ever again after this shit. The chip itself can run games even on 4K without a sweat, but the small vram kills it.
    Planned obsolescence at its finest.

    • @habana7638
      @habana7638 19 часов назад +5

      haha No, next time you just buy Nvidia again and fall for the sales talk as always.

    • @Mcnooblet
      @Mcnooblet 17 часов назад +2

      So you have had about a half decade with it so far and are now mad that it is starting to have performance issues. Right. You made the decision on 8gb, you made the decision not to go AMD where you wouldn't have to pay for RT cores, tensor cores, and get VRAM instead. "Ngreedia" didn't strong arm you into buying their specific product, you chose to, YOU DID. Hardware Unboxed has warned people about 8GB for years, and you dummies do what you want and cry later like a child whos bottle wasn't filled to the absolute top. Try AMD next time, and we will see you back in another 5 years crying about that GPU no longer performing as well as it use to.

    • @Machistmo
      @Machistmo 15 часов назад

      Finally someone that gets Nvidia baseball bat chair'd em.

    • @tottorookokkoroo5318
      @tottorookokkoroo5318 15 часов назад

      What games are you experiencing stuttering with?

    • @henriquebrenzinger4406
      @henriquebrenzinger4406 15 часов назад

      @@tottorookokkoroo5318 farming simulator 25

  • @dampflokfreund
    @dampflokfreund 22 часа назад +17

    I predict the GPU department of Intel will be a lot more popular in a few years, possibly more than their CPU department. B580 is a really good GPU, I'm glad to see the competition. And if the 5060 truly has 8 GB again, that's the best thing that could happen for Intel, because every decent reviewer would recommend the B580 instead. Also I really love the communication with Tom Petersen here!

    • @stealthhunter6998
      @stealthhunter6998 21 час назад

      The only issue it has r with drivers. Games like Spider-Man stutter like crazy for no reason with digital foundarys testing. Also the frametimes in general r worse.

    • @j195sclko
      @j195sclko 20 часов назад +5

      @@stealthhunter6998 Drivers can be fixed, VRAM cannot. Its a fight for Intel to lose if they don't come around to make their drivers more robust.

    • @AnEagle
      @AnEagle 19 часов назад

      ​@@stealthhunter6998I mean it's improved so much, it's now turned into pointing at a few games. That means, especially if people start buying arc GPUs, that this could probably be fixed soon, because a large part of the problems may well be on the game developer side

    • @Machistmo
      @Machistmo 15 часов назад

      I heard they shut it all down... 'MLiD Tom loves himself more than anyone else ever could' said so.... but I mean what about drivers if they shutter it? Intel pushed their CEO out the back door. then back dated his departure.. who fuckin does that man? I have never heard of that, ever. word was always that Gelsinger was the only thing keeping this project alive. He is gone now.

  • @kerotomas1
    @kerotomas1 21 час назад +14

    So any news on a B780 or some higher-end Battlemage card that's not just budget tier?

    • @ksolo614
      @ksolo614 18 часов назад +2

      Those were canceled

    • @kerotomas1
      @kerotomas1 17 часов назад +1

      @@ksolo614 Well that's not too reassuring for the future. The Intel guy was basically talking about how their Battlemage is essential for iGPU and APU development so it looks like they gonna ditch Arc eventually.

    • @nimrodery
      @nimrodery 17 часов назад +6

      @@ksolo614 When was that announced?

    • @Farsaar42
      @Farsaar42 12 часов назад

      ​@@nimrodery Never, he made it the fck up.

  • @Z3rgatul
    @Z3rgatul 20 часов назад +10

    Glad we got RTX ON in the glasses 🤣

  • @concinnus
    @concinnus 18 часов назад +4

    The fundamental issue is that the consoles aside from XSS have ~12GiB for graphics, so that's what games are designed around.

  • @KryssN1
    @KryssN1 20 часов назад +6

    Please, if nVidia again releases 5060 with 8GB and 16GB for 400-500€ and 5070 with 12GB for 600€ rip them a new one.
    Pretty please 😊❤

    • @roklaca3138
      @roklaca3138 17 часов назад +1

      @@KryssN1 nope, peoplenjustify this... Again and again

    • @Machistmo
      @Machistmo 15 часов назад

      NEVER. GOING. TO. HAPPEN. Nvidia is below 132 most of this morning. If is closes below 132 I think its got 15 - 30 point to drop before it soft lands. I think Nvidia is in serious trouble and with the new tarrifs, the smart money is leaving. Billionaires all sold at record highs already. The smart money jumps in the morning and exits in the afternoon. lets see....

    • @Machistmo
      @Machistmo 15 часов назад

      Side note: man I miss being young and dumb and full of hope.

  • @constantinosschinas4503
    @constantinosschinas4503 12 часов назад +1

    The 3060 adressed the 12GB issue 5 years ago but then Nvidia had a greed stroke.

  • @lauri9061
    @lauri9061 19 часов назад +7

    3:30 "they are not lying" lol

    • @HanSolo__
      @HanSolo__ 19 часов назад +1

      Corpo is a corpo regardless.

  • @lvc394
    @lvc394 22 часа назад +15

    Oh come on, the 3060 came with 12GB and the cost was not a big factor. If all new cards had 16GB they could keep the cost down due to the volume buy of one kind of chip.

    • @dampflokfreund
      @dampflokfreund 22 часа назад +2

      You can't generalize this because the Ada architecture is more expensive than Ampere, especially the process node.

    • @Dhruv-qw7jf
      @Dhruv-qw7jf 20 часов назад +3

      ​@@dampflokfreund is the Battlemage architecture more expensive than Ampere though?

    • @defeqel6537
      @defeqel6537 18 часов назад

      1GB costs under $3 right now, though that's just the VRAM chip/module

    • @drewnewby
      @drewnewby 15 часов назад +3

      TAP can talk cost analysis all wants, the memory, interconnect, bus, etc, but they genuinely cost nothing compared to the price. It's all about the profit obtained through market segmentation.

  • @trueNahobino
    @trueNahobino 10 часов назад

    Currently using an A770 and very happy with performance at 1440p. Glad to see Battlemage improve on it but I'll probably wait for Celestial to upgrade.

  • @cocobos
    @cocobos 23 часа назад +15

    I have an RX7900xtx and seeing the B850 make me wanna get one too 😅

    • @matgaw123
      @matgaw123 22 часа назад +1

      Yeah it's cool little gpu 😅

    • @newearth9027
      @newearth9027 22 часа назад +1

      Why

    • @cocobos
      @cocobos 21 час назад

      ​@@newearth9027 Cute 😅

    • @newearth9027
      @newearth9027 21 час назад

      @@cocobos cause the design is cute? Lol

    • @Lockwood360
      @Lockwood360 17 часов назад +1

      why? huge downgrade.

  • @geofrancis2001
    @geofrancis2001 22 часа назад +9

    with Nvidia and AMD avoiding the bottom end, there is definitely a market there for intel above APUs and below a discrete Nvidia and AMD. The thing i dont get is why intel don't have a laptop GPU since thats where they still get most sales. they could have bundled it like they used to do with centrino when you had an intel CPU + chipset + wifi, they could have done the same but with CPU + GPU + WIFI

    • @kurttis8512
      @kurttis8512 21 час назад

      no...all intel APU suck no handheld gaming developer want any APU from Intel .and Nvidia never make any CPU or APU still AMD was top ranking in any handheld APU

    • @geofrancis2001
      @geofrancis2001 21 час назад +2

      @@kurttis8512 i never said anything about an intel APU, i said a laptop GPU. and your also wrong gaming on a APU, I have the GPD WINMAX with a Intel 1035G7 and its iris graphics can play everything.

    • @m.mugadam7692
      @m.mugadam7692 21 час назад

      right especially with this efficiency, that laptop will be cold as fuck

  • @purgedome2386
    @purgedome2386 22 часа назад +2

    Nice. Thanks Tom.

  • @letto18
    @letto18 20 часов назад +3

    Seeing how the B580 turned out, I'm currious how the 700 tier cards would turn out? My early prediction is the B770 could be between the RTX 3070 & 3070 Ti for performance with 24GB VRAM on a 192bit bus (my understanding for B580 is 6x 2GB 32bit chips for 12GB over 192bit (with one chip removed for the B570 that would be 5x 2GB 32bit chips for 10GB over 160bit), next step up for size is 4GB chips so 6x 4GB 32bit chips would be 24GB over 192bit (with one less chip being 20GB over 160bit that could be the B750 that could be somewhere between the RTX 2080 Ti & 3070))

  • @Nintenboy01
    @Nintenboy01 23 часа назад +4

    Hope they do well, we really need strong alternatives at a good price

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 18 часов назад

      True i getting sick seeing crappy new budget gpu

    • @Machistmo
      @Machistmo 15 часов назад

      @@mrbobgamingmemes9558 and this card is 400 now. Making it MEHx2

  • @zakarysiders2020
    @zakarysiders2020 18 часов назад +3

    They aren't lying, they're just not telling the truth.

    • @Machistmo
      @Machistmo 15 часов назад

      Truth is the new orange.

  • @MiKesh1986
    @MiKesh1986 20 часов назад +4

    Where supply?

  • @rcvillapando
    @rcvillapando 16 часов назад

    Thanks, Tom!

  • @johnmoreno5837
    @johnmoreno5837 19 часов назад +2

    Let me put an nvme directly on to my gpu as vram

  • @IceNinja2007
    @IceNinja2007 21 час назад +6

    "I don't think they're lying to you..." You sure about that??

  • @AntonioCunningham
    @AntonioCunningham 15 часов назад

    I don't care about 1440p, I'm just glad we're getting more competition!

  • @Diegonando64
    @Diegonando64 14 часов назад

    We need GPUs with more VRAM for LLMs

  • @flywheelshyster
    @flywheelshyster 22 часа назад

    I got a new prebuilt earlier this year with a 4060 (coming from 1650 S) and man i kinda want this one as its really necessary for over 8 vram

    • @GruelingFive8
      @GruelingFive8 18 часов назад +1

      If your 4060 plays all your games at the settings you like, you're probably good for the time being

    • @PronounsR4Pussys
      @PronounsR4Pussys 16 часов назад

      You didnt get a '4060', you got a _4050_

  • @rhythmmandal3377
    @rhythmmandal3377 20 часов назад +13

    I don't understand how a hd 7970 could have 2gigs of 384-bit bus but now adays it's impossible for 8 gig cards to have 160-bit bus.

    • @Zbychoslaw666
      @Zbychoslaw666 20 часов назад +7

      HD 7970 had 3 GB of vram.
      Basically memory chips have a set bit bus compatibility - to populate 128 bit memory bus you need 4x 32-bit memory chips. To populate 160 bit bus you end with 5 chips totalling 10 GB (the configuration Intel Arc B570 has). To have a 160-bit 8 GB card you would need custom 40-bit memory chips, that no manufacturer would want to make, because those would be incompatible with industry standard.

    • @rhythmmandal3377
      @rhythmmandal3377 20 часов назад +1

      @Zbychoslaw666 you explain to me how a card from more that a decade ago had a higher memory bus width than current gen cards.

    • @cronos1675
      @cronos1675 19 часов назад +4

      ​​@@rhythmmandal3377 cause the hd7970 has 12x memory chip combined for 3 GB in total memory . Each memory chip must be 256MB 32 bit

    • @rhythmmandal3377
      @rhythmmandal3377 19 часов назад

      @cronos1675 ok then how 4060 ti can have 16 gigs of memory but not require a bus increase. Or if you think it's the other way around why doesn't the 8 gig version have a 64 bit bus.

    • @MarcABrown-tt1fp
      @MarcABrown-tt1fp 19 часов назад +3

      ​@rhythmmandal3377 That's because the memory controller can addresses more memory chips on the same channels, by adding chips to the same connections on the other side of the pcb. Only downside is those extra chip provide no additional bandwidth.

  • @RobloxianX
    @RobloxianX 20 часов назад +1

    The beauty with faster iGPUs also means there is no VRAM limit there either. I don't think AMD is going to make another 8GB card but Nvidia probably will. We'll see iGPUs on laptops overtake the RTX 5050, Nvidia will basically have to abandon that market entirely.

    • @defeqel6537
      @defeqel6537 18 часов назад

      all rumors point to Navi 44 being 128 bit, so likely we will get 8GB and 16 GB again

    • @Machistmo
      @Machistmo 15 часов назад +1

      @@defeqel6537 yea Navi is a generational leap in RAY Tracing performance, rumors too. I dont know that I care about RAY TRACING. No I know I dont care.

    • @defeqel6537
      @defeqel6537 14 часов назад

      @@Machistmo for perception reasons, RT performance is probably important, but I also don't really care about it

  • @icarus-ht8xv
    @icarus-ht8xv 15 часов назад

    It would be good if they could test in programs such as Stable Diffusion, comfyui and similar.

  • @tigerbalm666
    @tigerbalm666 18 часов назад

    12gb vram seems like sweet spot. I has RX6800 & RX6800XT 16GB in 2 PCs...

  • @Nanobits
    @Nanobits 16 часов назад

    These days VRAM should be 12Gb or above, less than that is in fact an issue unless you spend all your time playing these small games, most which i call retro style 2D games.

  • @cocosloan3748
    @cocosloan3748 20 часов назад

    Here ... Some Tom love ❤

  • @Machistmo
    @Machistmo 15 часов назад

    Side Note: I have seen this channel make opposite arguments about 8GB just not that long ago right? I am smoking too much of the local shrubbery? Well the main channel. They have branched out to hub clips and whatever.

  • @kimmyksbro3116
    @kimmyksbro3116 20 часов назад +1

    Why not 16GB?

    • @defeqel6537
      @defeqel6537 18 часов назад

      would require either a wider or narrower bus, neither is a good option, the first because of cost and the latter because of performance loss

    • @kimmyksbro3116
      @kimmyksbro3116 18 часов назад

      @@defeqel6537 4060ti has 16GB ram 128 Bus
      Why not just put 16GB on the card?
      192 bus, is that a problem?

    • @defeqel6537
      @defeqel6537 18 часов назад

      @@kimmyksbro3116 each memory controller, which connects to 1 or 2 memory modules/chips, is 32 bits, so you have 4 of them for a 128-bit bus. Now each memory module (of GDDR6) can be 1 or 2 GB large, so you get 4 or 8 GB capacity, if you want 16 GB, you need to connect 2 modules to the memory controllers. With a 192-bit bus you have 6 memory controllers, so either 6 GB, 12 GB or 24 GB (with 2 x 2GB modules per MC).
      In theory you could only have some memory controllers connected to 2 memory modules and some to a single one, but that makes things much more difficult in terms of memory bandwidth and data access.

  • @mrbobgamingmemes9558
    @mrbobgamingmemes9558 18 часов назад

    I hope intel gpu can be competitive this time,

  • @heroicsquirrel3195
    @heroicsquirrel3195 18 часов назад

    I want intel to release a card that competes with the upcoming nvidia 5080/70

  • @engahmednofal
    @engahmednofal 16 часов назад

    thanks

  • @Dhruv-qw7jf
    @Dhruv-qw7jf 20 часов назад

    It does cost more. About 30 dollars or so more because that's what 3060 costed compared to 4060 if we compare MSRP pricing.

    • @defeqel6537
      @defeqel6537 19 часов назад +1

      1GB/8Gb of VRAM costs under $3 atm. (and has for quite some time), plus a bit of additional die space when talking about a wider bus width, and some traces on the BCP. It's really not very expensive.

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 18 часов назад

      ​@@defeqel6537yeah i mean 12 gb at 250$ means than higher capacity vram is not that expensive,

  • @i3l4ckskillzz79
    @i3l4ckskillzz79 18 часов назад +2

    Another paper launch but people keep milking the 250 price point

  • @davidfluty7213
    @davidfluty7213 17 часов назад

    The ability of AMD and Nvidia to ignore the insatiable need for more vram is not just confounding but also a huge missed opportunity not just for sales but for developers to be freed to be more creative instead of being confined to 8GB.

    • @Machistmo
      @Machistmo 15 часов назад

      AMD had that space to themselves. Now here comes intel with a 400 card that no one realizes is 400 yet with the performance of a 6600XT come on people. STOP BEING GOLDFISH

    • @techonlystuff99-tw6cm
      @techonlystuff99-tw6cm 13 часов назад

      Nvidia does this on purpose so the only options are 3090/4090

    • @kevinj24535
      @kevinj24535 12 часов назад

      If the cards now had 24gb instead of 12gb, the game developers would not be any freer in their creativity. The majority of players have inexpensive cards that are somewhat older and have little VRAM. The game developers are guided by this. They want the largest possible buyer base. The market is moving so slowly that the steps with VRAM will move very slowly.

  • @aceingaming9656
    @aceingaming9656 21 час назад +4

    Let's go Intel. We need you to thrive in the GPU industry. We're very dissatisfied with AMD and NVIDIA. We need a new competitor. Please make those cringe ads that show how you're beating AMD and NVIDIA. Make sure you deliver your advertised performance. 😊

    • @Machistmo
      @Machistmo 15 часов назад +1

      and what do we see happening right now? Google BUY INTEL B580. how much are they again? Now compare this hunk of crap to a 400 card. The 4060 Ti is a no brainer. Nvidia MSI 4060 ti card right now before those prices go up. Are you people goldfish?

  • @Bdogy
    @Bdogy 19 часов назад +2

    Keep shilling ray tracing and 12gb won't even be enough for the shit games they put out these days

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 18 часов назад

      Worst of all there are few games who forced ray tracing despite the fact that steam most popular gpu is 💩 for ray tracing unless you use ultra performance upscaling

  • @drinkwoter
    @drinkwoter 22 часа назад

    Now it would be interesting to see if user benchmark makes b580 performing higher than 5090 in some made up measurements

    • @newearth9027
      @newearth9027 22 часа назад

      I still don't know how that site is even around still lol

  • @porksandbean
    @porksandbean 10 часов назад

    Nvidia's lying to you. Tom just want his options open when shit hits the fan :D

  • @MechAdv
    @MechAdv 14 часов назад

    The problem with this “entry level” GPU is that it uses as much silicon as a 4070 super to produce 4060 performance. Intel probably is selling these at a loss even BEFORE factoring in R&D. Hope they can stick around to actually make any money. The discount on TSMC wafers they lost because of their idiot CEO running his mouth probably ate the entire profit margin of Battlemage.

  • @Krushx0
    @Krushx0 21 час назад +3

    It looks like he just avoided to answer it actually. We need more ram, 8GB not enough, our calculation correct, but nvidia say otherwise, yet they are not wrong.... oh ffs come on man don't do that clown dance and tip toeing around, be straight with us, have a spine and we can respect you.

  • @cosmic_gate476
    @cosmic_gate476 17 часов назад +3

    RUclipsrs hyping this non existent product endlessly 😆

    • @Machistmo
      @Machistmo 15 часов назад +1

      THIS. THIS RIGHT HERE. Its garbage anyway.

    • @GrainGrown
      @GrainGrown 13 часов назад

      ​@@Machistmo *It's

    • @Machistmo
      @Machistmo 8 часов назад

      @@GrainGrown thanks auto correct. By the way you’re late and fired

  • @bighousemusic628
    @bighousemusic628 14 часов назад

    team blue baby im tired of ngreedia scamming us gamers iv seen the arc b580 mops the floor with the rtx 4060 4060ti rtx3070 3060 and amd rx 6700xt and 7600xt

  • @christophermullins7163
    @christophermullins7163 19 часов назад +1

    The answer is no. They wanted this gpu to fall in line at about the 4070 level and compete nicely there for $400 having a standard amount of vram. Perdormance fell short so now it is a spin of "12gb to the entry level". Not that its a bad more or a bad look. Good for intel.

    • @nimrodery
      @nimrodery 17 часов назад

      No, this unit only has 20 Xe cores (5 "render slices") compared to the previous gen with 32 (8 "render slices"), the increased efficiency over last gen means a higher tier GPU is possible, there's just no release date set and Intel hasn't said much. Intel's marketing also didn't compare the B580 to the top tier previous gen, even the naming suggests a cut-down version.

    • @Machistmo
      @Machistmo 15 часов назад

      Dude 4070? what now? Its barely on par with, and it looks like shit compared to, a 4060. I hate Nvidia passionately but I would get an Nvidia 4060 8GB before this card.

    • @christophermullins7163
      @christophermullins7163 14 часов назад

      @@nimrodery Can you imagine if Intel, with their infinite wisdom and knowledge of the 4060 like performance.. decided to compare the GPU to a 4070? Yeah.. the fact that they didn't do that means they were aiming for low end from the start. Riiight. Not. The number of cores is completely meaningless as well... different architecture means they cannot be compared. Remember the leaks and rumors that suggest Intel is aiming for 4070 ti to 4080 super performance at the top end? Well that is supposedly the 256bit GPU. Just like Nvidia.. if the 256 bit GPU is aiming for 4080 performance.. it's stands to reason that the 192bit GPU aims for the 4070(or 4070 ti because the 580 is the full die like the 4070 Ti is). And the b580 is the full die and is not a "cut down" GPU like you say. The b580 literally had identical silicon Costs as the 4070 Ti and it is only 60% of the performance of that GPU.. and you really think that was Intel's plan?

    • @nimrodery
      @nimrodery 13 часов назад

      @@christophermullins7163 Basically everything you said is wrong. "Sources" have been saying for a long time you shouldn't expect a 4080 tier GPU, and if you speculate on the performance of a 32 Xe core part based on this one, the facts seem to concur. Maybe you were hoping for a faster B770 but there has been literally zero figures divulged by Intel, we never knew beyond speculation such a GPU even exists. As for the B580 it's clearly a cut-down GPU, Intel made improvements in every area including heat, it's just ridiculous to suggest this is somehow the best they could do. If we don't see a B770 it's because Pat Gelsinger opened his mouth, not because it's too hard to make. The hard part was getting a dGPU on the market in the first place, and they've released 7 not including collabs with other companies. I don't know about you but I expected 0.

    • @nimrodery
      @nimrodery 13 часов назад

      @@christophermullins7163 And you compare stuff based on what else is on the market at a similar price. I thought that was pretty basic.

  • @steven7297
    @steven7297 16 часов назад

    Amd is still better , dude already spilled the beans and said they have already made the next 2 generations of Gpu’s meaning after 2 years these cards get put in the back burner

  • @cebuanostud
    @cebuanostud 19 часов назад

    Tom Peterson you're the man.

  • @ktkace
    @ktkace 17 часов назад

    B5800 被能夠提斯奈特勒熱卡莫麼

  • @Integroabysal
    @Integroabysal 21 час назад +2

    I see a lot of people hyping this up , like ah I want intel to release a b770 and stuff like that . I want to ask you, how many of you are ready to buy a 450$ gpu from intel that's on par with 4070/4070s instead of actually going with Nvidia ? Not many , I see a lot of people still buying i5 12400f in their rig despite the fact that 7500f /7600 ryzen platform is like 20-30$ more expensive and it has immense futureproof people still stuck to buy what they owned 10 years ago and if 10 years ago they had an i5 6400 they will gladly buy a 12400f and maybe even a motheroard that supports BCLK overclocking , get it to 4.8-5 ghz and be happy for the next 10 years. You are all acting here like majority of people are doing generation to generation upgrades , people usually buy a pc for ~5 years period then upgrade it , the amount of people doing generational upgrades is less than 10%. If intel wants market shares they should be even more aggressive in this price point , maybe a b590/b750 at around 300$ with same amount of memory but 20% more XE cores would do it even better than a b770 , cause in the 400$ price point there is already a lot of competition from new gen and past gen gpus.

    • @Rose.Of.Hizaki
      @Rose.Of.Hizaki 21 час назад +2

      I would without hesitation. Give me an Intel card that is fairly priced. Does 165-200fps at 1440p and Intel will get my money. Im still here rocking an old 6700XT, waiting for the 8000 series next year

    • @mythydamashii9978
      @mythydamashii9978 19 часов назад +2

      I would definitely get the B770. The only problems that I have are the taxed prices, stocks and a huge hole in my bank account after buying it (going into adulthood)

    • @cjmunnee3356
      @cjmunnee3356 19 часов назад

      I'm kinda new to the whole PC building situation, so maybe I just have a fresh perspective. But the build I'm planning involves a Ryzen 7600/7600X for the CPU and the Intel B580 GPU. Hopefully, I can scrounge together the money for it soon.

    • @Machistmo
      @Machistmo 15 часов назад

      @@Rose.Of.Hizaki that isnt this intel card. the 8000 series might be.

    • @Machistmo
      @Machistmo 15 часов назад +1

      @@cjmunnee3356 Dont buy components 1 at a time for a system. Save the chunk. Decide how to spend it when the time comes you have enough for what you wanted. Never buy a computer un pieces. Unless your President is about to enact pointless tariffs that will do nothing but weaken your nation globally. then whatever. Also start with used parts, you can source locally. Start with a used gamers PC. Watch your local sellers, nextdoors FBMP and the like. I just sold a banger system for 760 in a Zalman brand new case. Dude walked off with a HUGE deal. Sold it via NEXTDOOR app my GF uses.

  • @CRF250R1521
    @CRF250R1521 16 часов назад

    12gigs is still pathetic lmaooo

    • @dominicshortbow1828
      @dominicshortbow1828 15 часов назад +1

      not in the $250 price range. it's actually pretty good

  • @noticing33
    @noticing33 16 часов назад

    For 1080 8gb is plenty if games are properly optimized for low 3nd users, sadly this will only encourage more laziness

    • @Machistmo
      @Machistmo 15 часов назад

      This is just wrong. I wish you were right but you are DEAD wrong. Encourage more laziness? What in The FUCK are you on about?

  • @YashTheDon
    @YashTheDon 20 часов назад +2

    i am still gonna buy rtx 4060

    • @ComfyShortz
      @ComfyShortz 20 часов назад

      B580 is amazing at its price point you are making a mistake.

    • @YashTheDon
      @YashTheDon 20 часов назад +1

      @@ComfyShortz i know but it just that it is not available in India

    • @Bargate
      @Bargate 18 часов назад +2

      @@ComfyShortz It's literally going to be out of stock until January in most countries. I couldn't blame someone if they can't be arsed to wait and just play some games. Would get 7600XT in that price range for new GPUs but hey.

    • @roklaca3138
      @roklaca3138 18 часов назад +1

      Lol... 😂😂😂😂😂

    • @YashTheDon
      @YashTheDon 17 часов назад

      @@Bargate but rtx 4060 is good in 2d animation faster than rx7600xt that's why i am getting it