Made the transcript readable. Too bad I can't add it as a subtitle tho... 0:00 When Nvidia announced that its new Pascal consumer GPU's wouldn't be using 0:04 HBM video memory, 0:06 some folks were left feeling a little confused. 0:09 However that isn't to say that they didn't change anything about their vram 0:14 as these cards are using the also newly released gddr5X. So what's special about 0:21 its other than the fact that it has an x in its name, which might be trying to 0:26 remind people of words like exciting, 0:29 extreme or too expensive? 0:33 Well, that might depend on who you buy your components from. So instead let's 0:38 talk about what GDDR5X actually is. 0:41 It's a type of video ram which does things like hold textures, store images 0:46 in frame buffers and provide the GPU with lighting information. ensure that holds 0:51 the data that your GPU actually has to process so that your computer will 0:55 display your game properly but the reason question is what's special about 1:00 GDDR5X in particular, and is it going to make your games look any better and run 1:05 faster? 1:06 Well, usually vram tends to matter more the higher the resolution your games are 1:10 running at, making it especially important for multi-monitor setups or if 1:14 you're trying to run things with aggressive anti-aliasing or high res 1:19 texture settings, as these things require more stuff to be held in memory and with 1:25 games becoming ever more detailed than ever, higher resolution monitors are becoming more 1:29 common and cheaper. Quicker memory is also becoming more important. 1:34 Graphics memory has placed an emphasis on wide bandwidth for a long time, but, 1:39 GDDR5X improves upon the existing GDDR5 standard by allowing twice as much data 1:47 to be accessed at one time. 1:49 64 Bytes instead of 32, and, while this may not sound like much, we should see 1:54 overall transfer rates of around 10 gigabits per second or even higher per pin, 1:59 and nearly four hundred fifty gigabytes per second of overall bandwidth, while 2:04 drawing less power than its predecessor. Pretty sweet, but slow down just a minute: 2:09 what about that fancy new HBM that AMD 2:12 came out with? Why is GDDR5X a big deal then? 2:17 While it's true that HBM, which you can learn more about here, has higher 2:21 throughput and has a smaller footprint, which are both great things on your 2:25 graphics card, specifically than GDDR5X (#fail), GDDR5X should be quite a bit easier not 2:32 to mention cheaper to implement then HBM, since it's still quite similar to the 2:37 ubiquitous GDDR5 and although both Nvidia and AMD are heavily rumored to be 2:43 moving towards HBM2. Oh, for some of their upcoming architectures GDDR5X 2:49 might appear on more mid-range future graphics cards as well as the greater 2:54 accessibility of high-res displays and the growing interest in virtual reality 2:58 headsets will make high throughput vram increasingly important. Currently GDDR5X 3:06 is only available on Nvidia's new flagship: the Geforce GTX 1080. 3:10 But, don't be surprised to see it paired with more GPU's down the road and whether 3:15 you're going for GDDR5X or HBM, pay attention to what kind of vram is in 3:20 your next card if you want to get into surround gaming hook up an Oculus Rift 3:24 HTC vive or just really really like things with X's in them.
Same, geez, I had to visit their channel. The newest video in my subscription feed is 21 hours old even though I there should probably be like 30 new videos there.
Hey! That's some damn good editing and lighting in this video on top of Luke's ever increasingly improved articulation. You guys just keep getting better and better at this. Thumbs up!
yeah, compared to last years Luke, who was like stored into a freezer, stiff as chopsticks, now he looks more alive. didnt like him before because of stiffness before
I love the mentions to other LMG videos, the moment they come up I open the link up in another tab for when I'm done with the current video. Beats Playlists where I've seen some vids already and I into the zone of enjoying non-stop LMG content.
The GDDR5X SGRAM (synchronous graphics random access memory) standard is based on the GDDR5 technology introduced in 2007 and first used in 2008. The GDDR5X standard brings three key improvements to the well-established GDDR5: it increases data-rates by up to a factor of two, it improves energy efficiency of high-end memory, and it defines new capacities of memory chips to enable denser memory configurations of add-in graphics boards or other devices. What is very important for developers of chips and makers of graphics cards is that the GDDR5X should not require drastic changes to designs of graphics cards, and the general feature-set of GDDR5 remains unchanged (and hence why it is not being called GDDR6).
***** i did not mean HBM and did indeed refer to High Dynamic Range (HDR) as i want to know do if need a HDR monitor or if all monitors with good colour range profit? (IPS)
+Redline You need a monitor with displayport 1.3+ or hdmi 2.0a/b+ and the monitor must support 10bit colour depth per channel (over 1 billion colours instead of today's 8bit depth - over 16 million colours). It increases the colour contrast and brightness, too. That's it.
Luke has become a way better performer! Look how he has improved! Great performance mate! I am not even talking about Sebastian. You would think that Linus had the ability to be natural on camera from birth though he also developed that ability in the process.
Would be nice to see a "could they do it" series. Where a time period is chosen and see what could be done with their technology and how it would be different from the modern implementations. Was thinking of back when calculators were the size of rooms what would they be capable of if the computer was the size of a large city and took top tier reactors to run? How large would the computer have to be and would it ever have been able to run something like 2016's Doom.
I have done programming in CUDA environment. The stuff which is GDDR.. and all is considered to be slower. GPU has mulitple stages of memory 1) First the Global Memory which is what you see on the headlines, 2GB... 3GB and all 2) L2 memory 3)L1 memory 4) Register memory The fastest being Register memory and it is instantaneous and slowest being Global Memory. Now, there is another bottle neck to the overall GPU thing is the transfer from the RAM to GPU Global Memory. The architecture of GPU is that it is not independent like CPU. It has dependency on CPU and when CPU transfers memory to the GPU and instructions, it will run. This PCI express is the biggest bottle neck in the performance of the GPU. But when I heard that Hybrid memory is arriving where the CPU and GPU share the same memory address, it was great as it was addressing this problem but again I don't know how far that technology has evolved. But I hope that technology evolves. Codes need to be rewritten to fix that situation but can be managed in the libraries only so that the end dev is not affected.
@@tylerdurden3722 wonder if GDDR6X will be abandoned for GDDR7 in one or two years since it is not JEDEC standard but made by Micron specifically for Nvidia (like GDDR5X back in 2016/2017)
@@tHeWasTeDYouTh eventually, yes. GDDR5 is based on DDR3 GDDR6 is based on DDR4 Intel already has CPUs with DDR5 memory controllers in the pipeline. So my guess would be, that GDDR7 is not far behind DDR5. It would most likely have independent read and write, just like DDR5. I personally think that Samsung's HBM, that can work without an interposer, will eventually take root in mobile devices and then dominate everywhere else. GDDR6X is actually quite special. It doesn't really function with just 1's and 0's. It's not binary. It functions with 0, 1, 2 and 3. It has more states than just "On" or "Off". It uses a quaternary number system. Called PAM4. Ethernet and SSD's are already using such higher number systems (that are not really binary). GDDR6X takes inspiration from those technologies. This allows more information to be stored/transferred, per clock. It would be a shame if this is abandoned.
Why do all of your videos that you take in front of the green screen still have so much green in the picture, it really irritates me how green lukes skin looks on the "edges" of his body... LMG...pls fix :D
TL:DR It's a stopgap. If an old r9-290 was fitted with the same gddr5x as a 1080 it'd have 640GB/s of memory bandwidth, As it is a lowly r9-390 has to make do with only 384GB/s.
+Joe Bob please stop embarrassing yourself... There are multiple manufacturers of GDDR5X and ofc they have different characteristics. Nvidia uses Micron's version which is currently the slowest there is on the market (10Gb/s).
Here's a question and idea for a possible future video Is it possible to limit the vram of a video card and/or the speed of the vram? The reason I ask is because I'm wondering how much faster the GPU itself of the 1080ti is vs the 1080. We already know the 1080ti uses gddr5x vram and there's a bit more on three ti vs the standard 1080. My goal is to see how much faster the actual you dye is. To be clear I already own a 1080ti. (Gigabyte aorus gaming - non-extreme edition cuz money) but I'm just curious how it would stack up on a more level playing field vs it's non to counterpart
while you guys were talking about how gddr5x is bettwr than HBM. the pic shown in background of a gpu has 2 PCIe stick or whatever it is called. what gpu is that which uses 2 PCIe slots??
On HBM: "Yeah, the R9 Fury Series has that on itself, but it is rumored to be on Pascal in the next generation and you know, High-End today is the Mainstream tomorrow!" On GDDR5X "Look at the nVidia Graphics Cards featuring this all new and nice memory type!! We even give you a spec sheet of the GTX 1080, because you need that for GDDR5X to know what's it all about." Talking about bias.
Graphics Cards GDDR5X to DDR5 to GDDR6 PCI Express work in the same motebord for example ASUS Crosshair V Formula-Z work a gtx 1080 to GeForce RTX 2080 Ti GeForce RTX 2060
im thinking this should be on the forum, but why does increasing the memory clock speed give better results than increasing the core clock speed in certain games, (happened in payday 2)
gddr5x is more expensive than gddr5. Given this is the first " generation" of gpu they have chosen to implement this on, they had to chose this on the highest end single-gpu variant. The next gen will probably have gddr5x on the xx70 and the xx60 models while the xx80 an xx80ti will have HBM or HBM2.
You should do a video explaining graphics gard instability. I have a crappy Gigabyte GTX 960 and it can't be overclocked AT ALL (if I try, it immediately gets artifacts and crashes) and I've always wondered why.
Maybe this 960 is factory overclocked. My Zotac GTX770 AMP! is heavely overclocked and on my specific Card there is no fucking way to increase the Core Clock, but my Memory can increase to an Offset of 300Mhz. Do you tried to increase your Voltage Level, in Msi Afterburner you have to enable this function ;)
You could have improved on the organization of your presentation; it may be confusing for people who aren't tech savvy and watch your videos. In the video, you kept asking the questions (What is GDDR5X, what's special about GDDR5X in particular, etc.) but then the next thing you talk about is a definition of regular VRAM and what regular VRAM does, respectively, before going into actually answering the question. The viewers expect the answer after the question, not after a whole bunch of info dump. The video would have been a lot better if you first went into describing VRAM and then go into talking about GDDR5X specifically.
Torrents would be pirating it. So you chance the crack not working or other errors. Dvd is physical media that after installation is only good for launching the installed game. Steam or other places that host games you can buy are just a way to purchase a game and own it digitally without physical media
It's when your cpu is far weaker than your gpu, and thus "bottlenecking" it, which prevent your gpu from ever reaching it's full potential. You could fix that either by getting a new cpu, or just overclock the hell out of it :P
Made the transcript readable. Too bad I can't add it as a subtitle tho...
0:00 When Nvidia announced that its new Pascal consumer GPU's wouldn't be using
0:04 HBM video memory,
0:06 some folks were left feeling a little confused.
0:09 However that isn't to say that they didn't change anything about their vram
0:14 as these cards are using the also newly released gddr5X. So what's special about
0:21 its other than the fact that it has an x in its name, which might be trying to
0:26 remind people of words like exciting,
0:29 extreme or too expensive?
0:33 Well, that might depend on who you buy your components from. So instead let's
0:38 talk about what GDDR5X actually is.
0:41 It's a type of video ram which does things like hold textures, store images
0:46 in frame buffers and provide the GPU with lighting information. ensure that holds
0:51 the data that your GPU actually has to process so that your computer will
0:55 display your game properly but the reason question is what's special about
1:00 GDDR5X in particular, and is it going to make your games look any better and run
1:05 faster?
1:06 Well, usually vram tends to matter more the higher the resolution your games are
1:10 running at, making it especially important for multi-monitor setups or if
1:14 you're trying to run things with aggressive anti-aliasing or high res
1:19 texture settings, as these things require more stuff to be held in memory and with
1:25 games becoming ever more detailed than ever, higher resolution monitors are becoming more
1:29 common and cheaper. Quicker memory is also becoming more important.
1:34 Graphics memory has placed an emphasis on wide bandwidth for a long time, but,
1:39 GDDR5X improves upon the existing GDDR5 standard by allowing twice as much data
1:47 to be accessed at one time.
1:49 64 Bytes instead of 32, and, while this may not sound like much, we should see
1:54 overall transfer rates of around 10 gigabits per second or even higher per pin,
1:59 and nearly four hundred fifty gigabytes per second of overall bandwidth, while
2:04 drawing less power than its predecessor. Pretty sweet, but slow down just a minute:
2:09 what about that fancy new HBM that AMD
2:12 came out with? Why is GDDR5X a big deal then?
2:17 While it's true that HBM, which you can learn more about here, has higher
2:21 throughput and has a smaller footprint, which are both great things on your
2:25 graphics card, specifically than GDDR5X (#fail), GDDR5X should be quite a bit easier not
2:32 to mention cheaper to implement then HBM, since it's still quite similar to the
2:37 ubiquitous GDDR5 and although both Nvidia and AMD are heavily rumored to be
2:43 moving towards HBM2. Oh, for some of their upcoming architectures GDDR5X
2:49 might appear on more mid-range future graphics cards as well as the greater
2:54 accessibility of high-res displays and the growing interest in virtual reality
2:58 headsets will make high throughput vram increasingly important. Currently GDDR5X
3:06 is only available on Nvidia's new flagship: the Geforce GTX 1080.
3:10 But, don't be surprised to see it paired with more GPU's down the road and whether
3:15 you're going for GDDR5X or HBM, pay attention to what kind of vram is in
3:20 your next card if you want to get into surround gaming hook up an Oculus Rift
3:24 HTC vive or just really really like things with X's in them.
ñ
thank you lol
why...just why would you do this
Luke's hand movements aren't smooth, LTT needs 60fps
But that's not cinematic
But for that we go to the movies.
+Lati Sullivan tell that to the console peasants
Zylixel I don't speak to those :P
+potato at least he can afford nice things. Nicer than a Clinton supporter like you...
You spelled extreme wrong. Any child from the 90's knows that it is spelled XTREME!
I'm a kid from the 90s and I approve this message
I'm a message from the approve and I 90s this kid
I hurt my potato and I approve this taco
Bryan A
I'm a 9gager and I approve your potato
I'm a child from the 90's and I approve this comment stream
Great, the RUclips sub box has broken again. I got to this video purely by luck.
Worked for me, so idk, RUclips is weird :D
Same, geez, I had to visit their channel. The newest video in my subscription feed is 21 hours old even though I there should probably be like 30 new videos there.
+Sezze's Stuff same for me
TF2Maps?
Am I the only one who looks though the suscriber's to find vids because half of them dont show up?
03:50 Woah Luke t-shirt changing color
LinusShirtChangingTips
he also had a haircut
3:30*
How many times did he change shirts and haircut, lol?
HIS FUCKING NIPPLES.
What is this thing called "counter strike global offensive". Pssshhh must be some weird Nintendo game.
"How to learn Russian in 1 day"
I'm a virgin and this comment triggers me
It's a minigame for candy crush
+Tomshotz im a comment and this virgin triggers me
Its an amazing xbox game, it changed my life
Hey! That's some damn good editing and lighting in this video on top of Luke's ever increasingly improved articulation. You guys just keep getting better and better at this. Thumbs up!
yeah, compared to last years Luke, who was like stored into a freezer, stiff as chopsticks, now he looks more alive. didnt like him before because of stiffness before
I love the mentions to other LMG videos, the moment they come up I open the link up in another tab for when I'm done with the current video. Beats Playlists where I've seen some vids already and I into the zone of enjoying non-stop LMG content.
This episode looks much much better than the old ones. The animation and video effects are awesome. Did tech Techquickie hired some new guys?
The GDDR5X SGRAM (synchronous graphics random access memory) standard is based on the GDDR5 technology introduced in 2007 and first used in 2008. The GDDR5X standard brings three key improvements to the well-established GDDR5: it increases data-rates by up to a factor of two, it improves energy efficiency of high-end memory, and it defines new capacities of memory chips to enable denser memory configurations of add-in graphics boards or other devices. What is very important for developers of chips and makers of graphics cards is that the GDDR5X should not require drastic changes to designs of graphics cards, and the general feature-set of GDDR5 remains unchanged (and hence why it is not being called GDDR6).
*AMD's HDR as fast as possible.*
+kowaletzki think he meant HBM
I think he really meant HDR which is part of the new Display Port interface, 1.4. AMD RX480, or example, is an upcoming GPU which supports this.
***** i did not mean HBM and did indeed refer to High Dynamic Range (HDR)
as i want to know do if need a HDR monitor or if all monitors with good colour range profit? (IPS)
***** "as far as i know" that's why i want a as fast as possible :)
+Redline You need a monitor with displayport 1.3+ or hdmi 2.0a/b+ and the monitor must support 10bit colour depth per channel (over 1 billion colours instead of today's 8bit depth - over 16 million colours). It increases the colour contrast and brightness, too.
That's it.
Higher resolution monitors becoming cheaper?? lmao thats a good joke Luke, youre a funny guy!
When you get everything for free it's easy to be disconnected.
Yes it's getting cheaper, compare today's models with last year's
+bathrobeheroo I don't consider $350 for a monitor cheaper.
For a 4k monitor $350 ish, is very acceptable seeing as they're usually $500+
Cheap: "Costing little money"
Cheaper: "Costing less than usual or exptected"
Great timing for this video to be released since my 6 year old graphics card finally died last night 😂
So from 6 year old to highest possible new end? Great choice! xD
Isaax Ikr xD
OK something tells me that the 6 year old CPU will bottleneck your GPU
Dustin M no shit. But it's my main rig right now and I need a dedicated GPU
+Ttomisabeast I figured that you would know already but I posted it just in case
i like how Luke turned green when a background was applied to the white surface behind him :)
i know its a greenscreen, still funny though.
Video editor on point today!
Just updated my amazon link to the linus support ^_^ keep up the good work guys!
Anyone else notice how baked he was in this video?
great video! I thought all the new GPU were coming with HBM and this GDDR5x was so unexpected! 8/8 Luke
it's awsome to see Luke here :D
So we're slowly transcending to luketechtips?
Tech quickly is so close to 1,000,000 subscribers ⚡️⚡️⚡️
GDDR6X in 2020 be like : Hello there
really nice editing
Damn, Luke is quick with changing his shirt!
That white shirt was such a great idea
I just noticed that Techquickie is a separate channel. I always thought that it was just a playlist in the LTT channel.
Luke has become a way better performer! Look how he has improved! Great performance mate! I am not even talking about Sebastian. You would think that Linus had the ability to be natural on camera from birth though he also developed that ability in the process.
3 SHIRTS IN 1 VIDEO? EXTRAORDINARY!
Wow, Luke actually said "architecture" instead of "architexture". Celebrate, folks!!
Would be interesting to see a techquickie about SSH
Those CS:GO Asiimov key caps look really cool
i guess next year's april fools video will be a techquickie of a extremely specific topic to spoof videos like this one
Would be nice to see a "could they do it" series. Where a time period is chosen and see what could be done with their technology and how it would be different from the modern implementations. Was thinking of back when calculators were the size of rooms what would they be capable of if the computer was the size of a large city and took top tier reactors to run? How large would the computer have to be and would it ever have been able to run something like 2016's Doom.
the size of the observable universe
And also, what do Cores do and what does it benefits to it. This could be a good explanation on Techquickie
Can't belive that 10 series GTX was released more than 3 years ago!
I have done programming in CUDA environment. The stuff which is GDDR.. and all is considered to be slower. GPU has mulitple stages of memory
1) First the Global Memory which is what you see on the headlines, 2GB... 3GB and all
2) L2 memory
3)L1 memory
4) Register memory
The fastest being Register memory and it is instantaneous and slowest being Global Memory. Now, there is another bottle neck to the overall GPU thing is the transfer from the RAM to GPU Global Memory. The architecture of GPU is that it is not independent like CPU. It has dependency on CPU and when CPU transfers memory to the GPU and instructions, it will run. This PCI express is the biggest bottle neck in the performance of the GPU. But when I heard that Hybrid memory is arriving where the CPU and GPU share the same memory address, it was great as it was addressing this problem but again I don't know how far that technology has evolved. But I hope that technology evolves. Codes need to be rewritten to fix that situation but can be managed in the libraries only so that the end dev is not affected.
this comment tho ! this is totally AFAP
It is 2019 and everyone abandoned GDDR5X for GDDR6
lol
Just thought about that XD
And now there's GDDR6X 😅
@@tylerdurden3722 wonder if GDDR6X will be abandoned for GDDR7 in one or two years since it is not JEDEC standard but made by Micron specifically for Nvidia (like GDDR5X back in 2016/2017)
@@tHeWasTeDYouTh eventually, yes.
GDDR5 is based on DDR3
GDDR6 is based on DDR4
Intel already has CPUs with DDR5 memory controllers in the pipeline.
So my guess would be, that GDDR7 is not far behind DDR5. It would most likely have independent read and write, just like DDR5.
I personally think that Samsung's HBM, that can work without an interposer, will eventually take root in mobile devices and then dominate everywhere else.
GDDR6X is actually quite special. It doesn't really function with just 1's and 0's. It's not binary.
It functions with 0, 1, 2 and 3. It has more states than just "On" or "Off". It uses a quaternary number system. Called PAM4.
Ethernet and SSD's are already using such higher number systems (that are not really binary). GDDR6X takes inspiration from those technologies. This allows more information to be stored/transferred, per clock. It would be a shame if this is abandoned.
AMD's Gameworks As Fast as Possible.
Why do all of your videos that you take in front of the green screen still have so much green in the picture, it really irritates me how green lukes skin looks on the "edges" of his body... LMG...pls fix :D
Nearly 1 mil subs
you do not put NVIDIA and CHEAPER into the same sentence.........
You just did...
My left ear loved the vid
TL:DR It's a stopgap.
If an old r9-290 was fitted with the same gddr5x as a 1080 it'd have 640GB/s of memory bandwidth, As it is a lowly r9-390 has to make do with only 384GB/s.
1080 uses the slowest and cheapest gddr5x there is.
It uses the ONLY gddr5x there is, so its technically the fastest and most expensive version there is fuck boy.
+Joe Bob please stop embarrassing yourself... There are multiple manufacturers of GDDR5X and ofc they have different characteristics. Nvidia uses Micron's version which is currently the slowest there is on the market (10Gb/s).
I think Techquickie would generally be a much better channel there was some background music.
What's GDR5 and GDR5X that you mentioned a few times in the middle of the video? ;)
i am from future and GDDR6 is available in mid range pc
The editing omg
You wanna know something ironic? The ad I got before the video featured Linus...
Hey your skin is reflecting the greenscreen!
Luke looks so seasoned.. He's like fuck my life
Here's a question and idea for a possible future video
Is it possible to limit the vram of a video card and/or the speed of the vram? The reason I ask is because I'm wondering how much faster the GPU itself of the 1080ti is vs the 1080. We already know the 1080ti uses gddr5x vram and there's a bit more on three ti vs the standard 1080. My goal is to see how much faster the actual you dye is.
To be clear I already own a 1080ti. (Gigabyte aorus gaming - non-extreme edition cuz money) but I'm just curious how it would stack up on a more level playing field vs it's non to counterpart
while you guys were talking about how gddr5x is bettwr than HBM. the pic shown in background of a gpu has 2 PCIe stick or whatever it is called.
what gpu is that which uses 2 PCIe slots??
um, I'll help you, the ports in the upper side are SLI ports, not PCIe. So the card has a single PCIe port.
OVR ohhh...okay....so thats where the sli bridge is commected?
Yes indeed.
OVR thanks a lot brother :)
lmao
This should have been called GDDR5Xplained.
How long did it take you to crate this video Luke!
On HBM: "Yeah, the R9 Fury Series has that on itself, but it is rumored to be on Pascal in the next generation and you know, High-End today is the Mainstream tomorrow!"
On GDDR5X "Look at the nVidia Graphics Cards featuring this all new and nice memory type!! We even give you a spec sheet of the GTX 1080, because you need that for GDDR5X to know what's it all about."
Talking about bias.
Please do an explanation on "Tree command" in the future episodes.
Graphics Cards GDDR5X to DDR5 to GDDR6 PCI Express work in the same motebord
for example ASUS Crosshair V Formula-Z work a gtx 1080 to GeForce RTX 2080 Ti
GeForce RTX 2060
that knuckle crack 1:14
Nice t-shirt magic.
im thinking this should be on the forum, but why does increasing the memory clock speed give better results than increasing the core clock speed in certain games, (happened in payday 2)
3:26 Arnold voice FTW XD
Hey Luke, why'd you change your shirt for the ad?
people weren't "confused" that we didn't get HBM2. we were dissapointed!
Next do what is tessellation in 3D video/image rendering and how FFT speeds up the process.
isn't hbm 2.0 available to amd only due to their partnership?
anyone realize luke's shirt is changed?
So, if you play Low Graphical Demanding on GDDR5X like Minecraft, it will run it higher than 1080p60?
Why did Luke change his T-shirt mid-way, as fast as possible!!
I really liked that keyboard. Too bad i need to register to surf on massdrop. Not doing that.
Why didn't the 1070 get GDDR5X?
gddr5x is more expensive than gddr5. Given this is the first " generation" of gpu they have chosen to implement this on, they had to chose this on the highest end single-gpu variant. The next gen will probably have gddr5x on the xx70 and the xx60 models while the xx80 an xx80ti will have HBM or HBM2.
on gtx 1070 with custom pcb and extra power phases to the vrm, you'll be able to overclock the gddr5 ram to near gddr5x ram speeds
and then you can overclock the GDDR5X to negate that and get the lead again.
+_ Skylake _ oc gddr5 to 11ghz?? dafuq no way. the 8ghz on the 1070 is already near max for gddr5.
+Omar Abdel Aziz nope +700mhz on these chips are doable. that puts it at 9400mhz and g5x is at 10000mhz so the gap isnt that big :)
I think the 1080ti will have GDDR5X but the Titan(?) will have HBM 2.0 to set it a part. I may be wrong but a man can speculate can't he...
whats headphones sparkle ?? please make a video about this topic
i think that those led lights you use makes him look more pinkish. like his nose , mouth and palms.
1:01 it will make your game look better :D
He explained everything except what actually is GDDR5X 😬
1:39
Could you do a tech quickie on Intel Itanium
Nice shirt changes
Did anyone just get a notification about this for no reason?
New workshop idea More ram vs More VRam. (In a budget setting)
NICE!
I like the Exciting -> Extreme -> Expensive (0:27) :v
GDDR5X or HBM? Which is best for higher res gaming?
same socket though right you wont have to change mb?
Could you do a tech quickie on aftermarket gpu solutions please, thanks
can i ask something,what is a smallest motherboard that can support intel 750 sry im newbie
You should do a video explaining graphics gard instability.
I have a crappy Gigabyte GTX 960 and it can't be overclocked AT ALL (if I try, it immediately gets artifacts and crashes) and I've always wondered why.
Maybe this 960 is factory overclocked. My Zotac GTX770 AMP! is heavely overclocked and on my specific Card there is no fucking way to increase the Core Clock, but my Memory can increase to an Offset of 300Mhz.
Do you tried to increase your Voltage Level, in Msi Afterburner you have to enable this function ;)
hbm = hearty bowel movement.
You could have improved on the organization of your presentation; it may be confusing for people who aren't tech savvy and watch your videos. In the video, you kept asking the questions (What is GDDR5X, what's special about GDDR5X in particular, etc.) but then the next thing you talk about is a definition of regular VRAM and what regular VRAM does, respectively, before going into actually answering the question. The viewers expect the answer after the question, not after a whole bunch of info dump. The video would have been a lot better if you first went into describing VRAM and then go into talking about GDDR5X specifically.
can you make a video on Linus tech tips with nvidia surround with 1080 sli?
So it doesn NOT have High Bandwidth Memory, but it does have higher bandwidth memory... Seems legit.
3:30 and 4:29 Magic Shirt Change!
3:29 neat magic trick. shirt colour change in a second. lol
is there any major difference between installed games(DVDs) vs download games(torrents)?
Torrents would be pirating it. So you chance the crack not working or other errors. Dvd is physical media that after installation is only good for launching the installed game. Steam or other places that host games you can buy are just a way to purchase a game and own it digitally without physical media
+Scolio oki thank you!😃
+Scolio just because a game is downloaded via a torrent doesn't mean its always a pirated copy
Only real exceptions are already free games and games that are downloaded from a torrent site but use a legally purchased license
Luke´s Rudolph nose. New t-shirt though
So will GDDR5X work in a GDDR5 slot
Rudolph makes a good host. Also press 2 for butts.
So my future 1080, as a GDDR5X, is not compatible on a GDDR5-compatible motherboard?
(Planning to switch my 950 with a 1070, but...)
there no such thing as gddr5 compatiable board, your mobo has pci-e x16 gen 3 slot and the 1080 just goes into that, just like your 950
+Top Hats and Champagne OOOHH! THANK YOU GOOD SIR!!
The way you speak = trolling Linus
make a video about " what is cpu bottleneck "
im sure they have made one, might not be called that but something like it
It's when your cpu is far weaker than your gpu, and thus "bottlenecking" it, which prevent your gpu from ever reaching it's full potential.
You could fix that either by getting a new cpu, or just overclock the hell out of it :P
+I'ma Rob'Oat i think he knows that already
I'ma Rob'Oat thanks alot
+Adon Wullschleger then why would he ask?