+12michak12 LOL I'm sure you will have more tears when the chip goes KABOOM. I really have BAD, BAD experience with new technology, I'm sure I would chose screen quality to having G-sync for the price. From the luck I have, I'm almost sure the G-sync chip would burn just after waranty finish and have to go buy another screen, I mean it's one more chip to get broken and the fact that it keeps changing it's frequency it probably has more wear, no idea. I also bouht the Nvidia shield tablet a year ago and I'm on the 4th tablet already and this one is also defective on the wireless...
+HardstyleBeats_Mixr I always take good care of my stuff, no scratches, no hits, nothing and they all die... Anyway about the nvidia shield tablet, it have always been replaced by a new one from warranty, I would never buy it again.
am getting too old to notice the difference, but those who got the money and really need to have clear picture i guess is good technology but exspensive.
MrZodiac011 are you out of your mind? With my 970 i am getting 300ish in overwatch with competition settings. I'm upgrading to a titan xp if they don't announce 1080ti at CES in January 2017. Just so you know, the games that people play as Esports are the real reason why you would get a 240hz monitor. Every top Esports game my 970 easily can run 300+ so i have no idea what you are talking about.
MrZodiac01 means that nobody can reach 240fps at 1440p or 4k with modern games. I highly doubt your reaching 300fps with a 970 unless your playing at 720p.
I am playing 1080p low graphics in overwatch, CSgo, LoL and everyone of those games gets ~300 If you looked at the actual monitor i was referring to( Asus PG258q) you would see that it does not go higher than 1080 at 240hz with a 1ms response time. I am talking about people who play video games competitively. Every monitor is made for a different reason. There is not one monitor that works best in every situation.
I held back on getting a G-sync monitor for ages, V-Sync always just felt like it was enough... But G-sync really does feel smoother on top of eliminating tearing.
@@aleks.yt1 They work solo. It's like this: Vsync tries to match the computer's FPS to whatever the display is set to (fex. 60 Hz = 60 FPS) and Gsync on the other hand tries to match the monitor Hz to the FPS the computer is outputing (fex. 67 FPS = 67 Hz and dynamically, constantly changing to match the Hz to the FPS). So if you have a Gsync/Freesync capable monitor I don't see a reason to ever use Vsync.
+SHeeDeED Due to the fact that both AMD and Nvidia market their own versions of G-Sync and Freesync respectively, they simply cannot be free due to intellectual property (IP), thus using the technology requires a massive bonus for the companies respective. Granted, G-Sync monitors here (Australia) are far more expensive than their Freesync counterparts, there would be no benefit to having the technology free, as the choice is purely based on your graphics card. Using an AMD card with a G-Sync monitor is not only moronic (as it's basically an overpriced monitor that doesn't correctly sync the refresh rate), but also a major waste of money. As I own a GTX 770, and an Asus PG278Q ($1249 RRP AUS), I'll be forced to stick to Nvidia (which I plan to do anyway) for years to come, although the thought of rebuilding with a GTX 980Ti and also using my 50" 4K TV, gaming will be a dream on both my TV (larger resolution and slower moving), and my PG278Q for fast paced and graphically exquisite games.
+SHeeDeED I am so happy you're not instantly acting like a twit because I went against your opinion. I like you. Although desirable, it's not a cheap festure to implement and you'd see the average price of monitors increase. A lot of consumers would complain as they wouldn't understand/utilise the benefit of 144Hz sync with 1ms response, and would see it as a gimmick. I work in retail sales and see these sorts of things frequently, people complain about everything all the time. It's painful, but I enjoy educating consumers to make an informed decision before making a purchase.
+SHeeDeED with all this feature battle between amd and nvidia we'll going to have a segregated platform just like a console , if both have the same features (like gsync and freesync) they sould make one a standard and be done with it , and nividia should stop charging a licence for its gsync tech.
+bogdan m Actually yea! If i'm buying a GPU from either manufacturer it should be the end of it! Not worrying about stuttering, tearing and have to be a premium of around 200$ extra so that my monitor and GPU work together! Wtf is that? It should be a standard with ALL monitors working with ALL GPUs since nor Nvidia or AMD produces monitors!
Purchased my Asus PG278Q recently, and it's an absolutely gorgeous display. Not only with the 2560x1440 resolution, but the 144Hz refresh rate and 1ms response time makes for a fantastic gaming experience. Playing FPS games such as Battlefield 4/Hardline (and other FPS and high graphic games), if I turn very quickly you can see the screen lag trying to keep up. Problem solved with the new monitor and my old GTX 770 4GB. 27" Samsung 1920x1080 @ 60Hz is a great secondary monitor, but finally glad to have something that is actually responsive for a main display.
@@large_sailor i believe it! i cant stand tearing or stuttering in games im just hyper aware of it idk it gives me headaches too. so if i didnt have any tearing or input lag it be soooo worth it
Swear this was announced ages ago, and it's only coming now! But don't really such much difference, if the prices are the same as a normal monitor I'll get one, but wouldn't go out of the way for one.
You shouldn't ever ever watch graphics comparisons or videos like this on RUclips: the only video streaming site that does NOT support true 1080p (4K option comes CLOSE) or 60fps... at all...
I've bought a G-SYNC monitor. After a day the G-SYNC stopped working because of a setting in the monitor. I really noticed the difference already. Had to reset my monitor to fix it again. But god! I can't play games without G-SYNC anymore! Ones you go G-sync. Forget about the V-stink!
i don't doubt it makes a difference... but since everyone is getting it anyway i wouldnt go out of my way to get a g-sync monitor XD. look up adaptive sync displayport(vesa) and freesync(amd).... no real advantage in caring what monitor you have as long as it has display port which should be a requirement anyway.
This video seems pointless to me. The video examples basically only show us the difference between V-sync on/off, not G-sync. AFAIK you just cant see the benefit of a G-sync monitor on your normal screens. Its like trying to show us how great 4k resolution is on our 1080p screens, it just doesnt work that way and you have to see it in person.
Well, a perfect example, Grim Dawn, has A LOT of noticeable screen tearing. With V-sync turned on, instead of screen tears it is very noticeable stutters, which actually do affect gameplay. Enter the new monitor with G-sync I have and now both the screen tearing *AND* stutters are gone. And I have noticed the difference in other games I previously ignored the tearing and stutters on, mainly because I never had a choice but to choose between stutters and screen tearing, now I see them and since I have the option to get rid of it, that is exactly what I do. Which is basically where you are right now. You have either screen tearing or stutters. Those are your choices, at least until you get a monitor that can handle the variation. So you either ignore the issue like it is not there just settle for it or you do something about it. So which is it? Willful ignorance or fork out the dough?
I like how this legend tested this in arkham origins. Right now my pc is broken from a ram failure (ordering more soon), I have a gsync monitor and an RTX 3060. Can't wait to see how it goes
Cool technology, but it seems somewhat gimmicky to me. The only game that I've played recently that had screen tearing was Far Cry 4, and that was fixed once I turned V-Sync on in the game options. Didn't notice any stutter, either, so yeah... can't really justify the additional cost of 100s of dollars for a monitor that has G-Sync.
PC Nation No, I understand it. It's a variable refresh rate technology, so you won't notice tearing or stuttering if your framerate fluctuates. I get that. I just don't get the cost of it, when you can easily accomplish the same thing by turning V-Sync on, and dropping your settings a bit, to make sure that you stay at >60FPS.
Incessant Wake Not on my rig. I have yet to encounter any screen tear with V-Sync on. I don't doubt that G-Sync and Freesync are going to be included in the future with pretty much everything, but as of right now, I fail to see an investment reason.
LogicOwlGaming V Sync needs at least 1-2 frames to buffer, in some cases even 3! A single frame at 60fps needs 16,66ms to render, this is the input without V sync. With V sync, you need to add 2 frames for the buffer that, so you have about 50ms of input lag! This is INSANELY much and there's no way around that problem when using V sync. Even when you render 120fps and have a 120hz screen, it will still take 25ms for it to display with V sync. G sync doesn't need this buffer, because the only thing it does is waiting for the frame to be complete and tell the screen to display it; so you only have the input lag you would have without V sync enabled, but the benefit of a completely stutter- and tearing-free image.
Nvidia will release in a week or so a driver so that some free sync monitors will suport g-sync. The other free sync will also have a manual option to enable g-sync.
People don't seem to notice that this is for more "technically advanced", and or people who have acclimated themselves to a higher standard for gaming. I, myself, see one of these in my room sooner or later, because V-sync just makes me cry tears of anguish when I have to turn it on due to screen tearing. It's so outdated and doesn't fit into any gaming scene other than casual because of the input lag it causes. *Yep, calling all y'all casuals out.*
Surms41 I hate V sync so much. Screen tearing, FPS killing and Input lag just ruin the experience... I remember playing shadow of mordor with 59 fps on my gtx 970 but now when I play it on gsync I get a constant 80 to sometimes 90 fps.
AFAIK that helps only if the graphics card can't render 60FPS - given a 60hz monitor - so this will help you getting the same performance for example @ 45FPS without page tearing and without waiting for the whole refresh for the next frame as v-sync does, so you will get the best of the two worlds. but, if it the graphics card can render that game at that resolution at 60FPS+ then its the same as normal v-sync
i just got the viewsonic elite monitor, g-sync monitors cost more but i can confirm that you are definitely getting something for that extra cost. i am very glad i paid the extra for the more expensive g sync monitor
You will upgrade your GPU every 2 to 5 years, but if you have a top of the line monitor it will last longer while you also enjoy good image quality/resolution/size for movies and other stuff without the need of any GPU/CPU Upgrades.
Ivoruz Cz you are sa special kind is trash idiot on this globe. Omar has somehow the point. just because G-Sync, the monitor which is equipped with it costs at least 150 USD more. Why? Just because it's nVidia license. Funny that freesync isn't adding a single penny to the hardware price.
If you turn on V-Sync it will limit your monitor's refresh rate to 144 which will prevent screen tearing. G-Sync is there so that if it drops below 144 fps it won't be quite as noticeable and with G-Sync you're supposed to turn OFF V-Sync so that it can rise above 144hz.
+richyyy you are forgetting the dynamic refresh rate. Remember your gpu generates frames as fast as it can meaning that it will sometimes generate multiple frames before the screen even updates. The display will say 60 fps however you wouldnt be seeing all 60 of those frames because they were rendered before the screen updates. Havent you ever wondered why 30 fps looks a lot better with movies than games? Want to know why? Because the movies actually show you all 30 frames while real-time frame rates will often "skip" many of the 30 frames
+Brendan Forish hey Brendan can you help me out... I have intel i-7 core 6700k and gtx 980ti by i have very old monitor.... if i change my monitor to a 4k i would be able to play games on 4k ultra settings? and what monitor you suggest...
the screen tearing in this video isn't terrible but I guarantee it would seem much worse in person and depending on the game if v-sync is turned off you're going to get varying degrees of awful tearing. that's just how it is sadly
i want ask.. when using g sync monitor. all game can play 60fps ? or it depend on your pc spec ? I'm using gtx 1070 8gb with i3-4150 12gb ram. is that enough ?
I've been gaming without g/freesync all my life. The tearing can get annoying but unless its constant stuttering or tearing forget about the sync. I'm happy with my vg248qe 144hz and haven't noticed anything game breaking so far. If the extra cost of the technology isn't going to brekak your budget then by all means go for it, but don't let it rack your brain.
im a noob, so pardon my ignorance. g-sync is something that just works with your monitor if your pc is compatible, right? is there anything to "set-up"?
Vsync causes stuttering though, and more importantly, it can also cause input lag. The dumb thing about this video though is, you can't actually show g-sync working unless you're looking directly at the monitor with g-sync.
Terra Incognita Gaming I was just going from the few reviews and an explanation on how the technology works. Never heard anything like that from the reviews I saw. Sad to hear. :(
yeah but if i have a freesync monitor with a geforce GPU will still i get tearing?? won't the high refresh rate of a monitor (144Hz) eliminate all the tearing regardless of the sync technology used??
+jaberjbaar Well if you quit gaming in less than a year than you're not hardcore enough to care about screen tearing. I myself can see screen tearing and stuttering badly and I need myself a GSYNC.
I'm not sure if you've ever had smooth 60+ fps before consistently but i can guarantee you that once you start gaming 1080p 60+fps you will find it very difficult to go back to sub 30 frame rates.
It’s a bullshit it’s market crap that nvidia let pay nvidia customers now amount money for this stupid think called g sync . I have nvidia gtx 1080 and monitor is Samsung lc24rg50 curved 144hz freesync and absolut no problems with tearing . Instead of 600£ I paid 200£ and problem solved
Spielt ihr mit G-sync? Habe jetzt einen Monitor mit der Funktion. Habe in Shootern noch das Gefühl, das es mich verlangsamt. Geschmeidiger ist ja mit G-sync, aber kommt mir langsamer vor? Gibt es hier Shooter zocker, die damit Erfahrung haben?
It's not, it's very hard to notice a difference from watching G-sync on a video. Try playing a game yourself and compare V-sync and G-sync. You'll never want to use a monitor without G-sync ever again.
Hi, i need help. I wanna buy gtx 1080ti, but don't know, what monitor to buy. I have 2 monitor in select: Acer Gaming XR342CK: 34'' ultrawide, 2k, 75 hz or Acer Predator XB271HUAbmiprz: 27'' 2k, 165hz, G-sync. Are differences between G-sync ON/OFF too big? I will mainly play Dark souls 3, League of legends, Battlefield and GTA V.
Wow. I've only seen G-Sync a few times and i thought it doesn't brings you any advantages but damn. The game is so smooth with G-Sync. I need a G-Sync monitor. That shows again that Nvidia has more and better technologies than AMD. NVIDIA > AMD
AMD has freesync in its upcoming new cards as well... And it was clear to me the G-Sync video was made with better hardware than vsync off/on one. You can even see graphical improvements on G-Sync one xD I do own GTX 780 but really this is just typical Nvidia bluff. Adaptive vsync was supposed to be great as well with Nvidia, but it was actually creating more problems than it solved.
FanofScience It will be 2nd Maxwell generation. More cores, more performance. That means also more power draw and more heat. Goes pretty much how GTX 600 and 700 series. 600 series consumed less power, but had less performance than 700 series.
You're like a person who says that airplains are useless koz you dont fly. Simply a defected moron with no skill. Play on 30hz monitor if you like, you won't see the difference anyway.
do i need a monitor to have g sync in it and also a graphic card with gync to have a butter smooth experience or do i only need a graphic card with g sync ,my monitor is acer and has 60HZ
I hope this G-Sync Technology dies how it is right now and go to a much more less expensive way in which you do not have to look if you have a Amd or Nvidia Card to run your Monitor tear free. Amd does a much better work with freesync which can be seen in availability and price. Btw I'm not a Amd fan boy, because I have a Gtx 1070 in my system, but i do not want to pay so much more for a Gync Monitor.
+PlaystationCG I don't know exactly how it works, but it probably isnt even OS dependent... At least I imagine there's some kind of chip in their compatible gpu's that outputs data to the chip inside the screen (I dont know exactly if it even needs special software on the drivers for it)... Anyway windows is the number 1 gaming platform of pc's so you are probably even asking a wrong question :)
Sometimes, sure, you can barely even notice the screen tearing(without vsync) or stuttering(with vsync), but other times it is so noticeable, either way that it cannot be ignored, game breaking even, depending on the game or play style. Grim Dawn for example, the game I play that has the most screen tearing I have ever seen on any game, ever. It is so noticeable and so frequent that it is difficult to "put up with", so you I am like "Screw this, I am turning on Vsync." But now the game stutters so bad that I will actually lose about 2 seconds of gameplay every time there would normally be a screen tear. The stutter would hit and by the time it cleared by character will have traveled several steps and cast several actions and either my character would be dead or the enemies would be. Now, enter gsync and now there is no screen tearing *AND* no stutter, just smooth gameplay with no hitches or annoyances, from start to finish. The sims games have major issues with screen tearing as well, it is so bad that they don't even give you a vsync option because it would cause insane lag.
Yeah sure, great technology and all, but why the fuck is there only ONE monitor (an $800 one!) that has this technology built into it? ASUS is not very intelligent in this regard, since ROG has the only monitor out there, allowing ATI's tech to get closer to release. Oh, and ATI's FreeSync tech isn't proprietary. It works with any video card. So yeah, as much as I like ASUS (I even bought a GTX980 SC to get Gsync) their strategy here is pathetic. Make the technology affordable because very few people can afford an $800 monitor. They said the chip would only cost $100, but that seems to have been a lie. Besides, where are those chips?
Kevin Aous I was expecting a monitor that doesn't have the numerous issues this one does. A Google search reveals hundreds of duds and people who dropped 800 bucks getting lemons.
Good job. But yeah, I'm fortunate enough not to live in the UK. So shipping from UK vendors is expensive. Besides, Benq and Acer sick balls for quality. I'll stick with Asus. Well, I will when they fix this monitor and make it worthy of the Asus name.
I could never really tell the small details like tearing, or the difference between 1080p and 4k... I'm either getting to old or I just don't care about the little details.
if you cannot see the difference between 1080p and 4K then I should go check your eyes. Even if you dont care about the (little) details you could still see the difference.
The PPI of 1080p monitors is already really high so 4k is only beneficial for things like TV screens in my opinion. My eyes are fine and I don't think its justifiable spending extra money on 4K when it comes to small computer monitors.
kinda beating a dead horse now that adaptive sync and free sync are on the horizon. your upcoming product that wont be on almost every monitor in existence really doesnt match up against freesync for everyone with a new gen amd card, and adaptive sync for any card that can support dp 1.2a
Not all monitor will support Freesync actually. 1.2a standard doesn't mean the industry has to follow it to the letter on all monitors, especially when Freesync also require extra hardware in the monitor which the AMD is telling the manufacturers to do it themselves. What you will see when Freesync comes out is that there are monitors that will support it, and others will not, and those that do will also be at a premium, that is just how the market works. Nvidia to manufacturers: Here is the module, all you do is put it in and build monitors that can support it, the rest is up to your engineers on how you want to do it. AMD to manufacturers: You build the monitor that support variable refresh rate, and we'll provide the software to run it, and we'll also take credit for it by labeling it Freesync. Result wise, both are doing the same thing in the end. However I can't help but wonder what exactly is the point of Freesync if manufacturers are the ones doing most of the heavy lifting?
You do know adaptive sync was Nvidia tech also... Stupid AMD fanboy. But I suspect all these adaptive syncs, g-sync and freesync are just gimmicks that solve some problems but actually create new issues while doing that... Not worth it imo
SmartNexus ... seriously??? do you even try to stay informed on the subjects you post on? i'm sure VESA will be happy to know that they are owned by nvidia now somehow.
Neosaigo that's what the almost is there for. every monitor that is even somewhat aimed gamers/media will likely have adaptive sync in the near future(heck 1 is even making it available for free through a firmware update on a monitor already in circulation). As for free sync it works on any monitor that supports adaptive sync as far as i'm aware. basically their graphics card is the thing with the extra requirement(which is why only the new 260x, 290, 290x and 285 are supporting it in games). free sync is just an adaptive sync add on. g-sync is a full alternative that nvidia is using to milk out more cash for no particular benefit to the consumer, would have been easy for them to just support adaptive sync, but they want to charge more and reap all the benefits so they didnt.
***** Yeah I realized I never realized it in other games because v-sync is mostly always on by default I just turned it off in overwatch and it does surely hurt my eyes. It's not so bad in this video tho
I've got like 140 games on my PC and haven't noticed tearing on any of them ever, I get artifacting on GTA 5 and my friend does too but that's just the game, that's a common problem
***** Well g-sync is obviously a waste, never in my life have i seen screen tearing, people with g-sync monitors are either being little babies about it or they have absolute god-like eyesight, my eyesight is perfect and have never ever noticed it on any game on any PC in my life
+Mark ‘nattaking’ Kamp thier example wasn't great from my experience GTA V really suffers from this, where is is absolutely terrible with V-sync off, to the point where you can't really see anything when turning the camera and with V-Sync on it will immidiatly cap the framerate at 30 as soon as it hits 59 fps, causing some aweful stuttering (very noticable when driving fast)
Bought 144hz monitor, no G-Sync. Works amazing. I'm playing at around 100fps and don't see image tearing. I used to see it on my previous monitor which was 60hz, but it seems it's not an issue anymore on the 144hz. I'm pretty happy I saved money and didn't get G-Sync.
If they want to. Its not necessary. But we have option to do that. Console peasants are stuck with their shit hardware and configuration even if they wanted something better.
smokebomb.exe I have 28" inch 4K screen from Samsung which cost me 400 dollars. Well worth it for me. Doesn't have G-Sync, but I do not need it anyway. But you can get it if you want on PC. On consoles you have screen tearing and you cannot fix it even if you could afford it ;) You're obviously either deluded moron who doesn't know the prices of technology or just poor kid trying to find some reason why his low end handicapped PC (console) was better buy XDDD
Friend me and I'll show you my rig. And PS4, XBone (which I regret buying), WiiU, and PS3. And 64 inch Aquos 4k tv screen they all run on. facebook.com/rocketpunchgo
It looks like display manufacturers are going to be focusing on either Freesync or G-Sync in 2015. There will be products available from several different manufacturers for each technology. The G-Sync displays will be more expensive because of the hardware costs but since companies are avoiding internal competition there may not be a direct comparison available. Companies will probably try to pad in some extra profit from branding so even if the hardware is only $100 difference in cost the actual price for similar features could be much higher. G-Sync got an early lead by being first to market but the higher cost does not appear to be a good value compared to Freesync. Brand loyalty for GPUs and displays will probably have a big impact on sales but we have to wait and see if that loyalty will be enough to keep G-Sync in the lead.
I got a g sync and cant tell any difference. I turned it on through the nvidia settings. Maybe there is something els I have to do? I went from a 144hz to a 120hz ips ultrawide. And I get a solid 120hz. Looks no different than my 144 tear wise :(
FreeSync, which is an AMD technology, uses the Adaptive Sync standard built into the DisplayPort 1.2a specification. Because it's part of the DisplayPort standard decided upon by the VESA consortium, any monitor with a DisplayPort 1.2a input is potentially compatible. That's not to say that it's a free upgrade; specific scaler hardware is required for FreeSync to work, but the fact that there are multiple third-party scaler manufacturers signed up to make FreeSync compatible hardware (Realtek, Novatek and MStar) should mean that pricing is competitive due to the competition. AMD: According to AMD, The following pre-existing GPUs will be able to use FreeSync for dynamic refresh rates in games (after a software update): Radeon R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260. Other cards/chipsets will support FreeSync, but only for "video playback and power-saving purposes", these include: Radeon HD 7000, HD 8000, R7 and R9 series cards and APUs from the Kaveri, Kabini, Temash, Beema and Mullins lines.
When I play a game, I want it to always update in perfect sync. for example, I want to see an image on my screen that has had exactly 16ms passed. No more, no less. Exactly 16ms. The video made it look like gsync variably changes the refresh rate on the monitor. I thought that was odd. Is gsync for me? with a 144hz monitor?
That is what it does. When doesn't matter what graphics card you are running. On AAA games you wont always be over 144. When it drops to 120 the GPU and monitor will run in sync. That being said I have a gsync and a 2080ti and cant tell any difference with it on or off. My cousin swears by it though. Maybe you need 165 hz like hs has to tell? Idk I went from 144 to a 120 ips ultrawide and I can get a solid 120 on a lot of games. As far as the tearing I see no difference from 144hz to 120hz g sync on or off lol
If you truly want to make a great comparison, you should at least turn the in game camera at the same slow speed that's being shown on the G Sync Side.
It will help, it basically sets 144hz as the highest refresh rate that it will render. Its weird, and hard to explain, but I have a ROG Swift and there is a difference between 60 and 120, and even 120 and 144hz
Ko ure blind & nob. vsync stutters and gives horrible imput lag, g sync won't stutter and won't lag your mouse. If you move like an elephant with your mouse then it doesn't matter anyway, and if you're blind... you get what i;m saying.
I dont think so. From what I have seen 144 to 240 isn't noticable. But that being said I cant see a diff from 120 to 144 lol. 60 to 120 though is a world of difference. I used to be about that fps but finally said fuck it and got a 120 cause I wanted an ultrawide. Gsync makes no difference on my new monitor nor my old 144. My cousin runs a 165 and swears by gsync so in his experience 165 and gsync is worth it. I need to go visit him to confirm this. He told me he can move his mouse around the screen as fast as he can with no trailing. Sounds impressive but I have to see it before thinking about going back to a flat screen
I bough a gsync monitor and still have tears. Tears of joy from smooth games ;_:
12michak12 lol
+xZ4DE v (ChrisPredatorHD) did u read carefully his cmt?
+12michak12 LOL I'm sure you will have more tears when the chip goes KABOOM.
I really have BAD, BAD experience with new technology, I'm sure I would chose screen quality to having G-sync for the price.
From the luck I have, I'm almost sure the G-sync chip would burn just after waranty finish and have to go buy another screen, I mean it's one more chip to get broken and the fact that it keeps changing it's frequency it probably has more wear, no idea.
I also bouht the Nvidia shield tablet a year ago and I'm on the 4th tablet already and this one is also defective on the wireless...
+guily6669 maybe you should take care of your stuff more? Btw if you are already at your 4th tablet, why do you still keep getting them?
+HardstyleBeats_Mixr I always take good care of my stuff, no scratches, no hits, nothing and they all die...
Anyway about the nvidia shield tablet, it have always been replaced by a new one from warranty, I would never buy it again.
Use 0.25 speed on the youtube player for see the difference.
In other words, at normal speeds it is not that noticeable.
thats because youtube plays this video at 30 fps
its 60 fps in youtube...
am getting too old to notice the difference, but those who got the money and really need to have clear picture i guess is good technology but exspensive.
its a piece of shiiit technology. You can qoute me on that.
g-sync + 144hz is the only way to go.
That statement is only going to be true for a couple more weeks. Early 2017 a new ASUS g-sync 240 native monitor is coming
lol nobody can achieve 240fps unless the game is like 7 years old or older even on a titan xp
MrZodiac011 are you out of your mind? With my 970 i am getting 300ish in overwatch with competition settings. I'm upgrading to a titan xp if they don't announce 1080ti at CES in January 2017. Just so you know, the games that people play as Esports are the real reason why you would get a 240hz monitor. Every top Esports game my 970 easily can run 300+ so i have no idea what you are talking about.
MrZodiac01 means that nobody can reach 240fps at 1440p or 4k with modern games. I highly doubt your reaching 300fps with a 970 unless your playing at 720p.
I am playing 1080p low graphics in overwatch, CSgo, LoL and everyone of those games gets ~300 If you looked at the actual monitor i was referring to( Asus PG258q) you would see that it does not go higher than 1080 at 240hz with a 1ms response time. I am talking about people who play video games competitively. Every monitor is made for a different reason. There is not one monitor that works best in every situation.
i'd rather have screen tearing than eye and wallet tearing
But you would rather have empty wallet than seeing stuttering in games :P
No
Yeah, I'll stick with wallet tearing as it's rather small on my end.
Asus tuf gaming vq259 280hz g-sync $320±
Ya you are right dude,I can feel the difference
I held back on getting a G-sync monitor for ages, V-Sync always just felt like it was enough... But G-sync really does feel smoother on top of eliminating tearing.
gsync and vsync need to work together? or only gsync? sorry for my English
@@aleks.yt1 They work solo. It's like this: Vsync tries to match the computer's FPS to whatever the display is set to (fex. 60 Hz = 60 FPS) and Gsync on the other hand tries to match the monitor Hz to the FPS the computer is outputing (fex. 67 FPS = 67 Hz and dynamically, constantly changing to match the Hz to the FPS). So if you have a Gsync/Freesync capable monitor I don't see a reason to ever use Vsync.
I have G-Sync Monitor but I'm using AMD Radeon GPU now, and now I don't know what to do to have smooth gameplay again
G or F sync they should be free.....
+SHeeDeED Due to the fact that both AMD and Nvidia market their own versions of G-Sync and Freesync respectively, they simply cannot be free due to intellectual property (IP), thus using the technology requires a massive bonus for the companies respective. Granted, G-Sync monitors here (Australia) are far more expensive than their Freesync counterparts, there would be no benefit to having the technology free, as the choice is purely based on your graphics card. Using an AMD card with a G-Sync monitor is not only moronic (as it's basically an overpriced monitor that doesn't correctly sync the refresh rate), but also a major waste of money.
As I own a GTX 770, and an Asus PG278Q ($1249 RRP AUS), I'll be forced to stick to Nvidia (which I plan to do anyway) for years to come, although the thought of rebuilding with a GTX 980Ti and also using my 50" 4K TV, gaming will be a dream on both my TV (larger resolution and slower moving), and my PG278Q for fast paced and graphically exquisite games.
+Stephen Hurley All i'm saying, it should be a standard feature and w\e monitor i have it should have it fixed(tearing, refresh rate and all)
+SHeeDeED I am so happy you're not instantly acting like a twit because I went against your opinion. I like you.
Although desirable, it's not a cheap festure to implement and you'd see the average price of monitors increase. A lot of consumers would complain as they wouldn't understand/utilise the benefit of 144Hz sync with 1ms response, and would see it as a gimmick. I work in retail sales and see these sorts of things frequently, people complain about everything all the time. It's painful, but I enjoy educating consumers to make an informed decision before making a purchase.
+SHeeDeED with all this feature battle between amd and nvidia we'll going to have a segregated platform just like a console , if both have the same features (like gsync and freesync) they sould make one a standard and be done with it , and nividia should stop charging a licence for its gsync tech.
+bogdan m Actually yea! If i'm buying a GPU from either manufacturer it should be the end of it! Not worrying about stuttering, tearing and have to be a premium of around 200$ extra so that my monitor and GPU work together! Wtf is that? It should be a standard with ALL monitors working with ALL GPUs since nor Nvidia or AMD produces monitors!
Purchased my Asus PG278Q recently, and it's an absolutely gorgeous display. Not only with the 2560x1440 resolution, but the 144Hz refresh rate and 1ms response time makes for a fantastic gaming experience. Playing FPS games such as Battlefield 4/Hardline (and other FPS and high graphic games), if I turn very quickly you can see the screen lag trying to keep up. Problem solved with the new monitor and my old GTX 770 4GB. 27" Samsung 1920x1080 @ 60Hz is a great secondary monitor, but finally glad to have something that is actually responsive for a main display.
i just got mine, 144hz 1ms G-sync 27" 1440p.... omfg it is all true, so glorious smooth, it is worth it guys
Dont be sarcastic
you have gsync and you say you dont know if it is true wpwpwp
@@large_sailor hes telling the truth
What monitor u got
@@large_sailor i believe it! i cant stand tearing or stuttering in games im just hyper aware of it idk it gives me headaches too. so if i didnt have any tearing or input lag it be soooo worth it
I brought a g sync monitor. Now I’m homeless.
Oh that was good
Mark D'Alelio yeah but you would have gotten cancer if you brought a freesync home!😭😂
lol
@@ugurkagan9330 türk kardeşim sa
@@da-mz4sl Not true.
It's much more difference when you play. Just like how 30 fps is pretty smooth on videos
Does playing at lower fps feel better with g-sync mate?
Swear this was announced ages ago, and it's only coming now! But don't really such much difference, if the prices are the same as a normal monitor I'll get one, but wouldn't go out of the way for one.
^ this
Jari Pekkala just watched it on my mobile I can see now, but still hope the price is ok.
You shouldn't ever ever watch graphics comparisons or videos like this on RUclips: the only video streaming site that does NOT support true 1080p (4K option comes CLOSE) or 60fps... at all...
I've bought a G-SYNC monitor. After a day the G-SYNC stopped working because of a setting in the monitor. I really noticed the difference already. Had to reset my monitor to fix it again. But god! I can't play games without G-SYNC anymore! Ones you go G-sync. Forget about the V-stink!
all I know is I finally was able to get an actual gaming pc and I opted for a 144hz Gsync monitor and I couldn't be happier.
I just got my g sync diy kit today. If you are a serious gamer...then it is amazing. Believe the hype.
i don't doubt it makes a difference... but since everyone is getting it anyway i wouldnt go out of my way to get a g-sync monitor XD.
look up adaptive sync displayport(vesa) and freesync(amd).... no real advantage in caring what monitor you have as long as it has display port which should be a requirement anyway.
you are really that dunb aren't u?
ATLEAGLE579 dunb!
I just spend the most valuable 1:36 minutes of my life! thank you very much :D
hi 3 YEARS LATER!
Just upgraded to 240hz 1440p G-Sync after 1080p 60hz then 1440p 60hz. What a world of difference.
Holy cow batman, that looks amazing.
1 minute explained better than 1 day
thank you
This video seems pointless to me. The video examples basically only show us the difference between V-sync on/off, not G-sync. AFAIK you just cant see the benefit of a G-sync monitor on your normal screens. Its like trying to show us how great 4k resolution is on our 1080p screens, it just doesnt work that way and you have to see it in person.
if you can play 4k video in your 1080p monitor, play it because quality gets better trust me
Monitor downsamples 4k video\image to your resolution while keeping detail (quality) So you are wrong friend.
Well, a perfect example, Grim Dawn, has A LOT of noticeable screen tearing. With V-sync turned on, instead of screen tears it is very noticeable stutters, which actually do affect gameplay. Enter the new monitor with G-sync I have and now both the screen tearing *AND* stutters are gone. And I have noticed the difference in other games I previously ignored the tearing and stutters on, mainly because I never had a choice but to choose between stutters and screen tearing, now I see them and since I have the option to get rid of it, that is exactly what I do.
Which is basically where you are right now. You have either screen tearing or stutters. Those are your choices, at least until you get a monitor that can handle the variation. So you either ignore the issue like it is not there just settle for it or you do something about it. So which is it? Willful ignorance or fork out the dough?
I like how this legend tested this in arkham origins. Right now my pc is broken from a ram failure (ordering more soon), I have a gsync monitor and an RTX 3060. Can't wait to see how it goes
Cool technology, but it seems somewhat gimmicky to me. The only game that I've played recently that had screen tearing was Far Cry 4, and that was fixed once I turned V-Sync on in the game options. Didn't notice any stutter, either, so yeah... can't really justify the additional cost of 100s of dollars for a monitor that has G-Sync.
PC Nation No, I understand it. It's a variable refresh rate technology, so you won't notice tearing or stuttering if your framerate fluctuates. I get that. I just don't get the cost of it, when you can easily accomplish the same thing by turning V-Sync on, and dropping your settings a bit, to make sure that you stay at >60FPS.
Input lag is determined by GTG response times, not refresh rate.
LogicOwl Let's Plays dude not only does vsync kill fps but it still tears while doing so. G sync is the future. Just wait till it get's cheaper.
Incessant Wake Not on my rig. I have yet to encounter any screen tear with V-Sync on. I don't doubt that G-Sync and Freesync are going to be included in the future with pretty much everything, but as of right now, I fail to see an investment reason.
LogicOwlGaming V Sync needs at least 1-2 frames to buffer, in some cases even 3!
A single frame at 60fps needs 16,66ms to render, this is the input without V sync.
With V sync, you need to add 2 frames for the buffer that, so you have about 50ms of input lag! This is INSANELY much and there's no way around that problem when using V sync. Even when you render 120fps and have a 120hz screen, it will still take 25ms for it to display with V sync.
G sync doesn't need this buffer, because the only thing it does is waiting for the frame to be complete and tell the screen to display it; so you only have the input lag you would have without V sync enabled, but the benefit of a completely stutter- and tearing-free image.
Nvidia will release in a week or so a driver so that some free sync monitors will suport g-sync.
The other free sync will also have a manual option to enable g-sync.
People don't seem to notice that this is for more "technically advanced", and or people who have acclimated themselves to a higher standard for gaming.
I, myself, see one of these in my room sooner or later, because V-sync just makes me cry tears of anguish when I have to turn it on due to screen tearing.
It's so outdated and doesn't fit into any gaming scene other than casual because of the input lag it causes.
*Yep, calling all y'all casuals out.*
Surms41 I hate V sync so much. Screen tearing, FPS killing and Input lag just ruin the experience... I remember playing shadow of mordor with 59 fps on my gtx 970 but now when I play it on gsync I get a constant 80 to sometimes 90 fps.
Sure SLI all your 980 Ti too, I can see the morons Nvidia targets.
You need to get laid, son.
Thanks, I thought Gsync was just vsync but ad-hyped. This makes it clear.
AFAIK that helps only if the graphics card can't render 60FPS - given a 60hz monitor - so this will help you getting the same performance for example @ 45FPS without page tearing and without waiting for the whole refresh for the next frame as v-sync does, so you will get the best of the two worlds. but, if it the graphics card can render that game at that resolution at 60FPS+ then its the same as normal v-sync
i just got the viewsonic elite monitor, g-sync monitors cost more but i can confirm that you are definitely getting something for that extra cost. i am very glad i paid the extra for the more expensive g sync monitor
I'd rather skip this whole technology and use the extra cash to upgrade my GPU...
You will upgrade your GPU every 2 to 5 years, but if you have a top of the line monitor it will last longer while you also enjoy good image quality/resolution/size for movies and other stuff without the need of any GPU/CPU Upgrades.
Ivoruz Cz you are sa special kind is trash idiot on this globe.
Omar has somehow the point. just because G-Sync, the monitor which is equipped with it costs at least 150 USD more. Why? Just because it's nVidia license. Funny that freesync isn't adding a single penny to the hardware price.
salty ngreedia fanbois detected in reply comment section
Should I use G-Sync and V-Sync and limit my monitors FPS to 140 since I have a 144Hz monitor ?
what?
If you turn on V-Sync it will limit your monitor's refresh rate to 144 which will prevent screen tearing. G-Sync is there so that if it drops below 144 fps it won't be quite as noticeable and with G-Sync you're supposed to turn OFF V-Sync so that it can rise above 144hz.
I barely notice screen tears, G-Sync wouldn't be worth it for me.
+richyyy you are forgetting the dynamic refresh rate. Remember your gpu generates frames as fast as it can meaning that it will sometimes generate multiple frames before the screen even updates. The display will say 60 fps however you wouldnt be seeing all 60 of those frames because they were rendered before the screen updates. Havent you ever wondered why 30 fps looks a lot better with movies than games? Want to know why? Because the movies actually show you all 30 frames while real-time frame rates will often "skip" many of the 30 frames
+Brendan Forish hey Brendan can you help me out... I have intel i-7 core 6700k and gtx 980ti by i have very old monitor.... if i change my monitor to a 4k i would be able to play games on 4k ultra settings? and what monitor you suggest...
If no one told me, I would have never known about screen tearing. And I am a big gamer.
***** I know, my glasses have thick lenses and everything is blurry :(((
the screen tearing in this video isn't terrible but I guarantee it would seem much worse in person and depending on the game if v-sync is turned off you're going to get varying degrees of awful tearing. that's just how it is sadly
There was tearing in the G-sync ON, and also why was the cape not blowing, but instead stationary, in the G-sync ON footage?
tecnologia extremamente boa, usei as duas e g-sync é muito superior!
I can't really tell a difference in the video but I definitely could during games
Wonder why they didn't call it n sync, Too boy-band sounding?
G-Sync for GeForce Synchronization I guess
Touché : )
Jin Kazama And because N sync is just gay as fuck
+SwiftFX Watch your mouth, kid·
i want ask.. when using g sync monitor. all game can play 60fps ? or it depend on your pc spec ? I'm using gtx 1070 8gb with i3-4150 12gb ram. is that enough ?
I've been gaming without g/freesync all my life. The tearing can get annoying but unless its constant stuttering or tearing forget about the sync. I'm happy with my vg248qe 144hz and haven't noticed anything game breaking so far. If the extra cost of the technology isn't going to brekak your budget then by all means go for it, but don't let it rack your brain.
screen tearing only appear when there's unstable fps. or fps drop below refresh rate, or lower refresh rate monitor
im a noob, so pardon my ignorance. g-sync is something that just works with your monitor if your pc is compatible, right? is there anything to "set-up"?
After 5 years of waiting for monitor goes cheap budget curve monitor HKC. 100$ price
I like the video, but I just wish the examples were more exaggerated where I could learn exactly what each of these things are.
So basically, little to no difference. 144hz + Vsync is fine.
Vsync causes stuttering though, and more importantly, it can also cause input lag.
The dumb thing about this video though is, you can't actually show g-sync working unless you're looking directly at the monitor with g-sync.
OatmealTheCrazy Vsync only cause stuttering if you're below the refreshrat ein terms of framerate
Terra Incognita Gaming still, G-sync deals with that completely without that risk at all. Also, the input lag was the bigger concern.
OatmealTheCrazy If you google, Gsync has alot of issues with stuttering.
Terra Incognita Gaming I was just going from the few reviews and an explanation on how the technology works. Never heard anything like that from the reviews I saw. Sad to hear. :(
How do you compare G-sync with the ever green N-sync?
I see screen tearing on both sides because my GPU is so shitty LOL:-)
I have a question i have a good pc wich can run games at 100-200 fps but i have a 60hz monitor should i turn on v-sync?
Yes
Wow.. I bought G-sync and I'm still noob...
some people buy razer gaming mouses, keyboards, 144hz 4K monitors to be a lame bitches in CS: GO :D
Zoltan Chivay true, abd this is sad af XD
@@Marcin_z_bloku_obok well with my 144hz monitor i went from nova to global in 4 months in cs go :)
@@ewengharib9485 And U think you wouldn't do that witohut that monitor?
@@Marcin_z_bloku_obok yeah i think the monitor helped me
yeah but if i have a freesync monitor with a geforce GPU will still i get tearing??
won't the high refresh rate of a monitor (144Hz) eliminate all the tearing regardless of the sync technology used??
Like I care for this.
If my monitor has it, that's good
If not that's OK
I'm not willing to pay more for this.
+Archduke Crunch it's been a year or so.
+Archduke Crunch and?! I'm not into gaming now.
+jaberjbaar Well if you quit gaming in less than a year than you're not hardcore enough to care about screen tearing. I myself can see screen tearing and stuttering badly and I need myself a GSYNC.
Well I can see them, too.
(You can ignore them)
I'm not sure if you've ever had smooth 60+ fps before consistently but i can guarantee you that once you start gaming 1080p 60+fps you will find it very difficult to go back to sub 30 frame rates.
first video which let me understand G-Sync! thx for that!
It’s a bullshit it’s market crap that nvidia let pay nvidia customers now amount money for this stupid think called g sync .
I have nvidia gtx 1080 and monitor is Samsung lc24rg50 curved 144hz freesync and absolut no problems with tearing .
Instead of 600£ I paid 200£ and problem solved
Spielt ihr mit G-sync?
Habe jetzt einen Monitor mit der Funktion. Habe in Shootern noch das Gefühl, das es mich verlangsamt. Geschmeidiger ist ja mit G-sync, aber kommt mir langsamer vor?
Gibt es hier Shooter zocker, die damit Erfahrung haben?
If you guyz look carefully you can notice stuttering on Gsync too..
He should have tried with SSD HDD..
if i see this demo with a non G-Sync monitor, should i appreciate the smootha animation?
i think no
im loving the future
Yes I am from the future
@@AlexandreNarbert I am from the further future
Vsync on removes the screan tear but you get input lag. Gsync suppose to give you essentially vsync but without the input lag that comes from it.
cool but not worth the money
so trueeee minimum 300$ in my country for a g-sync monitor
@@fluffytuber7040 ja-noobs, in Argentina the cheapest g-sync monitor is $700
Could do without the guitar backing track, but otherwise a pretty neat demo for those of us still stuck in the stone age without G-Sync. 😂
and people still buying g-sync...u can barely notice the difference...believe me, gsync is the most overrated tech crap in the pc gaming as of date.
yes but y sync limits fps and increases lag
Thats why you use adaptive sync.
It's not, it's very hard to notice a difference from watching G-sync on a video. Try playing a game yourself and compare V-sync and G-sync. You'll never want to use a monitor without G-sync ever again.
are you sure about this? I tried 3d marks heaven benchmark on a gsync monitor at frys(local electronic store) and i didn't notice a damn difference...
fthis1234567
did you actually try playing a game?
Hi, i need help. I wanna buy gtx 1080ti, but don't know, what monitor to buy. I have 2 monitor in select: Acer Gaming XR342CK: 34'' ultrawide, 2k, 75 hz or Acer Predator XB271HUAbmiprz: 27'' 2k, 165hz, G-sync.
Are differences between G-sync ON/OFF too big? I will mainly play Dark souls 3, League of legends, Battlefield and GTA V.
Well i saw no difference
salt
Invocated Agitator lol😂
Thanks for the vid. Just looking for a basic example and it was there. Btw what's the background music song/artist name?
Wow. I've only seen G-Sync a few times and i thought it doesn't brings you any advantages but damn. The game is so smooth with G-Sync. I need a G-Sync monitor. That shows again that Nvidia has more and better technologies than AMD. NVIDIA > AMD
AMD has freesync in its upcoming new cards as well... And it was clear to me the G-Sync video was made with better hardware than vsync off/on one. You can even see graphical improvements on G-Sync one xD I do own GTX 780 but really this is just typical Nvidia bluff. Adaptive vsync was supposed to be great as well with Nvidia, but it was actually creating more problems than it solved.
wait, so you need a gsync monitor, and a gsync card?
StrawberryRainPop Exactly. GTX 780 Ti or GTX 900 series cards + monitor that supports G-sync.
SmartNexus What do you think. how will the next NVIDIA GPU generation be? I mean power efficency, heat, gaming performance and stuff like that.
FanofScience It will be 2nd Maxwell generation. More cores, more performance. That means also more power draw and more heat. Goes pretty much how GTX 600 and 700 series. 600 series consumed less power, but had less performance than 700 series.
My freesync monitor supports g-sync. I get the best outta both worlds. Freeaync price and g-sync performance
anyone else not notice the slightest fucking bit of stuttering in that video?
ye I cant see any difference while playing it is not a big deal in my opinion
Well you're a noob. you don't need much.
You are salty as fuck because you bought this useless extra xD
@Eric Voss keep looking for exuces for your noobines and blindnes fuck. I have old & cheap 75hz ASUS monitor so...
You're like a person who says that airplains are useless koz you dont fly. Simply a defected moron with no skill. Play on 30hz monitor if you like, you won't see the difference anyway.
What up with the people saying the G-SYNC monitors are expensive? You can get a 1080p 144hz GSYNC monitor for $180
nice technology but the price...
I better wait some years till it´s affordable
Yes it is quite affordable now especially 144Hz G-Sync Compatible IPS Monitor :))
@@AlexandreNarbert well of course is it now, it's been 6 years haha :D but thanks for the answer!
@@asadd2 ayee have a good day bro 👍👍 still happy to reply after 6 years
do i need a monitor to have g sync in it and also a graphic card with gync to have a butter smooth experience or do i only need a graphic card with g sync ,my monitor is acer and has 60HZ
I hope this G-Sync Technology dies how it is right now and go to a much more less expensive way in which you do not have to look if you have a Amd or Nvidia Card to run your Monitor tear free. Amd does a much better work with freesync which can be seen in availability and price. Btw I'm not a Amd fan boy, because I have a Gtx 1070 in my system, but i do not want to pay so much more for a Gync Monitor.
WhiteSnowGentelman Then....don't
I would rather buy an Amd card before buying this.
G-Sync has a wider range of variable refresh rates compared to FreeSync
Prich038, this is true but it does not justify such ridiculous price.
if you have a 1070 then buying an AMD card is pointless as they are all garbage compared to the 1070
Hello will nvidia gsync work with windows 10 ? if not now are they planning to make it work ?
It would be stupid if it didn't work.
+PlaystationCG I don't know exactly how it works, but it probably isnt even OS dependent... At least I imagine there's some kind of chip in their compatible gpu's that outputs data to the chip inside the screen (I dont know exactly if it even needs special software on the drivers for it)...
Anyway windows is the number 1 gaming platform of pc's so you are probably even asking a wrong question :)
+PlaystationCG i dont see how there would be an issue witht the OS, G-Sync works with the graphics card and monitor.
Barely any difference
When you're blind*
salt
Sometimes, sure, you can barely even notice the screen tearing(without vsync) or stuttering(with vsync), but other times it is so noticeable, either way that it cannot be ignored, game breaking even, depending on the game or play style.
Grim Dawn for example, the game I play that has the most screen tearing I have ever seen on any game, ever. It is so noticeable and so frequent that it is difficult to "put up with", so you I am like "Screw this, I am turning on Vsync." But now the game stutters so bad that I will actually lose about 2 seconds of gameplay every time there would normally be a screen tear. The stutter would hit and by the time it cleared by character will have traveled several steps and cast several actions and either my character would be dead or the enemies would be.
Now, enter gsync and now there is no screen tearing *AND* no stutter, just smooth gameplay with no hitches or annoyances, from start to finish.
The sims games have major issues with screen tearing as well, it is so bad that they don't even give you a vsync option because it would cause insane lag.
better with Vsync on
Yeah sure, great technology and all, but why the fuck is there only ONE monitor (an $800 one!) that has this technology built into it? ASUS is not very intelligent in this regard, since ROG has the only monitor out there, allowing ATI's tech to get closer to release. Oh, and ATI's FreeSync tech isn't proprietary. It works with any video card. So yeah, as much as I like ASUS (I even bought a GTX980 SC to get Gsync) their strategy here is pathetic. Make the technology affordable because very few people can afford an $800 monitor. They said the chip would only cost $100, but that seems to have been a lie. Besides, where are those chips?
Its a new technology, this is the first monitor to have it. What you expect?
Kevin Aous I was expecting a monitor that doesn't have the numerous issues this one does. A Google search reveals hundreds of duds and people who dropped 800 bucks getting lemons.
Good job. But yeah, I'm fortunate enough not to live in the UK. So shipping from UK vendors is expensive. Besides, Benq and Acer sick balls for quality. I'll stick with Asus. Well, I will when they fix this monitor and make it worthy of the Asus name.
TDR REVENGE I really want an Asus model though. Always had good luck with their stuff. If I change my mind on that maybe I'll contact Logan. Thanks.
ATI does not exist
Will Gsync lock the fps like Vsync? And is it possible if my monitor still only have 75hz refresh rate?
I could never really tell the small details like tearing, or the difference between 1080p and 4k... I'm either getting to old or I just don't care about the little details.
if you cannot see the difference between 1080p and 4K then I should go check your eyes. Even if you dont care about the (little) details you could still see the difference.
The PPI of 1080p monitors is already really high so 4k is only beneficial for things like TV screens in my opinion. My eyes are fine and I don't think its justifiable spending extra money on 4K when it comes to small computer monitors.
You're either blind or noob, anyway, if you really can't see the difference then i guess don't buy.
Brandon Jones with 4k u can see little bug wjth 108pp u cant
This would be better demonstrated in 60 fps somewhere, but yeah this tech is amazing.
kinda beating a dead horse now that adaptive sync and free sync are on the horizon. your upcoming product that wont be on almost every monitor in existence really doesnt match up against freesync for everyone with a new gen amd card, and adaptive sync for any card that can support dp 1.2a
Not all monitor will support Freesync actually. 1.2a standard doesn't mean the industry has to follow it to the letter on all monitors, especially when Freesync also require extra hardware in the monitor which the AMD is telling the manufacturers to do it themselves.
What you will see when Freesync comes out is that there are monitors that will support it, and others will not, and those that do will also be at a premium, that is just how the market works.
Nvidia to manufacturers: Here is the module, all you do is put it in and build monitors that can support it, the rest is up to your engineers on how you want to do it.
AMD to manufacturers: You build the monitor that support variable refresh rate, and we'll provide the software to run it, and we'll also take credit for it by labeling it Freesync.
Result wise, both are doing the same thing in the end.
However I can't help but wonder what exactly is the point of Freesync if manufacturers are the ones doing most of the heavy lifting?
You do know adaptive sync was Nvidia tech also... Stupid AMD fanboy. But I suspect all these adaptive syncs, g-sync and freesync are just gimmicks that solve some problems but actually create new issues while doing that... Not worth it imo
SmartNexus ... seriously??? do you even try to stay informed on the subjects you post on? i'm sure VESA will be happy to know that they are owned by nvidia now somehow.
Neosaigo that's what the almost is there for. every monitor that is even somewhat aimed gamers/media will likely have adaptive sync in the near future(heck 1 is even making it available for free through a firmware update on a monitor already in circulation). As for free sync it works on any monitor that supports adaptive sync as far as i'm aware. basically their graphics card is the thing with the extra requirement(which is why only the new 260x, 290, 290x and 285 are supporting it in games). free sync is just an adaptive sync add on. g-sync is a full alternative that nvidia is using to milk out more cash for no particular benefit to the consumer, would have been easy for them to just support adaptive sync, but they want to charge more and reap all the benefits so they didnt.
does g-sync or freesync work with other graphic cards?
I dunno what all the fuss is about I myself barely notice it and actually have never noticed it for myself. It's nothing game breaking.
***** Well, personally I don't even notice it without pausing the screen maybe I have retarded eyes I dunno..
You do, ye
***** Yeah I realized I never realized it in other games because v-sync is mostly always on by default I just turned it off in overwatch and it does surely hurt my eyes. It's not so bad in this video tho
I've got like 140 games on my PC and haven't noticed tearing on any of them ever, I get artifacting on GTA 5 and my friend does too but that's just the game, that's a common problem
***** Well g-sync is obviously a waste, never in my life have i seen screen tearing, people with g-sync monitors are either being little babies about it or they have absolute god-like eyesight, my eyesight is perfect and have never ever noticed it on any game on any PC in my life
I was thinking of upgrading my computer this would be so helpful
I don't see anything i'm sorry XD
+Mark ‘nattaking’ Kamp thier example wasn't great
from my experience GTA V really suffers from this, where is is absolutely terrible with V-sync off, to the point where you can't really see anything when turning the camera
and with V-Sync on it will immidiatly cap the framerate at 30 as soon as it hits 59 fps, causing some aweful stuttering (very noticable when driving fast)
Then youre blind, because i see a difference...
where do you see it?
Damn this was 5 years ago?
rich = wants good graphics for playing experience and story
poor = wants good game story
That doesn't make any sense
Bought 144hz monitor, no G-Sync. Works amazing. I'm playing at around 100fps and don't see image tearing. I used to see it on my previous monitor which was 60hz, but it seems it's not an issue anymore on the 144hz. I'm pretty happy I saved money and didn't get G-Sync.
Soo... another $400 for PC Elitists to spend. Excellent.
If they want to. Its not necessary. But we have option to do that. Console peasants are stuck with their shit hardware and configuration even if they wanted something better.
SmartNexus Something better... like a $400 20-inch screen.
smokebomb.exe I have 28" inch 4K screen from Samsung which cost me 400 dollars. Well worth it for me. Doesn't have G-Sync, but I do not need it anyway. But you can get it if you want on PC. On consoles you have screen tearing and you cannot fix it even if you could afford it ;)
You're obviously either deluded moron who doesn't know the prices of technology or just poor kid trying to find some reason why his low end handicapped PC (console) was better buy XDDD
Friend me and I'll show you my rig. And PS4, XBone (which I regret buying), WiiU, and PS3. And 64 inch Aquos 4k tv screen they all run on.
facebook.com/rocketpunchgo
Raziel Milky Blasphemy! Thou shalt not speak vainly against the Dreamcast!
Is it just me or is the G-sync screen very distorted? Looks like you put a big lens or something in front of the screen.
V Sync ON was fairly enough as savior for ages, until frame rates could go way beyond traditional monitors, hence GSYNC.
It looks like display manufacturers are going to be focusing on either Freesync or G-Sync in 2015. There will be products available from several different manufacturers for each technology. The G-Sync displays will be more expensive because of the hardware costs but since companies are avoiding internal competition there may not be a direct comparison available. Companies will probably try to pad in some extra profit from branding so even if the hardware is only $100 difference in cost the actual price for similar features could be much higher. G-Sync got an early lead by being first to market but the higher cost does not appear to be a good value compared to Freesync. Brand loyalty for GPUs and displays will probably have a big impact on sales but we have to wait and see if that loyalty will be enough to keep G-Sync in the lead.
I got a g sync and cant tell any difference. I turned it on through the nvidia settings. Maybe there is something els I have to do? I went from a 144hz to a 120hz ips ultrawide. And I get a solid 120hz. Looks no different than my 144 tear wise :(
FreeSync, which is an AMD technology, uses the Adaptive Sync standard built into the DisplayPort 1.2a specification. Because it's part of the DisplayPort standard decided upon by the VESA consortium, any monitor with a DisplayPort 1.2a input is potentially compatible. That's not to say that it's a free upgrade; specific scaler hardware is required for FreeSync to work, but the fact that there are multiple third-party scaler manufacturers signed up to make FreeSync compatible hardware (Realtek, Novatek and MStar) should mean that pricing is competitive due to the competition. AMD: According to AMD, The following pre-existing GPUs will be able to use FreeSync for dynamic refresh rates in games (after a software update): Radeon R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260. Other cards/chipsets will support FreeSync, but only for "video playback and power-saving purposes", these include: Radeon HD 7000, HD 8000, R7 and R9 series cards and APUs from the Kaveri, Kabini, Temash, Beema and Mullins lines.
I googled for you : G-sync monitor adds about 150-200$ to the price
When I play a game, I want it to always update in perfect sync. for example, I want to see an image on my screen that has had exactly 16ms passed. No more, no less. Exactly 16ms. The video made it look like gsync variably changes the refresh rate on the monitor. I thought that was odd. Is gsync for me? with a 144hz monitor?
That is what it does. When doesn't matter what graphics card you are running. On AAA games you wont always be over 144. When it drops to 120 the GPU and monitor will run in sync. That being said I have a gsync and a 2080ti and cant tell any difference with it on or off. My cousin swears by it though. Maybe you need 165 hz like hs has to tell? Idk I went from 144 to a 120 ips ultrawide and I can get a solid 120 on a lot of games. As far as the tearing I see no difference from 144hz to 120hz g sync on or off lol
Should u upgrade my gtx 1060 6gb to a 1070 or get a gsync montior
If you truly want to make a great comparison, you should at least turn the in game camera at the same slow speed that's being shown on the G Sync Side.
How low can the variable refresh rate go on G-Sync displays?
If your frame rate is capped at 60 and your frame rate never dipped below 60 do you still get torn frames?
if your fps lock below desire while vsync-on , turn vsync-off and use borderless fulscreen = G-sync
If you have a 144hz monitor, does G Sync help in anyway? Is it necessary to have both?
It will help, it basically sets 144hz as the highest refresh rate that it will render. Its weird, and hard to explain, but I have a ROG Swift and there is a difference between 60 and 120, and even 120 and 144hz
*So I have to have a gsync monitor for this to work?*
Literally couldn't tell the difference. ha ha, maybe my eyes are just too slow to tell.
Why i don't notice the difference between v - sync on and G-Sync? Which is the difference?
Ko ure blind & nob. vsync stutters and gives horrible imput lag, g sync won't stutter and won't lag your mouse. If you move like an elephant with your mouse then it doesn't matter anyway, and if you're blind... you get what i;m saying.
Wow even more salt xD
honestly this video showed me everything i need to know lol. I dont ever think i noticed that but i never played new games recently to tell.
I have a XFA240 monitor and have a gtx 970 and don't see the option to turn on gsync in the nvidia settings HELP!!!
Is 240hz + G-sync worth it?
I dont think so. From what I have seen 144 to 240 isn't noticable. But that being said I cant see a diff from 120 to 144 lol. 60 to 120 though is a world of difference. I used to be about that fps but finally said fuck it and got a 120 cause I wanted an ultrawide. Gsync makes no difference on my new monitor nor my old 144. My cousin runs a 165 and swears by gsync so in his experience 165 and gsync is worth it. I need to go visit him to confirm this. He told me he can move his mouse around the screen as fast as he can with no trailing. Sounds impressive but I have to see it before thinking about going back to a flat screen