EDIT: Please note that the video is not relevant anymore. It was created when new 1080 costed as much as second hand K5000. But today GTX cards are too expensive because of cryptocurrency mining, and Quadros are much less expensive than several years ago
No one talks about electrical consumption. 50watts of difference on one year, 24/7, is 438000 watts/hour 438kwh ! And elec price is growing years by years
If you have a choice between a modern Geforce card or a quadro, the geforce will likely outperform it (higher clock), but the quadro is one of a few devices (AMD FirePro) that enables the 10-bit opengl path in its drivers which can be used by Photoshop/Premiere and other professional applications to display deep color. To be clear -- a geforce card will likely render scenes out faster than a quadro, the content may be 10-bit depending on where it came from, but the wider color gamut cannot be displayed inside Photoshop/Premiere/Resolve for professional color grading (this also requires a true 10+ bit professional display). The question isn't "what is faster?" but rather "do I need to display 10-bit content in a professional application, or do I require more VRAM than geforce cards can provide?". The last 2 points are why Quadro cards are still relevant to professionals. If I was uploading 10-bit video to youtube/netflix for HDR, I'd probably want to see the content on a 10-bit display and maybe do some color grading with deep color available. I could handle a similar situation with a geforce card, but I wouldn't see 10-bits, it would dither down to 8-bit, there might be some banding, and while it may look "good enough" I would still want to see it in 10-bits before publishing it if this was professional work.
Its called early adoption. RUclips supports HDR right now and they even allocated extra bitrate AND do automatic conversions to SDR for everyone else. If you are making a high quality master that you shot as 10-bit, why not upload the highest quality source so that as youtube supports future codecs (h.265 is coming!) they can just automatically re-encode from your high quality master? I don't see a downside. I expect HDR display market saturation in 3-5 years like we saw with 1080p/HD adoption. Any early buyers are going to be looking for content to view on their new televisions and I see an opportunity to gain a couple views from this tech. support.google.com/youtube/answer/1722171?hl=en support.google.com/youtube/answer/7126552?hl=en If you have the capability to record in 10-bits, why wouldn't you adopt the HDR workflow? I just pre-ordered a Panasonic Lumix GH5 camera which records at 10-bits internally. I can only think of a few reasons, 1) Because you hate color correction in post processing and want to upload straight from the camera to youtube, and 2) You lack the gumption to read about the different/evolving HDR standards.
As mentioned FirePro does support a 10-bit opengl path, but as far as its 3d performance I would just defer to online benchmarks. I'll note that gamers scoop up geforce cards for performance reasons, but I never hear about gamers with FirePro cards. I favor AMD graphics more from a bang for your dollar perspective than as a top dog performer. Also FirePro cards, while they do support OpenCL, I find that some apps (Adobe products) either lack OpenCL support (favoring CUDA) or their CUDA optimizations perform better than on OpenCL (Blender falls into this category). I think its unfortunate that the open standard OpenCL, is not as popular.
technoendo but the workflow like preview efects and color grading on adobe premiere. That w9100 have 32gb of gpu only for 2.5k $. That has to mean something. Im looking for the best workflow.
I don't follow. Preview effects will still be accelerated on any proper GPU regardless of brand, is there some FirePro specific feature here? 32GB of vram for $2.5k does mean something. Are you familiar with GPU usage in premiere in general? Pugetsystems has been doing some great benchmarks over the years and looks like they updated the page below to include the newly released 1080ti, and they have some VRAM recommendations for different resolution of media being worked with (8K media may need 8-10GB of VRAM, less than 1/3rd of the VRAM you propose). Also when you look at these benchies do you see the minor gap between the 1070 and the 1080ti/Titan? Basically Premiere may not fully utilize the GPU which is why so many video editors aim for 1060's/1070's (usually its gaming or 3d rendering that will better utilize these higher end cards). www.pugetsystems.com/recommended/Recommended-Systems-for-Adobe-Premiere-Pro-143/Hardware-Recommendations If you can use the VRAM I think you have a winner, but if you can't demonstrate that you'll use anywhere near 32GB then I would just get a lower end/older quadro if 10-bit display is necessary in professional applications. I'm a little hopeful this HDR standard and increased prevalence of 10-bit displays coming in the next 3-5 years may mean we'll see alternatives -- maybe we work on 10-bit media blind in Premiere, but can generate previews that we can view in some directx based 10-bit video player as a monitor outside the primary application. I wish you luck sir.
With Quadro your paying for stability, Drivers that are compliant with the Pro tools and memory that is stable. I use Catia in my day job and licenses push 100k pa and then the Hourly rate for design engineers is high. The few grand for a bit of hardware really doesn't come into the equation. Just buy what is sure to work.
That is what Quadros are made for! Although, on the driver side, nVidia has abandoned custom quadro drivers for some 3d software. But honestly, working also on renderfarm management, I will tell you that from this perspective it is better to give 3d artists the worst GPU possible, so they are polygon-conscious and I don't end up having 1-hour frames and have a useless renderfarm. "Sadly" today there are no bad graphic cards.
You really have to get into the mathematics of how the applications work. CAD tools have two levels of detail they work with in real time. You can vary the visual representation, even have a tessellated /b-spline completely separate files for that representation, but the actual geometry has to be exact. In Catia case it's up to 15 decimal places I think. Visualization programs don't work to the same accuracy as it's not required. So yes these can benefit much more from the GPUs that are made for gaming. For example if you run some precise calcs on CUDA vis GPU you will find CUDA is inaccurate as it rounds up.
Brian Eastwood The previous days I have been reading about the floating point precision in current cards, and according to those articles both gtx and quadro cards can use FP64 precision, but gtx did not have it accelerated like quadros. The things are now interesting that new pascal quadros such as P5000 does not have that high-speed FP64 support, so the lines are getting really blurry there.
Yes geforce are better for content creation software. But for broader use of 3D, do benchmark with Specviewperf. Benchmark software for industrial 3D. Then we could see the quadro completely destroy geforce.
So I did some research and talked to some people that are software developers for CAD software. Some senior developers even said that it's total bullshit to buy a Quadro card. My conclusion is that for my situation/work Quadro is not the way to go. Considering the cost and the benefits it's better for my situation/work to buy a GTX. So for now I think I would go for a Alienware 15" with GTX 1070 and full specs, if there are any tips for a better laptop let me know! Also our workstation now has a GTX Titan 12GB's with a CPU i7 overclocked at 4,6ghz and works just fine with the software I use.The software I use: Vectorworks, Cinema4D, AutoCAD, S-CAD, Solidworks, Keyshot, Adobe Suite. There has to be a reason why those Quadro card are way more expensive like development costs and the thinks Pix Pix are mentioning.
Wow, interesting discussion so far. Yes, nowadays Titans really blur the line between Quadro and Geforce, Nvidia admit it. And for the software you use, seems like they will go just fine with Geforce especially Titans. In terms of horsepower, Quadro and Geforce doesn't make much of a difference. Not long ago I read Nvidia documentation about Quadro, it says Quadro has better memory management and stability than Geforce, maybe that's where the rest of the money went. One of my friend works in a car parts manufacturer, and it involves a lot of 3D scanning. She was once using Nvidia Geforce to do 3D scanning, not long after that her company bought Quadro workstation with very similar spec. She said there's some improvement in the scanned data processing speed after switching to Quadro workstation.
I agree with u when it's on 3ds max because Nirous it's optimize for Gaming card but on Maya I think Quadro is better on OpenGL Setting U can try srry for my bad eng
Yeah, the only problem is that Maya, 3ds Max and Cinebench only check 3DX capabilities of cards. Let's try comparing the cards in OpenGL and 3D-modelling (CAD) environment... ;)
Please note that the K5000 is based off the Kepler architecture and was released back in 2013. A more fair comparison would be between the Quadro P6000 against the GTX 1080, both bearing the new Pascal architecture. But essentially yeah Quadros are not worth the price in MOST cases for typical PC users and are made to delete your wallet
I heard that Quadro has 10-bit colour per channel for OpenGL, but Geforce only has 10 bit in DirectX, is this correct? As getting a 10 bit monitor, would mean you'd need Quadro to fully utilize it.
Thanks for the post, it helped a lot with quoting a 3ds user their new PC, seeing the vid and the comments, I was convinced The customer is getting a new i7 rig (16GB RAM, PCIe SSD, 27" monitor) and I went with a GTX 1060 6GB. A champion computer I'll sell for US$1400 Cheers Tech Guy :)
Can you compare Quadro VS Geforce cards in the Google SketchUp program? I did a test with a GTX 580 and a GTX 780, both provided the same frames in a scene of more than 100mb. I can not find videos about this on RUclips In that scene the computer slows down
Thank you for the heads up. I am only a 3D Studio Max user, I wonder if you can extrapolate this to that software (3D Max is never used in benchmarks on the reviews).
Benchmark at 0:14 is Max. I am Max user since it was 3d Studio v4 for DOS, and in my opinion it is completely ruined software. As you can see there is not much difference between cards in Max. Both are glitchy as you can see (I intentionally turned off lights, because it was client work). I have tried both DirectX and OpenGL mode with similar results, and both have very low frame rates compared to Maya (sometimes more than 10x difference). I have even tried with dedicated 3ds max driver for quadro, but it was the same.
Thank you so much. I want asked a questions. I can't find any information about quadro m2000m on internet. I can't choosing between Msi gs60/msi ws60. I am a architecture student. Which one is more useful on my modeling works and render, gtx 1060m or quadro m2000m. I hope I could explain myself. Sorry for my English language.
Yeah, It's only marketing, I have not seen differentiate between Quadro and GeForce. I using Octane Render with dual GTX 980 TI and I found that GeForce is more worthly than Quadro. A few year ago, I try to test preview speed of modeling in heavy polygons in viewport between Quadro K5000 and GTS 450, I have not seen difference.
Hello there, I know that this video is a little old but I was wondering if you could give me a moment. I am curious as to how your Quadro compares against the GTX 1080 in OpenGL mode in Maya (if that is not what you were already using). The reason I ask is because I have a Firepro w9100 and am able to get just under 190FPS in the Cinebench test and often work in OpenGL mode in Maya 2016 in order to see all of the features that are not supported in DX11 mode (in addition to previewing an extreme number of textures). In the end I want to see if professional cards can still outperform a consumer card in OpenGL (Viewport 2.0) mode as I am thinking about selling it as I move from CPU rendering (Arnold) to GPU rendering (Redshift). If the performance holds up then I am planning on getting a Titan XP plus a trio of 1070s. I still have to do an extensive amount of texture painting and previewing so I will indeed need the extra VRAM but when it actually comes rendering in Reshift, it is so RAM efficient that 8 gigs is all I would need for the headless cards (I have tested a friend's 1080 already). P.S. I know that the much newer 1080 beat the Quadro in Cinebench but I am wondering if that translates into real-world performance. THANKS!
I do not have m6000, sorry. We decided not to use Quadros any more. It should not have better performance except in raw CUDA calulations (m6000 has 3K cores and GTX1080 as 2.5K), but having 2x1080 should beat m6000even in CUDA calculations by at least 80% and still be less expensive.
Felix Martono Quadros used to have a lot more RAM than consumer cards, so big studios could work with heavy scenes. But since 1080 now has 8GB of memory I do not see the need for quadro. Also, quadros used to have specific performance drivers for 3d programs, but I remember a few years ago someone was able to find a setting to let geforce card be seen as a quadro and run the same driver, and have the same performance, so it looks to be a marketing gimmick. Since about quadro K4800 I have not seen custom drivers in 3dsMax, there is nothing in Maya and I am not sure about other programs. I have read before that they also do anti-aliasing on hardware so the CAD lines look "better", but I did not notice that there is anything different about the lines between these cards.
Seems like Revit is a geometry heavy software rather than texture heavy, and as far as I know, 3d scanning will need a good GPU stability and memory management. Lastly, do you need an absolute precision in your work? If so I would recommend Quadro.
Which lowest graphic card should I buy to use CAD 3D programs decently (SolidWorks, AutoCAD, etc)? 1050 TI works well? Or I need a better on? Or I can buy a cheaper one (RX 550 4Gb)? I won't work with, but I'll only use for studying.
There are probably banchmarks of GPUs for Solidworks, so try to find it. Also, if you don't plan to make insanely complicated models, any GPU should be good enough. Open the most complicated scene you have with your current system, and check the framerate. If the performance has to be doubled, seek the one with double the 3d benchmark score
Spinning heads and such like is all very well, but I've had both Quadro and GTX and will say that the GTX's are good for Activeshade rendering (VRay) the Quadro is better for viewport *display*. Maybe. I'm still testing my new Geforce cards but when working in 3DS Max, things like edges are hard to see and sometimes even not visible at all - it's a glitch and will drive me nuts if I experience it permanently. It's early days, might be driver issues, but I've got two 1080 ti's and was intending to fill up my motherboard with 4 of them. I think I'll just go with 3 and make the 4th a Quadro specifically for the viewports alone (you can do that with VRay).
No, differences between commercial and professional cards is pretty much gone these days, tehre are few features over 1080s with equal Quadros but they are for very high end movie making purposes for CGI and so on, having generally identical perf for normal asset creation and gaming. Its different with CPUs a bit Xeons outclass commercial cpus alot in multitasking and are great for some gaming on the side aswell, and they tend to last much longer underload than commercial variants, they tend to cheap out and use shitty heat tranfering on commerical ones instead and you have to scalp i7s, i9s and others nowadays to get proper direct cooling. Ofcourse you need to take in mind that for that intel asks far more, just cuz they are made to last(aka they cant go back to you to make you buy another), their perf is longlasting and will serve user for years or even over a decade given modern xeon's perf, and so on - this means hefty price for it to do such things. Also people buying them can just pay more cuz they work for big studios and get their gear funded often
Andrei Fasola Can you found a answer for your questions? I have a same problems like you. I can't make decision between quadro m2000m or gtx 1060m. If you can help me i would be very happy.
I told my dad to go with the M2200. I thought - instead of dealing with aliasing and all those pesky issues he better have the proper drivers and whatnot. Yeah it seems it's all about money - same shit in cinema, photography and whatnot. Cheers. canadacomputers.com/product_info.php?item_id=117596&cPath=570_710_1199 MSI WE72 7RJ-1080CA Workstation Notebook | 17.3" FHD Intel i7-7700HQ (2.80GHz) 32GB Memory 512GB SSD + 1TB HDD | NVIDIA Quadro M2200 GDDR5 4GB BT 4.2 Win 10 Pro
GeForce is the same Quadro, except there are limitations placed on GeForce cards, though those limitations are possible to be removed, it wouldn't be worth it.
No, but I have the software, so if you have a scene for each of those that I could use (and you have rights to lend the scene to me and have it shown online), then send it to me and I will test it out.
Can you do the same please with Maxwell based Quadro GPU VS GTX1080??? Kepler is being gimped WAY WAY more in rendering than in games... I mean an overclocked GTX780 Ti easily matches GTX1060 in games, but a professional GPU like Quadro K5000 loosing to common gaming card like GTX1080 in rendering, wow, sad...
That was the last quadro we purchased, and I think we will not consider quadros in the future anymore. When I made the video both had similar price. Today even 1080 is kind of an old card, I hope there is something even better soon
Help I'm choosing between a gtx 1080 and a quadro m4000, I need it for After Effects and Premiere and I want to learn 3d, which one is the best choice?
hi Tipo Sospechoso, maybe that image helps you a little ; ) www.4ktvcomparison.com/wp-content/uploads/2016/09/10-bit-vs-8-Bit-Depth-of-color-1.jpg while 8 bit gives you only 256 tones per channel per pixel - > 10 bit gives you 1024 tones p c p pixel.. which gives you 16.78 million possible tones for 8 bit compared to 1.07 billion tones for 10 bit ( for RGB you have 3 channels..etc.. ) - makes sense ?
Don't forget that there's more to 3d performance in some cases than just viewport rendering. There's also CUDA enhanced Path Tracing and GPU enhanced real time rendering nowadays, which all benefit greatly from the amount of CUDA cores in your card, and the clock speed of those cores. In many cases, GTX gaming cards like the 1080 or now the RTX line will have substantially more cores than a similarly priced Quadro. This means faster rendering in path tracing render engines and faster overall performance rendering in GPU enhanced software renderers, like the Mercury engine that Adobe uses.
Did you seriously just compare Kepler to Pascal? In that case I thought the old tech performed admirably. There is a place for Quadros and if you are certain niche professional you will know whether your use case requires one.
Michael Bridges Thank you so much for explanation. I want asked a questions. I can't find any information about quadro m2000m on internet. I can't choosing between Msi gs63/msi ws60. I am a architecture student. Which one is more useful on my modeling works and render, gtx 1060m or quadro m2000m. I hope I could explain myself. Sorry for my English language.
Everybody has their own opinion about these two video cards, I noticed you mention using Dual Xeon 56 core CPU workstation, but is Xeon CPU really good for Gaming? Has anybody seen any RUclips video that showing Xeon CPU Vs i7 cpu in gaming ?
I need help. I bought a 1080 Ti for animation in maya, but in viewport is suuuper slow. I dont know how improve the performance like in this video. I have 32 ram, SSD m.2 and a Threadripper. I tried everything, change drivers, settings in Nvidia panel, but I does not work. Cheers
Find how to switch to Viewport 2.0 instead of Legacy. It should be default, I did not change anything, and the head has 12M polys if I remember. Check that card with synthetic benchmarks and see if the results fit with other users with the same card.
I always use Viewport 2.0 and I tested in many benchmarks (3dmark, cinebench, vray, corona render and playing in Battlefield 1) and I dont have problem. I dont know what to do. Cheers.
I would do a clean driver installation and try reinstalling maya, and apply all maya updates. If there is something I could just open and spin around let me know. Try also forums. I am a max user so this maya test was without any customization. There is one stupid thing to try: on mac computers in our lab if the monitor is connected with HDMI cable maya has 1s delay for everything (and only on some macs, even though all are exactly the same) and has other weird issues; but connecting it with different cable/adapter makes it work again.
A Quadro K5000 is a GTX 770, (same architecture). With a GTX 1080 you have to compare a Quadro P5000. On CAD applications the Quadro P5k will destroy the 1080!
I watched Linus talked about this but its not really what i want to know because he or any of his team use any kind of 3d software for demonstrating. My conclusion is for mainstream media content creator, architecture 3d modeler, interior design, game design etc. You dont really need a quadro. If you want extra buff then SLI 2 or 3 1080 if your software support it.
My philosophy for hardware is to get the least expensive piece of equipment that makes your work comfortable. If the heaviest 3d project I have worked on has a good viewport performance of 60fps (which is max that my monitor supports anyway), then I probably do not need to upgrade. If I get a project where the performance drops to uncomfortable level, then I will make sure to upgrade to the specs that bring it back to comfortable level. If I was only sculpting those heads and getting 240fps on 60hz monitor, that would be just a waste of money.
I'll compare soon RTX5000 and RTX 2080TI as I have those, not RTX6000. So far I know that RTX5000 is much slower for GPU rendering in Octane and Redshift
@@synthdude7664 I have to check viewport performance to know if there is a difference there, but probably not. ECC RAM in GPU maybe, to result in less crashing? 30 bit color output instead of 24 bit. Better components, quality control? At work we ordered 12 2080TIs, and 3 were dead-on-arrival, and one died within minutes, and big businesses don't want to deal with RMAs and be without card for weeks until the replacement is there.
People are dumb or what❗️ In this video he compares fps. Sure gtx way better it's a gaming card. What they focus is Speed u need to test the accuracy of the image not the speed and also u need to do some test for a random ram error and check the difference, Any way for me I believe 1060, 1070 and 1080 can be a good alternative but for 900s cards no and NO don't take them if u want CAD or modeling they r only for gaming
ah hello? this is not actually a real test comparison... when is the Quadro K5000 release? what you should be doing is to compare Quadro M5000 vs GTX1080 if you can affort it
K5000 is still being sold. Some people may want to consider it if their budget is under $1000 for GPU, and I want to show that it is behind GTX in these programs. I will not compare M5000 because we will not purchase it, especially since the FP64 performance is the same as GTX and we would not have any benefit compared to the GTX.
not a good comparison. try to do a comparison with CAD model that consists of 1000 components. that will crush even a gtx1080ti. There's a reason why professionals are using Quadro in their workstation.
I have not tried to compare any solid modeling software with these. Myself and many others use solid modeling for just a small things, for which even laptop cards are fine, but there is much bigger user base that use Maya, Max, C4D for art, design, architecture, games, so this is aimed at that audience. If I worked with huge CAD models professionally, I would not get a GTX card. This was also 2 years ago, and considering that 1080ti now costs $1200, when it should be under $700 and that P4000 is $800, I would have a harder time deciding.
Hello! I have quadro 4000, Is it better than gtx 1070? (I am after effect and premiere pro user) i have confused, is it better stick to quadro 4000 or upgrade it to gtx 1070? Thanks!
Poredjenje je bilo po ceni - tada su obe kartice kostale isto i mnogi 3d umetnici sa uskim budzetom su se dvoumili izmedju te dve. Ovo je bilo pre 5+ godina, nisam znao da jos iko gleda ovaj video. Sa posla smo sve K5000 poslali na reciklazu jos pre 2 godine
Some of the big items on that list why should Quadro and Firepro be used for CAD and such work are; engineers that make that should get payed to bring you that peace of hardware and software where you have it more stable, reliable, made exactly for that use. Thing is if you make money with it you should support it more then some kid who use it for barely gaming. However I agree, Nvidia and AMD should make that line more obvious and make sure that its justified in more directions than that one if the price is 2-3-4-5-6 or more times bigger. If user supports that and the rest of the pipeline so high does not, whats the logic in that?...+ this 1080 vs K5000 is misleading as those are different, two genarations apart arcitectures of graphic cards. Comparison should be in 1080 vs P5000 where again that comparison is held from xx points and situations for each and every piece of software appart. Thats how comparison is done and thats how you get to the right info no matter whats the question about. Anyway thank you for the video Tech guy. I still generally choose workstation grade hardware.
The video is from 2016 when GTX1080 and K5000 (used, ebay) were the same price. Today, I would have compared it with P4000, since they are similar price, but that one still has less number of CUDA cores. At work we completely moved away from quadros, so I won't be making a video
I don't know anyone with P4000, but if I get my hands on one I will test it. I am waiting at least a month as gtx1180 might be released, so I'll have to see what is worth investing into.
I want to buy a laptop coz I want to run design engineering applications like CATIA, Siemens NX, do I need to go for Quadro or before 1050ti will work?? Plzz answer fast.
Well yes, but you are comparing last gen GTX with a quadro from 2012. And it wasn't even the top quadro at the time. Anything will look slow in that test.
But even today it is the same price as K5000. The M6000 has the FP64 acceleration removed which was the biggest reason to get Quadro rather than GTX, so we wont be getting one
Well, this 1080 consumes 180 W while Quadro k5000 consumes 122 W, so for the performance difference it is worth it. My home power bill in NY is about $35 so I don't think the computer is contributing too much to the total bill
Dude, this was 7 years ago. The base for comparison was the price, at that time they costed the same, so if you had money for one, you might have been considering one or the other for 3D work. Today Quadros are useless, especially if you are GPU rendering
Do you get an error? I have it working on all computers with 1080. Make sure you get official drivers from nvidia website and switch to Nitrous viewport mode in max prefs
GTX 1080 cannot do double precision calculations (hence forget the precision you need when working as class A modeler on applications like CATIA or Alias for example) and also cannot output 10 Bits colour, which means it cannot match any of the colour gamma you need when working professionally, say, with Photoshop or any colour grading and colour correction application. Also, without that 10 Bit output you cannot take advantage of those professional monitors that you definitely need.
Read other comments, I think we already had a discussion. All cards can do double precision calculation, but Quadros used to have it accelerated, but not in Pascal architecture. In Pascal both Quadro and GTX have the same acceleration for double precision. As for no 10bit output on GTX, that is also thing from the past. We have 10bit LG monitors and that GTX 1080 is set to output 10bit color in nVidia Control Panel.
While Geforce cards can do 10-bit its only for the directx render path. For 10-bit opengl render path Nvidia keeps that feature locked up in the Quadro drivers (both cards are capable of 10-bit, but 10-bit opengl and the extra VRAM is why professionals pay for quadro). Photoshop and Premiere require quadro to even display 10-bit content. You cannot use geforce cards to get 10-bit color in these specific professional apps. Setting 10-bit in the control panel on a geforce only works for a few things -- its not enough to work professionally in 10-bit without a quadro. I do agree Geforce is faster and cheaper for 3d operations when you don't care about 10-bit.
This has been the case for many years and has not changed. nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce-gpus forums.adobe.com/thread/1701015 www.reddit.com/r/editors/comments/3dnwum/premiere_pro_cc_2015_lowend_quadro_for_color/ct7zxn9/ www.eizoglobal.com/support/compatibility/gpu/photoshopc_nvidia_amd/ forums.adobe.com/thread/2151899 Nvidia if asked likely would say "the hardware is capable of 10-bit", which is true, but as directx doesn't run on mac osx Adobe has to use opengl for cross platform compatibility. It is the geforce driver that does not permit 10-bit color on the opengl rendering path however. If Nvidia allowed this it would open up deep color to a lot more developers and consumers. However those quadro cards are a cash cow so long as the professional market is willing to bear their cost for the few features that have been held back from geforce. Nvidia Titans are in the same boat as Geforce. Anyone with both a quadro and geforce card can test 10-bit color display themselves fairly easily. Find a 10-bit image file with a smooth color gradient, often you can find them specifically for the purpose of display testing 10-bit color. Open it in photoshop on the quadro card, looks great with no banding/dithering, do the same on a Geforce/Titan and you will see banding/dithering.
Base for the test was price, as they were similar prices at the time of the test. Most of us decide with "what is the best I can buy for this task and X amount of money". This should help people with certain budget decide if they want one or the other
une comparaison qui n'a aucun sens comprarer deux carte de même génération style titan x vs quadro p6000 ou p5000 c'est sur la même architecture la ya deux génération entre les deux lol forcement la 1080 et mieux mdr mais maintenant 1080 ou 1080ti vs p5000 la ya match !
I would argue that this was a unfair battle. it should have been Nvida Quadro K5100M Vs GTX 1080M. And why would you compare a Mini Card with the full desktop card ? That makes this all the more a non genuine comparison. You gave me too much ammo to discredit you. The point is being objective, getting the better card for your buck not fanboyism.
Neither card is "mini", neigher is M, both are exactly the same full size cards. The price was also almost the same for second-hand K5000 and for new GTX 1080, and that was the basis for comparison, as someone with that money may be looking to get quadro or GTX.
Very subjectively. The author decided that the professionals - people who work in Maya and 3d max. This is not true. These pro is architects and industrial designers. Main development tools - ArchiCad and AutoCAD, as well as a blender. Almost all of the professionals working in Linux and iOs. Programs under windows generally difficult to call a professional. In the above-mentioned programs use a lot of shader processing threads, triangles and other vector math and geometry more precisely. Just for these processes and the method of separation of general computing between the CPU and the GPU. When rendering a very complex scenes in Blender your Titan-X or GTX980 will perform calculations for too long. The only alternative to long waits are rendering end, Quadro card. Believe me - a time difference of a few times.
I use also ArchiCad and AutoCAD, and work with studios that use both, aswell as other programs. I don't know anyone though that use Blender even though I am more than 13 years in 3d industry, it is a hobby program. GPU has nothing to do with rendering, only a few rendering engines use GPU and not production houses or even small studios use GPU rendering yet.
I'm curious if that is going to change with some of the rendering engines like Octane and Vray which support GPU rendering. Considering the real time rendering capability of some of this software, it seems like it would make sense to tackle it from this direction depending on the aesthetic you are looking for.
Base for comparison here is the similar price (thus base, and we see how the other parameters compare based on one common point). There are people considering getting this K5000 instead of GTX1080 for 3D
Tech Guy sorry for late reply, well some people says that every kuadro had a very precise check for its components to make sure it survive long and very durable. Gotta say that gtx for consumers being produced like thousands a day and no one gotta check them and nvidia have no money for that. But for business they made the quadro quite expensive for its quality. Thats what ive read actually somewhere or in a video
Quadros definitely have better quality control. GTX cards are made by their partners, while quadros are made by nvidia. But there is no quality difference while working. Most of the smaller businesses would not benefit from quadros. Quadros become performance obsolete long before hardware fails, but at that price point you could have replaced 2 or 3 gtx cards, always running the best and latest generation.
Vray 3.6 now uses hybrid CPU - GPU rendering, or either. Octane uses GPU as does Arion, FurryBall RT, Redshift, Moskito, Cycles, and Indigo. It's helpful to have decent viewport playback for sure, but my focus is what can do in a small render farm or stand-alone environment. AMD's Vega Frontier is worth a look as well.
EDIT: Please note that the video is not relevant anymore. It was created when new 1080 costed as much as second hand K5000. But today GTX cards are too expensive because of cryptocurrency mining, and Quadros are much less expensive than several years ago
thanks for the update sir
my 1070 is pooing
i need Quadro
thanks for the research you put into this to help others
Thanks for your research and for your update!
and now GeForces are cheap haha
No one talks about electrical consumption.
50watts of difference on one year, 24/7, is 438000 watts/hour
438kwh !
And elec price is growing years by years
@@OlivierRacaniere dude
Compare render time before you calculate for electricity cost
If you have a choice between a modern Geforce card or a quadro, the geforce will likely outperform it (higher clock), but the quadro is one of a few devices (AMD FirePro) that enables the 10-bit opengl path in its drivers which can be used by Photoshop/Premiere and other professional applications to display deep color. To be clear -- a geforce card will likely render scenes out faster than a quadro, the content may be 10-bit depending on where it came from, but the wider color gamut cannot be displayed inside Photoshop/Premiere/Resolve for professional color grading (this also requires a true 10+ bit professional display).
The question isn't "what is faster?" but rather "do I need to display 10-bit content in a professional application, or do I require more VRAM than geforce cards can provide?". The last 2 points are why Quadro cards are still relevant to professionals. If I was uploading 10-bit video to youtube/netflix for HDR, I'd probably want to see the content on a 10-bit display and maybe do some color grading with deep color available. I could handle a similar situation with a geforce card, but I wouldn't see 10-bits, it would dither down to 8-bit, there might be some banding, and while it may look "good enough" I would still want to see it in 10-bits before publishing it if this was professional work.
what about amd firepro w9100?
Its called early adoption. RUclips supports HDR right now and they even allocated extra bitrate AND do automatic conversions to SDR for everyone else. If you are making a high quality master that you shot as 10-bit, why not upload the highest quality source so that as youtube supports future codecs (h.265 is coming!) they can just automatically re-encode from your high quality master? I don't see a downside. I expect HDR display market saturation in 3-5 years like we saw with 1080p/HD adoption. Any early buyers are going to be looking for content to view on their new televisions and I see an opportunity to gain a couple views from this tech.
support.google.com/youtube/answer/1722171?hl=en
support.google.com/youtube/answer/7126552?hl=en
If you have the capability to record in 10-bits, why wouldn't you adopt the HDR workflow? I just pre-ordered a Panasonic Lumix GH5 camera which records at 10-bits internally. I can only think of a few reasons, 1) Because you hate color correction in post processing and want to upload straight from the camera to youtube, and 2) You lack the gumption to read about the different/evolving HDR standards.
As mentioned FirePro does support a 10-bit opengl path, but as far as its 3d performance I would just defer to online benchmarks. I'll note that gamers scoop up geforce cards for performance reasons, but I never hear about gamers with FirePro cards. I favor AMD graphics more from a bang for your dollar perspective than as a top dog performer. Also FirePro cards, while they do support OpenCL, I find that some apps (Adobe products) either lack OpenCL support (favoring CUDA) or their CUDA optimizations perform better than on OpenCL (Blender falls into this category). I think its unfortunate that the open standard OpenCL, is not as popular.
technoendo but the workflow like preview efects and color grading on adobe premiere. That w9100 have 32gb of gpu only for 2.5k $. That has to mean something. Im looking for the best workflow.
I don't follow. Preview effects will still be accelerated on any proper GPU regardless of brand, is there some FirePro specific feature here? 32GB of vram for $2.5k does mean something. Are you familiar with GPU usage in premiere in general? Pugetsystems has been doing some great benchmarks over the years and looks like they updated the page below to include the newly released 1080ti, and they have some VRAM recommendations for different resolution of media being worked with (8K media may need 8-10GB of VRAM, less than 1/3rd of the VRAM you propose). Also when you look at these benchies do you see the minor gap between the 1070 and the 1080ti/Titan? Basically Premiere may not fully utilize the GPU which is why so many video editors aim for 1060's/1070's (usually its gaming or 3d rendering that will better utilize these higher end cards).
www.pugetsystems.com/recommended/Recommended-Systems-for-Adobe-Premiere-Pro-143/Hardware-Recommendations
If you can use the VRAM I think you have a winner, but if you can't demonstrate that you'll use anywhere near 32GB then I would just get a lower end/older quadro if 10-bit display is necessary in professional applications. I'm a little hopeful this HDR standard and increased prevalence of 10-bit displays coming in the next 3-5 years may mean we'll see alternatives -- maybe we work on 10-bit media blind in Premiere, but can generate previews that we can view in some directx based 10-bit video player as a monitor outside the primary application. I wish you luck sir.
With Quadro your paying for stability, Drivers that are compliant with the Pro tools and memory that is stable. I use Catia in my day job and licenses push 100k pa and then the Hourly rate for design engineers is high. The few grand for a bit of hardware really doesn't come into the equation. Just buy what is sure to work.
That is what Quadros are made for! Although, on the driver side, nVidia has abandoned custom quadro drivers for some 3d software. But honestly, working also on renderfarm management, I will tell you that from this perspective it is better to give 3d artists the worst GPU possible, so they are polygon-conscious and I don't end up having 1-hour frames and have a useless renderfarm. "Sadly" today there are no bad graphic cards.
You really have to get into the mathematics of how the applications work. CAD tools have two levels of detail they work with in real time. You can vary the visual representation, even have a tessellated /b-spline completely separate files for that representation, but the actual geometry has to be exact. In Catia case it's up to 15 decimal places I think. Visualization programs don't work to the same accuracy as it's not required. So yes these can benefit much more from the GPUs that are made for gaming. For example if you run some precise calcs on CUDA vis GPU you will find CUDA is inaccurate as it rounds up.
Brian Eastwood The previous days I have been reading about the floating point precision in current cards, and according to those articles both gtx and quadro cards can use FP64 precision, but gtx did not have it accelerated like quadros. The things are now interesting that new pascal quadros such as P5000 does not have that high-speed FP64 support, so the lines are getting really blurry there.
Yes geforce are better for content creation software. But for broader use of 3D, do benchmark with Specviewperf. Benchmark software for industrial 3D. Then we could see the quadro completely destroy geforce.
I will compare again with Specviewperf in the week or two... I am curious now
I'm really curious in the results! I need to buy a new workstation at work...
I'd love to see a showdown with Solidworks. It seems very sensitive to the OpenGL drivers in the cards.
So I did some research and talked to some people that are software developers for CAD software. Some senior developers even said that it's total bullshit to buy a Quadro card. My conclusion is that for my situation/work Quadro is not the way to go. Considering the cost and the benefits it's better for my situation/work to buy a GTX. So for now I think I would go for a Alienware 15" with GTX 1070 and full specs, if there are any tips for a better laptop let me know!
Also our workstation now has a GTX Titan 12GB's with a CPU i7 overclocked at 4,6ghz and works just fine with the software I use.The software I use: Vectorworks, Cinema4D, AutoCAD, S-CAD, Solidworks, Keyshot, Adobe Suite.
There has to be a reason why those Quadro card are way more expensive like development costs and the thinks Pix Pix are mentioning.
Wow, interesting discussion so far. Yes, nowadays Titans really blur the line between Quadro and Geforce, Nvidia admit it. And for the software you use, seems like they will go just fine with Geforce especially Titans. In terms of horsepower, Quadro and Geforce doesn't make much of a difference. Not long ago I read Nvidia documentation about Quadro, it says Quadro has better memory management and stability than Geforce, maybe that's where the rest of the money went.
One of my friend works in a car parts manufacturer, and it involves a lot of 3D scanning. She was once using Nvidia Geforce to do 3D scanning, not long after that her company bought Quadro workstation with very similar spec. She said there's some improvement in the scanned data processing speed after switching to Quadro workstation.
To be fair, that quadro is from like 2 generations ago.
Correto! teria que comparar com a quadro serie P...
Comparing a Pascal vs a Kepler card, of course the way older card will loose
I agree with u when it's on 3ds max because Nirous it's optimize for Gaming card
but on Maya I think Quadro is better on OpenGL Setting
U can try
srry for my bad eng
You're absolutely right.If you test solidworks or NX or ProE Quadro, it's much better than GeForce.
Yeah, the only problem is that Maya, 3ds Max and Cinebench only check 3DX capabilities of cards. Let's try comparing the cards in OpenGL and 3D-modelling (CAD) environment... ;)
Please note that the K5000 is based off the Kepler architecture and was released back in 2013. A more fair comparison would be between the Quadro P6000 against the GTX 1080, both bearing the new Pascal architecture.
But essentially yeah Quadros are not worth the price in MOST cases for typical PC users and are made to delete your wallet
No need to go that far, a comparison between Quadro M5000 and GTX 1080nwould be far more interesting.
I heard that Quadro has 10-bit colour per channel for OpenGL, but Geforce only has 10 bit in DirectX, is this correct? As getting a 10 bit monitor, would mean you'd need Quadro to fully utilize it.
Don't choice GTX. choice Quadro
because k5000 is a older , his double precision-130 GFLOPs , 1080p has a double precision : 277 GFLOPs
Thanks for the post, it helped a lot with quoting a 3ds user their new PC, seeing the vid and the comments, I was convinced
The customer is getting a new i7 rig (16GB RAM, PCIe SSD, 27" monitor) and I went with a GTX 1060 6GB. A champion computer I'll sell for US$1400
Cheers Tech Guy :)
Can you compare Quadro VS Geforce cards in the Google SketchUp program?
I did a test with a GTX 580 and a GTX 780, both provided the same frames in a scene of more than 100mb.
I can not find videos about this on RUclips
In that scene the computer slows down
That GTX 580 was like Fermi gen Titan.
If you are working with Davinci Resolve or other GPU accelerated video editing tools, forget about the quadro. get the gtx.
Thank you for the heads up. I am only a 3D Studio Max user, I wonder if you can extrapolate this to that software (3D Max is never used in benchmarks on the reviews).
Benchmark at 0:14 is Max. I am Max user since it was 3d Studio v4 for DOS, and in my opinion it is completely ruined software. As you can see there is not much difference between cards in Max. Both are glitchy as you can see (I intentionally turned off lights, because it was client work). I have tried both DirectX and OpenGL mode with similar results, and both have very low frame rates compared to Maya (sometimes more than 10x difference). I have even tried with dedicated 3ds max driver for quadro, but it was the same.
GPU NVIDIA Quadro K5000 (GK104). Аналог по скорости GeForce® GTX 680
Thank you so much. I want asked a questions. I can't find any information about quadro m2000m on internet. I can't choosing between Msi gs60/msi ws60. I am a architecture student. Which one is more useful on my modeling works and render, gtx 1060m or quadro m2000m. I hope I could explain myself. Sorry for my English language.
Yeah, It's only marketing, I have not seen differentiate between Quadro and GeForce. I using Octane Render with dual GTX 980 TI and I found that GeForce is more worthly than Quadro. A few year ago, I try to test preview speed of modeling in heavy polygons in viewport between Quadro K5000 and GTS 450, I have not seen difference.
Hello there, I know that this video is a little old but I was wondering if you could give me a moment. I am curious as to how your Quadro compares against the GTX 1080 in OpenGL mode in Maya (if that is not what you were already using). The reason I ask is because I have a Firepro w9100 and am able to get just under 190FPS in the Cinebench test and often work in OpenGL mode in Maya 2016 in order to see all of the features that are not supported in DX11 mode (in addition to previewing an extreme number of textures).
In the end I want to see if professional cards can still outperform a consumer card in OpenGL (Viewport 2.0) mode as I am thinking about selling it as I move from CPU rendering (Arnold) to GPU rendering (Redshift). If the performance holds up then I am planning on getting a Titan XP plus a trio of 1070s. I still have to do an extensive amount of texture painting and previewing so I will indeed need the extra VRAM but when it actually comes rendering in Reshift, it is so RAM efficient that 8 gigs is all I would need for the headless cards (I have tested a friend's 1080 already).
P.S. I know that the much newer 1080 beat the Quadro in Cinebench but I am wondering if that translates into real-world performance.
THANKS!
hI, could you do a comparison between quadro m6000 and gtx 1080?
I do not have m6000, sorry. We decided not to use Quadros any more. It should not have better performance except in raw CUDA calulations (m6000 has 3K cores and GTX1080 as 2.5K), but having 2x1080 should beat m6000even in CUDA calculations by at least 80% and still be less expensive.
so why does quadro is extremely expensive?
Felix Martono Quadros used to have a lot more RAM than consumer cards, so big studios could work with heavy scenes. But since 1080 now has 8GB of memory I do not see the need for quadro. Also, quadros used to have specific performance drivers for 3d programs, but I remember a few years ago someone was able to find a setting to let geforce card be seen as a quadro and run the same driver, and have the same performance, so it looks to be a marketing gimmick. Since about quadro K4800 I have not seen custom drivers in 3dsMax, there is nothing in Maya and I am not sure about other programs. I have read before that they also do anti-aliasing on hardware so the CAD lines look "better", but I did not notice that there is anything different about the lines between these cards.
Modern day quadros also have tons of RAM available and in ECC configurations.
Quadros are no longer suitable for use outside of clusters. Get a titan.
1080 costs about a 10th of what the quadro k5000 and performs better. amazing.
for content creation 3d mostly yes, for industrial 3d absolutely no.
please explain what the hell industrial 3d is supposed to be ?
MaxtronZero more like 3d for industrial needs, such as laser 3d scanning, car parts design, tower design, oil rig design, plant design, etc.
So for Revit Architecture and for scans with Faro but only interiors you would suggest 1080 or quadro?
Seems like Revit is a geometry heavy software rather than texture heavy, and as far as I know, 3d scanning will need a good GPU stability and memory management. Lastly, do you need an absolute precision in your work? If so I would recommend Quadro.
Which lowest graphic card should I buy to use CAD 3D programs decently (SolidWorks, AutoCAD, etc)? 1050 TI works well? Or I need a better on? Or I can buy a cheaper one (RX 550 4Gb)?
I won't work with, but I'll only use for studying.
There are probably banchmarks of GPUs for Solidworks, so try to find it. Also, if you don't plan to make insanely complicated models, any GPU should be good enough. Open the most complicated scene you have with your current system, and check the framerate. If the performance has to be doubled, seek the one with double the 3d benchmark score
Quadro P3000 and GTX 1080 which one the best ?
Spinning heads and such like is all very well, but I've had both Quadro and GTX and will say that the GTX's are good for Activeshade rendering (VRay) the Quadro is better for viewport *display*. Maybe. I'm still testing my new Geforce cards but when working in 3DS Max, things like edges are hard to see and sometimes even not visible at all - it's a glitch and will drive me nuts if I experience it permanently. It's early days, might be driver issues, but I've got two 1080 ti's and was intending to fill up my motherboard with 4 of them. I think I'll just go with 3 and make the 4th a Quadro specifically for the viewports alone (you can do that with VRay).
Shouldn't this be 1 Quadro K5000 vs 10 GTX 1080s, or there abouts.
No, differences between commercial and professional cards is pretty much gone these days, tehre are few features over 1080s with equal Quadros but they are for very high end movie making purposes for CGI and so on, having generally identical perf for normal asset creation and gaming. Its different with CPUs a bit Xeons outclass commercial cpus alot in multitasking and are great for some gaming on the side aswell, and they tend to last much longer underload than commercial variants, they tend to cheap out and use shitty heat tranfering on commerical ones instead and you have to scalp i7s, i9s and others nowadays to get proper direct cooling. Ofcourse you need to take in mind that for that intel asks far more, just cuz they are made to last(aka they cant go back to you to make you buy another), their perf is longlasting and will serve user for years or even over a decade given modern xeon's perf, and so on - this means hefty price for it to do such things. Also people buying them can just pay more cuz they work for big studios and get their gear funded often
Would this work for Solidworks just the same? I'm comparing a GTX 1060 with Quadro M2200, for a mobile work station.
Andrei Fasola Can you found a answer for your questions? I have a same problems like you. I can't make decision between quadro m2000m or gtx 1060m. If you can help me i would be very happy.
I told my dad to go with the M2200. I thought - instead of dealing with aliasing and all those pesky issues he better have the proper drivers and whatnot. Yeah it seems it's all about money - same shit in cinema, photography and whatnot. Cheers.
canadacomputers.com/product_info.php?item_id=117596&cPath=570_710_1199
MSI WE72 7RJ-1080CA Workstation Notebook | 17.3" FHD Intel i7-7700HQ (2.80GHz) 32GB Memory 512GB SSD + 1TB HDD | NVIDIA Quadro M2200 GDDR5 4GB BT 4.2 Win 10 Pro
Compare amd radeon pro wx7100 or wx9100 with gtx 1080 or 1080 ti please.
GeForce is the same Quadro, except there are limitations placed on GeForce cards, though those limitations are possible to be removed, it wouldn't be worth it.
Hi, Tech Guy. Do you test these cards on other 3D softwares like Mari, Substance Painter, zbrush or game engine like Ureal Engline or Cry Engine?
No, but I have the software, so if you have a scene for each of those that I could use (and you have rights to lend the scene to me and have it shown online), then send it to me and I will test it out.
I currently don't have ones, but will make some of them in the future and will send it to you.
Can you do the same please with Maxwell based Quadro GPU VS GTX1080???
Kepler is being gimped WAY WAY more in rendering than in games... I mean an overclocked GTX780 Ti easily matches GTX1060 in games, but a professional GPU like Quadro K5000 loosing to common gaming card like GTX1080 in rendering, wow, sad...
That was the last quadro we purchased, and I think we will not consider quadros in the future anymore. When I made the video both had similar price. Today even 1080 is kind of an old card, I hope there is something even better soon
Help I'm choosing between a gtx 1080 and a quadro m4000, I need it for After Effects and Premiere and I want to learn 3d, which one is the best choice?
Mamados Vlog same here with me. please ,Someone Who knows about that, response :S
i decided for the 1080 better in all ways and pascal now is 100% compatible with adobe suite
dude only the m4000 support 10 bit color deph
The 1080 for sure.
hi Tipo Sospechoso, maybe that image helps you a little ; ) www.4ktvcomparison.com/wp-content/uploads/2016/09/10-bit-vs-8-Bit-Depth-of-color-1.jpg
while 8 bit gives you only 256 tones per channel per pixel - > 10 bit gives you 1024 tones p c p pixel.. which gives you 16.78 million possible tones for 8 bit compared to 1.07 billion tones for 10 bit ( for RGB you have 3 channels..etc.. ) - makes sense ?
Don't forget that there's more to 3d performance in some cases than just viewport rendering. There's also CUDA enhanced Path Tracing and GPU enhanced real time rendering nowadays, which all benefit greatly from the amount of CUDA cores in your card, and the clock speed of those cores. In many cases, GTX gaming cards like the 1080 or now the RTX line will have substantially more cores than a similarly priced Quadro. This means faster rendering in path tracing render engines and faster overall performance rendering in GPU enhanced software renderers, like the Mercury engine that Adobe uses.
Did you seriously just compare Kepler to Pascal?
In that case I thought the old tech performed admirably.
There is a place for Quadros and if you are certain niche professional you will know whether your use case requires one.
Michael Bridges Thank you so much for explanation. I want asked a questions. I can't find any information about quadro m2000m on internet. I can't choosing between Msi gs63/msi ws60. I am a architecture student. Which one is more useful on my modeling works and render, gtx 1060m or quadro m2000m. I hope I could explain myself. Sorry for my English language.
Try actually interacting with high poly objects in Maya on a consumer card compared to a workstation card.
Everybody has their own opinion about these two video cards, I noticed you mention using Dual Xeon 56 core CPU workstation, but is Xeon CPU really good for Gaming? Has anybody seen any RUclips video that showing Xeon CPU Vs i7 cpu in gaming ?
Xeons are not for gaming; it is like using a 12-wheeler truck for weekly grocery shopping
Xeon e5 v3 v4 玩遊戲插一張rx580 gtx1060沒問題
I need help. I bought a 1080 Ti for animation in maya, but in viewport is suuuper slow. I dont know how improve the performance like in this video. I have 32 ram, SSD m.2 and a Threadripper. I tried everything, change drivers, settings in Nvidia panel, but I does not work. Cheers
Find how to switch to Viewport 2.0 instead of Legacy. It should be default, I did not change anything, and the head has 12M polys if I remember. Check that card with synthetic benchmarks and see if the results fit with other users with the same card.
I always use Viewport 2.0 and I tested in many benchmarks (3dmark, cinebench, vray, corona render and playing in Battlefield 1) and I dont have problem. I dont know what to do. Cheers.
I would do a clean driver installation and try reinstalling maya, and apply all maya updates. If there is something I could just open and spin around let me know. Try also forums. I am a max user so this maya test was without any customization. There is one stupid thing to try: on mac computers in our lab if the monitor is connected with HDMI cable maya has 1s delay for everything (and only on some macs, even though all are exactly the same) and has other weird issues; but connecting it with different cable/adapter makes it work again.
Wow I didn´t know about HDMI connector, I have 2 monitors in HDMI (In PC system, no mac). It could be that. Thanks man
A Quadro K5000 is a GTX 770, (same architecture). With a GTX 1080 you have to compare a Quadro P5000. On CAD applications the Quadro P5k will destroy the 1080!
But P5k cost 2-3x more? :D
I watched Linus talked about this but its not really what i want to know because he or any of his team use any kind of 3d software for demonstrating.
My conclusion is for mainstream media content creator, architecture 3d modeler, interior design, game design etc. You dont really need a quadro. If you want extra buff then SLI 2 or 3 1080 if your software support it.
My philosophy for hardware is to get the least expensive piece of equipment that makes your work comfortable. If the heaviest 3d project I have worked on has a good viewport performance of 60fps (which is max that my monitor supports anyway), then I probably do not need to upgrade. If I get a project where the performance drops to uncomfortable level, then I will make sure to upgrade to the specs that bring it back to comfortable level. If I was only sculpting those heads and getting 240fps on 60hz monitor, that would be just a waste of money.
Do GTX1080 cards not support 10 bit color?
That is not fair, you have to compare with P5000 the same generation that pascal
need select OpenGL option from viewport maya and 3dsmax for better test
I have tried both. Maya performed the same, while 3ds Max performed a lot worse using OpenGL.
What happens if you compare let’s say a Quadro RTX 6000 to a rtx 2080ti?
I'll compare soon RTX5000 and RTX 2080TI as I have those, not RTX6000. So far I know that RTX5000 is much slower for GPU rendering in Octane and Redshift
Tech Guy interesting!
What would be the purpose of the rtx 5000 if you get better performance from a 2080ti?
@@synthdude7664 I have to check viewport performance to know if there is a difference there, but probably not. ECC RAM in GPU maybe, to result in less crashing? 30 bit color output instead of 24 bit. Better components, quality control? At work we ordered 12 2080TIs, and 3 were dead-on-arrival, and one died within minutes, and big businesses don't want to deal with RMAs and be without card for weeks until the replacement is there.
Quadro = Professional Card
GeForce GTX : Gaming card
Get it?
Both shares the same chip manufactured by the same brand.
this is the exact kind of misinformation this video is trying to disprove.
thank you for your video its very helpfull for me to take decission,what graphic card will i buy
People are dumb or what❗️
In this video he compares fps. Sure gtx way better it's a gaming card. What they focus is Speed u need to test the accuracy of the image not the speed and also u need to do some test for a random ram error and check the difference,
Any way for me I believe 1060, 1070 and 1080 can be a good alternative but for 900s cards no and NO don't take them if u want CAD or modeling they r only for gaming
so it's better to get a gaming card so you can both game on it and do designs!
ah hello? this is not actually a real test comparison... when is the Quadro K5000 release? what you should be doing is to compare Quadro M5000 vs GTX1080 if you can affort it
K5000 is still being sold. Some people may want to consider it if their budget is under $1000 for GPU, and I want to show that it is behind GTX in these programs. I will not compare M5000 because we will not purchase it, especially since the FP64 performance is the same as GTX and we would not have any benefit compared to the GTX.
what about amd firepro w9100? or any firepro series.. can a i get 10bit from any?
not a good comparison. try to do a comparison with CAD model that consists of 1000 components. that will crush even a gtx1080ti. There's a reason why professionals are using Quadro in their workstation.
I have not tried to compare any solid modeling software with these. Myself and many others use solid modeling for just a small things, for which even laptop cards are fine, but there is much bigger user base that use Maya, Max, C4D for art, design, architecture, games, so this is aimed at that audience. If I worked with huge CAD models professionally, I would not get a GTX card. This was also 2 years ago, and considering that 1080ti now costs $1200, when it should be under $700 and that P4000 is $800, I would have a harder time deciding.
which is best
Kepler vs pascal compare P5000 vs GTX 1080
Hello! I have quadro 4000,
Is it better than gtx 1070?
(I am after effect and premiere pro user)
i have confused, is it better stick to quadro 4000 or upgrade it to gtx 1070? Thanks!
Try Siemens NX, 1080 will get only 16fps, any Quadro will get at least 100
quadro cards are known for accurate simulations and not for higher fps as far as i know
GTX for gaming, photoshop, video editing.
Quadro for 3D modeling.
yeah
Comparing a pci 2.0 card to a pci3.0 one? That's rich.
Poredjenje je bilo po ceni - tada su obe kartice kostale isto i mnogi 3d umetnici sa uskim budzetom su se dvoumili izmedju te dve. Ovo je bilo pre 5+ godina, nisam znao da jos iko gleda ovaj video. Sa posla smo sve K5000 poslali na reciklazu jos pre 2 godine
Thanks for sharing, though I also use Blender a work so am still curious on that one.
Rendering the same scene, a GTX1070 beats a Quadro P5000 - Only just, but consistantly
dude pls make more vids like this. if possible for solidworks
Some of the big items on that list why should Quadro and Firepro be used for CAD and such work are; engineers that make that should get payed to bring you that peace of hardware and software where you have it more stable, reliable, made exactly for that use. Thing is if you make money with it you should support it more then some kid who use it for barely gaming. However I agree, Nvidia and AMD should make that line more obvious and make sure that its justified in more directions than that one if the price is 2-3-4-5-6 or more times bigger. If user supports that and the rest of the pipeline so high does not, whats the logic in that?...+ this 1080 vs K5000 is misleading as those are different, two genarations apart arcitectures of graphic cards. Comparison should be in 1080 vs P5000 where again that comparison is held from xx points and situations for each and every piece of software appart. Thats how comparison is done and thats how you get to the right info no matter whats the question about. Anyway thank you for the video Tech guy. I still generally choose workstation grade hardware.
It is absolutely your right to waste perfectly good money
the k5000 is from the GTX 6th gen era, you are comparing a 6th gen kepler with a 10 gen card? i am selling a K5000 for 100 bucks man...
The video is from 2016 when GTX1080 and K5000 (used, ebay) were the same price. Today, I would have compared it with P4000, since they are similar price, but that one still has less number of CUDA cores. At work we completely moved away from quadros, so I won't be making a video
Tech Guy thanks you for the reply, very kind!, Can you do the test again using the p4000?
I don't know anyone with P4000, but if I get my hands on one I will test it. I am waiting at least a month as gtx1180 might be released, so I'll have to see what is worth investing into.
So i am joining animation school for bachelors in animation suggest me a gpu
My cpu is ryzen 2600x
I want to buy a laptop coz I want to run design engineering applications like CATIA, Siemens NX, do I need to go for Quadro or before 1050ti will work?? Plzz answer fast.
so... for rendering in maya?
which is better
nvidia gtx 1080 or quadro m4000?
help!
please
Rendering is CPU task and most of our render nodes dont even have graphic cards. See ruclips.net/video/SOH0hdYR36Y/видео.html for CPU comparison
Quadro can render 24h/24h but a GTX can render only 97 hours and then the card will died.
Well yes, but you are comparing last gen GTX with a quadro from 2012. And it wasn't even the top quadro at the time. Anything will look slow in that test.
my Quadro k4200 hit 192 fps in cinebench R15.. do not compare with the old ones :)
K5000 came out a long damn time ago, do a m6000 comparison and sell me on that.
But even today it is the same price as K5000. The M6000 has the FP64 acceleration removed which was the biggest reason to get Quadro rather than GTX, so we wont be getting one
Thank you.
what about maya and animation
quadro take 85W and this gtx take 200 w : ) , who work in cad many hours buy quadro : ).. .
Well, this 1080 consumes 180 W while Quadro k5000 consumes 122 W, so for the performance difference it is worth it. My home power bill in NY is about $35 so I don't think the computer is contributing too much to the total bill
we already passed the 60 mark so i guess we just need to get pass that 500fps ego mark
You are comparing two generations apart.
Dude, this was 7 years ago. The base for comparison was the price, at that time they costed the same, so if you had money for one, you might have been considering one or the other for 3D work. Today Quadros are useless, especially if you are GPU rendering
my gtx 1080 wont work in 3ds max for some reason :/
Do you get an error? I have it working on all computers with 1080. Make sure you get official drivers from nvidia website and switch to Nitrous viewport mode in max prefs
NVM I got it working. I was trying to use Iray but Quicksilver renderer seemed to use my 1080 fine. Thanks for reply though
Its call glitch....
With xeon, quadro and ecc memory...we work efficient... Cannot wasting time on this matter
GTX 1080 cannot do double precision calculations (hence forget the precision you need when working as class A modeler on applications like CATIA or Alias for example) and also cannot output 10 Bits colour, which means it cannot match any of the colour gamma you need when working professionally, say, with Photoshop or any colour grading and colour correction application.
Also, without that 10 Bit output you cannot take advantage of those professional monitors that you definitely need.
Read other comments, I think we already had a discussion. All cards can do double precision calculation, but Quadros used to have it accelerated, but not in Pascal architecture. In Pascal both Quadro and GTX have the same acceleration for double precision. As for no 10bit output on GTX, that is also thing from the past. We have 10bit LG monitors and that GTX 1080 is set to output 10bit color in nVidia Control Panel.
While Geforce cards can do 10-bit its only for the directx render path. For 10-bit opengl render path Nvidia keeps that feature locked up in the Quadro drivers (both cards are capable of 10-bit, but 10-bit opengl and the extra VRAM is why professionals pay for quadro). Photoshop and Premiere require quadro to even display 10-bit content. You cannot use geforce cards to get 10-bit color in these specific professional apps. Setting 10-bit in the control panel on a geforce only works for a few things -- its not enough to work professionally in 10-bit without a quadro.
I do agree Geforce is faster and cheaper for 3d operations when you don't care about 10-bit.
Tech Guy
Not according to NVidia.
Unless I am missing something from their website and unless they are stupid.
This has been the case for many years and has not changed.
nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce-gpus
forums.adobe.com/thread/1701015
www.reddit.com/r/editors/comments/3dnwum/premiere_pro_cc_2015_lowend_quadro_for_color/ct7zxn9/
www.eizoglobal.com/support/compatibility/gpu/photoshopc_nvidia_amd/
forums.adobe.com/thread/2151899
Nvidia if asked likely would say "the hardware is capable of 10-bit", which is true, but as directx doesn't run on mac osx Adobe has to use opengl for cross platform compatibility. It is the geforce driver that does not permit 10-bit color on the opengl rendering path however. If Nvidia allowed this it would open up deep color to a lot more developers and consumers. However those quadro cards are a cash cow so long as the professional market is willing to bear their cost for the few features that have been held back from geforce. Nvidia Titans are in the same boat as Geforce.
Anyone with both a quadro and geforce card can test 10-bit color display themselves fairly easily. Find a 10-bit image file with a smooth color gradient, often you can find them specifically for the purpose of display testing 10-bit color. Open it in photoshop on the quadro card, looks great with no banding/dithering, do the same on a Geforce/Titan and you will see banding/dithering.
what about amd firepro w9100??
where you work
1 000 000 000 + polygon fps test please )
compare the gtx1080 with a pascal quadro like the p5000......compare a kepler quadro with a pascal gtx is a nosense test....
Base for the test was price, as they were similar prices at the time of the test. Most of us decide with "what is the best I can buy for this task and X amount of money". This should help people with certain budget decide if they want one or the other
une comparaison qui n'a aucun sens comprarer deux carte de même génération style titan x vs quadro p6000 ou p5000 c'est sur la même architecture la ya deux génération entre les deux lol forcement la 1080 et mieux mdr mais maintenant 1080 ou 1080ti vs p5000 la ya match !
I would argue that this was a unfair battle. it should have been Nvida Quadro K5100M Vs GTX 1080M. And why would you compare a Mini Card with the full desktop card ? That makes this all the more a non genuine comparison. You gave me too much ammo to discredit you. The point is being objective, getting the better card for your buck not fanboyism.
Neither card is "mini", neigher is M, both are exactly the same full size cards. The price was also almost the same for second-hand K5000 and for new GTX 1080, and that was the basis for comparison, as someone with that money may be looking to get quadro or GTX.
k4000 vs gtx 1060
Very subjectively. The author decided that the professionals - people who work in Maya and 3d max. This is not true. These pro is architects and industrial designers. Main development tools - ArchiCad and AutoCAD, as well as a blender. Almost all of the professionals working in Linux and iOs. Programs under windows generally difficult to call a professional.
In the above-mentioned programs use a lot of shader processing threads, triangles and other vector math and geometry more precisely. Just for these processes and the method of separation of general computing between the CPU and the GPU.
When rendering a very complex scenes in Blender your Titan-X or GTX980 will perform calculations for too long. The only alternative to long waits are rendering end, Quadro card. Believe me - a time difference of a few times.
I use also ArchiCad and AutoCAD, and work with studios that use both, aswell as other programs. I don't know anyone though that use Blender even though I am more than 13 years in 3d industry, it is a hobby program. GPU has nothing to do with rendering, only a few rendering engines use GPU and not production houses or even small studios use GPU rendering yet.
I'm curious if that is going to change with some of the rendering engines like Octane and Vray which support GPU rendering. Considering the real time rendering capability of some of this software, it seems like it would make sense to tackle it from this direction depending on the aesthetic you are looking for.
Really
This is not a fair comparison!!! GTX 1080 should be compared with Quadro P5000 or P4000 not K4000. This is really stupid of publisher.
Base for comparison here is the similar price (thus base, and we see how the other parameters compare based on one common point). There are people considering getting this K5000 instead of GTX1080 for 3D
Pro cards are more reliable, 10bit color, time is money, reliability is time. This is useless.
Its about quality and not speed
Quality of what exactly? Do you see difference between viewport "quality" in either of those?
Tech Guy sorry for late reply, well some people says that every kuadro had a very precise check for its components to make sure it survive long and very durable.
Gotta say that gtx for consumers being produced like thousands a day and no one gotta check them and nvidia have no money for that. But for business they made the quadro quite expensive for its quality. Thats what ive read actually somewhere or in a video
Quadros definitely have better quality control. GTX cards are made by their partners, while quadros are made by nvidia. But there is no quality difference while working. Most of the smaller businesses would not benefit from quadros. Quadros become performance obsolete long before hardware fails, but at that price point you could have replaced 2 or 3 gtx cards, always running the best and latest generation.
TITAN X PASCAL
GTX 1080 winner???
man don't compare kernel to pascal, this test is irelevant, no one care about second hand market price, is overrated
Who cares about viewport display? It's the rendering time that counts.
Most 3d software use only CPU rendering, so GPU does not matter (our render farm does not even have graphic cards)
Vray 3.6 now uses hybrid CPU - GPU rendering, or either. Octane uses GPU as does Arion, FurryBall RT, Redshift, Moskito, Cycles, and Indigo. It's helpful to have decent viewport playback for sure, but my focus is what can do in a small render farm or stand-alone environment. AMD's Vega Frontier is worth a look as well.
Gtx iq better
this is an awful comparisson, those are two different architetures. One being 4 years older than the other it will obviously outperform it...