What GPUs

Nostromo

Suspended / Banned
Messages
6,236
Name
Dominic
Edit My Images
Yes
Just wondering which GPUs people are using for photo editing, mainly things like de-noise.
I'm using a Nvidia gtx 1660 super 6gb, which for most things is absolutely fine. De-noise is where I find it frustrating, it's taking about 50 seconds per image. Which on the face of it doesn't sound too bad, but sitting there watching a screen, image after image does become rather monotonous.
I'm thinking of upgrading to a rtx 3060 12gb which on the face of it should be a good budget improvement. But I can't find any information other than gaming reviews and specs. So I thought I'd ask here what others are using. I don't need lightning fast, just a good improvement.
 
Last edited:
It might help to say which app you're using when denoising - a better chance of users of the same app chiming in.
Most commercial programs use AI for denoising, so perhaps looking at AI inference and data generation tests might be of some value.
That said, generic AI benchmarks might not translate to particularly well to how a specific denoising engine might perform.

There are also a few "content creation" reviews floating around, such as: https://www.pugetsystems.com/labs/articles/nvidia-geforce-rtx-5070-content-creation-review/

You can also check out product-specific forums, such as this for DxO:

I have recently built a new computer with an Nvidia RTX 5070, but haven't had a chance to transfer everything to it just yet, and my old computer was from 2014 (with an Nvidia GTX 850m), which would be more of a leap than in your case (once I move the remaining things across).
It might be a while yet before I get to it, but on the off-chance I would manage to before you get a satisfactory answer, I can try and post an update here.
 
It might help to say which app you're using when denoising - a better chance of users of the same app chiming in.
Most commercial programs use AI for denoising, so perhaps looking at AI inference and data generation tests might be of some value.
Yes, I should have said, I'm using lightroom de-noise. My understanding is lightroom uses the GPU for display, export and de-noise.
 
Last edited:
12GB NVIDIA GEFORCE RTX 4070

Bought as part of PC Specialist desktop nearly two years ago.

Anthony
 
Last edited:
RTX 3060 4GB

Denoising a 21 Mb image takes just under 9 seconds in Lr.
 
I have the same card and Topaz Denoise AI takes around 17s for a full size RAW file shot at ISO12800. It takes close to 1min using the intel cpu based graphics.
 
Tested a few 50 and 24MPix images with LR de-noise on Mac Studio M2 Max. 50MPix takes av 17 seconds per image, 24Mpix 8 seconds. LR is using almost all available GPU resources, in this case, 12 cores.
 
I just did a quest test on a D810 RAW, so a fairly large file. Nvidia 5070 Founders Edition took 6 seconds. It hardly touched the GPU processor, but it did chew up about 80%+ of its 12GB VRAM, so worth taking into account what Long Lens has said about that.
 
I just did a quest test on a D810 RAW, so a fairly large file. Nvidia 5070 Founders Edition took 6 seconds. It hardly touched the GPU processor, but it did chew up about 80%+ of its 12GB VRAM, so worth taking into account what Long Lens has said about that.
It definitely perplexes me why 5070 is only 12GB. The cheaper 5060 TI is 16GB and then the next and much much more expensive step up is 5070 TI. Oh well, just makes it easier to go with the cheaper one.
 
It definitely perplexes me why 5070 is only 12GB. The cheaper 5060 TI is 16GB and then the next and much much more expensive step up is 5070 TI. Oh well, just makes it easier to go with the cheaper one.

Must be some method to their madness.

Although there is more than just size to consider, I mind from when I had the 3060Ti 8GB compared to the regular 3060 with 12GB, the Ti VRAM outperformed it due to the a 256-bit memory bus width and the 448.0 GB/s memory bandwidth (compared to 192-bit and 360.0 GB/s)

Incidentally, it's similar with the 5070 12GB compared to the 5060 16GB, with the 5070 being 192-bit and 672.0 GB/s, compared to the 5060's 128-bit and 448.0 GB/s. I think the 5070 also has a larger L2 cache. How much all of this impacts real world stuff I reckon will depend on the application.
 
Must be some method to their madness.

Although there is more than just size to consider, I mind from when I had the 3060Ti 8GB compared to the regular 3060 with 12GB, the Ti VRAM outperformed it due to the a 256-bit memory bus width and the 448.0 GB/s memory bandwidth (compared to 192-bit and 360.0 GB/s)

Incidentally, it's similar with the 5070 12GB compared to the 5060 16GB, with the 5070 being 192-bit and 672.0 GB/s, compared to the 5060's 128-bit and 448.0 GB/s. I think the 5070 also has a larger L2 cache. How much all of this impacts real world stuff I reckon will depend on the application.
This is correct. Unfortunately it doesn't help when software starts introducing hard limits at 16GB like Ligthroom classic just added a new use for GPU https://helpx.adobe.com/lightroom-c...html?trackingid=4JW793VX&mv=in-product&mv2=cc
My 3060 TI does not appear to even try to doing this despite system reporting another 16gb of shared system video memory. I presume any integrated graphics is out by default, only question is what happens on apple devices...

another one that might become quite useful is oss chatgpt model. https://openai.com/index/introducing-gpt-oss/ 16gb for base 20b variant, and 80gb for 120b model.

I don't see any clear and affordable path to 80gb vram, but 16 is easily doable with 5060TI despite slower bus or whatever it is. AMD has some options but some publications are saying they are not the best for editing, only like for gaming.
 
This is correct. Unfortunately it doesn't help when software starts introducing hard limits at 16GB like Ligthroom classic just added a new use for GPU https://helpx.adobe.com/lightroom-c...html?trackingid=4JW793VX&mv=in-product&mv2=cc
My 3060 TI does not appear to even try to doing this despite system reporting another 16gb of shared system video memory. I presume any integrated graphics is out by default, only question is what happens on apple devices...

another one that might become quite useful is oss chatgpt model. https://openai.com/index/introducing-gpt-oss/ 16gb for base 20b variant, and 80gb for 120b model.

I don't see any clear and affordable path to 80gb vram, but 16 is easily doable with 5060TI despite slower bus or whatever it is. AMD has some options but some publications are saying they are not the best for editing, only like for gaming.
I'm usually using C1. When I tested with LR on M2 Mac Studio 64GB 12 Core CPU / 38 Core GPU (I made a mistake earlier in the thread), it made good use of the GPU cores, over 90%.
 
This is correct. Unfortunately it doesn't help when software starts introducing hard limits at 16GB like Ligthroom classic just added a new use for GPU https://helpx.adobe.com/lightroom-c...html?trackingid=4JW793VX&mv=in-product&mv2=cc
My 3060 TI does not appear to even try to doing this despite system reporting another 16gb of shared system video memory. I presume any integrated graphics is out by default, only question is what happens on apple devices...

another one that might become quite useful is oss chatgpt model. https://openai.com/index/introducing-gpt-oss/ 16gb for base 20b variant, and 80gb for 120b model.

I don't see any clear and affordable path to 80gb vram, but 16 is easily doable with 5060TI despite slower bus or whatever it is. AMD has some options but some publications are saying they are not the best for editing, only like for gaming.

Hard limits seems a bit crazy when larger bus widths etc can compensate for lesser VRAM. I guess it's all part of tech advancement.

I've never been overly keen on AMD GPU's, probably just bad luck but any time I've used one they've caused me problems.
 
Hard limits seems a bit crazy when larger bus widths etc can compensate for lesser VRAM. I guess it's all part of tech advancement.

I've never been overly keen on AMD GPU's, probably just bad luck but any time I've used one they've caused me problems.
by the way do you notice much of an improvement with 5070 vs 3060 TI?
 
by the way do you notice much of an improvement with 5070 vs 3060 TI?

With gaming and VR it's a huge difference; everything on maximum settings now and running both 2k and 4k depending on the game. I do undervolt. With LR/PS, the 3060 Ti FE was no slouch, so I wouldn't say as noticeable, although a recent huge pano stitch that my laptop couldn't handle was done with ease on the big rig. It seemed one of the best value for performance cards (FE version), at least certainly at the time.
 
I have a Gainward GeForce RTX 3060 Ghost LHR 12GB graphics card in my desktop PC which works pretty well with LR Classic. When denoising I edit in Topaz PhotoAI where it takes just a handful of seconds per image. Of course you have to add on the time it takes for LRc to create a .tiff file, hand it off to Topaz and then for Topaz to return it to LRc but if you do batch processing it's not so bad.
 
I mostly don't have any LRC issues with my 3060TI except one which I suspect may be a bug? If I am quickly going back and forth between 2 or 3 images for comparing composition or leveling up the exposures at 2nd or 3rd go it no longer switches to the next image. Going back and forth again refreshes the view. It used to be OK a while ago. And clearly I won't benefit from the new preview generation function. 32GB system RAM is also a bit of a bottleneck. It is DDR4 so I need to weigh upgrading this vs the whole CPU MB combo (13700K) to the very latest whatever it is.

The real big upgrade I suspect may be in Davinci resolve. I haven't used that much lately but things are about to change with my plans to spend a bit of effort on developing a youtube channel(s)
 
I mostly don't have any LRC issues with my 3060TI except one which I suspect may be a bug? If I am quickly going back and forth between 2 or 3 images for comparing composition or leveling up the exposures at 2nd or 3rd go it no longer switches to the next image. Going back and forth again refreshes the view. It used to be OK a while ago. And clearly I won't benefit from the new preview generation function. 32GB system RAM is also a bit of a bottleneck. It is DDR4 so I need to weigh upgrading this vs the whole CPU MB combo (13700K) to the very latest whatever it is.

The real big upgrade I suspect may be in Davinci resolve. I haven't used that much lately but things are about to change with my plans to spend a bit of effort on developing a youtube channel(s)
AM5 has just been confirmed as having two more Ryzen generations so could be a good move.
 
Always worth checking the memory speeds that a CPU supports as well.
 
recent rumors have it there will be 5070 super and ti super increasing RAM to 18gb and 24gb respectively at the same price as the older variants. This is actually very very interesting and worth waiting for if it turns out true. Noting was mentioned of 5060 variants so they might stay unchanged or perhaps receive a minor price cut in line with competition updates. I am not in any way noticeably affected by aging 30360 ti so can easily afford to wait this long. Rather it may be time to look back at the monitors instead.
 
recent rumors have it there will be 5070 super and ti super increasing RAM to 18gb and 24gb respectively at the same price as the older variants. This is actually very very interesting and worth waiting for if it turns out true. Noting was mentioned of 5060 variants so they might stay unchanged or perhaps receive a minor price cut in line with competition updates.

Gee, thanks... :D


I am not in any way noticeably affected by aging 30360 ti so can easily afford to wait this long. Rather it may be time to look back at the monitors instead.

Yes, a nice ultra wide curved screen would look lovely on your table :p
 
Yes, a nice ultra wide curved screen would look lovely on your table :p
yes it would, and would be entirely useless at the same time :p
 
It might help to say which app you're using when denoising - a better chance of users of the same app chiming in.
Most commercial programs use AI for denoising, so perhaps looking at AI inference and data generation tests might be of some value.
That said, generic AI benchmarks might not translate to particularly well to how a specific denoising engine might perform.

There are also a few "content creation" reviews floating around, such as: https://www.pugetsystems.com/labs/articles/nvidia-geforce-rtx-5070-content-creation-review/

You can also check out product-specific forums, such as this for DxO:

I have recently built a new computer with an Nvidia RTX 5070, but haven't had a chance to transfer everything to it just yet, and my old computer was from 2014 (with an Nvidia GTX 850m), which would be more of a leap than in your case (once I move the remaining things across).
It might be a while yet before I get to it, but on the off-chance I would manage to before you get a satisfactory answer, I can try and post an update here.

Also, sometimes you want to play games, but the computer can't handle it because of a weak processor or something similar. In such a situation, I use https://online-casinosbe.com/casino-zonder-cruks/ because here is a list of games that you can play even directly in the browser. It's convenient while you're still building a computer.
Thanks for the detailed suggestions! I'll mention the app and check DxO forums and Puget review while exploring AI-based benchmarks!
 
Last edited:
First hand experience from people is definitely best but I've also found AI really useful for these kinds of questions to either help filter down my choices or to see if there are additional things to consider. I typed the original query in and this is what it says

---
This is a very common pain point right now. Modern "AI Denoise" tools (like those in Lightroom Classic, Adobe Camera Raw, and Topaz Photo AI) rely almost entirely on **Tensor Cores**—specialized AI hardware found only on Nvidia RTX cards.

Since you are currently on a GTX 1660 Super, you are missing these dedicated cores entirely, which is why your render times are so high. Upgrading to an RTX card is exactly what you need.

### The Verdict: RTX 3060 12GB vs. Alternatives
The **RTX 3060 12GB** is a solid choice, but there is a "trap" to avoid and a newer alternative to consider.

| GPU | Avg. Denoise Time (approx.) | VRAM | Why it matters |
| :--- | :--- | :--- | :--- |
| **GTX 1660 Super** | **50 - 60s** | 6GB | No AI cores; relies on raw compute. |
| **RTX 3060 12GB** | **10 - 15s** | 12GB | Huge VRAM pool; great for batch processing large files. |
| **RTX 4060 8GB** | **8 - 12s** | 8GB | Faster AI cores, but less memory for huge batches. |
| **RTX 4060 Ti 16GB** | **7 - 10s** | 16GB | The "gold standard" budget pro choice. |

---

### My Recommendation
**1. If you buy the RTX 3060:** Make sure it is the **12GB version**. There is an 8GB version of the 3060 that is significantly slower. The 12GB of VRAM is actually quite helpful in photo editing if you use a high-resolution monitor (4K) or if you tend to keep Photoshop, Lightroom, and a browser open at the same time. It prevents the "stuttering" you get when the GPU runs out of memory.

**2. The "Modern" Alternative (RTX 4060):** You can often find the **RTX 4060** for a similar price. While it only has 8GB of VRAM, its AI cores are a generation newer and more efficient. For a single image, it will be faster than the 3060. However, if you work with very high-megapixel files (45MP+), the 12GB on the 3060 is "safer" for long-term use.

### Why Denoise is so slow for you now
* **The "GTX" Limit:** Your 1660 is doing the math the "hard way." An RTX card uses its Tensor cores to "guess" the noise pattern using machine learning, which is exponentially faster.
* **VRAM Bottleneck:** 6GB is right on the edge for Adobe's AI tools. When you hit "Denoise," the app has to swap data in and out of your system RAM, which is much slower than GPU memory.

**Summary:** Moving from a 1660 Super to a 3060 12GB will feel like a massive upgrade. You will go from "waiting for a cup of coffee" to "waiting for a deep breath" per image.

**Would you like me to check if your current Power Supply (PSU) can handle the RTX 3060, or would you like to see a comparison for Topaz-specific benchmarks?**
 
Back
Top