# Need a GPU for Lightroom 5.7 photo editing on a 27 incher monitor and 4 year old rig



## Philmar (Jan 15, 2017)

Impulsively I went online and in to the early morning and in a sleepy mindset I pulled the trigger on a BenQ SW2700PT. 
BenQ SW2700PT 27 Inch Adobe RGB Color Management Monitor for Photographers  | BenQ Canada
When I awoke this morning I realized that I'd neglected to consider whether my PC can handle it. My expertise in pixels, not bits and bytes.

I don't have a graphics card - I use the onboard Intel® Z77 chipset graphics: Intel HD 4000 on my Ivy Bridge i7 3770K 
My mobo is the ASUS P8Z77-V PREMIUM (external link).
https://www.asus.com/ca-en/Motherboards/P8Z77V_PREMIUM/...
From what I've subsequently read the new monitor's increased resolution and resultant complex calculations will tax my CPU. I'm wondering (hoping) that a dedicated GPU will help.

I do NOT game or edit video. I do only light netflix viewing but the biggest demands on my system are RAW photo editing. Moving pixels around and displaying it on a high res monitor involves a lot of CPU calculations.
Will a GPU help my pc performance?
My understanding is that Lightroom 5.7 does not make use of GPU acceleration. A new LR 6.0 version does and this may force an upgrade for me.
I was hoping for a complete new rebuild in 2 or 3 years. I hope it isn't required as a result of the new monitor purchase.
Any recommendations on the best value (bang for buck) GPU that could help me (and ideally be used in my next build)?
I run W10 64 bit, 32 GB RAM, 1 SSD and 2 spin HDs and Antec 620 W PSU. 
Sincerest thanks for any helpful suggestions!


----------



## table1349 (Jan 15, 2017)

ZOTAC GeForce GTX 1080 AMP Extreme Graphics Card ZT-P10800B-10P


----------



## Philmar (Jan 15, 2017)

Thanks.
_*That *_is a value choice?


----------



## table1349 (Jan 15, 2017)

Philmar said:


> Thanks.
> _*That *_is a value choice?


Amazon.com: EVGA GeForce GTX TITAN Z 12GB GAMING, Fastest NVIDIA GPU Graphics Card 12G-P4-3990-KR: Computers & Accessories

"Value choice" is a relative term.


----------



## Philmar (Jan 15, 2017)

Considering your avatar is that of a hobo I guess i was taken aback by that expensive suggestion.


----------



## Philmar (Jan 15, 2017)

I realize the following site is a commercial site devoted to selling you hardware but their _advice _is interesting.
Recommended System: Recommended Systems for Adobe Lightroom


_*Video Card (GPU)*_

_In Lightroom CC 2015 and Lightroom 6, the software is able to utilize the power of your GPU to improve performance when editing images in the Develop module. At the moment, the performance gains are fairly modest, although Adobe has been investing heavily in GPU acceleration. While a high-end GPU is not required to get the benefits of GPU acceleration in Lightroom, it may be a good idea to get a slightly faster GPU than you think you need to help future proof your system._


_Lightroom is also very light on VRAM requirements, so even a card with just 2GB of VRAM should be more than enough. However, if you work with large images in Photoshop or use a 4K monitor it is a good idea to use a card that has at least 4GB of VRAM if possible. Workstation video cards are not required for Lightroom, although if you will be using a 30-bit monitor you will need a NVIDIA Quadro video card as GeForce cards currently do not support 30-bit display output._


_Although it is likely that Adobe will increase GPU acceleration support in Lightroom in the future, the current demand on the video card is actually relatively light. We recommend either a GeForce GTX 1060 GeForce GTX 1070._

Is my BenQ SW2700PT a 14 bit monitor?


----------



## table1349 (Jan 15, 2017)

Philmar said:


> Considering your avatar is that of a hobo I guess i was taken aback by that expensive suggestion.


Hobo????    HOBO?????  Seriously junior, do you lack that much knowledge????
The Official Red Skelton Page
Red Skelton - Wikipedia
http://www.biography.com/people/red-skelton-9485657
Red Skelton - Biography - IMDb





Hobo, my butt!!!!


----------



## Philmar (Jan 16, 2017)

*Freddie the Freeloader: *
*Freddie the Freeloader was the hobo who lived in the city dump. Freddie would also be seen sleeping on a park bench often being told by a passing by police officer to move along or risk be arrested for vagrancy. *
Red Skelton Characters, Pigeon Forge, Clem Kadiddlehopper, Freddie  Freeloader


----------



## weepete (Jan 16, 2017)

An Nvidia GTX 950 with 2GB VRAM would be more than adequate, and its not too expensive either. You will need to be a bit careful and check if your current power supply is enough to power everything in your rig and a new gfx card though there are plenty of online calculators to help with this.  

I'd wait until you get the monitor though, for 2D graphics you may well find that the onboard gfx is enough. Lightroom is heavily CPU dependant and barely uses the GFX cards.


----------



## astroNikon (Jan 16, 2017)

No one knows what your definition of a "value choice" is unless you provide an actual budget dollar amount.

For instance, a Value choice car could be a luxury car without all the options. To someone else it's a non-luxury car with all the options, or maybe a non-luxury car with no options.  Who knows unless you tell us ...

Some people here have $50,000 cameras, others $300.  So depending upon who you ask you'll get different answers on "value".


----------



## KmH (Jan 16, 2017)

Philmar said:


> Is my BenQ SW2700PT a 14 bit monitor?


 No.

The specs for that display say:
*Display Colors -‎ 1.07 B‎ *
You have a 10-bit display.

10 bits can code 1024 colors in each color channel.
There are 3 color channels - red, green, and blue.
1024 x 1024 x 1024 = 1.07 B

A 14-bit display would be *trillions* of colors, with 16,384 colors per color channel. (4.4 T)


----------



## table1349 (Jan 16, 2017)

Philmar said:


> *Freddie the Freeloader: *
> *Freddie the Freeloader was the hobo who lived in the city dump. Freddie would also be seen sleeping on a park bench often being told by a passing by police officer to move along or risk be arrested for vagrancy. *
> Red Skelton Characters, Pigeon Forge, Clem Kadiddlehopper, Freddie  Freeloader


No Freddie the Freeloader was a character created by RED SKELTON.  Psssst, Santa Clause doesn't exist either.


----------



## Philmar (Jan 16, 2017)

Thanks Keith for your helpful post


----------



## Philmar (Jan 16, 2017)

gryphonslair99 said:


> Philmar said:
> 
> 
> > *Freddie the Freeloader: *
> ...


And the character is a hobo...


----------



## Philmar (Jan 16, 2017)

My mobo is the ASUS P8Z77-V PREMIUM which has one DisplayPort and one Thunderbolt port. My BenQ has a DP port.
Curiously the monitor does NOT ship with a DP to DP port cable. However, it DOES ship with a Mini DP to DP port cable. Someone told me I could run the monitor with the MiniDP end of the supplied cable plugged in to the mobo Thunderbolt port and the DP end plugged in to the monitor. Is Mini DP compatible with Thunderbolt? Even if the supplied cable does work, would my monitor run better with a DP to DP cable?


----------



## Philmar (Jan 18, 2017)

Received the monitor today and setting up was a breeze. Base, arm and screen easy to assemble.

Had absolutely NO problems running it on my 4 year old processor's HD4000 graphics - even have the NEC 20WMGX2 as a 2nd monitor with both running at their native resolutions.

The BenQ is at 2560x1440 - w10 drivers were installed but not as easily as the manual said it would be.

Good news: It is connected using the supplied mini DP to DP wire: the Mini DP plug in my Intel mobo's Thunderbird port and the DP plug in to the monitor. Never thought that I'd ever use that port but am so glad I didn't have to go out and waste an hour buying another cable. Still don't know why they included it and not a DP to DP cable.

I am happy to say that I see almost NO performance degradation with LR 5.7.

Small difference: when in Library mode and scrolling in Grid view is a bit choppy.

Small difference: Bit more time required to cycle between regular and zoom when in Loupe view.

Performance differences are negligible - to the point I regret the considerable time I wasted researching it. Guess my i7 3770K and mobo were good choices 3 years ago...and happy I won't need a GPU!! And luckily I didn't buy that 'value' GPU.

Though it is early days thus far...I'll report any issues.


----------



## jcdeboever (Jan 18, 2017)

You have excellent hardware, I built many box's with that board. You would notice a small improvement with a dedicated  128 bit Quadro Nvidia, workstation graphics card. The drivers are fantastic and optimized for LR and other graphic applications. They are not cheap though, you are paying for stability and reliability. If you are satisfied with current performance, that's what is important. The GPU of the Intel exceeds minimum requirements for LR.


----------



## Philmar (Feb 1, 2017)

Thanks jcdeboever,
Turns out I was wrong...my mobo is the Asus P8Z77-V PRO/THUNDERBOLT ATX LGA1155 Motherboard.

Did a system upgrade. Monitor is playing very well with LR now. At time of first use, it was slowing down when I was using crop tool - this forces LR to change zoom to fit in to screen. As a result, I added a second SSD as the LR lrdata previews file (> 100 GBs) had almost filled my SSD C boot drive. Also I swapped out a spin hard drive that was dying with a new one. The problems disappeared completely.

I'm not a techie, but I suspect it had to do with the windows page file being on the 'soon to be toast' data dick drive as well as the fact performance was degraded by my boot drive being near capacity. I've moved the LR catalogue and all the lrdata/lrcat files on to a new secondary SSD drive.

One or both of the actions above has resulted in a performance boost. My i7 3770K and intel onboard 4000 graphics are now running LR on the BenQ with NO appreciable perceptible difference when compared to the smaller 20 inch monitor I used previously - and have hooked up as a second monitor. Also I calibrated the BenQ with the BenQ software which adds the profile to the monitor's LUT (just reporting what I read - can't say I understand what that meams). Someone suggested that this could also take some of the burden off of the CPU.

The only difference I see now is when I scroll quickly between files in the Library mode. They take longer to come in to sharp focus that they did previously.

I did all these changes at the same time so i can't say with any certainty which of the 3 things contributed most to my performance boost.


----------

