Performance Testing - How does LR4 utilise multiple cores
/forum/topic/1165920/2

1       2      
3
       end

Bifurcator
Registered: Oct 22, 2008
Total Posts: 9300
Country: Japan

Bifurcator wrote:
A couple of things here.

knower,
Adobe says that LR4 is indeed Multi-core and not just MT. So where is your information coming from?
Nuke is a video and FX editor/compositor, not a photo editor.
OpenGL is a 3D display language not really suited for 2D image processing at all. There's CUDA and such which are different, and could be used for 2D image processing and rendering.
Older is usually better in the world of apps. It spells maturity. If the PS GUI and workflow don't suit you personally that's a different issue. I'm an example of the opposite case.

Anyone,
It seems to me that LR is only a cludge of retrofitted PS code - if I can get away with using such ridiculous laymen terminology. It's a slow dog even compared to PS which is actually much faster and of course order's of magnitude more capable and diverse. Between Bridge and ACR you have 100% of the librarian, management and display functionality of LR4 and at about 4 to 6 times the speed. If you include a few functions from PS itself then you have 100% of all functions in all areas - and 4 to 6 times faster. Few people know this because of the way the GUI is laid out, the location of the various menu items, and the metaphorical terminology used to convince us we're doing something different or have something more in LR. Great for Adobe sales but kinda silly to anyone who has taken the time to actually look at all the functionality in Bridge and ACR. And speaking on a lower level it's identical as well. The same demosiacing routines are used, the same color models, the same (identical) core routines are present in LR for every one existing in Bridge, ACR, or Photoshop - with a few exceptions because PS gets the newer improved code before it hits LR 6 to 8 months later.

Here's a speed test for those not convinced of this. Load 200 RAW images into ACR, select all, make some edits you think will be CPU intensive, click Done. Now import those into LR and try browsing them. All the ACR edits render perfectly but it takes 2 to 3 seconds per image to "load" and render making everything fell very sluggish and making it nearly impossible or at least VERY uncomfortable to look navigate and compare images. Now load up Bridge and point it to the folder containing those same 200 RAW images. Everything is nearly instant and yet still all of the ACR edits render just like in ACR itself or in LR.

Why? Beats me! If can suggest a reason, I think it's because Adobe actually employs retarded chimpanzees instead of intelligent and accomplished application programmers. And I'm actually serious here - not kidding. I guess it's actually a management decision based on budget constraints and profit profiles. We went through some of that at NewTek for awhile too tho that was taken care of some years ago now. Basically Adobe employed a method (read: hackish cludge) of bringing together existing components into a GUI which streamlines image processing workflow to match that of several competing products, appear unique and different from ACR/Br/PS, and "create" an additional product.

So why do people use it?
a) They were told it's "the best" by someone with pretty images posted on-line.
b) They're stuck in a rut - it's all they know and don't wanna learn something different.
c) They didn't do any homework and believed the marketing hype - related to a).
d) They got it cheap or are using their dad's computer and can't afford something else.
e) They only process a few images a day anyway so who cares - they like the handholding GUI.

By now you're all thinking that I'm just bashing this app. But I'm really not. It's the actual state of affairs we find ourselves in with this product. Of the "popular" editors it's the very slowest there is. And exactly because of that slowness it's the most cumbersome and "unusable" to photographers who have any kind of schedule to keep with any sizable workload. So what are we doing here profiling LR? Trying to determine WHY it's slow? Unless Adobe want to start cutting me a paycheck I don't really see any value in determining why. It just is and there's nothing we can do about it.

MP/MC Advantage,
Not only are almost all other editors faster but what happens in the future when/if Adobe gets it together too? Will you then go out and buy a machine with more processors to take advantage of the added speed? That seems like a waste of time and probably money too. It's the opposite of how trained system engineers profile as well. You select the hardware for the kinds of tasks you're wanting or expecting to have to achieve, and then you select the most suitable software to accomplish that - based on budget, company ecological and political standing, EULA, function, performance, and so forth.

Image rendering (not so much editing), and video editing and rendering (what most photogs are concerned with in these modern times) greatly benefit from adding more cores. It scales almost linearly with each added core. The cost per CPU cycle also scales favorably to the typical professional - or can be made to!. So if the initial price can be justified, adding more cores to a system's spec is a great way to improve productivity - and perhaps even so for LR in the future too. To the casual photog shooting 1 to 20 images a day on average and almost no video, then for sure, Corei3 2-core boxes running LR are fine - they won't realize any of the advantages from higher bandwidth systems or (probably) faster editors either. And this profiles across the board to someone like Chase Jarvas who shot twenty thousand plus images in a single weekend for a single sporting goods ad. Can you imagine just the sorting & selecting job alone with that many images?

Here again, the system architecture is selected based on what you expect to be using it for and the most suitable software follows. For me LR doesn't fit into my profile anywhere at all - it's just too slow - and imo too restrictive as most of these kinds of apps are. It sounds to me from reading this thread that many of you have come to the same conclusion as well. So why fight it? Give it up and move to something that isn't currently a dog.




blob loblaw wrote:
I could not agree more!
I'm glad to see I was not the only one that feels this way, because calling out LR like that people don't seem to take you seriously and it sounds like you're just ranting and being unreasonable. Well, in my circles anyway.

Almost two years ago, I decided to try and find a replacement and tried as many trials as I could get on a PC: Bibble, Adobe Bridge+PS, DxO, CaptureOne, etc, etc.
I was just fed up with the way LR kept running. Took me a few weeks of frustration because of a learning curve and I've been very happy with C1Pro since.

I was a user of LR since before initial release of v1, a beta. Every new update, and every new version I kept hoping they would put priority on performance. They would always mention 'speed improvements' but I could not see any. In fact it kept feeling slower.
It feels like their target demo is non-professionals, an advanced amateur or non-technical user. I don't want to say an 'Apple/Mac' user, but that's the kind of user I envisioned. Someone who is willing to compromise performance for usability.
To me C1Pro definitely has a steeper learning curve. It requires knowledge of color theory and a host of other industry features, which makes it more of a craftsman tool rather than enduser tool, if that makes any sense but the results are amazing.





I did essentially the same thing and came to the same conclusion. I downloaded the evil or free copies of 13 different editors and really concentrated on feature, result, and speed. Heh, I even took notes and listed out any/all unique features. I also concluded that CaptureOne Pro was the superior tool. I've repeated the process a few times since though with far less rigor, and my conclusions have remained the same as far as the first place holder goes. If a person wants speed, accuracy, robust capability, and a good result in a process-guided GUI then it's C1p. There was a free app which ranked extremely high the first time around too BTW. Also the system and service integration of some packages should be recognized for the convenience and time savings they provide. That can be a major decision point to some.

So I purchased and keep current with C1p but I actually use PS far more often. The process guided GUI type apps just aren't as good as open environment apps like PS IMO where the speed and versatility of the OS can be employed to great assist. To C1p's credit, at least it doesn't force it's process on you as much as some others. DxO is probably the worst in that regard tho LR ranks fairly low in that department as well.

Anyway, to each their own (jedem das Seine) and (mir das Meiste ), the a) through e) people have a place too. It's just not mine.



knower
Registered: Aug 13, 2012
Total Posts: 111
Country: Canada

Frist of all, Adobe also says that Photoshop is multi-core optimized, but that's not true. Only part of the code is. If you check the updates in CS6 you'll find that now more filters are, in fact, using multi-core, but still not all of them.
The same is for Lightroom, not the entire code is multi-core.
Where do I take my info? Well 13 years in the movies VFX post-production and 10 years in the photography business should be enough, some of the guy that write photoshop, in fact, worked with us to optimize our internal custom versions of Photoshop, and we kindly usually suggest Adobe what to put next, in our beta-testing program.

Nuke is NOT a video editor. It is a compositing package, which is just the same as photoshop, but done to be able to work on sequences of frames, much more powerful and truly multi-core and multi-threaded. Photoshop is a toy compared to Nuke.
It doesn't have all the painting tools that Phothsop has, but is not supposed to, and anyway it was an example of a much better software architecture, not saying you have to use it for photo editing, even if it wouldn't be such a bad idea since the software has a completely linear workflow.

OpenGL and now also OpenCL are just platforms to use the GPU and make it work with the CPU. they both can be used for 2D AND 3D applications.
Photoshop uses OpenGL for the crappy 3d integration but also for some of the viewport operations and pixel transformations. There are many more things that can be done to speed up the calculation of an image through OpenGL and OpenCL, pixel filtering, sampling, mip mapping and a lot more.

OpenGL is not suitable for 2D application at all? Really?
Maybe you should give a read to OpenGLSL, and check out how many things you can actually optimize through the GPU and openGL.
You can accelerate most of the common 2D operations with OpenGL, with the right libraries.
Even iOS or Android run with a binding on the OpenGL libraries to speed up the GUI tranformations.
Neat Image in its last version uses GPU acceleration through custom openGL libraries and multi-core and you can see how faster it is, a perfect example of optimized architecture.
Compare it's noise reduction speed with photoshop or lightroom and have a laugh.
Can't see anything more 2D than resolving noise and nothing faster than OpenGL to resolve it.


CUDA doesn't have anything to do with the actual computation functions. It is a parallel computing sharing base, something even different and its purpose is to basically access GPUs as if they were CPUs and make them work in parallel. Threating multi-GPU as multi-Processor.

In which computing world older is better? For sure not the current one where the amount of data exchanged in the pipes changes so quickly in amount and streams complexity.
The architectures change fast and if they are not scalable enough you can't do much more to extend an application, hence you need a complete rewrite.
Photoshop is super old and it was born in the VFX industry when tools like Shake and Nuke were not available and you had to work on a single frame a time.
I love to work with Photoshop and Lightroom, I also teach them, but that doesn't mean they are good software architecture example.
Obviously if one has never used anything more advanced than them like a nodal base compositing software or a 3D package they seem just fine.
Too bad they had to introduce Smart crappy objects because they can't implement a nodal structure due to the old architecture.
The interface of both Lightroom and Photoshop is crap. Is basically what they design and there is no way of customize it the way one wants. But for what they are intended that's not a very big deal, unless you discover pyQT design.
Lightroom is slow, but it does the job. At least for me it can handle huge amount of data and I can develop quickly and have a fast workflow. Photoshop does the rest well enough not to bother for a rewrite for now, but a nodal Photoshop would kick ass.

Your point in using Photoshop, ACR and Bridge is valid. They are really faster.
But the whole thing fails because you have to save a PSD file, and have a destructive workflow in Photoshop, whereas with Lightroom you can potentially always go back and edit files without losing a bit. That's something you can't really do with Photoshop, especially if you consider switching color spaces. Also working on a RAW files right away is cheaper in terms of space than saving a TIFF or a PSD file.
Furthermore, Lightroom interface is, imho, better to handle photography stuff, everything is in a single application and seamless and these days you can do most of the work in Lightroom without even going to Photoshop for editing.

Do you work at Newtek? I used to demo Lightwave 3D when I was 17, about 13 years ago! So much time passed!

G.








15Bit
Registered: Jan 27, 2008
Total Posts: 3910
Country: Norway

Guys,

The thread is intended more as a technical discussion about how LR scales with processor count than slagging it off for being slow, ranting about the interface etc. I think we all appreciate that it should be faster and that some folk don't like the interface etc, but it would be great if we could avoid drifting too far "off-topic". We can start another LR rant thread separately if you like, and i'll be as enthusiastic as anyone in my complaints

Bifurcator, we have so far got testing on up to 6 cores here and you have a 12 core Xeon? It would be great if you could test that out and report your experiences/CPU load plots.



Bifurcator
Registered: Oct 22, 2008
Total Posts: 9300
Country: Japan

knower,
I guess we're not allowed to expand the conversation beyond a few minute points from the OP. Having a similar BG and career path as your own plus developer experience I will say that I stand by what I said in all cases - semantics and marketing hype aside.


15Bit,
Why? I can't for the life of me understand why anyone would wanna know that. As an organizer, participant, or observer, in the olympics I don't need to know how fast Forest can run with his leg braces on. All I need to know is that it's too slow to consider. Until those braces get busted off there's just no point.



knower
Registered: Aug 13, 2012
Total Posts: 111
Country: Canada

15Bit sorry, didn't mean to drift the topic away, I agree with what you say.

Bifurcator, I am a developer, too, but I only write 3D and 2D plugins

G.



15Bit
Registered: Jan 27, 2008
Total Posts: 3910
Country: Norway

Bifurcator wrote:
knower,
I guess we're not allowed to expand the conversation beyond a few minute points from the OP. Having a similar BG and career path as your own plus developer experience I will say that I stand by what I said in all cases - semantics and marketing hype aside.


15Bit,
Why? I can't for the life of me understand why anyone would wanna know that. As an organizer, participant, or observer, in the olympics I don't need to know how fast Forest can run with his leg braces on. All I need to know is that it's too slow to consider. Until those braces get busted off there's just no point.


Please feel free to expand the conversation (especially the threading discussion, which is very interesting), but we've plenty of LR bitching threads elsewhere - i was just (perhaps naively) hoping to do something more constructive here.

As for the performance, you are right - LR does seem to be trying to run with its hobbles on, and with the kind of processing power many of us have it really shouldn't. I'm as curious as everyone else as to why, hence the testing. In most other respects LR is an excellent package with almost all the functionality many photographers want. And lets be honest, many of us are quite heavily tied to our LR catalogues from previous versions, which makes the switch to something like C1Pro difficult.



Bifurcator
Registered: Oct 22, 2008
Total Posts: 9300
Country: Japan

Bifurcator wrote:
Until those braces get busted off there's just no point.

15Bit wrote:
i was just (perhaps naively) hoping to do something more constructive here.


Like what?

As for the performance, you are right...
I'm as curious as everyone else as to why...


It's the chimpanzees, I'm telling ya.
That may sound like only a snide remark but it's not. I've worked in several very similar environments and if the talent, management, and resources aren't in place it's difficult or impossible for ground to be gained. In LR's case I can only guess that management is withholding resource thereby stifling the talent. It's like maybe they're putting most of their accomplished talent on PS or some of their other packages and leaving the chimps to assemble the LR kludge.

And lets be honest, many of us are quite heavily tied to our LR catalogues from previous versions, which makes the switch to something like C1Pro difficult.

That is probably true but it shouldn't be. LR doesn't lock a person in to any exclusive behaviors or formats. Therefor it's only the reluctance of the end users and their resistance to change that is keeping them there. Picking up a package like C1 takes only a couple of hours of rigorous concentration and far less than that if the individual is already accomplished on another similar package (like LR).

I'm not saying everyone should move to C1. Rather, I'm saying that for those who are unsatisfied with the software they're using whatever it is, there are usually alternatives - and in this case good ones which offer the solutions to the problems creating those dissatisfactions - chiefly performance issues.




knower wrote:
Bifurcator, I am a developer, too, but I only write 3D and 2D plugins


Cool, that can be fun aye!?! I guess you can appreciate application level design and coding then. Not nearly as much fun but very challenging!



1       2      
3
       end