Lightroom 7.1 came out overnight with updates to all parts of the “Lightroom world” – to Lightroom itself (ie the so-called “Classic”) and also to the cloud-dependent iOS, Android and PC/Mac apps. It’s mainly about new camera support, bug fixes, and performance tuning, but I recommend that you check out the one feature that is new – a completely-rewritten Auto button.

The old Auto was always a crude “lucky dip”, wrong at least as often as it was right, but the underlying calculation has been totally changed after a lot of empirical research:

Auto has been completely reworked to create better results, every time. Using an advanced neural network powered by Adobe Sensei, our artificial intelligence (AI) and machine learning platform, the new Auto Settings creates a better photo by analyzing your photo and comparing to tens of thousands of professionally edited photos to create a beautiful, pleasing image. The new Auto is available ecosystem wide, including in Lightroom CC, Lightroom CC for iOS, Lightroom CC for Android, Lightroom CC on the web, Lightroom Classic, and Adobe Camera Raw (ACR).

I certainly think Auto’s results are very much better. Maybe the Whites and Blacks are set a bit too aggressively, and bright images such as snow scenes appear to be rendered a little too dark for my taste. One surprise is that Auto now sets the Vibrance and Saturation sliders, and while I don’t completely like this happening (I live in a drab country) I feel I can live with it. If I don’t like how these sliders pump up the colour, I quickly double click the Presence label to reset them to zero. But in general, I feel the new Auto almost always produces a substantially better starting point for editing.

Some other details:

New Auto is based on the cropped area, not the full image

It reads the WB settings when calculating slider values

You can set the Auto values for individual sliders by Shift + double click

The other highlight of the 7.1 release is in LRMobile with the iOS app gaining a long-overdue watermarking feature. For what it’s worth, a couple more missing features have also been added to the new PC/Mac app LRCC Desktop.

In this case the new Auto is much more active, producing a result closer to what I would have done myself. I’m not sure I like it changing Vibrance and Saturation, though here I would accept it.

More about the new Auto

[“Get me Closer Quickly”] is the aim of the design of the new AI based Auto. Also, it should be noted that I personally had a bit to do with training the new AI based Auto. I adjusted over 1K of my images from all sorts of situations and conditions including under/over exposures, high ISO, studio and artificial lighting as well as landscape day and nite shots. I was charged with making those adjustments I personally would do for my images (because, well, they were my images).

Yes, the new Auto is a bit conservative with extreme highlights with a tendency of texture and detail being important. Also, the shadows tend to be fairly open…

The goal here is to be consistent in “improving” an image’s global based setting and trying to improve the workflow for selection editing. Some images may need little or no further adjustments…most will need tweaking either globally or locally.

The old Auto was really and old crude and primitive attempt to generally adjust black and whitepoints with little other finesse to other settings. It sucked…it was just about as likely to screw an image up as make an improvement…but the new Auto has made great strides as a first pass adjustment.

What are the criteria for “best photos” in Lightroom web‘s technology preview?

The info isn’t published anywhere and Adobe haven’t said much apart from talking about how their Sensei artificial intelligence tool analyses images uploaded to LrMobile (the FAQ says you can opt). It’s also a “technology preview” and is probably changing quite rapidly.

If anyone does know the calculation or parameters, they won’t be at liberty to say – but there’s plenty we can work out for ourselves!

I think one can tell that it groups photos shot at roughly the same time, and then looks for the best of each group using a range of criteria. User-entered data seems to rank highly, so I have series of almost-identical photos where Sensei chose the one which I had starred or flagged. After that we’re just guessing. Maybe it measures qualities like sharpness, and the algorithm could be flexed for any faces identified in a picture. I think I can also show that it includes conformity to composition concepts like the rule of thirds.

Interesting though it is to make educated guesses, maybe we’d be better off asking how well it works? Does it automatically identify your best photos, and would that save you time? I don’t yet know for sure, but it’s certainly interesting as a preview.

In this case Adobe’s Sensei artificial intelligence preferred the photo on the left. Apart from the car’s position everything else is the same in the two pictures and I hadn’t flagged or rated either one, so presumably Sensei chose the one closest to the rule of thirds or some other principle.

Although my starting point is LR, I always launch Silver Efex (SFX) this way, via Photoshop as Smart Object (SO). The flexibility is a big advantage.

Starting from LR, just open an image in Photoshop and convert the Background layer to a smart object (via the right click). If you want to edit the image in Photoshop before going to SFX, eg adding adjustment layers or cloning, you can select multiple layers and convert them into a single smart object.

Still in PS, launch SFX, do whatever you want, and save. Whereas normally you would expect SFX to add a pixel layer, running SFX on the smart object applies its edits as a “smart filter” – a sub item in the Layers panel that you can hide or show, even mask. To change the SFX edits, you just double click the smart filter and SFX launches again with all your control points and other settings available. Fundamentally, that’s it.

The method was slightly fancier in the paragraph you quoted. Instead of just opening in PS, from LR you use Edit > Open as Smart Object in Photoshop. The advantage here is that the SO remains raw and you can change its ACR adjustments by double clicking it. There’s no difference for the SFX part of the workflow.

The smart object/filter technique works with all filters, not just SFX. It’s also a neat way to copy SFX effects between images – you drag the smart filter from a SO in one image and drop it on a SO in the other.

And wasn’t this good news?

I was very pleased to see that Google are letting someone else develop the Nik programs, Silver Efex in particular. See DXO’s announcement :

“The Nik Collection gives photographers tools to create photos they absolutely love,” said Aravind Krishnaswamy, an Engineering Director with Google. “We’re thrilled to have DxO, a company dedicated to high-quality photography solutions, acquire and continue to develop it.”

“We are very excited to welcome the Nik Collection to the DxO family,” said Jérôme Ménière, CEO and founder of DxO. “DxO revolutionized the image processing market many times over the years with its innovative solutions, and we will continue to do so with Nik’s tools, which offer new creative opportunities to millions of photographers. The new version of our flagship software DxO OpticsPro, which is available as of now under its new name DxO PhotoLab, is the first embodiment of this thrilling acquisition with built-in U Point technology.”

If you only look at what Adobe have just done to Lightroom, you might miss that a bit of Lightroom has been added directly to Photoshop 2018. A bit like having your Lightroom Web account available directly inside Photoshop, the Welcome page now lets you directly access photos you’ve synced in Lightroom.

It’s very easy to use. On the left of the Welcome screen there’s a small link to Lr Photos. Clicking it makes Photoshop connect to Lightroom Web and display your collections. You can then go into a collection, or even search for images,select one or more, and open them directly in Photoshop.

So in this example Photoshop is accessing a number of collections that I have synced:

What happens next

If the photo is a raw original or a smart preview, it is opened in Adobe Camera Raw where you can tweak your adjustments.

When you’ve finished working on the photo, you can save the version back up to Adobe’s Lightroom server via the new Quick Share button.

Lightroom’s 7th incarnation – the so-called Lightroom Classic – introduces a new Embedded & Sidecar workflow which is designed to let you review, compare, and cull reject photos much faster than before. The key points are:

It’s for anyone with too many photos, too little time

You must select Embedded & Sidecar in the Import dialog

Lightroom imports the photos and build its previews from the embedded previews or….

If the embedded preview is less than 50% of the raw file’s resolution, Lightroom will try to use a sidecar JPEG

This lets you zoom in on photos or move from one to the next much quicker

What are “embedded previews”?

You may already know that every raw photo contains one or more JPEGs that the camera has written inside the raw file. These “embedded previews” are what you see on the camera’s LCD when you zoom in, and you briefly see them in Lightroom’s Import dialog too, until Adobe’s own raw conversion takes over.

But since Lightroom’s early days, many photographers have wanted Lightroom to display these embedded previews in Library. That’s because:

Embedded previews are usually good enough for you to decide if a picture is a keeper, or is destined for the bin.

Displaying the embedded preview is faster, much faster than converting and displaying the raw file itself.

The frustration has been that everyone knows “PhotoMechanic speed” should be possible in Lightroom. PhotoMechanic is an image browser that excels at letting the user review and choose pictures very quickly because it takes full advantage of the embedded previews. It does only a few jobs but does them very well, and this speed has earned it a loyal following among press photographers and others with lots of pictures and tight deadlines.

Finally Adobe have responded.

So how do you take advantage of this change?

Decide if you should switch to shooting Raw + JPEG

Import photos with the Embedded & Sidecar option

Try to avoid making adjustments until you’re done reviewing and culling photos

Should you shoot Raw + JPEG?

I strongly recommend you review your camera(s) to see if changing to shooting Raw + JPEG would make the most of this new Embedded / Sidecar Preview (ESP) workflow.

On my Fuji camera I have switched to using Raw + JPEG. On my Nikons I haven’t because their embedded previews are full resolution.

In the past I never liked shooting Raw + JPEG, but this has now changed – at least with one of my cameras. The key issue is the size of the embedded preview

To explain, my newest camera produces raw files with a resolution of 6000×4000 pixels but with an embedded preview of only 1920×1280 pixels. The problem is if I want to zoom in to 1:1 in Lightroom, to check the focus or fine detail for example. Because the embedded preview isn’t full resolution, Lightroom then has to load the raw file and I lose the speed benefit of using the ESP.

Lightroom anticipates this possibility. If the ESP is less than 50% of the raw file’s full resolution, Import will look for a full resolution “sidecar” JPEG. So this camera – a one year old Fuji X-T2 – is now set up to shoot Raw + JPEG.

I still don’t like having two files for every photo, and it would be great if Fuji did a firmware upgrade to allow full size embedded previews, but it’s worthwhile for the faster ESP workflow.

What you need to find out is if your camera writes an embedded preview that is full resolution. In general:

If you use a recent DSLR from Nikon, Canon etc, there’s probably no point changing to Raw + JPEG. To the best of my knowledge, raw files from these cameras do contain full resolution embedded previews.

If you use a mirrorless camera such as Fuji or Olympus, the embedded previews are generally not full resolution and you may want to start shooting Raw + JPEG.

When you see a “Loading” message when zooming in, it could be because the embedded preview is less than full size.

When you import, choose the option Embedded & Sidecar

Once your camera is set up to take advantage of the ESP workflow, the key step is when you import new photos.

Make sure you go the Import’s File Handling section at the top right of Import and choose Embedded & Sidecar from the Build Previews drop down box.

If you forget to choose Embedded & Sidecar, Lightroom will import the files as in the past. You just won’t get the ESP workflow unless you remove the photos from the catalogue and reimport them with the correct Build Previews choice.

To help ensure you always choose Embedded & Sidecar, one tip is save your Import settings as a preset. This is done at the bottom of the Import dialog, and you can save other standard settings like a copyright metadata template or renaming options.

What should you see?

You should see “Fetching Initial Previews”, not standard or 1:1

Initially you should see a message about Fetching Initial Previews (rather than standard previews). One important thing to know is that extraction of the first embedded previews should begin immediately, so you can zoom in and examine earlier pictures while the rest of the import is proceeding.

As thumbnails appear, look for a new badge which should be on every thumbnail. If it’s not there:

one possibility is that you forgot to set the Embedded & Sidecar option in the Import dialog

Another is a bug in what’s a new, fragile feature

If you really want your ESP workflow, removing and re-importing is your only option

I don’t like the Raw + JPEG workflow but have adopted it for this camera because the embedded previews are low resolution.

In Loupe view, you will also see a bezel “Embedded Preview”. If not, see the above suggestions.

If you see “Loading” messages

This usually means:

You didn’t set the Embedded & Sidecar option in the Import dialog

The embedded preview is lower resolution than the raw file, so you may want to shoot Raw + JPEG

You have adjusted the image….

What if you adjust the image?

I welcome this ESP feature and find it very helpful reviewing and culling new photos, but it has an obvious weakness when you need to adjust an image before you can decide whether to keep or bin it.

The current ESP workflow is great for normally-exposed pictures (eg sports in daylight) but not when your work involves less-flat light (eg stage performance, sunsets) when you just can’t judge an underexposed image properly because it’s too dark, or you need to pull back the highlights or open the shadows to see if any worthwhile detail is present. Speed’s no use if you can’t see well enough.

Quick Develop was designed for this comparison and culling process, letting you bump a few key characteristics so you could choose between images or see if one may be a keeper. But using it with ESPs comes at the price of loading the raw data. We really need to be able to apply some QD adjustments to the ESP-derived thumbnail/preview so that the user can properly compare and cull images at speed. Just Exposure, Highlights and Shadows would be enough.

I hope Adobe will implement this idea, but I am not holding my breath….

Other tweaks

As part of the new version’s performance tweaks, there is a new item in Preferences – “Replace embedded previews with standard previews during idle time”.

The option does exactly what it says. When you aren’t doing anything in LR, it will quietly replace the embedded previews with standard previews. In other words, the embedded previews are only replaced with standard previews. It doesn’t build the 1:1 previews which are so useful when you zoom in.

So while this option may have its uses, it’s not too helpful for a hardcore ESP workflow. I recommend disabling this option.

Incidentally, I also recommend unchecking the option “Treat JPEG files next to raw files as separate photos”.

Conclusion

Lightroom 7 or “Classic” is all about performance improvements. The need for the mother of all bug fixes crept up on us, a bit like the tale of how to boil a frog where you only heat the water slowly so it has gone to sleep before it’s too late to escape the danger. Many of us didn’t experience the performance problems that caused others so much pain. In my case, I had no sense of slowdowns until finding that processing the Fuji X-T2’s 24mp raw files was much slower than my Nikon D800’s 38mp NEFs.

Performance fine tuning is likely to be a continuing effort, and I see remarkable speed benefits generating standard previews – 4 times as fast as Lightroom 6. The Embedded & Standard Previews workflow is a big element of that effort, targeting performance at a time when you need it most. I hope you found this useful.

Make sure this is enabled – it can have a dramatic effect on preview generation speed.

When someone logs into a collection you’ve shared with them, they can now use Facebook or Google IDs.

I’ve not seen any announcement yet, but from today people can use a Google or Facebook account to log into Adobe.

So what? Well, for Lightroom CC users it opens up a great opportunity to squeeze more value out of your subscription, because this little change makes it so much easier to get feedback on your work via your page at https://lightroom.adobe.com, the web browser part of Lightroom Mobile which is also known as Lightroom on the Web (more here).

Since Mobile was introduced, you’ve been able to share sets of pictures with clients or friends. In theory at least, they could “like” photos they wanted and could exchange comments with you. Just because there’s a social media angle it doesn’t mean you’re limited to “OMG” or “LOL”. Imagine a client could annotate photos with “Please send me this full res” or “Crop on the right”, or recently I used it for a family history project and got an older relative to enter comments like “I recognise this young man – he is definitely your great grandfather on the promenade at Ramsey on the Isle of Man”.

The trouble was, liking and commenting was limited to people with Adobe IDs.

That’s OK if they do have one, but most humans don’t – and just try getting someone to sign up for a free account they have never needed.

Much more of humanity has a Google or Facebook account though. So from now on people can log in yo view your pictures using a login they may already use.

Lightroom on the Web has always seemed a lost opportunity for Adobe to offer a light client proofing workflow to Lightroom users. It’s not quite there yet – I’d like to see watermarks and filtering likes and comments in LrD – but this is a big step forward.

I must admit, I hardly use the brush tool any longer – the radius is my go to

Details (sharpening and noise) for iOS

Is a phone or iPad a great place to sharpen?

New iPad interface

It’s generally much more elegant

Unfortunately they have removed the elegant speed rating which lets you swipe the image to apply ratings or flags

New interface for Android

Also notice that Adobe are looking closely at performance issues for a coming version. This is not a recent thing, and most LR6 versions have included performance tweaks such as GPU acceleration or the option to use smart previews. Whether these have been successful is another matter, and external factors such as 4-5k screens and larger raw files must play a role, but clearly Adobe has been aware of performance problems for some time. So one can hope or anticipate something exciting for the next version of Lightroom.

After removing pictures from a synced collection or after stopping a collection from syncing, the images remain on Lightroom Mobile and in “All Synced Photographs”. How do I find these photos so I can remove them from Adobe’s server?

When Lightroom Mobile began, photos were automatically cleared from Adobe’s server whenever you removed them from a synced collection or when you stopped syncing the collection. I liked that clarity. If a photo was in a synced collection, it was on Lightroom Mobile. If it wasn’t, it wasn’t.

Sadly, sometime last year Adobe changed things. I’m not sure why, but photos now remain on Adobe’s server unless you remove them from “All Synced Photographs”. In one sense this doesn’t matter. This web space is unlimited, and it doesn’t really make it any harder to find pictures on mobile devices or in Lightroom on the Web. Still, many of us don’t like to clutter up our online library with photos that we no longer want on Adobe’s server.

The way to remove them from Lightroom Mobile is to remove them from “All Synced Photographs”. Easy enough? Unfortunately it’s not so easy to identify which pictures are in synced collections, and which aren’t. So how can we find them?

The solution isn’t automatic, but it is surprisingly easy and logical:

My plugin X-LR, which automatically applies Fuji film simulations in Lightroom, went through a great 60 day preview period. A lot of people tested it, especially after it was featured on Fujilove, FujiRumors and the Lightroom Blog YouTube channel.

People found very few bugs – but I had overlooked the sepia film simulation

People asked about ratings – I added that feature

People asked about DR and other camera settings – I added Expert Mode

People wanted to run it without the dialog box – I added an option

People who use Olympus asked “what about us?” – I may have something brewing….

Lightroom 6.9 / 2015.9 came out yesterday. There’s no headline new feature and my interpretation is that it’s mainly to support a number of new cameras.

Fuji in particular have released a bunch of X bodies and a week ago launched their 50 megapixel GFX. After some mixed messages, it had slowly become clear that Lightroom’s main competitor, CaptureOne, was not willing to support the GFX because of its threat to PhaseOne’s core business (spot the conflict of interest!). So Adobe’s speedy support is welcome for owners of this camera, and no doubt for Fuji themselves.

Also included is belated support for PhaseOne’s IQ100 16 bit raw format. This won’t affect many people, but I’ve been keeping a close eye on it (and whispering in a few ears) because I happen to know a couple of IQ100 users who wanted to use Lightroom as an alternative to CaptureOne. It turns out that support was delayed for such an unusually-long time because Adobe was waiting for certain documentation on the proprietary format from Phase One. Let’s not say any more – often the fault lies on both sides or somewhere in between. Anyway, Lightroom’s default colour rendering isn’t as good as one would hope, and it initially produced ugly results with a blown-out sunset photo, but a couple of tweaks in the Camera Calibration tab produced results comparable to CaptureOne. So for the few users of this camera, they now have the option of continuing to manage IQ100 files in Lightroom’s much-superior DAM, and processing in either Lightroom or CaptureOne.

If you want to test it with more than 5 images at a time, email me. If you will definitely try it (as opposed to promising to do so) and provide feedback, good or bad, please contact me by PM or email and I will remove the preview restrictions.

Before you castigate Adobe, just think what’s happened over the last few months.

Let’s say you regarded the previous £8.57 as an acceptable price…. Until the Pound went down because of the idiotic Brext vote (yes, I accept some thought about it seriously, but I stand by my description), this meant Adobe US earned $12.68 a month (£8.57 x $1.48=£). So that’s our starting point – both sides are happy with that deal.

Now though, your £8.57 only earns Adobe US $10.80. Restoring their previous $ income means a monthly UK price of £10.07 ($12.68 / 1.26) and they’re putting it up to £10. OK, they won’t use these exact dates and numbers, but this puts into perspective the kind of price rises we’re beginning to see. I’m tempted to add, don’t blame Adobe, blame David Cameron and those who voted for Brexit.

For CC subscribers the big new feature is the introduction of the Reference Photo view in Develop, and it’s a feature I like very much. It means that you can now split Develop’s central pane between the image you’re editing and another photo. While it looks quite similar to the existing Before/After view, which I use all the time, here the idea to help you adjust the active photo by comparing it with some aspects of an existing picture.

Until now you’ve had to know a rather obscure and cumbersome trick using Lightroom’s second screen with a locked photo. This new Reference Photo feature is much simpler – you just use Shift R to split the screen, and drag the other photo into the left side.

This might be handy in a number of circumstances. For example, you might have a series of pictures and want a consistent look, or you may want to edit a picture so it can be used alongside an older one. I’ve found the feature helps when I correct the white balance of photos taken with extreme ND filters such as Lee’s Big and Supper Stoppers. Since I’ll usually have other pictures taken at the same time without the filter, I can use one as the Reference Photo.

Here I’m editing a long exposure image which has a colour cast due to a 10 stop ND filter. An earlier picture was taken without the filter and is my Reference Photo.

There are a few ways to set the Reference Photo:

One idea is to maintain a collection of reference photos and add it to your Favorite Sources

When you want to use a photo you just worked on, it would still be in the filmstrip and you can often just drag and drop it into the Reference Photo side.

If the filmstrip contains lots of photos and you can’t see the one you want, just hit G and find it in Library. Then right click it, choose Set As Reference Photo, and then go back to Develop.

Sometimes you want to use a photo from a different folder or collection, so remember the filmstrip has the Recent Sources list. This could let you switch quickly back to a previous folder, or you could add folders or collections to Favorite Sources – such as a collection of sample or reference photos.

Another little feature is Has Snapshots, a new criterion added to Smart Collections and the Library Filter. It’ll be handy in the sense that there’s previously been no ability to find pictures that do have snapshots. But it’s been implemented in such a half-baked way that I’m not sure why they bothered – you can simply find pictures that have snapshots, or those that don’t. What you can’t do is search for the snapshot name, so you can’t find all the pictures with a certain paper profile in the snapshot name, or with a phrase like “final” or whatever. A lost opportunity which I mention just in case you may find a use for it.

Lightroom Mobile 2.6 for iOS has just been released. The main change is for the iPhone version which now has a slightly different UI. It’s probably better, but it doesn’t greatly excite me.

But the best thing is that the iPhone app now allows you to update photos’ titles and captions. Finally, finally….

Obviously the phone is a fiddly way to enter lots of metadata, but for odd pictures it’s going to be very convenient. You just go to a picture, change the top menu to Info and you can then edit the title, caption and copyright. I try to use the voice recognition to do the captioning – it’s not very efficient, but it’s fun.

Yesterday and again today I tried uploading a book to Blurb. Each time the progress bar never moved, and I was forced to cancel.

So what was wrong? There seemed nothing odd about the images, and I could generate the book as a PDF. The book + cover wasn’t huge, 12×12 inches and 50 pages amounting to roughly 180mb as a PDF, but in any case I am on a fast internet connection (200/20 mbs) and Lightroom was certainly communicating with Blurb as it was able to get the price info from them. It wasn’t clear where the problem lay.

As I know the Lightroom SDK, I eventually decided to edit the preferences file and zap the relevant preferences (mentioned here):

Restarting Lightroom, I tried to upload the book and this time I was asked to enter my Blurb user details and password (which have never changed). The upload then proceeded normally and sent Blurb the book.

Clearly it is unacceptable to leave the user wondering why upload isn’t happening. Equally, assuming the fault is in Lr, I can’t see Adobe putting much time into Book. So if you do apply the above method, make sure you take care – by backing up that preference file before you hack away!

Smart Preview Preference

This year it appears that Adobe has been making consistent efforts to improve performance or eradicate bad performance, however you want to see it. So in 2015.6.7 they’ve introduced an option which facilitates a workflow based on smart previews.

You can create smart previews, small proxy versions of your originals, either during import or afterwards in Library. They come in very handy when you go on the road, letting you take along your main catalogue and edit and output existing photos without access to the originals. But smart previews are also quick to adjust, so for a few years some people have been quite ingenious with them. What they’ve been doing is temporarily renaming the original folders in Explorer or Finder, forcing Lightroom to use the smart previews even when their originals were available.

So 2015.6.7 now adds a preference (Performance tab) which makes Lightroom use smart previews if they are available, even if originals are present.

When you zoom in to 1:1 for tasks like noise reduction and sharpening Lightroom will then load the raw data, but for other work it will try to use the smart previews.

I think Adobe should have provided a switch in Develop rather than burying it away in a Preference, but it’s still good that the renaming trick is no more. As this isn’t really a “new feature”, perpetual licence users get this too.

Just say No – or Yes – and check Don’t show again.

Sync

Another change affects how Lightroom Mobile Sync keeps photos on Adobe’s servers. In the past, you added a photo to Mobile by adding it to a collection, and equally when you removed a photo from a synced collection it was automatically removed from “All Synced Photographs” (ie from Adobe’s servers) if it wasn’t in any other synced collection. Simple enough, eh?

This has now changed and it means that pictures removed from all synced collections remain on Adobe’s servers. They aren’t automatically removed.

In itself, this is no big deal – who cares if Adobe are using more cloud storage for you than you actually require? The only downside is the annoying dialog boxes that Lightroom now displays whenever you remove photos from synced collections, delete a collection or unsync it.

To be frank, I think Adobe are overthinking this area. So I simply recommend you just choose No – or Yes, if you wish – and check Don’t show again.

Other

There’s a new Publish Service allowing you to submit your photos directly from Lightroom to Adobe Stock. I’m not sure how interesting this is – it’s royalty free – but you can read more about it here.

Adobe have tested Lightroom CC 2015.6.7 on the new Mac Sierra operating system, but 2015.6.7 on Mac now requires Mac OS X 10.10 or later. I wouldn’t be surprised if this upsets a few people!

The user was specifically asking “I am trying to create a web gallery using the Grid Gallery but its Multiple page option only allows 5 or 10 or … items per page. Can I use 12 items per page for this specific gallery?”

How can I customise a built-in web gallery?

If you know your way around your computer’s user folders, and if you can hack a Lua file in a text editor like Notepad or TextEdit, you can certainly change the choices you see in Lightroom (for example see right). You can also change the HTML/CSS if you have those skills. It’s techy, which might deter some people, but it’s not too difficult….

The gallery templates are in C:\Program Files\Adobe\Adobe Lightroom\Shared\webengines\ (Windows) or an equivalent place on Mac – inside the app’s package. Copy the LR-Gallery-Standard.lrwebengine folder and temporarily put it on your desktop. Rename it LR-Gallery-Mine.lrwebengine.

The drop down box is defined at line 274. If you’ve got this far, you’ll understand how to add extra choices. It’s just cut and paste, but take care with the syntax. That should be all the editing.

Now you need to move the folder into a subfolder called “Web Galleries” which is in the Lightroom preferences & presets folders. Get to this via Preferences > Presets or use direct links:

PC: C:\Users\USER\AppData\Roaming\Adobe\Lightroom\Web Galleries

Mac: USER\Library\Application Support\Adobe\Lightroom\Web Galleries

So Web Galleries is a subfolder at the same level as Metadata Presets, Print Templates etc (if Web Galleries doesn’t exist, create it at that level), and put LR-Gallery-Mine.lrwebengine inside it. Restart Lightroom and “Grid Gallery – Mine” should be listed. That’s it!

One deals with a memory bug that could reduce Lightroom to a crawl on more powerful computers. Version 6.6 introduced a mechanism in Develop to cache files in advance, which meant that loading photos could be dramatically faster. This is really wonderful when I load Nikon D800 raw files on my MacAir with its puny 4Gb of RAM, but unfortunately there was a memory leak affecting those computers with much more memory. So on my main computer, a PC with 48Gb of RAM, staying in Develop for a long time could lead to memory usage rising and rising, and performance dropping so much that restarting Lightroom was necessary. That problem seems fixed.

The other is a printing problem that only affected Macs and meant blues and to a lesser extent greens were not accurately printed by Lightroom 6.6 or Photoshop 2015.5. Adobe had been an early adopter of some new Apple APIs, and it’s hard to know who really bears the responsibility – or rather, the blame. Some people always point the finger at Adobe, of course, and their QA should certainly have detected a problem, but it’s not as if Apple ever admit to their faults! As I don’t print from my Mac laptop I’ve not tested the fix for myself, but I hear from usually-reliable sources that it works correctly.

Other interesting stuff:

Support for the Fuji XT2 – already!

A new AppleTV app to show Lightroom Mobile photos. I can see the point of this, though I can’t see existing AppleTV owners upgrading to 4th Gen. After all, you’re already able to mirror from a laptop, iPad or iPhone to older AppleTVs. At least that’s what I like doing.

Directly importing raw files is undoubtedly a major development for LrMobile. I must admit though – it just doesn’t interest me. When I travel, I take my laptop and an external hard drive, and I don’t feel like spending money on Apple’s camera connection cable. But clearly mobile hardware is now making such a workflow more practical, and I acknowledge that many others have demanded raw import since day 1. Be careful what you wish for.

We’re sure it’s happened to you before: you’re out taking photos (in raw of course) and you capture a real stunner that you can’t wait to share with the world. Until now, you had to either transfer a JPEG version of the file over or you had to wait until you got back to your desktop or laptop. With the raw technology preview, you’ll be able to import raw photos immediately to either your iPhone or iPad, edit them, and then share them, anywhere you’ve got a connection. Our goal with Lightroom for mobile is to make it an indispensable part of your photography workflow, providing the tools that you’re familiar with and the quality you expect in a product that can be with you, no matter when inspiration strikes. With this technology preview, we want to push the boundaries of how photographers around the world work with their mobile devices.

You get all of the benefits of raw, such as the ability to change the white balance, being able to recover blown out highlights, access to the full range of color information, as well as editing an uncompressed file, all using the exact same technology that powers Lightroom on your desktop. An added benefit is that the raw file that you’ve imported into Lightroom for iOS will be synced with Lightroom on your other devices, such as Lightroom for desktop or Lightroom on the web, along with any of the edits, star ratings, or flags that you added.

While importing raw files does nothing for me, I’m much more interested in being able to add graduated and radial adjustments. These are added in a way that will be very familiar, if a bit awkward. First you enable Local Adjustments by tapping the button at the bottom of the screen, then you choose “Linear Selection” or “Radial Selection” on the screen’s left, then you click a little + button at its top. But then the adjustments handle just like in Lightroom Desktop, and the adjustments “round trip” (I guess that’s a verb now) just like any other adjustments.

Incidentally, I think they’ve made a mistake using the technically-correct but ugly and unphotographic names “Linear Selection” and “Radial Selection”. I appreciate “filter” misleads those who use other mobile apps, but filter is what they’re called on the desktop and LrMobile isn’t just for mobile-only folk, you know.

Guided Upright may be what catches the eye, but for me the headline is what Adobe have been doing to boost the speed of accessing pictures in Develop. It’s faster when you initially take a picture into Develop, and Lightroom then loads into RAM the 2 images before and 2 images after the current image, so you should see loading time improvements when you navigate in either direction.

On my 3 year old Mac Air, which I regard as underpowered for the D800 raw files I typically shoot, I would describe the results as dramatic. But the optimisation applies to all types of computer. And this improvement applies to all users, not just those with subscriptions.

As for the other change, Guided Upright, this is an obvious improvement to the existing Upright feature and is limited to CC subscribers. While Upright’s existing Auto setting often appears to work by magic, I often find the other sliders fiddly and unsatisfactory when when Auto doesn’t get it right. So the new Guided Upright method lets you draw up to 4 lines for things that you want to align to 90 degrees. Upright then twists the image accordingly.

Apply lens corrections first

Draw lines to straighten

Maximum of four lines

Notices (errors, instructions, etc.) appear at the bottom of the Transform panel

With Guided Upright, you draw up to 4 lines to help Lightroom straighten the image.

Here are the other changes and some important bug fixes:

HDR and Panorama Merges

Possible when only Smart Previews are available

Wacom fixes especially on Windows

Pressure sensitivity now works

Can drag around in Develop when zoomed in

Keyword count for given photo is now visible as a tool tip when hovering over the applied keywords box in the Keyboarding panel

In Preferences > Lightroom mobile, there is now a “Pending Sync Activity” information section show uploading and downloading activity