The following report are my notes from the First International Photo
Metadata Conference, held on Thursday, June 7, 2007, in Florence,
Italy. This is split into three sections. The first report of three
will deal with Photo metadata users: free-lance photographers, small
and large picture agencies and libraries, and trade associations from
the photo business.

==== The Picture Tide is Rising

Andreas Trampe (Stern)
Andreas Trampe of Germany’s Stern
magazine opened the First International Metadata Conference to a
capacity crowd (over 125 people), by illustrating the challenges of
managing a flood of images at a busy picture desk.

On a normal weekend, they may receive in excess of 25,000 images,
and on a normal weekday they may see upwards of 12,000 pictures
continuously from 12 press agencies. However, they only use about 250
photos per issue, or about 0.3 percent of all of the images they have
received. In addition, they have online access to around 300 image
databases where they can search approximately 60 million additional
images at any hour of the day. Trampe stressed that, “We have an excess
of images. The search process has become much too time consuming.”

The day before the conference was a bit above the norm due to the G8
summit. Trampe did a search using the term “G8” in their database which
resulted in 8691 images. Limiting the search to only the last three
days yielded 4355 hits. Narrowing the search further to those with the
keyword “demonstration” still leaves you with over 796 images to sift
through.

« Many photographers misuse the IPTC fields, unleashing an orgy of
keywords on us, » said Trampe. He gave demonstrations of the problems
that arise when photographers and image editors enter erroneous
information simply to increase the chance of someone finding their
images in a search.

This keyword and caption spamming frustrates image buyers trying to
find an image on a specific topic. Some suppliers will even « jigger »
the IPTC date created field, so that images will show up in more recent
searches. Part of this is a training problem, as most are self-taught,
or learn of such techniques from their colleagues. Part of this issue
is a lack of standards and guidelines.

He then demonstrated on the Grazi Neri website how to use effective
search filters by doing a search of Florence, and then limited the
search to only those in the category of « Travel/stock » and to show
mostly images of the city. This reduced the field from 3000 to 600
images that would be useful for most magazine editors.

Trampe concluded that it may be more worthwhile to expend efforts on
getting better quality image metadata content than in buying better
database software.

= Meta-Education needed

David Riecks
I used my 15 minutes to discuss a number of issues
that confront stock photographers that are dealing with metadata. In
some cases distributors may modify metadata without knowing what they
are doing. With some distributors metadata may be lost simply because
their workflows haven’t been tested or reviewed.

Clients often change filenames when they download comps of images.
This makes it difficult for the client to identify the supplier, or the
supplier to find the correct image. Always embedding the unique
filename into the Document Title field is one solution that would be a
great help for all parties in the imaging chain.

There is also a perception among many clients who feel that they
can’t view embedded metadata without opening the image in Photoshop.
There are alternatives, but education is needed.

I reported that the “Save for Web” feature in Photoshop still
discards all metadata. The defaults for this option should be
flip-flopped so that metadata is always preserved by default.

Photographers need ways to Insert metadata as early as possible in
the process – preferable at the capture stage. They also need to
validate their own workflows so they know that their metadata is still
in the file before sending on to the distributor or client. Many
distributors routinely discard metadata, so they also need to check to
see if the distributor is changing or retaining metadata, both in the
images sent to clients as well as the preview images that are displayed
on websites.

Training and education for both photographers and others in the
imaging chain is the key to better photo metadata. For additional
details, download my presentation from the conference website.

= Identifying “Pain Points”

Peter Krogh
Peter tried to identify a number of “pain points”
based on his experience and others he has worked with (Peter is the
author of “The DAM book”).

He discussed the problems that occur when moving images from
applications that only support older IPTC schemas, and those that use
the newer XMP variety. He made a number of suggestions, some which will
probably resonate with the readers of these notes.

Adopt the Photoshop namespace “Copyright Status” tag as an IPTC
Standard. Currently it is not, and in many applications there is no
support, requiring photographers to re-enter this information.

We need to figure out a way to manage images from a collections
management standpoint. For example, the Document Title field could be
expanded to allow the easy recording of all sources of a file by making
it a “bag-type” field, similar in sense to the current Keywords field.
This would be useful for photographers doing montages, HDR (layering
multiple exposures to expand the dynamic range), or stitching images
into larger panoramic images.

Krogh felt that there needs to be a way to express “Parent-Child”
relationships with keywords (note: I would call this the “hierarchical”
relationship of the images). He showed how this could be accomplished
in Lightroom with the addition of some XMP coding:

Sample Keyword|Son of Sample

Pipe symbols would be used to separate the hierarchically arranged
information, with sets of hierarchical terms, each as separate line
items. Krogh mentions that he is not sure how to denote the synonyms in
such a schema, but he assumes “it would not be too hard to design.”

Photographers (and others) need a way to express rankings and
ratings of photographs within a set. To insure that this information is
available across many applications, one of the few methods used
currently is to use the keywords field to enter notes on
rankings/ratings. There really needs to be a better place to store this
information for process and handling info. Krogh suggested that the use
of a Pipe separated set of terms within a “Collections” field could be
used to accomplish this and tag the image in a durable way. This way,
you could have ratings as indicated by the photographer, as well as by
an editor, distributor, etc.

Expanding this concept further, Krogh suggested that being able to
store “Alternate Metadata sets” within a future IPTC schema would be
useful. This might include information from other users, allow the
storage of alternate color/tonal renderings for an image, deal with
color management issues, and even allow for what he termed
“crowdsourcing.” (note: from his description, I believe he was really
talking about Folksonomies, as this is how we would describe allowing a
wide number of users to tag images with keywords, etc.).

Today, only 10-15 percent of the images they receive at Trinity
Mirror have EXIF info. Span stressed that they need to talk to the
photographers and agencies that submit images to make sure that they
retain EXIF info, as well as encouraging photographers to enter as much
information as possible shortly after the shoot using standard IPTC
metadata. At present, there is no single software that is useful for
dealing with this situation.

They use a typical Picture Desk workflow and deal with approximately
10,000 to 15,000 images per day. They use IPTC and EXIF, but in a
proprietary way. They end up having to alter the IPTC metadata as
submitted, in order to have consistency in their internal systems.

Today each image must go through several adjustments and
conversions, with each conversion causing a loss of quality. Newspaper
workflows require automatic conversion. Currently they are converting
all images to Colormatch RGB, but are considering a move to Adobe RGB.
Span mentioned that his goal is that they “should make as much use of
existing data as possible” and with that in mind, they are looking at
what it will take to only require a single color space conversion in
the workflow.

= Expanding the IPTC Standard for Stock

Jan Leidicke (BVPA / Keystone)
Leidicke reported that most stock
agencies still only use the older IIM standard for IPTC metadata. Very
few have systems that are set up to properly handle XMP based metadata
such as IPTC Core.

For historical images the exact date of creation is typically not
known. However, the current Date Created field requires you to specify
year, month and day. If the editor enters a bogus month and day just so
they can enter a year, this can create problems later in knowing if the
day and month are really accurate or just a guess. The way that this
field is defined needs to change to accommodate less precise dates.

Leidicke also mentioned how the use of the “named people” fields
(one of the fields recommended by the Photo Metadata Working Group)
could make it easier to find images with that actual person in the
image.

The present IPTC standard lacks fields for properly expressing model
release information, rights and permissions granted, and more. Metadata
entered using controlled vocabularies can be easier to translate. It’s
also important to enforce standards and always enter information into
the proper field. Image buyers expect to find certain kinds of
information within specified fields, regardless of image source.

= Does “Meta Matter?”

Roger Bacon (Reuters)
While Reuters is a name in long standing
with the news community (about 150 years), they only began adding
photography coverage since the mid 1980’s. Bacon questioned “Does Meta
Matter?” and began his talk by mentioning that Reuters’ currently has
about 600 photographers and image editors involved in the photo area,
and they are receiving about 1,500 images per day. The photographers
are required to enter the following fields: headline, caption, category
code, urgency, supplemental category code, byline, credit, object name,
date created, city, state (USA only), country, and original
transmission reference.

The photographer is asked NOT to add keywords; this is something
that they do at the management level. As you can see by the field names
indicated above, Reuters is primarily using the older IPTC Information
Interchange Module (IIM) standard.

Bacon emphasized the importance to add metadata early and make sure
that your internal systems don’t throw it away. He mentioned that it
would be great if the time and date could be automatically updated
(like a cell phone does) regardless of where the photographer was in
the world. It would also be helpful if photographers could upload other
information to the camera. For example, if you could easily enter
routing information before image capture, the images could be
automatically distributed via selected channels.

Bacon did admit that, at present, Reuters strips EXIF metadata. This
is due to one major client reporting that their workflow was “broken”
when sent images containing this information.

Reuters adds rights information to the caption and to the special
instructions field. But rights management is one important area that
needs to be simplified. Bacon referred to what they enter as
“Polyhierarchical” data, using something he referred to as Paneikon, or
RRPE. He admitted that the information which is entered for news photos
doesn’t work well for stock use.

In addition, they have several outstanding issues, such as clients
who are asking for RAW files (their photographers only supply jpegs).
In addition, MSN has been asking them to provide square thumbnails so
they could automatically format them for their online news packages.

In closing Bacon asked the question, “what are we discarding today, that will be invaluable tomorrow?”

Michael Steidl (IPTC)
Michael Steidl, the managing director of
the IPTC traced the history of the IPTC schemas and noted that the
original Information Interchange Module (IIM) standard was designed to
be used with audio, video, and other news elements not just
photographs. This IIM schema was first used by Photoshop 4 starting in
1994. The latest incarnation is IPTC Core and was released in the
spring of 2005. Today these two standards are both used by many
applications which means that proper synchronization of data between
these two versions is very important.

Steidl stressed that interoperability between applications requires
that users are presented with consistent user interfaces. For
example, some interfaces refer to the Creator field, while others use
Author, and some even use Photographer.

He also emphasized that it’s important for user interfaces to be
agnostic to cultural norms. For instance, it should not matter whether
the user is used to writing the date as June 7, 2007, or 6-7-07, the
system should accommodate these and use the localization info to
display it in the most prevalent format.

= Wouldn’t if be nice if…

Harald Löffler (IFRA)
Löffler reported that IFRA is the world’s
leading association for newspaper and media publishing, and that they
currently have 3000 members in 70 countries.

Löffler asked the crowd, “Wouldn‘t it be nice:
…to have an automatic image workflow from the photographer to the publisher?
… to know all the time, when and where and by whom a photo was created?
… to get your revenue share?
… not to rekey again, what others have already done earlier?
… to create high quality colour images for different types of publications?
… without manual intervention

His overriding theme, echoing the principles put forth in SAA’s own
“Metadata Manifesto” is that we should not throw away metadata – we
need to preserve it. IFRA has done extensive work regarding automatic
image processing issues. They found that the preservation of EXIF data
(such as ICC profile related data) is needed for proper and efficient
image processing.

He left the audience with some conclusions and open issues, such as
the need for well defined mappings between EXIF, IPTC IIM, IPTC Core,
PLUS and other schemas. The need for “Write-Once” metadata values,
the means to handle versioning of metadata values, and improved
support for controlled vocabularies on the user interface level. In
addition, imaging devices are not confined to cameras anymore, and
mobile phones don’t have EXIF info.

= Global Metadata Rights Standards

Jeff Sedlik (PLUS)
Sedlik stated that “If keywords are chaos –
then intellectual property (copyright) marking is mayhem.” He then
explained how the mission of PLUS – To Simplify & Facilitate Image
Licensing – can make a difference by providing “Global Metadata
Standards for the Communication of Image Rights.” The freely available
PLUS glossary contains over 1500 license and intellectual property
related terms. Words and definitions from a larger number of image
agencies the whole over were collected, synthesized, and reviewed by
panels of volunteers to build a common glossary in American English,
which was approved by both Licensors & Licensees. The glossary
reflects the common practice of PLUS, which is to first create a
master using one language, and then let local working groups translate
to other languages.

The PLUS media matrix and license data format contain standardized
menus which can be used within e-commerce solutions. This allows the
buyer to use the same codes and/or definitions regardless of image
supplier. PLUS codes can be embedded in the image, and held in a
central registry.

To simplify the adoption of PLUS they have build a set of standard
licenses for rights managed material, called PLUS Packs. The Stock
Artists Alliance (SAA) has released a free open-source price
calculator which uses the PLUS Packs and is designed so that
photographers and image distributors can populate with their own
pricing information.

= “Use the label, make it stick”

Sarah Saunders (BAPLA)
Saunders has worked with John Moore
(manager at Conde Nast and Pic4Press) on digital workflow issues. She
discussed how the British Association of Picture Libraries and
Agencies (BAPLA) has worked to develop a single custom panel within
Photoshop for metadata entry.

She asked the crowd, “Why do we need metadata?” and then answered it for them.

These issues, in tandem with orphan works, explain why there is an
increased importance on metadata. Without being to identify the
photographer or know who holds the copyright, future image users might
be able to use images in the future without making any payment.

The fact that there are so many different fields creates confusion
for those entering metadata. The Pic4Press group identified the most
important fields that should always be entered. Then they along with
BAPLA created their own custom Photoshop template. This template only
makes use of existing fields, unfortunately is only works with
Photoshop CS at present. The main goal was to get people started to at
least enter the most important information, as many agencies still
have the tendency to simply put all metadata into the
caption/description field.

Sarah emphasized that the Caption, Credit and Picture Number (Document Title) should be entered into ALL images.

Gunar Penikis (Adobe)
Penikis emphasized that media objects need
to become more intelligent noting that metadata has become, “a data
exchange technology.” Workflows are becoming completely digital.
Digital cameras capture and upload straight into the production
process. The same image file may be used for several outputs; print,
web, CD, handheld, wireless, eBook, RSS, video. In this workflow
metadata plays a central role. As Penikis declared, ”If you can’t
describe it, you can’t control it”.

Penikis showed how XMP works, how it can hold multiple forms of
metadata (EXIF, IIM, IPTC Core) within a single container, and how it
is possible for it to be extended – by allowing other applications to
edit XMP without fully understanding the file format. XMP can support
formats beyond still images such as AVI, WAV, MOV, MP3, and MPEG. In
addition, the new version of Bridge in CS3 also allows for
non-destructive edits to Jpegs and TIFFs.

Penikis showed how easy it is to build and install custom metadata
panels for Adobe Photoshop and Bridge. Bridge can now also be
extended with Flash panels, which can have a nice user interface and
via scripting be extended to send images through FTP or web services.

Joe Schorr (Apple)
Photographers love having metadata in their
images, stated Joe Schorr of Apple, but many photographers fail to
include essential metadata with their images because the process is
too complex and cumbersome with most of today’s photo tools. He also
remarked that as Apple’s Senior Product Manager he was pleased to see
lots of Mac’s in use in the room.

Schorr then proceeded to show several ways that Apple Aperture can
be used to input metadata for images. He showed how you can add
metadata to a batch of images on ingest using metadata presets. He
demonstrated the Keywords “Heads up Display” (HUD), which is now
lockable (as of 7 months ago) to maintain a strict controlled
vocabulary. He showed how you can drag keywords to the images, or use
button sets to append frequently used keywords. He also talked about
how AppleScript can be used to automate various metadata functions.

With Aperture you can create custom metadata views of selected
fields, or use existing ones such as the BAPLA Pic4Press, or Getty
template. Underneath there is a real list view that is built on a
heavy duty database (none of the metadata is stored in the images
until you export them). All metadata fields that they support (the
older IIM standard) are embedded at file export. If you export the
metadata for a RAW file this is saved/stored as an .XMP sidecar.
Aperture also supports plug-ins that lets you enter custom metadata
(such as one for Getty Images). The metadata in the images is also
tied into their operating systems level search called “Spotlight.”

= Advancing EXIF utility

Hiroshi Maeno (Canon)
It was great to see a number of the major
professional digital camera manufacturers represented at this
conference. Hiroshi Maeno, from Canon, Japan gave a good overview of
the history of EXIF metadata standard.

He then explained some of the interesting applications they have
devised for their cameras that use metadata. For example, they have
an image verification kit that works with their more advanced cameras
(20D and forward) so that you can tell if the image has been
retouched. The new EOS 1D Mark III allows for validation of the
metadata (such as GPS) as well as image content (note: this feature
must be activated in camera prior to shooting and its use may impact
your shooting speed). In order to run the validation you need to have
an un-modified original image file in order to compare.

Canon also has a wireless transmitter that allows photographers to
send images directly to a picture desk at an event. The Canon EOS
WFT-E2 can handle wireless input from several cameras simultaneously.

When asked if there is a means for photographers to add metadata
in-camera at the time of capture they said they were considering this
but had no timeline at present. There were representatives from Nikon
at the conference as well (but none that were speaking), and they gave
virtually the same answer.

= Extensibility Throughout the Enterprise

Celemens Molinari (Fotoware)
Molinari gave a demo of the next
generation of FotoStation software (v5.3 beta) which has full support
for XMP, by using the Fotoware MDC (Metadata Configuration) throughout
their entire suite of products. This allows the photographer to
either embed metadata or have it stored as sidecar files (and is fully
backwards compatible with the IPTC IIM schema).

It is also fully extensible, allowing for multiple
standards/multiple namespaces. This makes it the first software
developer other than Adobe to allow for this facility. This means
that not only can you import custom panels from Adobe Photoshop, but
you can create your own custom panels for entering or editing
metadata. This information can then be tagged not only to still image
files, but RAW files, video, audio and other types of digital assets.

Molinari also showed that the v5.3 FotoStation can also provide
support for the IPTC Subject Codes, which will be very good news for
newspapers wishing to use these codes. This beta also allows for the
storage of multiple languages (Captions and Keywords in both English
and in German) but does not have any support for automatic
translation. Their new software offers Open Database Connectivity
(ODBC) support allowing you to interface it to other databases.

= Optimizing for Digital

Peter Stig (Hasselblad)
Stig spoke of how Hasselblad has worked
to optimize their entire system for digital, including changes in
lens design as well as software. He explained how Hasselblad handles
metadata in their Flexcolor camera software (Hasselblad merged with
Imacon, the scanner/scanning back company in 2003) which is looking
similar to the Lightroom or Aperture interfaces. Their software has
supported the IIM version of IPTC metadata since the first version,
and in addition, they store a history log where you can see how the
image has been processed (data from this log can be embedded into the
exported file).

Hasselblad has returned to a proprietary format for their raw
capture. They did offer support for the DNG format for a while but
discontinued that option. They have returned to their own format as
the DNG format lacked support for some advanced features used to
correct for lens distortion.

When asked if the documentation of the file format is publicly
available, we learned that it is not. However, they are working with
Apple and Adobe to allow their software to read the Hasselblad files.
Stig mentioned that while it may be possible to release this as an
open-source format in the future, the lens correction data itself
relies on proprietary information.

= “The Truth Is In The File”

Josh Weisberg (Microsoft)
Josh Weisberg of Microsoft explained
how Windows Vista delivers significant improvements in how photo
metadata is handled in this latest Microsoft operating system. Their
motto is that “The Truth is in the File” because when metadata
properties change, the new values are always written back to the file.
This provides for information portability – wherever the file goes,
the metadata goes.

Today metadata is used by many applications, but in different ways;
and previously every application developer had to write his own
metadata handler or rely on third party libraries. As an alternative,
Microsoft has created what they call WIC (Windows Imaging
Components). WIC is an extensible, system-wide platform for image
handling that will reduce the need for sidecar files. It ships with
Vista and is available for Windows XP and other server products.

The WIC policy handler has two functions, to deal with color
Codec’s, and to handle Metadata. Weisberg explained that applications
can use WIC to: read & write image formats, perform RAW
conversions, and to read & write metadata.

This makes it possible for developers to provide support for
multiple metadata standards in their application. Out of the box, the
WIC policy handler can support EXIF, IPTC and XMP. Weisberg explained
that they want to “bring RAW files to the masses”, and decided that
camera manufacturers will develop their own codec’s for cameras. This
should mean that there will be no delay for the OS or applications to
catch up when new devices are released.

Microsoft bought the image asset manager, iView Multimedia in July
of 2006 and have completed the incorporation and released it under the
name Microsoft Expression Media. Weisberg also mentioned that
Expression Media has been integrated with the Vista OS so that
hierarchical keyword/metatags are even compatible with Lightroom,
though didn’t go into details on how this was handled.

= Wrap-up Panel Discussion
There were opportunities for
those attending to ask a few questions of each of the presenters, but
the most interesting ones came in the wrap-up at the end. While no
representative from Nikon spoke at the conference – they responded too
late to the invitation – they did send several representatives to
observe. Canon as well as Nikon asked what fields photographers need
for entering metadata at the time of capture. This was a bit
unexpected, and some panelists mentioned that it was a tough question
to answer without knowing how this metadata would be entered into the
camera. The chair of the Photo Metadata Working Group said that they
would explore this question and be in contact with them in the near
future.

There was some discussion on whether a conference of this type
would be a regular event, but there was no conclusion at this time.

= Final NotesDownload The
IPTC Photo Metadata White Paper and transcripts of many of the
presentations given at the Conference from http://phmdc.org/

This report presented June 23, 2007 by David Riecks with assistance from Betsy Reid.
—
David Riecks (that’s « i » before « e », but the « e » is silent)david@riecks.comhttp://www.riecks.com/
Midwest/Chicago ASMP
====================== David Riecks =============================