Technology helps consumers get the look: Image recognition apps have moved into fashion retail, but not all can survive

(Guardian (UK) Via Acquire Media NewsEdge) When Cara Delevingne told Vogue that one of her favourite apps was the newly released Asap54 - which uses visual-recognition technology to identify clothes - it was a PR shot in the arm for the new player in an area where competition to become the definitive technology is rife.

Snap Fashion in the UK, Style-Eyes from Ireland and Slyce in Canada are just a few of the companies that are using elaborate software to allow shoppers to take a picture of clothing on their smartphone and then be linked to a retailer where they can buy that piece or something similar.

Image-recognition software, where algorithms are used to identify and match one image with another, has been used in security and marketing for a number of years, and the move into fashion is seen as one of the first steps in the widespread commercialisation of the technology.

Jenny Griffiths started the groundwork for what would become Snap Fashion during her degree at the University of Bristol, where she graduated in 2009. Launched during London fashion week in September 2012, it notched up 250,000 users in its first year.

When a picture is taken of a shoe or a piece of clothing, the software - on a desktop or mobile - analyses it by looking at the colour, pattern and shape, and tries to find a match on an existing database of products from 170 retailers, ranging from New Look to Harrods. Another app called ColourPop matches products solely by colour.

"I think it is really fascinating but also someone's job to push technology into consumer spaces because we are wasting the tools [smartphones] we are carrying around every day," said Griffiths. Snap Fashion has created an app for Westfield shopping centres that lets shoppers upload a picture and then receive suggestions from the different ranges in stock.

Mark Hughes, who has worked in image recognition for 10 years, is a co-founder of Dublin-based Style-Eyes, which has attracted about 65,000 users since it launched last year. Users take a picture of what they want to find, draw an outline round it and the image is matched against a database of 1.5m pieces of clothing, shoes and handbags. Suggestions of where they can buy something similar are shown, drawn from about 600 shops.

"If someone matches a very expensive dress, the chances are they will not be able to afford it but what we intend to do is bring back similar ones that might be in the price range so you can filter all the results with what your intended range is - [for example] 'Find me something like that that is less than pounds 200'," said Hughes.

Both firms make commission when items are bought via their sites - Snap Fashion makes 5-15% and Style-Eyes 5-12%. Style-Eyes says it receives up to 15p each time a user clicks through to a retailer. Both say just over one-third of searches result in users going through to a retailer's website.

Although there are numerous players in the field, one industry commentator from a fashion retailer said no single company has managed to reach the tipping point that will make it the standard technology for consumers and brands to use. Technological glitches can happen when the various programs cannot recognise clothing because of a lack of distinguishing features, and no one yet claims a 100% success rate.

"Fashion, from a technical perspective, is very difficult for image recognition mainly because clothes change because of what way someone is sitting or what direction you are taking it [the picture]," said Hughes.

"With standard image-recognition technology now, it works really well for static objects - the front of a box of cornflakes or the front of a big building - it never really changes, the structure remains the same. That is almost a solved problem but with fashion it is a lot more difficult."
Griffiths does not believe a 100% success rate is possible. "Sometimes you have that experience when you Google something and you say: 'Actually that is not what I am looking for' and you have to try a different search term. So just the nature of search is you are not going to get it right 100% of the time. It is all about getting the user to manage it when they get the wrong result, to getting them to the right one," she said.

Cortexica, a London-based firm, uses its FindSimilar software to build image-recognition facilities for retailers such as Zalando in Germany.

Iain McCready, chief executive of Cortexica, said the rapid development of technology would enable clothes to be identified from video within two generations of iPhones. Eight computer scientists in the staff of 20 at Cortexica are constantly updating the software, which was initially developed by Imperial College London.

So far the software has mainly been aimed at women's fashion - the largest part of the market - but it is expected that menswear will soon feature. Style-Eyes hopes to expand to the US in the summer, and Cortexica is already setting up there.

Griffiths said there was not a lot of difference between various visual search companies' products and that she expected one to emerge as the dominant technology in the future. "I like to think we are close to the tipping point as an industry.

"I think it will be one company that does [succeed] because I don't think consumers will be bothered downloading a load of visual search apps and testing them out for themselves."
(c) 2014 Guardian Newspapers Limited.