Visual search is one of those technologies we know is set to impact the future of shopping significantly, we’re just not quite there yet. The ability for consumers to snap a shot of something they like on the street, then find the same, if not similar item somewhere for sale immediately, is an appealing concept – who hasn’t fancied the look of someone’s coat on the subway, or shoes in a bar.
At this point in time, there are numerous apps out there promising to offer this sort of service, but the results are mixed – surfacing ideal product on occasion, and total misses on others. One of the companies backing it to get it right, is Cortexica.
Its findSimilar™ software leverages sophisticated algorithms to mimic the way the human visual cortex within the brain interprets images that we see everyday. It white-labels this technology for use by retailers including Macy’s, Zalando, Rent the Runway and more. The Macy’s launch, just announced last week, sees the image recognition and visual search offer embedded in its iOS app in time for the holiday season. Users are able to upload their pictures, find equivalent product on Macys.com and make purchases immediately.
I spoke with Steve Semenzato, co-founder and VP of business development at Cortexica about where visual search is headed, the role deep learning and data will play in its development, and the fact we’re five years out from this having true mainstream application.
Head over to Forbes.com for the full interview.