Billing itself as surfacing “everything that is happening around the world, as captured by the crowd,” Eyein invites users to search for photos and videos from sports, concerts–or indeed news events like demonstrations and natural disasters–in real time, as the content appears on Facebook, Instagram, or Twitter.
The event-based, and hence–in effect–location-based search results are contrasted with popularity-based results retrieved from engines like Google Image search. It also means users will not need to rely on hashtags, or selections of videos and images curated by media outlets. Mobli also says Eyein filters our the white noise of irrelevant results, like selfies, even though they may be associated with events.
Eyein’s approach is based on a mix of image recognition technology, natural language processing applied to titles, captions, etc, and machine learning. It will be available as a website and a mobile app.
Eyein for Publishers leverages the new engine’s capabilities to provide a plug-in album of videos and images, updated in real time, to complement the publishers’ own content. Publishing partners already signed up include the Huffington Post.