Clarifai Raises $30M in Series B Funding

clarifaiClarifai, a NYC-based visual recognition AI company that uses machine learning to understand images and videos, raised $30m in Series B funding.

The round was led by Menlo Ventures, with participation from existing investors including Union Square Ventures, Lux Capital, Qualcomm Ventures, Osage University Partners, and others.

The company, which has raised $41.25m in combined funding to date, intends to use the funds to accelerate the expansion of its engineering and business teams, hire key research talent and continue artificial intelligence innovation with new product releases.

Founded in 2013 by CEO Matthew Zeiler, Clarifai provides an image and video recognition technology – built on machine learning systems and made accessible by an API – which allows developers to build intelligent applications.
The company’s API recognizes more than 11,000 different concepts in photos and videos, and also provides domain-specific recognition tools, including the Not Safe For Work adult content moderation model, which can recognize potentially offensive nudity; the Travel model, which can recognize travel-related concepts like hot tub, kids area, and indoor swimming pool; and the Food model, which can recognize food images down to the ingredient level.

Clarifai recently releaed Custom Training, which allows users to teach the visual recognition API new concepts, and Visual Search, which allows any user to organize, access, or recommend images or products by visual similarity and/or keyword, enabling developers and businesses to better connect their users with what they’re looking for.
The technology, previously only available to large tech companies, is now available to anyone through the Clarifai API.

The company’s AI platform currently predicts more than 1.2 billion concepts in photos and videos every month from applications built by clients ranging from Fortune 500 companies to startups and small development teams, including Buzzfeed, Trivago, 500px, StyleMePretty, etc.



Join the discussion