PHASE 2 | Facebook AI Image Similarity Challenge: Matching Track

Advance the science of image similarity detection, with applications in areas including content tracing, copyright infringement and misinformation. In the "Matching Track" participants develop models to predict if an image comes from a large corpus of images. #society

$100,000 in prizes
oct 2021
1,100 joined

Facebook AI

Facebook AI seeks to understand and develop systems with human-level intelligence by advancing the longer-term academic problems surrounding AI. Their research covers theory, algorithms, applications, software infrastructure, and hardware infrastructure across areas including computer vision, conversational AI, integrity, natural language processing, ranking and recommendations, systems research, theory, speech & audio, human & machine intelligence, and more.

Image Similarity Challenge & Dataset

Misinformation is the biggest obstacle to truth online. While the majority of online users manipulate images in benign ways, there have been a great deal of instances where images with the intent to misinform and spread hate have resulted in online and offline harm.

There are multiple cross-industry efforts to combat the threat of misinformation and abuse on social media. Data provenance is one area of focus since it is a high-prevalence issue that’s applicable across multiple domains such as copyright infringement, integrity problems, and scams.

Facebook AI is engaging in this space by compiling a dataset to help build detection systems that better understand image similarity, and running a challenge to provide global benchmarks for identifying manipulated images. to combat various abuses online that involve image manipulation.

Getting scalable visual similarity detection right can reduce the amount of exposure to harmful content that online communities, businesses, and content moderators face from bad actors. The Image Similarity Dataset is an essential tool in the fight against malicious image manipulation.

You can learn more here: