Reliable Deepfake Detection: Evaluating model uncertainty using Bayesian Approximations

Paper 425: AI based Deepfake Detection System aims to detect fake or manipulated imagery from a wide range of generative models.

Download
Published: 1 October 2025
Photo of a person holding a mask in front of a camera
  • Juil Sock

    Juil Sock

    Senior Principal Data Scientist
  • Marc Górriz Blanch

    Marc Górriz Blanch

    Senior data scientist
  • Woody Bayliss

    Woody Bayliss

    Senior data scientist
  • Danijela Horak

    Danijela Horak

    Head of Applied Research, AI

With rapidly advancing generative AI, deepfake media poses a growing threat to the credibility of information and visual content used by journalists and media organizations. The AI based Deepfake Detection System (DFDS), developed by the Computer Vision Team at BBC R&D aims to detect fake or manipulated imagery from a wide range of generative models and reaches accuracies comparable to those of commercial deepfake detectors. However, the model often suffers from overconfident predictions, especially on ambiguous or outof-distribution inputs, resulting in misclassifications and reduced trust in the detection model.

This paper expands upon the work and lays down a framework to quantify model uncertainty by integrating epistemic and heteroscedastic aleatoric uncertainty into the DFDS pipeline. Using Monte Carlo dropout and predictive logit distributions, the system not only provides confidence in its probabilistic predictions, but also increases robustness of model for Out of Distribution datasets. Experiments on diverse datasets show that incorporating uncertainty can lead to more interpretable, cautious, and reliable predictions, providing greater benefits, particularly in high-risk scenarios. The results highlight the importance of uncertainty-aware detection in building trustworthy AI systems for critical media applications.

This paper is authored by BBC Research & Development's Nikita Balodhi, Woody Bayliss, Marc Gorriz-Blanch, Juil Sock and Danijela Horak.

White Paper copyright

© BBC. All rights reserved. Except as provided below, no part of a White Paper may be reproduced in any material form (including photocopying or storing it in any medium by electronic means) without the prior written permission of BBC Research except in accordance with the provisions of the (UK) Copyright, Designs and Patents Act 1988.

The BBC grants permission to individuals and organisations to make copies of any White Paper as a complete document (including the copyright notice) for their own internal use. No copies may be published, distributed or made available to third parties whether by paper, electronic or other means without the BBC's prior written permission.

Search by Tag:

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Demo mode

Hides preview environment warning banner on preview pages.

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: