Deep Fake Challenge

Any identification has zero value, if it can be deceived

Menu
Skip to content
  • ABOUT
    • Terms and definitions
    • Standards
    • Logo
    • Resume
    • Privacy Policy
    • Contacts
  • History of biometrics
    • Fingerprint identification
    • Iris identification
    • Automated face recognition
    • Development of devices and algorithms for voice synthesis
  • Methods of deception
    • Biometric Identification Fraud
    • Fake News
      • Exposing fake News
    • Fingerprint
    • Face
    • Iris
    • Palm & finger veins
  • Download
    • Testing and research
    • Training sample
    • Training samples of fakes
    • Data for participants of competitions
    • Data for optimization of algorithms

Day: November 14, 2022

  • Security

Facial recognition algorithms must be protected from attacks

  • Posted on November 14, 2022December 29, 2022
  • by DFC

Biometrics researchers say master face attacks pose “a severe security threat” for under-protected facial recognition algorithms

Read More

Recent Comments

  • Alex on Digital copy in place of presidential candidate
  • Sean Harris on Facebook fun or deepfake stuff?
  • Sean Harris on It takes the Human Voice to Infuse words with deeper meaning
  • Sean Harris on Budget deepfakes become real
  • Sean Harris on New AI helps talk with famous people (even dead)
  • Sean Harris on Commvault Predicts Deepfake’s Dangerous Impact on Data
  • DFC on First use of Deepfake to intimidate opponents

Categories

November 2022
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
282930  
« Oct   Dec »

Archives

© Copyright 2019-2023 – Deep Fake Challenge
Retina Theme by WPAisle ⋅ Powered by WordPress