Microsoft announced the launch of Microsoft Video Authenticator, a tool designed to spot when videos have been manipulated using deepfake technology.
Deepfakes are videos that have been altered using AI software, often to replace one person's face with another, or to change the movement of a person's mouth to make it look like they said something they didn't.
Microsoft said it's inevitable deepfake technology will adapt to avoid detection, but that in the run-up to the election its tool can be useful.
It's hopeful that its tech can catch misinformation in the run-up to the 2020 presidential election.
Photos by Getty Images