What should I do if my face is changed by AI? Use “X-ray” to make an identification

The phrase “seeing is believing” has spread to this day, and it cannot be counted as truth. Starting from the original “PS Beauty” and “PS Beauty”, the visual “fake” technology has been continuously improved. In 2017, a developer open sourced an AI face-swap algorithm called DeepFake. Since then, the Internet industry has set off a wave of face-swap creation. Many video bloggers change faces of actors in movies to attract traffic to their accounts, and many developers even make one-click face-changing apps for users to download and use.

However, this technological carnival quickly became “out of control”. Some people have replaced the faces in pornographic movies with female stars, which has caused troubles to the celebrity’s reputation; in the United States, there have been many spoof news videos of presidents changing their faces on the Internet, causing political public opinion to be disturbed for a time. If you think that these “rollovers” are far away from yourself, there is a more intuitive and terrible scene—assuming you offend a hacker who replaces your face in a crime video, and then this video is used as a crime For judicial evidence.

“Research found that about 30% of users can’t distinguish the authenticity of the image while watching it, so we proposed to do research on the face recognition algorithm.” Chen Dong, a senior researcher at Microsoft Asia Research Institute, told China Business News. In November 2019, Chen Dong’s research group proposed the face-changing identification algorithm “Face X-Ray”. In February this year, a paper on this algorithm technology was included in the 2020 IEEE International Conference on Computer Vision and Pattern Recognition (CVPR2020).

In academia, the research and development of face recognition algorithms has been very active, and most of them use training AI classifiers—collecting a large number of real face materials, as well as the materials whose faces are changed by some algorithm, using self-supervision The learning method trains AI to identify what kind of face has been replaced. The limitation of this method is that the trained AI can only be identified based on a specific face-changing algorithm. Only when the face-changing image uses a known face-changing algorithm, the identification accuracy can reach 99%; if you take one The accuracy of the face-changing algorithm that Zhang AI has not learned to identify will drop below 70%. In most face-changing pictures, it is difficult to judge the algorithm behind it with the naked eye. Therefore, the identification of fake faces by “classification method” is not universal.

And Face X-Ray can complete identification under the background of not knowing the authenticity of the photo and which face-changing algorithm is used in the face-changing image. The training method is that the researcher manually synthesizes some real face images into fake face images, turns the fusion boundary of true and false faces into known, and then uses deep learning technology to let AI learn how to find this boundary. The final result is that when AI finds that a face in an unfamiliar picture is a fake face, it will respond to the boundary of the fake face, that is, light up the boundary of the face in a black image, if it is a real one For a face picture, the machine will not output a response, it will be displayed as a pure black image. This response process is much like taking an “X-ray” for the picture.

“The problem we want to solve is to identify all face-changing images, including algorithms that will appear in the future but do not yet exist.” Guo Baining, executive vice president of Microsoft Research Asia, told China Business News magazine.

Different pictures have different “noise”, which may be one of the algorithm explanations of “Face X-Ray”.

For a long time in the past, researchers in the Computer Vision Group of Microsoft Research Asia also used “classification” to do research, and once increased the identification accuracy of an algorithm based on “classification” from 98% to 99.6%. Later, as the face-swap technology became more and more complex and diverse, researchers came up with the idea of ​​developing “universal authentication”, but they have not found a suitable algorithm path. Until September 2019, researchers accidentally discovered that Face X-Ray’s idea of ​​identifying boundaries can suddenly increase the recognition accuracy of unknown face-changing algorithms from 70% to more than 95%.

“A lot of time doing research is like kicking and shooting. It is possible that you will not score many times, but you must keep trying to shoot to create the possibility of scoring.” Guo Baining said.

In addition to versatility, another breakthrough of Face X-Ray is to make AI identification of face changes interpretable-using “classification” to identify a face picture, the machine can only give a conclusion whether a picture has been changed. It cannot point out where it has been replaced, but FaceX-Ray can mark out the boundary.

In the past six months, the main work of the members of the Face X-Ray project team is to improve the accuracy of video face-changing identification. The specific operation method is to perform face-changing identification of the video frame by frame to determine whether the face of a certain piece of material is true or false. From the perspective of fraud technology, it is more difficult to change faces in videos than in pictures, because it is difficult for counterfeiters to guarantee the continuity of fake images-it is possible that the first few frames are a person, and the next few frames are another person. Another breakthrough point Chen Dong is studying is that when the video is changed face, the illumination of the image will jitter. Researchers may be able to detect this jitter to improve the accuracy of the identification algorithm. “We hope to develop this technology this year.” Chen Dong said.

Even if multiple breakthroughs are achieved, it does not mean that FaceX-Ray is a panacea.

Face X-Ray can only mark where the boundary of the face change is, and cannot tell the reason for deeper judgment. “For example, the real face and the fake face come from different images, which causes the two pictures to have different noises, so they are recognized as false by the machine. The difference in noise is just one of the explanations for this algorithm, and there may be other things. Different, this is discovered by the neural network itself.” Chen Dong said. In addition, the resolution of the picture and the compression ratio of the video will also affect the identification of AI. If the face occupies only a dozen pixels in the picture, or the compression ratio of the video is high, the discrimination accuracy of Face X-Ray will decrease. For another example, if a faker uses a computer to “create a fake face out of nothing”, the accuracy of identification will also be reduced.

When Face X-Ray finds that a face in an unfamiliar picture is a fake face, it will light up the border of the face change in a black image.

As new face swap technologies continue to emerge, scientists at Microsoft Research Asia need to continuously train Face X-Ray with images of newly emerging face swap algorithms to ensure that this identification technology keeps up with the “fake trend.” Nowadays, there are many databases containing real and fake face videos to help researchers conduct experiments, such as the FaceForensics++ dataset, which contains 1,000 real videos and 4,000 videos that have been proven to be fake, mostly news-related materials.

But even if it is possible to identify that the picture has been changed face, it is difficult for scientists to reverse the face change technology based on the face X-Ray identification results. For example, many face-changing algorithms cannot unify the light on the real and fake faces, but there is a technology called “alpha blending”, which can change the light of the replaced face to the same as the original image. The method is to use Find the unique solution for the wave-acoustic equation. According to Chen Dong, there are currently three types of mainstream face-changing algorithms. The simplest is to paste the face of a real picture into another real picture; the second type is based on 3D face technology, which combines a three-dimensional face. The data is pasted into another picture, and then rendered according to the posture and lighting of the original character in the latter picture to make it integrated. This type of algorithm is currently the mainstream of fraud; the third type is based on deep contrast network technology, using deep learning The network generates human faces, so the richness of fake faces will be higher.

“Of course, there may also be some specific countermeasures for Face X-Ray detection algorithms to add some specific countermeasures to face-changing pictures to avoid being detected by us. We can’t eliminate fraud, but can only increase the difficulty of fraud.” Chen Dong said . Identifying fake faces is actually like identifying spam in an email address, or the “verification code” technology used to prove that you are a real person instead of a machine. “Fake” and “Counter-Fake” are escalating cat-and-mouse games. There is no perfect game. Forgery technology, there is no perfect technology for identifying fraud, and “anti-counterfeiting”, which plays a defensive role, is passive in most cases.

At present, the detection accuracy of Face X-Ray has reached an average of over 95%, and it is still in the state of research and development, and there is still a lack of “practical” experience. Guo Baining’s vision is that in the future, face-changing authentication technology can be freely downloaded by people like anti-virus software to filter out false face pictures or videos on search engines. “However, technology is always upgrading and progressing, and it cannot completely solve a social problem. Therefore, to combat face-changing fraud, it is not enough to only improve technology. It requires the soundness of relevant laws and the participation of the whole society.” He said.