Detecting Image Forgery Through the Integration of Lightweight Deep Learning Models
DOI:
https://doi.org/10.62647/Abstract
Various sectors, including judicial processes, social media platforms, and insurance fraud investigations, are increasingly using images and videos as convincing kinds of evidence in this digital era. There is reason to doubt the veracity of digital picture editing tools due to their inherent flexibility, especially when no obvious signs of manipulation are present. Authorities in the field of picture forensics are tasked with developing new technologies that can detect image fraud. So far, research has focused on three primary types of approaches for detecting modifications or forgeries: features descriptors, uneven shadows, and double JPEG compression. Many online information systems, social media, and real-time applications have image manipulation detection as a major challenge. Conventional detection approaches have their limitations because to long-held assumptions about things like hand-crafted characteristics, size, and contrast, which are used to identify indications of photo alterations. This study introduces a fusion-based judgment procedure for the detection of picture fraud. A trio of lightweight deep learning models—SqueezeNet, MobileNetV2, and ShuffleNet—form the basis of the decision fusion. A biphasic approach is used to perform the fusion decision system. To begin determining whether the images are real, we use the current weights of the effective deep learning models. In addition, the results of image forgeries are compared to the pre-existing models using the improved weights. The fusion-based decision strategy achieves better accuracy than the state-of-the-art methods according to the experimental data. Image fusion, support vector machines, detection, deep learning, lightweight models, and light fusion are all related terms
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2023 Mrs. Ethakula Avyaktha, P .Sahithi Reddy (Author)

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.