Please use this identifier to cite or link to this item:
http://hdl.handle.net/123456789/412
Title: | A deep learning model with an inductive transfer learning for forgery image detection |
Authors: | Bevinamarad, Prabhu Unki, Prakash H Bhandage, Venkatesh |
Keywords: | Copy-move forgery Deep learning Image tampering Image-splicing Inductive transfer learning |
Issue Date: | 30-Sep-2024 |
Publisher: | Indonesian Journal of Electrical Engineering and Computer Science |
Series/Report no.: | 801- 810; |
Abstract: | Due to the availability of affordable electronic devices and several advanced on- line and offline multimedia content editing applications, the frequency of image manipulation has increased. In addition, the manipulated images are presented as evidence in courtrooms, circulated on social media and uploaded upon au- thentication to deceive the situation. This study implements a deep learning (DL) framework with inductive transfer learning (ITL) by using a pre-trained network to benefit from the discovered feature maps rather than starting from scratch and fine-tuning the process to check and classify whether the suspected image is authenticated or forged effectively. To experiment with the proposed model, we used both Columbian uncompressed image splicing detection (CUISD) and the CoMoFoD dataset for training and testing. We measured the model’s per- formance by changing hyperparameters and confirmed the better selection of values for the hyperparameter to yield compromised results. As per the evalu- ation results, our model showed improved results by classifying new instances of images with an average precision of 89.00%, recall of 86.43%, F1-score of 87.32, and accuracy of 87.72% and consistently performed better compared to other methods currently in use. |
URI: | http://hdl.handle.net/123456789/412 |
ISSN: | 502-4752 |
Appears in Collections: | F P |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Prabhu Bevinamarad.pdf | 328.4 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.