Advanced search
Start date
Betweenand
(Reference retrieved automatically from Web of Science through information on FAPESP grant and its corresponding number as mentioned in the publication by the authors.)

Connecting the dots: Toward accountable machine-learning printer attribution methods

Full text
Author(s):
Navarro, Luiz C. [1] ; Navarro, Alexandre K. W. [2] ; Rocha, Anderson [1] ; Dahab, Ricardo [1]
Total Authors: 4
Affiliation:
[1] Univ Campinas UNICAMP, Inst Comp, Campinas, SP - Brazil
[2] Univ Cambridge, Engn Dept, Cambridge - England
Total Affiliations: 2
Document type: Journal article
Source: JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION; v. 53, p. 257-272, MAY 2018.
Web of Science Citations: 3
Abstract

Digital forensics is rapidly evolving as a direct consequence of the adoption of machine-learning methods allied with ever-growing amounts of data. Despite the fact that these methods yield more consistent and accurate results, they may face adoption hindrances in practice if their produced results are absent in a human-interpretable form. In this paper, we exemplify how human-interpretable (a.k.a., accountable) extensions can enhance existing algorithms to aid human experts, by introducing a new method for the source printer attribution problem. We leverage the recently proposed Convolutional Texture Gradient Filter (CTGF) algorithm's ability to capture local printing imperfections to introduce a new method that maps and highlights important attribution features directly onto the investigated printed document. Supported by Random Forest classifiers, we isolate and rank features that are pivotal for differentiating a printer from others, and back-project those features onto the investigated document, giving analysts further evidence about the attribution process. (AU)

FAPESP's process: 17/12646-3 - Déjà vu: feature-space-time coherence from heterogeneous data for media integrity analytics and interpretation of events
Grantee:Anderson de Rezende Rocha
Support Opportunities: Research Projects - Thematic Grants