Abstract
The mainstream AI community has seen a rise in large-scale open-source classifiers, often pre-trained on vast datasets and tested on standard benchmarks; however, users facing diverse needs and limited, expensive test data may be overwhelmed by available choices. Deep Neural Network (DNN) classifiers undergo training, validation, and testing phases using example datasets, with the testing phase focused on determining the classification accuracy of test examples without delving into the inner workings of the classifier. In this work we evaluate a DNN classifier's training quality without any example dataset. It is assumed that a DNN is a composition of a feature extractor and a classifier which is the penultimate completely connected layer. The quality of the classifier is estimated using its weight vectors. The feature extractor is characterized using two metrics that measure feature vectors it produces when synthetic data is fed as input. These synthetic input vectors are produced by backpropagating desired outputs of the classifier. Our empirical study of the proposed method for ResNet18, trained with CIFAR10 and CIFAR100 datasets, confirms that dataless evaluation of DNN classifiers is indeed possible.