[1]
“Improving Data Entry Accuracy Using Distil-BERT: An Efficient Extension to BERT-Based NLP Models”, IJITCE, vol. 13, no. 4, pp. 95–101, Nov. 2025, doi: 10.62647/.