Energy-bounded Learning for Robust Models of Code

by   Nghi D. Q. Bui, et al.

In programming, learning code representations has a variety of applications, including code classification, code search, comment generation, bug prediction, and so on. Various representations of code in terms of tokens, syntax trees, dependency graphs, code navigation paths, or a combination of their variants have been proposed, however, existing vanilla learning techniques have a major limitation in robustness, i.e., it is easy for the models to make incorrect predictions when the inputs are altered in a subtle way. To enhance the robustness, existing approaches focus on recognizing adversarial samples rather than on the valid samples that fall outside a given distribution, which we refer to as out-of-distribution (OOD) samples. Recognizing such OOD samples is the novel problem investigated in this paper. To this end, we propose to first augment the in=distribution datasets with out-of-distribution samples such that, when trained together, they will enhance the model's robustness. We propose the use of an energy-bounded learning objective function to assign a higher score to in-distribution samples and a lower score to out-of-distribution samples in order to incorporate such out-of-distribution samples into the training process of source code models. In terms of OOD detection and adversarial samples detection, our evaluation results demonstrate a greater robustness for existing source code models to become more accurate at recognizing OOD data while being more resistant to adversarial attacks at the same time. Furthermore, the proposed energy-bounded score outperforms all existing OOD detection scores by a large margin, including the softmax confidence score, the Mahalanobis score, and ODIN.


page 1

page 2

page 3

page 4


Disentangling Confidence Score Distribution for Out-of-Domain Intent Detection with Energy-Based Learning

Detecting Out-of-Domain (OOD) or unknown intents from user queries is es...

WOOD: Wasserstein-based Out-of-Distribution Detection

The training and test data for deep-neural-network-based classifiers are...

Energy-based Out-of-distribution Detection

Determining whether inputs are out-of-distribution (OOD) is an essential...

Diffusion Denoised Smoothing for Certified and Adversarial Robust Out-Of-Distribution Detection

As the use of machine learning continues to expand, the importance of en...

A Controlled Experiment of Different Code Representations for Learning-Based Bug Repair

Training a deep learning model on source code has gained significant tra...

Distribution-restrained Softmax Loss for the Model Robustness

Recently, the robustness of deep learning models has received widespread...

Unsupervised Energy-based Out-of-distribution Detection using Stiefel-Restricted Kernel Machine

Detecting out-of-distribution (OOD) samples is an essential requirement ...

Please sign up or login with your details

Forgot password? Click here to reset