%0 Journal Article %A CAI Xiao-dong %A CHEN Yue-lin %A WANG Chen-kui %T Person Re-Identification Method Based on Image Style Transfer %D 2021 %R 10.13190/j.jbupt.2020-147 %J Journal of Beijing University of Posts and Telecommunications %P 67-72 %V 44 %N 3 %X The training set of the existing person re-identification model comes from limited fixed collection equipment, and the sample style lacks diversity. Through the cyclic generative adversarial network, the image data captured by different cameras can be styled and style transferred, which can improve the diversity of sample styles at a lower cost. In order to improve the generalization ability of the model, a new training mechanism of positive and negative samples fusion is designed. Firstly, the samples after the style transfer are regarded as negative samples, and the samples before the style transfer are regarded as positive samples. The positive and negative samples are sent to the model training at the same time. Furthermore, in order to prevent over fitting and consider the loss of false labels positions, label smoothing regularization is adopted. At the same time, in order to pay more attention to difficult and error-prone samples, and to optimize the loss of negative samples, a focal loss function is adopted. Experiments show that there is a significant increase of 1.51% and 2.07% on the Market-1501 and DukeMTMC-reID datasets, respectively. %U https://journal.bupt.edu.cn/EN/10.13190/j.jbupt.2020-147