Work place: Software College, Shenyang Normal University, Shenyang 110034, China
E-mail: lihangsoft@163.com
Website:
Research Interests: Quantum Computing Theory, Network Security, Information Security, Hardware Security
Biography
Hang Li obtained his Ph.D. degree in Information Science and Engineering from Northeastern University. Hang Li is a full professor of the Software college at Shenyang Normal University. He is also a master’s supervisor. He has research interests in wireless networks, mobile computing, cloud computing, social networks, network security and quantum cryptography. Prof. Li had published more than 30 international journal and international conference papers on the above research fields.
By Yuhao Zhao Hang Li Shoulin Yin
DOI: https://doi.org/10.5815/ijmsc.2022.01.03, Pub. Date: 8 Feb. 2022
Relation classification is an important semantic processing task in the field of natural language processing. The deep learning technology, which combines Convolutional Neural Network and Recurrent Neural Network with attention mechanism, has always been the mainstream and state-of-art method. The LSTM model based on recurrent neural network dynamically controls the weight by gating, which can better extract the context state information in time series and effectively solve the long-standing problem of recurrent neural network. The pre-trained model BERT has also achieved excellent results in many natural language processing tasks. This paper proposes a multi-channel character relationship classification model of BERT and LSTM based on attention mechanism. Through the attention mechanism, the semantic information of the two models is fused to get the final classification result. Using this model to process the text, we can extract and classify the relationship between the characters, and finally get the relationship between the characters included in this paper. Experimental results show that the proposed method performs better than the previous deep learning model on the SemEval-2010 task 8 dataset and the COAE-2016-Task3 dataset.
[...] Read more.By Dan zheng Hang Li Shoulin Yin
DOI: https://doi.org/10.5815/ijmsc.2020.06.03, Pub. Date: 8 Dec. 2020
Human action recognition is an important research direction in computer vision areas. Its main content is to simulate human brain to analyze and recognize human action in video. It usually includes individual actions, interactions between people and the external environment. Space-time dual-channel neural network can represent the features of video from both spatial and temporal perspectives. Compared with other neural network models, it has more advantages in human action recognition. In this paper, a action recognition method based on improved space-time two-channel convolutional neural network is proposed. First, the video is divided into several equal length non-overlapping segments, and a frame image representing the static feature of the video and a stacked optical flow image representing the motion feature are sampled at random part from each segment. Then these two kinds of images are input into the spatial domain and the temporal domain convolutional neural network respectively for feature extraction, and then the segmented features of each video are fused in the two channels respectively to obtain the category prediction features of the spatial domain and the temporal domain. Finally, the video action recognition results are obtained by integrating the predictive features of the two channels. Through experiments, various data enhancement methods and transfer learning schemes are discussed to solve the over-fitting problem caused by insufficient training samples, and the effects of different segmental number, pre-training network, segmental feature fusion scheme and dual-channel integration strategy on action recognition performance are analyzed. The experiment results show that the proposed model can better learn the human action features in a complex video and better recognize the action.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals