Abstract
Systems with human-centered artificial intelligence are always as good as their ability to consider their users’ context when making decisions. Research on identifying people’s everyday activities has evolved rapidly, but little attention has been paid to recognizing both the activities themselves and the motions they make during those tasks. Automated monitoring, human-to-computer interaction, and sports analysis all benefit from Web 4.0. Every sport has gotten its move, and every move is not known to everyone. In ice hockey, every move cannot be monitored by the referee. Here, Convolution Neural Network-based Real-Time Image Processing Framework (CNN-RTIPF) is introduced to classify every move in Ice Hockey. CNN-RTIPF can reduce the challenges in monitoring the player’s move individually. The image of every move is captured and compared with the trained data in CNN. These real-time captured images are processed using a human-centered artificial intelligence system. They compared images predicted by probability calculation of the trained set of images for effective classification. Simulation analysis shows that the proposed CNN-RTIPF can classify real-time images with improved classification ratio, sensitivity, and error rate. The proposed CNN-RTIPF has been validated based on the optimization parameter for reliability. To improve the algorithm for movement identification and train the system for many other everyday activities, human-centered artificial intelligence-based Web 4.0 will continue to develop.