The control signals of the intelligent bionic prosthesis mainly include various physiological electrical signals and speech signals such as the brain's EEG, myoelectricity and nerve signals. Electroencephalography (EEG) control technology is to place electrodes on the scalp of the brain to detect the electrical signals when the human body produces conscious activities, thereby driving the movement of the prosthetic hand, but due to the complexity of wearing, poor signal quality, and few recognizable motion patterns. The characteristics make this technology still in the laboratory research stage. Neural control technology uses nerve buried electrodes to guide nerve signals in the body, and controls the movement of the prosthesis through the processing and recognition of this part of the signal. At present, the guiding methods of nerve signals are mainly divided into two categories, one is to detect nerve signals through electrodes implanted in the cerebral cortex, and the other is to detect nerve signals on the patient's stump. Nerve control technology has strong theoretical support, but in practice, it is still in the stage of laboratory research and discussion because of the high risk of surgery and the rejection of the electrode by the body. Speech (Speech Signals) control technology is to obtain human voice commands through a voice recognition device placed on the human body, thereby driving the movement of the prosthetic hand. At present, there are two main ways to control the prosthetic hand with voice. One is to collect the surface EMG signals placed on the larynx electrode, and the other is to directly recognize the human voice. Although voice recognition technology is very mature, the use of voice alone for control does not meet the intelligent requirements of the human mind to control the prosthesis, and it is a non-habitual control method.