11/14/2023 0 Comments Deep learning yoshua bengioAn important but subtle contribution of the IEEE Transactions 1994 paper is to show that the condition required to store bits of information reliably over time also gives rise to vanishing gradients, using dynamical systems theory. These papers have had a major impact and motivated later papers on architectures to aid with learning long-term dependencies and deal with vanishing or exploding gradients. The idea of learning to learn (particularly by back-propagating through the whole process) has now become very popular, but we lacked the necessary computing power in the early 90’s.ġ993-1995 Uncovering the fundamental difficulty of learning in recurrent nets and other machine learning models of temporal dependencies, associated with vanishing and exploding gradients: ICNN 1993, NIPS 1993, NIPS 1994, IEEE Transactions on Neural Nets 1994, and NIPS 1995. These architectures were first applied to speech recognition in my PhD (and rediscovered after 2010) and then with Yann LeCun et al to handwriting recognition and document analysis (most cited paper is “Gradient-based learning applied to document recognition”, 1998, with over 15,000 citations in 2018), where we also introduce non-linear forms of conditional random fields (before they were a thing).ġ991-1995 Learning to learn papers with Samy Bengio, starting with IJCNN 1991, “Learning a synaptic learning rule”. Notable Past Researchġ989-1998 Convolutional and recurrent networks trained end-to-end with probabilistic alignment (HMMs) to model sequences, as the main contribution of my PhD thesis (1991) NIPS 1988, NIPS 1989, Eurospeech 1991, PAMI 1991, and IEEE Trans. I am also very interested in AI for social good, in particular in healthcare and the environment (with a focus on climate change). ![]() ![]() ![]() I believe that all of the above are different aspects of a common goal, going beyond the limitations of current deep learning and towards human-level AI.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |