Update 'pages/students/2016/patrik_pavlisin/dp22/README.md'
This commit is contained in:
parent
81d53ed746
commit
8c85bda12a
@ -108,10 +108,17 @@ Navrhovaný transformátor R sa skladá zo stohu rovnakých vrstiev. Každá vrs
|
||||
## Zoznam použitej literatúry
|
||||
|
||||
[1]. VASWANI A., SHAZEER N., PARMAR N., USZKOREIT J., JONES L., GOMEZ N.A., KASIER L., POLUSUKHIN.I.: _Attention Is All You Need._ [online]. [citované 2017].
|
||||
|
||||
[2]. WANG Z., MA Y., LIU Z., TANG J.: _R-Transformer: Recurrent Neural Network Enhanced Transformer._ [online]. [citované 12-07-2019].
|
||||
|
||||
[3]. SRIVASTAVA S.: _Machine Translation (Encoder-Decoder Model)!._ [online]. [citované 31-10-2019].
|
||||
|
||||
[4]. ALAMMAR J.: _The Illustrated Transformer._ [online]. [citované 27-06-2018].
|
||||
|
||||
[5]. _Sequence Modeling with Neural Networks (Part 2): Attention Models_ [online]. [citované 18-04-2016].
|
||||
|
||||
[6]. GIACAGLIA G.: _How Transformers Work._ [online]. [citované 11-03-2019].
|
||||
|
||||
[7]. _Understanding LSMT Networks_ [online]. [citované 27-08-2015].
|
||||
|
||||
[8]. _6 Types of Artifical Neural Networks Currently Being Used in Machine Translation_ [online]. [citované 15-01-201].
|
Loading…
Reference in New Issue
Block a user