From 8c85bda12acba80b74dce3ab4aa813777eb4edf4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Patrik=20Pavli=C5=A1in?= Date: Sat, 23 Oct 2021 18:44:18 +0000 Subject: [PATCH] Update 'pages/students/2016/patrik_pavlisin/dp22/README.md' --- pages/students/2016/patrik_pavlisin/dp22/README.md | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/pages/students/2016/patrik_pavlisin/dp22/README.md b/pages/students/2016/patrik_pavlisin/dp22/README.md index b0f3df30..b0902e1b 100644 --- a/pages/students/2016/patrik_pavlisin/dp22/README.md +++ b/pages/students/2016/patrik_pavlisin/dp22/README.md @@ -108,10 +108,17 @@ Navrhovaný transformátor R sa skladá zo stohu rovnakých vrstiev. Každá vrs ## Zoznam použitej literatúry [1]. VASWANI A., SHAZEER N., PARMAR N., USZKOREIT J., JONES L., GOMEZ N.A., KASIER L., POLUSUKHIN.I.: _Attention Is All You Need._ [online]. [citované 2017]. + [2]. WANG Z., MA Y., LIU Z., TANG J.: _R-Transformer: Recurrent Neural Network Enhanced Transformer._ [online]. [citované 12-07-2019]. + [3]. SRIVASTAVA S.: _Machine Translation (Encoder-Decoder Model)!._ [online]. [citované 31-10-2019]. + [4]. ALAMMAR J.: _The Illustrated Transformer._ [online]. [citované 27-06-2018]. + [5]. _Sequence Modeling with Neural Networks (Part 2): Attention Models_ [online]. [citované 18-04-2016]. + [6]. GIACAGLIA G.: _How Transformers Work._ [online]. [citované 11-03-2019]. + [7]. _Understanding LSMT Networks_ [online]. [citované 27-08-2015]. + [8]. _6 Types of Artifical Neural Networks Currently Being Used in Machine Translation_ [online]. [citované 15-01-201]. \ No newline at end of file