Commit Graph

51 Commits (788c8100ba65a54120defdbe168676b7fed705a8)

Author SHA1 Message Date
Eren Golge 0a92c6d5a7 Set attention norm method by config.json 2019-03-26 00:48:12 +01:00
Eren Golge 44c66c6e3e remove comments 2019-03-05 13:34:33 +01:00
Eren Golge 1c99be2ffd Change window size for attention 2019-02-18 13:06:26 +01:00
Eren Golge d28bbe09fb Attention bias setting Revert to old 2019-02-06 16:23:01 +01:00
Eren Golge 66f8d0e260 Attention biased chaged 2019-01-22 18:18:21 +01:00
Eren Golge 4431e04b48 use sigmoid for attention 2019-01-16 16:26:05 +01:00
Eren Golge 7e020d4084 Bug fixes 2019-01-16 16:23:04 +01:00
Eren Golge af22bed149 set bias 2019-01-16 15:53:24 +01:00
Eren Golge f4fa155cd3 Make attn windowing optional 2019-01-16 12:33:07 +01:00
Eren Golge ed1f648b83 Enalbe attention windowing and make in configurable at model level. 2019-01-16 12:32:40 +01:00
Eren Golge 4826e7db9c remove intermediate tensor asap 2018-12-28 14:22:41 +01:00
Eren Golge dc3d09304e Cache attention annot vectors for the whole sequence. 2018-12-11 16:06:02 +01:00
Eren Golge 440f51b61d correct import statements 2018-11-03 23:19:23 +01:00
Eren Golge 0b6a9995fc change import statements 2018-11-03 19:15:06 +01:00
Eren Golge c8a552e627 Batch update after data-loss 2018-11-02 16:13:51 +01:00
Eren f2ef1ca36a Smmothed attention as in https://arxiv.org/pdf/1506.07503.pdf 2018-09-19 15:47:24 +02:00
Eren 67df385275 Explicit padding for unbalanced padding sizes 2018-09-19 14:20:02 +02:00
Eren fd830c6416 Attention convolution padding correction for TF "SAME" 2018-09-19 14:16:21 +02:00
Eren f377cd3cb8 larger attention filter size and mode filters 2018-09-06 15:27:15 +02:00
Eren a15b3ec9a1 pytorch 0.4.1update 2018-08-13 15:02:17 +02:00
Eren e0bce1d2d1 Pass mask instead of length to model 2018-08-10 17:45:17 +02:00
Eren G f5537dc48f pep8 format all 2018-08-02 16:34:17 +02:00
Eren G fce6bd27b8 Smaller attention filters 2018-07-19 17:36:31 +02:00
Eren G 4ef3ecf37f loca sens attn fix 2018-07-17 17:43:51 +02:00
Eren G 4e6596a8e1 Loc sens attention 2018-07-17 17:01:40 +02:00
Eren G ddaf414434 attentio update 2018-07-17 16:24:39 +02:00
Eren G adbe603af1 Bug fixes 2018-07-13 15:24:50 +02:00
Eren G dac8fdffa9 Attn masking 2018-07-13 14:50:55 +02:00
Eren G 9f52833151 Merge branch 'loc-sens-attn' into loc-sens-attn-new and attention without attention-cum 2018-07-13 14:27:51 +02:00
Eren Golge 1b8d0f5b26 Master merge 2018-05-28 01:24:06 -07:00
Eren Golge a5f66b58e0 Remove empty lines 2018-05-25 03:24:45 -07:00
Eren Golge fe99baec5a Add a missing class variable to attention class 2018-05-23 06:20:04 -07:00
Eren Golge 0f933106ca Configurable alignment method 2018-05-23 06:17:01 -07:00
Eren Golge d8c460442a Commenting the attention code 2018-05-23 06:16:39 -07:00
Eren Golge 7acf4eab94 A major bug fix for location sensitive attention. 2018-05-23 06:04:28 -07:00
Eren Golge 243204bc3e Add location sens attention 2018-05-18 03:33:41 -07:00
Eren Golge 7b9fd63649 Correct commnet 2018-05-18 03:32:17 -07:00
Eren Golge 3348d462d2 Add location sensitive attention 2018-05-17 08:03:16 -07:00
Eren Golge 2c1f66a0fc remove Variable from models/tacotron.py 2018-05-10 16:35:38 -07:00
Eren Golge a9eadd1b8a pep8 check 2018-04-03 03:24:57 -07:00
Eren Golge 3c084177c6 Data loader bug fix and Attention bug fix 2018-03-26 10:43:36 -07:00
Eren Golge 632c08a638 normal attention 2018-03-25 12:01:41 -07:00
Eren Golge b4032e8dff best model ever changes 2018-03-07 06:58:51 -08:00
Eren Golge 7d5bcd6ca4 Testing of layers and documentation 2018-02-08 10:10:11 -08:00
Eren Golge 3cafc6568c Update attention module Possible BUG FIX2 2018-02-05 08:22:30 -08:00
Eren Golge b6c5771a6f Update attention module Possible BUG FIX 2018-02-05 06:37:40 -08:00
Eren Golge 2fd37a5bad Better naming of variables 2018-02-05 06:27:02 -08:00
Eren Golge 4e0ab65bbf plot attention alignments 2018-02-02 05:37:09 -08:00
Eren Golge 088a105a43 Log spectrogram constraction 2018-01-31 08:38:46 -08:00
Eren Golge 1320d5344a Bug solve on attention module and a new Notebook to experiment spectrogram reconstruction 2018-01-31 07:21:22 -08:00