Commit Graph

39 Commits (fbe5310be01321220ad219efb48ea68d38f30267)

Author SHA1 Message Date
root 0d17019d22 remove old graves 2020-02-19 18:27:02 +01:00
root bb1117ff32 stop dividing g_t with sig_t and commenting 2020-02-19 18:27:02 +01:00
root 72817438db graves v2 2020-02-19 18:27:02 +01:00
root cf7d968f57 graves attention as in melnet paper 2020-02-19 18:27:01 +01:00
root dc0e6c8019 simpler gmm attention implementaiton 2020-02-19 18:27:01 +01:00
root 0e8881114b efficient GMM attneiton with native broadcasting 2020-01-10 13:45:09 +01:00
root f2b6d00c45 grave attention config update: 2020-01-07 18:47:02 +01:00
geneing 748cbbc403 Change to GMMv2b 2020-01-05 18:34:01 -08:00
geneing 34e0291ba7 Change to GMMv2b 2020-01-05 18:32:49 -08:00
geneing 20b4211af5 Change to GMMv2b 2020-01-05 18:32:35 -08:00
Eren Golge cd06a4c1e5 linter fix 2019-11-12 13:51:22 +01:00
Eren Golge df1b8b3ec7 linter and test updates for speaker_encoder, gmm_Attention 2019-11-12 12:42:42 +01:00
Eren Golge 1401a0db6b update GMM attention calp max min 2019-11-12 11:20:53 +01:00
Eren Golge 6f3dd1b6ae chnage gmm activations 2019-11-12 11:20:53 +01:00
Eren Golge 2966e3f2d1 use ReLU for GMM 2019-11-12 11:20:53 +01:00
Eren Golge b904bc02d6 config update and initial bias for graves attention 2019-11-12 11:19:57 +01:00
Eren Golge 926a4d36ce change tanh layer size for graves attention 2019-11-12 11:19:16 +01:00
Eren Golge 695bf1a1f6 bug fix for illegal memory reach 2019-11-12 11:19:16 +01:00
Eren Golge b9e0faca98 config update and bug fixes 2019-11-12 11:19:16 +01:00
Eren Golge adf9ebd629 Graves attention and setting attn type by config.json 2019-11-12 11:18:57 +01:00
Eren Golge 84d81b6579 graves attention [WIP] 2019-11-12 11:17:35 +01:00
Eren Golge ec579d02a1 bug fix argparser 2019-10-31 15:13:39 +01:00
Eren Golge 72ad58d893 change the bitwise for masking and small fixes 2019-08-19 16:24:28 +02:00
Eren Golge b22c7d4a29 Merge branch 'dev-gradual-queue' into dev 2019-08-16 13:20:17 +02:00
Eren Golge 64f2b95c31 update regarding torch 1.2 2019-08-13 12:14:34 +02:00
Thomas Werkmeister ab42396fbf undo loc attn after fwd attn 2019-07-25 13:04:41 +02:00
Thomas Werkmeister f3dac0aa84 updating location attn after calculating fwd attention 2019-07-24 11:49:07 +02:00
Thomas Werkmeister 40f56f9b00 simplified code for fwd attn 2019-07-24 11:47:06 +02:00
Thomas Werkmeister 82db35530f unused var 2019-07-23 19:33:56 +02:00
Thomas Werkmeister 98edb7a4f8 renamed attention_rnn to query_rnn 2019-07-23 18:38:09 +02:00
Reuben Morais 11e7895329 Fix Pylint issues 2019-07-19 09:08:51 +02:00
Eren Golge 0f0ec679ec small refactoring 2019-07-16 21:15:24 +02:00
Eren Golge c72470bcfc update forward attention 2019-06-24 16:57:29 +02:00
Eren Golge d7e0f828cf remove print 2019-06-04 00:40:03 +02:00
Eren Golge 4678c66599 forward_attn_mask and config update 2019-06-04 00:39:29 +02:00
Eren Golge f774db0241 bug fix #207 2019-05-29 00:37:41 +02:00
Eren Golge 0b5a00d29e enforce monotonic attention in forward attention y for batches 2019-05-28 14:28:32 +02:00
Eren Golge 35b76556e4 Use Attention and Prenet from common file 2019-05-27 15:30:57 +02:00
Eren Golge ba492f43be Set tacotron model parameters to adap to common_layers.py - Prenet and Attention 2019-05-27 14:40:28 +02:00