Commit Graph

56 Commits (215eb014ca67b6945eaffaa6f8c0c5cfc9453536)

Author SHA1 Message Date
Reuben Morais e3f841742d Address even more lint problems 2019-07-19 11:48:12 +02:00
Reuben Morais 057a50a889 Merge branch 'dev' into contribution-grease 2019-07-19 11:41:14 +02:00
Eren Golge 4336e1d338 fix unittests for the latest updates 2019-07-19 11:12:48 +02:00
Reuben Morais 11e7895329 Fix Pylint issues 2019-07-19 09:08:51 +02:00
Thomas Werkmeister 6390c3b2e6 num_speakers larger than 1 2019-07-02 14:46:41 +02:00
Thomas Werkmeister 81c5df71f6 removed in-place changes 2019-07-01 21:19:35 +02:00
Thomas Werkmeister ba8cc8054b disabling multispeaker with num_speakers=0 2019-07-01 14:01:34 +02:00
Thomas Werkmeister d172a3d3d5 multispeaker 2019-06-26 12:59:14 +02:00
Eren Golge 0f8936d744 GST inference 2019-06-12 12:12:01 +02:00
Eren Golge fef3aecc09 tacotrongst 2019-06-05 18:46:28 +02:00
Eren Golge 4678c66599 forward_attn_mask and config update 2019-06-04 00:39:29 +02:00
Eren Golge ba492f43be Set tacotron model parameters to adap to common_layers.py - Prenet and Attention 2019-05-27 14:40:28 +02:00
Eren Golge e62659da94 update separate stopnet flow to make it faster. 2019-05-17 16:15:43 +02:00
Eren Golge 6331bccefc make dropout oprional #2 2019-05-12 17:35:31 +02:00
Eren Golge e2439fde9a make location attention optional and keep all attention weights in attention class 2019-04-29 11:37:01 +02:00
Eren Golge 312a539a0e Enable optional forward attention with transition agent 2019-04-10 16:41:30 +02:00
Eren Golge 961af0f5cd setup_model externally based on model selection. Make forward attention and prenet type configurable in config.json 2019-04-05 17:49:18 +02:00
Eren Golge bc51b81aae parameter name fix 2019-03-26 00:52:47 +01:00
Eren Golge 0a92c6d5a7 Set attention norm method by config.json 2019-03-26 00:48:12 +01:00
Eren Golge b9b79fcf0f inference truncated NEED TO BE TESTED 2019-03-11 17:40:09 +01:00
Eren Golge a9ce1d4f19 bug fix for tacotron and tests update 2019-03-06 13:43:29 +01:00
Eren Golge b031a65677 compute sequence mask in model, add tacotron2 relatedfiles 2019-03-06 13:14:58 +01:00
Eren Golge 1e8fdec084 Modularize functions in Tacotron 2019-03-05 13:25:50 +01:00
Eren Golge bf5f18d11e Formatting changes and distributed training 2019-02-27 09:50:52 +01:00
Eren Golge 6ea31e47df Constant queue size for autoregression window 2019-02-16 03:18:49 +01:00
Eren Golge b011dafbab pass num_chars in train.py 2019-01-21 14:52:40 +01:00
Eren Golge b78fc96115 Change embedding layer init to old version 2019-01-16 12:33:24 +01:00
Eren Golge ed1f648b83 Enalbe attention windowing and make in configurable at model level. 2019-01-16 12:32:40 +01:00
Eren Golge c2637f114e New initialization for embedding layer and use phonemes count instead of symbols for embedding layer init 2019-01-16 12:18:54 +01:00
Eren Golge 440f51b61d correct import statements 2018-11-03 23:19:23 +01:00
Eren Golge 0b6a9995fc change import statements 2018-11-03 19:15:06 +01:00
Eren e7278437ee Please enter the commit message for your changes. Lines starting 2018-09-19 15:49:42 +02:00
Eren 3b2654203d fixing size mismatch 2018-08-10 18:48:43 +02:00
Eren e0bce1d2d1 Pass mask instead of length to model 2018-08-10 17:45:17 +02:00
Eren G d5febfb187 Setting up network size according to the reference paper 2018-08-08 12:34:44 +02:00
Eren G f5537dc48f pep8 format all 2018-08-02 16:34:17 +02:00
Eren G dac8fdffa9 Attn masking 2018-07-13 14:50:55 +02:00
Eren 5edfad1e09 fix import statements 2018-06-21 16:33:30 +02:00
Eren Golge 8be07ee3c5 stop token prediction update for tacotron model 2018-05-11 04:15:06 -07:00
Eren Golge 7c40455edd remove redundant files 2018-05-10 16:36:07 -07:00
Eren Golge 5442d8c734 Message fix 2018-04-10 09:35:49 -07:00
Eren Golge a9eadd1b8a pep8 check 2018-04-03 03:24:57 -07:00
Eren Golge f6f1b06b77 Remove useless config argument 2018-03-28 09:43:29 -07:00
Eren Golge 0f3b2ddd7b remove stop token prediciton 2018-03-22 12:50:26 -07:00
Eren Golge 5750090fcd Stop token prediction - does train yet 2018-03-22 12:34:16 -07:00
Eren Golge 9b4aa92667 Adding harmonized teacher-forcing 2018-03-19 09:27:19 -07:00
Eren Golge 3071e7f6f6 remove attention mask 2018-03-19 08:26:16 -07:00
Eren Golge b4032e8dff best model ever changes 2018-03-07 06:58:51 -08:00
Eren Golge b5f2181e04 teacher forcing with combining 2018-02-23 08:35:53 -08:00
Eren Golge a3d8059d06 More layer tests 2018-02-13 08:08:23 -08:00