We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I tried to train a Chinese-English-Mixed model by our own dataset. The training precess worked, but with some tensor size errors as below.
The expanded size of the tensor (32) must match the existing size (0) at non-singleton dimension 1. Target sizes: [192, 32]. Tensor sizes: [192, 0]
And I noticed, there is a try .... catch part in training process.
...... for epoch in range(epoch_str, hps.train.epochs + 1): try: if rank == 0: train_and_evaluate( rank, epoch, hps, [net_g, net_d, net_dur_disc], [optim_g, optim_d, optim_dur_disc], [scheduler_g, scheduler_d, scheduler_dur_disc], scaler, [train_loader, eval_loader], logger, [writer, writer_eval], ) else: train_and_evaluate( rank, epoch, hps, [net_g, net_d, net_dur_disc], [optim_g, optim_d, optim_dur_disc], [scheduler_g, scheduler_d, scheduler_dur_disc], scaler, [train_loader, None], None, None, ) except Exception as e: print(e) torch.cuda.empty_cache() scheduler_g.step() scheduler_d.step() if net_dur_disc is not None: scheduler_dur_disc.step()
Why should we add the try ....catch code for errors, and why the size of tensor may be not match during the training?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I tried to train a Chinese-English-Mixed model by our own dataset. The training precess worked, but with some tensor size errors as below.
And I noticed, there is a try .... catch part in training process.
Why should we add the try ....catch code for errors, and why the size of tensor may be not match during the training?
The text was updated successfully, but these errors were encountered: