admin管理员组

文章数量:1794759

报错:RuntimeError: Adam does not support sparse gradients, please consider SparseAdam instead

报错:RuntimeError: Adam does not support sparse gradients, please consider SparseAdam instead

 

日萌社

人工智能AI:Keras PyTorch MXNet TensorFlow PaddlePaddle 深度学习实战(不定时更新)


使用torch.optim.Adam(model.parameters(), lr=0.1)后报错如下: RuntimeError: Adam does not support sparse gradients, please consider SparseAdam instead 分析:因为Adam的关系,所以nn.Embedding(vocab_size, embed_dim, sparse=True)中的sparse不能等于True,必须为False 解决:nn.Embedding(vocab_size, embed_dim, sparse=False)

本文标签: 报错AdamRuntimeErrorsupportgradients