Skip to content

Commit c0bacc6

Browse files
gchananfacebook-github-bot
authored andcommitted
Guard test_lapack_empty with has_magma. (pytorch#9936)
Summary: CUDA lapack functions generally don't work unless has_magma is true. Pull Request resolved: pytorch#9936 Differential Revision: D9028579 Pulled By: gchanan fbshipit-source-id: 9b77e3b05253fd49bcabf604d0924ffa0e116055
1 parent bf32ea8 commit c0bacc6

File tree

1 file changed

+5
-0
lines changed

1 file changed

+5
-0
lines changed

test/test_torch.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6394,6 +6394,11 @@ def test_lapack_empty(self):
63946394
devices = ['cpu'] if not torch.cuda.is_available() else ['cpu', 'cuda']
63956395
for device in devices:
63966396

6397+
# need to init cuda to check has_magma
6398+
empty = torch.randn((0, 0), device=device)
6399+
if device == 'cuda' and not torch.cuda.has_magma:
6400+
continue
6401+
63976402
def fn(torchfn, *args):
63986403
return torchfn(*tuple(torch.randn(shape, device=device) if isinstance(shape, tuple) else shape
63996404
for shape in args))

0 commit comments

Comments
 (0)