Skip to content

Commit

Permalink
comment a test case(test_get_max_memory) for musa
Browse files Browse the repository at this point in the history
  • Loading branch information
hanhaowen-mt committed Jan 9, 2024
1 parent 51c5cb1 commit 243d093
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions tests/test_runner/test_log_processor.py
Original file line number Diff line number Diff line change
Expand Up @@ -253,8 +253,8 @@ def test_collect_non_scalars(self):

# TODO:[email protected]
@unittest.skipIf(
is_musa_available(),
'musa backend do not support torch.cuda.reset_peak_memory_stats')
is_musa_available(),
'musa backend do not support torch.cuda.reset_peak_memory_stats')
@patch('torch.cuda.max_memory_allocated', MagicMock())
@patch('torch.cuda.reset_peak_memory_stats', MagicMock())
def test_get_max_memory(self):
Expand Down

0 comments on commit 243d093

Please sign in to comment.