Skip to content

Commit

Permalink
PR changes
Browse files Browse the repository at this point in the history
Signed-off-by: Abhishek <[email protected]>
  • Loading branch information
Abhishek-TAMU committed Oct 11, 2024
1 parent d2796f6 commit 39d2868
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/processing_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,6 @@
from typing import Any, Dict, List, Optional, Tuple, TypedDict, Union

import numpy as np
import torch
import typing_extensions

from .dynamic_module_utils import custom_object_save
Expand Down Expand Up @@ -93,6 +92,7 @@ class FlashAttentionKwargs(TypedDict, total=False):
max_length_k (`int`, *optional*):
Maximum sequence length for key state.
"""
import torch

cu_seq_lens_q: Optional[torch.LongTensor]
cu_seq_lens_k: Optional[torch.LongTensor]
Expand Down

0 comments on commit 39d2868

Please sign in to comment.