Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add seeds to dpmodel and fix seeds in tf & pt #3880

Merged
merged 14 commits into from
Jun 19, 2024

Conversation

njzjz
Copy link
Member

@njzjz njzjz commented Jun 14, 2024

Fix #3799.

Summary by CodeRabbit

  • New Features

    • Introduced flexibility in specifying seed values, allowing either an integer or a list of integers.
    • Enhanced seed parameter usage across various initialization methods and classes for more controlled randomization.
  • Improvements

    • Updated seed initialization logic to include additional computations and dynamic adjustments.
    • Enhanced documentation for parameters in multiple classes, providing clearer usage guidelines.

njzjz added 10 commits June 13, 2024 20:06
Signed-off-by: Jinzhe Zeng <[email protected]>
Signed-off-by: Jinzhe Zeng <[email protected]>
Signed-off-by: Jinzhe Zeng <[email protected]>
Signed-off-by: Jinzhe Zeng <[email protected]>
Signed-off-by: Jinzhe Zeng <[email protected]>
Signed-off-by: Jinzhe Zeng <[email protected]>
Signed-off-by: Jinzhe Zeng <[email protected]>
Signed-off-by: Jinzhe Zeng <[email protected]>
@njzjz njzjz requested review from iProzd and wanghan-iapcm June 14, 2024 21:07
Copy link
Contributor

coderabbitai bot commented Jun 14, 2024

Walkthrough

The main changes introduce a seed parameter across numerous classes and functions in the deepmd module, supporting both single integers and lists of integers. These adjustments ensure the seeding mechanism is uniformly and effectively passed down to the network layers, thereby enhancing the reproducibility and randomness control.

Changes

Files/Modules Changes Summary
deepmd/dpmodel/descriptor/dpa1.py Added seed parameter to constructors.
deepmd/dpmodel/descriptor/dpa2.py Updated init_subclass_params to include seed parameter.
deepmd/dpmodel/descriptor/repformers.py Enhanced DescrptBlockRepformers class with various new parameters, including seed.
deepmd/dpmodel/descriptor/se_atten_v2.py Modified seed in DescrptDPA1 class constructor to accept lists.
deepmd/dpmodel/descriptor/se_e2_a.py Modified calculation and usage of seed in __init__ method.
deepmd/dpmodel/descriptor/se_r.py Added seed parameter with calculation in constructor.
deepmd/dpmodel/descriptor/se_t.py Enhanced seeding logic with child_seed function.
deepmd/dpmodel/fitting/dipole_fitting.py Updated seed value to accept both integer and list types.
deepmd/dpmodel/fitting/dos_fitting.py Updated seed value to accept both integer and list types.
deepmd/dpmodel/fitting/ener_fitting.py Updated seed value to accept both integer and list types.
deepmd/dpmodel/fitting/general_fitting.py Added seed parameter in GeneralFitting class.
deepmd/dpmodel/fitting/polarizability_fitting.py Updated seed value to accept both integer and list types.
deepmd/dpmodel/utils/network.py Introduced optional precision and seed parameters in various class constructors.
deepmd/dpmodel/utils/seed.py Added child_seed function for generating child seeds.
deepmd/dpmodel/utils/type_embed.py Added seed parameter to __init__ method of affected class.
deepmd/pt/model/descriptor/dpa1.py Enhanced seed logic to support integers and lists in class constructor.
deepmd/pt/model/descriptor/dpa2.py Modified init_subclass_params to handle seed with additional computations and child_seed function.
deepmd/pt/model/descriptor/repformer_layer.py Enhanced RepformerLayer class by modifying seed values to control random seed behavior effectively.
deepmd/pt/model/descriptor/se_atten.py Enhanced seeding logic with customized calculations for seed parameter based on existing attributes.
deepmd/pt/model/network/network.py Adjusted __init__ method to modify seed parameter behavior.
deepmd/pt/utils/utils.py Enhanced get_generator function to accept lists as seeds and hash them to generate a torch generator seed.

Sequence Diagram(s)

N/A

Assessment against linked issues

Objective (Issue #3799) Addressed Explanation
Ensure seed parameters are passed to network layers
Enhance seed parameter to accept lists of integers
Modify seeding logic using child_seed function

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@njzjz njzjz linked an issue Jun 14, 2024 that may be closed by this pull request
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

Outside diff range and nitpick comments (78)
deepmd/dpmodel/descriptor/se_r.py (2)

Line range hint 105-105: Avoid using mutable default arguments as they can lead to unexpected behavior.

- neuron: List[int] = [24, 48, 96],
- exclude_types: List[List[int]] = [],
+ neuron: Optional[List[int]] = None,
+ exclude_types: Optional[List[List[int]]] = None,
+ if neuron is None:
+     neuron = [24, 48, 96]
+ if exclude_types is None:
+     exclude_types = []

Also applies to: 109-109


Line range hint 381-381: The variable env_mat is declared but not used, which could lead to unnecessary resource allocation.

- env_mat = data.pop("env_mat")
deepmd/dpmodel/descriptor/se_t.py (2)

Line range hint 93-93: It's important to avoid mutable default arguments to prevent unintended side effects.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,
+ if exclude_types is None:
+     exclude_types = []

Also applies to: 98-98, 253-253


Line range hint 382-382: The variable env_mat is declared but appears to be unused, which could lead to unnecessary resource allocation.

- env_mat = data.pop("env_mat")
deepmd/pt/model/descriptor/se_r.py (3)

Line range hint 64-64: Replace mutable default arguments with immutable ones.

-        neuron=[25, 50, 100],
+        neuron=None,
+        if neuron is None:
+            neuron = [25, 50, 100],

-        exclude_types: List[Tuple[int, int]] = [],
+        exclude_types: Optional[List[Tuple[int, int]]] = None,
+        if exclude_types is None:
+            exclude_types = [],

Also applies to: 69-69, 297-297


Line range hint 257-261: Simplify conditional logic using a ternary operator.

-        if callable(merged):
-            # only get data for once
-            sampled = merged()
-        else:
-            sampled = merged
+        sampled = merged() if callable(merged) else merged

Line range hint 435-435: Remove unused variable env_mat.

-        env_mat = EnvMatStatSe(self)
deepmd/dpmodel/fitting/general_fitting.py (2)

Line range hint 94-94: Avoid using mutable default arguments in function definitions to prevent potential bugs due to the mutable object being shared across all calls to the function.

- neuron: List[int] = [120, 120, 120],
+ neuron: Optional[List[int]] = None,
+ if neuron is None:
+    neuron = [120, 120, 120],

- remove_vaccum_contribution: Optional[List[bool]] = None,
+ remove_vaccum_contribution: Optional[List[bool]] = None,
+ if remove_vaccum_contribution is None:
+    remove_vaccum_contribution = [],

Also applies to: 108-108, 262-262


Line range hint 309-309: Use direct key checks in dictionaries instead of calling .keys() for better performance and readability.

- if key in ["bias_atom_e"].keys():
+ if key in ["bias_atom_e"]:
deepmd/dpmodel/descriptor/se_e2_a.py (2)

Line range hint 147-147: Avoid using mutable default arguments in function definitions to prevent potential bugs due to the mutable object being shared across all calls to the function.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None,
+ if neuron is None:
+    neuron = [24, 48, 96],

- exclude_types: List[List[int]] = [],
+ exclude_types: Optional[List[List[int]]] = None,
+ if exclude_types is None:
+    exclude_types = [],

- sel: List[int],
+ sel: Optional[List[int]] = None,
+ if sel is None:
+    sel = [],

Also applies to: 152-152, 326-326


Line range hint 456-456: Remove the assignment to the unused variable env_mat to clean up the code and improve readability.

- env_mat = data.pop("env_mat")
deepmd/pt/model/task/fitting.py (4)

Line range hint 137-137: Avoid using mutable default arguments for function parameters.

- neuron: List[int] = [128, 128, 128],
+ neuron: Optional[List[int]] = None,

And then inside the function:

if neuron is None:
    neuron = [128, 128, 128]

Line range hint 147-147: Mutable default arguments should be replaced with None to avoid potential bugs.

- exclude_types: List[int] = [],
+ exclude_types: Optional[List[int]] = None,

And then inside the function:

if exclude_types is None:
    exclude_types = []

Line range hint 253-253: Using mutable default arguments can lead to unexpected behaviors.

- exclude_types: List[int] = [],
+ exclude_types: Optional[List[int]] = None,

And then inside the function:

if exclude_types is None:
    exclude_types = []

Line range hint 328-328: Optimize dictionary key access.

- if key in variables.keys():
+ if key in variables:
deepmd/pt/model/descriptor/repformers.py (4)

Line range hint 99-99: Mutable default arguments in function parameters should be avoided to prevent bugs.

- exclude_types: List[List[int]] = [],
+ exclude_types: Optional[List[List[int]]] = None,

And then inside the function:

if exclude_types is None:
    exclude_types = []

Line range hint 391-391: Replace mutable default arguments with None.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

And then inside the function:

if exclude_types is None:
    exclude_types = []

Line range hint 458-458: Use an underscore to denote unused loop variables.

- for idx, ll in enumerate(self.layers):
+ for _idx, ll in enumerate(self.layers):

Line range hint 530-534: Simplify conditional assignment using a ternary operator.

- if callable(merged):
-     sampled = merged()
- else:
-     sampled = merged
+ sampled = merged() if callable(merged) else merged
deepmd/pt/model/descriptor/se_a.py (9)

Line range hint 78-78: Replace mutable default argument with immutable type.

- neuron=[25, 50, 100],
+ neuron=None,

Initialize neuron within the function if it is None.


Line range hint 84-84: Replace mutable default argument with immutable type.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Initialize exclude_types within the function if it is None.


Line range hint 224-224: Replace mutable default argument with immutable type.

- neuron=[25, 50, 100],
+ neuron=None,

Initialize neuron within the function if it is None.


Line range hint 323-323: Remove assignment to unused variable env_mat.

- env_mat = EnvMatStatSe(self)

Line range hint 376-376: Replace mutable default argument with immutable type.

- neuron=[25, 50, 100],
+ neuron=None,

Initialize neuron within the function if it is None.


Line range hint 382-382: Replace mutable default argument with immutable type.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Initialize exclude_types within the function if it is None.


Line range hint 567-571: Use ternary operator for conditional assignment.

- if callable(merged):
-     sampled = merged()
- else:
-     sampled = merged
+ sampled = merged() if callable(merged) else merged

Line range hint 591-591: Replace mutable default argument with immutable type.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Initialize exclude_types within the function if it is None.


Line range hint 641-641: Remove unused loop control variable ii.

- for ii, transform in enumerate(self.filter_layers_old[1:]):
+ for transform in self.filter_layers_old[1:]:
deepmd/pt/model/descriptor/dpa1.py (2)

Line range hint 215-215: Replace mutable default arguments with safer alternatives.

- neuron: list = [25, 50, 100],
+ neuron: Optional[List[int]] = None,

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

And then initialize them within the function:

if neuron is None:
    neuron = [25, 50, 100]
if exclude_types is None:
    exclude_types = []

This change prevents potential bugs related to the mutable default argument.

Also applies to: 227-227


Line range hint 517-517: Remove unused variables to clean up the code.

- env_mat = DPEnvMat(obj.rcut, obj.rcut_smth).serialize(),
- nall = extended_coord.view(nframes, -1).shape[1] // 3

These variables are declared but not used, which could lead to confusion and clutter in the codebase.

Also applies to: 585-585

deepmd/pt/model/descriptor/se_t.py (8)

Line range hint 113-113: Replace mutable default argument with an immutable one.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None,

And initialize within the function:

if neuron is None:
    neuron = [24, 48, 96]

Line range hint 118-118: Replace mutable default argument with an immutable one.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

And initialize within the function:

if exclude_types is None:
    exclude_types = []

Line range hint 253-253: Avoid mutable default arguments to prevent potential bugs.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

And initialize within the function:

if exclude_types is None:
    exclude_types = []

Line range hint 348-348: Remove assignment to unused variable env_mat.

- env_mat = data.pop("env_mat")

Line range hint 401-401: Avoid mutable default arguments to prevent potential bugs.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

And initialize within the function:

if exclude_types is None:
    exclude_types = []

Line range hint 406-406: Avoid mutable default arguments to prevent potential bugs.

- networks: List[Union[NativeNet, dict]] = [],
+ networks: Optional[List[Union[NativeNet, dict]]] = None,

And initialize within the function:

if networks is None:
    networks = []

Line range hint 594-598: Use ternary operator to simplify the if-else block.

- if callable(merged):
-     sampled = merged()
- else:
-     sampled = merged
+ sampled = merged() if callable(merged) else merged

Line range hint 618-618: Avoid mutable default arguments to prevent potential bugs.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

And initialize within the function:

if exclude_types is None:
    exclude_types = []
deepmd/dpmodel/utils/network.py (3)

Line range hint 576-576: Replace mutable default argument with an immutable one.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None,

And initialize within the function:

if neuron is None:
    neuron = [24, 48, 96]

Line range hint 678-678: Replace mutable default argument with an immutable one.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None,

And initialize within the function:

if neuron is None:
    neuron = [24, 48, 96]

Line range hint 783-783: Replace mutable default argument with an immutable one.

- networks: List[Union[NativeNet, dict]] = [],
+ networks: Optional[List[Union[NativeNet, dict]]] = None,

And initialize within the function:

if networks is None:
    networks = []
deepmd/pt/model/descriptor/dpa2.py (4)

Line range hint 207-221: Ensure proper propagation of the seed parameter through constructors.

- seed=seed + len(self.repinit_args.neuron) * 2 if seed is not None else None,
+ seed=seed + len(self.repinit_args.neuron) * 2 + 1 + self.repformer_args.nlayers * 14 if seed is not None else None,

The calculation of the seed parameter in the constructor of DescrptBlockRepformers should include all the factors as per the AI-generated summary. Ensure that the seed calculation is consistent and covers all the necessary components.


Line range hint 84-84: Avoid using mutable default arguments in function definitions.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,
...
+ if exclude_types is None:
+     exclude_types = []

Using mutable default arguments can lead to unexpected behavior if the function is called multiple times. It's safer to use None as a default value and initialize the mutable data structure within the function body.


Line range hint 434-434: Rename unused loop control variable.

- for ii, descrpt in enumerate([self.repinit, self.repformers]):
+ for _ii, descrpt in enumerate([self.repinit, self.repformers]):

The loop control variable ii is not used within the loop body. It is a good practice to rename unused variables to _ or _ii to indicate that they are placeholders.


Line range hint 554-554: Remove assignment to unused variable.

- env_mat = repformers_variable.pop("env_mat")
+ repformers_variable.pop("env_mat")

The local variable env_mat is assigned but never used. It's good practice to remove such assignments to keep the code clean and efficient.

deepmd/dpmodel/descriptor/dpa2.py (3)

Line range hint 67-67: Avoid using mutable default arguments.

- neuron: List[int] = [25, 50, 100],
+ neuron: Optional[List[int]] = None,

In the constructor, add:

if neuron is None:
    neuron = [25, 50, 100]

Line range hint 325-325: Avoid using mutable default arguments.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

In the constructor, add:

if exclude_types is None:
    exclude_types = []

Line range hint 812-812: Remove unused variable assignment.

- env_mat = repinit_variable.pop("env_mat")

This variable is assigned but never used within the scope of this function.

deepmd/pt/model/descriptor/se_atten.py (5)

Line range hint 65-65: Replace mutable default argument with an immutable one.

- neuron: list = [25, 50, 100],
+ neuron: Optional[list] = None,

In the method body, initialize neuron if it is None:

if neuron is None:
    neuron = [25, 50, 100]

Line range hint 82-82: Replace mutable default argument with an immutable one.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

In the method body, initialize exclude_types if it is None:

if exclude_types is None:
    exclude_types = []

Line range hint 398-402: Use a ternary operator for cleaner and more concise code.

- if callable(merged):
-     sampled = merged()
- else:
-     sampled = merged
+ sampled = merged() if callable(merged) else merged

Line range hint 422-422: Replace mutable default argument with an immutable one.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

In the method body, initialize exclude_types if it is None:

if exclude_types is None:
    exclude_types = []

Line range hint 472-472: Remove unused local variable nall.

- nall = extended_coord.view(nb, -1, 3).shape[1]
deepmd/pt/model/descriptor/repformer_layer.py (3)

Line range hint 898-898: Remove assignment to unused variable ng2.

- ng2 = g2.shape[-1]

Line range hint 1061-1064: Simplify the conditional assignment of gg1.

- if cal_gg1:
-     gg1 = _make_nei_g1(g1_ext, nlist)
- else:
-     gg1 = None
+ gg1 = _make_nei_g1(g1_ext, nlist) if cal_gg1 else None

Line range hint 1173-1173: Remove assignment to unused variable nitem.

- nitem = len(update_list)
deepmd/dpmodel/descriptor/dpa1.py (7)

Line range hint 226-226: Replace mutable default argument with an immutable one.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Ensure to initialize exclude_types within the function if it is None.


Line range hint 237-237: Replace mutable default argument with an immutable one.

- neuron: List[int] = [25, 50, 100],
+ neuron: Optional[List[int]] = None,

Ensure to initialize neuron within the function if it is None.


Line range hint 467-467: Remove unused variable assignment.

- nall = coord_ext.reshape(nf, -1).shape[1] // 3

Line range hint 547-547: Remove unused variable assignment.

- env_mat = EnvMat(self.rcut, self.rcut_smth, protection=self.env_protection)

Line range hint 609-609: Replace mutable default argument with an immutable one.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Ensure to initialize exclude_types within the function if it is None.


Line range hint 619-619: Replace mutable default argument with an immutable one.

- neuron: List[int] = [25, 50, 100],
+ neuron: Optional[List[int]] = None,

Ensure to initialize neuron within the function if it is None.


Line range hint 813-813: Replace mutable default argument with an immutable one.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Ensure to initialize exclude_types within the function if it is None.

deepmd/tf/descriptor/se_a.py (4)

Line range hint 173-173: Avoid using mutable default arguments for function parameters.

- exclude_types: List[List[int]] = []
+ exclude_types: Optional[List[List[int]]] = None

Initialize exclude_types within the function if it is None.


Line range hint 179-179: Avoid using mutable default arguments for function parameters.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None

Initialize neuron within the function if it is None.


Line range hint 613-619: Remove unused local variables to clean up the code.

- t_rcut = tf.constant(
-     np.max([self.rcut_r, self.rcut_a]),
-     name="rcut",
-     dtype=GLOBAL_TF_FLOAT_PRECISION,
- )
- t_ntypes = tf.constant(self.ntypes, name="ntypes", dtype=tf.int32)
- t_ndescrpt = tf.constant(self.ndescrpt, name="ndescrpt", dtype=tf.int32)

These variables are declared but never used, which can lead to confusion and clutter.


Line range hint 1298-1304: Use contextlib.suppress for a cleaner approach to handling exceptions where the action is to pass.

- try:
-     self.original_sel = get_tensor_by_name_from_graph(
-         graph, f"descrpt_attr{suffix}/original_sel"
-     )
- except GraphWithoutTensorError:
-     # original_sel is not restored in old graphs, assume sel never changed before
-     pass
+ from contextlib import suppress
+
+ with suppress(GraphWithoutTensorError):
+     self.original_sel = get_tensor_by_name_from_graph(
+         graph, f"descrpt_attr{suffix}/original_sel"
+     )

This approach reduces the complexity of the try-except block and makes the intention clearer.

deepmd/dpmodel/descriptor/repformers.py (7)

Line range hint 332-332: Avoid using mutable default arguments like lists for function parameters due to the risk of unexpected behavior if the function modifies the object.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Line range hint 376-376: The loop control variable idx is not used within the loop body. It's a good practice to replace it with _ if not used.

- for idx in range(nlayers):
+ for _ in range(nlayers):

Line range hint 544-544: The variable ng is assigned but never used. It can be safely removed to clean up the code.

- ng = g.shape[-1]

Line range hint 1305-1305: The variable ng1 is assigned but never used. It can be safely removed to clean up the code.

- ng1 = gg1.shape[-1]

Line range hint 1386-1386: The variable nall is assigned but never used. It can be safely removed to clean up the code.

- nall = g1_ext.shape[1]

Line range hint 1396-1399: Use a ternary operator for more concise and readable code when a variable is conditionally assigned based on a single condition.

- if cal_gg1:
-     gg1 = _make_nei_g1(g1_ext, nlist)
- else:
-     gg1 = None
+ gg1 = _make_nei_g1(g1_ext, nlist) if cal_gg1 else None

Line range hint 1505-1505: The variable nitem is assigned but never used. It can be safely removed to clean up the code.

- nitem = len(update_list)
deepmd/tf/descriptor/se_atten.py (4)

Line range hint 180-180: Replace mutable default arguments with immutable ones.

- neuron: List[int] = [25, 50, 100],
+ neuron: Optional[List[int]] = None,
- exclude_types: List[List[int]] = [],
+ exclude_types: Optional[List[List[int]]] = None,

Add the following initialization inside the methods:

if neuron is None:
    neuron = [25, 50, 100]
if exclude_types is None:
    exclude_types = []

Also applies to: 187-187, 2095-2095, 2106-2106


Line range hint 569-569: Remove unused local variables.

- t_rcut = tf.constant(
-     np.max([self.rcut_r, self.rcut_a]),
-     name="rcut",
-     dtype=GLOBAL_TF_FLOAT_PRECISION,
- )
- t_ntypes = tf.constant(self.ntypes, name="ntypes", dtype=tf.int32)
- t_ndescrpt = tf.constant(self.ndescrpt, name="ndescrpt", dtype=tf.int32)

Also applies to: 574-574, 575-575


Line range hint 1308-1308: Remove unused local variable nframes.

- nframes = dd_all.shape[0]

Line range hint 392-392: Use enumerate() to simplify the loop that uses the sys_num variable.

- sys_num = 0
- for cc, bb, tt, nn, mm, r_n in zip(
+ for sys_num, (cc, bb, tt, nn, mm, r_n) in enumerate(zip(

deepmd/dpmodel/descriptor/dpa2.py Outdated Show resolved Hide resolved
deepmd/dpmodel/descriptor/repformers.py Outdated Show resolved Hide resolved
deepmd/pt/model/descriptor/repformer_layer.py Outdated Show resolved Hide resolved
Copy link

codecov bot commented Jun 14, 2024

Codecov Report

Attention: Patch coverage is 97.89474% with 2 lines in your changes missing coverage. Please review.

Project coverage is 82.74%. Comparing base (c644314) to head (8aacdea).
Report is 117 commits behind head on devel.

Files with missing lines Patch % Lines
deepmd/dpmodel/descriptor/repformers.py 83.33% 1 Missing ⚠️
deepmd/dpmodel/utils/seed.py 92.30% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##            devel    #3880      +/-   ##
==========================================
+ Coverage   82.70%   82.74%   +0.03%     
==========================================
  Files         517      518       +1     
  Lines       50141    50215      +74     
  Branches     2984     2984              
==========================================
+ Hits        41469    41548      +79     
+ Misses       7762     7757       -5     
  Partials      910      910              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Outside diff range and nitpick comments (7)
deepmd/tf/descriptor/se_atten.py (7)

Line range hint 180-180: Avoid using mutable default arguments for function parameters.

- exclude_types: List[List[int]] = []
+ exclude_types: Optional[List[List[int]]] = None

Initialize exclude_types within the function if it is None.


Line range hint 187-187: Avoid using mutable default arguments for function parameters.

- neuron: List[int] = [25, 50, 100]
+ neuron: Optional[List[int]] = None

Initialize neuron within the function if it is None.


Line range hint 569-575: Remove unused local variables to clean up the code.

- t_rcut = tf.constant(
-     np.max([self.rcut_r, self.rcut_a]),
-     name="rcut",
-     dtype=GLOBAL_TF_FLOAT_PRECISION,
- )
- t_ntypes = tf.constant(self.ntypes, name="ntypes", dtype=tf.int32)
- t_ndescrpt = tf.constant(self.ndescrpt, name="ndescrpt", dtype=tf.int32)

Line range hint 1308-1308: Remove unused local variable to clean up the code.

- nframes = dd_all.shape[0]

Line range hint 2095-2095: Avoid using mutable default arguments for function parameters.

- exclude_types: List[List[int]] = []
+ exclude_types: Optional[List[List[int]]] = None

Initialize exclude_types within the function if it is None.


Line range hint 392-392: Use enumerate() to simplify the loop and directly obtain the index.

for sys_num, (cc, bb, tt, nn, mm, r_n) in enumerate(zip(data_coord, data_box, data_atype, natoms_vec, mesh, real_natoms_vec)):
    ...

Line range hint 218-218: Consider adding a stacklevel to the warnings.warn call to specify the origin of the warning.

warnings.warn(
    "Set 'set_davg_zero' False in descriptor 'se_atten' may cause unexpected incontinuity during model inference!",
    stacklevel=2
)

Signed-off-by: Jinzhe Zeng <[email protected]>
Copy link
Collaborator

@wanghan-iapcm wanghan-iapcm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I find one has to spend much effort on maintaining the number of seeds used by each module. I would recommend the parallel random number generation feature by which we do not need to manually record the number of consumed seeds. see e.g. https://numpy.org/doc/stable/reference/random/parallel.html

deepmd/dpmodel/fitting/general_fitting.py Outdated Show resolved Hide resolved
deepmd/dpmodel/descriptor/se_e2_a.py Outdated Show resolved Hide resolved
@njzjz
Copy link
Member Author

njzjz commented Jun 16, 2024

I find one has to spend much effort on maintaining the number of seeds used by each module. I would recommend the parallel random number generation feature by which we do not need to manually record the number of consumed seeds. see e.g. https://numpy.org/doc/stable/reference/random/parallel.html

It's unclear how to share arguments with PT when input is a numpy.random.SeedSequence, considering PT also uses several functions in dpmodel.

@njzjz
Copy link
Member Author

njzjz commented Jun 17, 2024

I find one has to spend much effort on maintaining the number of seeds used by each module. I would recommend the parallel random number generation feature by which we do not need to manually record the number of consumed seeds. see e.g. https://numpy.org/doc/stable/reference/random/parallel.html

It's unclear how to share arguments with PT when input is a numpy.random.SeedSequence, considering PT also uses several functions in dpmodel.

I just realized one can do this:

>>> import numpy as np
>>> rng = np.random.default_rng([1,2,3,4,5])

So it will work if we change the type of seed from int | None to int | list[int] | None.

@njzjz njzjz requested a review from wanghan-iapcm June 17, 2024 23:23
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 9

Outside diff range and nitpick comments (62)
deepmd/dpmodel/fitting/ener_fitting.py (1)

Line range hint 33-33: Replace mutable default arguments with None and initialize them within the function to avoid potential bugs.

- neuron: List[int] = [120, 120, 120],
+ neuron: Optional[List[int]] = None,

- trainable: Optional[List[bool]] = None,
+ trainable: Union[bool, List[bool]] = True,

In the __init__ method, initialize neuron and trainable if they are None.

Also applies to: 47-47

deepmd/dpmodel/fitting/dos_fitting.py (1)

Line range hint 36-36: Replace mutable default arguments with None and initialize them within the function to avoid potential bugs.

- neuron: List[int] = [120, 120, 120],
+ neuron: Optional[List[int]] = None,

- trainable: Union[bool, List[bool]] = True,
+ trainable: Optional[Union[bool, List[bool]]] = None,

In the __init__ method, initialize neuron and trainable if they are None.

Also applies to: 46-46

Tools
Ruff

46-46: Do not use mutable data structures for argument defaults (B006)

Replace with None; initialize within function

deepmd/pt/model/task/dos.py (1)

Line range hint 48-48: Replace mutable default arguments with None and initialize them within the function to avoid potential bugs.

- neuron: List[int] = [128, 128, 128],
+ neuron: Optional[List[int]] = None,

- trainable: Union[bool, List[bool]] = True,
+ trainable: Optional[Union[bool, List[bool]]] = None,

In the __init__ method, initialize neuron and trainable if they are None.

Also applies to: 58-58

Tools
Ruff

58-58: Do not use mutable data structures for argument defaults (B006)

Replace with None; initialize within function

deepmd/pt/model/network/layernorm.py (1)

Line range hint 140-140: The local variable prec is assigned but not used in the deserialize method. This might be a remnant of a previous version of the code where prec was used, or it might be a preparation for future use that hasn't been implemented yet.

-        prec = PRECISION_DICT[obj.precision]

If the variable is not intended for immediate use, it's best to remove it to clean up the code and avoid confusion.

deepmd/pt/model/task/invar_fitting.py (1)

Line range hint 90-90: Using mutable default arguments like lists can lead to unexpected behaviors if the function modifies these arguments. Each call to the function without explicitly passing these arguments would use the same list, leading to potential bugs.

-        neuron: List[int] = [128, 128, 128],
+        neuron: Optional[List[int]] = None,
+        if neuron is None:
+            neuron = [128, 128, 128]

-        exclude_types: List[int] = [],
+        exclude_types: Optional[List[int]] = None,
+        if exclude_types is None:
+            exclude_types = []

This change ensures that a new list is created every time the function is called, preventing accidental sharing of mutable objects across function calls.

Also applies to: 100-100

deepmd/dpmodel/descriptor/se_atten_v2.py (1)

Line range hint 43-43: The use of mutable default arguments for exclude_types and neuron can lead to unintended side effects if these arguments are modified within the method. This is a common Python pitfall.

-        neuron: List[int] = [25, 50, 100],
+        neuron: Optional[List[int]] = None,
+        if neuron is None:
+            neuron = [25, 50, 100]

-        exclude_types: List[Tuple[int, int]] = [],
+        exclude_types: Optional[List[Tuple[int, int]]] = None,
+        if exclude_types is None:
+            exclude_types = []

This modification ensures that the defaults are safely handled, avoiding potential bugs related to the mutable default argument issue.

Also applies to: 53-53

source/tests/tf/test_model_se_a_ebd_v2.py (1)

Line range hint 118-118: The variable atom_ener is assigned but never used, leading to unnecessary memory allocation and potential confusion.

- t_atom_ener = tf.placeholder(GLOBAL_TF_FLOAT_PRECISION, [None], name="t_atom_ener")

Remove or use the variable to clean up the code.

deepmd/pt/model/network/mlp.py (1)

Line range hint 266-266: Unused variable prec should be removed to clean up the code.

- prec = PRECISION_DICT[obj.precision]
deepmd/dpmodel/descriptor/se_r.py (3)

Line range hint 109-109: Avoid using mutable default arguments for function parameters.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None,

And then inside the function:

if neuron is None:
    neuron = [24, 48, 96]

Line range hint 113-113: Avoid using mutable default arguments for function parameters.

- exclude_types: List[List[int]] = [],
+ exclude_types: Optional[List[List[int]]] = None,

And then inside the function:

if exclude_types is None:
    exclude_types = []

Line range hint 385-385: Remove the unused local variable env_mat to clean up the code.

- env_mat = data.pop("env_mat")
deepmd/dpmodel/descriptor/se_t.py (2)

Line range hint 97-97: Avoid using mutable default arguments for function parameters.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None,

And then inside the function:

if neuron is None:
    neuron = [24, 48, 96]

Line range hint 386-386: Remove the unused local variable env_mat to clean up the code.

- env_mat = data.pop("env_mat")
deepmd/pt/model/descriptor/se_r.py (4)

Line range hint 67-67: Avoid using mutable default arguments for function parameters.

- neuron=[25, 50, 100],
+ neuron: Optional[List[int]] = None,

And then inside the function:

if neuron is None:
    neuron = [25, 50, 100]

Line range hint 72-72: Avoid using mutable default arguments for function parameters.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

And then inside the function:

if exclude_types is None:
    exclude_types = []

Line range hint 260-264: Optimize the conditional logic for better readability and efficiency.

- if callable(merged):
-     # only get data for once
-     sampled = merged()
- else:
-     sampled = merged
+ sampled = merged() if callable(merged) else merged

Line range hint 438-438: Remove the unused local variable env_mat to clean up the code.

- env_mat = data.pop("env_mat")
deepmd/dpmodel/fitting/general_fitting.py (2)

89-90: Ensure proper documentation for the seed parameter.

The documentation for the seed parameter should clearly explain the effect of using an integer vs. a list of integers. This can help users understand how the seed influences the network initialization.


Line range hint 311-311: Use key in dict instead of key in dict.keys().

Using key in dict.keys() is less efficient than key in dict. The latter is more Pythonic and should be used.

- if key in variables.keys():
+ if key in variables:
deepmd/dpmodel/descriptor/se_e2_a.py (2)

Line range hint 151-151: Avoid using mutable default arguments like lists for function parameters. This can lead to unexpected behavior due to Python's handling of mutable defaults.

- neuron: List[int] = [24, 48, 96],
- exclude_types: List[List[int]] = [],
- remove_vaccum_contribution: Optional[List[bool]] = None,
+ neuron: Optional[List[int]] = None,
+ exclude_types: Optional[List[List[int]]] = None,
+ remove_vaccum_contribution: Optional[Optional[List[bool]]] = None,

Also applies to: 156-156, 330-330


Line range hint 460-460: The local variable env_mat is declared but not used within its scope, which could be an oversight or leftover from refactoring.

- env_mat = data.pop("env_mat")
deepmd/pt/model/task/fitting.py (2)

Line range hint 140-140: Using mutable default arguments such as lists or dictionaries can lead to unexpected behavior if the function modifies these objects. It's safer to use immutable defaults.

- neuron: List[int] = [128, 128, 128],
- exclude_types: List[int] = [],
- remove_vaccum_contribution: Optional[List[bool]] = None,
+ neuron: Optional[List[int]] = None,
+ exclude_types: Optional[List[int]] = None,
+ remove_vaccum_contribution: Optional[Optional[List[bool]]] = None,

Also applies to: 150-150, 256-256


Line range hint 331-331: When checking for the presence of a key in a dictionary, use the more efficient key in dict instead of key in dict.keys().

- if key in dict.keys()
+ if key in dict
deepmd/pt/model/descriptor/repformers.py (2)

Line range hint 463-463: The loop control variable idx is not used within the loop body and should be replaced with _ to signify that it is unused.

- for idx, ll in enumerate(self.layers):
+ for _, ll in enumerate(self.layers):

Line range hint 535-539: Refactor the if-else block to a ternary operator for clarity and conciseness.

- if callable(merged):
-     sampled = merged()
- else:
-     sampled = merged
+ sampled = merged() if callable(merged) else merged
deepmd/pt/model/descriptor/dpa1.py (4)

Line range hint 218-218: Avoid using mutable default arguments like lists for function parameters. This can lead to unexpected behavior if the function modifies the default value.

- exclude_types: List[Tuple[int, int]] = []
+ exclude_types: Optional[List[Tuple[int, int]]] = None

Initialize exclude_types within the function if it's None.


Line range hint 230-230: Similar to the previous comment, avoid mutable defaults for lists to prevent unintended side effects.

- neuron: list = [25, 50, 100]
+ neuron: Optional[List[int]] = None

Initialize neuron within the function if it's None.


Line range hint 520-520: The variable env_mat is defined but not used in the code. Removing unused variables can clean up the code and reduce confusion.

- env_mat = DPEnvMat(obj.rcut, obj.rcut_smth).serialize()

Line range hint 588-588: The variable nall is assigned but never used. It should be removed unless it's planned for future use.

- nall = extended_coord.view(nframes, -1).shape[1] // 3
deepmd/pt/model/descriptor/se_t.py (3)

Line range hint 256-256: Avoid using mutable default arguments for function parameters to prevent potential bugs.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None,

And then initialize neuron within the function if it is None.

Also applies to: 404-404


Line range hint 351-351: Remove the unused variable env_mat to clean up the code.

- env_mat = EnvMatStatSe(self)

Line range hint 597-601: Use a ternary operator for cleaner and more concise code.

- if callable(merged):
-     sampled = merged()
- else:
-     sampled = merged
+ sampled = merged() if callable(merged) else merged
deepmd/dpmodel/utils/network.py (1)

Line range hint 579-579: Avoid using mutable default arguments for function parameters to prevent potential bugs.

- neuron: List[int] = [24, 48, 96],
+ neuron: Optional[List[int]] = None,

And then initialize neuron within the function if it is None.

Also applies to: 681-681, 786-786

deepmd/pt/model/descriptor/dpa2.py (2)

Line range hint 432-432: The loop control variable ii is not used within the loop body.

- for ii, descrpt in enumerate([self.repinit, self.repformers]):
+ for _ii, descrpt in enumerate([self.repinit, self.repformers]):

Renaming ii to _ii indicates that it is an unused variable, which improves code readability.


Line range hint 552-552: Remove assignment to unused variable env_mat.

- env_mat = repinit_variable.pop("env_mat")
+ repinit_variable.pop("env_mat")

The variable env_mat is assigned but never used, which could lead to confusion and clutter in the code.

source/tests/tf/test_pairwise_dprc.py (1)

Line range hint 45-48: Consider using a ternary operator for setting default_places.

- if GLOBAL_NP_FLOAT_PRECISION == np.float32:
-     default_places = 4
- else:
-     default_places = 10
+ default_places = 4 if GLOBAL_NP_FLOAT_PRECISION == np.float32 else 10
deepmd/dpmodel/descriptor/dpa2.py (2)

Line range hint 70-70: Avoid using mutable default arguments in function definitions.

- neuron: List[int] = [25, 50, 100],
+ neuron: Optional[List[int]] = None,

And then initialize it within the function:

if neuron is None:
    neuron = [25, 50, 100]

Line range hint 810-810: Local variable env_mat is assigned but never used, which can be removed to clean up the code.

- env_mat = repinit_variable.pop("env_mat")
deepmd/pt/model/descriptor/se_atten.py (3)

Line range hint 68-68: Replace mutable default arguments with immutable ones to prevent potential bugs.

- neuron: list = [25, 50, 100],
- exclude_types: List[Tuple[int, int]] = [],
+ neuron: Optional[List[int]] = None,
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Also applies to: 85-85, 423-423


Line range hint 399-403: Simplify the conditional assignment using a ternary operator.

- if callable(merged):
-     sampled = merged()
- else:
-     sampled = merged
+ sampled = merged() if callable(merged) else merged

Line range hint 473-473: Remove the unused variable nall to clean up the code.

- nall = extended_coord.view(nb, -1, 3).shape[1]
deepmd/pt/model/descriptor/repformer_layer.py (4)

Line range hint 846-846: Remove assignment to unused variable ng1.

-        ng1 = gg1.shape[-1]

Line range hint 905-905: Remove assignment to unused variable ng2.

-        ng2 = g2.shape[-1]

Line range hint 1068-1071: Replace if-else-block with ternary operator for more concise code.

-        if cal_gg1:
-            gg1 = _make_nei_g1(g1_ext, nlist)
-        else:
-            gg1 = None
+        gg1 = _make_nei_g1(g1_ext, nlist) if cal_gg1 else None

Line range hint 1180-1180: Remove assignment to unused variable nitem.

-        nitem = len(update_list)
deepmd/dpmodel/descriptor/dpa1.py (7)

Line range hint 229-229: Avoid using mutable default arguments for function parameters.

- neuron: List[int] = [25, 50, 100],
+ neuron: Optional[List[int]] = None,

Initialize neuron within the function if it is None.


Line range hint 240-240: Avoid using mutable default arguments for function parameters.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Initialize exclude_types within the function if it is None.


Line range hint 470-470: Remove assignment to unused variable nall.

- nall = coord_ext.reshape(nf, -1).shape[1] // 3

Line range hint 550-550: Remove assignment to unused variable env_mat.

- env_mat = self.env_mat

Line range hint 612-612: Avoid using mutable default arguments for function parameters.

- neuron: List[int] = [25, 50, 100],
+ neuron: Optional[List[int]] = None,

Initialize neuron within the function if it is None.


Line range hint 622-622: Avoid using mutable default arguments for function parameters.

- exclude_types: List[Tuple[int, int]] = [],
+ exclude_types: Optional[List[Tuple[int, int]]] = None,

Initialize exclude_types within the function if it is None.


Line range hint 816-816: Avoid using mutable default arguments for function parameters.

- neuron: List[int] = [25, 50, 100],
+ neuron: Optional[List[int]] = None,

Initialize neuron within the function if it is None.

deepmd/dpmodel/descriptor/repformers.py (8)

Line range hint 156-156: Avoid using mutable default arguments in function definitions.

- exclude_types: List[Tuple[int, int]] = []
+ exclude_types: Optional[List[Tuple[int, int]]] = None

Initialize exclude_types within the function if it is None.


Line range hint 337-337: Avoid using mutable default arguments in function definitions.

- exclude_types: List[Tuple[int, int]] = []
+ exclude_types: Optional[List[Tuple[int, int]]] = None

Initialize exclude_types within the function if it is None.


Line range hint 381-381: Rename unused loop variable.

- for idx, ll in enumerate(self.layers):
+ for _idx, ll in enumerate(self.layers):

Line range hint 549-549: Remove assignment to unused variable.

- ng = g.shape[-1]

Line range hint 1322-1322: Remove assignment to unused variable.

- ng1 = gg1.shape[-1]

Line range hint 1403-1403: Remove assignment to unused variable.

- nall = g1_ext.shape[1]

Line range hint 1413-1416: Simplify conditional assignment using a ternary operator.

- if cal_gg1:
-     gg1 = _make_nei_g1(g1_ext, nlist)
- else:
-     gg1 = None
+ gg1 = _make_nei_g1(g1_ext, nlist) if cal_gg1 else None

Line range hint 1522-1522: Remove assignment to unused variable.

- nitem = len(update_list)
deepmd/pt/model/network/network.py (2)

Line range hint 450-450: Refactor the conditional assignment for clarity.

- hidden = input_dim if not hidden else hidden
+ hidden = hidden if hidden else input_dim

This change simplifies the logic and improves readability.


Line range hint 974-974: Rename unused loop variables to _.

- for i in range(self.layer_num):
+ for _i in range(self.layer_num):

This change adheres to Python conventions for unused variables, improving code cleanliness.

Also applies to: 1758-1758

deepmd/dpmodel/fitting/general_fitting.py Show resolved Hide resolved
deepmd/dpmodel/descriptor/dpa2.py Show resolved Hide resolved
deepmd/pt/model/descriptor/se_t.py Show resolved Hide resolved
deepmd/pt/model/descriptor/se_t.py Show resolved Hide resolved
deepmd/pt/model/task/dipole.py Show resolved Hide resolved
deepmd/dpmodel/descriptor/se_t.py Show resolved Hide resolved
deepmd/pt/model/task/polarizability.py Show resolved Hide resolved
deepmd/pt/model/descriptor/se_atten_v2.py Show resolved Hide resolved
deepmd/pt/utils/utils.py Outdated Show resolved Hide resolved


@overload
def child_seed(seed: None, idx: int) -> None: ...

Check notice

Code scanning / CodeQL

Statement has no effect Note

This statement has no effect.


@overload
def child_seed(seed: Union[int, List[int]], idx: int) -> List[int]: ...

Check notice

Code scanning / CodeQL

Statement has no effect Note

This statement has no effect.
@njzjz njzjz marked this pull request as draft June 17, 2024 23:54
Signed-off-by: Jinzhe Zeng <[email protected]>
@njzjz njzjz marked this pull request as ready for review June 18, 2024 00:39
Copy link
Collaborator

@wanghan-iapcm wanghan-iapcm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you plz add a UT for mixed entropy to make sure the implementation is consistent with numpy.

@njzjz
Copy link
Member Author

njzjz commented Jun 18, 2024

Could you plz add a UT for mixed entropy to make sure the implementation is consistent with numpy.

No, it's not consistent, as PyTorch limits the seed to (-0xffff_ffff_ffff_ffff, 0xffff_ffff_ffff_ffff). It is only ensured that the seed generated by mix_entropy has high entropy by using a similar way to NumPy.

@njzjz njzjz requested a review from wanghan-iapcm June 18, 2024 21:18
@wanghan-iapcm wanghan-iapcm added this pull request to the merge queue Jun 19, 2024
Merged via the queue into deepmodeling:devel with commit 0c472d1 Jun 19, 2024
60 checks passed
@njzjz njzjz added this to the v2.2.11 milestone Jul 3, 2024
njzjz added a commit to njzjz/deepmd-kit that referenced this pull request Jul 3, 2024
Fix deepmodeling#3799.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

- **New Features**
- Introduced flexibility in specifying seed values, allowing either an
integer or a list of integers.
- Enhanced seed parameter usage across various initialization methods
and classes for more controlled randomization.

- **Improvements**
- Updated seed initialization logic to include additional computations
and dynamic adjustments.
- Enhanced documentation for parameters in multiple classes, providing
clearer usage guidelines.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Signed-off-by: Jinzhe Zeng <[email protected]>
(cherry picked from commit 0c472d1)
Signed-off-by: Jinzhe Zeng <[email protected]>
wanghan-iapcm pushed a commit that referenced this pull request Jul 3, 2024
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

- **Bug Fixes**
- Resolved inconsistencies in seed values by incrementing `self.seed`
conditionally in descriptor modules.
  
- **Tests**
- Updated test arrays `refe`, `reff`, and `refv` with new reference
values.
- Adjusted expected values in `test_model_ener` method for better
accuracy.

These changes ensure more reliable descriptor computations and improved
test accuracy.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------


(cherry picked from commit 0c472d1)

Signed-off-by: Jinzhe Zeng <[email protected]>
mtaillefumier pushed a commit to mtaillefumier/deepmd-kit that referenced this pull request Sep 18, 2024
Fix deepmodeling#3799.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

- **New Features**
- Introduced flexibility in specifying seed values, allowing either an
integer or a list of integers.
- Enhanced seed parameter usage across various initialization methods
and classes for more controlled randomization.

- **Improvements**
- Updated seed initialization logic to include additional computations
and dynamic adjustments.
- Enhanced documentation for parameters in multiple classes, providing
clearer usage guidelines.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Signed-off-by: Jinzhe Zeng <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Seeds of descriptor/fitting in dpmodel are not passed to network
3 participants