feat(show): add serialization-tree via DeepEval.serialize#5316
feat(show): add serialization-tree via DeepEval.serialize#5316njzjz-bot wants to merge 5 commits intodeepmodeling:masterfrom
Conversation
Authored by OpenClaw (model: gpt-5.2)
|
Implements #5185 (first step): add powered by a backend-unified wrapper.\n\nAuthored by OpenClaw (model: gpt-5.2) |
|
Implements #5185 (first step): add Authored by OpenClaw (model: gpt-5.2) |
|
(FYI) Previous comment got mangled by shell backticks; this one is the corrected text.\n\nAuthored by OpenClaw (model: gpt-5.2) |
for more information, see https://pre-commit.ci
📝 WalkthroughWalkthroughAdds a "serialization-tree" show option and a DeepEval.serialize() API implemented across backends; show entrypoint can call serialize(), validate presence of a "model" key, deserialize into a Node tree, and log the serialization tree. DeepEval stores model_file for backend resolution. Changes
Sequence DiagramsequenceDiagram
participant CLI as CLI Parser
participant Show as Show Entrypoint
participant DeepEval as DeepEval Interface
participant Backend as Backend Implementation
participant Node as Node (Serialization)
CLI->>Show: invoke show with "serialization-tree"
Show->>DeepEval: request serialize()
DeepEval->>Backend: resolve backend using model_file and call serialize()
Backend-->>DeepEval: return serialized dict (must include "model")
Show->>Node: Node.deserialize(serialized["model"])
Node-->>Show: serialization tree
Show->>Show: log serialization tree
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
Suggested labels
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
deepmd/infer/deep_eval.py (1)
444-446: Cyclic import flagged by static analysis.CodeQL reports a cyclic import starting from
deepmd.pretrained.deep_eval. While placing the import inside the method body mitigates runtime issues (the import only occurs when the pretrained branch is taken), consider whether these utilities could be imported from a lower-level module to avoid the cycle entirely.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@deepmd/infer/deep_eval.py` around lines 444 - 446, The import of parse_pretrained_alias inside deepmd.infer.deep_eval creates a cyclic dependency with deepmd.pretrained.deep_eval; to fix it, extract parse_pretrained_alias (and any small helper utilities it needs) into a lower-level module (e.g., deepmd.pretrained.utils or deepmd.utils.pretrained) and update both deepmd.pretrained.deep_eval and deepmd.infer.deep_eval to import parse_pretrained_alias from that new module; ensure the extracted function has no imports back to deepmd.pretrained.deep_eval to break the cycle and run tests to confirm no runtime regressions.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@deepmd/infer/deep_eval.py`:
- Around line 443-454: When resolving a pretrained alias you re-detect the
backend and immediately call serialize_hook on backend_cls (via
Backend.detect_backend_by_model and backend_cls().serialize_hook) without
checking that the backend supports the IO feature; add the same IO capability
check used in the regular path before calling serialize_hook: after determining
backend_cls = Backend.detect_backend_by_model(resolved), verify
Backend.feature(backend_cls, Backend.Feature.IO) (or equivalent feature-check
method used elsewhere) and raise the same NotImplementedError with the same
message if IO is not supported, otherwise call
backend_cls().serialize_hook(resolved).
---
Nitpick comments:
In `@deepmd/infer/deep_eval.py`:
- Around line 444-446: The import of parse_pretrained_alias inside
deepmd.infer.deep_eval creates a cyclic dependency with
deepmd.pretrained.deep_eval; to fix it, extract parse_pretrained_alias (and any
small helper utilities it needs) into a lower-level module (e.g.,
deepmd.pretrained.utils or deepmd.utils.pretrained) and update both
deepmd.pretrained.deep_eval and deepmd.infer.deep_eval to import
parse_pretrained_alias from that new module; ensure the extracted function has
no imports back to deepmd.pretrained.deep_eval to break the cycle and run tests
to confirm no runtime regressions.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: a8a6ae18-b1d4-465f-b5b3-2149b4fbd411
📒 Files selected for processing (3)
deepmd/entrypoints/show.pydeepmd/infer/deep_eval.pydeepmd/main.py
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## master #5316 +/- ##
==========================================
- Coverage 82.29% 80.29% -2.01%
==========================================
Files 775 812 +37
Lines 77627 83534 +5907
Branches 3676 4050 +374
==========================================
+ Hits 63887 67077 +3190
- Misses 12566 15234 +2668
- Partials 1174 1223 +49 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Switch DeepEval.serialize() to delegate to DeepEvalBackend.serialize(), and implement serialize() in each backend by calling the underlying model's serialize(). Also move Node import in dp show to module top-level. Authored by OpenClaw (model: gpt-5.2)
|
Refactor update (per feedback): switch from
Latest commit: 1694360 Authored by OpenClaw (model: gpt-5.2) |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@deepmd/pt_expt/infer/deep_eval.py`:
- Around line 668-673: serialize() currently returns the raw model_dict from
serialize_from_file(self.model_path) but callers (e.g., the CLI in show.py)
expect a wrapper with keys "backend", "model", "model_def_script", and
"@variables"; change serialize() in the class to call
serialize_from_file(self.model_path), then construct and return a dict like
{"backend": "<appropriate backend name or value from model_dict if available>",
"model": model_dict, "model_def_script": <script or None>, "@variables": <vars
dict or empty dict>} so the returned structure contains those keys; ensure you
pull any available values from model_dict to populate "backend",
"model_def_script", and "@variables" or set sensible defaults if missing.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: f7533fe2-e54e-4e12-ae64-e1712fa5c00b
📒 Files selected for processing (9)
deepmd/dpmodel/infer/deep_eval.pydeepmd/entrypoints/show.pydeepmd/infer/deep_eval.pydeepmd/jax/infer/deep_eval.pydeepmd/pd/infer/deep_eval.pydeepmd/pretrained/deep_eval.pydeepmd/pt/infer/deep_eval.pydeepmd/pt_expt/infer/deep_eval.pydeepmd/tf/infer/deep_eval.py
🚧 Files skipped from review as they are similar to previous changes (1)
- deepmd/entrypoints/show.py
There was a problem hiding this comment.
Pull request overview
This PR adds a new serialization-tree option to the dp show command by introducing a backend-unified DeepEval.serialize() API and implementing it across supported inference backends.
Changes:
- Added
DeepEvalBackend.serialize()(and wrapperDeepEval.serialize()) to return a unified serialized-model dictionary. - Implemented backend-specific
serialize()methods for TF / PyTorch / Paddle / DPModel / JAX / pt_expt (+ pretrained delegator). - Extended
dp showCLI attribute choices and entrypoint logic to print a model serialization tree usingNode.deserialize(...).
Reviewed changes
Copilot reviewed 10 out of 10 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
deepmd/infer/deep_eval.py |
Adds serialize() to the low-level backend interface and exposes it on the high-level DeepEval wrapper. |
deepmd/tf/infer/deep_eval.py |
Implements TF serialization via graph_def loading + Model.serialize(), including optional min_nbor_dist. |
deepmd/pt/infer/deep_eval.py |
Implements PyTorch serialization via self.dp.model["Default"].serialize(). |
deepmd/pd/infer/deep_eval.py |
Implements Paddle serialization via self.dp.model["Default"].serialize(). |
deepmd/dpmodel/infer/deep_eval.py |
Implements DPModel serialization via self.dp.serialize(). |
deepmd/jax/infer/deep_eval.py |
Adds JAX serialization method and includes jax_version. |
deepmd/pt_expt/infer/deep_eval.py |
Implements serialization by delegating to serialize_from_file(...). |
deepmd/pretrained/deep_eval.py |
Delegates serialization to the resolved backend for pretrained aliases. |
deepmd/entrypoints/show.py |
Adds serialization-tree printing using Node.deserialize(data["model"]). |
deepmd/main.py |
Adds serialization-tree to the dp show CLI ATTRIBUTES choices list. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Route JAX DeepEval serialization through the existing file-based serializer so .hlo and .savedmodel models follow the supported path instead of calling unimplemented model-level serialize() methods. Also add the missing pt_version field to the PyTorch backend serializer and wrap pt_expt serialization in the backend-unified payload expected by dp show serialization-tree. Add a targeted pt_expt serialization contract test. Authored by OpenClaw (model: gpt-5.4)
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@deepmd/jax/infer/deep_eval.py`:
- Around line 190-195: Update the serialize method in deep_eval.DeepEval (the
serialize(self) -> dict[str, Any] function that currently calls
serialize_from_file(self.model_path)) to explicitly document and guard against
TensorFlow SavedModel inputs: add a docstring note explaining that JAX backend
only supports serializing .jax/.hlo directories and that .savedmodel
(TensorFlow-wrapped) models are not supported, and add a pre-check that inspects
self.model_path (and/or the model wrapper type, e.g., TFModelWrapper if
accessible) to raise a clear ValueError with a descriptive message like
"serialize() not supported for .savedmodel / TFModelWrapper: JAX backend only
supports converting .jax/.hlo directories" before calling serialize_from_file.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 5017e4a9-2d01-483b-a3ea-85d28ab8a405
📒 Files selected for processing (4)
deepmd/jax/infer/deep_eval.pydeepmd/pt/infer/deep_eval.pydeepmd/pt_expt/infer/deep_eval.pysource/tests/pt_expt/infer/test_deep_eval.py
🚧 Files skipped from review as they are similar to previous changes (1)
- deepmd/pt_expt/infer/deep_eval.py
| def serialize(self) -> dict[str, Any]: | ||
| from deepmd.jax.utils.serialization import ( | ||
| serialize_from_file, | ||
| ) | ||
|
|
||
| return serialize_from_file(self.model_path) |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check if serialize_from_file handles .savedmodel or only .jax/.hlo
ast-grep --pattern $'def serialize_from_file($$$) {
$$$
}'Repository: deepmodeling/deepmd-kit
Length of output: 49
🏁 Script executed:
cat -n deepmd/jax/infer/deep_eval.py | head -120 | tail -40Repository: deepmodeling/deepmd-kit
Length of output: 1878
🏁 Script executed:
rg "def serialize_from_file" deepmd/jax/utils/serialization.py -A 50Repository: deepmodeling/deepmd-kit
Length of output: 1741
🏁 Script executed:
rg "\.savedmodel|\.hlo|\.jax" deepmd/jax/infer/deep_eval.py -B 2 -A 2Repository: deepmodeling/deepmd-kit
Length of output: 756
🏁 Script executed:
rg "def serialize_from_file" deepmd/jax/utils/serialization.py -A 100 | head -110Repository: deepmodeling/deepmd-kit
Length of output: 1966
Document .savedmodel serialization limitation in serialize().
Calling serialize() on a model loaded from a .savedmodel file will raise ValueError: JAX backend only supports converting .jax directory. This appears intentional—JAX-native models (.hlo, .jax) are serializable, but TensorFlow models wrapped via TFModelWrapper are not.
Consider adding a docstring note to serialize() or raising a more descriptive error in the method to clarify this limitation upfront.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@deepmd/jax/infer/deep_eval.py` around lines 190 - 195, Update the serialize
method in deep_eval.DeepEval (the serialize(self) -> dict[str, Any] function
that currently calls serialize_from_file(self.model_path)) to explicitly
document and guard against TensorFlow SavedModel inputs: add a docstring note
explaining that JAX backend only supports serializing .jax/.hlo directories and
that .savedmodel (TensorFlow-wrapped) models are not supported, and add a
pre-check that inspects self.model_path (and/or the model wrapper type, e.g.,
TFModelWrapper if accessible) to raise a clear ValueError with a descriptive
message like "serialize() not supported for .savedmodel / TFModelWrapper: JAX
backend only supports converting .jax/.hlo directories" before calling
serialize_from_file.
@/tmp/pr-body-5185.md
Summary by CodeRabbit
New Features
Tests