Skip to content

docs: address PR feedback on external storage pages#4368

Open
lennessyy wants to merge 6 commits intofeat/large-payload-storagefrom
feat/large-payload-storage-feedback
Open

docs: address PR feedback on external storage pages#4368
lennessyy wants to merge 6 commits intofeat/large-payload-storagefrom
feat/large-payload-storage-feedback

Conversation

@lennessyy
Copy link
Copy Markdown
Contributor

@lennessyy lennessyy commented Mar 31, 2026

Summary

Addresses feedback from PR #4333 review.

┆Attachments: EDU-6136 docs: address PM feedback on external storage pages

@lennessyy lennessyy requested a review from a team as a code owner March 31, 2026 01:58
@vercel
Copy link
Copy Markdown

vercel bot commented Mar 31, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
temporal-documentation Ready Ready Preview, Comment Apr 1, 2026 6:17pm

Request Review

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 31, 2026

@lennessyy lennessyy changed the title docs: address PM feedback on external storage pages docs: address PR feedback on external storage pages Mar 31, 2026
external_storage=ExternalStorage(
drivers=[driver],
payload_size_threshold=256 * 1024, # 256 KiB (default)
payload_size_threshold=0,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be an invalid value. I checked the Python implementation and we aren't enforcing that. When that is fixed, the value to set to offload all payloads would be 1 (or in languages that support it, like Python, None) because all payloads have a size that is at least 1 byte.

Setting that aside for a moment, why do we need to set this here? It's optional and defaulted for a reason. We don't want customers copying this verbatim and always externalizing all payloads.

Copy link
Copy Markdown
Contributor Author

@lennessyy lennessyy Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The way I see it the question is "Is this a parameter that we expect most users will have to customize for their workload?" If it is, then we should include the parameter this sample.

It's not my intention to make users copy the code sample and externalize all payloads by default. I just want to include whatever the default value is here, just so users see it when they copy the code. If the default is 256, then we use 256 here. I wasn't sure if we were changing the default to 0 based on another the discussion in the other PR: https://github.com/temporalio/documentation/pull/4333/changes#r3012633767

But if we think this is a default most people should not bother to change, then I will remove it


<!--SNIPEND-->

By default, payloads larger than approximately 200 KiB are offloaded to external storage. You can adjust this with
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The value is exactly 256 KiB. And if the payload size is equal or greater than that value, it will be eligible to be externally stored. The driver selector ultimately determines if a payload is externally stored. In the above specific example, it will be stored if the payload size is equal to or greater than the threshold since there is no explicitly defined selector.

external_storage=ExternalStorage(
drivers=[driver],
payload_size_threshold=256 * 1024, # 256 KiB (default)
payload_size_threshold=0,
Copy link
Copy Markdown
Contributor Author

@lennessyy lennessyy Apr 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
payload_size_threshold=0,
payload_size_threshold=256 * 1024,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants