Skip to content

Conversation

@ericallam
Copy link
Member

No description provided.

@changeset-bot
Copy link

changeset-bot bot commented Dec 16, 2025

⚠️ No Changeset found

Latest commit: 95d6d8b

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 16, 2025

Walkthrough

This pull request introduces a new Batches API (v3) with streaming capabilities across documentation and OpenAPI specifications. Changes include: adding two new API endpoints (POST /api/v3/batches and POST /api/v3/batches/{batchId}/items) with complete OpenAPI schema definitions, creating new MDX documentation files for these endpoints, updating batch operation limits from 500 to 1,000 items (SDK 4.3.1+), introducing batch rate-limiting and concurrency sections to limits documentation, adding environment variable documentation for the new streaming batch configuration, and updating existing references to reflect the new v3 batch API capabilities.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

  • docs/openapi.yml: Requires careful verification of new endpoint schemas (CreateBatchRequest, CreateBatchResponse, BatchItemNDJSON, StreamBatchItemsResponse), request/response bindings, error response codes, and security definitions for both new batch endpoints.
  • docs/limits.mdx: Verify accuracy of new rate-limiting token bucket parameters, concurrency limits per tier, and that SDK version qualifications (4.3.1+) are consistently applied throughout.
  • docs/management/batches/create.mdx and stream-items.mdx: Confirm OpenAPI operation references match the paths defined in openapi.yml.
  • docs/self-hosting/env/webapp.mdx: Validate new environment variable names and descriptions align with implementation requirements and v2 vs. v3 API distinctions.
  • Navigation consistency: Verify that the new "Batches API" group in docs/docs.json properly routes to the created MDX documentation files.

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Description check ⚠️ Warning The PR description is entirely missing. The author did not provide any description content, which is required by the repository template. Add a complete PR description following the repository template, including Testing, Changelog, and other required sections.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly summarizes the main change: documentation updates for new batch trigger limits and functionality.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch ea-branch-106

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (1)
docs/management/batches/stream-items.mdx (1)

1-4: Same openapi reference format as create.mdx - verify consistency.

This file has the same openapi: "openapi POST /api/v3/batches/{batchId}/items" format as create.mdx. If the format verification for create.mdx shows this is incorrect, the same fix should be applied here.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 28a66ac and 95d6d8b.

📒 Files selected for processing (9)
  • docs/docs.json (2 hunks)
  • docs/limits.mdx (1 hunks)
  • docs/management/batches/create.mdx (1 hunks)
  • docs/management/batches/stream-items.mdx (1 hunks)
  • docs/openapi.yml (2 hunks)
  • docs/self-hosting/env/webapp.mdx (1 hunks)
  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx (1 hunks)
  • docs/triggering.mdx (2 hunks)
  • docs/v3-openapi.yaml (1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.{js,ts,jsx,tsx,json,md,css,scss}

📄 CodeRabbit inference engine (AGENTS.md)

Format code using Prettier

Files:

  • docs/docs.json
🧠 Learnings (16)
📓 Common learnings
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `batch.trigger()` to trigger multiple different tasks at once from backend code
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `tasks.batchTrigger()` to trigger multiple runs of a single task with different payloads
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `batch.triggerByTask()` to batch trigger tasks by passing task instances for static task sets
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `batch.triggerByTaskAndWait()` to batch trigger tasks by passing task instances and wait for results
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `yourTask.batchTrigger()` to trigger multiple runs of a task from inside another task
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `batch.triggerAndWait()` to batch trigger multiple different tasks and wait for results
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `yourTask.batchTriggerAndWait()` to batch trigger tasks and wait for all results from a parent task
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `batch.trigger()` to trigger multiple different tasks at once from backend code

Applied to files:

  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx
  • docs/triggering.mdx
  • docs/v3-openapi.yaml
  • docs/openapi.yml
  • docs/limits.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `tasks.batchTrigger()` to trigger multiple runs of a single task with different payloads

Applied to files:

  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx
  • docs/triggering.mdx
  • docs/v3-openapi.yaml
  • docs/openapi.yml
  • docs/limits.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `batch.triggerByTask()` to batch trigger tasks by passing task instances for static task sets

Applied to files:

  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx
  • docs/triggering.mdx
  • docs/v3-openapi.yaml
  • docs/openapi.yml
  • docs/limits.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `yourTask.batchTrigger()` to trigger multiple runs of a task from inside another task

Applied to files:

  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx
  • docs/triggering.mdx
  • docs/v3-openapi.yaml
  • docs/openapi.yml
  • docs/limits.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `batch.triggerByTaskAndWait()` to batch trigger tasks by passing task instances and wait for results

Applied to files:

  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx
  • docs/triggering.mdx
  • docs/v3-openapi.yaml
  • docs/openapi.yml
  • docs/limits.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `yourTask.batchTriggerAndWait()` to batch trigger tasks and wait for all results from a parent task

Applied to files:

  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx
  • docs/triggering.mdx
  • docs/v3-openapi.yaml
  • docs/openapi.yml
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `batch.triggerAndWait()` to batch trigger multiple different tasks and wait for results

Applied to files:

  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx
  • docs/triggering.mdx
  • docs/v3-openapi.yaml
  • docs/openapi.yml
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use the `task()` function from `trigger.dev/sdk/v3` to define tasks with id and run properties

Applied to files:

  • docs/snippets/rate-limit-hit-use-batchtrigger.mdx
  • docs/triggering.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `runs.subscribeToBatch()` to subscribe to changes for all runs in a batch

Applied to files:

  • docs/triggering.mdx
  • docs/openapi.yml
  • docs/limits.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `.withStreams()` to subscribe to realtime streams from task metadata in addition to run changes

Applied to files:

  • docs/triggering.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `yourTask.trigger()` to trigger a task from inside another task with specified payload

Applied to files:

  • docs/triggering.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Use `yourTask.triggerAndWait()` to trigger a task and wait for its result from a parent task

Applied to files:

  • docs/triggering.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Generate example payloads for tasks when possible

Applied to files:

  • docs/triggering.mdx
📚 Learning: 2025-11-10T09:09:07.399Z
Learnt from: myftija
Repo: triggerdotdev/trigger.dev PR: 2663
File: apps/webapp/app/env.server.ts:1205-1206
Timestamp: 2025-11-10T09:09:07.399Z
Learning: In the trigger.dev webapp, S2_ACCESS_TOKEN and S2_DEPLOYMENT_LOGS_BASIN_NAME environment variables must remain optional until an OSS version of S2 is available, to avoid breaking environments that don't have S2 provisioned.

Applied to files:

  • docs/self-hosting/env/webapp.mdx
📚 Learning: 2025-11-27T16:27:35.304Z
Learnt from: CR
Repo: triggerdotdev/trigger.dev PR: 0
File: .cursor/rules/writing-tasks.mdc:0-0
Timestamp: 2025-11-27T16:27:35.304Z
Learning: Applies to **/trigger/**/*.{ts,tsx,js,jsx} : Limit task duration using the `maxDuration` property (in seconds)

Applied to files:

  • docs/limits.mdx
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Cursor Bugbot
  • GitHub Check: Analyze (javascript-typescript)
🔇 Additional comments (11)
docs/self-hosting/env/webapp.mdx (1)

104-106: LGTM! Clear documentation of new streaming batch environment variables.

The addition of streaming batch configuration is well documented. The legacy v2 notation helps users understand the migration path, and the SDK version requirement (4.3.1+) is clearly stated.

docs/openapi.yml (3)

46-123: Well-structured batch creation endpoint with comprehensive error handling.

The two-phase batch API design is clear. The inclusion of JWT headers in the 202 response and comprehensive rate-limiting headers in the 429 response demonstrates good API design.


286-370: LGTM! Comprehensive schema definitions for the new batch API.

The schemas are well-defined with clear descriptions and appropriate validations. The conditional fields in StreamBatchItemsResponse (enqueuedCount and expectedCount only present when sealed=false) provide helpful retry context.


124-198: The OpenAPI specification for this endpoint is correct. Both application/x-ndjson and application/ndjson are widely supported MIME types for NDJSON streams, with the latter being the standardized type and the former a common de facto standard. Backpressure is a legitimate implementation pattern for HTTP POST endpoints with streaming request bodies—the server can control the rate at which it reads incoming data to apply backpressure to the client. The specification accurately reflects this capability.

docs/snippets/rate-limit-hit-use-batchtrigger.mdx (1)

1-1: LGTM! Clear upgrade guidance for batch trigger limits.

The snippet now accurately reflects the increased limit (1,000 items) while maintaining backward compatibility information for prior SDK versions.

docs/v3-openapi.yaml (1)

1304-1304: LGTM! Batch trigger limit accurately updated.

The description now correctly reflects the increased limit with appropriate SDK version requirements and backward compatibility notes.

docs/docs.json (1)

273-279: LGTM! Navigation structure properly updated for new Batches API.

The new "Batches API" group is logically placed and the pages correspond to the newly added MDX documentation files.

docs/limits.mdx (3)

78-88: Excellent clarification of batch payload and size limits.

The updated documentation clearly distinguishes between per-item limits (3MB each) and the previous combined limit (1MB total), with appropriate SDK version requirements. This prevents confusion and helps users understand the upgrade benefits.


92-103: Well-documented rate limiting with helpful example.

The token bucket algorithm is clearly explained with concrete numbers and a practical example. The per-tier breakdown helps users understand their specific limits.


104-115: Clear batch processing concurrency limits.

The new section effectively communicates how many batches can be processed simultaneously per environment. The note about "items actively being processed into runs" clarifies what the limit applies to.

docs/triggering.mdx (1)

201-202: LGTM! Clear version-specific limit documentation.

The updated batch trigger limit is well-documented with SDK version requirements, helping users understand capability differences between SDK versions.

Comment on lines +1 to +4
---
title: "Create batch"
openapi: "openapi POST /api/v3/batches"
---
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check the openapi reference format used in other management API files
rg -n "^openapi:" docs/management/ -A 0 -C 0

Repository: triggerdotdev/trigger.dev

Length of output: 2504


Fix openapi reference format inconsistency.

The openapi reference in line 3 uses "openapi POST /api/v3/batches", but this differs from the standard format used across all other management API documentation files, which use "v3-openapi POST ...". Update this line to align with the codebase convention: "v3-openapi POST /api/v3/batches". Note that docs/management/batches/stream-items.mdx has the same inconsistency.

🤖 Prompt for AI Agents
In docs/management/batches/create.mdx around lines 1 to 4, the frontmatter
openapi tag uses "openapi POST /api/v3/batches" which is inconsistent with the
project convention; replace that value with "v3-openapi POST /api/v3/batches" to
match other management API docs, and while here also check and update
docs/management/batches/stream-items.mdx if it contains the same "openapi ..."
line so both files use "v3-openapi POST /api/v3/batches".

Comment on lines +974 to +1010
## Streaming batch triggering

<Note>This feature is only available with SDK 4.3.1+</Note>

For large batches, you can pass an `AsyncIterable` or `ReadableStream` instead of an array. This allows you to generate items on-demand without loading them all into memory upfront.

```ts /trigger/my-task.ts
import { task } from "@trigger.dev/sdk";
import { myOtherTask } from "~/trigger/my-other-task";

export const myTask = task({
id: "my-task",
run: async (payload: { userIds: string[] }) => {
// Use an async generator to stream items
async function* generateItems() {
for (const userId of payload.userIds) {
yield { payload: { userId } };
}
}

const batchHandle = await myOtherTask.batchTrigger(generateItems());

return { batchId: batchHandle.batchId };
},
});
```

This works with all batch trigger methods:

- `yourTask.batchTrigger()`
- `yourTask.batchTriggerAndWait()`
- `batch.triggerByTask()`
- `batch.triggerByTaskAndWait()`
- `tasks.batchTrigger()`

Streaming is especially useful when generating batches from database queries, API pagination, or
file processing where you don't want to load all items into memory at once.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# Check what limit is mentioned at lines 201-202 in docs/triggering.mdx
head -210 docs/triggering.mdx | tail -15

Repository: triggerdotdev/trigger.dev

Length of output: 562


🏁 Script executed:

# Search for streaming limits documentation in the docs
rg -n -i "stream.*limit|stream.*batch.*item" docs/triggering.mdx docs/limits.mdx

# Search for rate limiting information related to batches
rg -n -B3 -A3 "rate.*limit|batch.*rate" docs/limits.mdx

# Search for error handling patterns with streaming
rg -n -i "error.*stream|stream.*error|async.*generator.*error" docs/triggering.mdx

Repository: triggerdotdev/trigger.dev

Length of output: 816


🏁 Script executed:

# Get the full batch trigger rate limits section
sed -n '92,120p' docs/limits.mdx

# Search for any streaming-specific documentation in limits
rg -n -A10 "stream" docs/limits.mdx

Repository: triggerdotdev/trigger.dev

Length of output: 1287


🏁 Script executed:

# Search for error handling guidance with async generators
rg -n -B5 -A5 "async.*generator|generator.*error|throw|catch" docs/triggering.mdx | head -50

# Look for any error handling section in triggering documentation
rg -n -i "error.*handling|error.*batch|error.*stream" docs/triggering.mdx

Repository: triggerdotdev/trigger.dev

Length of output: 1878


🏁 Script executed:

# Look at the streaming batch section specifically (lines 974-1010)
sed -n '974,1010p' docs/triggering.mdx

# Search for any error handling guidance in streaming context
rg -n -B10 -A10 "async function\* generateItems" docs/triggering.mdx

Repository: triggerdotdev/trigger.dev

Length of output: 2046


🏁 Script executed:

# Check if there's any clarification about the 1000 vs 500 limit for SDK 4.3.1+
rg -n -B5 -A5 "SDK 4.3.1|4.3.1" docs/triggering.mdx | head -40

# Search for any changelog or migration notes about batch limits
rg -n "500.*1000|1000.*500|batch.*limit" docs/

Repository: triggerdotdev/trigger.dev

Length of output: 2620


🏁 Script executed:

# Check the exact limits.mdx content around the batch limits
sed -n '85,95p' docs/limits.mdx

# Search for streaming error handling or considerations
rg -n -B3 -A3 "streaming|stream.*fail|stream.*error" docs/limits.mdx docs/triggering.mdx

Repository: triggerdotdev/trigger.dev

Length of output: 391


Add references to batch limits and rate limiting in the streaming batch section.

The streaming batch triggering section introduces AsyncIterable/ReadableStream support but omits important operational constraints:

  1. Batch size limits: The streaming feature requires SDK 4.3.1+, which supports up to 1,000 items per batch (see docs/limits.mdx for details). This should be explicitly stated.

  2. Rate limiting: Batch triggering uses a token bucket algorithm where each run in a batch consumes one token. Streaming batches are subject to the same rate limits as regular batch triggers.

  3. Error handling: Consider documenting how errors in the async generator/stream are handled (e.g., does one failed item stop the stream or continue processing remaining items?).

🤖 Prompt for AI Agents
In docs/triggering.mdx around lines 974 to 1010, the Streaming batch triggering
section lacks operational constraints: add a brief note that SDK 4.3.1+
streaming batches abide by the documented batch size limits (up to 1,000 items —
reference docs/limits.mdx), document that streaming batches are subject to the
same token-bucket rate limiting as regular batch triggers (each run consumes one
token), and add a short paragraph describing error handling behavior for async
generators/streams (clarify whether a single item error stops the stream or how
failures are surfaced and recovered). Ensure these points are concise, reference
docs/limits.mdx for limits, and place them immediately after the explanation of
AsyncIterable/ReadableStream usage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants