# Common Errors
Pipedream sources and workflows can throw errors for a variety of reasons. In some cases, you'll encounter an error sending data to a third-party API; in other cases, Pipedream will raise an error related to platform limits you've exceeded, or other internal errors.
This doc reviews common errors you'll encounter, and how to troubleshoot them.
# Warnings
Pipedream displays warnings below steps in certain conditions. These warnings do not stop the execution of your workflow, but can signal an issue you should be aware of.
# This step was still trying to run code when the step ended. Make sure you await all Promises, or promisify callback functions.
See the reference on running asynchronous code on Pipedream.
# Pipedream Internal Errors
Pipedream sets limits on runtime, memory, and other execution-related properties. If you exceed these limits, you'll receive one of the errors below. See the limits doc for details on specific limits.
# Invocations Quota Exceeded
On the Developer (free) tier, Pipedream imposes a limit on the daily invocations across all workflows and sources. If you hit this limit, you'll see an Invocations Quota Exceeded error.
Paid plans, like the Professional Tier, have no invocations limit. Upgrade here.
# Runtime Quota Exceeded
On the Developer (free) tier, Pipedream imposes a limit on the daily compute time across all workflows and sources. If you hit this limit, you'll see a Runtime Quota Exceeded error.
Paid plans, like the Professional Tier, have no compute time limit. Upgrade here.
# Timeout
Event sources and workflows have a default time limit on a given execution. If your code exceeds that limit, you may encounter a Timeout error.
Currently, you can raise the execution limit of a workflow in your workflow's settings. If you need to change the execution limit for an event source, please reach out to our team.
# Out of Memory
Pipedream limits the default memory available to workflows and event sources. If you exceed this memory, you'll see an Out of Memory error.
This can happen for a variety of reasons. Normally, it can occur when you try to load a large file or object into a variable / memory. Where possible, consider streaming the file to / from disk, instead of storing it in memory, using a technique like this.
You can raise the memory of your workflow in your workflow's Settings.
# Rate Limit Exceeded
Pipedream limits the number of events that can be processed by a given interface (e.g. HTTP endpoints) during a given interval. This limit is most commonly reached for HTTP interfaces - see the QPS limits documentation for more information on that limit.
This limit can be raised for HTTP endpoints. Reach out to our team to request an increase.
# Request Entity Too Large
By default, Pipedream limits the size of incoming HTTP payloads. If you exceed this limit, you'll see a Request Entity Too Large error.
Pipedream supports two different ways to bypass this limit. Both of these interfaces support uploading data up to 5TB
, though you may encounter other platform limits.
- You can send large HTTP payloads by passing the
pipedream_upload_body=1
query string or anx-pd-upload-body: 1
HTTP header in your HTTP request. Read more here. - You can upload multiple large files, like images and videos, using the large file upload interface.
# Function Payload Limit Exceeded
The total size of console.log()
statements, step exports, and the original event data sent to workflows and sources cannot exceed a combined size of 8MB
. If you produce logs or step exports larger than this - for example, passing around large API responses, CSVs, or other data - you may encounter a Function Payload Limit Exceeded in your workflow.
Often, this occurs when you pass large data between steps using step exports. You can avoid this error by writing that data to the /tmp
directory in one step, and reading the data into another step, which avoids the use of step exports and should keep you under the payload limit.
# JSON Nested Property Limit Exceeded
Working with nested JavaScript objects that have more than 256 nested objects will trigger a JSON Nested Property Limit Exceeded error.
Often, objects with this many nested objects result from a programming error that explodes the object in an unexpected way. Please confirm the code you're using to convert data into an object is correctly parsing the object.
# Event Queue Full
Workflows have a maximum event queue size when using concurrency and throttling controls. If the number of unprocessed events exceeds the maximum queue size, you may encounter an Event Queue Full error.
Paid plans can increase their queue size up to 10,000 for a given workflow.