Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Denial of service with array validations? #1872

Open
axel-habermaier opened this issue Jan 12, 2023 · 5 comments
Open

Denial of service with array validations? #1872

axel-habermaier opened this issue Jan 12, 2023 · 5 comments
Labels
enhancement New feature or request

Comments

@axel-habermaier
Copy link

axel-habermaier commented Jan 12, 2023

Based on Zod's README.md and the mentioned libraries using Zod that are linked within that document, I would assume that Zod is designed to be used in potentially highly security-critical situations as a first line of defense against malicious users sending invalid data, i.e., in form validations, when parsing/validating HTTP request bodies, etc. If that is not the case, please (also) interpret this ticket as a request to clearly indicate the intended use cases of the library.

If my understanding is correct, however, I think there might be a security issue around the handling of arrays: Assuming a Zod schema z.number().max(1).array(), parsing the value new Array(1000000).fill(5) yields 1000000 validation errors, taking safeParse 3 seconds to generate the completely useless list of errors. For an array length of 10000000 (one additional 0), the Nodejs process crashes due to an out-of-memory error on my machine.

Is is also possible to generate an array like const a = [1,2,3]; a[1000000000] = 4. In that case, after about 20 seconds (!), safeParse of the same example schema (see above) crashes with an error: TypeError: Cannot read properties of undefined (reading 'status') at ParseStatus.mergeArray (.../node_modules/zod/lib/helpers/parseUtil.js:62:19).... The huge computation time during which the entire Nodejs main thread is blocked is obviously problematic. The error that is thrown seems unintentional, i.e. this might even be a Zod bug and the 1000000000 error messages should have been reported.

By the way, I used the lastest version of Zod, 3.20.2, for my tests.

Now, you could say that there should be other lines of dense against such attacks, such as the maximum request body size. You could also use the array schema's length function to restrict the maximum allowed array length (if there is some meaningful upper bound for your particular use case). And all of that is true, but it violates the secure by default/by design principle.

As asked above, isn't the point of Zod that I can just pass it some data that I want to ensure to have a certain shape/schema? So are my findings above the intended behavior of Zod? Other issues like #1403 seem to suggest that Zod's design explicitly wants to return all errors to the user, which is clearly undesirable from a security (in particular: availability) perspective. If the behavior is intentional, I don't see how Zod can be recommended for security-critical applications and use cases like the libraries mentioned in the Readme.md. Yes, it doesn't leak data, but it endangers the availability of a system since it facilitates denial of service attacks. This behavior might be surprising to users; it clearly was surprising to me.

The fact that this project doesn't even seem to have a security policy defined (at least none that I could easily find; I expected to find it here) also doesn't look like security is a primary concern. I would have preferred to not report this (potential) issue publicly in a GitHub issue.

@igalklebanov
Copy link
Contributor

igalklebanov commented Jan 12, 2023

Maybe we should check for status === 'dirty' after length checks and stop parsing if array length was erroneous (not continue and check each individual array item)?

.. or provide a parse option that enables such behavior.

If this is reasonable, we could also suggest (for concerns brought in this thread, and for general batch processing use cases) to always limit array length.

@JacobWeisenburger JacobWeisenburger added the enhancement New feature or request label Jan 14, 2023
@exneval
Copy link

exneval commented Jan 19, 2023

any updates of the discussion? seems this issue lack of attention.

@carlBgood
Copy link

carlBgood commented Feb 10, 2023

I'd encourage this to be merged as well - seems like a pretty simple and quick addon. And it would save major headaches in chaining if we could prevent refine/superRefine from running (especially if we're querying dbs) if a simple chained error pops first.

RE: #1606 (comment)

This is a feature any validation library should have.
I see 2 possible options here:

  • global validation setting (like abortEarly: true)
  • single field validation modifier (like z.string().bail()...). This one is preferred.

@JanKalkan
Copy link

Hi, any updates on the issue?

@abdurahmanshiine
Copy link

This seems to be a critical issue, yet lacks attention

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

7 participants