I love Nginx and have never had a problem with it. Until now.
I finally managed to reproduce the problem in cURL, and to my surprise, the requests were getting stopped by Nginx. All other requests were going through fine, and the error only happened when uploading a file of 10240 bytes or more.
First thing I though was that Nginx v1.8.0 had a bug. Nobody on the internet seemed to have this problem. So I installed v1.9.4. Now the server returned a 500 error instead of a 404. Still no answer to why.
I finally found it: playing with
client_body_buffer_size seemed to change the
threshold for which files would trigger the error and which wouldn’t, but
ultimately the error was still there. Then I read about how Nginx uses
temporary files to store body data. I checked that folder (in my case
/var/lib/nginx/client_body) and the folder was writeable by the
however the parent folder
/var/lib/nginx was owned by
root:root and was set
0700. I set
/var/lib/nginx to be readable/writable by user
it all started working.
Check your permissions
So, check your folder permissions. Nginx wasn’t returning any useful errors (first a 404, which I’m assuming was a bug fixed in a later version) then a 500 error. It’s important to note that after switching to v1.9.4, the Permission Denied error did show up in the error log, but at that point I had already decided the logs were useless (v1.8.0 silently ignored the problem).
This is an edit! Shortly after I applied the above fix, I started getting another error. My backend was getting the requests, but the entire request was being buffered by Nginx before being proxied. This is annoying to me because the backend is async and is made to stream large uploads.
After some research, I found the fix (I put this in the backend proxy’s
This tells Nginx to just stream the request to the backend (exactly what I want).