I’ve got a django project running which requires you to login to access files.

That means that I have to serve the files via python, like this:

def download(request, filename):
  # ... some code specific to my site ...
  response = HttpResponse(mimetype=postUpload.mimetype)
  response['Content-Disposition'] = "attachment; filename=" + original_filename
  response['Content-Length'] = os.path.getsize(filename_path)
  return response

The problem: If the download of a file exceeded 5 minutes (big files and/or low
bandwidth) the download was canceled on the server side by a timeout. This Apache
configuration for mod_fcgid solved the problem (Note that this has been renamed into FcgidBusyTimeout, documentation here)

<IfModule mod_fcgid.c>
 BusyTimeout 1200

The problem was that the apache module scanned every minute for processes that run for
more than BusyTimeout seconds. These processes are potentially in bad health (infinite
loop et al.) and have to be killed. Not so with my processes (since I know what I’m
doing..). The setting of the busy timeout to 1200 seconds now lets my processes run for
a maximum of one hour.

As this setting can’t by overwritten in a htaccess file by default I needed to bug
my web hosting provider with the request, which was
handled in 24 hours, so thanks for that one!

PS: If you know of another way how to serve protected static files via a single sign on
(no HTTP basic auth), please let me know.