Kallithea issues archive

Issue #301: Cloning large git repository fails in Kallithea with MemoryError

Reported by: Geoff Bache
State: closed
Created on: 2017-12-06 13:35
Updated on: 2018-01-19 01:12

Description

I just converted a ~2GB repository to Git and uploaded it to Kallithea. It turns out it cannot be cloned via Kallithea. Cloning it outside Kallithea works fine. Any attempts write the following to the Kallithea log:

2017-12-06 14:17:23.789 ERROR [waitress] Exception when serving /Documentation/doc/git-upload-pack
Traceback (most recent call last):
  File "c:\kallithea\env\lib\site-packages\waitress\channel.py", line 337, in service
    task.service()
  File "c:\kallithea\env\lib\site-packages\waitress\task.py", line 173, in service
    self.execute()
  File "c:\kallithea\env\lib\site-packages\waitress\task.py", line 392, in execute
    app_iter = self.channel.server.application(env, start_response)
  File "c:\python27\lib\site-packages\paste\gzipper.py", line 38, in __call__
    return response.write()
  File "c:\python27\lib\site-packages\paste\gzipper.py", line 68, in write
    out.seek(0)
  File "c:\python27\lib\StringIO.py", line 106, in seek
    self.buf += ''.join(self.buflist)
MemoryError

The machine isn't short of physical memory. We can see that the memory used by the Kallithea process increases by about 1GB before it throws this exception.

Attachments

Comments

Comment by Geoff Bache, on 2017-12-06 13:37

Comment by Ron Winacott, on 2017-12-06 14:06

I have not seen this type of problem since the pre-kallithea fork days, but I did see this when the /tmp or /temp (depending on OS of the server running Kallithea was full or not large enough of the large repository. Once the cloning was complete, or the error was thrown, the space in /tmp was freed again.

Comment by Geoff Bache, on 2017-12-06 14:14

Server is running Windows Server 2012. My temp (C: disk) has 8GB free and the repository is slightly less than 2GB.

Comment by Mads Kiilerich, on 2017-12-07 01:22

I guess that is caused by Kallithea entering a mode where it will zip all content before serving it ... and thus buffers it all up. That has been fixed on the development branch.

If using a "real" web server that can serve static content, try using

[app:main]
static_files = true

It seems like you are using waitress from paster serve. In that case, perhaps try editing kallithea/config/middleware.py and delete the app = make_gzip_middleware(app, global_conf, compress_level=1) line.

Comment by domruf, on 2017-12-11 18:19

I bet @gjb1002 is using a 32bit python and therefore the process can not be larger then 2 GB.

Comment by Mads Kiilerich, on 2017-12-12 00:55

@domruf a very good point. Disabling static_files might make it work anyway.

Comment by Johan Andersson, on 2018-01-10 15:32

The solution for us was to update to 64-bit. We did not see the error after that.

Comment by Geoff Bache, on 2018-01-10 15:49

No longer a problem (Johan Andersson above is my colleague).

Comment by Mads Kiilerich, on 2018-01-19 01:12

(It would still be interesting to get confirmation if it works with 32-bit on the "default" (development) branch ...)