Log in |
Recovering from hitting the 2 gig data base limitThere is a problem with having the Data.fs as one file and that is hitting the 2 gig file limit that occurs in many OS's and python. The solutions are numerous, use mounted databases or external relational database to store data and so on. Some OS's dont have this problem Solaris, Windows NT/2000, AIX, and FreeBSD (too name a few). However that does not mean that python supports the 2 gig limit, mostly notably as I found out on Windows. So here's the problem I was trying to solve recently, my database just hit 2 gigs, Zope crashes. What do you do? You can't start Zope to pack it, scripts such as fsrecover.py won't run because Zope won't start, you're hooped. My first try was move the db to an os that supports large file system. We tried 64-bit Linux, where Python 2.0 supports large files, of course this means porting Zope to python 2.0 and 64-bit Linux. An interesting problem to do someday. Then I looked at Solaris, but we didnt have the right version running, so we'd have to set up a box etc. I rummaged around Zope.org and found nothing, which is why Im creating this how-to. I came up with the idea of truncating the file and then running Zope and praying. I later found out this works.
Other references: |