#2016 closed defect (duplicate)
Unexpected end of ZLIB input stream
Reported by: | Nicklas Nordborg | Owned by: | everyone |
---|---|---|---|
Priority: | minor | Milestone: | BASE 3.9 |
Component: | core | Version: | |
Keywords: | Cc: |
Description
When using the "Packed file exporter" plug-in to create an archive some files causes the plug-in to fail with the following stacktrace:
java.io.EOFException: Unexpected end of ZLIB input stream at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:240) at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:158) at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:117) at net.sf.basedb.util.FileUtil.copy(FileUtil.java:113) at net.sf.basedb.util.FileUtil.copy(FileUtil.java:90) at net.sf.basedb.util.zip.ZipFilePacker.pack(ZipFilePacker.java:111) at net.sf.basedb.plugins.PackedFileExporter.performExport(PackedFileExporter.java:419) at net.sf.basedb.core.plugin.AbstractExporterPlugin.run(AbstractExporterPlugin.java:146) ...
It doesn't matter which type of output archive that is selected.
The file is stored compressed in the BASE internal file storage.
Downloading the same file seems to work, but in the Tomcat log file there is a similar stacktrace. It can be verified that the MD5 of the downloaded file is the same as the MD5 of the original file.
The only pattern found so far is that files over 2GB (uncompressed) get this error but not files that are smaller.
There are reports on the net about a similar error but it should have been fixed already: http://bugs.java.com/bugdatabase/view_bug.do;jsessionid=53ede10dc8803210b03577eac43?bug_id=6519463
Change History (5)
comment:1 by , 8 years ago
comment:2 by , 8 years ago
Resolution: | → duplicate |
---|---|
Status: | new → closed |
Closed as duplicate. Should be fixed as part of #2006.
comment:3 by , 8 years ago
(In [7170]) References #2006 and #2016.
Removed parallelgzip jar file and added source files to the BASE core package instead. The intention is to fix the file size problem. The current code is the original code as downloaded from https://github.com/shevek/parallelgzip (version 1.0.1). The code does not compile due to using non-standard annotation from "javax.annotation" package.
Hmmm... further investigations seem to indicate that the parallel gzip implementation added in #2006 that may have a flaw. Reverting back to the old built-in method makes the problem go away.
Checking their code repository on GitHub (https://github.com/shevek/parallelgzip/blob/master/src/main/java/org/anarres/parallelgzip/ParallelGZIPOutputStream.java#L244) reveals that the
close()
method may have a flaw. ThebytesWritten
variable is an integer and if the file is larger than 2GB it will overflow to a negative number causing the entireclose()
method to be skipped and the last final bytes will be skipped. The missing data is not important and should only contain some metadata about the file that was compressed.