Invalid stored block lengths ошибка при распаковке

I am scraping a website and it contains many URLs from which O need to fetch data.
I used XPath and fetched all the hrefs (URLs) and saved in to a list. I looped this list and yielded a request. Below is my spider code,

class ExampledotcomSpider(BaseSpider):
   name = "exampledotcom"
   allowed_domains = ["www.example.com"]
   start_urls = ["http://www.example.com/movies/city.html"]


   def parse(self, response):
       hxs = HtmlXPathSelector(response)
       cinema_links = hxs.select('//div[@class="contentArea"]/div[@class="leftNav"]/div[@class="cinema"]/div[@class="rc"]/div[@class="il"]/span[@class="bt"]/a/@href').extract()
       for cinema_hall in cinema_links:
            yield Request(cinema_hall, callback=self.parse_cinema)


   def parse_cinema(self, response):
       hxs = HtmlXPathSelector(response)
       cinemahall_name = hxs.select('//div[@class="companyDetails"]/div[@itemscope=""]/span[@class="srchrslt"]/h1/span/text()').extract()
       ........

Here, for example, I had 60 URLs in the list, and for about 37 URLs are not downloaded: for these, an error message appeared:

2012-06-06 14:00:12+0530 [exampledotcom] ERROR: Error downloading <GET http://www.example.com/city/Cinema-Hall-70mm-%3Cnear%3E-place/040PXX40-XX40-000147377847-A6M3>: Error -3 while decompressing: invalid stored block lengths
2012-06-06 14:00:12+0530 [exampledotcom] ERROR: Error downloading <GET http://www.example.com/city/Cinema-Hall-35mm-%3Cnear%3E-place/040PXX40-XX40-000164969686-H9C5>: Error -3 while decompressing: invalid stored block lengths

Only for some URLs Scrapy is downloading, and for the remainder, I do not understand what’s happening and what’s wrong with my code.

Can anyone please suggest me how to remove these errors?

I am trying to extract a ZIP file from my current JAR using:

InputStream resource = getClass().getClassLoader().getResourceAsStream(name);

This get the correct InputStream, but it gives an error when I try to unzip it using the following code (I’m storing each file into a Hashmap<file, filename>):

public static HashMap<String, String> readZip(InputStream inputStream) throws IOException {
    byte[] buffer = new byte[1024];
    HashMap<String, String> list = new HashMap<>();
    ZipInputStream zipInputStream = new ZipInputStream(inputStream);
    ZipEntry entry = zipInputStream.getNextEntry();
    while (entry != null) {
        if (!entry.isDirectory()) {
            StringBuilder stringBuilder = new StringBuilder();
            while (IOUtils.read(zipInputStream, buffer) > 0) {
                stringBuilder.append(new String(buffer, "UTF-8"));
            }
            list.put(stringBuilder.toString(), entry.getName());
        }
        zipInputStream.closeEntry();
        entry = zipInputStream.getNextEntry();
    }
    zipInputStream.closeEntry();
    zipInputStream.close();
    return list;
}

However when I try to do this, I get this exception (on IOUtils.read)

java.util.zip.ZipException: invalid stored block lengths
   at java.util.zip.InflaterInputStream.read(Unknown Source)
   at java.util.zip.ZipInputStream.read(Unknown Source)

Am I doing this wrong? I’ve done plenty of googling of the error, and I didn’t see anything related to my issue.

Unpacking (larger) zipped archives from github repositories will result in error: invalid stored block lengths.

I was at least able to reproduce this issue with the packed releases 1 and 2. Any zip file of respectable size should produce the error. However, on smaller distro’s (3) the error is not occurring.

  1. https://github.com/TryGhost/Ghost/releases/download/0.3.3/Ghost-0.3.3.zip
  2. https://github.com/primus/primus/archive/1.4.4.zip
  3. https://github.com/Swaagie/smith/archive/v1.0.zip
var fs = require('fs')
  , zlib = require('zlib')
  , file = process.env.HOME + '/Downloads/dist.zip'; // any packed repo with: git archive HEAD -o dist.zip 

fs.createReadStream(file).pipe(zlib.createInflateRaw());

I know this does not lead to any useful output in itself, but it is a simplification of the unzip package process.

This seems to be related to the two different underlying package libraries.

  • git archive uses the libarchive library
  • node.js uses the zlib library

Repackaging the files with zip results in the same error, I couldn’t find the underlying library that is used by zip however. Will follow up on this tomorrrow

Repacking the files with tar and using createUnzip completes without errors.

Does anyone know of other languages/command line tools that bind into the zlib library so node.js could actually be excluded or not?

Edit: this same error is also present in node version 0.8.25 which makes me suspect the zlib library, is there an update available for this library?

System specs:
64 bit, node 0.10.21, ubuntu 13.10

I am trying to extract a ZIP file from my current JAR using:

InputStream resource = getClass().getClassLoader().getResourceAsStream(name);

This get the correct InputStream, but it gives an error when I try to unzip it using the following code (I’m storing each file into a Hashmap<file, filename>):

public static HashMap<String, String> readZip(InputStream inputStream) throws IOException {
    byte[] buffer = new byte[1024];
    HashMap<String, String> list = new HashMap<>();
    ZipInputStream zipInputStream = new ZipInputStream(inputStream);
    ZipEntry entry = zipInputStream.getNextEntry();
    while (entry != null) {
        if (!entry.isDirectory()) {
            StringBuilder stringBuilder = new StringBuilder();
            while (IOUtils.read(zipInputStream, buffer) > 0) {
                stringBuilder.append(new String(buffer, "UTF-8"));
            }
            list.put(stringBuilder.toString(), entry.getName());
        }
        zipInputStream.closeEntry();
        entry = zipInputStream.getNextEntry();
    }
    zipInputStream.closeEntry();
    zipInputStream.close();
    return list;
}

However when I try to do this, I get this exception (on IOUtils.read)

java.util.zip.ZipException: invalid stored block lengths
   at java.util.zip.InflaterInputStream.read(Unknown Source)
   at java.util.zip.ZipInputStream.read(Unknown Source)

Am I doing this wrong? I’ve done plenty of googling of the error, and I didn’t see anything related to my issue.

I’m not really sure what to do here, except that zip contains a header that you have to seek beyond before you can start your inflate stream

var fs = require('fs');
var zlib = require('zlib');

var file = process.argv[2];

var zs = zlib.createInflateRaw();
var fstream = fs.createReadStream(file);

function doIt() {
  if (fstream.read(2))
    fstream.pipe(zs);
  else
    setImmediate(doIt);
}

setImmediate(doIt);

zs.on('error', function(err) {
  console.error(err.errno, err);
});

And it works fine. Closing as this is mostly beyond the scope of node, if you can find a more minimal test case that can show a problem with our usage of zlib please let us know.

Я пытаюсь извлечь ZIP файл из моего текущего JAR, используя:

InputStream resource = getClass().getClassLoader().getResourceAsStream(name);

Получите правильный InputStream, но он дает ошибку, когда я пытаюсь разархивировать его, используя следующий код (я храню каждый файл в Hashmap<file, filename>):

public static HashMap<String, String> readZip(InputStream inputStream) throws IOException {
    byte[] buffer = new byte[1024];
    HashMap<String, String> list = new HashMap<>();
    ZipInputStream zipInputStream = new ZipInputStream(inputStream);
    ZipEntry entry = zipInputStream.getNextEntry();
    while (entry != null) {
        if (!entry.isDirectory()) {
            StringBuilder stringBuilder = new StringBuilder();
            while (IOUtils.read(zipInputStream, buffer) > 0) {
                stringBuilder.append(new String(buffer, "UTF-8"));
            }
            list.put(stringBuilder.toString(), entry.getName());
        }
        zipInputStream.closeEntry();
        entry = zipInputStream.getNextEntry();
    }
    zipInputStream.closeEntry();
    zipInputStream.close();
    return list;
}

Однако, когда я пытаюсь это сделать, я получаю это исключение (на IOUtils.read)

java.util.zip.ZipException: invalid stored block lengths
   at java.util.zip.InflaterInputStream.read(Unknown Source)
   at java.util.zip.ZipInputStream.read(Unknown Source)

Я делаю это неправильно? Я сделал много ошибок в ошибке, и я не видел ничего, связанного с моей проблемой.

I’m trying to send some compressed data over the network, I think the data is successfully compressed since I can re-inflate it from the code that deflates the data (written in C), however when the data arrives at my server (nodejs using express) and tries to inflate the data I’m receiving:

{ Error: invalid stored block lengths
    at InflateRaw.zlibOnError (zlib.js:153:15) errno: -3, code: 'Z_DATA_ERROR' }

Any Help will be really appreciated, thanks in advance.

Szz.

P.D: I can provide code snipped if it is useful for anyone.

EDIT:

Thanks for the quick answer and sorry for the delay, I’ve been working in my code, now I’m sending the original data to the server along with the compressed data. When the original data arrives the server I’m deflating it in order to see the buffers generated, surprisingly the buffers of the compressed data sent over the network and the buffer compressed in the server are the same except for the two first bytes particulary:

<Buffer 78 da 15 8c c9 11 00 41 08 02 ff c6 82 55 83 b7 f9 27 b6 ee 13 68 7a a6 b0 4c b8 a7 68 ac 81 61 84 5a 8b e6 82 6b 05 bd aa 44 d9 5e a0 d7 20 6c 2f a6 ... >

<Buffer 15 8c c9 11 00 41 08 02 ff c6 82 55 83 b7 f9 27 b6 ee 13 68 7a a6 b0 4c b8 a7 68 ac 81 61 84 5a 8b e6 82 6b 05 bd aa 44 d9 5e a0 d7 20 6c 2f a6 bf 9b ... >

So, I think my question now is can I assume that this first two bytes will always appear in the compressed data from the C program? Is there any option to avoid this first two bytes? Am I doing well?

Here I leave you the function that compress the data in the C part, the node part is just formidable parsing the post input and a little work with buffers now.

int compress_image (compressed_image raw, size_t size, compressed_image * out) {

    z_stream strm;
    int out_size = 0;

    *out = (compressed_image) malloc(size);

    strm.zalloc = Z_NULL;
    strm.zfree = Z_NULL;
    strm.opaque = Z_NULL;

    int ret = deflateInit(&strm, Z_BEST_COMPRESSION); /* Max compression level, but bad speed performace */

    if (ret != Z_OK)
        return ret;

    strm.avail_in = size;
    strm.next_in = (unsigned char *)raw;

    do {

        strm.avail_out = size;
        strm.next_out = *out;
        ret = deflate(&strm, Z_FINISH);    /* no bad return value */
        assert(ret != Z_STREAM_ERROR);  /* state not clobbered */
        out_size += (size - strm.avail_out);

    } while (strm.avail_out == 0);

    (void)deflateEnd(&strm);

    return out_size;
}

Thanks again!

EDIT 2:
I’m dumping my buffers now and I can see some trailing bytes that differs too,
here you have the buffer coming from C program:

00000000: 78da 158c c911 0041 0802 ffc6 8255 83b7  xÚ..É..A..ÿÆ.U.·
00000010: f927 b6ee 1368 7aa6 b04c b8a7 68ac 8161  ù'¶î.hz¦°L¸§h¬.a
00000020: 845a 8be6 826b 05bd aa44 d95e a0d7 206c  .Z.æ.k.½ªDÙ^.× l
00000030: 2fa6 bf9b 680e f2ce 9c67 975f 0e58 2d91  /¦¿.h.òÎ.g._.X-.
00000040: 75dc f1ed b25c 687a 43dd 42bc 1f98 e7dd  uÜñí²hzCÝB¼..çÝ
00000050: 79a7 99f6 5fd3 bfcc 86f2 01b0 f51a cd    y§.ö_Ó¿Ì.ò.°õ.Í

And the buffer deflated in the server:

00000000: 158c c911 0041 0802 ffc6 8255 83b7 f927  ..É..A..ÿÆ.U.·ù'
00000010: b6ee 1368 7aa6 b04c b8a7 68ac 8161 845a  ¶î.hz¦°L¸§h¬.a.Z
00000020: 8be6 826b 05bd aa44 d95e a0d7 206c 2fa6  .æ.k.½ªDÙ^.× l/¦
00000030: bf9b 680e f2ce 9c67 975f 0e58 2d91 75dc  ¿.h.òÎ.g._.X-.uÜ
00000040: f1ed b25c 687a 43dd 42bc 1f98 e7dd 79a7  ñí²hzCÝB¼..çÝy§
00000050: 99f6 5fd3 bfcc 86f2 01   

However creating a new buffer without the first two bytes I’m able to se my data properly inflated in the server:

my node.js part:

const dump = require('buffer-hexdump');

router.post('/test', function(req, res, next) {

    let form = new formidable.IncomingForm();
    form.uploadDir = "./uploads"

    form.parse(req, function(err, fields, files) {

        let deflated = zlib.deflateRawSync(fs.readFileSync(files.image_raw.path));

        let image_buff = fs.readFileSync(files.image.path);
        let buff = Buffer.alloc(image_buff.length - 2);

        image_buff.copy(buff, 0, 2);

        console.log(buff.equals(deflated)); //false
        console.log(dump(deflated));
        console.log(dump(image_buff));

        zlib.inflateRaw(buff, (err, buffer) => {

            if (err)
                console.error(err);
            else
                console.log(buffer.toString()); //here I can see my data!!

        });

        res.status(200).send("POST received!");

    });
});

I don’t think it is solved however …

Thanks again!

Szz.

 invalid stored block lengths
 
 
Недопустимая длина хранимых блоков. Это сообщение об ошибке обычно появляется в установке или обновлении базы данных Oracle базы данных или обновления
 


 Есть несколько причин, почему эта ошибка вызвана.
 1. Есть проблема с пакетом загрузки
 2. Проблема в процессе декомпрессии
 3. Есть проблема во время процесса передачи


 решение 
 1. Скачать установочный пакет DB
 2. Реэкстрация
 3. Передача на сервер, по умолчанию открытая двоичная передача
4.restart  server

Перепечатано на: https://blog.51cto.com/evils798/1420912


Интеллектуальная рекомендация

Жадный алгоритм

Эта статья относится к книге «Графический алгоритм» Во-первых, прежде чем понимать жадный алгоритм, сначала нужно понятьNP полная проблема Полная проблема NP (проблема NP-C) является одной и…

Многопоточный

Обязательное выполнение не выполняется, как и ожидалось 1. Причиной каждого принуждения является новый NW () поток 2. Он выполняется одновременно перед разрезанием Решите код…

ssh localhost ( )

ssh locahost , , test, test, 1 ssh localhost: $ ssh localhost Вывод выглядит следующим образом: 2 ssh localhost, : $ ssh-keygen -t dsa -P » -f ~/.ssh/id_dsa $ cat ~/.ssh/id_dsa.pub >> ~/.ssh/au…

Вам также может понравиться

Ifeq Multi -Condition в makefile

Ifeq Multi -Condition в makefile 21 августа 2015 г. 20:14:23liwugang43210Количество чтения 42340 Заявление об авторском праве: эта статья является оригинальной статьей блоггеров. Если вы переиздаете, …

socket API(linux)

функция 1.socket (1) определение Роль: Создать сокет (2) Параметры domain Установка домена сети связи, то есть настройки протокола связи: имя имея в виду  AF_UNIX, AF_LOCAL Местная связь AF_INET …

ImperialPearl

New Here

,

/t5/photoshop-ecosystem-discussions/error-idat-invalid-stored-block-lengths/td-p/10108834
Sep 29, 2018
Sep 29, 2018

Copy link to clipboard

Copied

I’m getting this error IDAT: invalid stored block lengths when trying to open a png image… The resolution of the image is really big 12800 x 7200.

Anyone know why this is happening ?

Thank you,
Daniel

  • Follow
  • Report

Community guidelines

Be kind and respectful, give credit to the original source of content, and search for duplicates before posting.
Learn more

community guidelines

Adobe

Explore related tutorials & articles

replies
2
Replies
2

Mylenium

LEGEND

,

/t5/photoshop-ecosystem-discussions/error-idat-invalid-stored-block-lengths/m-p/10108835#M198902
Sep 30, 2018
Sep 30, 2018

Copy link to clipboard

Copied

Without any info how the file was created in the first place, system info, exact version info for PS and so on nobody can tell you much.

Mylenium

  • Follow
  • Report

Community guidelines

Be kind and respectful, give credit to the original source of content, and search for duplicates before posting.
Learn more

community guidelines

davescm

Community Expert

Community Expert

,

/t5/photoshop-ecosystem-discussions/error-idat-invalid-stored-block-lengths/m-p/10108836#M198903
Sep 30, 2018
Sep 30, 2018

Copy link to clipboard

Copied

LATEST

It sounds like a corrupt file.

Can you open it with any other application? If so, can you resave it as a new copy from that application then try opening the new copy in Photoshop.

Dave

  • Follow
  • Report

Community guidelines

Be kind and respectful, give credit to the original source of content, and search for duplicates before posting.
Learn more

community guidelines

Понравилась статья? Поделить с друзьями:

Интересное по теме:

  • Invalid start mode archive offset tlauncher ошибка
  • Invalid remid симс 4 ошибка как исправить
  • Invalid data about errors перевод ошибка
  • Invalid qualifier vba excel ошибка
  • Invalid payload register ошибка

  • Добавить комментарий

    ;-) :| :x :twisted: :smile: :shock: :sad: :roll: :razz: :oops: :o :mrgreen: :lol: :idea: :grin: :evil: :cry: :cool: :arrow: :???: :?: :!: