When submitting a too large bulk request, the importer persistently fails with "413 Request Entity Too Large"

XMLWordPrintable

    • Type: Bug Report
    • Resolution: Unresolved
    • Priority: L3 - Default
    • None
    • Affects Version/s: None
    • Component/s: None
    • None
    • Not defined

      By default, Elasticsearch accepts a bulk request with up to 100MB, a bulk request greater than 100MB fails with

      ElasticsearchStatusException[Unable to parse response body]; nested: ResponseException[method [POST], host [http://elasticsearch:9200], URI [/_bulk], status line [HTTP/1.1 413 Request Entity Too Large] ];
      

      Once failed with such an exception, the importer simply retries the bulk request forever.

      Steps to reproduce:

      Submit a bulk request greater than 100MB.

      Actual result:

      • The importer retries the bulk request forever.
      • The importer gets stuck at that point.

      Expected result:

      • One option would be to do bookkeeping, meaning, add as many requests to the bulk request as it reaches a 100MB size. Then submit the bulk request and after completion start another bulk request.
      • Another option would be to submit a bulk request as always and only when receiving the exception, then handle it gracefully. For example, by dividing the bulk request into smaller chunks.

            Assignee:
            Joshua Windels
            Reporter:
            Roman Smirnov
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated: