-
Bug Report
-
Resolution: Duplicate
-
L2 - Critical
-
None
-
3.6.0
-
None
-
Not defined
Brief summary of the bug. What is it ? Where is it ?
In case a variable instance contains a too huge variable value (>32766 bytes) the import bulk containing this variable will fail an get retried indefinitely, blocking the affected variable import pipelines progress.
Steps to reproduce:
1. Deploy a Camunda Cloud Cluster using the QA Type
2. Check the log of the cluster
Actual result:
3. the variable import gets eventually stuck producing logs like like:
There were failures while performing bulk on Zeebe process instances. ..... [2]: index [optimize-process-instance-simpleprocess_v8], type [_doc], id [4503599627370771], message [ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=Document contains at least one immense term in field=\"variables.value\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[49, 50, 51, 52, 53, 54, 55, 56, 57, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 48]...', original message: bytes can be at most 32766 in length; got 42000]]; nested: ElasticsearchException[Elasticsearch exception [type=max_bytes_length_exceeded_exception, reason=bytes can be at most 32766 in length; got 42000]]
Expected result:
We either need to find a way of storing these, truncate those values to match the limit or skip them entirely (still logging that his occurred for transparency).
Hint:
There is https://www.elastic.co/guide/en/elasticsearch/reference/current/ignore-above.html which would allow to still store the value in the source document but tell elastic to ignore the content on indexing.
This is the controller panel for Smart Panels app
- duplicates
-
OPT-5809 Variable import fails on too long term
- Done