Uploaded image for project: 'Camunda Optimize'
  1. Camunda Optimize
  2. OPT-3459

Further improve import efficiency in terms of same timestamp entities

XMLWordPrintable

    • Icon: Task Task
    • Resolution: Won't Do
    • Icon: L3 - Default L3 - Default
    • None
    • None
    • backend
    • Not defined

      Context:
      Based on the feedback from this change https://github.com/camunda/camunda-optimize/pull/1828#discussion_r396445386 we could think of improving the import efficiency in terms of same timestamp entity handling even further.

      Given the following scenario:
      Entities: [A, B, C, D1, D2, D3, D4]
      Timestamps: [ta, tb, tc, td , td ,td , td]

      PageSize: 3

      #0:
      count = 0
      t = 1970 → next: [A, B, C] last: []

      #1:
      count=1
      t = tc → next [D1, D2, D3] last: [C]

      #2:
      count=3
      t = td → next[], last: [D1, D2, D3, D4]

      #3:
      count=4
      t = td → next[], last: [D1, D2, D3, D4]

      Current behavior that could be improved is :

      round #2 imports D1, D2, and D3 again
      Potential Optimization measure: we could keep a Set of the last executions entity id's and exclude already seen ones
      round #3 (and all subsequent rounds without data) will fetch last: [D1, D2, D3, D4] again.
      Potential Optimization measure: if entities of that same timestamp have already been fetched there is no need to re-fetch them at all in consecutive executions

      AT:

      • the two optimization cases described are addressed

        This is the controller panel for Smart Panels app

              Unassigned Unassigned
              sebastian.bathke Sebastian Bathke
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated:
                Resolved: