Hi,
When we have > 5000 rows in Flexmonster and we trigger a function to load a single image for each row, we get an error 'dataset is too large'. We could lower the base64 string resolution and it starts working but we need to lower it to the point when its too obscure. We are wondering if there are any solutions for the issue, whether we can set some parameters to still keep loading images or maybe load 500 at a time and trigger next loading when all of the 500 are scrolled through, etc.
Nikita
Hi,
Thank you for contacting us.
We want to explain that expanding all rows is a heavy operation because it is performed entirely in the browser and relies on the user's machine resources. Since this process significantly impacts performance, Flexmonster has an internal time limitation for the execution of the expandAll
property or expandAllData()
API call. By default, it is set to 9000
ms, but you can increase the waiting time using the expandExecutionTimeout
property of the Options object. For example:
report: {
options: {
expandExecutionTimeout: 60000 // it is set in milliseconds
}
}
You can find more details in our documentation: https://www.flexmonster.com/api/options-object/#expandexecutiontimeout.
Please let us know if it works for you. Looking forward to hearing from you.
Kind regards,
Nadia