-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support crazy large dbs #68
Comments
@R0tenur have you taken an inventory of everything that counts within this size? is it possible to review to see if there's whitespace, or long names that could be "..." suffixed in descriptions, etc? basically optimizing down exactly what data is being sent to mermaid. |
I think the main part of the load is for calculate positions and building the svg. But since the extension is in charge of both the limit and the white space it might be possible for a couple of more users if the white space is trimmed and the limit is increased even more. |
Or support exporting the mermaid diagram text to be generated later with a high maximum text. |
Im seeing this with most of my databases. What's the resolution? I see the issue is marked as "Closed" with no resolution/ |
There is actually support for increasing the value as big as we would like. The issue is that data studio is hanging over a certain level. I have actually an ongoing work of entierly getting rid of mermaid since it's a markdown language an not that suitable for this project. In the meantime I have thougths of catching this error in particular and make it possible for the user to opt in to a potential hang of the application |
@AltarBeastiful I have added support for exporting it in the most recent version, will have a real fix for the issue in a later release |
Mermaid is throwing "Maximum text size in diagram exceeded" to prevent completely blocking the UI when dbs are to big. The limit is already increased in #49 but could probably be increased even more if the load is put in a web worker.
The text was updated successfully, but these errors were encountered: