-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Errors in clickhouse shard, about memory, whatever the allocated memory #66
Comments
These
errors still intensify... |
ClickHouse needs a lot of memory. I don't know what specifically is causing your issue, but some of the recommendations for running on low memory might help you out: https://clickhouse.com/docs/en/operations/tips#using-less-than-16gb-of-ram |
Could we have this documented in the chart Btw, the way my questions are answered (about default clickhouse system tables size, and ingest pod persistence, and kafka replications issues, and now about clickhouse memory) makes me think Jitsu really use by default a quantity of memory and services quite insane.. for the usage we have in dispatching about hundred of thousands events per day. And the new docs is so light .. https://docs.jitsu.com/, Anyway, I 'd really like to get more documentation in jitsu-charts about all this. |
Managing these services, especially Jitsu's third party dependencies, over time is not really within the scope of this chart. We recommend managing them separately for a production deployment. I'm not opposed to including some useful documentation on how to manage them, but the best source for this is likely going to be the official documentation for each service. We're operating a relatively small instance of Jitsu, so we don't necessarily have all the answers when it comes to scaling beyond that, but we're open to contributions from those who are processing a greater number of events. |
Since recently, I got every few seconds this error :
So I tried to increase the memory of clickhouse in the config, but whatever the memory is, even 10Gi, I finally got this (with
maximum: <WHATEVER_MAX_MEMORY_THAT_I_SET> GiB
in the above error message).Any idea ?
Thanks
The text was updated successfully, but these errors were encountered: