-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
python process gets killed when trying to export a large buffer #11
Comments
Here is the problem: quasselgrep/quasselgrep/query.py Line 301 in 502c88b
For non-context queries it's probably an easy fix. |
This should be fixed now. Could you check that your use-case works? |
Still happens,
the query below returns more than 5 million lines, which i am guessing it is the amount which should be returned if i add the -i switch.
|
I have now finally managed to reproduce this! It somehow managed to kill my quasselcore at the same time as the python process, so that's annoying. I also think I know the cause now - turns out iterating over a postgres database cursor actually just slurps all the rows initially by default. Shouldn't be too hard to fix! |
OK @pitastrudl it should be fixed in master, can you test hopefully for the last time? |
@pitastrudl did you ever see this again? |
Hi @fish-face, sadly not, but it is in my to-do to test again. Since i've posted this i've come to learn more about postgres and python so maybe i'll be able to help more! Quassel also just released a new release, so it will be more interesting to test. I'll try it out and let you know. Happy new years! |
VM has around 500mb of ram, 3gb of free space.
went in the psql prompt to check how many lines the buffer has
psql query:
select buffer.buffername,count(buffer.buffername) as counts from buffer inner join backlog on buffer.bufferid=backlog.bufferid group by buffer.buffername order by counts
number of lines:
#ubuntu | 5505647
i execute
./quasselgrep -N 'Freenode' -b '#ubuntu' > ubuntu.txt
which gets killed and
dmesg -T | grep process
results to:The text was updated successfully, but these errors were encountered: