View Issue Details

IDProjectCategoryView StatusLast Update
0004183Kali LinuxGeneral Bugpublic2017-08-24 16:19
Reportergrimwiz Assigned Torhertzog  
PrioritynormalSeverityminorReproducibilitysometimes
Status closedResolutionwon't fix 
Product Version2017.1 
Summary0004183: openvas with postgresql causes large file tasks.db-wal
Description

This morning my Kali instance had it's disk almost completely filled by a file called tasks.db-wal that had grown to 5Gb. This file seems to be the postgresql transaction log and in theory I thought it should have been truncated by the default postgresql configuration.
There was a rogue openvasmd process using 100% CPU when I noticed the problem.
Stopping openvas, redis and postgres and restarting them solved the problem - the file vanished when postgres was restarted. Restarting postgresql removed the large file but openvas would not restart until redis had also been restarted.

root@kalipc:/home/openvas/mgr# ls -l
total 5186692
-rw------- 1 root root 239075328 Aug 22 09:45 tasks.db
-rw------- 1 root root 9830400 Aug 24 09:17 tasks.db-shm
-rw------- 1 root root 5062256392 Aug 24 09:17 tasks.db-wal

top - 09:16:33 up 8 days, 16:40, 1 user, load average: 1.18, 1.14, 1.05
Tasks: 150 total, 2 running, 148 sleeping, 0 stopped, 0 zombie
%Cpu(s): 5.0 us, 20.1 sy, 0.0 ni, 74.9 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st
KiB Mem : 8169784 total, 268464 free, 669240 used, 7232080 buff/cache
KiB Swap: 1048572 total, 1048572 free, 0 used. 7156604 avail Mem

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
7706 root 20 0 258768 93380 7748 R 100.0 1.1 2818:07 openvasmd

Steps To Reproduce

I don't know how to reproduce, I've been watching the tasks.db-wal file all day and it's size has stayed constant in spite of having triggered some scans. I have made the redis change that I mentioned in another ticket to set "databases 256" in /etc/redis/redis.conf and that may have improved matters.

root@kalipc:/home/openvas/mgr# ls -l
total 238276
-rw------- 1 root root 239075328 Aug 24 16:08 tasks.db
-rw------- 1 root root 32768 Aug 24 16:08 tasks.db-shm
-rw------- 1 root root 4882232 Aug 24 16:08 tasks.db-wal

Activities

rhertzog

rhertzog

2017-08-24 16:18

administrator   ~0007121

So the rogue openvas probably wrote data in a never-ending transaction and this created this huge file.

There's no way that we (Kali developers) will understand why Openvas started misbehaving, you should engage with the upstream OpenVAS developers instead.

rhertzog

rhertzog

2017-08-24 16:19

administrator   ~0007122

Also have a look at https://kali.training/chapter-6/filing-a-good-bug-report/ before filing a bug to OpenVAS. If you can't reproduce it, it will likely not be useful.

Issue History

Date Modified Username Field Change
2017-08-24 15:18 grimwiz New Issue
2017-08-24 16:18 rhertzog Assigned To => rhertzog
2017-08-24 16:18 rhertzog Status new => closed
2017-08-24 16:18 rhertzog Resolution open => won't fix
2017-08-24 16:18 rhertzog Note Added: 0007121
2017-08-24 16:19 rhertzog Note Added: 0007122