I have a rake task and when I run it in console, it is killed. This rake task operates with a table of cca 40.000 rows, I guess that may be a problem with Out of memory. 
Also, I believe that this query used is optimized for dealing with long tables:
MyModel.where(:processed => false).pluck(:attribute_for_analysis).find_each(:batch_size => 100)   do |a|
# deal with 40000 rows and only attribute `attribute_for_analysis`.
end
This task will not be run in the future on regular basis, so I want to avoid some job monitoring solutions like God etc...but considering background jobs e.g.Rescue job. 
I work with Ubuntu, ruby 2.0 and rails 3.2.14
> My free memory is as follows: 
Mem:                total      used       free     shared    buffers     cached  
                    3891076    1901532    1989544  0         1240        368128
-/+ buffers/cache:  1532164    2358912
Swap:               4035580     507108    3528472
QUESTIONS:
- How to investigate why 
rake task is always killed(answered) - How to make this 
rake taskrunning ( not answered - still is killed ) - What is the difference between 
total-vm,aton-rs,file-rss(not answered) 
UPDATE 1
-Can someone explain the difference between?:
- total-vm
 - anon-rss
 - file-rss
 
$ grep "Killed process" /var/log/syslog 
Dec 25 13:31:14 Lenovo-G580 kernel: [15692.810010] Killed process 10017 (ruby) total-vm:5605064kB, anon-rss:3126296kB, file-rss:988kB
Dec 25 13:56:44 Lenovo-G580 kernel: [17221.484357] Killed process 10308 (ruby) total-vm:5832176kB, anon-rss:3190528kB, file-rss:1092kB
Dec 25 13:56:44 Lenovo-G580 kernel: [17221.498432] Killed process 10334 (ruby-timer-thr) total-vm:5832176kB, anon-rss:3190536kB, file-rss:1092kB
Dec 25 15:03:50 Lenovo-G580 kernel: [21243.138675] Killed process 11586 (ruby) total-vm:5547856kB, anon-rss:3085052kB, file-rss:1008kB
UPDATE 2
- modified query like this and 
rake taskis still killed. 
MyModel.where(:processed => false).find_in_batches do |group| p system("free -k") group.each do |row| # process end end