My company has lots of very big files; Textfiles, CVS tables and so forth. I want to write a program that reads all the contents from the file, line by line and writes it into a SQLite Database.
The program works exactly like it should, but it is too slow. It writes 50.000 Entries in almost 2 Minutes. But the files it have to deal with, if I will use it in production, have over 10.000.000 entries. Some of them twice as much.
Which means I need to speed things up.
Here is the program:
require 'sqlite3'
require 'benchmark'
system("clear")
    db = SQLite3::Database.open "./database/BenchmarkBase.db"
    counter = 0
      bench = Benchmark.measure {
          theFile = File.open("/root/Downloads/test/cellphoneNumbers.txt", 'r').each do |line|
            begin
              db.execute "INSERT INTO PhoneN(cellPhone) VALUES ('#{line}')"
              print "Entry: "
              print counter += 1
              print " .. has been written into database." + "\r"
            rescue SQLite3::Exception => e
              puts "Exception occurred"
              puts e
            end
          end
      }
    puts "time consumption: #{bench.real}"
    db.close if db
Forking this process have literally no performance boost. (or maybe I did it wrong.) so if anyone has a good idea how I can make the progress faster, please don't hesitate to tell me.
Thank you in advance
 
    