3

I am doing a lengthy SageMath calculation in Terminal that I plan to save the results of to disk using Shell > Export Text As... The resulting text file will be somewhere in the neighborhood of 10 GB.

Do I need to worry that 16 GB of RAM will not be enough for this task? In particular, I worry that both SageMath and the Terminal display are storing the results separately. SageMath's format would require less space, but by how much I don't know.

Another question: if I try to use “Shell > Export Text As…” while the calculation is running, will this cause it to stop? I'd prefer to not have to start over after three days just in order to find out that the answer to this question is "Yes, it will stop!"

I expect the calculation to take about two weeks total.

Giacomo1968
  • 58,727

1 Answers1

6

To answer your primary question, the Terminal scroll back buffer size is limited only by your RAM on the machine (16GB). Without knowing what your process is outputting, it is hard to know if this will present an issue or not.

However, rather than depending on the scroll buffer integrity, especially for a process that runs in time measured in weeks, you could instead redirect the output to a file that can be inspected over time.

One way to do this might be to use the logging capabilities of SageMath.

An alternate way might be to create a standalone script, and if you are interested in what would have shown in the terminal (STDOUT), it can be redirected to a file:

your_sage_script > sage.log

Then the data will be redirected to the file sage.log, and can be inspected as that file grows, without affecting the execution of the script itself.

The above ways would be a much better way of ensuring you are saving the I/O of your process. For a script that will take weeks to complete, it is worth looking into these methods.

Giacomo1968
  • 58,727
Scot
  • 408