17

I often pipe program output to less, e.g.

produce_output | less

This works great, until produce_output produces large amounts of output. If I search for some text that is deep into the file, less reports

Calculating line numbers... (interrupt to abort)

If I interrupt with Control+C, it also kills produce_output, which stops it from producing further output. Is there any way to send the interrupt to less, so that produce_output keeps running?

I know that I could use kill -INT less_process, but I think there must be a better solution.

Ed McMan
  • 520

4 Answers4

21

Normally all processes in a pipeline run in the same process group, causing all of them to receive the signal. You can use setsid foo | less to run foo in a different pgrp.

grawity
  • 501,077
10

You can disable line numbers with the

   -n or --line-numbers

option.

produce_output | less -n
Matteo
  • 8,097
  • 3
  • 47
  • 58
2

You can also just do this:

less +F -f <(produce_output)
0

In working with large amounts of output, I've found it very helpful to send the output to a file and use tail -f or less +F to watch, e.g.:

produce_output > out 2>&1 & less +F out

The 2>&1 syntax makes sure that both stdout and stderr go to out---remove that if you only want stdout going to the file. This way, you can inspect the output in various ways (even from a different machine) without having to mess with the program producing the output.

Note that 2>&1 may be bash-specific (I'm not sure). Be sure that you have sufficient disk space for the output file :-)

jrennie
  • 186