3

I'm trying to write a very simple Nushell script. I want to skip the first 46 lines and pipe the rest to an external command. In Bash:

zcat filename.sql.gz |
    tail -n +46 |
    mysql dbname

Since Nushell's open doesn't seem to support compressed formats, I extracted the .gz and then, in Nushell, tried:

> open --raw filename.sql | lines | skip 46 | describe
list<string>

...so I have a list, therefore I should be able to convert it to one string, but...

> open --raw filename.sql | lines | skip 46 | to text | mysql dbname

It appears as though that tries to load the whole file into memory, but it's much too large. Is there a way to avoid that?

NotTheDr01ds
  • 28,025
Hut8
  • 235

1 Answers1

2

Important: As @Prem correctly mentions in the comments, my current answer below only works for a narrow use-case, and can cause issues in most SQL-based workloads like in the question. That it seems to have worked for the OP (based on current acceptance of the answer) is a happy coincidence.

See the bottom of this answer for the reasons why it is wrong, situations in which it might cause issues, and how we might (eventually) get it to work.


Short answer (addressing several of your issues), try:

zcat filename.sql.gz |
    lines |
    skip 46 |
    each {
        mysql dbname
    }

Explanation:

As fdncred mentioned in your Github issue, the problem is the to text.

That's attempting to convert the entire list<string> to text. That, of course, can't happen until the entire file is read.

What you really seem to want it to simply process each line as it is read.

You can see this demonstrated without feeding it to MySQL with something like:

open filename.mysql | 
    lines |
    skip 46 |
    each {
        bash -c "cat; echo"
        sleep 1sec
    }

^^^ Quick-and-dirty use of Bash to print what it receives on stdin (the cat) + a newline (the echo).

Note that there's some shorthand in there, primarily around implicit output, and the each block handling. A more conventional, explicit-output, explicit-each-variable option:

open filename.mysql | 
    lines |
    skip 46 |
    each {|sql_statement|
        echo $sql_statement | bash -c "cat; echo"
        sleep 1sec
    }

Also, as you probably noticed in the "short answer", using this form you can return to using zcat to process the compressed file.


The "Oops" section

As mentioned in the front-matter, this doesn't work in all (perhaps most) cases.

The each will spawn a new mysql process for each line in the original file. This often won't work, because SQL statements can span multiple lines. Attempting to pass a single line of a multi-line statement into a single mysql process will fail.

I can think of a half-dozen different ways to make this work, but so far all of them would require changes to Nushell.

The best scenario, IMHO, would be for to text to stream as you original expected, which is a recent Nushell feature request (#6178).

NotTheDr01ds
  • 28,025