3

I'm using the latest version of DBeaver on Mac, querying against a remote MSSQL database and I've got a query that results in 24 million rows. I need to get this data into CSV format, but rather than one huge, unmanageable file with 24 million lines in it, I'd like to get a series of files, each one having a smaller number of lines in them.

I've been using the Export from Query feature to get the output into CSV format. How do I export the data such that after some number of rows the current CSV file is closed and a new one opened? I can't see how this is possible from the UI but I'm hoping I'm missing something basic.

I can run the query in a paginated manner, increasing the page count each time I run it, but it's taking forever and there's too much risk of human error :/

I tried with a WHILE loop, and that gave me different result sets on different tabs (as hoped) but when running it through Export from Query, it still ends up in the same file.

If this decidedly cannot be done on the DBeaver client, does anyone know of a client that WILL support this? It seems fairly basic--managing size of csv files. As it stands now I'm having to re-run the query, exporting to a different file, and chunking the data with OFFSET/FETCH ROWS. It's tedious, but I can do this if it's my only recourse.

jaydel
  • 131

1 Answers1

2

As of today (2020-05) you can only split by file size, not number of rows. See https://github.com/dbeaver/dbeaver/issues/1646

motobói
  • 762