I am trying to export some query results to a CSV or TAB delimited file. One of the fields is a text blob which includes special characters, possibly including single and double quotation marks (", '), newlines (\n) and tabs (\t). 
+------+------+--------------------------------------------------------+
| col1 | col2 | text                                                   |
+------+------+--------------------------------------------------------+
|    1 |  foo | Oh hey why not "this" or t'is                          |
or a newline while we are at it |
+------+------+--------------------------------------------------------+
This is the query I am using, with \t instead of , for TAB delimited files.
SELECT col1, col2, text
FROM   mytable
INTO OUTFILE '/tmp/foo.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n';
This neither works with CSV or TAB delimited files, as any commas or quotes or tabs in a text field end up splitting the text blob across multiple lines and/or columns in any exported file.
Actual question:
Is there any way to escape special characters like ", ', \t, \n in the text field and write to a CSV or TAB file, or do these need to be replaced before attempting to write to a file?
If these should be replaced, I would try to start with the code in this question using the REPLACE function, but I'd prefer something that preserves the original text.
Thanks.
 
     
    