PostgreSQL query/error logging for .sql scripts
Asked Answered
L

3

8

Here is the problem: I need to know how to get all PostgreSQL output from an executed .sql script to log to a file, including errors and query results.

I have already surrounded my script with \o and \o opening and closing tags, but this only logs the results of queries, which tells me nothing about what did and did not succeed. I have tried piping the results of a call to to PostgreSQL using Unix tools like so:

    $~: psql < filename.sql | tee &> filename.log

... with no success. Filename.log in this case ends up completely empty.

I don't want to activate universal logging, as I am only concerned with the success/failure of the scripts I have written and keeping a record of this for future reference.

In Oracle, I used SPOOL, in MySQL I used TEE. Both of these worked well enough for my needs, but \o does not have this functionality, and neither does activating logging, as this logs to a single file and I want my logs separated based on which file resulted in those logs.

Any help would be appreciated.

EDIT: the version I am using is 9.1

EDIT: The problem I am having is that using -f and using < to execute .sql files gives me essentially the same results; it doesn't log the errors, it only logs query results. I need to somehow get stderr (if that's what is used) to print it's messages to a file and not just the command line such that the file will essentially look identical to the command line results of running the file, with query results and errors mixed in. The reason I need to do this is because it makes debugging a .sql file much easier and it allows me to keep targeted records while avoiding universal error logging.

Lally answered 12/7, 2012 at 15:43 Comment(0)
L
12

I was using the wrong redirect syntax on the command line. While this is not an ideal solution in my opinion (I have to remember to modify any command I ever run to redirect output to a file), it works, and that's all I care about.

To redirect both stdout and stderr to a file to log the results of just one .sql file including errors and queries, one can do the following:

psql -f scriptname.sql &> logname.txt

Here is the breakdown: -f (as alluded to in the answer from Frank Heikens) is a better command to use than < for reading a .sql script because it shows the line numbers that triggered the errors. &> is a Linux command line trick that puts both stdout and stderr into the file that follows. To limit the log to just errors (just sdterr output), simply use 2> instead of &>. There is no need for a pipe because both streams go to the same place.

You will notice, however, that the above command takes all of the output from the command and places it into the file, and you therefore do not see any of that output in the command line. This is not ideal for debugging .sql scripts, so the following command can be used instead:

psql -f scriptname.sql 2>&1 | tee logname.txt

The breakdown here is that the 2>&1 is the long form of &>, but when working with pipes, 2>&1 avoids BASH interpreting the syntax as an error (don't ask me why). tee simply allows output to go both to the console and to the given file. A filename in all of the above commands can be a full path and can include ~ (for the home directory), and anything else that Linux can interpret as a valid filename and/or path.

Still, this solution is not ideal as it is external to the .sql scripts that I need to log. However, this method does work. I assume that the history PostgreSQL has with Linux is partially to blame for the dependence on command-line tools to do this sort of thing.

Lally answered 14/7, 2012 at 3:45 Comment(1)
Thanks! This is exactly the answer I was looking for and an explanation to go with it. Thanks again!Menis
T
5

This also works - use the -a -b and -e options for "verbose logging:"

psql -f scriptname.sql -a -b -e &> logname.text

psql --help shows:

Input and output options:
  -a, --echo-all           echo all input from script
  -b, --echo-errors        echo failed commands
  -e, --echo-queries       echo commands sent to server
Towboat answered 15/3, 2017 at 20:2 Comment(0)
E
0

Use -f (and other settings) to feed psql.

Erida answered 12/7, 2012 at 16:11 Comment(1)
While this is a good practice (it is more full-featured than < because it provides the line numbers that triggered the errors) it still doesn't put error output into a file. I've tried -L, -o, \o, and a couple of others from the link but none have had the desired effect (error output AND query results funneled into a file for logging purposes).Lally

© 2022 - 2024 — McMap. All rights reserved.