Greenplum DBA - Useful log files

1. Greenplum Database Log Files
a) Greenplum Database Log Files files are created on a daily basis in the pg_log directory of the master and each segment instance. 
b) Naming convention of: gpdb-YYYY-MM-DD.log

Note:  Greenplum Database log output tends to be voluminous (especially at higher debug levels) and you do not need to save it indefinitely. Administrators need to rotate the log files periodically so that new log files are started and old ones are removed after a reasonable period of time.
Administrators will need to implement some script or program to periodically clean up old log files in the pg_log directory of the master and each segment instance

Learn more about Greenplum Database Log Files and sample scripts to periodically clean up old log files in the pg_log directory of the master and each segment instance visit www.greenplumdba.com (registered user only. Click here to become DBARef.com registered user

Learn how to use gp_toolkit schema views to view Greenplum database server log files. There are 4 views in the gp_toolkit schema that can help you to view the greenplum database server log files.
1. gp_log_command_timings
2. gp_log_database
3. gp_log_master_concise
4. gp_log_system

2. Greenplum Database Management Utility Log Files
Log files for the Greenplum Database management utilities are written to ~/gpAdminLogs by default. The naming convention for management log files is:
<script_name>_<date>.log
The log entry format is:
<timestamp>:<utility>:<host>:<user>:[INFO|WARN|FATAL]:<message>
The log file for a particular utility execution is appended to its daily log file each time that utility is run.

Gathering Information for Greenplum Support
The gpdetective utility collects information from a running Greenplum Database system and creates a bzip2-compressed tar output file. You can then send the output file to Greenplum Customer Support to aid the diagnosis of Greenplum Database errors or system failures. Run gpdetective on your master host. Learn more visit  www.greenplumdba.com (registered user only. Click here to become DBARef.com registered user).

Greenplum Chorus Log Files ( If you use Greenplum Chorus in your environment - Learn more about greenplum chorus visit  www.greenplumdba.com (registered user only. Click here to become DBARef.com registered user))

Log levels
Depending on the log level set in chorus.properties, the volume of the log files can vary drastically. Supported log levels are:
• debug
• info
• warn
• error
• fatal

production.log
The rails production.log file is stored in: <chorus-root>/shared/log/production.log
This log contains information on requests sent to the Chorus webserver and various debugging information. For example: server errors, file not found, permission denied, and others.

worker.production.log
The rails worker.production.log file is stored in: <chorus-root>/shared/log/worker.production.log
It contains logs for the background worker threads that Chorus uses to perform various asynchronous tasks like database imports, checking instance statuses, etc.

scheduler.production.log
The rails scheduler.production.log file is stored in: <chorus-root>/shared/log/scheduler.production.log
It contains information about jobs that the scheduler issues to different background workers. This will mainly show that a task was scheduled. See the worker.production.log for more detailed information about what happened during execution of a task.

solr-production.log
The rails solr-production.log file is stored in: <chorus-root>/shared/log/solr-production.log
It contains information about solr search queries issued against Chorus.

nginx
nginx maintains access.log and error.log files in <chorus-root>/shared/log/nginx

syslog
As an alternative to the log files listed above, all logs can be combined in one file by using syslog as the logger.
To turn on syslog as the logger, put logging.syslog = true in <chorus>/shared/chorus.properties.

Logrotate
You can use the Linux command logrotate to rotate your log files and prevent accumulation. By running logrotate your_logrotate.conf from a cron job, you
can make sure the logs get rotated at preset intervals.

Here is an example of a your_logrotate.conf configuration file that rotates all the important Chorus log files:
daily
rotate 4
copytruncate
size 10M
<chorus>/shared/log/production.log {
}
<chorus>/shared/log/nginx/access.log {
}
<chorus>/shared/log/nginx/error.log {
}
<chorus>/shared/log/solr-production.log {
}
<chorus>/shared/log/worker.production.log {
}
<chorus>/shared/log/scheduler.production.log {
}
See the logrotate manual page for more details on the features of logrotate: http://linuxcommand.org/man_pages/logrotate8.html.
Note: If you use syslog, you don't need to rotate your logs manually—syslog rotates the log files for you.
Comments