I am running a series of python scripts from a bash script.
eg
nohup python $IMPORTER_PATH/importer.py -t styles -e $ELASTIC_URL -j $STYLES_DATA_PATH -b -f &>> "$LOG_PATH/styles-$(date "+%d-%b-%Y").log"
nohup python $IMPORTER_PATH/importer.py -t objects -e $ELASTIC_URL -j $OBJECTS_DATA_PATH -b -f &>> "$LOG_PATH/objects-$(date "+%d-%b-%Y").log"
At the moment I am successfully generating log output of stdout and stderr with the use of &>>
My python script had been littered with useful print statements such as print("Process started") and therefore the bash generated log file did have some use.
However, I have recently been cleaning up my log files and using the python logging module to generate more focused log files.
Here's the extract from my importer.py script:
import argparse
import logging
def parse_args(args):
parser = argparse.ArgumentParser()
parser.add_argument("-t", "--type", help="Type of data to import")
def main(argv):
# Set up Logging levels
logging.basicConfig(filename='importer.log',
filemode='w', level=logging.INFO)
es_log = logging.getLogger("elasticsearch")
es_log.setLevel(logging.WARNING)
args = parse_args(argv)
logging.info('Started with %s', args)
The problem now is that the importer.log is created wherever the bash script is run from, and because I have set filemode='w', it gets overwritten as each python script is called.
Is there a way for me to redirect the output from importer.log to my dated log file such as "$LOG_PATH/styles-$(date "+%d-%b-%Y").log"?
Or should I add another argument to my importer.py ArgumentParser and pass it the destination path and file name?