Hello,
I can't get the babar.log aggregated to yarn logs. When I run the command to get the logs:
yarn logs --applicationId application_XXXX_YYYY > myAppLog.log
the resulting myAppLog.log doesn't contain the traces of babar.log. It contains only my application and yarn log messages.
It is not a problem when I launch the application from the linux command line (calling directly spark2-submit) because the file babar.log is created in the same directory. But when I launch it from an oozie workflow (production environment), the file babar.log dissapears when the container terminates and its content is not aggregated.
I realized that the system properties: yarn.app.container.log.dir and spark.yarn.app.container.log.dir are null, then babar uses a local directory where log is stored : ./log. Could it be the reason? Anyone has observed the same problem?
Hello,
I can't get the babar.log aggregated to yarn logs. When I run the command to get the logs:
yarn logs --applicationId application_XXXX_YYYY > myAppLog.logthe resulting
myAppLog.logdoesn't contain the traces ofbabar.log. It contains only my application and yarn log messages.It is not a problem when I launch the application from the linux command line (calling directly spark2-submit) because the file
babar.logis created in the same directory. But when I launch it from an oozie workflow (production environment), the filebabar.logdissapears when the container terminates and its content is not aggregated.I realized that the system properties:
yarn.app.container.log.dirandspark.yarn.app.container.log.dirarenull, thenbabaruses a local directory where log is stored :./log. Could it be the reason? Anyone has observed the same problem?