Skip to main content

Logs

See a research for Logs aggregation.

We are following the first solution described in this document, logging to stdout with no need for a forwarder sidecar pod.

Where do I find the logs?

First, you have to get access to Splunk (CMDB ID is 'PCKT-002').

Then go to https://rhcorporate.splunkcloud.comSearch & Reporting

You should be able to see production logs using this query:

index="rh_paas" kubernetes.namespace_name="packit--prod"

and staging logs using this query:

index="rh_paas_preprod" kubernetes.namespace_name="packit--stg"

If the above query doesn't return any results, request access to rh_paas index.

caution

If you cannot see Access to Additional Datasets (as suggested by the instructions), use Update Permissions as the Request Type and ask to access the rh_paas index in the additional details.

The more specific search, the faster it'll be. You should specify at least index and kubernetes.namespace_name, but if you want to export the results then you'll have to exclude the _raw field containing the complete JSON structure and include only fields you need, such as message or kubernetes.pod_name, otherwise you'll most likely hit quota. You can start with the examples above and tune it from there. For example:

  • add | reverse if you want to se the results from oldest to newest
  • add | fields - _time, _raw | fields message to leave only message field without timestamp duplication

All in one URL here - now just export it to csv; and you have almost the same log file as you'd get by exporting logs from a worker pod.

For more info, see (Red Hat internal):