Loki isolation problem: separate endpoints but shared logs

Hi everyone,

I have a multi-environment Kubernetes setup with separate environments:

  • dev
  • test
  • prod

Each environment contains multiple Kubernetes clusters (around 5 per environment).

On one cluster inside each environment I deployed a separate Loki instance, and on every cluster I deployed Grafana Alloy to collect and send logs to the Loki instance of that environment.

Example:

  • dev clusters → send logs to Loki in dev
  • test clusters → send logs to Loki in test
  • prod clusters → send logs to Loki in prod

Each Alloy instance has its own Loki endpoint configured correctly. However, I still see logs from other environments when querying Loki.

Important detail:
All Loki instances use the same S3 bucket, but I created separate folders/prefixes:

observability-dev/
observability-test/
observability-prod/

My expectation was:

  • dev Loki should only see dev logs
  • test Loki should only see test logs
  • prod Loki should only see prod logs

But currently the data appears mixed across environments.

My questions:

  1. Is using different “folders” inside the same S3 bucket enough for isolation in Loki?
  2. Do I need to configure explicit S3 prefixes/path_prefix somewhere in Loki configuration?
  3. Would labels like environment=dev help here, or are labels only for querying/filtering and not for real isolation?
  4. What is considered best practice for separating environments in Loki?

Thanks!
Antonio

Hi @tonyy

In my view, you just need to create separate S3 buckets for each environment and configure Loki to send logs to the correct bucket. This makes it easier to manage permissions and access policies.

In addition, it will be easier to maintain and process in later stages, such as analytics or querying logs.