Skip to content

Increased memory usage in version 2.11.18 #11423

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
2 tasks done
aucampia opened this issue Jan 9, 2025 · 14 comments
Closed
2 tasks done

Increased memory usage in version 2.11.18 #11423

aucampia opened this issue Jan 9, 2025 · 14 comments
Labels
Milestone

Comments

@aucampia
Copy link

aucampia commented Jan 9, 2025

Welcome!

  • Yes, I've searched similar issues on GitHub and didn't find any.
  • Yes, I've searched similar issues on the Traefik community forum and didn't find any.

What did you do?

Upgrade from version 2.11.16 to 2.11.18. We did not expect any change in memory usage.

What did you see instead?

It seems Træfik now uses at least 2x as much memory as before, and it may even be leaking memory.

Image
Image

What version of Traefik are you using?

2.11.18

What is your environment & configuration?

N/A

If applicable, please paste the log output in DEBUG level

No response

@nmengin
Copy link
Contributor

nmengin commented Jan 9, 2025

Hey @aucampia,

Thanks for reaching out.

Could you provide a minimal reproducible case (for instance, full Kubernetes manifest to reproduce the issue)?

@nmengin nmengin added kind/bug/possible a possible bug that needs analysis before it is confirmed or fixed. area/server and removed status/0-needs-triage labels Jan 9, 2025
@romainhealth
Copy link

Hi,

We are also experiencing this issue with the latest Traefik version 2.11.18.
We are noticing memory usage spikes, which ultimately lead to our Docker Swarm Managers being killed.

On January 8th, we updated to version 2.11.18.
Image

Traefik is running in our Docker Swarm environment. We have 3 managers, 2 workers, and over 20 services.

version: '3.3'

services:
  proxy:
    image: traefik:v2.11.16
    command: 
      - --api=true
      - --providers.docker
      - --providers.docker.swarmMode=true
      - --entryPoints.web-secure.address=:443
      - --entryPoints.web-secure.transport.respondingTimeouts.readTimeout=150
      - --entryPoints.metrics.address=:8082
      - --metrics=true
      - --metrics.prometheus=true
      - --metrics.prometheus.entryPoint=metrics
      - --ping=true
      - --entryPoints.ping.address=:8083
      - --ping.entryPoint=ping
    ports:
      - "443:443"
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    networks:
      sample:
      management:
    deploy:
      mode: global
      labels:
        - "traefik.enable=true"
        - "traefik.http.routers.traefik.rule=(Host(`${url}`) && PathPrefix(`/api`)) || (Host(`${url}`) && PathPrefix(`/dashboard`))"
        - "traefik.http.services.traefik.loadbalancer.server.port=8080"
        - "traefik.http.routers.traefik.service=api@internal"
        - "traefik.http.routers.traefik.tls=true"
        - "traefik.http.routers.traefik.middlewares=auth"
        - "traefik.http.middlewares.auth.basicauth.users=sample:sample"
        - "traefik.http.routers.traefikping.rule=(Host(`${url}`) && PathPrefix(`/ping`))"
        - "traefik.http.routers.traefikping.service=ping@internal"
        - "traefik.http.routers.traefikping.tls=true"
        - "traefik.http.services.traefikping.loadbalancer.server.port=8083"
      placement:
        constraints:
          - node.role == manager
networks:
  sample:
    external: true
  management:
    external: true

volumes:
  logs:

@johnbizokk
Copy link

We also faced this problem

Image

@Bobris
Copy link

Bobris commented Jan 16, 2025

We have same problem after upgrading from 2.11.16 to 2.11.18, pods starting to be OOM killed.

@aucampia
Copy link
Author

Sorry I accidentally closed this, please re-open it.

@mmatur mmatur reopened this Jan 16, 2025
@emilevauge
Copy link
Member

emilevauge commented Jan 16, 2025

Here are the PRs merged between 2.11.16 and 2.11.18:

The most likely root cause is the bump of the lib golang.org/x/net to v0.33.0. There is no release note associated with v0.33.0. The go team released v0.34.0 on Jan 6, it might be worth a try.

In the meantime, can anyone provide a pprof of Traefik while facing the issue?

@HomoSapiens
Copy link

Have the same problem. Memory consumption is too large and increasing.

@rtribotte rtribotte self-assigned this Jan 20, 2025
@mmatur mmatur self-assigned this Jan 20, 2025
@mmatur
Copy link
Member

mmatur commented Jan 23, 2025

Hi,

We’ve built a test version of v2.11.18 with the x/net dependency downgraded to version v0.30.0. You can find the changes here.

A corresponding Docker image is available on Docker Hub.

Warning:
This is a test image intended solely for debugging purposes, to determine whether the upgrade of x/net to version v0.33.0 could be impacting memory consumption.

Let us know your findings!

@RemiBou
Copy link

RemiBou commented Jan 23, 2025

I saw this also with upgrade from 2.11.16 to 2.11.17

@RemiBou
Copy link

RemiBou commented Jan 23, 2025

a pprof show huge usage of logrus maybe it's #11344

Image

@avivklas
Copy link

I think I found something. In #11344, a new logger object is initialized per request.

@emilevauge
Copy link
Member

Indeed, this is the issue:
https://github.com/traefik/traefik/pull/11344/files#diff-f7d7f0e8fef165ce3ca78be8f4d887b323d564a29b25d416a6a7d2b0e9ff7df7R86
ErrorLog: stdlog.New(log.WithoutContext().WriterLevel(logrus.DebugLevel), "", 0),
Thanks a lot @RemiBou for helping on this, we will be able to fix it ASAP.

@nmengin nmengin added priority/P0 needs hot fix kind/bug/confirmed a confirmed bug (reproducible). and removed kind/bug/possible a possible bug that needs analysis before it is confirmed or fixed. contributor/waiting-for-feedback labels Jan 23, 2025
@RemiBou
Copy link

RemiBou commented Jan 24, 2025

I think it's missing a close on the logger and happens mostly hen you have a lot of config changes

@traefiker
Copy link
Contributor

Closed by #11487.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests