Skip to content

Data race detected in tests when using conn pool v5 #1356

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
pencilleaf opened this issue Oct 26, 2022 · 1 comment
Closed

Data race detected in tests when using conn pool v5 #1356

pencilleaf opened this issue Oct 26, 2022 · 1 comment

Comments

@pencilleaf
Copy link

pencilleaf commented Oct 26, 2022

I am facing some data race issues when moving from pgx/v4 to pgx/v5.

For context, I am using a wrapper that will allow me to init a connection pool to a postgres database via pgx with some default values using pgxpool.ConnectConfig (v4) and now pgxpool.NewWithConfig (v5). And the test cases for said wrapper are failing when run with -race, which do not happen in pgx/v4. For most of these test cases, I will create a new pgxpool and close it for each subtest. I can't share it here, but this is an example I rewrote without using the wrapper, and it runs into the same issue. I'm using dockertest with postgres 14 image.

func TestNewWithConfig(t *testing.T) {
	t.Parallel()

	var (
		defaultMaxConns     = 25
		defaultMaxIdleConns = 25
		defaultMaxLifeTime  = 5 * time.Minute
	)

	type args struct {
		User         string
		Password     string
		Host         string
		Port         string
		Database     string
		MaxConns     int32
		MaxIdleConns int32
		MaxLifeTime  time.Duration
	}

	tests := []struct {
		name string
		args args
	}{
		{
			name: "happy",
			args: args{
				Host:         testHost,
				Port:         testPort,
				User:         "postgres",
				Password:     "postgres",
				Database:     "datawarehouse",
				MaxConns:     int32(defaultMaxConns),
				MaxIdleConns: int32(defaultMaxIdleConns),
				MaxLifeTime:  defaultMaxLifeTime,
			},
		},
		{
			name: "happy 2",
			args: args{
				Host:     testHost,
				Port:     testPort,
				User:     "postgres",
				Password: "postgres",
				Database: "datawarehouse",
				MaxConns:     int32(defaultMaxConns),
				MaxIdleConns: int32(defaultMaxIdleConns),
				MaxLifeTime:  defaultMaxLifeTime,
			},
		},
		{
			name: "happy 3",
			args: args{
				Host:     testHost,
				Port:     testPort,
				User:     "postgres",
				Password: "postgres",
				Database: "datawarehouse",
				MaxConns:     int32(defaultMaxConns),
				MaxIdleConns: int32(defaultMaxIdleConns),
				MaxLifeTime:  defaultMaxLifeTime,
			},
		},
	}

	for _, tt := range tests {
		tt := tt
		t.Run(tt.name, func(t *testing.T) {
			t.Parallel()

			ctx := context.Background()

			databaseURL := fmt.Sprintf(
				"postgres://%s:%s@%s/%s",
				tt.args.User, tt.args.Password, net.JoinHostPort(tt.args.Host, tt.args.Port), tt.args.Database,
			)

			pgxConf, err := pgxpool.ParseConfig(databaseURL)
			if err != nil {
				t.Fatalf("parse config: %s", err)
			}

			pgxConf.MaxConns = tt.args.MaxConns
			pgxConf.MinConns = tt.args.MaxIdleConns
			pgxConf.MaxConnLifetime = tt.args.MaxLifeTime

			db, err := pgxpool.NewWithConfig(ctx, pgxConf)
			if err != nil {
				t.Fatalf("connect config: %s", err)
			}

			defer db.Close()

			if err := db.Ping(ctx); err != nil {
				t.Fatalf("failed to ping database: %s", err)
			}
		})
	}
}

When running this test with the -race flag, e.g. go test ./... -race I sometimes get data race:

WARNING: DATA RACE
Read at 0x00c000036cb8 by goroutine 88:
  runtime.raceread()
      <autogenerated>:1 +0x10
  github.com/jackc/puddle/v2.(*Pool[...]).CreateResource()
      /Users/claire.tan/go/pkg/mod/github.com/jackc/puddle/[email protected]/pool.go:514 +0x27c
  github.com/jackc/pgx/v5/pgxpool.(*Pool).createIdleResources.func1()
      /Users/claire.tan/go/pkg/mod/github.com/jackc/pgx/[email protected]/pgxpool/pool.go:476 +0x74

Previous write at 0x00c000036cb8 by goroutine 50:
  runtime.racewrite()
      <autogenerated>:1 +0x10
  github.com/jackc/puddle/v2.(*Pool[...]).Close()
      /Users/claire.tan/go/pkg/mod/github.com/jackc/puddle/[email protected]/pool.go:183 +0x440
  github.com/jackc/pgx/v5/pgxpool.(*Pool).Close.func1()
      /Users/claire.tan/go/pkg/mod/github.com/jackc/pgx/[email protected]/pgxpool/pool.go:362 +0x60
  sync.(*Once).doSlow()
      /opt/homebrew/Cellar/go/1.19.2/libexec/src/sync/once.go:74 +0xb0
  sync.(*Once).Do()
      /opt/homebrew/Cellar/go/1.19.2/libexec/src/sync/once.go:65 +0x40
  github.com/jackc/pgx/v5/pgxpool.(*Pool).Close()
      /Users/claire.tan/go/pkg/mod/github.com/jackc/pgx/[email protected]/pgxpool/pool.go:360 +0x48
  github.com/claire-tjq/test.TestNewWithConfig.func1.1()
      /Users/claire.tan/go/src/github.com/claire-tjq/test/pgxpool_test.go:199 +0x34
  runtime.deferreturn()
      /opt/homebrew/Cellar/go/1.19.2/libexec/src/runtime/panic.go:476 +0x30
  testing.tRunner()
      /opt/homebrew/Cellar/go/1.19.2/libexec/src/testing/testing.go:1446 +0x188
  testing.(*T).Run.func1()
      /opt/homebrew/Cellar/go/1.19.2/libexec/src/testing/testing.go:1493 +0x40

Goroutine 88 (running) created at:
  github.com/jackc/pgx/v5/pgxpool.(*Pool).createIdleResources()
      /Users/claire.tan/go/pkg/mod/github.com/jackc/pgx/[email protected]/pgxpool/pool.go:474 +0x9c
  github.com/jackc/pgx/v5/pgxpool.NewWithConfig.func3()
      /Users/claire.tan/go/pkg/mod/github.com/jackc/pgx/[email protected]/pgxpool/pool.go:249 +0x58

Goroutine 50 (finished) created at:
  testing.(*T).Run()
      /opt/homebrew/Cellar/go/1.19.2/libexec/src/testing/testing.go:1493 +0x55c
  github.com/claire-tjq/test.TestNewWithConfig()
      /Users/claire.tan/go/src/github.com/claire-tjq/test/pgxpool_test.go:175 +0x26c
  testing.tRunner()
      /opt/homebrew/Cellar/go/1.19.2/libexec/src/testing/testing.go:1446 +0x188
  testing.(*T).Run.func1()
      /opt/homebrew/Cellar/go/1.19.2/libexec/src/testing/testing.go:1493 +0x40
==================
--- FAIL: TestNewWithConfig (0.00s)
    --- FAIL: TestNewWithConfig/happy_2 (0.34s)
        testing.go:1319: race detected during execution of test

Version

go version go1.19.2 darwin/arm64
github.com/jackc/pgx/v5 v5.0.4

jackc added a commit to jackc/puddle that referenced this issue Oct 28, 2022
Otherwise, if many constructors are underway while the pool is closed it
is possible destructWG will get to 0 while the Wait in Close is underway
and then Add will be called again which is disallowed.

From the docs:

Note that calls with a positive delta that occur when the counter is
zero must happen before a Wait.

jackc/pgx#1356
@jackc
Copy link
Owner

jackc commented Oct 28, 2022

The problem was in progress connections while the pool was being closed. It could cause a WaitGroup to go to 0 and back to 1 during a Wait() call.

Fixed in jackc/puddle@d344a18.

@jackc jackc closed this as completed Oct 28, 2022
atlas-2192 added a commit to atlas-2192/Resource-For-Go that referenced this issue Dec 6, 2024
Otherwise, if many constructors are underway while the pool is closed it
is possible destructWG will get to 0 while the Wait in Close is underway
and then Add will be called again which is disallowed.

From the docs:

Note that calls with a positive delta that occur when the counter is
zero must happen before a Wait.

jackc/pgx#1356
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants