I'm migrating from AWS Aurora PG to Heroku PG and I'm currently refactoring my queries. Using the Node-Postgres library in my Serverless Framework API, I now have to manage opening and closing the connection, something I didn't have to do with the AWS-SDK.
I am following this article to setup my queries. Taken from the article:
let pgPool;
const setupPgPool = () => {
// pgConfig is the host/user/password/database object
pgPool = new pg.Pool(pgConfig);
};
module.exports.hello = async () => {
if (!pgPool) {
// "Cold start". Get Heroku Postgres creds and create connection pool.
await setupPgPool();
}
// Else, backing container "warm". Use existing connection pool.
try {
const result = await pgPool.query('SELECT now()');
// Response body must be JSON.
return {
statusCode: 200,
body: JSON.stringify({
output: {
currTimePg: result.rows[0].now,
},
}),
};
} catch (e) {
// Return error message in response body for easy debugging.
// INSECURE - CHANGE FOR PROD
return {
statusCode: 500,
body: JSON.stringify({
error: e.message,
}),
};
}
};
I'm trying to solve the issue of where to put the pgPool. My folder structure is like so:
src/
controllers/
auth/
createUser.js
models/
authModel.js
userModel.js
To dumb down my create createUser handler let's say it looks like this:
module.exports = (App) => {
App.controllers.createAuth = async (event, context, cb) => {
const body = JSON.parse(event.body)
const { email, password } = body
try {
const hash = createHash(password)
// Auth is a class that contains all my auth table read/write functions to the db
const Auth = new App.Auth()
const authRecordId = await Auth.createAuth(email, hash)
// User is a class that contains all my user table read/write functions to the db
const User = new App.User()
await User.createUser(authRecordId)
const token = createToken(authRecordId)
return cb(null, utils.res(200, { token }))
} catch (error) {
return cb(null, utils.res(500, { message: 'There was an error. Please try again' } ))
}
}
}
Inside my authModel I have:
let pgPool
const setupPgPool = () => {
// pgConfig is the host/user/password/database object
pgPool = new pg.Pool(pgConfig)
}
Auth.prototype.createAuth = async (email, hash) => {
const sql = `
INSERT INTO auth (email, hash)
VALUES (lower($1), $2)
RETURNING id;
`
const values = [ email, hash ]
try {
if (!pgPool) {
await setupPgPool()
}
const { rows } = await pgPool.query(sql, values)
await pgPool.end()
return rows[0].id
} catch (err) {
throw err
}
}
The userModal query looks very similar (setup pgPool, run query, end connection, return data). The issue is pgPool never exists and I always need to await it before running the query. I also need to run await pgPool.end() otherwise my queries will hang. Is this the best practice, as the article suggests, in a Serverless Framework API?
Lastly, should I instead open the connection my handler and pass pgPool as an argument to my models? That way if I have to make more than one query to my database I don't have to await setupPgPool every time I want to read/write from the database?
EDIT: I was refactoring my first handler as I was writing this. After implementing what I used in the post, I am now running into this error: Cannot use a pool after calling end on the pool. Seems to be because I am closing the connection.
EDIT 2: Commented out the await pgPool.end() lines and now my calls hang again.. not really sure what to do..
finallyrelease the client back to the poolend()the pool when the container/process/etc is shut down, e.g. on aprocesssignal. And you should configure the pool size accordingly to the scaling behaviour.1would suffice as one nodejs process never needs to use two separate postgres clients. Acquiring more for some reason would be rather inefficient. (Disclaimer - this might have changed since I last researched the topic). But this is probably not very relevant until you really need to scale up…