I have a postgres table with millions of record in it. Now I want to add new column to that table called "time_modified" with the value in another column "last_event_time". Running a migration script is taking long time , so need a simple solution to run in production.
2 Answers
Assuming that the columns are timestamps you can try:
alter table my_table add time_modified text;
alter table my_table alter time_modified type timestamp using last_event_time;
5 Comments
klin
It depends on server load and other clients. Once we obtain a lock the operation should be much faster than an update. I would do this on Sunday at 4 am ;)
Nagaraj Vittal
Thanks klin .This solution better than update but only concern about locking the table. Is there any better option ?
klin
I don't think there is a better solution. You have to choose between locking all rows for write and an exclusive lock on the table, and it depends on the current server load. I think the key is to choose the right moment.
klin
If the server is permanently heavily loaded, you can perform updates in chunks. Of course, these partial updates have to be executed in their own transactions (not all in a single transaction).
I suggest use function with pg_sleep, which wait between iteriation in loop
This way don't invoke exclusive lock and others locks on your_table.
SELECT pg_sleep(seconds);
But time of execute is long
alter table my_table add time_modified timestamp;
CREATE OR REPLACE FUNCTION update_mew_column()
RETURNS void AS
$BODY$
DECLARE
rec record;
BEGIN
for rec in (select id,last_event_time from your_table) loop
update your_table set time_modified = rec.last_event_time where id = rec.id;
PERFORM pg_sleep(0.01);
end loop;
END;
$BODY$
LANGUAGE plpgsql VOLATILE
and execute function:
select update_mew_column();
updateafter you have added the column.