0

I'm working on supporting postgresql to an existing application that currently uses Microsoft SQL Server. So far i have converted all tables and stored procedures to plpgsql.

I want to test the results on both both DBs and ensure that both results match. ive tested the tables using a linked server and triggers on postgresql. However this is redundant and will continue to be when testing my 140 stored procedures and postgresql functions.

Is there an easier way or idea to compare the results and also the table changes made to ensure both produce the same output. Any insight would be helpful.

2 Answers 2

1

To test migrated DB you need to perform the same operations on both databases and then compare the data between the 2 databases table by table.
Search internet for data comparison tools that would allow you to automate some of the data comparisons between SQL Server and PostgreSQL. A bit of Googling revealed this: https://dbconvert.com/mssql/postgresql/.
I would also recommend running traces that record all SP / Batch executions and parameters used while running your test cases, to make it easier to find problems and reproduce them.
P.S. I dont know PostgreSQL to recommend specific actions for it but I know a bit about DB migration

Sign up to request clarification or add additional context in comments.

Comments

0

One way to achieve this would be to create scripts to dump each table on both databases as CSV files (each sorted the same way in both databases), and then combine all CSV files into one (e.g. by file/table name, alphabetically), to finally compare the files created in one and the other file using diff or vimdiff.

6 Comments

And what if his database is more than 100MB in size?
@Alex perhaps you mean 100GB? 100MB is nothing by today's standards. vimdiff will handle a 100MB file without breaking a sweat.
And you going to scoll up and down left and right trying to find ID of the row and the name of the table? What if you have dozens of subtle differences in strings? IMO This is only 1 step up from manually eye balling data cell by cell between 2 tables.
Just to clarify I am not talkinng about techincal limitations of opening 100MB text files but the usability of comparing differences.
@Alex I use methods like this all the time and find it very convenient for debugging, since it is scriptable and reportable. I have a legacy database in the process of migration to PostgreSQL and need to compare them a few times a day (and it is much bigger than 100MB) and this is one of the methods I use, and get reports as an email whenever there's a sync failure.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.