2

I have a MySQL database that I use only for logging. It consists of several simple look-alike MyISAM tables. There is always one local (i.e. located on the same machine) client that only writes data to db and several remote clients that only read data.

What I need is to insert bulks of data from local client as fast as possible.

I have already tried many approaches to make this faster such as reducing amount of inserts by increasing the length of values list, or using LOAD DATA .. INFILE and some others.

Now it seems to me that I've came to the limitation of parsing values from string to its target data type (doesn't matter if it is done when parsing queries or a text file).

So the question is:

does MySQL provide some means of manipulating data directly for local clients (i.e. not using SQL)? Maybe there is some API that allow inserting data by simply passing a pointer.

Once again. I don't want to optimize SQL code or invoke the same queries in a script as hd1 adviced. What I want is to pass a buffer of data directly to the database engine. This means I don't want to invoke SQL at all. Is it possible?

3
  • Where you want to pass pointer? Commented May 28, 2013 at 6:57
  • I mean that I would like some function like void insert(char* tableName, void* data, int dataLen) and data would contain values of columns. Thus I could insert data to table and avoid using SQL (i.e. parsing). Or something like this. Commented May 28, 2013 at 11:55
  • Here's a comment for everyone who is adviceing me to use one big insert or to lock/unlock tables or other optimizations of SQL code. I've already tried all of these approaches. I'm not interested in optimizing SQL. What I want is to pass data directly to database engin using some API and passing a buffer to it. Commented May 28, 2013 at 12:07

8 Answers 8

2

Use mysql's LOAD DATA command:

Write the data to file in CSV format then execute this OS command:

LOAD DATA INFILE 'somefile.csv' INTO TABLE mytable

For more info, see the documentation

Sign up to request clarification or add additional context in comments.

2 Comments

I have tried it, but I think there's still some overhead of parsing data from text file. This approach is realy fast but not much faster then executing one big insert with the same data as in csv file. Is there a way of passing data as is. Maybe some twiks to MySQL source or anything.
The LOAD command is way faster than one bug insert. And no, there is no "direct inject" method. This is the closest thing to what you want. It's what I would use if I needed something like this.
1

Other than LOAD DATA INFILE, I'm not sure there is any other way to get data into MySQL without using SQL. If you want to avoid parsing multiple times, you should use a client library that supports parameter binding, the query can be parsed and prepared once and executed multiple times with different data.

However, I highly doubt that parsing the query is your bottleneck. Is this a dedicated database server? What kind of hard disks are being used? Are they fast? Does your RAID controller have battery backed RAM? If so, you can optimize disk writes. Why aren't you using InnoDB instead of MyISAM?

5 Comments

Binding parameters sound very interesting. I'll try it.
As for database server, it's a bit exotic application of mysql. I have a PLC (Programmable Logic Controller) that has to log some data on itself (to mysql db) and then pass it futher to some log servers. This PLC has SSD drives. So they should be very fast. Yet they are too small. And that's why I use MyISAM, because I then pack db files using myisampack.
Not all SSD drives are fast at writing data. You can also try using INSERT DELAYED: dev.mysql.com/doc/refman/5.5/en/insert-delayed.html. The docs say it is useful for logging situations.
Well I'm pretty sure that writing data to disk is not a bottleneck. Because according to profiling information disk driver uses much less CPU speed then mysql daemon.
Disk drivers shouldn't use a lot of CPU. The disk itself would be the bottleneck because it is much much slower than memory.
0

With MySQL you can insert multiple tuples with one insert statement. I don't have an example, because I did this several years ago and don't have the source anymore.

Comments

0

Consider as mentioned to use one INSERT with multiple values:

INSERT INTO table_name (col1, col2) VALUES (1, 'A'), (2, 'B'), (3, 'C'), ( ... )

This leads to you only having to connect to your database with one bigger query instead of several smaller. It's easier to take in the entire couch through the door once than running back and forth with all disassembled pieces of the couch, opening the door every time. :)

Apart from that, you can also run LOCK TABLES table_name WRITE before INSERT and UNLOCK TABLES afterwards. That will secure that nothing else is inserted during.

Lock tables

Comments

0

INSERT into foo (foocol1, foocol2) VALUES ('foocol1val1', 'foocol2val1'),('foocol1val2','foocol2val2') and so on should sort you. More information and sample code will be found here. If you have further problems, do leave a comment.

UPDATE

If you don't want to use SQL, then try this shell script to do as many inserts as you want, put it in a file, say insertToDb.sh, and get on with your day/evening:

#!/bin/sh
mysql --user=me --password=foo dbname -h foo.example.com -e "insert into tablename (col1, col2) values ($1, $2);"

Invoke as sh insertToDb.sh col1value col2value. If I've still misunderstood your question, leave another comment.

1 Comment

As I already said, I am already using this approach. The question is not how to optimize the query but how to insert data not using queries at all.
0

After making some investigation I found no way of passing data directly to mysql database engine (without parsing it).

My aim was to speed up communication between local client and db server as much as possible. The idea was if client is local then it could use some api functions to pass data to db engine thus not using (i.e. parsing) SQL and values in it. The only closest solution was proposed by bobwienholt (using prepared statement and binding parameters). But LOAD DATA .. INFILE appeared to be a bit faster in my case.

Comments

-1

The best way to insert data on MS SQL without using insert into or update queries is just to access MS SQL Interface. Right click on the table name and select "Edit top 200 rows". Then you will be able to add data on the database directly by just typing per cell. For you to enable searching or using select or other sql commands just right click on any of the 200 rows you have selected. Go to pane then select SQL and you can add sql command. Check it out. :D

1 Comment

MS SQL is not MySQL, which is what OP was asking about
-1

without using insert statement , use " Sqllite Studio " for inserting data in mysql. It's free and open source so u can download and check.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.