1

I'm working with a data set that varies in size from potentially very small, to as large as hundreds of millions.

When working with a contiguous data set, is there any difference in functionality or performance when assigning a new value to a pointer versus using pointer arithmetic to progress to my desired location?

For example, when progressing to the next member of the data, I could simply increment my pointer by one, or assign my working pointer to that memory address (assuming I already had it for whatever reason).

Operating under Windows using Visual Studio 2012 as a compiler.

2
  • 1
    Is there a predictable pattern in which you will be accessing members of this data structure, or is going to be completely random access? Commented Mar 19, 2013 at 2:51
  • It is split into blocks of predictable data, but the distances between the blocks is not always the same. So, a little bit of both. Commented Mar 19, 2013 at 2:53

1 Answer 1

2

As for performance, according to Andrei Alexandrescu recently (see this link, there is a link to a video of a good talk he gave there) you should prefer indexing into an array over pointer arithmetic for contiguous accesses on modern machines.

However, there is one timeless rule for optimization: measure it! :)

Without more information I have nothing to say re: differences in functionality other than "no".

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.