I'm working with a data set that varies in size from potentially very small, to as large as hundreds of millions.
When working with a contiguous data set, is there any difference in functionality or performance when assigning a new value to a pointer versus using pointer arithmetic to progress to my desired location?
For example, when progressing to the next member of the data, I could simply increment my pointer by one, or assign my working pointer to that memory address (assuming I already had it for whatever reason).
Operating under Windows using Visual Studio 2012 as a compiler.