I have read from different sources that size of integer variable is unbounded in python and it grows with the size of the integer itself. I have few question related to a code that I am working on in this context.
- Is there a lower bound? Or does it literally starts from 1 byte and grows according to integer size?
- Can a lower bound or any bound for that matter of fact be applied to integer variables in python?
- If I have an array of integers is it likely that each index has different number of bytes depending on the integer it is holding? or Does python guarantee uniform size in arrays?
What I am trying to do is to calculate memory bandwidth by noting the time taken to sum up a really huge array and then use this time and size of array to roughly estimate the bandwidth. But in order to do so I need to know the bytes read from the memory and if they are not uniform then it is really infeasible to check each individual index of the array as the array has around 10M indexes. Any alternate suggestions?
sys.getsizeof(0)gives the lower bound, 2. I don't understand the question, 3. arrays hold references of integer objects varying in size.