I have this list
bytes = ['11010001', '00100111']
And I want to write the content of bytes in my own binary file as a byte. So I iterate through every element of the list, convert it from string to binary and then write it in the file as a char represented by that binary combination.
output = open(location+filename + '.enchuff', 'wb')
for byte in bytes:
chunk = int(byte, base=2)
output.write(chr(chunk))
It works well, but the problem is when the bytes list gets bigger. I generate it from another file and when I input let's say a 100MB file for it to read, the list gets REALLY long and my program hangs on the for cycle. I guess the for cycle must be the problem, since it is iterating probably more than hundreds thousands of elements and writes down every single one of them. Also my memory consumption jumps from that point even to 4GB of ram. Is there any other way to achieve this faster and preserve precious RAM?