I came across this question because I wanted to do the same thing.
Mine was an X/Y problem, however: I was implementing a function and thought that I wanted to convert the input array of bits into an integer.
What I really wanted to do was to use an IntFlag from the enum built-in instead of my custom implementation.
Here's the standard example from the documentation:
>>> from enum import IntFlag
>>> class Perm(IntFlag):
... R = 4
... W = 2
... X = 1
...
>>> Perm.R | Perm.W
<Perm.R|W: 6>
>>> Perm.R + Perm.W
6
>>> RW = Perm.R | Perm.W
>>> Perm.R in RW
True
It also works the other way:
>>> Perm(7)
<Perm.R|W|X: 7>
My use case was for a bitmask:
class BitMask(IntFlag):
flag1 = 2**0
flag2 = 2**1
flag3 = 2**2
flag4 = 2**3
flag5 = 2**4
# etcetera
With an IntFlag I can get the integer value while still retaining the useful information of what each bit represents.
11101001in binary which is233in dec