When Flash memory is used to store the deep neural network weights, inference accuracy can degrade due to state errors of Flash memory. To avoid inference accuracy loss, recent studies used ECC(Error Correction Code) or parity that can incur power/storage overhead. In this study, we propose a weight bit inversion method for reducing Flash memory state errors without using ECC or parity. The method first applies WISE(Weight-bit Inversion for State Elimination) that removes the most error-prone state from MLC NAND, thereby improving both the error robustness and the MSB page read speed. If the accuracy loss due to weight inversion from WISE is unacceptable, we apply WISER(Weight-bit Inversion for State Error Reduction) that reduces weight mapping to the error-prone state with minimum weight value changes. The simulation results show that after 16K program-erase cycles in NAND Flash, WISER reduces CIFAR-100 accuracy loss by 1.33X for LeNet-5 and 2.92X for VGG-16 compared to the previous methods.