Welp, I am at a loss. I took readings at 10A then went to 200mA. Inot sure if this means 250mA or 15mA. Either I am good, or I have a hell of a draw.
What's the connection to the battery?
All you did was move the red lead from "10A" to "VΩ" and nothing else?
Your actual current is closer to 15.1mA than 250mA.
To understand why requires being an EE...
If you're reading this, then OK.
The 10A scale in this meter apparently doesn't read very low values correctly. It's specified to be ±(2% reading + 5 counts) with 10mA resolution, which means at full scale the error could be as much as 265mA in error. Since that's about what it's showing for a very low actual current means it's just trying to display at it's lower limit when it sees any current flow. It would probably show "0.25" until you got higher than ~375mA on 10A (normally you'd expect it to say zero below its minimum resolution). At that point readings would I think fall into the 2% give-or-take accuracy.
Edit. Now that I think about it I notice you may have the meter's polarity reversed. That could trip up the ADC into rolling over. If you reverse leads I suspect it may fall into line. On 10A it might show 10mA up to as high as 30mA, depends on how the error is handled.
When you switch to the 200mA range the accuracy is ±(1% reading + 5 counts) with 100μA (0.1mA) resolution, so indicating 15.1mA means with uncertainty it may be around 255μA (0.255mA) either way from that.
To go any further would require explanation of analog-to-digital conversion, e.g. ADC bit-width (this appears to be a 10-bit ADC), signal-to-noise, sampling, binary math and display issues. Suffice to say this meter's guts are not exactly pushing any performance envelopes.