First you need to eliminate the contact resistance of the ohmmeter connections, connect both probes to ONE resistor lead to get a baseline, then switch to the other lead of the test resistor to get the updated value, then subtract.
The ESR meters out there I have seen force you to zero every time they are turned on. tens of milliohms on two wire readings are pretty good. I generally use that instead of digging out my 4 wire.
Usually a 4 wire resistance reading is done, 2 wires push through the test current, and 2 other wires read the voltage across the resistor by connecting the high impedance voltmeter to the resistor leads at different locations than the test current leads. Been doing that professionally since the late 70's.
Poor mans improvised check: get 20 to 490 ohms, and 12 volts. put the 20 to 400 in series with the unknown sub 1 ohm resistor and then the DMM set for current and on the 12 volts. There will be a current flow. write the current reading down
series circuit:
12v-----120 ohm resistor-----unknown sub 1 ohm resistor---------dmm set for current--------ground
(12v 12 ohms 1 amp 12 watts, 12v 120 ohms 0.1 amp 1.2 watts and so forth.)
remove the dmm from in series but leave the rest set up to allow the current to flow.
12v-----120 ohm resistor-----unknown sub 1 ohm resistor----------ground
.....................................^............................................^...................
.....................................|---dmm set for voltage----------|
The current will be SLIGHTLY different without the DMM in circuit, but not much.
Take the dmm and measure the VOLTAGE across your sub 1 ohm resistor.
at 1 amp current, volts across the unknown resistor will translate directly to ohms, 0.300v will be 0.3 ohms
at 0.1 amp current 0.030 volts will be 0.3 ohms.
apply ohm's law : volts divided by amps is ohms.
Way back when, I used constant test currents of TEN amps. Mercury Displacement Relays.
Trying to get THAT temperature stable over time was a real treat.