I get most of this - I expect it'll be clearer and faster with practice, but I get the principal that it is logarithmic to powers of 10. I also understand the underlying principal of decibel gain/loss related to a ratio:
db = 10log10(Pout/Pin) for Power
db = 20log10(Vout/Vin) for Volts RMS (or, generically, a ratio of amplitude)
So for power, a +3db gain is a doubling of power; for volts, a +6db gain is a doubling of volts (or amplitude.)
I then come across a statement like this:
"dBm: This is the value of a signal as referenced to 1 milliwatt. It is an absolute value. A 0 dBm signal equals 1 milliwatt. A 1 watt signal is equal to +30 dBm. If you had a 100-watt transmitter, that would equate to a +50.0 dBm level. If you had a 2 microvolt signal, that would equate to a -100 dBm signal. If you have a 0.5 microvolt (uV) signal, that would equate to -112 dBm."
I can follow the first part, no problem. If 0dBm is 0.001W, then for 1W, dBm = 10log10(1/0.001) = 30; 100W, dBm = 10log10(100/0.001) = 50. Generically 1W is 10^3 from 1mW and log10(1000) is 3 so 10x3 = 30; ditto 100W ~ 10^5 so log10(100000) is 5 so 10x5 = 50.
The second part of the statement though feels like it's making an assumption, one that maybe is so common to those in the know that it doesn't need stating?? That doesn't include me.
To relate a voltage value to a power value I need to know the load in Ohms or Amps don't I? Take as an example, because it works with that logic (nearly), an assumed 50Ohm load, then from the formula P=V^2/R:
2uV @ 50Ohms: P = (2x10^-6)^2 / 50 = (2^-6)^2 / 50 = (4^-12) / 50 = 8^-14.
dBm = 10log10(8^-14 / 0.001) = -100 (actually, pretty much -101)
0.5uV @ 50Ohms (not showing working out!): P = 5^-15
dBm = 10log10(5^-15 / 0.001) = 113 (although the statement above has 112)
I thought working with decibels was intended to be quick so I think I'm missing a trick here. I couldn't work out in my head that 0.5uV is -112dBm (or -113dBm) so I feel like I'm doing something wrong.

