The correct answer is: B. Multimeter B is more sensitive.
Multimeter A has a sensitivity of $\frac{{10{\text{k}}\Omega }}{{\text{V}}}$, which means that it requires a 10 kΩ input impedance to measure a 1 V input signal. Multimeter B has a sensitivity of $\frac{{30{\text{k}}\Omega }}{{\text{V}}}$, which means that it requires a 30 kΩ input impedance to measure a 1 V input signal. Therefore, multimeter B is more sensitive than multimeter A.
A more sensitive multimeter will have a lower input impedance. This means that it will draw less current from the circuit being measured, which can reduce measurement errors.