My idiot's guide tells me I should measure amps in series with a component (in this case an LED.)
I did so, but the thing is this:
I first measured the LED with the multimeter needles either side of it [is this parallel ?]and the reading accorded accurately with Ohm's Law. When I put them is series (before the LED) the reading was way off my calculations.
Also how can I have such wild fluctuations in amps depending on where I measure ? The circuit had an LED, a switch and a 100 Ohm resistor. (3v DC)
Thanks in anticipation.