The old turn-signal flashers used a bi-metallic strip which curved as it heated up, broke contact, cooled, and remade contact.
It seems obvious that the flasher rate would vary with load. Lower resistance, higher current, bimetallic strip heats up faster ...
I first saw the modern Triton electronic flashers in the 1970s. There was a capacitor inside, and I presumed they operated on the R-C time-constant principle.
For whatever reason, I assumed that the resistance was internal, and that the flasher rate would be independent of external load.
However, it looks like the rate still changes when a bulb is out (and the load therefore changes).
Can anyone shed light on this for me? (Arg, pun not intended!)
It seems obvious that the flasher rate would vary with load. Lower resistance, higher current, bimetallic strip heats up faster ...
I first saw the modern Triton electronic flashers in the 1970s. There was a capacitor inside, and I presumed they operated on the R-C time-constant principle.
For whatever reason, I assumed that the resistance was internal, and that the flasher rate would be independent of external load.
However, it looks like the rate still changes when a bulb is out (and the load therefore changes).
Can anyone shed light on this for me? (Arg, pun not intended!)