We built a demo model to show engineers the dots as they are connected. In the overview image, wall AC electricity is fed through power meter M1 to an AC/DC converter. It is not all that efficient, but has a part number an engineer can look up a data sheet on. The power supply output goes through power meter M2, then to our device (NRG Amp in the image). From there it feeds through power meter M3 to either the blower motor (inductive load) or the LED light (resistive load). We've filmed dozens of 4 hour test segments so engineers could watch the meters ratchet up.
In the end, output from the commercial power supply was 1/2 the output from the NRG Amp, consistently, regardless of load. Put another way, 1 unit of electricity into the NRG Amp and 2 units of work performed. The meters image shows 6 watt/hours of electricity having come from the grid, but only 4 watt/hours of DC energy output from the power supply. It is only around 50% efficient. However, the load performed 8 watt/hours worth of work. If it were battery powered, there would be no AC to DC losses. (Or if an efficient AC/DC power supply were used, it would reduce those losses.)
In case you're thinking batteries, a 4 hour test at 1 amp and 13.8 volts would consume (1 X 13.8 X 4) 55.2 watt/hours. How many batteries would we have to stuff in that small box to cover 1/2 of that (27.6 watt/hours)?
Does this sort of answer your questions?