Edit: The original post is an error, because my calcuation for the second example is wrong.

The first example suggests 10 amps per 100 watts. Based on their example, a 24 watt TV will use approx. 2.4 amps an hour using an inverter.

The second example suggests dividing 120 (120v TV) by 12 (12v battery) and multiplying the results (10) by the TV amps in ac mode (.20). Based on their example, the TV will use approx. 2.0 amps an hour using an inverter:
(24 watt, 120v TV is .20 amps)......(120 divided by 12 is 10).....(.20 amps multiplied by 10 is 2.0)

As you can see they're close. Better to error on the side of caution and use the first example. Sorry for the confusion.

I'm researching information in prep for a solar system, and I came across confusing information. Maybe these are both correct, but I can't see how. I have a 24 watt AC TV and I'm trying to determine how many amp hours (ah) it will demand using an invertor and a single 12V battery. Here's the information. Can you clear this up?
________________________________________________________________

Information Source One:

Put simply, for every 100 watts of AC power that your inverter is producing, it needs to draw about 10 amps from your 12 volt battery system.

One of the biggest mistakes made by those just starting out is not understanding the relationship between amps and amp-hour requirements of 120 volt AC items versus the effects on their DC low voltage batteries.

For example, say you have a 24 volt nominal system and an inverter powering a load of 3 amps, 120VAC, which has a duty cycle of 4 hours per day. You would have a 12 amp hour load (3A X 4 hrs=12 ah).
However, in order to determine the true drain on your batteries you have to divide your nominal battery voltage (24v) into the voltage of the load (120v), which is 5, and then multiply this times your 120vac amp hours (5 x 12 ah). So in this case the calculation would be 60 amp hours drained from your batteries – not the 12 ah.

The first source suggests that a 24 watt TV requires about 2.4 amps per hour (24 watts divided by 10 amps = 2.4 amps per hour).

Assumming the TV is 120 volts, the second source of information suggests that a 24 watt TV will require about 12.4 amps per hour (120V TV divided by 12V Battery = 10). (2.4 amps per hour times 10 is 12.4 amps per hour).

What am I missing?Edit: The original post is an error, because my calcuation for the second example is wrong.

The first example suggests 10 amps per 100 watts. Based on their example, a 24 watt TV will use approx. 2.4 amps an hour using an inverter.

The second example suggests dividing 120 (120v TV) by 12 (12v battery) and multiplying the results (10) by the TV amps in ac mode (.20). Based on their example, the TV will use approx. 2.0 amps an hour using an inverter:
(24 watt, 120v TV is .20 amps)......(120 divided by 12 is 10).....(.20 amps multiplied by 10 is 2.0)

As you can see they're close. Better to error on the side of caution and use the first example. Sorry for the confusion.

* This post was
edited 03/09/12 05:35pm by ZZSPIRAL1 *

Ohms law!
Watts divided by Volts = Amps
To invert 12VDC back to 120VAC is roughly 10 to one
So for every amp of 120VAC will require approx 10 amps DC
To make it really simply look up Ohms law.

Don,Lorri,Max (The Rescue Flat Coat Retriever?)
The Other Dallas

An Ampere is a given quantity of electrons per second
so, an Ampere per hour would be electrons per second per hour,
which is a measure of acceleration......

Now that this is solved (?), others will solve the problem.

L NORMAN WADDELL
30 FOOT ALLEGRO
SATURN TOAD
WIFE AND 2 DOGS SUGAR BEAR & COCO BEAR

As Don noted, divide your watts by voltage to get amps. 12V for the battery side, 120V for the output of the inverter. For amp hours, multiply the battery amps times the length of time you use the load.

In the case of your 24 watt TV, if it draws 24 watts, @ 12V that's 2 amps. (At 120V it is only drawing .2 amps) At 4 hours per day you will use 8 amp/hrs. Of course that doesn't include the loss created by the inverter, but unless you are using a huge inverter it won't be more than 5%-10%.

The 24 watt TV will draw about 2 amps at 12 volts plus a little bit since the inverter is not 100% efficient, maybe 2.2 amps . Multiply that result by the number of hours you will use the TV and that will be the ampere hours.

If your TV is a 12v TV that uses an adapter to run off 120v you could run the TV directly off the battery and not have the inefficient inverter producing heat and wasting power.

Class C, 2004/5 Four Winds Dutchman Express 28A, Chevy chassis
2010 Subaru Impreza Sedan
Camped in 45 states, 7 Provinces and 1 Territory

If you want to use your inverter being fed by a 12VDC to produce the same amount of Watts, then you will need 10 times more Amps or 2Amps.

For every Amp consumed using 120VAC you will need 10 times more when using 12VDC.

So if you have a battery that is rated at 100ah and assuming you will use it to 50% discharge then you can use your TV on for 25 hours straight (50ah/2Amp=25hs).

I hope it helps.

All the best,
1st Mate (The Admiral) & Sailingnuts
2005 Winnebago Journey 36' DP, 350 CAT
2014 Jeep Grand Cherokee Overland TOAD

The first source suggests that a 24 watt TV requires about 2.4 amps per hour (24 watts divided by 10 amps = 2.4 amps per hour).

Assumming the TV is 120 volts, the second source of information suggests that a 24 watt TV will require about 12.4 amps per hour (120V TV divided by 12V Battery = 10). (2.4 amps per hour times 10 is 12.4 amps per hour).

What am I missing?

As I read the two examples the first is using a 12V battery system to make its calculations and the second is using a 24V DC system to supply the inverter.

Everything starts with the TV at 24 watts. That requires 0.2 AC amps. The 12V DC system must supply about 10 times that or about 2 amps. The 24V DC system must supply about 5 times that or 1 amp.

I suspect that you have a 12V DC system so the first example is correct. The confusion comes from mixing the DC voltages.

An Ampere is a given quantity of electrons per second
so, an Ampere per hour would be electrons per second per hour,
which is a measure of acceleration......

Now that this is solved (?), others will solve the problem.

Not totally correct since all electric bills are charges is KWH which is basically amps time time over a certain period which is generally an hour. Thus they charge you for the amps used per hour averaged out.

2001 standard box 7.3L E-350 PSD Van with 4.10 rear and 2007 Holiday Rambler Aluma-Lite 8306S Been RV'ing since 1974. RAINKAP INSTALL////ETERNABOND INSTALL

Not totally correct since all electric bills are charges is KWH which is basically amps time time over a certain period which is generally an hour. Thus they charge you for the amps used per hour averaged out.

Here is how I understand it. KWH is a measure of energy and the total on your energy bill is the total energy used over the billing period.
Amp-hr is also a measure of energy and is used to measure energy consumption in our 12V systems. They are not directly comparable because of he different voltages involved but both are a measure of energy.

I think you are saying that if the amps over any time period (say 30 days) were averaged and then that average was multiplied by that time period we would have the total of energy (Amp-days, which would be proportional to KWH) used in that time period. Does that make sense?

It is interesting how different people approach the same subject. I always enjoy these conversations because it makes me rethink my way of thinking.