Announcement

Collapse
No announcement yet.

light bulbs and ohms law

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • light bulbs and ohms law

    hi all
    am having difficulty determening an adequate load for load test

    do light bulbs adhere to ohms law?

    either my meter is faulty, or after the testdata im getting, they dont

    my understanding was they were just a resistor inside a glass vacuum?

    so if originally measured a 1-ohm bulb (now reading between 1.5-3-ohm)(battery on meter is low)
    by ohms law it should draw 12amps from a 12v battery? (well at least 4amps if it were 3ohm)

    my meter is only reading 1.5amps well below what i expected....?

    even when connected to 13.7v 10amp powersupply it is only reading 1.84amps and quite bright

    so either my meter is faulty or my understanding is completly wrong

    if somebody could shed some light on this(plz excuse the pun) would be greatly appreciated =)

    thanks in advance

    Mattb

  • #2
    in the project file "bedini monopole 3 group experiment" page12, "selecting charge/discharge criteria" you will find the hint..
    here the calculations..

    you need high wattage resistors..
    put severall resistors with sufficient wattage in serie to get nearly your resistor value
    you will find high wattage resistors like that in electronic shops
    VITROHM Hochlast-Widerstand 25 W RE6155B 10R 5% | ELV-Elektronik

    other things like lamps will work too..

    project file calculates with end point 12.2 V:
    17Ah -> C20 = 17Ah/20 = 0.85A = 850mA
    charging down to 12.2V
    12.2V / 0.85A = 14.35Ohm
    watts are: 12.2V * 0.85A = 10.37Watt

    i calculate start point too, at 13.5V.. with 14.35Ohm
    (1 / 14.35Ohm) * 13.5V = 0.94A
    13.5V * 0.94A = 12,69Watt

    #
    ##
    #

    10Ah battery -> C20Rate is 10Ah/20 = 0.5A = 500mA
    charging down to 12.2V
    12.2V / 0.5A = 24.4Ohm
    watts are: 12.2V * 0.5A = 6.1Watt

    starts at 13.5V.. with 24.4Ohm
    (1 / 24.4Ohm) * 13.5V = 0.55A
    13.5V * 0.55A = 7,47Watt

    #
    ##
    #

    7Ah -> C20 = 7Ah/20 = 0.35A = 350mA
    charging down to 12.2V
    12.2V / 0.35A = 34.86Ohm
    watts are: 12.2V * 0.35A = 4.27Watt

    starts at 13.5V.. with 34.86Ohm
    (1 / 34.86Ohm) * 13.5V = 0.39A
    13.5V * 0.39A = 5.23Watt

    #
    ##
    #

    Comment


    • #3
      I think the resistance goes up once it becomes lit. It becomes lit because the battery is supplying much higher voltage than your Ohm meter. If the wire gets hot, the Ohms go up.

      The OHM meter is probly only something like 2v to test the OHM rating of a "cold" wire. Once the wire/bulb is actually operating for a time on real voltage, it becomes hot, and resistance goes UP, which results in less AMP draw from the same 12v source.

      Maybe run your light bulb on tiny battery that can't even light the bulb, see if a tiny voltage with your 1-ohm measurement results in the AMPs you initially thought you would see.

      This is just an educated guess, but I know that eventually if you cool a wire it can get down to 0-ohms, so I figure heating a wire results in higher Ohms. Your meter isn't heating the wire, but your battery is. So you're getting different readings.

      Maybe run the bulb hot, then really quickly check the ohm's after disconnecting while the bulb is still hot?

      -Ben

      Comment


      • #4
        Mattb,

        Just connect an ammeter between the battery and the bulb. Make sure the current is within the C20 rate of the battery. No need to worry about calculating the resistance of the bulb.


        John K.
        Last edited by John_Koorn; 08-08-2012, 03:23 PM. Reason: Damn you autocorrect :)

        Comment

        Working...
        X