If I want to increase the signal output, distance, I know the wavelength output, I even know what voltage i output at, let's say 5v, and I know my circuit requires say 500ma and delivers this to the antenna and It radiates out the signal..

So which is better?..

12v amplified to use 1amp to broadcast (12watts)
or
120v radiating out just 0.1amps (12 watts)

Which would be better and why? and the DB value? what exactly is that? (Volts or Amps or Both ??)

It has to be referenced to something. For example you might see the term dBV. This is a voltage ratio referenced to 1V. In audio engineering audio voltage levels are often stated as dBm. This is a voltage that produces 1mW in a 600ohm load, a common load impedance in audio (0.775V). So an audio voltage level of +6dBm is approx. 2x 0.775V.

But with electromagnetic radiation, the resistance (impedance of free space) is fixed at about 377 ohms, so there is only one combination of voltage and current available for EM power transmission. You don't get a choice. Sorry.