all home built solar system


Author Message
Warpspeed
Guru

Joined: 09/08/2007
Location: Australia
Posts: 4406
Posted: 07:59am 19 Apr 2021      

There have historically been quite a few issues with excessive gate voltages, but things have slowly changed over the years, where its now no longer the problem it once was.

Back in the early mosfet days, mosfet gates were commonly rated for +/-20v max, but it was found that 15v of drive could lead to long term unreliability, and 12v drive was considered optimum for best reliability. Mosfet technology and quality control have come a very long way over the last few decades.

You will commonly see a 10K  or 4K7 resistor connected gate to source. The purpose of that is if the gate is run open circuit, and you suddenly switch say +400 volts onto the drain at power up, the gate drain capacitance will couple enough energy into the gate circuit to often blow the gate oxide layer before the mosfet can turn on to protect itself. Normally this would never happen with a gate driver chip connected, but its always been good practice to fit a resistor to high voltage circuits.

Some of the switch mode controller chips use bipolar transistors, as the internal op amps and comparators are bipolar, and so is the bipolar gate driver output stage. The application notes warn that a big heavy duty Shottky diode MUST be connected between gate and source to prevent any negative spikes from latching the substrate of the driver chip.

Many if not most gate driver chips these days have no analog circuitry inside, and are totally CMOS.  The inherent source drain reverse diodes in the driver output stage mosfets totally protect both the driver chip and the mosfet gate from any excessive voltage transients.

So its no longer common practice to fit Shottky diodes or resistors (or transorbs) into mosfet gate circuits. Sometimes you still see them, but they really do nothing useful.

Peter is quite right, Mad's totem pole has solved a lot of issues.