there maybe wrong assumption in my question , so forgive me, i am here to learn .
So controlling a light(LED) with MCU that outputs PWM, into a MOSFET N-CHANNEL, as described in this little image:
so if i check the voltage in the gate, i can see clearly that it goes linearly up from 0 to 3.3V . (vcc is 5v)
But when you look at it, the led is really not dimming linearly at all, and has these steps.
As we know the ID is proportional to (VGS-VT)^2
, and that means if the gate goes up linearly , the output will be nonlinear .
But, i do want a completely linear experience, and i was wondering a few things :
- because
ID is proportional to (VGS-VT)^2
, i can put the voltage in the gate to be square(V), and so when i move up a linear slider, the output may be linear (?? )
2.the fact that i goes from 0 to -3V, maybe has to do with the specific MOSFET, that create this effect?
Bottom line, i see all the time,that you can use a MOSFET to dim a light, but he is not a linear animal ,and usually these circuits dont add any other parts to the circuit .