Can anyone please tell me if the sensor output is 0 to +-10V, and I connect the sensor to a differential amplifier to get an output of 0 to 3.3V then what is the input voltage to the differential amplifier from -10v to +10v or (0 to 10 v or -10 to 0v)?
If I use -10 to 10 volts then the differential amplifier adds both the input voltages and gives +20V and with gain, we can adjust it to 0 to 3.3. But, if I use 0 to 10v or 0 to -10V then I get half voltage with the above set gain.
Can anyone please explain?
A differential amplifier amplifies the difference between the signals. If you had a differential amplifier that scales a ±10v differential signal to a 0 to 3.3v signal, it would expect that the maximum input would be a 20v difference between the terminals. That's why you need to apply a 20v signal to the input to get the output you want.
If you have a sensor that outputs from -10v to +10v then you would need to apply 10v to one side of your differential amplifier so that 0v from your sensor corresponds to 1.65v out, -10v is 0v and 10v is 3.3v
Answered by Mark Omo on January 6, 2022
1 Asked on October 3, 2021 by jash
0 Asked on October 3, 2021
1 Asked on October 3, 2021 by batini
1 Asked on February 9, 2021
1 Asked on January 20, 2021
1 Asked on January 14, 2021 by dairon
1 Asked on January 9, 2021 by aalian-khan
2 Asked on January 6, 2021 by xabigarde
3 Asked on December 27, 2020 by james_erikson
1 Asked on December 24, 2020 by farhan-ahmed
2 Asked on December 21, 2020 by nikthebrick
1 Asked on December 15, 2020 by stackcode12
1 Asked on December 14, 2020
5 Asked on December 4, 2020 by boardbite
0 Asked on November 18, 2020 by muzammil-ibrahim
2 Asked on October 31, 2020 by jpcreeper13
Get help from others!