0

I have rigged two different DHT22 sensors to an Arduino Nano board. Both temperature readings are more or less comparable but they produce very different humidity readings.

The sensors are:

The simple code is build on one of the basic examples as followed:

    #include "DHT.h"
    #define DHTPIN_2 2     // Digital pin connected to the DHT sensor_1
    #define DHTPIN_3 3     // Digital pin connected to the DHT sensor_2
    #define DHTTYPE DHT22   // DHT 22  (AM2302), AM2321
    #define DHTTYPE_11 DHT11   // DHT 11  (AM2302), AM2321
DHT dht_1(DHTPIN_2, DHTTYPE);
DHT dht_2(DHTPIN_3, DHTTYPE);


void setup() {
  Serial.begin(9600);   //consider baud in serial monitor!
  Serial.println(F("DHT22 test!"));
  dht_1.begin();
  dht_2.begin();
}

void loop() {
  delay(5000);
  float h_1 = dht_1.readHumidity();
  float t_1 = dht_1.readTemperature();
  float h_2 = dht_2.readHumidity();
  float t_2 = dht_2.readTemperature();

  //Check if any reads failed and exit early (to try again).
  if (isnan(h_1) || isnan(t_1)) {
    Serial.println(F("Failed to read from DHT sensor_1!"));
    return;
  }

  // Check if any reads failed and exit early (to try again).
  if (isnan(h_2) || isnan(t_2)) {
    Serial.println(F("Failed to read from DHT sensor_2!"));
    return;
  }
  Serial.print(F("Humidities: "));
  Serial.print('\n');
  Serial.print(F("s_1: "));
  Serial.print(h_1);
  Serial.print(F(" s_2: "));
  Serial.print(h_2);
  Serial.print('\n');
  Serial.print(F("Temperatures: "));
  Serial.print('\n');
  Serial.print(F("s_1: "));
  Serial.print(t_1);
  Serial.print(F(" s_2: "));
  Serial.print(t_2);
  Serial.print('\n');
}

The resulting output on the Serial Monitor is:

Temperatures: 
s_1: 25.30 s_2: 23.40
Humidities: 
s_1: 6.20 s_2: 66.40
Temperatures: 
s_1: 25.20 s_2: 23.40
Humidities: 
s_1: 6.00 s_2: 66.10
Temperatures: 
s_1: 25.10 s_2: 23.40
.
.   
.

The humidity reading from sensor_1 seems defective, just having the 10th of the value of sensor_2. Is there a fix for that? Is it possible to re calibrate sensor_1 or is it broken?

Edit: swapping the sensors (suggested by jsotola)

The output results are:

Temperatures: 
s_1: 22.60 s_2: 24.70
Humidities: 
s_1: 67.80 s_2: 7.30

Its the sensor it seems.

Rohit Gupta
  • 618
  • 2
  • 5
  • 18
nick
  • 29
  • 3

1 Answers1

-1

The difference in humidity readings may be traced to any number of reasons including faulty parts, incorrect programming or low voltages (please post at what voltage you are running the sensors at) to name a few.

Let us speculate that one of the sensors is reporting relative humidity and the other absolute. Absolute humidity does not consider temperature where as relative does. From this web site we get the following conversion equation:

relative to absolute humidity equation

This equation converts relative humidity to absolute. Finding the saturation vapor pressure measured in pascal is not trivial. So we will use this web site to calculate this value.

Let us use the sample:

  • Temperature 1: 22.60 C
  • Temperature 2: 24.70 C
  • (assumed) Relative Humidity: 67.80 %
  • (assumed) Absolute Humidity: 7.30

So, average temperature is 23.65 C. Using this temperature and the above web page, the vapor pressure is 2923 PA.

AH = (67.80 x 2923) / (462.5 x (273.15 + 23.65) x 100) AH = 0.014437 Kg/(meter cubed)

As absolute humidity is normally written as grams per cubic meter:

AH g/(meter cubed) = 0.014437 * 1000 = 14.43 g/(meter cubed)

14.43 is not 7.30. Consider your options. Check your circuit, voltages and software (does varying the amount of time between samples result in different values). Also consider running more tests in better conditions (placing both sensors inside the same sealed container over a long period of time to mitigate exposure to different conditions).


Added later...

Evidently problems with the DHT11 humidity sensor is a known problem:

st2000
  • 7,513
  • 2
  • 13
  • 19