I have a Fluke Ti25 Thermal Imaging Camera that I used on a 1,200 watt PFC Controller to measure the power FET and power diode temps. The Fluke camera has an adjustment for emissivity, which I set to copper (I don't recall the value). I chose this as my heatsink was a copper bar. Using both my finger (with the power off, of course) and a thermocouple, I'm confident the actual temperature was about 85 Celsius, but the Fluke IR camera said it was 15 Celsius. Clearly, this reading was wrong as my lab is in Southern California and it was done in late summer. The temperature never gets below 70 F (21.1 C) and this heat sink was hot to the touch.
The literature says it may be due to reflected energy instead of emitted energy. So I changed the angle of the camera with respect to the heatsink and it didn't make much difference. The literature also says to put black tape on the surface or to paint it black, but both of those would be problematic on my heat sink as it would interfere with convection cooling off of the copper bar heat sink. The devices were TO-247 (one FET and one SiC Diode) on a 4" x 1" x 0.19" thick copper bar. The devices were mounted with a 6-32 screw and nut and tightened appropriately.
Has anyone else had similar problems and if so, do you have any tips on how to correct it to obtain more accurate temperature values from the IR camera?