3

Could the same principle of LEDs be applied to chip (silicon or gallium) design to shift the infra-red heat towards the blue range then direct it away fromt the chip with fiber to allow the chip to remain cooler? This would change the wavelength of the radiation from infra-red (heat) to the blue or green light range to allow for the chip to run cooler by redirecting the light.

JRE
  • 71,321
  • 10
  • 107
  • 188
Xeno
  • 51
  • 5
  • 1
    The laws of thermodynamics place strict limits on what you can do with waste heat, including how you can radiate it. So while you can make a chip that can lose energy by emitting light, you typically cannot use the same energy to make blue light as you use to do computation or other useful work. That must be lost as IR light as determined by the chip's temperature. – user1850479 Feb 18 '24 at 22:47
  • but is not some of the heat due to the band gap of the simi-conductor? – Xeno Feb 18 '24 at 22:51
  • @Xeno -- interesting question. Keep going. Keep that out-of-the-box thinking alive. And keep cultivating the skills that you have so that either you can prove (at least one of) your ideas yourself, or pay somebody else to prove them for you. You can also try the crowdfunding route, but one idea is likely all you'll get. After that, you're too much of an investment risk (until you've made it in some other area, at least). – MicroservicesOnDDD Feb 19 '24 at 01:11
  • @MicroservicesOnDDD Thanks! I often to think out-of-the-box, its given me a lot of ideas soem of which are starting to take shape and many that are long dead, yet I have 2 or 3 I am working on right now (That I still haven't shown any flaws, yet...), one of wich I was hoping this could help with, but I have a lot to learn about electrical engineering. – Xeno Feb 19 '24 at 04:34

2 Answers2

8

Not really. The main sources of heat generation inside e.g. a computer processor are not recombination-based and would therefore not be affected by changing the bandgap. In addition, the processor voltage would likely have to rise and the layout would need to change to make sure that more light left the chip than was reabsorbed inside it, both of which would probably negate whatever small heat dissipation is achievable by turning it into light.

vir
  • 18,294
  • 1
  • 16
  • 36
  • does not the bandgap release IR thereby increasing processor heat? – Xeno Feb 18 '24 at 22:50
  • 1
    Not IR photons, but phonons, which are vibrations in the crystal lattice. – vir Feb 18 '24 at 22:51
  • So the bandgap does not release IR photons and is irellivent to the processor temp? – Xeno Feb 18 '24 at 22:53
  • Am i wrong to assume that higher bandgap would decrease net temp on the processor? – Xeno Feb 18 '24 at 22:55
  • 1
    Yes, that assumption is incorrect. – vir Feb 18 '24 at 22:57
  • So bandgap is irrelivent to produced heat? – Xeno Feb 18 '24 at 22:58
  • If you increase the bandgap to the level where you are producing visible light photons, other sources of heat generation in the processor will have surpassed the light generation many times over. – vir Feb 18 '24 at 23:01
  • But higher bandgaps also allows for higher tempetures and higher frequencies, so overall it would be a positive improvment in performance, would it not? – Xeno Feb 18 '24 at 23:03
  • 1
    I have no idea how you came to that conclusion. – vir Feb 18 '24 at 23:07
  • Higher band gaps contract as heat increases giving more tolorance to tempature – Xeno Feb 18 '24 at 23:08
  • Im sorry i just realized i have shifted off topic toward wide-bandgap transitors and nolonger on light. Thank you for you answer. – Xeno Feb 18 '24 at 23:11
  • Got it, emmiting light has no benifit because it would induce more heat, even in higher heat tolerance wide-bandgaps it would increase heat and not be at all helpful. – Xeno Feb 18 '24 at 23:14
  • Rather than switching applications, what about applications where you intend to dissipate a controlled amount of energy in the transistor due to using it as a variable resistor, like high power AB amplifiers? Would you be able to get away with converting less of the energy to heat, even if the same overall energy is consumed? It wouldn't increase efficiency, but might allow higher amplification at the same temperature. – user1937198 Feb 19 '24 at 13:18
  • Still doubtful, I think. You'd have to change the material around to allow photon emission and the fraction of energy emitted as photons would probably be less than that of a "pure" LED, which itself is <50%. Then, you'd have the problem of getting rid of a bunch of heat and how to get several watts of light out of the same package. – vir Feb 19 '24 at 18:03
3

I think you are mixing two related, but distinct concepts here: the heat and the infrared emission.

While we have certain amount of heat generated in an electronic device, it is not an infrared light at any point. It is heat - a chaotic movement of the particles and is usually transferred away by heat conduction and convection.

On the other hand, these days we do have some very efficient LEDs. There, we force the electrons to jump the bandgap in conditions that don't favor dissipating the energy as heat so the electrons have no choice other than releasing their energy as light. Even there, some 50% (or more) of the energy becomes heat.

In a transistor (be it bipolar or mos or a whole lot of mos transistors in a computer chip) few electrons find their way jumping the bandgap and the silicon itself is well known to favor heat instead of photons in this case. Silicon is also not very much transparent, so the few generated photons have little chance to get out of the crystal.

In short, there is little to be gained from extracting the photons from the chip.

fraxinus
  • 9,184
  • 11
  • 35