![]() It is quite difficult but not impossible to integrate if $T$ is just a linear function. Then if you want to know the radiation from the surface of the slab $A_s$ to another surface $A_o$ then the heat transfer in W/m2 should be given by the hard definition of the view factor: In linear case to simplify we have: $$T = T_e x/L (T_b-T_e)$$ first assuming you have the temperature of a slab of length L which is a function of variable $x$ and $T(0) = T_e$ and $T(L) = T_b$. There would be no absorption features in the spectrum. If the Sun had the same (or similar) temperature to its effective temperature all the way through then the spectrum would be that of a blackbody at that temperature. Assuming you could approximate to that, then the temperature at which the optical depth reaches 2/3 would be lower than in the steeper temperature gradient case - the surface brightness would be closer to $\sigma T_E^4$. ![]() The slab at uniform temperature is a physical impossibility if it is emitting radiation. If the opacity depends on wavelength then you get a pseudo-blackbody spectrum with absorption features - much like the solar spectrum. The surface brightness would be higher than $\sigma T_E^4$. In which case the spectrum will be roughly a Planck function at the temperature where the optical depth into the slab is around 2/3. Further, assume that the opacity is wavelength-independent. What we need to know is the opacity as a function of wavelength, or more specifically, at what temperature the optical depth becomes unity as a function of wavelength.įor the case of the temperature gradient, let us assume that the physical thickness of the slab is big enough to not allow any radiation through it. You haven't given sufficient information.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |