Abstract
NUMEROUS investigations of the effect of dispersion on atomic scattering factors for X-rays have shown that, when the wave-length of the-X-radiation is comparable with the wave-length corresponding to an absorption edge of the scattering atom, the value of the scattering factor f is lowered by an amount f depending on the proximity of the incident wavelength to the absorption edge ; the effect is in many respects similar to anomalous dispersion in the optical region. The agreement between experiment and theory regarding the magnitude of f has been very rough and, as Williams1 has pointed out, “the results obtained by different observers do not show a consistent departure from the calculated results”. It is clear therefore that the blame cannot be laid wholly on assumptions involved in the calculations.
Similar content being viewed by others
Article PDF
References
E. J. Williams, Proc. Roy. Soc., A., 143, 358 ; 1934.
H. Hönl, Ann. Phys., 18, 625 ; 1933.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
BRINDLEY, G., SPIERS, F. Effect of Dispersion and of Lattice Distortion on the Atomic Scattering Factor of Copper for X-Rays. Nature 134, 850 (1934). https://doi.org/10.1038/134850a0
Issue Date:
DOI: https://doi.org/10.1038/134850a0
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.