To find the distance between the layers in a crystalline solid that produces a diffraction peak at 2θ=60∘, we first need to determine θ. Given that 2θ=60∘, it follows that θ=30∘. Then, we use Bragg's law for diffraction, which is: nλ=2dsin‌θ Given: Wavelength of X-rays, λ=1.54Å=1.54×10−10m Order of reflection, n=1 sin‌30∘=0.5 Rearranging Bragg's law to solve for d, we have: d=‌
nλ
2sin‌θ
Substituting the given values into the equation: d=‌
1×1.54×10−10
2×0.5
=‌
1.54×10−10
1
=1.54×10−10m=1.54×10−8cm Thus, the distance between the layers is 1.54×10−8cm.