Also, you should know, microphones don’t detect sound the way our ears do – they lose a LOT of auditory information. A couple of lossy quirks of microphones, vis-a-vis our auditory system, need to be noted. They dramatically affect the way we use mics.
Quirk number one is that microphones cannot distinguish the angle of arrival of various sound artifacts (as our ears do), so that all artifacts are merged into a single wavetrace that does not contain directional information. At the same time , the spectrum of that wavetrace is affected by the inability of the microphone to detect frequency equally in all directions.
Quirk number two is that microphones cannot integrate sound artifacts over time and sort them by phase (as our ears do), so that all early reflections (profoundly useful spatial cues for us humans) end up being interference effects for the microphone.
The net result of these quirks is that a great deal of sonic information that us humans use to make sense of the sonic world around us is lost at the microphone. The two-dimensional map of energy over time that comes down the mic cable is NOT a reasonable representation of the aural information that we humans use.
This is all academic until you get on set, and an inexperienced producer or director will wonder why sound needs another take because of an airplane they couldn't hear, or why they need to hold the roll because of a nearby leaf blower. The mic will hear all, in a much different way than our ears (and more specifically, our brains) do; what seems like a negligible noise on set becomes insurmountable once it hits the mic diaphragm. Understanding how the mics work is tantamount to understanding how lenses capture light: you can do more with them when you fully realize their capabilities and limitations.