Why doesn't your microphone preamplifier need a meter?
Some microphone preamplifiers have meters for every channel. Some microphone preamplifiers, such as the Focusrite Octopre LE, have just one meter that is switchable to which channel you need to check.
But the strange thing is that microphone preamplifiers don't need meters at all.
It's all a matter of gain structure. Let's see how it works in a conventional mixing console...
A conventional mixing console has a microphone preamplifier on every input channel. There are no meters specifically dedicated to the preamps, but there will be meters on the group and master outputs.
Conventionally, the way you set the microphone preamplifier gain on a conventional mixing console is to press the PFL (solo) button on that channel. Not only do you hear the channel in isolation, the output of the channel from a point before the fader is sent to the meters.
All you need to is set the microphone preamplifier gain so that there is a good healthy reading on the meters.
Let's substitute the conventional mixing console with a separate microphone preamplifier, or set of them, and digital audio workstation.
Each individual preamp is connected to one channel of the DAW. Now since any DAW worth using has metering of the input level on every channel, these take the place of the meters of the conventional console.
So now, you have metering of the output of the mic preamps all the time. So the preamp itself doesn't need a meter at all.
However, there is a question of gain structure...
In a conventional mixing console, the highest possible level is the same for the output of the microphone preamplifier and all of the rest of the following circuitry. So unless the designer has done something really odd, if output signal from the preamp section is within the maximum allowable, then that is OK for everything that follows.
However, this is not necessarily the case for the combination of a microphone preamplifier and a DAW.
Ideally the maximum output level of the preamplifier should correspond to the clipping point of the audio interface of the DAW.
That way, the signal-to-noise ratio is optimized. However, the maximum output level of the preamp may be higher than the clipping point of the interface. In which case a lower gain setting will be used, leaving some unused - and unusable - headroom in the preamp.
It may be possible to reduce the input sensitivity of the interface. But the likely method will just be by reducing the level before the active circuitry. This doesn't really make things any better.
In an ideal world, the output capability of the preamp would be matched to the requirements of the interface at the design stage. But since they are in all probability made by different manufacturers, this would only happen by coincidence.
To be honest, few people will notice any difference. But the conventional mixing console is optimized right out of the box. The preamp/DAW combination may be always be a compromise.