If the signal is periodic, two case arise:
There is a direct relation between a signal's duration in time and the width of its frequency spectrum:
The 'glitches' are short signals. So they have a broad frequency spectrum. And this broadening is superimposed on the frequency spectrum of the actual signal:
This broadening of the frequency spectrum determines the frequency resolution - the ability to resolve (that is, to distinguish between) two adjacent frequency components.
Only the one happy circumstance where the signal is such that an integral number of cycles exactly fit into the measurement time gives the expected frequency spectrum. In all other cases the frequency spectrum is broadened by the 'glitches' at the ends. Matters are made worse because the size of the glitch depends on when the first measurement occurred in the cycle - so the broadening will change if the measurement is repeated.
For example a sine wave 'should' have a frequency spectrum which consists of one single line. But in practice, if measured say by a spectrum analyser, the frequency spectrum will be a broad line - with the sides flapping up and down like Batman's cloak. When we see a perfect single line spectrum - for example in the charts sometimes provided with analogue to digital converter chips - this has in fact been obtained by tuning the signal frequency carefully so that the period exactly fits the measurement time and the frequency spectrum is the best obtainable.