ABSTRACT
A real-time jitter meter is used to measure and digitally sample the pulse-to-pulse timing error in a laser pulse train. The jitter meter is self-referenced using a single-pulse delay line interferometer and measures timing jitter using optical heterodyne detection between two frequency channels of the pulse train. Jitter sensitivity down to 3×10-10fs2/Hz at 500 MHz has been demonstrated with a pulse-to-pulse noise floor of 1.6 fs. As a proof of principle, the digital correction of the output of a high-frequency photonic analog-to-digital converter (PADC) is demonstrated with an emulated jitter signal. Up to 23 dB of jitter correction, down to the noise floor of the PADC, is accomplished with radio-frequency modulation up to 40 GHz.
ABSTRACT
Spectral phase ripple associated with novel dispersive devices can distort broadband optical signals. We present a digital postprocessing algorithm to correct for this distortion by exploiting the static deterministic nature of the ripple. This algorithm is demonstrated with empirical data for several systems employing chirped fiber Bragg gratings (CFBGs). We employ this technique in a photonic time-stretch system incorporating CFBGs, improving the signal fidelity by 9 dB. Simulations and experiments show that this algorithm, which can be reduced to a simple interpolation and matrix multiplication, also mitigates additive noise. We see that the act of distortion correction yields signal fidelity superior to that of an ideal dispersive element.