SECTION 7 PERFORMANCE TEST
7-6
7.3.3 Output level accuracy
(1) Test specifications
• Output level accuracy ±1 dB (≤ +17 dBm, ≥ –127 dBm)
±3 dB (< –127 dBm)
(2) Measuring instrument for test
• MS2602A Spectrum analyzer (100 kHz to 2080 MHz)
• Pre-amplifier (100 kHz to 2080 MHz, gain 40 dB, noise figure ≤10 dB)
• Fixed attenuator (100 kHz to 2080 MHz, attenuation 3 dB)
(3) Setup
Fig. 7-3 Output Level Accuracy Test
(4) Test procedure (For test, use the calculation sheet of the performance test result sheet in Appendix E.)
STEP PROCEDURE
1. Execute the intenal calibration (ALL CAL) of the MS2602A.
2. Set the MG3641A/MG3642A frequency to the measurement frequency, and the level to +17 dBm.
Here, do not use the pre-amplifier.
3. Set the MS2602A to time-domain sweep mode, reference level +22 dBm, RF attenuator 45 dB, RBW
10 Hz, VBW 10 Hz, and sweep time 50 ms. Fine-adjust the center frequency around the measuring
frequency to maximize the marker level.
4. Record the marker level displayed. (Ma+17)
5. Set the MG3641A/MG3642A output level to +16 dBm, and the MS2602A reference level to +21 dBm.
Record the marker level displayed. (Ma+16)
6. Setting the MG3641A/MG3642A output level down to -8 dBm, repeat the measurement and record the
marker level displayed. (Ma+15 to Ma-8)
MS2602A Spectrum analyzer
INPUT
Pre-amplifier
MG3641A/MG3642A
3 dB