USRP Hardware Driver and USRP Manual  Version: 003.010.002.HEAD-0-gbd6e21dc
UHD and USRP Manual
R&D Testing Procedures

All defined R&D test procedures are listed here. These tests are meant as a tool for Ettus R&D to enable faster and more reliable development. Note these tests are no replacement for manufacturing or production tests, and should not be treated as such. Instead, they are meant to catch common failure modes during development. As a result, test definitions are fairly light-weight.

GPSDO Tests

Test Code Device Peripherals Manual Test Procedure Automatic Test Procedure
GPS-X310-TCXO-v1 USRP X310 Jackson Labs TCXO GPSDO: Manual Test Procedure GPSDO: Automatic Test Procedure
GPS-X310-OCXO-v1 USRP X310 Jackson Labs OCXO GPSDO: Manual Test Procedure GPSDO: Automatic Test Procedure
GPS-X300-TCXO-v1 USRP X300 Jackson Labs TCXO GPSDO: Manual Test Procedure GPSDO: Automatic Test Procedure
GPS-X300-OCXO-v1 USRP X300 Jackson Labs OCXO GPSDO: Manual Test Procedure GPSDO: Automatic Test Procedure
GPS-B200-TCXO-v1 USRP B200 Jackson Labs TCXO GPSDO: Manual Test Procedure GPSDO: Automatic Test Procedure
GPS-B210-TCXO-v1 USRP B210 Jackson Labs TCXO GPSDO: Manual Test Procedure GPSDO: Automatic Test Procedure

Recommendations

For cursory testing, not all tests within a device family are required (e.g., only testing the OCXO on any X-Series, and testing the TCXO on any B-Series is sufficient).

The following tests are recommended for a minimum test (N stands for the latest version of this test):

  • One of GPS-X310-OCXO-vN or GPS-X300-OCXO-vN
  • One of GPS-B210-TCXO-vN or GPS-B200-TCXO-vN

Requirements

All of these tests require a device that is GPSDO capable (e.g., X3x0, B2x0, N2x0). For those devices that have a separate GPS component (such as the Jackson Labs GPSDOs), this component is also required (called the "peripheral" in the following).

GPSDO: Manual Test Procedure

  1. Without connecting the peripheral to the device, run uhd_usrp_probe on the device and verify that the lack of GPSDO is correctly reported. No warning or error must be printed.
  2. This and the following tests are run with the peripheral connected: Run uhd_usrp_probe and verify that the GPSDO is correctly reported. Power down the device before connecting the peripheral. The GPSDO must be reported found, and no error or warning must be printed.
  3. OCXO only: Without connecting the GPS antenna input, run query_gpsdo_sensors. To pass, it must report the GPSDO as found, lock to the external reference, but then report not being locked to GPS. The tool will report a valid GPS time, and a string such as "GPS and UHD Device time are aligned" in case of success.
  4. Connect a GPS antenna to the input and make sure it is in a position to receive GPS satellite data. Confirm that GPS lock is reported using query_gpsdo_sensors within 20 minutes of connecting the antenna. The tool query_gpsdo_sensors will print a string such as "GPS Locked" in case of success.

All of these tests must pass for a 'pass' validation.

GPSDO: Automatic Test Procedure

tbd

Devtests

Test Code Device Peripherals Manual Test Procedure Automatic Test Procedure
DEVTEST-X310-XG-v1 USRP X310 None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-X310-HG-v1 USRP X310 None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-X300-XG-v1 USRP X300 None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-X300-HG-v1 USRP X300 None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-E310-SG1-v1 USRP E310-SG1 None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-E310-SG3-v1 USRP E310-SG3 None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-B200-v1 USRP B200 None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-B210-v1 USRP B210 None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-B200m-v1 USRP B200mini None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure
DEVTEST-B205m-v1 USRP B205mini None Devtest: Manual Test Procedure Devtest: Automatic Test Procedure

The devtests are hardware tests built in to the UHD make system. They can be run directly from the build directory and require no configuration. Devtests are designed to always run, regardless of the actual device configuration. This means, by definition, that devtests cannot require special cabling, specific daughtercards, etc.

Note: The actual devtests can change, since they're part of the code. This does not require a version bump on the test code.

Requirements

Devtests are only defined for some devices. When running a devtest, all peripherals must be disconnected (e.g., no daughterboards on the X-Series, no GPSDOs on the B- and X-Series).

Devtest: Manual Test Procedure

X3x0 procedure

  1. Make sure no peripherals are connected to the device (no daughterboards, no GSPDO, front panel GPIO is unconnected).
  2. When testing the HG image, run a test once for each connection (1 GigE and 10 GigE). When testing the XG image, a test on either connection (SFP0 or SFP1) is sufficient. In both cases, also test via PCIe.
  3. When the device is connected, simply run make test_x3x0 from the command line in the build directory. Multiple devices connected will all get tested, there is no requirement to only connect a single device at a time (because devtest will run sequentially anyway).
  4. Devtest must report no failures for a 'pass' validation.

B2xx procedure

Note: The test codes with an 'm' suffix refer to B200mini and B205mini, respectively.

  1. Make sure no peripherals are connected to the device (no GPSDO if applicable, GPIO pins unconnected)
  2. Test once via USB3, once via USB2.
  3. Simply run make test_b2xx
  4. Devtest must report no failures for a 'pass' validation.

E310 procedure

  1. Make sure GPIO pins are unconnected.
  2. Tests need to be run natively on the device. If the build environment is available on the device, running make test_e3xx is sufficient.
  3. In general, there is no build environment on the device (e.g. when doing a typical sshfs mount of an environment). In this case, copy the contents of the devtest directory onto the device, and run the following command (the environment variables need to point to the location of the devtest code, the location of the UHD examples such as benchmark_rate, and where you want to store log files, respectively):
    $DEVTEST_DIR/run_testsuite.py --src-dir $DEVTEST_DIR \
                                  --devtest-pattern e3xx \
                                  --build-type na \
                                  --build-dir $EXAMPLES_DIR \
                                  --device-filter e3x0 \
                                  --log-dir $LOG_DIR
    
  4. Devtest must report no failures for a 'pass' validation.

Devtest: Automatic Test Procedure

As all these tests can be run unsupervised, they can be run automatically given the correct device setup. The return code of the tests can be used to check for pass/fail conditions (return code 0 means 'pass').

FPGA: Testing through Simulations

Test benches provide a faster way to verify the design through simulations.

Test Code Device Peripherals Manual Test Procedure Automatic Test Procedure
FPGATB-v1 None None Manual Test Procedure Automatic Test Procedure

Requirements

These tests are simulations and do not need any device. Vivado 15.4 should be installed.

Manual Test Procedure

  1. Go to the fpga directory depending on which test needs to be run.
    1. NoC block test benches: Most of the NoC blocks have a test bench written in System Verilog that provides stimuli to the noc_block to verify it. The test bench for a block resides in <fpga-dir>/usrp3/lib/rfnoc/‍*_tb.
    2. Running unit test benches: A few sub-blocks like noc-shell and sine_tone are used within the bigger noc_blocks. They have their own test benches. Their test benches reside in <fpga-dir>/usrp3/lib/sim/rfnoc/‍*.
    3. Radio test bench: The radio test bench resides in <fpga-dir>/usrp3/lib/radio/noc_block_radio_core_tb/.
    4. Device specific test benches: IPs specific to a device have test benches that exist in <fpga-dir>/usrp3/top/x300/sim/*. e.g. DMA testbench, PCIe, etc.
  2. Setup the environment by running source <fpga-dir>/usrp3/top/<device>/setupenv.sh.
  3. In the test bench directory and run the test bench by 'make xsim' or 'make vsim'.
  4. All of these tests must report no failure with a 'PASS' validation. Example testbench output:
========================================================
TESTBENCH STARTED: noc_block_skeleton
========================================================
[TEST CASE 1] (t=000000000) BEGIN: Wait for Reset...
[TEST CASE 1] (t=000001002) DONE... Passed
[TEST CASE 2] (t=000001002) BEGIN: Check NoC ID...
Read Skeleton NOC ID: 1234000000000000
[TEST CASE 2] (t=000001238) DONE... Passed
[TEST CASE 3] (t=000001238) BEGIN: Connect RFNoC blocks...
Connecting noc_block_tb (SID: 1:0) to noc_block_skeleton (SID: 0:0)
Connecting noc_block_skeleton (SID: 0:0) to noc_block_tb (SID: 1:0)
[TEST CASE 3] (t=000005457) DONE... Passed
[TEST CASE 4] (t=000005457) BEGIN: Write / readback user registers...
[TEST CASE 4] (t=000006888) DONE... Passed
[TEST CASE 5] (t=000006888) BEGIN: Test sequence...
[TEST CASE 5] (t=000007403) DONE... Passed
========================================================
TESTBENCH FINISHED: noc_block_skeleton
- Time elapsed: 7500 ns
- Tests Expected: 5
- Tests Run: 5
- Tests Passed: 5
Result: PASSED
========================================================

Failing tests can be debugged by checking the waveform in a Vivado GUI by running 'make GUI=1 xsim'. More details on debugging: https://kb.ettus.com/Debugging_FPGA_images

Automatic Test Procedure

Go to <fpga-dir>/usrp3/ and run 'build.py xsim all'. All tests should report 'PASS'.

FPGA DSP Verification

Test Code Device Peripherals Manual Test Procedure Automatic Test Procedure
FPGADSPVERIF-X310-HG-v1 USRP X310 2x UBX FPGA DSP Verification: Manual Test Procedure FPGA DSP Verification: Automatic Test Procedure
FPGADSPVERIF-X310-XG-v1 USRP X300 2x UBX FPGA DSP Verification: Manual Test Procedure FPGA DSP Verification: Automatic Test Procedure
FPGADSPVERIF-X300-HG-v1 USRP X310 2x UBX FPGA DSP Verification: Manual Test Procedure FPGA DSP Verification: Automatic Test Procedure
FPGADSPVERIF-X300-XG-v1 USRP X300 2x UBX FPGA DSP Verification: Manual Test Procedure FPGA DSP Verification: Automatic Test Procedure
FPGADSPVERIF-E310-SG1-v1 USRP E310 SG1 None FPGA DSP Verification: Manual Test Procedure FPGA DSP Verification: Automatic Test Procedure
FPGADSPVERIF-E310-SG3-v1 USRP E310 SG3 None FPGA DSP Verification: Manual Test Procedure FPGA DSP Verification: Automatic Test Procedure

Requirements

  • Signal generator and spectrum analyzer
  • X300 & X310 with 2x UBX daughterboard
  • E310 SG1 & SG3 with SSH access

FPGA DSP Verification: Manual Test Procedure

This procedure tests the DDC and DUC signal quality and the block's capability to change sample rate while streaming.

RX testing

  1. Run calibration on device, if applicable
  2. Using a signal generator, inject a sine tone into RX channel 0 at 915.5 MHz @ -40 dBm
  3. Inspect the received spectrum using uhd_fft
    • X3x0: uhd_fft -f 915e6 -s 10e6 -g 10
    • E3xx: uhd_fft -f 915e6 -s 2e6 -g 50
    • Embedded devices will require either using network mode or using X forwarding over ssh to run the app natively
  4. In the GUI, inspect the spectrum. There should be a strong tone at the test tone frequency. There may be a small tone at the carrier frequency due to DC offset and a quadrature image due to IQ imbalance.
  5. Check the input tone frequencies outlined below. The tone should shift from left to right as the frequency changes and may have some amplitude variation, especially at the band edges.
    • X3x0: 910 MHz to 920 MHz in 1 MHz steps
    • E3xx: 914 MHz to 916 MHz in 200 kHz steps
  6. Set input tone back to 915.5 MHz. Check the sampling rate as outlined below. The spectrum should reflect the change in sample rate.
    • X3x0: 1, 5, 20, 33.333, 50, 66.666, 100, 200 MHz
    • E3x0: 0.1, 0.5, 1, 1.143, 1.684 MHz
  7. Repeat on each RX channel of the device.
  8. This test fails if:
    • DC offset and IQ imbalance tones are unusally large
    • There are any other strong tones or spectrum distortion
    • The spectrum changes significantly between frequencies or sample rates
      • An initial transient distortion is acceptable
      • Amplitude variation on the order of +/-10 dB is acceptable
    • Console reports any of the following:
      • Overruns 'O' if continuous and not due to host computer's lack of processing performance
      • Dropped packets 'D'
      • Sequence number errors 'S'
      • Timeouts

TX testing

  1. Run calibration on device, if applicable
  2. Using uhd_siggen_gui, generate a sine tone TX channel 0 at 915.5 MHz:
    • X3x0: uhd_siggen_gui -f 915e6 -s 10e6 -g 10 -x 500e3 --sine
    • E3xx: uhd_siggen_gui -f 915e6 -s 2e6 -g 50 -x 500e3 --sine
  3. Using a spectrum analyzer, inspect the output spectrum. There should be a strong tone at the test tone frequency. There may be a small tone at the carrier frequency due to DC offset and a quadrature image due to IQ imbalance.
  4. Using the GUI, test the follow offset frequencies. The tone should shift from left to right as the frequency changes and may have some amplitude variation, especially at the band edges.
    • X3x0: -5 to +5 MHz in 1 MHz steps
    • E3xx: -1 to +1 MHz in 200 kHz steps
  5. Set output tone offset back to 500e3. Change sampling rate as outlined below. The spectrum should not significantly differ between sample rates.
    • X3x0: 1, 5, 20, 33.333, 50, 66.666, 100, 200 MHz
    • E3x0: 0.1, 0.5, 1, 1.143, 1.684 MHz
  6. Repeat on each TX channel of the device
  7. This test fails if:
    • DC offset and IQ imbalance tones are unusually large
    • There are any other strong tones or spectrum distortion
    • The spectrum changes significantly between sample rates
      • An initial transient distortion is acceptable
    • Console reports any of the following:
      • Underruns 'U' if continuous and not due to host computer's lack of processing performance
      • Late packets 'L'
      • Sequence number errors 'S'

FPGA DSP Verification: Automatic Test Procedure

tbd

FPGA Functional Verification

Test Code Device Peripherals Manual Test Procedure Automatic Test Procedure
FPGAFUNCVERIF-X310-HG-v1 USRP X310 2x UBX FPGA Functional Verification: Manual Test Procedure FPGA Functional Verification: Automatic Test Procedure
FPGAFUNCVERIF-X310-XG-v1 USRP X300 2x UBX FPGA Functional Verification: Manual Test Procedure FPGA Functional Verification: Automatic Test Procedure
FPGAFUNCVERIF-X300-HG-v1 USRP X310 2x UBX FPGA Functional Verification: Manual Test Procedure FPGA Functional Verification: Automatic Test Procedure
FPGAFUNCVERIF-X300-XG-v1 USRP X300 2x UBX FPGA Functional Verification: Manual Test Procedure FPGA Functional Verification: Automatic Test Procedure
FPGAFUNCVERIF-E310-SG1-v1 USRP E310 SG1 None FPGA Functional Verification: Manual Test Procedure FPGA Functional Verification: Automatic Test Procedure
FPGAFUNCVERIF-E310-SG3-v1 USRP E310 SG3 None FPGA Functional Verification: Manual Test Procedure FPGA Functional Verification: Automatic Test Procedure

The FPGA functional verification tests exercise the Digital Downconverter (DDC), Digital Upconverter (DUC), and Radio Core RFNoC blocks.

Requirements

  • X300 & X310 with two daughterboards
    • 2x UBX recommended
    • HG tests require a single 10 GigE connection, XG requires two for the 2x RX 200 MSPS test
    • 1 GigE and PCIe adapters and cabling for optional tests
  • E310 SG1 & SG3 with SSH access

FPGA Functional Verification: Manual Test Procedure

This procedure verifies that the DDC, DUC, and Radio Core can run at various sample rates and channel configurations without any data flow issues.

  1. Run benchmark_rate using the parameters outlined in the tables below
  2. Unless otherwise noted, to pass each test:
    • Benchmark rate must run without reporting any of the following:
      • Underruns 'U'
      • Overruns 'O'
      • Dropped packets 'D'
      • Sequence number errors 'S'
      • Late commands 'L'
      • Timeouts
    • Appropriate TX/RX LEDs must be illuminated
  3. Unless specified in 'Notes' column, use default values for unlisted parameters
  4. Example commands:
    • X3x0: benchmark_rate --tx_rate 1e6 --rx_rate 1e6 --channels 0,1 --duration 120
    • E3xx: benchmark_rate --args="master_clock_rate=10e6" --tx_rate 1e6 --rx_rate 1e6 --channels 0,1 --duration 120

USRP X3x0: 10 GigE Interface

  • Required images to test: X310 HG
  • Optional images to test: X310 XG, X300 HG, X300 XG
  • Note: On TX tests, initial Us within the first 5 seconds can be ignored and do not fail the test
Channels Sample Rates Duration Notes
1x RX 10e6, 50e6, 100e6, 200e6 60 Test both channels
2x RX 10e6, 50e6, 100e6 60
2x RX 200e6 60 2x 10G, XG only
1x TX 10e6, 50e6, 100e6, 200e6 60 Test both channels
2x TX 10e6, 50e6, 100e6 60
1x RX & 1x TX 10e6, 50e6, 100e6 60 Test both channels
1x RX & 1x TX 200e6 60 Use channel 0
2x RX & 2x TX 10e6, 50e6 60
1x RX & 1x TX 200e6 3600 Use channel 1
2x RX & 2x TX 100e6 3600

USRP X3x0: 1 GigE Interface

  • Required images to test: None
  • Optional images to test: X310 HG, X310 XG, X300 HG, X300 XG
  • Note: On TX tests, initial Us within the first 5 seconds can be ignored and do not fail the test
Channels Sample Rates Duration
1x RX 1e6, 10e6, 25e6, 50e6 60
2x RX 1e6, 10e6, 25e6 60
1x TX 1e6, 10e6, 25e6, 50e6 60
2x TX 1e6, 10e6, 25e6 60
1x RX & 1x TX 1e6, 10e6, 25e6, 50e6 60
2x RX & 2x TX 1e6, 10e6, 25e6 60

USRP X3x0: PCIe Interface

  • Required images to test: None
  • Optional images to test: X310 HG, X310 XG, X300 HG, X300 XG
  • Note: On TX tests, initial Us within the first 5 seconds can be ignored and do not fail the test
Channels Sample Rates Duration
1x RX 10e6, 50e6, 100e6, 200e6 60
2x RX 10e6, 50e6, 100e6 60
1x TX 10e6, 50e6, 100e6, 200e6 60
2x TX 10e6, 50e6, 100e6 60
1x RX & 1x TX 10e6, 50e6, 100e6 60
1x RX & 1x TX 200e6 60
2x RX & 2x TX 10e6, 50e6 60

Note: On TX tests, initial Us within the first 5 seconds can be ignored and do not fail the test

USRP E3xx (SG3 Required, SG1 Optional)

Channels Master Clock Rates Sample Rate Duration Notes
1x RX 1e6, 10e6, 61.44e6 1e6 60 Test both channels
1x TX 1e6, 10e6, 61.44e6 1e6 60 Test both channels
2x RX 1e6, 10e6, 30.72e6 1e6 60
2x TX 1e6, 10e6, 30.72e6 1e6 60
1x RX & 1x TX 1e6, 10e6, 61.44e6 1e6 60 Test both channels
1x RX & 1x TX 61.44e6 1e6 60 Use channel 1
2x RX & 2x TX 1e6, 10e6, 30.72e6 1e6 60
1x RX & 1x TX 61.44e6 1e6 3600 Use channel 0
2x RX & 2x TX 30.72e6 1e6 3600

Note: Any sample rate warnings can be ignored.

FPGA Functional Verification: Automatic Test Procedure

tbd

Phase alignment tests

Test Code Device Peripherals Manual Test Procedure Automatic Test Procedure
PHASE-Twin-RX-v1 2xTwinRX 2xX3x0 + Octoclock + Signalgenerator + LOSharing cables TwinRX specifics for phase alignment testing TwinRX specifics for phase alignment testing
PHASE-UBX-40-RX-v1 2xUBX-40 2xX3x0 + Octoclock + Signalgenerator Manual phase alignment testing (Receiver) Automatic phase alignment testing (Receiver)
PHASE-UBX-160-RX-v1 2xUBX-160 2xX3x0 + Octoclock + Signalgenerator Manual phase alignment testing (Receiver) Automatic phase alignment testing (Receiver)
PHASE-SBX-40-RX-v1 2xSBX-40 2xX3x0 + Octoclock + Signalgenerator Manual phase alignment testing (Receiver) Automatic phase alignment testing (Receiver)
PHASE-SBX-120-RX-v1 2xSBX-120 2xX3x0 + Octoclock + Signalgenerator Manual phase alignment testing (Receiver) Automatic phase alignment testing (Receiver)
PHASE-N2x0-MIMO-v1 2x N2x0 + MIMO cable 2x SBX + Signalgenerator N2x0 with MIMO cable specifics for phase alignment testing N2x0 with MIMO cable specifics for phase alignment testing
Device Frequency Range Number of bands
TwinRX 10 - 6000 MHz 12
UBX-{160, 40} 10 - 6000 MHz 12
SBX-{120, 40} 400 - 4400 MHz 7

Phase alignment testing is necessary to verify device synchronization across multiple daughter- and motherboards is working as expected for CBX, SBX and UBX daughterboards. To enable efficient Phase alignment testing a GNU Radio Out-of-Tree module gr-usrptest exists in tools/gr-usrptest. It is required for testing RX testcases and later may be required to perform TX testcases.

To test phase alignment we measure phase offset between DUTs at an offset of 2 MHz offset from the selected center frequency. The phase difference for a given center frequency has to stay the same across retunes and power cycles of the DUT.

Correct synchronization with PPS and 10 MHz references is required for these tests.

Manual phase alignment testing (Receiver)

  1. Get required peripherals and DUTs and additionally one splitter and enough coaxial cables. Provide a connection from host computer to USRPs.
  2. Connect output of signal generator to the splitter and the output of splitter with each DUT.
  3. Make sure other outputs of the splitter are terminated with a 50 Ohms terminator.
  4. Connect USRPs with ethernet cables to the switch. Connect host computer with switch.
  5. Install gr-usrptest OOT-module on your host system (requires GNU Radio (>v3.7.10.1 recommended) and UHD already installed)
  6. Load correct FPGA images on X3x0 devices (via JTAG cable or with uhd_image_loader)
  7. Configure Network (USRPs and host interface)
  8. In tools/gr-usrptest/apps (or already in your $PATH if gr-usrptest is installed) run:
    ./usrp_phasealignment.py
                 --s 10e6 -runs 10 --duration 2.0 --plot --auto \ 
                 --sync pps --time-source external --clock-source external \
                 --args "addr0=<address first X3x0>,addr1=<address second X3x0>" \
                 --channels <first channel, second channel (e.g. 0,2)> \
                 -f <frequency> \
                 --freq-bands <# frequency bands> \ 
                 --start-freq <lowest daughterboard frequency> --stop-freq <highest daughterboard frequency> \
    
  1. Tune signal generator to displayed frequency + 2 MHz and start measurement
  2. Inspect error plot if phase difference stays the same across retunes. Drift over time must be significantely lower than 1 degree and deviation must be well below 2 degrees.
  3. Verify result with terminal output and note result for current test frequency.
  4. Repeat steps 9.-12. for remaining bands

Automatic phase alignment testing (Receiver)

tbd

TwinRX specifics for phase alignment testing

Phase alignment testing with TwinRX works as described above with additional test cases and commandline options. TwinRX offers LO sharing inside one board and across boards. For uhd_app and derived applications involving our tools and GNU Radio we have introduced --lo-source {internal, companion, external} and --lo-export {True, False} arguments to apply LO sharing feartures on TwinRX daughterboards. Two testcases have to pass:

  • Phase alignment if sharing LO with companion receiver on a single daughterboard
  • Phase alignment if sharing LO with external TwinRX daughterboard

When testing TwinRX put 2 daughterboards in separate motherboards and connect LO sharing cables. Setup USRPs in a similar fashion as described above. Supply additional commandline arguments to ./usrp_phasealignment.py. Use four receive channels --channels 0,1,2,3 and therefore specify --spec "A:0 A:1 B:0 B:1" to address both receiver channels on each daughterboard. Also supply --lo-export True,False,False,False and --lo-source internal,companion,external,external if your LO sharing setup exports LOs from the first motherboard to the second otherwise adjust lo sharing arguments.

N2x0 with MIMO cable specifics for phase alignment testing

Phase alignment testing with N210 and MIMO cable works like in the case with X3x0 devices but no Octoclock is needed for device synchronization. Instead two N210 devices are connected with a MIMO cable and only one N210 is connected with an ethernet cable to the host computer. Supply --time-source internal,mimo and --clock-source internal,mimo on the commandline to instruct the N2x0s to share time and clock over the MIMO cable.

Defining R&D Tests

Tests can be added any time to define procedures for pass/fail validation. Any test must include the following:

  • An unambiguous test code. This code consists of three characters that identify the test, a short description of the devices required, and a version suffix. Example: GPS-X310-OCXO-v1 is a GPS-related test, requires an X310 and an OCXO to run, and is version 1 of this test.
  • A manual testing procedure. This must unambiguously define a set of tasks, and clearly identify whether or not a test has failed or passed. Tests do not require any other defined outcome other than 'pass' and 'fail'.
  • Optional, but highly recommended: An automatic test procedure. This must consist of a command, or a script, or a set of commands that can be automatically executed, and that will report a failure condition by means of returning a non-zero return value.

Basic understanding of the operation of USRPs by the test operator should be assumed when authoring test procedures. The descriptions should be as short as possible to fully describe, unambiguously, how to reach a pass/fail conclusion.

Test procedures may be updated at any time. If this happens, a new test code must be generated, with the version number increased. Old test codes are considered deprecated (if there exists a version 2 of a test, version 1 should not be run any more).