Utilities

Local running tool

Local experiment running tool

usage: artiq_run [-h] [-v] [-q] [--device-db DEVICE_DB]
                 [--dataset-db DATASET_DB] [-e EXPERIMENT] [-o HDF5]
                 FILE [ARGUMENTS [ARGUMENTS ...]]
Positional arguments:
file file containing the experiment to run
arguments run arguments
Options:
-v=0, --verbose=0
 increase logging level
-q=0, --quiet=0
 decrease logging level
--device-db=device_db.py
 device database file (default: ‘%(default)s’)
--dataset-db=dataset_db.pyon
 dataset file (default: ‘%(default)s’)
-e, --experiment
 experiment to run
-o, --hdf5 write results to specified HDF5 file (default: print them)

Remote Procedure Call tool

ARTIQ RPC tool

usage: artiq_rpctool [-h]
                     SERVER PORT {list-targets,list-methods,call,interactive}
                     ...
Positional arguments:
server hostname or IP of the controller to connect to
port TCP port to use to connect to the controller
Sub-commands:
list-targets

list existing targets

usage: artiq_rpctool list-targets [-h]
list-methods

list target’s methods

usage: artiq_rpctool list-methods [-h] [-t TARGET]
Options:
-t, --target target name
call

call a target’s method

usage: artiq_rpctool call [-h] [-t TARGET] METHOD ...
Positional arguments:
method method name
args arguments
Options:
-t, --target target name
interactive

enter interactive mode (default)

usage: artiq_rpctool interactive [-h] [-t TARGET]
Options:
-t, --target target name

This tool is the preferred way of handling simple ARTIQ controllers. Instead of writing a client for very simple cases you can just use this tool in order to call remote functions of an ARTIQ controller.

  • Listing existing targets

    The list-targets sub-command will print to standard output the target list of the remote server:

    $ artiq_rpctool hostname port list-targets
    
  • Listing callable functions

    The list-methods sub-command will print to standard output a sorted list of the functions you can call on the remote server’s target.

    The list will contain function names, signatures (arguments) and docstrings.

    If the server has only one target, you can do:

    $ artiq_rpctool hostname port list-methods
    

    Otherwise you need to specify the target, using the -t target option:

    $ artiq_rpctool hostname port list-methods -t target_name
    
  • Remotely calling a function

    The call sub-command will call a function on the specified remote server’s target, passing the specified arguments. Like with the previous sub-command, you only need to provide the target name (with -t target) if the server hosts several targets.

    The following example will call the set_attenuation method of the Lda controller with the argument 5:

    $ artiq_rpctool ::1 3253 call -t lda set_attenuation 5
    

    In general, to call a function named f with N arguments named respectively x1, x2, ..., xN you can do:

    $ artiq_rpctool hostname port call -t target f x1 x2 ... xN
    

    You can use Python syntax to compute arguments as they will be passed to the eval() primitive. The numpy package is available in the namespace as np. Beware to use quotes to separate arguments which use spaces:

    $ artiq_rpctool hostname port call -t target f '3 * 4 + 2' True '[1, 2]'
    $ artiq_rpctool ::1 3256 call load_sample_values 'np.array([1.0, 2.0], dtype=float)'
    

    If the called function has a return value, it will get printed to the standard output if the value is not None like in the standard python interactive console:

    $ artiq_rpctool ::1 3253 call get_attenuation
    5.0 dB
    

Static compiler

This tool compiles an experiment into a ELF file. It is primarily used to prepare binaries for the default experiment loaded in non-volatile storage of the core device. Experiments compiled with this tool are not allowed to use RPCs, and their run entry point must be a kernel.

ARTIQ static compiler

usage: artiq_compile [-h] [-v] [-q] [--device-db DEVICE_DB]
                     [--dataset-db DATASET_DB] [-e EXPERIMENT] [-o OUTPUT]
                     FILE [ARGUMENTS [ARGUMENTS ...]]
Positional arguments:
file file containing the experiment to compile
arguments run arguments
Options:
-v=0, --verbose=0
 increase logging level
-q=0, --quiet=0
 decrease logging level
--device-db=device_db.py
 device database file (default: ‘%(default)s’)
--dataset-db=dataset_db.pyon
 dataset file (default: ‘%(default)s’)
-e, --experiment
 experiment to compile
-o, --output output file

Flash storage image generator

This tool compiles key/value pairs into a binary image suitable for flashing into the flash storage space of the core device.

ARTIQ flash storage image generator

usage: artiq_mkfs [-h] [-s KEY STRING] [-f KEY FILENAME] output
Positional arguments:
output output file
Options:
-s=[] add string
-f=[] add file contents

Flashing/Loading tool

ARTIQ flashing/deployment tool

usage: artiq_flash [-h] [-t TARGET] [-m ADAPTER] [--target-file TARGET_FILE]
                   [-f STORAGE] [-d DIR]
                   [ACTION [ACTION ...]]
Positional arguments:
action actions to perform, default: %(default)s
Options:
-t=kc705, --target=kc705
 target board, default: %(default)s
-m=nist_clock, --adapter=nist_clock
 target adapter, default: %(default)s
--target-file use alternative OpenOCD target file
-f, --storage write file to storage area
-d, --dir look for files in this directory

Valid actions: * proxy: load the flash proxy gateware bitstream * gateware: write gateware bitstream to flash * bios: write bios to flash * runtime: write runtime to flash * storage: write storage image to flash * load: load gateware bitstream into device (volatile but fast) * start: trigger the target to (re)load its gateware bitstream from flash Prerequisites: * Connect the board through its/a JTAG adapter. * Have OpenOCD installed and in your $PATH. * Have access to the JTAG adapter’s devices. Udev rules from OpenOCD: ‘sudo cp openocd/contrib/99-openocd.rules /etc/udev/rules.d’ and replug the device. Ensure you are member of the plugdev group: ‘sudo adduser $USER plugdev’ and re-login.

Core device configuration tool

The artiq_coreconfig utility gives remote access to the Flash storage.

To use this tool, you need to specify a device_db.py device database file which contains a comm device (an example is provided in examples/master/device_db.py). This tells the tool how to connect to the core device and with which parameters (e.g. IP address, TCP port). When not specified, the artiq_coreconfig utility will assume that there is a file named device_db.py in the current directory.

To read the record whose key is mac:

$ artiq_coreconfig read mac

To write the value test_value in the key my_key:

$ artiq_coreconfig write -s my_key test_value
$ artiq_coreconfig read my_key
b'test_value'

You can also write entire files in a record using the -f parameter. This is useful for instance to write the startup and idle kernels in the flash storage:

$ artiq_coreconfig write -f idle_kernel idle.elf
$ artiq_coreconfig read idle_kernel | head -c9
b'\x7fELF

You can write several records at once:

$ artiq_coreconfig write -s key1 value1 -f key2 filename -s key3 value3

To remove the previously written key my_key:

$ artiq_coreconfig delete my_key

You can remove several keys at once:

$ artiq_coreconfig delete key1 key2

To erase the entire flash storage area:

$ artiq_coreconfig erase

You do not need to remove a record in order to change its value, just overwrite it:

$ artiq_coreconfig write -s my_key some_value
$ artiq_coreconfig write -s my_key some_other_value
$ artiq_coreconfig read my_key
b'some_other_value'

ARTIQ core device configuration tool

usage: artiq_coreconfig [-h] [-v] [-q] [--device-db DEVICE_DB]
                        {read,write,delete,erase} ...
Options:
-v=0, --verbose=0
 increase logging level
-q=0, --quiet=0
 decrease logging level
--device-db=device_db.py
 device database file (default: ‘%(default)s’)
Sub-commands:
read

read key from core device config

usage: artiq_coreconfig read [-h] KEY
Positional arguments:
key key to be read from core device config
write

write key-value records to core device config

usage: artiq_coreconfig write [-h] [-s KEY STRING] [-f KEY FILENAME]
Options:
-s=[], --string=[]
 key-value records to be written to core device config
-f=[], --file=[]
 key and file whose content to be written to core device config
delete

delete key from core device config

usage: artiq_coreconfig delete [-h] ...
Positional arguments:
key key to be deleted from core device config
erase

fully erase core device config

usage: artiq_coreconfig erase [-h]

Core device log download tool

ARTIQ core device log tool

usage: artiq_corelog [-h] [-v] [-q] [--device-db DEVICE_DB]
                     {clear,set_level,set_uart_level} ...
Options:
-v=0, --verbose=0
 increase logging level
-q=0, --quiet=0
 decrease logging level
--device-db=device_db.py
 device database file (default: ‘%(default)s’)
Sub-commands:
clear

clear log buffer

usage: artiq_corelog clear [-h]
set_level

set minimum level for messages to be logged

usage: artiq_corelog set_level [-h] LEVEL
Positional arguments:
level log level (one of: OFF ERROR WARN INFO DEBUG TRACE)
set_uart_level

set minimum level for messages to be logged to UART

usage: artiq_corelog set_uart_level [-h] LEVEL
Positional arguments:
level log level (one of: OFF ERROR WARN INFO DEBUG TRACE)

Core device RTIO analyzer tool

ARTIQ core device RTIO analysis tool

usage: artiq_coreanalyzer [-h] [-v] [-q] [--device-db DEVICE_DB]
                          [-r READ_DUMP] [-p] [-w WRITE_VCD] [-d WRITE_DUMP]
Options:
-v=0, --verbose=0
 increase logging level
-q=0, --quiet=0
 decrease logging level
--device-db=device_db.py
 device database file (default: ‘%(default)s’)
-r, --read-dump
 read raw dump file instead of accessing device
-p=False, --print-decoded=False
 print raw decoded messages
-w, --write-vcd
 format and write contents to VCD file
-d, --write-dump
 write raw dump file

Data to InfluxDB bridge

ARTIQ data to InfluxDB bridge

usage: artiq_influxdb [-h] [--server-master SERVER_MASTER]
                      [--port-master PORT_MASTER]
                      [--retry-master RETRY_MASTER] [--baseurl-db BASEURL_DB]
                      [--user-db USER_DB] [--password-db PASSWORD_DB]
                      [--database DATABASE] [--table TABLE]
                      [--pattern-file PATTERN_FILE] [--bind BIND]
                      [--no-localhost-bind] [--port-control PORT_CONTROL] [-v]
                      [-q]
Options:
--server-master=::1
 hostname or IP of the master to connect to
--port-master=3250
 TCP port to use to connect to the master
--retry-master=5.0
 retry timer for reconnecting to master
--baseurl-db=http://localhost:8086
 base URL to access InfluxDB (default: %(default)s)
--user-db= InfluxDB username
--password-db= InfluxDB password
--database=db database name to use
--table=lab table name to use
--pattern-file=influxdb_patterns.cfg
 file to load the patterns from (default: %(default)s). If the file is not found, no patterns are loaded (everything is logged).
--bind=[] additional hostname or IP addresse to bind to; use ‘*’ to bind to all interfaces (default: %(default)s)
--no-localhost-bind=False
 do not implicitly also bind to localhost addresses
--port-control=3248
 TCP port to listen on for control connections (default: 3248)
-v=0, --verbose=0
 increase logging level
-q=0, --quiet=0
 decrease logging level

Pattern matching works as follows. The default action on a key (dataset name) is to log it. Then the patterns are traversed in order and glob-matched with the key. Optional + and - pattern prefixes specify whether to ignore or log keys matching the rest of the pattern. Default (in the absence of prefix) is to ignore. Last matched pattern takes precedence.