The default serial parameters are 9600 baud, 8n1, no flow control.
For the most part, numbers are ASCII - so for the following command, “A10” is how to request averaging 10 samples before returning. And for almost all the “set parameter” commands, “?{letter}” will return the current value - so “?A” returns the currently set value for the averaging count.
A{int}: Number of spectra to average before returning results. This will delay returning results until the average for all is done.
I{int}: Set the integration time in milliseconds. This is how long it counts for one sample. Valid range is 50-65000.
K{int}: Set the baud rate. 0: 115200 1: 38400 2: 19200 3: 9600 4: 4800 5: 2400 6: 1200 7:600
Q: Reset the spectrometer, returning all values to default.
S: Initiate a scan and return the values, either in ASCII or binary mode.
a: Set ASCII mode. Results are returned in human-readable counts on the terminal.
b: Set binary mode. The results are compressed, as described below.
In binary mode, each pixel is compared to the previous value. If the value is +/- 127, a single signed 8-bit int encodes the difference. If the difference is greater, a three-byte sequence is sent: 0x80 (flag value that the next byte is the full value), {high order bits of the 16-bit uint}, {low order bits of the 16-bit uint}. It seems straightforward enough to decode, I’ve just not written software to do it because Spectrum Studio in a VM is fine with me.
Tom's Electronics pages / tom@mmto.org