srttools package

Submodules

srttools.calibration module

Produce calibrated light curves.

SDTlcurve is a script that, given a list of cross scans from different sources, is able to recognize calibrators and use them to convert the observed counts into a density flux value in Jy.

class srttools.calibration.CalibratorTable(*args, **kwargs)[source] [edit on github]

Bases: Table

Table composed of fitted and tabulated fluxes.

Initialize the object.

Jy_over_counts(channel=None, elevation=None, map_unit='Jy/beam', good_mask=None)[source] [edit on github]

Compute the Jy/Counts conversion corresponding to a given map unit.

Parameters:
channelstr

Channel name (e.g. ‘Feed0_RCP’, ‘Feed0_LCP’ etc.)

elevationfloat or array-like

The elevation or a list of elevations

map_unitstr

A valid unit for the calibrated map (See the keys of FLUX_QUANTITIES)

good_maskarray of bools, default None

This mask can be used to specify the valid entries of the table. If None, the mask is set to an array of True values

Returns:
fcfloat or array-like

One conversion value for each elevation

fcefloat or array-like

the uncertainties corresponding to each fc

Jy_over_counts_rough(channel=None, map_unit='Jy/beam', good_mask=None)[source] [edit on github]

Get the conversion from counts to Jy.

Returns:
fcfloat

flux density /count ratio

fcefloat

uncertainty on fc

Other Parameters:
channelstr

Name of the data channel

map_unitstr

A valid unit for the calibrated map (See the keys of FLUX_QUANTITIES)

good_maskarray of bools, default None

This mask can be used to specify the valid entries of the table. If None, the mask is set to an array of True values

beam_width(channel=None)[source] [edit on github]

Calculate the (weighted) mean beam width, in radians.

Checks for invalid (nan and such) values.

calculate_src_flux(channel=None, map_unit='Jy/beam', source=None)[source] [edit on github]

Calculate source flux and error, pointing by pointing.

Uses the conversion factors calculated from the tabulated fluxes for all sources but the current, and the fitted Gaussian amplitude for the current source. Updates the calibrator table and returns the average flux

Parameters:
channelstr or list of str

Data channel

map_unitstr

Units in the map (default Jy/beam)

sourcestr

Source name. Must match one of the sources in the table. Default

Returns:
mean_fluxarray of floats

Array with as many channels as the input ones

mean_flux_errarray of floats

Uncertainties corresponding to mean_flux

calibrate()[source] [edit on github]

Calculate the calibration constants.

The following conversion functions are calculated for each tabulated cross scan belonging to a calibrator:

  • ‘Flux/Counts’ and ‘Flux/Counts Err’: Tabulated flux density divided by the _height_ of the fitted Gaussian. This is used, e.g. to calibrate images in Jy/beam, as it calibrates the local amplitude to the flux density

  • ‘Flux Integral/Counts’ and ‘Flux Integral/Counts Err’: Tabulated flux density divided by the _volume_ of the 2D Gaussian corresponding to the fitted cross scans, assuming a symmetrical beam (which is generally not the case, but a good approximation). This is used, e.g., to perform the calibration in Jy/pixel: Each pixel will be normalized to the expected total flux in the corresponding pixel area

check_consistency(channel=None, epsilon=0.05)[source] [edit on github]

Check the consistency of calculated and fitted flux densities.

For each source in the srttools’ calibrator list, use calculate_src_flux to calculate the source flux ignoring the tabulated value, and compare the calculated and tabulated values.

Returns:
retvalbool

True if, for all calibrators, the tabulated and calculated values of the flux are consistent. False otherwise.

check_not_empty()[source] [edit on github]

Check that table is not empty.

Returns:
goodbool

True if all checks pass, False otherwise.

check_up_to_date()[source] [edit on github]

Check that the calibration information is up to date.

Returns:
goodbool

True if all checks pass, False otherwise.

compute_conversion_function(map_unit='Jy/beam', good_mask=None)[source] [edit on github]

Compute the conversion between Jy and counts.

Try to get a meaningful second-degree polynomial fit over elevation. Revert to the rough function Jy_over_counts_rough() in case statsmodels is not installed. In this latter case, only the baseline value is given for flux conversion and error. These values are saved in the calibration_coeffs and calibration_uncerts attributes of CalibratorTable, and a dictionary called calibration is also created. For each channel, this dictionary contains either None or an object. This object is the output of a fit procedure in statsmodels. The method object.predict(X) returns the calibration corresponding to elevation X.

counts_over_Jy(channel=None, elevation=None)[source] [edit on github]

Get the conversion from Jy to counts.

from_scans(scan_list=None, debug=False, freqsplat=None, config_file=None, nofilt=False, plot=False)[source] [edit on github]

Load source table from a list of scans.

For each scan, a fit is performed. Since we are assuming point-like sources here, the fit is a Gaussian plus a slope. The centroid, width and amplitude of the fit fill out new rows of the CalibratorTable (‘Fit RA’ or ‘Fit Dec’, ‘Width’ and ‘Counts’ respectively).

Parameters:
scan_listlist of str

List of files containing cross scans to be fitted

config_filestr

File containing the configuration (list of directories etc.)

Returns:
retvalbool

True if at least one scan was correctly processed

Other Parameters:
debugbool

Throw debug information

freqsplatstr

List of frequencies to be merged into one. See srttools.scan.interpret_frequency_range()

nofiltbool

Do not filter the noisy channels of the scan. See srttools.scan.clean_scan_using_variability

plotbool

Plot diagnostic plots? Default False, True if debug is True.

get_fluxes()[source] [edit on github]

Get the tabulated flux of the source, if listed as calibrators.

Updates the table.

plot_two_columns(xcol, ycol, xerrcol=None, yerrcol=None, ax=None, channel=None, xfactor=1, yfactor=1, color=None, test=False)[source] [edit on github]

Plot the data corresponding to two given columns.

show()[source] [edit on github]

Show a summary of the calibration.

update()[source] [edit on github]

Update the calibration information.

Execute get_fluxes, calibrate and compute_conversion_function

write(fname, *args, **kwargs)[source] [edit on github]

Same as Table.write, but adds path information for HDF5.

srttools.calibration.read_calibrator_config()[source] [edit on github]

Read the configuration of calibrators in data/calibrators.

Returns:
configsdict

Dictionary containing the configuration for each calibrator. Each key is the name of a calibrator. Each entry is another dictionary, in one of the following formats: 1) {‘Kind’ : ‘FreqList’, ‘Frequencies’ : […], ‘Bandwidths’ : […], ‘Fluxes’ : […], ‘Flux Errors’ : […]} where ‘Frequencies’ is the list of observing frequencies in GHz, ‘Bandwidths’ is the list of bandwidths in GHz, ‘Fluxes’ is the list of flux densities in Jy from the literature and ‘Flux Errors’ are the uncertainties on those fluxes. 2) {‘Kind’ : ‘CoeffTable’, ‘CoeffTable’: {‘coeffs’ : ‘time, a0, a0e, a1, a1e, a2, a2e, a3, a3e

2010.0,0 …}}

where the ‘coeffs’ key contains a dictionary with the table of coefficients a la Perley & Butler ApJS 204, 19 (2013), as a comma-separated string.

Examples

>>> calibs = read_calibrator_config() 
>>> calibs['DummyCal']['Kind']
'CoeffTable'
>>> 'coeffs' in calibs['DummyCal']['CoeffTable']
True

srttools.convert module

srttools.convert.convert_to_complete_fitszilla(fname, outname)[source] [edit on github]
srttools.convert.launch_classfits_creator(name, label, test=False, save_locally=False)[source] [edit on github]
srttools.convert.launch_convert_coords(name, label, save_locally=False)[source] [edit on github]
srttools.convert.launch_mbfits_creator(name, label, test=False, wrap=False, detrend=False, save_locally=False)[source] [edit on github]
srttools.convert.launch_sdfits_creator(name, label, test=False, save_locally=False)[source] [edit on github]
srttools.convert.main_convert(args=None)[source] [edit on github]
srttools.convert.match_srt_name(name)[source] [edit on github]

Examples

>>> matchobj = match_srt_name('blabla/20180212-150835-S0000-3C84_RA/')
>>> matchobj.group(1)
'20180212'
>>> matchobj.group(2)
'150835'
>>> matchobj.group(3)
'S0000'
>>> matchobj.group(4)
'3C84_RA'

srttools.converters.classfits module

Convert fitszilla data to a CLASS-compatible fits format.

The conversion makes use of three types of pointings:

  • SIGNAL or ON, meaning on-source data

  • REFERENCE or OFF, meaning data taken from an “empty” region in the sky

  • CAL, meaning data taken with the calibration mark on (can be CALOFF or

    CALON depending on where the antenna is pointed.)

The conversion is performed as follows:

  1. Load all subscans from a given scan (=a subdirectory with a summary.fits file)

    and classify each of them as ON, OFF, CALON, CALOFF.

  2. Try to infer patterns in the observation of the kind nON, mOFF, rCAL, and

    group the data for each repetition of the pattern

  3. Inside each group, average all the OFF spectra and the CAL spectra,

    not the ON ones (unless specified).

    3a. If specified, smooth the OFF and CAL spectra

  4. Normalize the flux of each ON measurement as

    4a. Normalize the spectrum with respect to the sky spectrum

              ___
         ON - OFF
    S = ----------
           ___
           OFF
    

    4b. Normalize the calibration signal by the sky spectrum

                  ___
         CALOFF - OFF
    C = -------------
             ___
             OFF
    

    4c. If specified, obtain a similar calibration factor with the CALON signal

    4b. Scale the normalized sky spectrum to the normalized calibration spectrum Tc,

    to obtain the antenna temperature Ta:

          S
    Ta = --- Tc
          C
    
class srttools.converters.classfits.CLASSFITS_creator(dirname, scandir=None, average=True, use_calon=False, test=False)[source] [edit on github]

Bases: object

CLASS-compatible FITS creator obhject.

Initialization.

Initialization is easy. If scandir is given, the conversion is done right away.

Parameters:
dirnamestr

Output directory for products

Other Parameters:
scandirstr

Input data directory (to be clear, the directory containing a set of subscans plus a summary.fits file)

averagebool, default True

Average all spectra of a given configuration?

use_calonbool, default False

If False, only the OFF + CAL is used for the calibration. If True, Also the ON + CAL is used and the calibration constant is averaged with that obtained through OFF + CAL.

testbool

Only use for unit tests

calibrate_all(use_calon=False)[source] [edit on github]

Calibrate the scan in all available ways.

The basic calibration is (on - off)/off, where on and off are on-source and off-source spectra respectively.

New HDU lists are produced and added to the existing, uncalibrated ones.

If the calibration mark has been used in some scan, an additional calibration is applied and further HDU lists are produced.

Other Parameters:
use_calonbool, default False

If False, only the OFF + CAL is used for the calibration. If True, Also the ON + CAL is used and the calibration constant is averaged with that obtained through OFF + CAL.

fill_in_summary(summaryfile)[source] [edit on github]

Fill in the information contained in the summary.fits file.

get_scan(scandir, average=False)[source] [edit on github]

Treat the data and produce the output, uncalibrated files.

Fills in the self.tables attribute with a dictionary of HDU lists containing a primary header and a MATRIX extension in CLASS-compatible FITS format

Parameters:
scandirstr

Input data directory (to be clear, the directory containing a set of subscans plus a summary.fits file)

Returns:
tables
Other Parameters:
averagebool, default True

Average all spectra of a given configuration?

write_tables_to_disk()[source] [edit on github]

Write all HDU lists produced until now in separate FITS files.

srttools.converters.classfits.cal_is_on(subscan)[source] [edit on github]

Is the calibration mark on? Try to understand.

srttools.converters.classfits.create_variable_length_column(values, max_length=2048, name='SPECTRUM', unit='K')[source] [edit on github]

If we want to use variable length arrays, this is what we should do.

Examples

>>> col = create_variable_length_column([[1, 2]])
>>> isinstance(col, fits.Column)
True
srttools.converters.classfits.find_cycles(table, list_of_keys)[source] [edit on github]

Find cyclic patterns in table.

Parameters:
tableastropy.table.Table object, or compatible

Input table

list_of_keyslist

List of keywords of the table to cycle on

Examples

>>> import doctest
>>> doctest.ELLIPSIS_MARKER = '-etc-'
>>> table = Table(data=[[0, 0, 1, 1, 0, 0, 1, 1],
...                     [0, 0, 0, 0, 0, 0, 0, 0]], names=['A', 'B'])
>>> list_of_keys = ['A', 'B']
>>> new_table = find_cycles(table, list_of_keys) 
-etc-
>>> np.all(new_table['CYCLE'] == [0, 0, 0, 0, 1, 1, 1, 1])
True
>>> table = Table(data=[[0, 0, 1, 1, 0, 0, 1, 1],
...                     [0, 1, 0, 1, 0, 1, 0, 1]], names=['A', 'B'])
>>> list_of_keys = ['A', 'B']
>>> new_table = find_cycles(table, list_of_keys) 
-etc-
>>> np.all(new_table['CYCLE'] == [0, 0, 0, 0, 1, 1, 1, 1])
True
>>> table = Table(data=[[0, 1, 0, 1], [1, 0, 1, 0]], names=['A', 'B'])
>>> list_of_keys = ['A', 'B']
>>> new_table = find_cycles(table, list_of_keys) 
-etc-
>>> np.all(new_table['CYCLE'] == [0, 0, 1, 1])
True
srttools.converters.classfits.get_model_HDUlist(additional_columns=None, **kwargs)[source] [edit on github]

Produce a model CLASS-compatible HDUlist.

srttools.converters.classfits.normalize_on_off_cal(table, smooth=False, apply_cal=True, use_calon=False)[source] [edit on github]

Do the actuall onoff/onoffcal calibration.

The first passage is to combine the ON-source signal with the closest OFF-source signal (alternatively, SIGNAL/on-source vs REFERENCE/off-source) as (ON - OFF)/OFF.

If a signal with calibration mark is present (CAL), and apply_cal is True, an additional calibration factor is applied. If CAL is applied only to the OFF/REFERENCE signal (let us call it OFFCAL), a single calibration factor is calculated: OFF/(OFFCAL - OFF). If the CAL signal is also applied to the ON signal, and use_calon is True, an additional factor ON/(ONCAL - ON) is calculated and averaged with the previous.

Parameters:
tableastropy.table.Table object

The data table

Other Parameters:
smoothbool, default False

Run a median filter on the reference data (the ones that go to the denominator) before calculating the ratio

apply_calbool, default True

If a CAL (calib. mark on) signal is present, use it!

use_calonbool, default False

If False, only the OFF + CAL is used for the calibration. If True, Also the ON + CAL is used and the calibration constant is averaged with that obtained through OFF + CAL.

srttools.converters.classfits.on_or_off(subscan, feed)[source] [edit on github]

Try to infer if a given subscan is ON or OFF.

srttools.fit module

Useful fitting functions.

srttools.fit.align(xs, ys)[source] [edit on github]

Given the first scan, it aligns all the others to that.

Parameters:
xslist of array-like [array1, array2, …]

list of arrays containing the x values of each scan

yslist of array-like [array1, array2, …]

list of arrays containing the y values of each scan

Returns:
qsarray-like

The list of intercepts maximising the alignment, one for each scan

msarray-like

The list of slopes maximising the alignment, one for each scan

srttools.fit.baseline_als(x, y, **kwargs)[source] [edit on github]

Baseline Correction with Asymmetric Least Squares Smoothing.

If the input arrays are larger than 300 elements, ignores outlier_purging and executes the baseline calculation on a small subset of the (median-filtered) y-array

Parameters:
xarray-like

the sample time/number/position

yarray-like

the data series corresponding to x

lamfloat

the lambda parameter of the ALS method. This control how much the baseline can adapt to local changes. A higher value corresponds to a stiffer baseline

pfloat

the asymmetry parameter of the ALS method. This controls the overall slope tollerated for the baseline. A higher value correspond to a higher possible slope

Returns:
y_subtractedarray-like, same size as y

The initial time series, subtracted from the trend

baselinearray-like, same size as y

Fitted baseline. Only returned if return_baseline is True

Other Parameters:
niterint

The number of iterations to perform

return_baselinebool

return the baseline?

offset_correctionbool

also correct for an offset to align with the running mean of the scan

outlier_purgingbool

Purge outliers before the fit?

maskarray of bools

Mask indicating the good x and y data. True for good, False for bad

srttools.fit.baseline_rough(x, y, start_pars=None, return_baseline=False, mask=None)[source] [edit on github]

Rough function to subtract the baseline.

Parameters:
xarray-like

the sample time/number/position

yarray-like

the data series corresponding to x

start_pars[q0, m0], floats

Intercept and slope of linear function

Returns:
y_subtractedarray-like, same size as y

The initial time series, subtracted from the trend

baselinearray-like, same size as y

Fitted baseline

Other Parameters:
return_baselinebool

return the baseline?

maskarray of bools

Mask indicating the good x and y data. True for good, False for bad

srttools.fit.contiguous_regions(condition)[source] [edit on github]

Find contiguous True regions of the boolean array “condition”.

Return a 2D array where the first column is the start index of the region and the second column is the end index.

Parameters:
conditionboolean array
Returns:
idx[[i0_0, i0_1], [i1_0, i1_1], …]

A list of integer couples, with the start and end of each True blocks in the original array

Notes

From https://stackoverflow.com/questions/4494404/find-large-number-of-consecutive-values-fulfilling-condition-in-a-numpy-array

srttools.fit.fit_baseline_plus_bell(x, y, ye=None, kind='gauss')[source] [edit on github]

Fit a function composed of a linear baseline plus a bell function.

Parameters:
xarray-like

the sample time/number/position

yarray-like

the data series corresponding to x

Returns:
mod_outAstropy.modeling.model object

The fitted model

fit_infodict

Fit info from the Astropy fitting routine.

Other Parameters:
yearray-like

the errors on the data series

kind: str

Can be ‘gauss’ or ‘lorentz’

srttools.fit.linear_fit(x, y, start_pars, return_err=False)[source] [edit on github]

A linear fit with any set of data.

Parameters:
xarray-like
yarray-like
start_pars[q0, m0], floats

Intercept and slope of linear function

Returns:
par[q, m], floats

Fitted intercept and slope of the linear function

srttools.fit.linear_fun(x, q, m)[source] [edit on github]

A linear function.

Parameters:
xfloat or array

The independent variable

mfloat

The slope

qfloat

The intercept

Returns:
yfloat or array

The dependent variable

srttools.fit.offset(x, off)[source] [edit on github]

An offset.

srttools.fit.offset_fit(x, y, offset_start=0, return_err=False)[source] [edit on github]

Fit a constant offset to the data.

Parameters:
xarray-like
yarray-like
offset_startfloat

Constant offset, initial value

Returns:
offsetfloat

Fitted offset

srttools.fit.purge_outliers(y, window_size=5, up=True, down=True, mask=None, plot=False)[source] [edit on github]

Remove obvious outliers.

Attention: This is known to throw false positives on bona fide, very strong Gaussian peaks

srttools.fit.ref_mad(array, window=1)[source] [edit on github]

Ref. Median Absolute Deviation of an array, rolling median-subtracted.

If a data series is noisy, it is difficult to determine the underlying statistics of the original series. Here, the MAD is calculated in a rolling window, and the minimum is saved, because it will likely be the interval with less noise.

Parameters:
arraynumpy.array object or list

Input data

windowint or float

Number of bins of the window

Returns:
ref_stdfloat

The reference MAD

srttools.fit.ref_std(array, window=1)[source] [edit on github]

Minimum standard deviation along an array.

If a data series is noisy, it is difficult to determine the underlying standard deviation of the original series. Here, the standard deviation is calculated in a rolling window, and the minimum is saved, because it will likely be the interval with less noise.

Parameters:
arraynumpy.array object or list

Input data

windowint or float

Number of bins of the window

Returns:
ref_stdfloat

The reference Standard Deviation

srttools.fit.total_variance(xs, ys, params)[source] [edit on github]

Calculate the total variance of a series of scans.

This functions subtracts a linear function from each of the scans (excluding the first one) and calculates the total variance.

Parameters:
xslist of array-like [array1, array2, …]

list of arrays containing the x values of each scan

yslist of array-like [array1, array2, …]

list of arrays containing the y values of each scan

paramslist of array-like [[q0, m0], [q1, m1], …]

list of arrays containing the parameters [m, q] for each scan.

Returns:
total_variancefloat

The total variance of the baseline-subtracted scans.

srttools.global_fit module

Functions to clean images by fitting linear trends to the initial scans.

srttools.global_fit.display_intermediate(scanset, chan='Feed0_RCP', feed=0, excluded=None, parfile=None, factor=1)[source] [edit on github]

Display the intermediate steps of global_fitting.

Parameters:
scanseta :class:ScanSet instance

The scanset to be fit

Other Parameters:
chanstr

channel of the scanset to be fit. Defaults to "Feed0_RCP"

feedint

feed of the scanset to be fit. Defaults to 0

excluded[[centerx0, centery0, radius0]]

List of circular regions to exclude from fitting (e.g. strong sources that might alter the total rms)

parfilestr

File containing the parameters, in the same format saved by _callback

srttools.global_fit.fit_full_image(scanset, chan='Feed0_RCP', feed=0, excluded=None, par=None)[source] [edit on github]

Get a clean image by subtracting linear trends from the initial scans.

Parameters:
scanseta :class:ScanSet instance

The scanset to be fit

Returns:
new_countsarray-like

The new Counts column for scanset, where a baseline has been subtracted from each scan to produce the cleanest image background.

Other Parameters:
chanstr

channel of the scanset to be fit. Defaults to "Feed0_RCP"

feedint

feed of the scanset to be fit. Defaults to 0

excluded[[centerx0, centery0, radius0]]

List of circular regions to exclude from fitting (e.g. strong sources that might alter the total rms)

par[m0, q0, m1, q1, …] or None

Initial parameters – slope and intercept for linear trends to be subtracted from the scans

srttools.histograms module

This module contains a fast replacement for numpy’s histogramdd and histogram2d. Two changes were made. The first was replacing

np.digitize(a, b)

with

np.searchsorted(b, a, “right”)

This performance bug is explained on https://github.com/numpy/numpy/issues/2656 The speedup is around 2x for big number of bins (roughly >100). It assumes that the bins are monotonic.

The other change is to allow lists of weight arrays. This is advantageous for resampling as there is just one set of coordinates but several data arrays (=weights). Therefore repeated computations are prevented.

srttools.histograms.histogram2d(x, y, bins=10, bin_range=None, normed=False, weights=None)[source] [edit on github]

Compute the bi-dimensional histogram of two data samples.

Parameters:
xarray_like, shape (N,)

An array containing the x coordinates of the points to be histogrammed.

yarray_like, shape (N,)

An array containing the y coordinates of the points to be histogrammed.

binsint or [int, int] or array_like or [array, array], optional

The bin specification:

  • If int, the number of bins for the two dimensions (nx=ny=bins).

  • If [int, int], the number of bins in each dimension (nx, ny = bins).

  • If array_like, the bin edges for the two dimensions (x_edges=y_edges=bins).

  • If [array, array], the bin edges in each dimension (x_edges, y_edges = bins).

bin_rangearray_like, shape(2,2), optional

The leftmost and rightmost edges of the bins along each dimension (if not specified explicitly in the bins parameters): [[xmin, xmax], [ymin, ymax]]. All values outside of this range will be considered outliers and not tallied in the histogram.

normedbool, optional

If False, returns the number of samples in each bin. If True, returns the bin density bin_count / sample_count / bin_area.

weightsarray_like, shape(N,), optional

An array of values w_i weighing each sample (x_i, y_i). Weights are normalized to 1 if normed is True. If normed is False, the values of the returned histogram are equal to the sum of the weights belonging to the samples falling into each bin. Weights can also be a list of (weight arrays or None), in which case a list of histograms is returned as H.

Returns:
Hndarray, shape(nx, ny)

The bi-dimensional histogram of samples x and y. Values in x are histogrammed along the first dimension and values in y are histogrammed along the second dimension.

xedgesndarray, shape(nx,)

The bin edges along the first dimension.

yedgesndarray, shape(ny,)

The bin edges along the second dimension.

See also

histogram()

1D histogram

histogramdd()

Multidimensional histogram

Notes

When normed is True, then the returned histogram is the sample density, defined such that the sum over bins of the product bin_value * bin_area is 1.

Please note that the histogram does not follow the Cartesian convention where x values are on the abscissa and y values on the ordinate axis. Rather, x is histogrammed along the first dimension of the array (vertical), and y along the second dimension of the array (horizontal). This ensures compatibility with histogramdd.

Examples

>>> import matplotlib as mpl
>>> import matplotlib.pyplot as plt

Construct a 2D-histogram with variable bin width. First define the bin edges:

>>> xedges = [0, 1, 1.5, 3, 5]
>>> yedges = [0, 2, 3, 4, 6]

Next we create a histogram H with random bin content:

>>> x = np.random.normal(3, 1, 100)
>>> y = np.random.normal(1, 1, 100)
>>> H, xedges, yedges = np.histogram2d(y, x, bins=(xedges, yedges))

Or we fill the histogram H with a determined bin content:

>>> H = np.ones((4, 4)).cumsum().reshape(4, 4)
>>> print(H[::-1])  # This shows the bin content in the order as plotted
[[ 13.  14.  15.  16.]
 [  9.  10.  11.  12.]
 [  5.   6.   7.   8.]
 [  1.   2.   3.   4.]]

Imshow can only do an equidistant representation of bins:

>>> fig = plt.figure(figsize=(7, 3))
>>> ax = fig.add_subplot(131)
>>> _ = ax.set_title('imshow: equidistant')
>>> im = plt.imshow(H, interpolation='nearest', origin='low',
...                 extent=[xedges[0], xedges[-1], yedges[0], yedges[-1]])

pcolormesh can display exact bin edges:

>>> ax = fig.add_subplot(132)
>>> _ = ax.set_title('pcolormesh: exact bin edges')
>>> X, Y = np.meshgrid(xedges, yedges)
>>> _ = ax.pcolormesh(X, Y, H)
>>> _ = ax.set_aspect('equal')

NonUniformImage displays exact bin edges with interpolation:

>>> ax = fig.add_subplot(133)
>>> _ = ax.set_title('NonUniformImage: interpolated')
>>> im = mpl.image.NonUniformImage(ax, interpolation='bilinear')
>>> xcenters = xedges[:-1] + 0.5 * (xedges[1:] - xedges[:-1])
>>> ycenters = yedges[:-1] + 0.5 * (yedges[1:] - yedges[:-1])
>>> _ = im.set_data(xcenters, ycenters, H)
>>> _ = ax.images.append(im)
>>> _ = ax.set_xlim(xedges[0], xedges[-1])
>>> _ = ax.set_ylim(yedges[0], yedges[-1])
>>> _ = ax.set_aspect('equal')
>>> _ = plt.close(fig)
srttools.histograms.histogramdd(sample, bins=10, bin_range=None, normed=False, weights=None)[source] [edit on github]

Compute the multidimensional histogram of some data.

Parameters:
samplearray_like

The data to be histogrammed. It must be an (N,D) array or data that can be converted to such. The rows of the resulting array are the coordinates of points in a D dimensional polytope.

binssequence or int, optional

The bin specification:

  • A sequence of arrays describing the bin edges along each dimension.

  • The number of bins for each dimension (nx, ny, … =bins)

  • The number of bins for all dimensions (nx=ny=…=bins).

bin_rangesequence, optional

A sequence of lower and upper bin edges to be used if the edges are not given explicitly in bins. Defaults to the minimum and maximum values along each dimension.

normedbool, optional

If False, returns the number of samples in each bin. If True, returns the bin density bin_count / sample_count / bin_volume.

weightsarray_like (N,), optional

An array of values w_i weighing each sample (x_i, y_i, z_i, ...). Weights are normalized to 1 if normed is True. If normed is False, the values of the returned histogram are equal to the sum of the weights belonging to the samples falling into each bin. Weights can also be a list of (weight arrays or None), in which case a list of histograms is returned as H.

Returns:
Hndarray

The multidimensional histogram of sample x. See normed and weights for the different possible semantics.

edgeslist

A list of D arrays describing the bin edges for each dimension.

See also

histogram

1-D histogram

histogram2d

2-D histogram

Examples

>>> r = np.random.randn(100,3)
>>> H, edges = np.histogramdd(r, bins = (5, 8, 4))
>>> H.shape, edges[0].size, edges[1].size, edges[2].size
((5, 8, 4), 6, 9, 5)

srttools.imager module

Produce calibrated images.

SDTimage is a script that, given a list of cross scans composing an on-the-fly map, is able to calculate the map and save it in FITS format after cleaning the data.

class srttools.imager.ScanSet(data=None, norefilt=True, config_file=None, freqsplat=None, nofilt=False, nosub=False, plot=False, debug=False, **kwargs)[source] [edit on github]

Bases: Table

Class obtained by a set of scans.

Once the scans are loaded, this class contains all functionality that will be used to produce (calibrated or uncalibrated) maps with WCS information.

Parameters:
datastr or None

data can be one of the following: + a config file, containing the information on the scans to load + an HDF5 archive, containing a former scanset + another ScanSet or an Astropy Table

config_filestr

Config file containing the parameters for the images and the directories containing the image and calibration data

norefiltbool

See srttools.scan.Scan

freqsplatstr

See srttools.scan.interpret_frequency_range

nofiltbool

See srttools.scan.clean_scan_using_variability

nosubbool

See srttools.scan.Scan

Other Parameters:
kwargsadditional arguments

These will be passed to Scan initializers

Examples

>>> scanset = ScanSet()  # An empty scanset
>>> isinstance(scanset, ScanSet)
True
analyze_coordinates(frame='icrs')[source] [edit on github]

Save statistical information on coordinates.

apply_user_filter(user_func=None, out_column=None)[source] [edit on github]

Apply a user-supplied function as filter.

Parameters:
user_funcfunction

This function needs to accept a srttools.imager.ScanSet object as only argument. It has to return an array with the same length of a column of the input dataset

out_columnstr

column where the results will be stored

Returns:
retvalarray

the result of user_func

barycenter_times()[source] [edit on github]

Create barytime column with observing times converted to TDB.

calculate_beam_fom(im, cm=None, radius=0.3, label=None, use_log=False, show_plot=False)[source] [edit on github]

Calculate various figures of merit (FOMs) in an image.

These FOMs are useful to single out asymmetries in a beam shape: for example, when characterizing the beam of the radio telescope using a map of a calibrator, it is useful to understand if there are lobes appearing only in one direction.

Parameters:
im2-d array

The image to be analyzed

Returns:
results_dictdict

Dictionary containing the results

Other Parameters:
cm[int, int]

‘Center of mass’ of the image

radiusfloat

The radius around the center of mass, in percentage of the image size (0 <= radius <= 0.5)

use_log: bool

Rescale the image to a log scale before calculating the coefficients. The scale is the same documented in the ds9 docs, for consistency. After normalizing the image from 0 to 1, the log-rescaled image is log(ax + 1) / log a, with x the normalized image and a a constant fixed here at 1000

show_plotbool, default False

show the plots immediately

calculate_delta_altaz()[source] [edit on github]

Construction of delta altaz coordinates.

Calculate the delta of altazimutal coordinates wrt the position of the source

calculate_images(no_offsets=False, frame='icrs', calibration=None, elevation=None, map_unit='Jy/beam', calibrate_scans=False, direction=None, onlychans=None)[source] [edit on github]

Obtain image from all scans.

Other Parameters:
no_offsets: bool

use positions from feed 0 for all feeds.

frame: str

One of icrs, altaz, sun, galactic, default icrs

calibration: CalibratorTable

Optional Calibrator table for calibration, default None

elevation: Angle

Optional elevation angle. Defaults to mean elevation

map_unit: str

Only used if calibration is not None. One of Jy/beam or Jy/pixel

calibrate_scans: bool

Calibrate from subscans instead of from the binned image. Slower but more precise

direction: int

Optional: only select horizontal or vertical scans for this image. 0 if horizontal, 1 if vertical, defaults to None that means that all scans will be used.

onlychans: List

List of channels for which images are calculated. If defaults to all channels

calculate_zernike_moments(im, cm=None, radius=0.3, norder=8, label=None, use_log=False)[source] [edit on github]

Calculate the Zernike moments of the image.

These moments are useful to single out asymmetries in the image: for example, when characterizing the beam of the radio telescope using a map of a calibrator, it is useful to calculate these moments to understand if the beam is radially symmetric or has distorted side lobes.

Parameters:
im2-d array

The image to be analyzed

cm[int, int]

‘Center of mass’ of the image

radiusfloat

The radius around the center of mass, in percentage of the image size (0 <= radius <= 0.5)

norderint

Maximum order of the moments to calculate

Returns:
moments_dictdict

Dictionary containing the order, the sub-index and the moment, e.g. {0: {0: 0.3}, 1: {0: 1e-16}, 2: {0: 0.95, 2: 6e-19}, …} Moments are symmetrical, so only the unique values are reported.

calibrate_images(calibration, elevation=0.7853981633974483, map_unit='Jy/beam', direction=None)[source] [edit on github]

Calibrate the images.

property chan_columns
convert_coordinates(frame='icrs')[source] [edit on github]

Convert the coordinates from sky to pixel.

create_wcs(frame='icrs')[source] [edit on github]

Create a wcs object from the pointing information.

destripe_images(niter=10, npix_tol=None, **kwargs)[source] [edit on github]
find_scans_through_pixel(x, y, test=False)[source] [edit on github]

Find scans passing through a pixel.

fit_full_images(chans=None, fname=None, save_sdev=False, no_offsets=False, frame='icrs', calibration=None, excluded=None, par=None, map_unit='Jy/beam')[source] [edit on github]

Flatten the baseline with a global fit.

Fit a linear trend to each scan to minimize the scatter in an image

get_coordinates(frame='icrs')[source] [edit on github]

Give the coordinates as pairs of RA, DEC.

get_obstimes()[source] [edit on github]

Get astropy.time.Time at the telescope location.

get_opacity(datadir=None, dirlist=None)[source] [edit on github]

List all scans contained in the directory listed in config.

interactive_display(ch=None, recreate=False, test=False)[source] [edit on github]

Modify original scans from the image display.

list_scans(datadir=None, dirlist=None)[source] [edit on github]

List all scans contained in the directory listed in config.

load_scans(scan_list, freqsplat=None, nofilt=False, debug=False, **kwargs)[source] [edit on github]

Load the scans in the list one by ones.

read_images_from_meta()[source] [edit on github]
reprocess_scans_through_pixel(x, y, test=False)[source] [edit on github]

Given a pixel in the image, find all scans passing through it.

rerun_scan_analysis(x, y, key, test=False)[source] [edit on github]

Rerun the analysis of single scans.

save_ds9_images(fname=None, save_sdev=False, scrunch=False, no_offsets=False, frame='icrs', calibration=None, map_unit='Jy/beam', calibrate_scans=False, destripe=False, npix_tol=None, bad_chans=[])[source] [edit on github]

Save a ds9-compatible file with one image per extension.

property scan_list
scrunch_images(bad_chans=[])[source] [edit on github]

Sum the images from all channels.

update_meta_with_images()[source] [edit on github]
update_scan(sname, sid, dim, zap_info, fit_info, flag_info, test=False)[source] [edit on github]

Update a scan in the scanset after filtering.

write(fname, *args, **kwargs)[source] [edit on github]

Same as Table.write, but adds path information for HDF5.

srttools.interactive_filter module

Interactive operations.

class srttools.interactive_filter.DataSelector(xs, ys, ax1, ax2, masks=None, xlabel=None, title=None, test=False)[source] [edit on github]

Bases: object

Plot and process scans interactively.

Initialize.

align_all()[source] [edit on github]

Given the selected scan, aligns all the others to that.

base(event)[source] [edit on github]

Add an interval to the ones that will be used by baseline sub.

flag(value=True)[source] [edit on github]
on_click(event)[source] [edit on github]

Dummy function, in case I want to do something with a click.

on_key(event)[source] [edit on github]

Do something when the keyboard is used.

on_pick(event)[source] [edit on github]

Do this when I pick a line in the plot.

plot_all(silent=False)[source] [edit on github]

Plot everything.

print_info()[source] [edit on github]

Print info on the current scan.

Info includes zapped intervals and fit parameters.

print_instructions()[source] [edit on github]

Print to terminal some instructions for the interactive window.

quit()[source] [edit on github]
reset()[source] [edit on github]
subtract_baseline()[source] [edit on github]

Subtract the baseline based on the selected intervals.

subtract_model(channel)[source] [edit on github]

Subtract the model from the scan.

zap(event)[source] [edit on github]

Create a zap interval.

class srttools.interactive_filter.ImageSelector(data, ax, fun=None, test=False)[source] [edit on github]

Bases: object

Return xs and ys of the image, and the key that was pressed.

Attributes:
imgarray

the image

axpyplot.axis instance

the axis where the image will be plotted

funfunction

the function to call when a key is pressed. It must accept three arguments: x, y and key

Initialize an ImageSelector class.

Parameters:
dataarray

the image

axpyplot.axis instance

the axis where the image will be plotted

funfunction, optional

the function to call when a key is pressed. It must accept three arguments: x, y and key

on_key(event)[source] [edit on github]

Do this when the keyboard is pressed.

plot_img()[source] [edit on github]

Plot the image on the interactive display.

exception srttools.interactive_filter.PlotWarning[source] [edit on github]

Bases: UserWarning

exception srttools.interactive_filter.TestWarning[source] [edit on github]

Bases: UserWarning

srttools.interactive_filter.create_empty_info(keys)[source] [edit on github]
class srttools.interactive_filter.intervals[source] [edit on github]

Bases: object

A list of xs and ys of the points taken during interactive selection.

Initialize.

add(point)[source] [edit on github]

Append points.

clear()[source] [edit on github]

Clear.

srttools.interactive_filter.mask(xs, border_xs, invert=False)[source] [edit on github]

Create mask from a list of interval borders.

Parameters:
xsarray

the array of values to filter

border_xsarray

the list of borders. Should be an even number of positions

Returns:
maskarray

Array of boolean values, that work as a mask to xs

Other Parameters:
invertbool

Mask value is False if invert = False, and vice versa. E.g. for zapped intervals, invert = False. For baseline fit selections, invert = True

srttools.interactive_filter.select_data(xs, ys, masks=None, title=None, xlabel=None, test=False)[source] [edit on github]

Open a DataSelector window.

srttools.io module

Input/output functions.

srttools.io.correct_offsets(obs_angle, xoffset, yoffset)[source] [edit on github]

Correct feed offsets for derotation angle.

All angles are in radians.

Examples

>>> x = 2 ** 0.5
>>> y = 2 ** 0.5
>>> angle = np.pi / 4
>>> xoff, yoff = correct_offsets(angle, x, y)
>>> np.allclose([xoff, yoff], 2 ** 0.5)
True
srttools.io.detect_data_kind(fname)[source] [edit on github]

Placeholder for function that recognizes data format.

srttools.io.get_chan_columns(table)[source] [edit on github]
srttools.io.get_rest_angle(xoffsets, yoffsets)[source] [edit on github]

Calculate the rest angle for multifeed.

The first feed is assumed to be at position 0, for it the return value is 0

Examples

>>> xoffsets = [0.0, -0.0382222, -0.0191226, 0.0191226, 0.0382222,
...             0.0191226, -0.0191226]
>>> yoffsets = [0.0, 0.0, 0.0331014, 0.0331014, 0.0, -0.0331014,
...             -0.0331014]
>>> np.allclose(get_rest_angle(xoffsets, yoffsets).to(u.deg).value,
...             np.array([0., 180., 120., 60., 360., 300., 240.]))
True
srttools.io.mkdir_p(path)[source] [edit on github]

Safe mkdir function.

Parameters:
pathstr

Name of the directory/ies to create

Notes

Found at https://stackoverflow.com/questions/600268/mkdir-p-functionality-in-python

srttools.io.observing_angle(rest_angle, derot_angle)[source] [edit on github]

Calculate the observing angle of the multifeed.

If values have no units, they are assumed in radians

Parameters:
rest_anglefloat or Astropy quantity, angle

rest angle of the feeds

derot_anglefloat or Astropy quantity, angle

derotator angle

Examples

>>> observing_angle(0 * u.rad, TWOPI * u.rad).to(u.rad).value
0.0
>>> observing_angle(0, TWOPI).to(u.rad).value
0.0
srttools.io.print_obs_info_fitszilla(fname)[source] [edit on github]

Placeholder for function that prints out oberving information.

srttools.io.read_data(fname)[source] [edit on github]

Read the data, whatever the format, and return them.

srttools.io.read_data_fitszilla(fname)[source] [edit on github]
srttools.io.root_name(fname)[source] [edit on github]

Return the file name without extension.

srttools.opacity module

srttools.opacity.calculate_opacity(file, plot=True, tatm=None, tau0=None, t0=None)[source] [edit on github]

Calculate opacity from a skydip scan.

Atmosphere temperature is fixed, from Buffa et al.’s calculations.

Parameters:
filestr

File name of the skydip scan in Fits format

plotbool

Plot diagnostics about the fit

Returns:
opacitiesdict

Dictionary containing the opacities calculated for each channel, plus the time in the middle of the observation.

Other Parameters:
tatmfloat

Atmospheric temperature (fixed in the fit). The default value is calculated from an empyrical formula

tau0float

Initial opacity in the fit. The default value is np.log(2 / (1 + np.sqrt(1 - 4 * (t30 - t90) / tatm))), where t30 and t90 are the Tsys values calculated at 30 and 90 degrees of elevation respectively.

t0float

Initial value for Tsys in the fit.

srttools.opacity.exptau(airmass, tatm, tau, t0)[source] [edit on github]

Function to fit to the T vs airmass data.

srttools.opacity.main_opacity(args=None)[source] [edit on github]

srttools.read_config module

Read the configuration file.

srttools.read_config.get_config_file()[source] [edit on github]

Get the current config file.

srttools.read_config.read_config(fname=None)[source] [edit on github]

Read a config file and return a dictionary of all entries.

srttools.read_config.sample_config_file(fname='sample_config_file.ini')[source] [edit on github]

Create a sample config file, to be modified by hand.

srttools.scan module

Scan class.

class srttools.scan.Scan(data=None, config_file=None, norefilt=True, interactive=False, nosave=False, debug=False, plot=False, freqsplat=None, nofilt=False, nosub=False, avoid_regions=None, save_spectrum=False, debug_file_format=None, **kwargs)[source] [edit on github]

Bases: Table

Class containing a single scan.

Load a Scan object

Parameters:
datastr or None

data can be one of the following: None, in which case an empty Scan object is created; a FITS or HDF5 archive, containing an on-the-fly or cross scan in one of the accepted formats; another srttools.scan.Scan or astropy.table.Table object

config_filestr

Config file containing the parameters for the images and the directories containing the image and calibration data

norefiltbool

If an HDF5 archive is present with the same basename as the input FITS file, do not re-run the filtering (default True)

freqsplatstr

See srttools.scan.interpret_frequency_range

nofiltbool

See srttools.scan.clean_scan_using_variability

nosubbool

Do not run the baseline subtraction.

Other Parameters:
kwargsadditional arguments

These will be passed to astropy.table.Table initializer

baseline_subtract(kind='als', plot=False, avoid_regions=None, **kwargs)[source] [edit on github]

Subtract the baseline.

Parameters:
kindstr

If ‘als’, use the Asymmetric Least Square fitting in srttools.fit.baseline_als(), using a very stiff baseline (lam=1e11). If ‘rough’, use srttools.fit.baseline_rough() instead.

Other Parameters:
plotbool

Plot diagnostic information in an image with the same basename as the fits file, an additional label corresponding to the channel, in PNG format.

avoid_regions: [[r0_ra, r0_dec, r0_radius], [r1_ra, r1_dec, r1_radius]]

Avoid these regions from the fit

chan_columns()[source] [edit on github]

List columns containing samples.

check_order()[source] [edit on github]

Check that times in a scan are monotonically increasing.

clean_and_splat(good_mask=None, freqsplat=None, noise_threshold=5, debug=True, plot=True, save_spectrum=False, nofilt=False)[source] [edit on github]

Clean from RFI.

Very rough now, it will become complicated eventually.

Parameters:
good_maskboolean array

this mask specifies intervals that should never be discarded as RFI, for example because they contain spectral lines

freqsplatstr

List of frequencies to be merged into one. See srttools.scan.interpret_frequency_range()

noise_thresholdfloat

The threshold, in sigmas, over which a given channel is considered noisy

Returns:
masksdictionary of boolean arrays

this dictionary contains, for each detector/polarization, True values for good spectral channels, and False for bad channels.

Other Parameters:
save_spectrumbool, default False

Save the spectrum into a ‘ChX_spec’ column

debugbool, default True

Be verbose

plotbool, default True

Save images with quicklook information on single scans

nofiltbool

Do not filter noisy channels (see clean_scan_using_variability())

get_info_string(ch)[source] [edit on github]
interactive_filter(save=True, test=False)[source] [edit on github]

Run the interactive filter.

root_name(fname)[source] [edit on github]
save(fname=None)[source] [edit on github]

Call self.write with a default filename, or specify it.

write(fname, *args, **kwargs)[source] [edit on github]

Same as Table.write, but adds path information for HDF5.

srttools.scan.clean_scan_using_variability(dynamical_spectrum, length, bandwidth, good_mask=None, freqsplat=None, noise_threshold=5.0, debug=True, plot=True, nofilt=False, outfile='out', label='', smoothing_window=0.05, debug_file_format='png', info_string='Empty info string', dpi=100)[source] [edit on github]

Clean a spectroscopic scan using the difference of channel variability.

From the dynamical spectrum, i.e. the list of spectra obtained in each sample of a scan, we calculate the rms variability of each frequency channel. This forms a sort of rms spectrum. We calculate the baseline of this spectrum, and all channels whose rms is above above noise_threshold times the reference median absolute deviation (srttools.fit.ref_mad()), calculated with a minimum window of 20 samples, are cut and assigned an interpolated value between the closest valid points. The baseline is calculated with srttools.fit.baseline_als(), using a lambda value depending on the number of channels, with a formula that has been shown to work in a few standard cases but might be modified in the future.

Parameters:
dynamical_spectrum2-d array

Array of shape MxN, with M spectra of N elements each.

lengthfloat

Duration in seconds of the scan (assumed to have constant sample time)

bandwidthfloat

Bandwidth in MHz

Returns:
resultsobject

The attributes of this object are:

lcarray-like

The cleaned light curve

freqminfloat

Minimum frequency in MHz, referred to local oscillator

freqmaxfloat

Maximum frequency in MHz, referred to local oscillator

Other Parameters:
good_maskboolean array

this mask specifies channels that should never be discarded as RFI, for example because they contain spectral lines

freqsplatstr

List of frequencies to be merged into one. See srttools.scan.interpret_frequency_range()

noise_thresholdfloat

The threshold, in sigmas, over which a given channel is considered noisy

debugbool

Print out debugging information

plotbool

Plot stuff

nofiltbool

Do not filter noisy channels (set noise_threshold to 1e32)

outfilestr

Root file name for the diagnostics plots (outfile_label.png)

labelstr

Label to append to the filename (outfile_label.png)

smoothing_windowfloat

Width of smoothing window, in fraction of spectral length

srttools.scan.interpret_frequency_range(freqsplat, bandwidth, nbin)[source] [edit on github]

Interpret the frequency range specified in freqsplat.

Parameters:
freqsplatstr

Frequency specification. If None, it defaults to the interval 10%-90% of the bandwidth. If ‘:’, it considers the full bandwidth. If ‘f0:f1’, where f0 and f1 are floats/ints, f0 and f1 are interpreted as start and end frequency in MHz, referred to the local oscillator (LO; e.g., if ‘100:400’, at 6.9 GHz this will mean the interval 7.0-7.3 GHz)

bandwidthfloat

The bandwidth in MHz

nbinint

The number of bins in the spectrum

Returns:
freqminfloat

The minimum frequency in the band (ref. to LO), in MHz

freqmaxfloat

The maximum frequency in the band (ref. to LO), in MHz

binminint

The minimum spectral bin

binmaxint

The maximum spectral bin

Examples

>>> interpret_frequency_range(None, 1024, 512)
(102.4, 921.6, 51, 459)
>>> interpret_frequency_range('default', 1024, 512)
(102.4, 921.6, 51, 459)
>>> interpret_frequency_range(':', 1024, 512)
(0, 1024, 0, 511)
>>> interpret_frequency_range('all', 1024, 512)
(0, 1024, 0, 511)
>>> interpret_frequency_range('200:800', 1024, 512)
(200.0, 800.0, 100, 399)
srttools.scan.list_scans(datadir, dirlist)[source] [edit on github]

List all scans contained in the directory listed in config.

srttools.simulate module

Functions to simulate scans and maps.

srttools.simulate.save_scan(times, ra, dec, channels, filename='out.fits', other_columns=None, other_keywords=None, scan_type=None, src_ra=None, src_dec=None, srcname='Dummy', counts_to_K=0.03)[source] [edit on github]

Save a simulated scan in fitszilla format.

Parameters:
timesiterable

times corresponding to each bin center, in seconds

raiterable

RA corresponding to each bin center

deciterable

Dec corresponding to each bin center

channels{‘Ch0’: array([…]), ‘Ch1’: array([…]), …}

Dictionary containing the count array. Keys represent the name of the channel

filenamestr

Output file name

srcnamestr

Name of the source

counts_to_Kfloat, array or dict

Conversion factor between counts and K. If array, it has to be the same length as channels.keys()

srttools.simulate.simulate_map(dt=0.04, length_ra=120.0, length_dec=120.0, speed=4.0, spacing=0.5, count_map=None, noise_amplitude=1.0, width_ra=None, width_dec=None, outdir='sim/', baseline='flat', mean_ra=180, mean_dec=70, srcname='Dummy', channel_ratio=1, nbin=1, debug=False, start_time=0, radec_labels_in_srcname=False)[source] [edit on github]

Simulate a map.

Parameters:
dtfloat

The integration time in seconds

lengthfloat

Length of the scan in arcminutes

speedfloat

Speed of the scan in arcminutes / second

shapefunction

Function that describes the shape of the scan. If None, a constant scan is assumed. The zero point of the scan is in the center of it

noise_amplitudefloat

Noise level in counts

spacingfloat

Spacing between scans, in arcminutes

baselinestr

“flat”, “slope” (linearly increasing/decreasing), “messy” (random walk) or a number (which gives an amplitude to the random-walk baseline, that is 20 for “messy”).

count_mapfunction

Flux distribution function, centered on zero

outdirstr or iterable (str, str)

If a single string, put all files in that directory; if two strings, put RA and DEC scans in the two directories.

channel_ratiofloat

Ratio between the counts in the two channels

radec_labels_in_srcnamebool

If True, add RA and Dec labels to source name

srttools.simulate.simulate_scan(dt=0.04, length=120.0, speed=4.0, shape=None, noise_amplitude=1.0, center=0.0, baseline='flat', nbin=1, calon=False, nsamples=None)[source] [edit on github]

Simulate a scan.

Parameters:
dtfloat

The integration time in seconds

lengthfloat

Length of the scan in arcminutes

speedfloat

Speed of the scan in arcminutes / second

shapefunction

Function that describes the shape of the scan. If None, a constant scan is assumed. The zero point of the scan is in the center of it

noise_amplitudefloat

Noise level in counts

centerfloat

Center coordinate in degrees

baselinestr, number or tuple

“flat”, “slope” (linearly increasing/decreasing), “messy” (random walk), a number (which gives an amplitude to the random-walk baseline, that is 20 for “messy”), or a tuple (m, q, messy_amp) giving the maximum and minimum absolute-value slope and intercept, and the random-walk amplitude.