PyHRF is a set of tools for within-subject fMRI data analysis, which focuses on the characterization of the hemodynamics.
Within the chain of fMRI data processing, these tools provide alternatives to the classical within-subject GLM estimation step. The inputs are preprocessed within-subject data and the outputs are statistical maps and/or fitted HRFs.
The package is mainly written in Python and provides the implementation of the two following methods:
Check the PyHRF website for details.
pyhrf.
Verbose
(verbosity=0, log=<open file '<stdout>', mode 'w'>)¶Bases: pyhrf._verbose.Verbose
This is a dummy class implementing the original Verbose class.
This is only to be able to raise a warning when one uses this old implementation.
old_to_new_log_dict
= {0: 30, 1: 20, 2: 20, 3: 20, 4: 20, 5: 10, 6: 10}¶Module designed to the BOLD signal synthesis according to the Linear and Time Invariant model described in:
- Makni, S., Ciuciu, P., Idier, J., & Poline, J.-B. (2005). Joint detection-estimation of brain activity in functional MRI: a Multichannel Deconvolution solution. IEEE Transactions on Signal Processing, 53(9), 3488–3502. http://doi.org/10.1109/TSP.2005.853303
It also provides a set of plotting functions (based on matplotlib) - see L{bolsynth.plot}
pyhrf.boldsynth.pottsfield.swendsenwang.
CptDefaultGraphLinks
(RefGraph)¶computes a default list GraphLinks from RefGraph
input:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
CptDefaultGraphNodesLabels
(RefGraph)¶computes a default list GraphNodesLabels from RefGraph
input:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
CptDefaultGraphWeight
(RefGraph)¶computes a default list GraphWeight from RefGraph. Each edge weight is set to 1.0.
input:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
CptRefGrphNgbhPosi
(RefGraph)¶computes the critical list CptRefGrphNgbhPosi from RefGraph
imput:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
Cpt_U_graph
(RefGraph, GraphNodesLabels, GraphWeight=None)¶Computes an estimation of U(Graph)
inputs:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
Cpt_Vec_U_graph
(RefGraph, beta, LabelsNb, SamplesNb, GraphWeight=None, GraphNodesLabels=None, GraphLinks=None, RefGrphNgbhPosi=None)¶Computes a given number of U for fields generated according to a given normalization constant Beta. Swendsen-Wang sampling is used to generate fields.
input:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
GraphBetaMix
(RefGraph, GraphNodesLabels, beta=0.5, NbLabels=2, NbIt=5, weights=None)¶Generate a partition in GraphNodesLabels with respect to beta.
input:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
GraphToImage
(GraphNodesCoord, GraphNodesLabels, NBZ, NBY, NBX)¶Computes a 3D image from a connectivity graph.
input:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
ImageToGraph
(Image, Mask, LabelOI=1, ConnectivityType=6)¶Computes the connectivity graph of an image under a 3D mask voxels.
inputs:
outputs:
pyhrf.boldsynth.pottsfield.swendsenwang.
MaskToGraph
(Mask, LabelOI=1, ConnectivityType=6)¶Computes the connectivity graph of in 3D mask voxels.
inputs:
outputs:
pyhrf.boldsynth.pottsfield.swendsenwang.
SwendsenWangSampler_graph
(RefGraph, GraphNodesLabels, beta, NbLabels, GraphLinks=None, RefGrphNgbhPosi=None, method=1, weights=None)¶image sampling with Swendsen-Wang algorithm
input:
output:
pyhrf.boldsynth.pottsfield.swendsenwang.
linkNodes
(RefGraph, beta, GraphNodesLabels, GraphLinks, RefGrphNgbhPosi)¶pyhrf.boldsynth.pottsfield.swendsenwang.
linkNodesSets
(RefGraph, beta, GraphNodesLabels, links, weights=None)¶pyhrf.boldsynth.pottsfield.swendsenwang.
pickLabels
(RefGraph, GraphLinks, GraphNodesLabels, NbLabels, TempVec, NextTempVec)¶pyhrf.boldsynth.pottsfield.swendsenwang.
set_cluster_labels
(links, labels, nbClasses)¶pyhrf.boldsynth.pottsfield.swendsenwang.
walkCluster
(i, links, labels, l, remainingIndexes)¶pyhrf.boldsynth.field.
count_homo_cliques
(graph, labels, weights=None)¶pyhrf.boldsynth.field.
genPepperSaltField
(size, nbLabels, initProps=None)¶pyhrf.boldsynth.field.
genPotts
(graph, beta, nbLabels=2, labelsIni=None, method='SW', weights=None)¶Simulate a realisation of a Potts Field with spatial correlation amount ‘beta’. ‘graph’ is list of lists, ie a neighbors for each node index. ‘nbLabels’ is the number of labels ‘method’ can be either ‘SW’ (swensdsen-wang) or ‘gibbs’
pyhrf.boldsynth.field.
genPottsMap
(mask, beta, nbLabels, method='SW')¶pyhrf.boldsynth.field.
potts_generator
(**args)¶pyhrf.boldsynth.field.
random_field_generator
(size, nbClasses)¶pyhrf.boldsynth.hrf.
bezierCurve
(p1, pc1, p2, pc2, xPrecision)¶pyhrf.boldsynth.hrf.
buildFiniteDiffMatrix
(order, size)¶pyhrf.boldsynth.hrf.
genBezierHRF
(timeAxis=array([ 0. , 0.6, 1.2, 1.8, 2.4, 3. , 3.6, 4.2, 4.8, 5.4, 6. , 6.6, 7.2, 7.8, 8.4, 9. , 9.6, 10.2, 10.8, 11.4, 12. , 12.6, 13.2, 13.8, 14.4, 15. , 15.6, 16.2, 16.8, 17.4, 18. , 18.6, 19.2, 19.8, 20.4, 21. , 21.6, 22.2, 22.8, 23.4, 24. , 24.6, 25.2]), pic=[6, 1], picw=2, ushoot=[15, -0.2], ushootw=3, normalize=False)¶pyhrf.boldsynth.hrf.
genCanoBezierHRF
(duration=25.0, dt=0.6, normalize=False)¶pyhrf.boldsynth.hrf.
genExpHRF
(timeAxis=array([ 0., 0.5, 1., 1.5, 2., 2.5, 3., 3.5, 4., 4.5, 5., 5.5, 6., 6.5, 7., 7.5, 8., 8.5, 9., 9.5, 10., 10.5, 11., 11.5, 12., 12.5, 13., 13.5, 14., 14.5, 15., 15.5, 16., 16.5, 17., 17.5, 18., 18.5, 19., 19.5, 20., 20.5, 21., 21.5, 22., 22.5, 23., 23.5, 24., 24.5]), ttp=6, pa=1, pw=0.2, ttu=11, ua=0.2, uw=0.01)¶pyhrf.boldsynth.hrf.
genGaussianSmoothHRF
(zc, length, eventdt, rh, order=2)¶pyhrf.boldsynth.hrf.
genPriorCov
(zc, pprcov, dt)¶pyhrf.boldsynth.hrf.
getCanoHRF
(duration=25, dt=0.6, hrf_from_spm=True, delay_of_response=6.0, delay_of_undershoot=16.0, dispersion_of_response=1.0, dispersion_of_undershoot=1.0, ratio_resp_under=6.0, delay=0.0)¶Compute the canonical HRF.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.boldsynth.hrf.
getCanoHRF_tderivative
(duration=25.0, dt=0.5)¶pyhrf.boldsynth.scenarios.
build_ctrl_tag_matrix
(asl_shape)¶pyhrf.boldsynth.scenarios.
calc_asl_shape
(bold_stim_induced, dsf)¶pyhrf.boldsynth.scenarios.
createBiGaussCovarNRL
(condition_defs, labels, covariance)¶pyhrf.boldsynth.scenarios.
create_3Dlabels_Potts
(condition_defs, beta, dims, mask)¶pyhrf.boldsynth.scenarios.
create_AR_noise
(bold_shape, v_noise, order=2, v_corr=0.1)¶pyhrf.boldsynth.scenarios.
create_Xh
(nrls, rastered_paradigm, hrf, condition_defs, dt, hrf_territories=None)¶Retrieve the product X.h
pyhrf.boldsynth.scenarios.
create_alpha_for_hrfgroup
(alpha_var)¶Create alpha from a normal distribution, for one subject
pyhrf.boldsynth.scenarios.
create_asl_from_stim_induced
(bold_stim_induced, perf_stim_induced, ctrl_tag_mat, dsf, perf_baseline, noise, drift=None, outliers=None)¶Downsample stim_induced signal according to downsampling factor ‘dsf’ and add noise and drift (nuisance signals) which has to be at downsampled temporal resolution.
pyhrf.boldsynth.scenarios.
create_bigaussian_nrls
(labels, mean_act, var_act, var_inact)¶Simulate bi-Gaussian NRLs (zero-centered inactive component)
pyhrf.boldsynth.scenarios.
create_bold
(stim_induced_signal, dsf, noise, drift=None, outliers=None)¶a short-cut for function create_bold_from_stim_induced
pyhrf.boldsynth.scenarios.
create_bold_controlled_variance
(stim_induced_signal, alpha, nb_voxels, dsf, nrls, Xh, drift=None, outliers=None)¶Create BOLD with controlled explained variance alpha: percentage of explained variance on total variance
pyhrf.boldsynth.scenarios.
create_bold_from_stim_induced
(stim_induced_signal, dsf, noise, drift=None, outliers=None)¶Downsample stim_induced signal according to downsampling factor ‘dsf’ and add noise and drift (nuisance signals) which has to be at downsampled temporal resolution.
pyhrf.boldsynth.scenarios.
create_bold_from_stim_induced_RealNoise
(stim_induced_signal, dsf, noise, drift)¶Downsample stim_induced signal according to downsampling factor ‘dsf’ and add noise and drift (nuisance signals) which has to be at downsampled temporal resolution.
pyhrf.boldsynth.scenarios.
create_bold_stim_induced_signal
(brls, rastered_paradigm, brf, condition_defs, dt, hrf_territories=None)¶Create a stimulus induced signal for ASL from BOLD response levels, paradigm and BRF (sum_{m=1}^M a^m X^m h + sum_{m=1}^M c^m W X^m g) For each condition, compute the convolution of the paradigm binary sequence ‘rastered_paradigm’ with the given BRF and multiply by brls. Finally compute the sum over conditions.
Return a asl array of shape (nb scans, nb voxels)
pyhrf.boldsynth.scenarios.
create_canonical_hrf
(hrf_duration=25.0, dt=0.5)¶pyhrf.boldsynth.scenarios.
create_connected_label_clusters
(condition_defs, activ_label_graph)¶pyhrf.boldsynth.scenarios.
create_drift_coeffs
(bold_shape, drift_order, drift_coeff_var)¶pyhrf.boldsynth.scenarios.
create_drift_coeffs_asl
(asl_shape, drift_order, drift_var)¶pyhrf.boldsynth.scenarios.
create_gaussian_hrf_subject
(hrf_group, var_subject_hrf, dt, alpha=0.0)¶Creation of hrf by subject. Use group level hrf and variance for each subject (var_subjects_hrfs must be a list) Simulated hrfs must be smooth enough: correlation between temporal coeffcients
pyhrf.boldsynth.scenarios.
create_gaussian_noise
(bold_shape, v_noise, m_noise=0.0)¶pyhrf.boldsynth.scenarios.
create_gaussian_noise_asl
(asl_shape, v_gnoise, m_noise=0.0)¶pyhrf.boldsynth.scenarios.
create_gaussian_nrls_sessions_and_mean
(nrls, condition_defs, labels, var_sess)¶Creation of nrls by session (and by voxel and cond) - for one session n° sess The nrls by session vary around an nrl mean (nrls_bar) defined by voxel and cond (var_sess corresponds to the variation of session defined nrls around the nrl_bar) Here “nrls” is nrls_bar, mean over subjects!
pyhrf.boldsynth.scenarios.
create_gsmooth_hrf
(hrf_duration=25.0, dt=0.5, order=2, hrf_var=1.0, zc=True, normalize_hrf=True)¶Create a smooth HRF according to the multivariate gaussian prior used in JDE hrf_duration and dt are the HRF duration and temporal resolution, respectively (in sec.). order is derivative order constraining the covariance matrix. hrf_var is the HRF variance. zc is a flag to impose zeros at the begining and the end of the HRF
return: a np array of HRF coefficients
pyhrf.boldsynth.scenarios.
create_hrf
(picw, pic, under=2, hrf_duration=25.0, dt=0.5)¶pyhrf.boldsynth.scenarios.
create_hrf_from_territories
(hrf_territories, primary_hrfs)¶pyhrf.boldsynth.scenarios.
create_labels_Potts
(condition_defs, beta, nb_voxels)¶pyhrf.boldsynth.scenarios.
create_labels_vol
(condition_defs)¶Create a seet labels from the field “label_map” in condition_defs Available choices for the field label_map: - ‘random_small’ : binary labels are randomly generated with shape (1,5,5) - a tag (str) : corresponds to a png file in pyhrf data files - a 3D np containing the labels
pyhrf.boldsynth.scenarios.
create_language_paradigm
(condition_defs)¶pyhrf.boldsynth.scenarios.
create_localizer_paradigm
(condition_defs, paradigm_label='av')¶pyhrf.boldsynth.scenarios.
create_localizer_paradigm_a
(condition_defs)¶pyhrf.boldsynth.scenarios.
create_localizer_paradigm_avd
(condition_defs)¶pyhrf.boldsynth.scenarios.
create_multisess_stim_induced_signal
(nrls_session, rastered_paradigm, hrf, condition_defs, dt, hrf_territories=None)¶Create a stimulus induced signal from neural response levels, paradigm and HRF (sum_{m=1}^M a^m X^m h) For each condition, compute the convolution of the paradigm binary sequence ‘rastered_paradigm’ with the given HRF and multiply by nrls. Finally compute the sum over conditions.
Return a bold array of shape (nb scans, nb voxels)
pyhrf.boldsynth.scenarios.
create_multisess_stim_induced_signal_asl
(prls_session, rastered_paradigm, prf, condition_defs, dt, hrf_territories=None)¶Create a stimulus induced signal from neural response levels, paradigm and HRF (sum_{m=1}^M a^m X^m h) For each condition, compute the convolution of the paradigm binary sequence ‘rastered_paradigm’ with the given HRF and multiply by nrls. Finally compute the sum over conditions.
Return a bold array of shape (nb scans, nb voxels)
pyhrf.boldsynth.scenarios.
create_null_drift
(bold_shape)¶pyhrf.boldsynth.scenarios.
create_outliers
(bold_shape, stim_induced_signal, nb_outliers, outlier_scale=5.0)¶pyhrf.boldsynth.scenarios.
create_paradigm_un_evnt
(condition_defs)¶pyhrf.boldsynth.scenarios.
create_perf_baseline
(asl_shape, perf_baseline_var, perf_baseline_mean=0.0)¶pyhrf.boldsynth.scenarios.
create_perf_stim_induced_signal
(prls, rastered_paradigm, prf, condition_defs, dt, hrf_territories=None)¶Create a stimulus induced signal for ASL from perfusion response levels, paradigm and PRF (sum_{m=1}^M c^m X^m g) For each condition, compute the convolution of the paradigm binary sequence ‘rastered_paradigm’ with the given PRF and multiply by prls. Finally compute the sum over conditions.
Return a asl array of shape (nb scans, nb voxels)
pyhrf.boldsynth.scenarios.
create_polynomial_drift
(bold_shape, tr, drift_order, drift_var)¶pyhrf.boldsynth.scenarios.
create_polynomial_drift_from_coeffs
(bold_shape, tr, drift_order, drift_coeffs, drift_mean=0.0, drift_amplitude=1.0)¶pyhrf.boldsynth.scenarios.
create_polynomial_drift_from_coeffs_asl
(asl_shape, tr, drift_order, drift_coeffs)¶pyhrf.boldsynth.scenarios.
create_prf
(prf_duration=25.0, dt=0.5)¶pyhrf.boldsynth.scenarios.
create_small_bold_simulation
(snr='high', output_dir=None, simu_items=None)¶pyhrf.boldsynth.scenarios.
create_stim_induced_signal
(nrls, rastered_paradigm, hrf, dt)¶Create a stimulus induced signal from neural response levels, paradigm and HRF (sum_{m=1}^M a^m X^m h) For each condition, compute the convolution of the paradigm binary sequence ‘rastered_paradigm’ with the given HRF and multiply by nrls. Finally compute the sum over conditions.
Return a bold array of shape (nb scans, nb voxels)
pyhrf.boldsynth.scenarios.
create_stim_induced_signal_Parsi
(nrls, rastered_paradigm, hrf, condition_defs, dt, w)¶Create a stimulus induced signal from neural response levels, paradigm and HRF (sum_{m=1}^M a^m w^m X^m h) For each condition, compute the convolution of the paradigm binary sequence ‘rastered_paradigm’ with the given HRF and multiply by nrls and W. Finally compute the sum over conditions.
Return a bold array of shape (nb scans, nb voxels)
pyhrf.boldsynth.scenarios.
create_time_invariant_gaussian_brls
(condition_defs, labels)¶BOLD response levels for ASL
pyhrf.boldsynth.scenarios.
create_time_invariant_gaussian_nrls
(condition_defs, labels)¶pyhrf.boldsynth.scenarios.
create_time_invariant_gaussian_prls
(condition_defs, labels)¶Perfusion response levels for ASL
pyhrf.boldsynth.scenarios.
create_varying_hrf
(hrf_duration=25.0, dt=0.5)¶pyhrf.boldsynth.scenarios.
duplicate_brf
(nb_voxels, primary_brf)¶Duplicate brf over all voxels. Return an array of shape (nb_voxels, len(brf))
pyhrf.boldsynth.scenarios.
duplicate_hrf
(nb_voxels, primary_hrf)¶Duplicate hrf over all voxels. Return an array of shape (nb_voxels, len(hrf))
pyhrf.boldsynth.scenarios.
duplicate_noise_var
(nb_voxels, v_gnoise)¶Duplicate variance of noise over all voxels. Return an array of shape (nb_voxels, var noise)
pyhrf.boldsynth.scenarios.
duplicate_prf
(nb_voxels, primary_prf)¶Duplicate prf over all voxels. Return an array of shape (nb_voxels, len(prf))
pyhrf.boldsynth.scenarios.
flatten_labels_vol
(labels_vol)¶pyhrf.boldsynth.scenarios.
get_bold_shape
(stim_induced_signal, dsf)¶pyhrf.boldsynth.scenarios.
load_drawn_labels
(name)¶pyhrf.boldsynth.scenarios.
load_hrf_territories
(nb_hrf_territories=0, hrf_territories_name=None)¶pyhrf.boldsynth.scenarios.
load_many_hrf_territories
(nb_hrf_territories)¶pyhrf.boldsynth.scenarios.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.boldsynth.scenarios.
rasterize_paradigm
(paradigm, dt, condition_defs)¶Return binary sequences of onsets approximated on temporal grid of temporal resolution dt, for all conditions. ‘paradigm’ is expected to be an instance of ‘pyhrf.paradigm.mpar.Paradigm’
pyhrf.boldsynth.scenarios.
save_simulation
(simulation, output_dir)¶short-hand for simulation_save_vol_outputs
pyhrf.boldsynth.scenarios.
simulation_save_vol_outputs
(simulation, output_dir, bold_3D_vols_dir=None, simulation_graph_output=None, prefix=None, vol_meta=None)¶simulation_graph_output : None, ‘simple’, ‘thumbnails’ #TODO
pyhrf.boldsynth.spatialconfig.
Mapper1D
(mapping, expandedShape)¶Handles a mapping between a nD coordinate space (expanded) and a 1D coordinate space (flatten). Can be applied to numpy.ndarray objects
createExpandedArray
(flatShape, type, mappedAxis=0, fillValue=0)¶expandArray
(a, mappedAxis=0, dest=None, fillValue=0)¶Expand dimensions of ‘a’ following predefined mapping. ‘mappedAxis’ is the axis index to be expanded in ‘a’. If dest is not None the map values from ‘a’ to dest. If dest is None then return a new array and fill positions not involved in mapping with ‘fillValue’.
flattenArray
(array, firstMappedAxis=0)¶Reduce dimensions of ‘array’. ‘firstMappedAxis’ is index of the axis to be reduced (other mapped axes are assumed to follow this one).
pyhrf.boldsynth.spatialconfig.
NeighbourhoodSystem
(neighboursSets)¶fromLattice
(kerMask=None, depth=1, torusFlag=False)¶Creates a NeighbourhoodSystem instance from a n-dimensional lattice
fromMesh
()¶getMaxIndex
()¶getMaxNeighbours
()¶getNeighbours
(nodeId)¶getNeighboursArrays
()¶getNeighboursLists
()¶getNeighboursSets
()¶kerMask2D_4n
= array([[-1, 0], [ 1, 0], [ 0, 1], [ 0, -1]])¶kerMask3D_6n
= array([[ 1, 0, 0], [ 0, 1, 0], [ 0, 0, 1], [-1, 0, 0], [ 0, -1, 0], [ 0, 0, -1]])¶sub
(nodeIds)¶pyhrf.boldsynth.spatialconfig.
PottsField
(classNames, spConf, initProps=None)¶pyhrf.boldsynth.spatialconfig.
RegularLatticeMapping
(shape=None, mapping=None, order=1, depth=1)¶Bases: pyhrf.boldsynth.spatialconfig.SpatialMapping
Define a SpatialMapping on a 3D regular lattice.
buildNeighboursCoordLists
()¶buildNeighboursIndexLists
()¶c
= [1, 1, 1]¶createFromGUI
()¶Creates the actual object based on the parameters
getClosestNeighboursIndexes
(idVoxel)¶getCoord
(index)¶getIndex
(coord)¶getMapping
()¶getNbCliques
()¶getNbVoxels
()¶getNdArrayMask
()¶getNeighboursCoordLists
()¶getNeighboursCoords
(idvoxel)¶getNeighboursIndexLists
()¶getNeighboursIndexes
(idVoxel)¶getRoiMask
()¶Return a binary or n-ary 3D mask which has the shape of the target data
getTargetAxesNames
()¶mapVoxData
(data, fillValue=0)¶nbNeighboursOrder1
= 6¶nbNeighboursOrder2
= 26¶order1Mask
= array([[ 1, 0, 0], [ 0, 1, 0], [ 0, 0, 1], [-1, 0, 0], [ 0, -1, 0], [ 0, 0, -1]])¶order2Mask
= array([[ 0, 0, -1], [ 0, 0, 1], [ 0, -1, 0], [ 0, -1, -1], [ 0, -1, 1], [ 0, 1, 0], [ 0, 1, -1], [ 0, 1, 1], [-1, 0, 0], [-1, 0, -1], [-1, 0, 1], [-1, -1, 0], [-1, -1, -1], [-1, -1, 1], [-1, 1, 0], [-1, 1, -1], [-1, 1, 1], [ 1, 0, 0], [ 1, 0, -1], [ 1, 0, 1], [ 1, -1, 0], [ 1, -1, -1], [ 1, -1, 1], [ 1, 1, 0], [ 1, 1, -1], [ 1, 1, 1]])¶pyhrf.boldsynth.spatialconfig.
RegularLatticeMapping2
(maskLattice, kerMask=None, nsDepth=1, parentMapping=None, torusFlag=False)¶Bases: pyhrf.boldsynth.spatialconfig.SpatialMapping2
flattenData
(data, firstMappedAxis=0)¶mapData
(data, mappedAxis=0, fillValue=0)¶pyhrf.boldsynth.spatialconfig.
SpatialMapping
¶Interface specification for the handling of a mapping between integer indexes and positions in a 3D space.
getCoord
()¶Return coord mapped with ‘index’
getIndex
()¶Return index mapped with ‘coord’
getMapping
()¶Return a mapping object (list or dict) which maps an integer index to its 3D coordinates.
getNbVoxels
()¶Return the total number of mapped position
getNdArrayMask
()¶Return the set of mapped 3D coordinates in a tuple usable as a mask for numpy.ndarray
getNeighboursCoordLists
()¶Get lists of neighbours for all positions @param idVoxel: index of the voxel @return: a mapping object (list or dict) which maps each integer index to a list of 3D coordinates (the neighbours).
getNeighboursCoords
(idVoxel)¶@param idVoxel: index of the voxel @return: the list of 3D coordinates corresponding to the neighbours of the specified voxel.
getNeighboursIndexLists
()¶Get lists of neighbours for all positions @param idVoxel: index of the voxel @return: a mapping object (list or dict) which maps each integer index to a list of integer indexes (the neighbours).
getNeighboursIndexes
(idVoxel)¶@param idVoxel: index of the voxel @return: the list of integer indexes corresponding to the neighbours of the specified voxel.
getRoiMask
()¶Return a binary or n-ary mask which has the shape of the target data
pyhrf.boldsynth.spatialconfig.
SpatialMapping2
(positions, ns, parentMapping=None, parentIndex=None)¶fromLattice
(kerMask=None, nsDepth=1, torusFlag=False)¶fromMesh
(positions)¶getPositions
()¶sub
(nodeIds)¶pyhrf.boldsynth.spatialconfig.
StateField
(classNames, spConf, initProps=None)¶Class handling a field of states : a set of integers (ie labels) whose ranks can be spatially mapped to 3D coordinates. Each label refers to a class wich is identified by an ID and a name.
generate
()¶Generate values for every states. By default : if initProportions is set, generate values according to it. State values will be ordered by class ID
getClassId
(className)¶Return the class id corresponding to the string ‘className’
getClassName
(classId)¶Return the class name corresponding to the integer ‘classId’
getClassNames
()¶Return all the class names
getFieldValues
()¶Return all field values.
getMappedFieldValues
()¶getNbClasses
()¶getSize
()¶Return to size of the field.
randomize
()¶Randomize state values with a ramdom permutation.
setFieldValues
(values, mask=None)¶Copy the content of ‘values’ to state values masked by ‘mask’.
setFieldValues0
(values, mask=None)¶Copy the content of ‘values’ to state values masked by ‘mask’.
updateClassCounts
()¶Compute the size of every classes.
pyhrf.boldsynth.spatialconfig.
UnboundSpatialMapping
(nbVoxels=100)¶Bases: pyhrf.boldsynth.spatialconfig.SpatialMapping
Convinient class to provide an implementation of SpatialMapping when there is no mapping.
createFromGUI
()¶Creates the actual object based on the parameters
getCoord
(index)¶getIndex
(coord)¶getMapping
()¶getNbVoxels
()¶getNdArrayMask
()¶getNeighboursCoordLists
()¶getNeighboursCoords
(idVoxel)¶getNeighboursIndexLists
()¶getNeighboursIndexes
(idVoxel)¶getRoiMask
()¶Return a binary or n-ary 3D mask which has the shape of the target data
pyhrf.boldsynth.spatialconfig.
flattenElements
(l)¶pyhrf.boldsynth.spatialconfig.
getRotationMatrix
(axis, angle)¶Compute the 3x3 matrix for the 3D rotation defined by ‘angle’ and the direction ‘axis’.
pyhrf.boldsynth.spatialconfig.
hashMask
(m)¶pyhrf.boldsynth.spatialconfig.
lattice_indexes
(mask)¶pyhrf.boldsynth.spatialconfig.
maskToMapping
(m)¶pyhrf.boldsynth.spatialconfig.
mask_to_coords
(m)¶pyhrf.jde.nrl.ar.
NRLARSampler
(do_sampling=True, val_ini=None, contrasts={}, do_label_sampling=True, use_true_nrls=False, use_true_labels=False, labels_ini=None, ppm_proba_threshold=0.05, ppm_value_threshold=0, ppm_value_multi_threshold=array([ 0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1., 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2., 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 3., 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. ]), mean_activation_threshold=4, rescale_results=False, wip_variance_computation=False)¶Bases: pyhrf.jde.nrl.bigaussian.NRLSampler
Class handling the Gibbs sampling of Neural Response Levels according to:
Makni, S., Ciuciu, P., Idier, J., & Poline, J. (2006). Joint Detection-Estimation of Brain Activity in fMRI using an Autoregressive Noise Model. In 3rd IEEE International Symposium on Biomedical Imaging: Macro to Nano, 2006. (pp. 1048–1051). IEEE. https://doi.org/10.1109/ISBI.2006.1625101
Inherits the abstract class C{ GibbsSamplerVariable}.
cleanMemory
()¶computeMeanVarClassApost
(j, variables)¶computeVarYTilde
(varXh, varMBYPl)¶linkToData
(dataInput)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶#TODO : comment
pyhrf.jde.nrl.bigaussian.
BiGaussMixtureParamsSampler
(do_sampling=True, use_true_value=False, val_ini=None, hyper_prior_type='Jeffreys', activ_thresh=4.0, var_ci_pr_alpha=2.04, var_ci_pr_beta=0.5, var_ca_pr_alpha=2.01, var_ca_pr_beta=0.5, mean_ca_pr_mean=5.0, mean_ca_pr_var=20.0)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶computeWithProperPriors
(j, cardCIj, cardCAj)¶finalizeSampling
()¶getCurrentMeans
()¶getCurrentVars
()¶getOutputs
()¶get_string_value
(v)¶linkToData
(dataInput)¶parametersComments
= {'activ_thresh': 'Threshold for the max activ mean above which the region is considered activating', 'hyper_prior_type': "Either 'proper' or 'Jeffreys'"}¶parametersToShow
= []¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateObsersables
()¶pyhrf.jde.nrl.bigaussian.
BiGaussMixtureParamsSamplerWithRelVar
(do_sampling=True, use_true_value=False, val_ini=None, hyper_prior_type='Jeffreys', activ_thresh=4.0, var_ci_pr_alpha=2.04, var_ci_pr_beta=0.5, var_ca_pr_alpha=2.01, var_ca_pr_beta=0.5, mean_ca_pr_mean=5.0, mean_ca_pr_var=20.0)¶Bases: pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSampler
computeWithProperPriorsWithRelVar
(nrlsj, j, cardCIj, cardCAj, wj)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.nrl.bigaussian.
BiGaussMixtureParamsSamplerWithRelVar_OLD
(do_sampling=True, use_true_value=False, val_ini=None, hyper_prior_type='Jeffreys', activ_thresh=4.0, var_ci_pr_alpha=2.04, var_ci_pr_beta=0.5, var_ca_pr_alpha=2.01, var_ca_pr_beta=0.5, mean_ca_pr_mean=5.0, mean_ca_pr_var=20.0)¶Bases: pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSampler
computeWithProperPriorsWithRelVar
(nrlsj, j, cardCIj, cardCAj, wj)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.nrl.bigaussian.
MixtureWeightsSampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
checkAndSetInitValue
(variables)¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.nrl.bigaussian.
NRLSampler
(do_sampling=True, val_ini=None, contrasts={}, do_label_sampling=True, use_true_nrls=False, use_true_labels=False, labels_ini=None, ppm_proba_threshold=0.05, ppm_value_threshold=0, ppm_value_multi_threshold=array([ 0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1., 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2., 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 3., 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. ]), mean_activation_threshold=4, rescale_results=False, wip_variance_computation=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
Class handling the Gibbs sampling of Neural Response Levels with a prior bi-gaussian mixture model. It handles independent and spatial versions.
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶FALSE_NEG
= 3¶FALSE_POS
= 2¶L_CA
= 1¶L_CI
= 0¶PPMcalculus
(apost_mean_activ, apost_var_activ, apost_mean_inactiv, apost_var_inactiv, labels_activ, labels_inactiv)¶Function to calculate the probability that the nrl in voxel j, condition m, is superior to a given hreshold_value
ThresholdPPM
(threshold_pval)¶calcFracLambdaTilde
(cond, c1, c2, variables)¶checkAndSetInitLabels
(variables)¶checkAndSetInitNRL
(variables)¶checkAndSetInitValue
(variables)¶cleanMemory
()¶cleanObservables
()¶computeAA
(nrls, destaa)¶computeComponentsApost
(variables, j, gTQg)¶computeContrasts
()¶computeVarXhtQ
(h, varXQ)¶computeVarYTildeOpt
(varXh)¶compute_summary_stats
()¶countLabels
(labels, voxIdx, cardClass)¶finalizeSampling
()¶getClassifRate
()¶getFinalLabels
(thres=None)¶getOutputs
()¶getRocData
(dthres=0.005)¶get_final_summary
()¶initObservables
()¶init_contrasts
()¶linkToData
(dataInput)¶markWrongLabels
(labels)¶parametersComments
= {'contrasts': 'Define contrasts as arithmetic expressions.\nCondition names used in expressions must be consistent with those specified in session data above'}¶parametersToShow
= ['contrasts']¶printState
(_)¶reportDetection
()¶sampleLabels
(cond, variables)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
sampleNrlsParallel
(varXh, rb, h, varLambda, varCI, varCA, meanCA, gTQg, variables)¶sampleNrlsSerial
(rb, h, varCI, varCA, meanCA, gTQg, variables)¶samplingWarmUp
(variables)¶#TODO : comment
saveCurrentValue
(it)¶saveObservables
(it)¶updateObsersables
()¶pyhrf.jde.nrl.bigaussian.
NRLSamplerWithRelVar
(do_sampling=True, val_ini=None, contrasts={}, do_label_sampling=True, use_true_nrls=False, use_true_labels=False, labels_ini=None, ppm_proba_threshold=0.05, ppm_value_threshold=0, ppm_value_multi_threshold=array([ 0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1., 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2., 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 3., 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. ]), mean_activation_threshold=4, rescale_results=False, wip_variance_computation=False)¶Bases: pyhrf.jde.nrl.bigaussian.NRLSampler
calcFracLambdaTildeWithIRRelCond
(cond, c1, c2, variables, nbVox, moyqvoxj, t1, t2)¶calcFracLambdaTildeWithRelCond
(l, nbVox, moyqvoxj, t1, t2)¶computeComponentsApostWithRelVar
(variables, j, gTQg, w)¶computeSumWAxh
(wa, varXh)¶computeVarYTildeOptWithRelVar
(varXh, w)¶computeWA
(a, w, wa)¶computemoyqvox
(cardClass, nbVox)¶Compute mean of labels in ROI (without the label of voxel i)
createWAxh
(aXh, w)¶deltaWCorr0
(nbVox, moyqvoxj, t1, t2)¶deltaWCorr1
(nbVox, moyqvoxj, t1, t2)¶sampleLabelsWithRelVar
(cond, variables)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
sampleNrlsParallelWithRelVar
(varXh, rb, h, varLambda, varCI, varCA, meanCA, gTQg, variables, w)¶sampleNrlsSerialWithRelVar
(rb, h, gTQg, variables, w, t1, t2)¶samplingWarmUp
(variables)¶#TODO : comment
subtractYtildeWithRelVar
()¶pyhrf.jde.nrl.bigaussian.
NRL_Multi_Sess_Sampler
(parameters=None, xmlHandler=None, xmlLabel=None, xmlComment=None)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
P_OUTPUT_NRL
= 'writeResponsesOutput'¶P_SAMPLE_FLAG
= 'sampleFlag'¶P_TrueNrlFilename
= 'TrueNrlFilename'¶P_USE_TRUE_NRLS
= 'useTrueNrls'¶P_VAL_INI
= 'initialValue'¶checkAndSetInitValue
(variables)¶cleanMemory
()¶computeAA
(nrls, destaa)¶computeComponentsApost
(variables, m, varXh, s)¶computeVarYTildeSessionOpt
(varXh, s)¶defaultParameters
= {'TrueNrlFilename': None, 'initialValue': None, 'sampleFlag': True, 'useTrueNrls': False, 'writeResponsesOutput': True}¶finalizeSampling
()¶getOutputs
()¶linkToData
(dataInput)¶parametersComments
= {'TrueNrlFilename': 'Define the filename of simulated NRLs.\nIt is taken into account when NRLs is not sampled.'}¶parametersToShow
= ['writeResponsesOutput']¶sampleNextAlt
(variables)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶#TODO : comment
saveCurrentValue
(it)¶pyhrf.jde.nrl.bigaussian.
Variance_GaussianNRL_Multi_Sess
(parameters=None, xmlHandler=None, xmlLabel=None, xmlComment=None)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
P_SAMPLE_FLAG
= 'sampleFlag'¶P_USE_TRUE_VALUE
= 'useTrueValue'¶P_VAL_INI
= 'initialValue'¶checkAndSetInitValue
(variables)¶defaultParameters
= {'initialValue': array([ 1.]), 'sampleFlag': False, 'useTrueValue': False}¶linkToData
(dataInput)¶parametersToShow
= ['useTrueValue']¶sampleNextInternal
(variables)¶pyhrf.jde.nrl.bigaussian_drift.
BiGaussMixtureParams_Multi_Sess_NRLsBar_Sampler
(parameters=None, xmlHandler=None, xmlLabel=None, xmlComment=None)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶P_ACTIV_THRESH
= 'mean_activation_threshold'¶P_HYPER_PRIOR
= 'hyperPriorType'¶P_MEAN_CA_PR_MEAN
= 'meanCAPrMean'¶P_MEAN_CA_PR_VAR
= 'meanCAPrVar'¶P_SAMPLE_FLAG
= 'sampleFlag'¶P_USE_TRUE_VALUE
= 'useTrueValue'¶P_VAL_INI
= 'initialValue'¶P_VAR_CA_PR_ALPHA
= 'varCAPrAlpha'¶P_VAR_CA_PR_BETA
= 'varCAPrBeta'¶P_VAR_CI_PR_ALPHA
= 'varCIPrAlpha'¶P_VAR_CI_PR_BETA
= 'varCIPrBeta'¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶computeWithProperPriors
(j, cardCIj, cardCAj)¶defaultParameters
= {'hyperPriorType': 'Jeffrey', 'initialValue': None, 'meanCAPrMean': 5.0, 'meanCAPrVar': 20.0, 'mean_activation_threshold': 4.0, 'sampleFlag': True, 'useTrueValue': False, 'varCAPrAlpha': 2.01, 'varCAPrBeta': 0.5, 'varCIPrAlpha': 2.04, 'varCIPrBeta': 2.08}¶finalizeSampling
()¶getCurrentMeans
()¶getCurrentVars
()¶getOutputs
()¶get_string_value
(v)¶linkToData
(dataInput)¶parametersComments
= {'hyperPriorType': "Either 'proper' or 'Jeffrey'", 'mean_activation_threshold': 'Threshold for the max activ mean above which the region is considered activating'}¶parametersToShow
= ['initialValue', 'sampleFlag', 'mean_activation_threshold', 'useTrueValue', 'hyperPriorType', 'meanCAPrMean', 'meanCAPrVar', 'varCIPrAlpha', 'varCIPrBeta', 'varCAPrAlpha', 'varCAPrBeta']¶sampleNextInternal
(variables)¶updateObsersables
()¶pyhrf.jde.nrl.bigaussian_drift.
NRL_Drift_Sampler
(do_sampling=True, val_ini=None, contrasts={}, do_label_sampling=True, use_true_nrls=False, use_true_labels=False, labels_ini=None, ppm_proba_threshold=0.05, ppm_value_threshold=0, ppm_value_multi_threshold=array([ 0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1., 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2., 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 3., 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. ]), mean_activation_threshold=4, rescale_results=False, wip_variance_computation=False)¶Bases: pyhrf.jde.nrl.bigaussian.NRLSampler
Class handling the Gibbs sampling of Neural Response Levels in the case of joint drift sampling.
computeVarYTildeOpt
(varXh)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
sampleNrlsSerial
(rb, h, varCI, varCA, meanCA, gTg, variables)¶pyhrf.jde.nrl.bigaussian_drift.
NRL_Drift_SamplerWithRelVar
(do_sampling=True, val_ini=None, contrasts={}, do_label_sampling=True, use_true_nrls=False, use_true_labels=False, labels_ini=None, ppm_proba_threshold=0.05, ppm_value_threshold=0, ppm_value_multi_threshold=array([ 0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1., 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2., 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 3., 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. ]), mean_activation_threshold=4, rescale_results=False, wip_variance_computation=False)¶Bases: pyhrf.jde.nrl.bigaussian.NRLSamplerWithRelVar
Class handling the Gibbs sampling of Neural Response Levels in the case of joint drift sampling and relevant variable.
computeVarYTildeOptWithRelVar
(varXh, w)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
sampleNrlsSerialWithRelVar
(rb, h, gTg, variables, w, t1, t2)¶pyhrf.jde.nrl.bigaussian_drift.
NRLsBar_Drift_Multi_Sess_Sampler
(do_sampling=True, val_ini=None, contrasts={}, do_label_sampling=True, use_true_nrls=False, use_true_labels=False, labels_ini=None, ppm_proba_threshold=0.05, ppm_value_threshold=0, ppm_value_multi_threshold=array([ 0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1., 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2., 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 3., 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. ]), mean_activation_threshold=4, rescale_results=False, wip_variance_computation=False)¶Bases: pyhrf.jde.nrl.bigaussian.NRLSampler
Class handling the Gibbs sampling of Neural Response Levels in the case of joint drift sampling.
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
sampleNrlsSerial
(varCI, varCA, meanCA, variables)¶samplingWarmUp
(variables)¶#TODO : comment
pyhrf.jde.nrl.bigaussian_drift.
permutation
(x)¶Randomly permute a sequence, or return a permuted range.
If x is a multi-dimensional array, it is only shuffled along its first index.
Parameters: | x (int or array_like) – If x is an integer, randomly permute np.arange(x) .
If x is an array, make a copy and shuffle the elements
randomly. |
---|---|
Returns: | out – Permuted sequence or array range. |
Return type: | ndarray |
Examples
>>> np.random.permutation(10)
array([1, 7, 4, 3, 0, 9, 2, 5, 8, 6])
>>> np.random.permutation([1, 4, 9, 12, 15])
array([15, 1, 9, 4, 12])
>>> arr = np.arange(9).reshape((3, 3))
>>> np.random.permutation(arr)
array([[6, 7, 8],
[0, 1, 2],
[3, 4, 5]])
pyhrf.jde.nrl.bigaussian_drift.
rand
(d0, d1, ..., dn)¶Random values in a given shape.
Create an array of the given shape and populate it with
random samples from a uniform distribution
over [0, 1)
.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should all be positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | out – Random values. |
Return type: | ndarray, shape (d0, d1, ..., dn) |
See also
random()
Notes
This is a convenience function. If you want an interface that takes a shape-tuple as the first argument, refer to np.random.random_sample .
Examples
>>> np.random.rand(3,2)
array([[ 0.14022471, 0.96360618], #random
[ 0.37601032, 0.25528411], #random
[ 0.49313049, 0.94909878]]) #random
pyhrf.jde.nrl.bigaussian_drift.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.jde.nrl.gammagaussian.
GamGaussMixtureParamsSampler
(parameters=None, xmlHandler=None, xmlLabel=None, xmlComment=None)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Shape_Activ', 'Scale_Activ', 'Var_Inactiv']¶P_SAMPLE_FLAG
= 'sampleFlag'¶P_SCALE_CA_PR_ALPHA
= 'scaleCAPrAlpha'¶P_SCALE_CA_PR_BETA
= 'scaleCAPrBeta'¶P_SHAPE_CA_PR_MEAN
= 'shapeCAPrMean'¶P_VAL_INI
= 'initialValue'¶P_VAR_CI_PR_ALPHA
= 'varCIPrAlpha'¶P_VAR_CI_PR_BETA
= 'varCIPrBeta'¶checkAndSetInitValue
(variables)¶defaultParameters
= {'initialValue': None, 'sampleFlag': 1, 'scaleCAPrAlpha': 2.5, 'scaleCAPrBeta': 1.5, 'shapeCAPrMean': 10.0, 'varCIPrAlpha': 2.5, 'varCIPrBeta': 0.5}¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.nrl.gammagaussian.
InhomogeneousNRLSampler
(parameters=None, xmlHandler=None, xmlLabel=None, xmlComment=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
Class handling the Gibbs sampling of Neural Response Levels according to:
Inherits the abstract class C{GibbsSamplerVariable}. #TODO : comment attributes
L_CA
= 1¶L_CI
= 0¶P_BETA
= 'beta'¶P_LABELS_COLORS
= 'labelsColors'¶P_LABELS_INI
= 'labelsIni'¶P_SAMPLE_FLAG
= 'sampleFlag'¶P_SAMPLE_LABELS
= 'sampleLabels'¶P_TRUE_LABELS
= 'trueLabels'¶P_VAL_INI
= 'initialValue'¶calcEnergy
(voxIdx, label, cond)¶checkAndSetInitValue
(variables)¶computeMean
()¶computeMeanClassApost
(j, nrls, varXhj, rb)¶computeVarYTilde
(varXh)¶computeVariablesApost
(varCI, shapeCA, scaleCA, rb, varXh, varLambda)¶countLabels
()¶defaultParameters
= {'beta': 0.4, 'initialValue': None, 'labelsColors': array([ 0., 0.]), 'labelsIni': None, 'sampleFlag': 1, 'sampleLabels': 1}¶finalizeSampling
()¶linkToData
(dataInput)¶sampleLabels
(cond, varCI, varCA, meanCA)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶#TODO : comment
pyhrf.jde.nrl.habituation.
LaplacianPdf
(beta, r0Hab, a, b, N=1)¶pyhrf.jde.nrl.habituation.
NRLwithHabSampler
¶Bases: pyhrf.jde.nrl.bigaussian.NRLSampler
Class handling the Gibbs sampling of Neural Response Levels in combination with habituation speed factor sampling. The underlying model is exponential decaying #TODO : comment attributes
P_HABITS_INI
= 'habitIni'¶P_HAB_ALGO_PARAM
= 'paramLexp'¶P_OUTPUT_RATIO
= 'outputRatio'¶P_SAMPLE_HABITS
= 'sampleHabit'¶P_TRUE_HABITS
= 'trueHabits'¶checkAndSetInitHabit
(variables)¶checkAndSetInitValue
(variables)¶cleanMemory
()¶cleanObservables
()¶computeComponentsApost
(variables, j, XhtQXh)¶computeVarXhtQ
(Q)¶computeVarYTildeHab
(varXh)¶computeVarYTildeHabOld
(varXh)¶finalizeSampling
()¶getOutputs
()¶habitCondSampler
(j, rb, varHRF)¶habitCondSamplerParallel
(rb, h)¶habitCondSamplerSerial
(rb, h)¶initObservables
()¶linkToData
(dataInput)¶parametersComments
= {'contrasts': 'Define contrasts as arithmetic expressions.\nCondition names used in expressions must be consistent with those specified in session data above', 'paramLexp': 'lambda-like parameter of the Laplacian distribution in habit sampling\n recommended between 1. and 10.'}¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
sampleNrlsParallel
(rb, h, varLambda, varCI, varCA, meanCA, varXhtQXh, variables)¶sampleNrlsSerial
(varXh, rb, h, varCI, varCA, meanCA, variables)¶sampleNrlsSerial_bak
(rb, h, varLambda, varCI, varCA, meanCA, varXhtQXh, variables)¶samplingWarmUp
(variables)¶#TODO : comment
saveCurrentValue
()¶setupGamma
()¶setupTimeNrls
()¶spExtract
(spInd, mtrx, cond)¶updateGammaTimeNRLs
(nc, nv)¶updateObsersables
()¶updateXh
(varHRF)¶updateYtilde
()¶pyhrf.jde.nrl.habituation.
sparsedot
(X, A, mask, taille)¶pyhrf.jde.nrl.habituation.
sparsedotdimun
(X, A, mask, lenght)¶pyhrf.jde.nrl.habituation.
subcptGamma
(nrl, habit, nbTrials, deltaOns)¶pyhrf.jde.nrl.trigaussian.
GGGNRLSampler
(do_sampling=True, val_ini=None, contrasts={}, do_label_sampling=True, use_true_nrls=False, use_true_labels=False, labels_ini=None, ppm_proba_threshold=0.05, ppm_value_threshold=0, ppm_value_multi_threshold=array([ 0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1., 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2., 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 3., 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. ]), mean_activation_threshold=4, rescale_results=False, wip_variance_computation=False)¶Bases: pyhrf.jde.nrl.bigaussian.NRLSampler
CLASSES
= array([0, 1, 2])¶CLASS_NAMES
= ['inactiv', 'activ', 'deactiv']¶FALSE_NEG
= 4¶FALSE_POS
= 3¶L_CA
= 1¶L_CD
= 2¶L_CI
= 0¶sampleLabels
(cond, variables)¶pyhrf.jde.nrl.trigaussian.
TriGaussMixtureParamsSampler
(do_sampling=True, use_true_value=False, val_ini=None, hyper_prior_type='Jeffreys', activ_thresh=4.0, var_ci_pr_alpha=2.04, var_ci_pr_beta=0.5, var_ca_pr_alpha=2.01, var_ca_pr_beta=0.5, var_cd_pr_alpha=2.01, var_cd_pr_beta=0.5, mean_ca_pr_mean=5.0, mean_ca_pr_var=20.0, mean_cd_pr_mean=-20.0, mean_cd_pr_var=20.0)¶Bases: pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSampler
I_MEAN_CD
= 3¶I_VAR_CD
= 4¶L_CD
= 2¶NB_PARAMS
= 5¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv', 'Mean_Deactiv', 'Var_Deactiv']¶P_MEAN_CD_PR_MEAN
= 'meanCDPrMean'¶P_MEAN_CD_PR_VAR
= 'meanCDPrVar'¶P_VAR_CD_PR_ALPHA
= 'varCDPrAlpha'¶P_VAR_CD_PR_BETA
= 'varCDPrBeta'¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCDj)¶finalizeSampling
()¶getCurrentMeans
()¶getCurrentVars
()¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl.
ASLSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, bold_response_levels=<pyhrf.jde.asl.BOLDResponseLevelSampler object>, perf_response_levels=<pyhrf.jde.asl.PerfResponseLevelSampler object>, labels=<pyhrf.jde.asl.LabelSampler object>, noise_var=<pyhrf.jde.asl.NoiseVarianceSampler object>, brf=<pyhrf.jde.asl.BOLDResponseSampler object>, brf_var=<pyhrf.jde.asl.BOLDResponseVarianceSampler object>, prf=<pyhrf.jde.asl.PerfResponseSampler object>, prf_var=<pyhrf.jde.asl.PerfResponseVarianceSampler object>, bold_mixt_params=<pyhrf.jde.asl.BOLDMixtureSampler object>, perf_mixt_params=<pyhrf.jde.asl.PerfMixtureSampler object>, drift=<pyhrf.jde.asl.DriftCoeffSampler object>, drift_var=<pyhrf.jde.asl.DriftVarianceSampler object>, perf_baseline=<pyhrf.jde.asl.PerfBaselineSampler object>, perf_baseline_var=<pyhrf.jde.asl.PerfBaselineVarianceSampler object>, check_final_value=None, output_fit=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
computeFit
()¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶inputClass
¶alias of WN_BiG_ASLSamplerInput
parametersToShow
= ['nb_its', 'bold_response_levels', 'brf', 'brf_var', 'prf', 'prf_var']¶pyhrf.jde.asl.
BOLDMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl.MixtureParamsSampler
, pyhrf.xmlio.Initable
get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl.
BOLDResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl.ResponseLevelSampler
, pyhrf.xmlio.Initable
computeVarYTildeOpt
(update_perf=False)¶if update_perf is True then also update sumcXg and prl.ytilde update_perf should only be used at init of variable values.
getOutputs
()¶samplingWarmUp
(v)¶pyhrf.jde.asl.
BOLDResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum cWXg - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶pyhrf.jde.asl.
BOLDResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl.ResponseVarianceSampler
, pyhrf.xmlio.Initable
pyhrf.jde.asl.
DriftCoeffSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
get_final_value
()¶get_true_value
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateNorm
()¶pyhrf.jde.asl.
DriftVarianceSampler
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶get_MAP_labels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl.
MixtureParamsSampler
(name, response_level_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶get_current_means
()¶get_current_vars
()¶get_true_values_from_simulation_dict
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.asl.
NoiseVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl.
PerfBaselineSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_residuals
()¶compute_wa
(a=None)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl.
PerfBaselineVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl.
PerfMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl.MixtureParamsSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl.
PerfResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl.ResponseLevelSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶computeVarYTildeOpt
()¶pyhrf.jde.asl.
PerfResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False, diff_res=True)¶Bases: pyhrf.jde.asl.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum aXh - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶pyhrf.jde.asl.
PerfResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl.ResponseVarianceSampler
, pyhrf.xmlio.Initable
pyhrf.jde.asl.
ResponseLevelSampler
(name, response_name, mixture_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶computeRR
()¶computeVarYTildeOpt
()¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶updateObsersables
()¶pyhrf.jde.asl.
ResponseSampler
(name, response_level_name, variance_name, smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Generic parent class to perfusion response & BOLD response samplers
calcXResp
(resp, stackX=None)¶checkAndSetInitValue
(variables)¶computeYTilde
()¶get_mat_X
()¶get_mat_XtX
()¶get_rlrl
()¶get_stackX
()¶get_ybar
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶setFinalValue
()¶updateNorm
()¶updateXResp
()¶pyhrf.jde.asl.
ResponseVarianceSampler
(name, response_name, val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶pyhrf.jde.asl.
WN_BiG_ASLSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.WN_BiG_Drift_BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.asl.
b
()¶pyhrf.jde.asl.
compute_StS_StY
(rls, v_b, mx, mxtx, ybar, rlrl, yaj, ajak_vb)¶yaj and ajak_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.jde.asl.
simulate_asl
(output_dir=None, noise_scenario='high_snr', spatial_size='tiny', v_noise=None, dt=0.5, tr=2.5)¶pyhrf.jde.asl_2steps.
dummy_jde
(fmri_data, dt)¶pyhrf.jde.asl_2steps.
jde_analyse_2steps_v1
(output_dir, fmri_data, dt, nb_iterations, brf_var=None, do_sampling_brf_var=False, prf_var=None, do_sampling_prf_var=False)¶#Return: # dict of outputs
pyhrf.jde.asl_2steps.
physio_build_jde_mcmc_sampler
(nb_iterations, rf_prior, flag_zc=False, brf_var_ini=None, prf_var_ini=None, do_sampling_brf_var=False, do_sampling_prf_var=False, prf_ini=None, do_sampling_prf=True, prls_ini=None, do_sampling_prls=True, brf_ini=None, do_sampling_brf=True, brls_ini=None, do_sampling_brls=True, perf_bl_ini=None, do_sampling_perf_bl=True, do_sampling_perf_var=True, drift_ini=None, do_sampling_drift=True, drift_var_ini=None, do_sampling_drift_var=True, noise_var_ini=None, labels_ini=None, do_sampling_labels=True)¶pyhrf.jde.asl_physio.
ASLPhysioSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, bold_response_levels=<pyhrf.jde.asl_physio.BOLDResponseLevelSampler object>, perf_response_levels=<pyhrf.jde.asl_physio.PerfResponseLevelSampler object>, labels=<pyhrf.jde.asl_physio.LabelSampler object>, noise_var=<pyhrf.jde.asl_physio.NoiseVarianceSampler object>, brf=<pyhrf.jde.asl_physio.PhysioBOLDResponseSampler object>, brf_var=<pyhrf.jde.asl_physio.PhysioBOLDResponseVarianceSampler object>, prf=<pyhrf.jde.asl_physio.PhysioPerfResponseSampler object>, prf_var=<pyhrf.jde.asl_physio.PhysioPerfResponseVarianceSampler object>, bold_mixt_params=<pyhrf.jde.asl_physio.BOLDMixtureSampler object>, perf_mixt_params=<pyhrf.jde.asl_physio.PerfMixtureSampler object>, drift=<pyhrf.jde.asl_physio.DriftCoeffSampler object>, drift_var=<pyhrf.jde.asl_physio.DriftVarianceSampler object>, perf_baseline=<pyhrf.jde.asl_physio.PerfBaselineSampler object>, perf_baseline_var=<pyhrf.jde.asl_physio.PerfBaselineVarianceSampler object>, check_final_value=None, output_fit=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
computeFit
()¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶inputClass
¶alias of WN_BiG_ASLSamplerInput
parametersToShow
= ['nb_its', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.asl_physio.
BOLDMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio.MixtureParamsSampler
, pyhrf.xmlio.Initable
get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio.
BOLDResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio.ResponseLevelSampler
, pyhrf.xmlio.Initable
computeVarYTildeOpt
(update_perf=False)¶if update_perf is True then also update sumcXg and prl.ytilde update_perf should only be used at init of variable values.
getOutputs
()¶samplingWarmUp
(v)¶pyhrf.jde.asl_physio.
DriftCoeffSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
updateNorm
()¶pyhrf.jde.asl_physio.
DriftVarianceSampler
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio.
MixtureParamsSampler
(name, response_level_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶get_current_means
()¶get_current_vars
()¶get_true_values_from_simulation_dict
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.asl_physio.
NoiseVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio.
PerfBaselineSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_residuals
()¶compute_wa
(a=None)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio.
PerfBaselineVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio.
PerfMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio.MixtureParamsSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio.
PerfResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio.ResponseLevelSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶computeVarYTildeOpt
()¶pyhrf.jde.asl_physio.
PhysioBOLDResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum cWXg - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample BRF
changes to mean: changes to var:
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio.
PhysioBOLDResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio.ResponseVarianceSampler
, pyhrf.xmlio.Initable
pyhrf.jde.asl_physio.
PhysioPerfResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False, diff_res=True, prior_type='physio_stochastic_regularized')¶Bases: pyhrf.jde.asl_physio.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum aXh - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample PRF with physio prior
changes to mean: add a factor of Omega h Sigma_g^-1 v_g^-1
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio.
PhysioPerfResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio.ResponseVarianceSampler
, pyhrf.xmlio.Initable
pyhrf.jde.asl_physio.
ResponseLevelSampler
(name, response_name, mixture_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶computeRR
()¶computeVarYTildeOpt
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶setFinalValue
()¶pyhrf.jde.asl_physio.
ResponseSampler
(name, response_level_name, variance_name, smooth_order=2, zero_constraint=False, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Generic parent class to perfusion response & BOLD response samplers
calcXResp
(resp, stackX=None)¶checkAndSetInitValue
(variables)¶computeYTilde
()¶getOutputs
()¶get_mat_X
()¶get_mat_XtX
()¶get_rlrl
()¶get_stackX
()¶get_ybar
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶setFinalValue
()¶updateNorm
()¶updateXResp
()¶pyhrf.jde.asl_physio.
ResponseVarianceSampler
(name, response_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Sample variance of BRF or PRF
TODO: change code below –> no changes necessary so far
pyhrf.jde.asl_physio.
WN_BiG_ASLSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.WN_BiG_Drift_BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.asl_physio.
b
()¶pyhrf.jde.asl_physio.
compute_StS_StY
(rls, v_b, mx, mxtx, ybar, rlrl, yaj, ajak_vb)¶yaj and ajak_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio.
compute_StS_StY_deterministic
(brls, prls, v_b, mx, mxtx, mwx, mxtwx, mwxtwx, ybar, rlrl_bold, rlrl_perf, brlprl, omega, yj, ajak_vb)¶yj, ajak_vb and cjck_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio.
compute_bRpR
(brl, prl, nbConditions, nbVoxels)¶pyhrf.jde.asl_physio_1step.
ASLPhysioSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, bold_response_levels=<pyhrf.jde.asl_physio_1step.BOLDResponseLevelSampler object>, perf_response_levels=<pyhrf.jde.asl_physio_1step.PerfResponseLevelSampler object>, labels=<pyhrf.jde.asl_physio_1step.LabelSampler object>, noise_var=<pyhrf.jde.asl_physio_1step.NoiseVarianceSampler object>, brf=<pyhrf.jde.asl_physio_1step.PhysioBOLDResponseSampler object>, brf_var=<pyhrf.jde.asl_physio_1step.PhysioBOLDResponseVarianceSampler object>, prf=<pyhrf.jde.asl_physio_1step.PhysioPerfResponseSampler object>, prf_var=<pyhrf.jde.asl_physio_1step.PhysioPerfResponseVarianceSampler object>, bold_mixt_params=<pyhrf.jde.asl_physio_1step.BOLDMixtureSampler object>, perf_mixt_params=<pyhrf.jde.asl_physio_1step.PerfMixtureSampler object>, drift=<pyhrf.jde.asl_physio_1step.DriftCoeffSampler object>, drift_var=<pyhrf.jde.asl_physio_1step.DriftVarianceSampler object>, perf_baseline=<pyhrf.jde.asl_physio_1step.PerfBaselineSampler object>, perf_baseline_var=<pyhrf.jde.asl_physio_1step.PerfBaselineVarianceSampler object>, check_final_value=None, output_fit=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
computeFit
()¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶inputClass
¶alias of WN_BiG_ASLSamplerInput
parametersToShow
= ['nb_its', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.asl_physio_1step.
BOLDMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step.MixtureParamsSampler
, pyhrf.xmlio.Initable
get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_1step.
BOLDResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step.ResponseLevelSampler
, pyhrf.xmlio.Initable
computeVarYTildeOpt
(update_perf=False)¶if update_perf is True then also update sumcXg and prl.ytilde update_perf should only be used at init of variable values.
getOutputs
()¶samplingWarmUp
(v)¶pyhrf.jde.asl_physio_1step.
DriftCoeffSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
updateNorm
()¶pyhrf.jde.asl_physio_1step.
DriftVarianceSampler
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_1step.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_1step.
MixtureParamsSampler
(name, response_level_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶get_current_means
()¶get_current_vars
()¶get_true_values_from_simulation_dict
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.asl_physio_1step.
NoiseVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_1step.
PerfBaselineSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_residuals
()¶compute_wa
(a=None)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_1step.
PerfBaselineVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_1step.
PerfMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step.MixtureParamsSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_1step.
PerfResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step.ResponseLevelSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶computeVarYTildeOpt
()¶pyhrf.jde.asl_physio_1step.
PhysioBOLDResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum cWXg - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample BRF
changes to mean: changes to var:
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_1step.
PhysioBOLDResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step.ResponseVarianceSampler
, pyhrf.xmlio.Initable
pyhrf.jde.asl_physio_1step.
PhysioPerfResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False, diff_res=True, prior_type='physio_stochastic_regularized')¶Bases: pyhrf.jde.asl_physio_1step.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum aXh - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample PRF with physio prior
changes to mean: add a factor of Omega h Sigma_g^-1 v_g^-1
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_1step.
PhysioPerfResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step.ResponseVarianceSampler
, pyhrf.xmlio.Initable
pyhrf.jde.asl_physio_1step.
ResponseLevelSampler
(name, response_name, mixture_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶computeRR
()¶computeVarYTildeOpt
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶setFinalValue
()¶pyhrf.jde.asl_physio_1step.
ResponseSampler
(name, response_level_name, variance_name, smooth_order=2, zero_constraint=False, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Generic parent class to perfusion response & BOLD response samplers
calcXResp
(resp, stackX=None)¶checkAndSetInitValue
(variables)¶computeYTilde
()¶getOutputs
()¶get_mat_X
()¶get_mat_XtX
()¶get_rlrl
()¶get_stackX
()¶get_ybar
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶setFinalValue
()¶updateNorm
()¶updateXResp
()¶pyhrf.jde.asl_physio_1step.
ResponseVarianceSampler
(name, response_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Sample variance of BRF or PRF
TODO: change code below –> no changes necessary so far
pyhrf.jde.asl_physio_1step.
WN_BiG_ASLSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.WN_BiG_Drift_BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.asl_physio_1step.
b
()¶pyhrf.jde.asl_physio_1step.
compute_StS_StY
(rls, v_b, mx, mxtx, ybar, rlrl, yaj, ajak_vb)¶yaj and ajak_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_1step.
compute_StS_StY_deterministic
(brls, prls, v_b, mx, mxtx, mwx, mxtwx, mwxtwx, ybar, rlrl_bold, rlrl_perf, brlprl, omega, yj, ajak_vb)¶yj, ajak_vb and cjck_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_1step.
compute_bRpR
(brl, prl, nbConditions, nbVoxels)¶pyhrf.jde.asl_physio_1step_params.
ASLPhysioSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, bold_response_levels=<pyhrf.jde.asl_physio_1step_params.BOLDResponseLevelSampler object>, perf_response_levels=<pyhrf.jde.asl_physio_1step_params.PerfResponseLevelSampler object>, labels=<pyhrf.jde.asl_physio_1step_params.LabelSampler object>, noise_var=<pyhrf.jde.asl_physio_1step_params.NoiseVarianceSampler object>, brf=<pyhrf.jde.asl_physio_1step_params.PhysioBOLDResponseSampler object>, brf_var=<pyhrf.jde.asl_physio_1step_params.PhysioBOLDResponseVarianceSampler object>, prf=<pyhrf.jde.asl_physio_1step_params.PhysioPerfResponseSampler object>, prf_var=<pyhrf.jde.asl_physio_1step_params.PhysioPerfResponseVarianceSampler object>, bold_mixt_params=<pyhrf.jde.asl_physio_1step_params.BOLDMixtureSampler object>, perf_mixt_params=<pyhrf.jde.asl_physio_1step_params.PerfMixtureSampler object>, drift=<pyhrf.jde.asl_physio_1step_params.DriftCoeffSampler object>, drift_var=<pyhrf.jde.asl_physio_1step_params.DriftVarianceSampler object>, perf_baseline=<pyhrf.jde.asl_physio_1step_params.PerfBaselineSampler object>, perf_baseline_var=<pyhrf.jde.asl_physio_1step_params.PerfBaselineVarianceSampler object>, check_final_value=None, output_fit=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
computeFit
()¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶inputClass
¶alias of WN_BiG_ASLSamplerInput
parametersToShow
= ['nb_its', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.asl_physio_1step_params.
BOLDMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step_params.MixtureParamsSampler
, pyhrf.xmlio.Initable
get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_1step_params.
BOLDResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step_params.ResponseLevelSampler
, pyhrf.xmlio.Initable
computeVarYTildeOpt
(update_perf=False)¶if update_perf is True then also update sumcXg and prl.ytilde update_perf should only be used at init of variable values.
getOutputs
()¶samplingWarmUp
(v)¶pyhrf.jde.asl_physio_1step_params.
DriftCoeffSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
updateNorm
()¶pyhrf.jde.asl_physio_1step_params.
DriftVarianceSampler
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_1step_params.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_1step_params.
MixtureParamsSampler
(name, response_level_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶get_current_means
()¶get_current_vars
()¶get_true_values_from_simulation_dict
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.asl_physio_1step_params.
NoiseVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_1step_params.
PerfBaselineSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_residuals
()¶compute_wa
(a=None)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_1step_params.
PerfBaselineVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_1step_params.
PerfMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step_params.MixtureParamsSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_1step_params.
PerfResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step_params.ResponseLevelSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶computeVarYTildeOpt
()¶pyhrf.jde.asl_physio_1step_params.
PhysioBOLDResponseSampler
(phy_params={'E0': 0.8, 'TE': 0.04, 'V0': 0.02, 'alpha_w': 0.2, 'buxton': False, 'e': 0.4, 'eps': 0.5, 'eps_max': 10.0, 'linear': True, 'model': 'RBM', 'model_name': 'Friston00', 'obata': False, 'r0': 100, 'tau_f': 2.5, 'tau_m': 1.0, 'tau_s': 1.25, 'vt0': 80.6}, smooth_order=2, zero_constraint=True, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step_params.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum cWXg - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample BRF
changes to mean: changes to var:
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_1step_params.
PhysioBOLDResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step_params.ResponseVarianceSampler
, pyhrf.xmlio.Initable
pyhrf.jde.asl_physio_1step_params.
PhysioPerfResponseSampler
(phy_params={'E0': 0.8, 'TE': 0.04, 'V0': 0.02, 'alpha_w': 0.2, 'buxton': False, 'e': 0.4, 'eps': 0.5, 'eps_max': 10.0, 'linear': True, 'model': 'RBM', 'model_name': 'Friston00', 'obata': False, 'r0': 100, 'tau_f': 2.5, 'tau_m': 1.0, 'tau_s': 1.25, 'vt0': 80.6}, smooth_order=2, zero_constraint=True, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False, diff_res=True, prior_type='physio_stochastic_regularized')¶Bases: pyhrf.jde.asl_physio_1step_params.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum aXh - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample PRF with physio prior
changes to mean: add a factor of Omega h Sigma_g^-1 v_g^-1
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_1step_params.
PhysioPerfResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_1step_params.ResponseVarianceSampler
, pyhrf.xmlio.Initable
pyhrf.jde.asl_physio_1step_params.
ResponseLevelSampler
(name, response_name, mixture_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶computeRR
()¶computeVarYTildeOpt
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶setFinalValue
()¶pyhrf.jde.asl_physio_1step_params.
ResponseSampler
(name, response_level_name, variance_name, phy_params, smooth_order=2, zero_constraint=False, duration=25.0, normalise=0.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Generic parent class to perfusion response & BOLD response samplers
calcXResp
(resp, stackX=None)¶checkAndSetInitValue
(variables)¶computeYTilde
()¶getOutputs
()¶get_mat_X
()¶get_mat_XtX
()¶get_rlrl
()¶get_stackX
()¶get_ybar
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶setFinalValue
()¶updateNorm
()¶updateXResp
()¶pyhrf.jde.asl_physio_1step_params.
ResponseVarianceSampler
(name, response_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Sample variance of BRF or PRF
TODO: change code below –> no changes necessary so far
pyhrf.jde.asl_physio_1step_params.
WN_BiG_ASLSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.WN_BiG_Drift_BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.asl_physio_1step_params.
b
()¶pyhrf.jde.asl_physio_1step_params.
compute_StS_StY
(rls, v_b, mx, mxtx, ybar, rlrl, yaj, ajak_vb)¶yaj and ajak_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_1step_params.
compute_StS_StY_deterministic
(brls, prls, v_b, mx, mxtx, mwx, mxtwx, mwxtwx, ybar, rlrl_bold, rlrl_perf, brlprl, omega, yj, ajak_vb)¶yj, ajak_vb and cjck_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_1step_params.
compute_bRpR
(brl, prl, nbConditions, nbVoxels)¶Physio prior, deterministic version where fwd model is changed TODO: clean to remove stochastic parts
pyhrf.jde.asl_physio_det_fwdm.
ASLPhysioSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, bold_response_levels=<pyhrf.jde.asl_physio_det_fwdm.BOLDResponseLevelSampler object>, perf_response_levels=<pyhrf.jde.asl_physio_det_fwdm.PerfResponseLevelSampler object>, labels=<pyhrf.jde.asl_physio_det_fwdm.LabelSampler object>, noise_var=<pyhrf.jde.asl_physio_det_fwdm.NoiseVarianceSampler object>, brf=<pyhrf.jde.asl_physio_det_fwdm.PhysioBOLDResponseSampler object>, brf_var=<pyhrf.jde.asl_physio_det_fwdm.PhysioBOLDResponseSampler object>, prf=<pyhrf.jde.asl_physio_det_fwdm.PhysioPerfResponseSampler object>, prf_var=<pyhrf.jde.asl_physio_det_fwdm.PhysioPerfResponseSampler object>, bold_mixt_params=<pyhrf.jde.asl_physio_det_fwdm.BOLDMixtureSampler object>, perf_mixt_params=<pyhrf.jde.asl_physio_det_fwdm.PerfMixtureSampler object>, drift=<pyhrf.jde.asl_physio_det_fwdm.DriftCoeffSampler object>, drift_var=<pyhrf.jde.asl_physio_det_fwdm.DriftVarianceSampler object>, perf_baseline=<pyhrf.jde.asl_physio_det_fwdm.PerfBaselineSampler object>, perf_baseline_var=<pyhrf.jde.asl_physio_det_fwdm.PerfBaselineVarianceSampler object>, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
computeFit
()¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶inputClass
¶alias of WN_BiG_ASLSamplerInput
parametersToShow
= ['nb_its', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.asl_physio_det_fwdm.
BOLDMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_det_fwdm.MixtureParamsSampler
, pyhrf.xmlio.Initable
get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_det_fwdm.
BOLDResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_det_fwdm.ResponseLevelSampler
, pyhrf.xmlio.Initable
computeVarYTildeOpt
(update_perf=False)¶if update_perf is True then also update sumcXg and prl.ytilde update_perf should only be used at init of variable values.
getOutputs
()¶samplingWarmUp
(v)¶pyhrf.jde.asl_physio_det_fwdm.
DriftCoeffSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateNorm
()¶pyhrf.jde.asl_physio_det_fwdm.
DriftVarianceSampler
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_det_fwdm.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_det_fwdm.
MixtureParamsSampler
(name, response_level_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶get_current_means
()¶get_current_vars
()¶get_true_values_from_simulation_dict
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.asl_physio_det_fwdm.
NoiseVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_det_fwdm.
PerfBaselineSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_residuals
()¶compute_wa
(a=None)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_det_fwdm.
PerfBaselineVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_det_fwdm.
PerfMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_det_fwdm.MixtureParamsSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_det_fwdm.
PerfResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_det_fwdm.ResponseLevelSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶computeVarYTildeOpt
()¶pyhrf.jde.asl_physio_det_fwdm.
PhysioBOLDResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False, use_omega=True, deterministic=False)¶Bases: pyhrf.jde.asl_physio_det_fwdm.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum cWXg - Pl - wa
get_mat_X
()¶get_mat_XtWX
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample BRF
changes to mean: changes to var:
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_det_fwdm.
PhysioBOLDResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_det_fwdm.ResponseVarianceSampler
, pyhrf.xmlio.Initable
sampleNextInternal
(v)¶Sample variance of BRF
TODO: change code below –> no changes necessary so far
pyhrf.jde.asl_physio_det_fwdm.
PhysioPerfResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False, diff_res=True, regularize=True, deterministic=False)¶Bases: pyhrf.jde.asl_physio_det_fwdm.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum aXh - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample PRF with physio prior
changes to mean: add a factor of Omega h Sigma_g^-1 v_g^-1
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_det_fwdm.
PhysioPerfResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_det_fwdm.ResponseVarianceSampler
, pyhrf.xmlio.Initable
sampleNextInternal
(v)¶Sample variance of PRF
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_det_fwdm.
ResponseLevelSampler
(name, response_name, mixture_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶computeRR
()¶computeVarYTildeOpt
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶pyhrf.jde.asl_physio_det_fwdm.
ResponseSampler
(name, response_level_name, variance_name, smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False, deterministic=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Generic parent class to perfusion response & BOLD response samplers
calcXResp
(resp, stackX=None)¶checkAndSetInitValue
(variables)¶computeYTilde
()¶get_mat_X
()¶get_mat_XtX
()¶get_rlrl
()¶get_stackX
()¶get_ybar
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶setFinalValue
()¶updateNorm
()¶updateXResp
()¶pyhrf.jde.asl_physio_det_fwdm.
ResponseVarianceSampler
(name, response_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶pyhrf.jde.asl_physio_det_fwdm.
WN_BiG_ASLSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.WN_BiG_Drift_BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.asl_physio_det_fwdm.
b
()¶pyhrf.jde.asl_physio_det_fwdm.
compute_StS_StY
(rls, v_b, mx, mxtx, ybar, rlrl, yaj, ajak_vb)¶yaj and ajak_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_det_fwdm.
compute_StS_StY_deterministic
(brls, prls, v_b, mx, mxtx, mx_perf, mxtx_perf, mxtwx, ybar, rlrl_bold, rlrl_perf, brlprl, yj, ajak_vb, cjck_vb, omega, W)¶yaj and ajak_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_det_fwdm.
compute_bRpR
(brl, prl, nbConditions, nbVoxels)¶pyhrf.jde.asl_physio_hierarchical.
ASLPhysioSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, bold_response_levels=<pyhrf.jde.asl_physio_hierarchical.BOLDResponseLevelSampler object>, perf_response_levels=<pyhrf.jde.asl_physio_hierarchical.PerfResponseLevelSampler object>, labels=<pyhrf.jde.asl_physio_hierarchical.LabelSampler object>, noise_var=<pyhrf.jde.asl_physio_hierarchical.NoiseVarianceSampler object>, truebrf=<pyhrf.jde.asl_physio_hierarchical.PhysioTrueBOLDResponseSampler object>, truebrf_var=<pyhrf.jde.asl_physio_hierarchical.PhysioTrueBOLDResponseVarianceSampler object>, brf=<pyhrf.jde.asl_physio_hierarchical.PhysioBOLDResponseSampler object>, brf_var=<pyhrf.jde.asl_physio_hierarchical.PhysioBOLDResponseVarianceSampler object>, prf=<pyhrf.jde.asl_physio_hierarchical.PhysioPerfResponseSampler object>, prf_var=<pyhrf.jde.asl_physio_hierarchical.PhysioPerfResponseVarianceSampler object>, bold_mixt_params=<pyhrf.jde.asl_physio_hierarchical.BOLDMixtureSampler object>, perf_mixt_params=<pyhrf.jde.asl_physio_hierarchical.PerfMixtureSampler object>, drift=<pyhrf.jde.asl_physio_hierarchical.DriftCoeffSampler object>, drift_var=<pyhrf.jde.asl_physio_hierarchical.DriftVarianceSampler object>, perf_baseline=<pyhrf.jde.asl_physio_hierarchical.PerfBaselineSampler object>, perf_baseline_var=<pyhrf.jde.asl_physio_hierarchical.PerfBaselineVarianceSampler object>, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
computeFit
()¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶inputClass
¶alias of WN_BiG_ASLSamplerInput
parametersToShow
= ['nb_its', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.asl_physio_hierarchical.
BOLDMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_hierarchical.MixtureParamsSampler
, pyhrf.xmlio.Initable
get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_hierarchical.
BOLDResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_hierarchical.ResponseLevelSampler
, pyhrf.xmlio.Initable
computeVarYTildeOpt
(update_perf=False)¶if update_perf is True then also update sumcXg and prl.ytilde update_perf should only be used at init of variable values.
getOutputs
()¶samplingWarmUp
(v)¶pyhrf.jde.asl_physio_hierarchical.
DriftCoeffSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
updateNorm
()¶pyhrf.jde.asl_physio_hierarchical.
DriftVarianceSampler
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_hierarchical.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_hierarchical.
MixtureParamsSampler
(name, response_level_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶get_current_means
()¶get_current_vars
()¶get_true_values_from_simulation_dict
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.asl_physio_hierarchical.
NoiseVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_hierarchical.
PerfBaselineSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_residuals
()¶compute_wa
(a=None)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_hierarchical.
PerfBaselineVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_hierarchical.
PerfMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_hierarchical.MixtureParamsSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_hierarchical.
PerfResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_hierarchical.ResponseLevelSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶computeVarYTildeOpt
()¶pyhrf.jde.asl_physio_hierarchical.
PhysioBOLDResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False, prior_type='not_regularized')¶Bases: pyhrf.jde.asl_physio_hierarchical.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum cWXg - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample BRF
changes to mean: changes to var:
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_hierarchical.
PhysioBOLDResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Sample variance of BRF
pyhrf.jde.asl_physio_hierarchical.
PhysioPerfResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False, diff_res=True, prior_type='not_regularized')¶Bases: pyhrf.jde.asl_physio_hierarchical.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum aXh - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample PRF with physio prior
changes to mean: add a factor of Omega h Sigma_g^-1 v_g^-1
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_hierarchical.
PhysioPerfResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Sample variance of PRF
pyhrf.jde.asl_physio_hierarchical.
PhysioTrueBOLDResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False, prior_type='regularized')¶Bases: pyhrf.jde.asl_physio_hierarchical.ResponseSampler
, pyhrf.xmlio.Initable
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample TRUE BRF
pyhrf.jde.asl_physio_hierarchical.
PhysioTrueBOLDResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Sample variance of BRF
pyhrf.jde.asl_physio_hierarchical.
ResponseLevelSampler
(name, response_name, mixture_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶computeRR
()¶computeVarYTildeOpt
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶pyhrf.jde.asl_physio_hierarchical.
ResponseSampler
(name, response_level_name, variance_name, prior_type, smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Generic parent class to perfusion response & BOLD response samplers
calcXResp
(resp, stackX=None)¶checkAndSetInitValue
(variables)¶computeYTilde
()¶getOutputs
()¶get_mat_X
()¶get_mat_XtX
()¶get_rlrl
()¶get_stackX
()¶get_ybar
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶setFinalValue
()¶updateNorm
()¶updateXResp
()¶pyhrf.jde.asl_physio_hierarchical.
WN_BiG_ASLSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.WN_BiG_Drift_BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.asl_physio_hierarchical.
b
()¶pyhrf.jde.asl_physio_hierarchical.
compute_StS_StY
(rls, v_b, mx, mxtx, ybar, rlrl, yaj, ajak_vb)¶yaj and ajak_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_hierarchical.
compute_StS_StY_deterministic
(brls, prls, v_b, mx, mxtx, mwx, mxtwx, mwxtwx, ybar, rlrl_bold, rlrl_perf, brlprl, omega, yj, ajak_vb)¶yj, ajak_vb and cjck_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_hierarchical.
compute_bRpR
(brl, prl, nbConditions, nbVoxels)¶pyhrf.jde.asl_physio_joint.
ASLPhysioSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, bold_response_levels=<pyhrf.jde.asl_physio_joint.BOLDResponseLevelSampler object>, perf_response_levels=<pyhrf.jde.asl_physio_joint.PerfResponseLevelSampler object>, labels=<pyhrf.jde.asl_physio_joint.LabelSampler object>, noise_var=<pyhrf.jde.asl_physio_joint.NoiseVarianceSampler object>, brf=<pyhrf.jde.asl_physio_joint.PhysioBOLDResponseSampler object>, prf=<pyhrf.jde.asl_physio_joint.PhysioPerfResponseSampler object>, prfbrf_var=<pyhrf.jde.asl_physio_joint.PhysioJointResponseVarianceSampler object>, bold_mixt_params=<pyhrf.jde.asl_physio_joint.BOLDMixtureSampler object>, perf_mixt_params=<pyhrf.jde.asl_physio_joint.PerfMixtureSampler object>, drift=<pyhrf.jde.asl_physio_joint.DriftCoeffSampler object>, drift_var=<pyhrf.jde.asl_physio_joint.DriftVarianceSampler object>, perf_baseline=<pyhrf.jde.asl_physio_joint.PerfBaselineSampler object>, perf_baseline_var=<pyhrf.jde.asl_physio_joint.PerfBaselineVarianceSampler object>, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
computeFit
()¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶inputClass
¶alias of WN_BiG_ASLSamplerInput
parametersToShow
= ['nb_its', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.asl_physio_joint.
BOLDMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_joint.MixtureParamsSampler
, pyhrf.xmlio.Initable
get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_joint.
BOLDResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_joint.ResponseLevelSampler
, pyhrf.xmlio.Initable
computeVarYTildeOpt
(update_perf=False)¶if update_perf is True then also update sumcXg and prl.ytilde update_perf should only be used at init of variable values.
getOutputs
()¶samplingWarmUp
(v)¶pyhrf.jde.asl_physio_joint.
DriftCoeffSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
updateNorm
()¶pyhrf.jde.asl_physio_joint.
DriftVarianceSampler
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_joint.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_joint.
MixtureParamsSampler
(name, response_level_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶get_current_means
()¶get_current_vars
()¶get_true_values_from_simulation_dict
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.asl_physio_joint.
NoiseVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_y_tilde
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_joint.
PerfBaselineSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶compute_residuals
()¶compute_wa
(a=None)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_joint.
PerfBaselineVarianceSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.asl_physio_joint.
PerfMixtureSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_joint.MixtureParamsSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶get_true_values_from_simulation_cdefs
(cdefs)¶pyhrf.jde.asl_physio_joint.
PerfResponseLevelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_joint.ResponseLevelSampler
, pyhrf.xmlio.Initable
checkAndSetInitValue
(variables)¶computeVarYTildeOpt
()¶pyhrf.jde.asl_physio_joint.
PhysioBOLDResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.asl_physio_joint.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum cWXg - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample BRF
changes to mean: changes to var:
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_joint.
PhysioJointResponseVarianceSampler
(val_ini=array([ 0.001]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
, pyhrf.xmlio.Initable
checkAndSetInitValue
(v)¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Sample joint variance of BRF and PRF
pyhrf.jde.asl_physio_joint.
PhysioPerfResponseSampler
(smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False, diff_res=True)¶Bases: pyhrf.jde.asl_physio_joint.ResponseSampler
, pyhrf.xmlio.Initable
computeYTilde
()¶y - sum aXh - Pl - wa
get_mat_X
()¶get_mat_XtX
()¶get_stackX
()¶sampleNextInternal
(variables)¶Sample PRF with physio prior
changes to mean: add a factor of Omega h Sigma_g^-1 v_g^-1
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.asl_physio_joint.
ResponseLevelSampler
(name, response_name, mixture_name, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶computeRR
()¶computeVarYTildeOpt
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶pyhrf.jde.asl_physio_joint.
ResponseSampler
(name, response_level_name, variance_name, smooth_order=2, zero_constraint=True, duration=25.0, normalise=1.0, val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Generic parent class to perfusion response & BOLD response samplers
calcXResp
(resp, stackX=None)¶checkAndSetInitValue
(variables)¶computeYTilde
()¶getOutputs
()¶get_mat_X
()¶get_mat_XtX
()¶get_rlrl
()¶get_stackX
()¶get_ybar
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶setFinalValue
()¶updateNorm
()¶updateXResp
()¶pyhrf.jde.asl_physio_joint.
WN_BiG_ASLSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.WN_BiG_Drift_BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.asl_physio_joint.
b
()¶pyhrf.jde.asl_physio_joint.
compute_StS_StY
(rls, v_b, mx, mxtx, ybar, rlrl, yaj, ajak_vb)¶yaj and ajak_vb are only used to store intermediate quantities, they’re not inputs.
pyhrf.jde.asl_physio_joint.
compute_bRpR
(brl, prl, nbConditions, nbVoxels)¶pyhrf.jde.beta.
BetaSampler
(do_sampling=True, use_true_value=False, val_ini=array([ 0.7]), sigma=0.05, pr_beta_cut=1.2, pf_method='es', pf=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶getOutputs
()¶get_string_value
(v)¶linkToData
(dataInput)¶loadBetaGrid
()¶parametersComments
= {'pf_method': 'either "es" (extrapolation scheme) or "ps" (path sampling)'}¶parametersToShow
= ['do_sampling', 'val_ini']¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
saveCurrentValue
(it)¶pyhrf.jde.beta.
Cpt_AcceptNewBeta_Graph
(RefGraph, GraphNodesLabels, VecEstim_lnZ, VecBetaVal, CurrentBeta, sigma, thresh=1.2, GraphWeight=None)¶Starting from a given Beta vector (1 value for each condition) CurrentBeta, computes new Beta values in NewBeta using a Metropolis-Hastings step.
Parameters: |
|
---|---|
Returns: | Contains the accepted beta value at the next iteration |
Return type: | NewBeta |
pyhrf.jde.beta.
Cpt_Distrib_P_beta_graph
(RefGraph, GraphNodesLabels, VecEstim_lnZ, VecBetaVal, thresh=1.5, GraphWeight=None)¶Computes the distribution
Parameters: |
|
---|---|
Returns: | contains the |
Return type: | Vec_P_Beta |
pyhrf.jde.beta.
Cpt_Exact_lnZ_graph
(RefGraph, beta, LabelsNb, GraphWeight=None)¶Computes the logarithm of the exact partition function .
Parameters: |
|
---|---|
Returns: | exact value of ln(Z) |
Return type: | exact_lnZ |
pyhrf.jde.beta.
Cpt_Expected_U_graph
(RefGraph, beta, LabelsNb, SamplesNb, GraphWeight=None, GraphNodesLabels=None, GraphLinks=None, RefGrphNgbhPosi=None)¶Useless now!
Estimates the expectation of U for a given normalization constant Beta and a given mask shape. Swendsen-Wang sampling is used to assess the expectation on significant images depending of beta.
Parameters: |
|
---|---|
Returns: | U expectation |
Return type: | ExpectU |
pyhrf.jde.beta.
Cpt_Vec_Estim_lnZ_Graph
(RefGraph, LabelsNb, SamplesNb=40, BetaMax=1.4, BetaStep=0.05, GraphWeight=None)¶Estimates ln(Z) for fields of a given size and Beta values between 0 and BetaMax. Estimates of ln(Z) are first computed on a coarse grid of Beta values. They are then computed and returned on a fine grid. No approximation using precomputed partition function is performed here.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.jde.beta.
Cpt_Vec_Estim_lnZ_Graph_fast
(RefGraph, LabelsNb, MaxErrorAllowed=5, BetaMax=1.4, BetaStep=0.05)¶Estimate ln(Z(beta)) of Potts fields. The default Beta grid is between 0. and 1.4 with a step of 0.05. Extrapolation algorithm is used. Fast estimates are only performed for Ising fields (2 labels). Reference partition functions were pre-computed on Ising fields designed on regular and non-regular grids. They all respect a 6-connectivity system.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.jde.beta.
Cpt_Vec_Estim_lnZ_Graph_fast2
(RefGraph, BetaMax=1.4, BetaStep=0.05)¶Estimate ln(Z(beta)) of Ising fields (2 labels). The default Beta grid is between 0. and 1.4 with a step of 0.05. Bilinar estimation with the number of sites and cliques is used. The bilinear functions were estimated using bilinear regression on reference partition functions on 240 non-regular grids and with respect to a 6-connectivity system. (Pfs are found in LoadBaseLogPartFctRef -> PFs 0:239)
Parameters: |
|
---|---|
Returns: |
|
pyhrf.jde.beta.
Cpt_Vec_Estim_lnZ_Graph_fast3
(RefGraph, LabelsNb, MaxErrorAllowed=5, BetaMax=1.4, BetaStep=0.05)¶Estimate ln(Z(beta)) of Potts fields. The default Beta grid is between 0. and 1.4 with a step of 0.05. Extrapolation algorithm is used. Fast estimates are only performed for Ising fields (2 labels). Reference partition functions were pre-computed on Ising fields designed on regular and non-regular grids. They all respect a 6-connectivity system.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.jde.beta.
Cpt_Vec_Estim_lnZ_OLD_Graph
(RefGraph, LabelsNb, SamplesNb=50, BetaMax=1.0, BetaStep=0.01, GraphWeight=None)¶Useless now!
Estimates ln(Z) for fields of a given size and Beta values between 0 and BetaMax.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.jde.beta.
Cpt_Vec_Estim_lnZ_Onsager
(n, BetaMax=1.2, BetaStep=0.05)¶Estimate ln(Z(beta)) Onsager using Onsager technique (2D periodic fields - 2 labels - 4 connectivity)
Parameters: |
|
---|---|
Returns: |
|
pyhrf.jde.beta.
Estim_lnZ_Onsager
(n, beta)¶Estimate ln(Z(beta)) using Onsager technique (2D periodic fields - 2 labels - 4 connectivity)
Parameters: |
|
---|---|
Returns: | ln(Z(beta)) estimate |
Return type: | LogZ |
pyhrf.jde.beta.
Estim_lnZ_ngbhd_graph
(RefGraph, beta_Ngbhd, beta_Ref, lnZ_ref, VecU_ref, LabelsNb)¶Estimates ln(Z) for beta=betaNgbhd. beta_Ngbhd is supposed close to beta_Ref for which ln(Z) is known (lnZ_ref) and the energy U of fields generated according to it have already been computed (VecU_ref).
Parameters: |
|
---|---|
Returns: | ln(Z) for beta=beta_Ngbhd |
Return type: | lnZ_Ngbhd |
pyhrf.jde.beta.
LoadBaseLogPartFctRef
()¶pyhrf.jde.beta.
beta_estim_obs_field
(graph, labels, gridLnz, method='MAP', weights=None)¶Estimate the amount of spatial correlation of an Ising observed field. graph is the neighbours list defining the topology labels is the field realisation gridLnz is the log-partition function associated to the topology, ie a grid where gridLnz[0] stores values of lnz and gridLnz[1] stores corresponding values of beta.
Returns: |
|
---|
pyhrf.jde.beta.
logpf_ising_onsager
(size, beta)¶Calculate log partition function in terms of beta for an Ising field of size ‘size’. ‘beta’ can be scalar or numpy.array. Assumptions: the field is 2D, squared, toroidal and has 4-connectivity
pyhrf.jde.drift.
DriftARSampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
Gibbs sampler of the parameters modelling the low frequency drift in the fMRI time course, in the case of AR noise
checkAndSetInitValue
(variables)¶computeVarYTilde
(varNrls, varXh)¶fillOutputs2
(outputs, iROI=-1)¶finalizeSampling
()¶initOutputs2
(outputs, nbROI=-1)¶linkToData
(dataInput)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶#TODO : comment
updateNorm
()¶updateVarYmDrift
()¶pyhrf.jde.drift.
DriftSampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
Gibbs sampler of the parameters modelling the low frequency drift in the fMRI time course, in the case of white noise.
checkAndSetInitValue
(variables)¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateNorm
()¶pyhrf.jde.drift.
DriftSamplerWithRelVar
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.jde.drift.DriftSampler
Gibbs sampler of the parameters modelling the low frequency drift in the fMRI time course, in the case of white noise.
checkAndSetInitValue
(variables)¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateNorm
()¶pyhrf.jde.drift.
ETASampler
(do_sampling=True, use_true_value=False, val_ini=array([ 1.]))¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
Gibbs sampler of the variance of the Inverse Gamma prior used to regularise the estimation of the low frequency drift embedded in the fMRI time course
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.drift.
ETASampler_MultiSess
(do_sampling=True, use_true_value=False, val_ini=array([ 1.]))¶Bases: pyhrf.jde.drift.ETASampler
linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.drift.
sampleDrift
(varInvSigma_drift, ptLambdaY, dim)¶pyhrf.jde.hrf.
HRFARSampler
(do_sampling=True, use_true_value=False, val_ini=None, duration=25.0, zero_constraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', do_voxelwise_outputs=False, compute_ah_online=False, output_ah=False)¶Bases: pyhrf.jde.hrf.HRFSampler
#THis class implements the sampling of the HRF when modelling a serially AR(1) noise process in the data. The structure of this noise is spatially varying in the sense that there is one AR parameter in combination with one noise variance per voxel.
computeStDS_StDY
(reps, noiseInvCov, nrls, varMBYPl)¶finalizeSampling
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.hrf.
HRFSampler
(do_sampling=True, use_true_value=False, val_ini=None, duration=25.0, zero_constraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', do_voxelwise_outputs=False, compute_ah_online=False, output_ah=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : HRF sampler for BiGaussian NLR mixture
calcXh
(hrf)¶checkAndSetInitValue
(variables)¶computeStDS_StDY
(rb, nrls, aa)¶detectSignError
()¶finalizeSampling
()¶getCurrentVar
()¶getFinalVar
()¶getOutputs
()¶getScaleFactor
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
get_final_value
()¶Used to compare with simulated value
initObservables
()¶linkToData
(dataInput)¶parametersComments
= {'covar_hack': 'Divide the term coming from the likelihood by the nb of voxels\n when computing the posterior covariance. The aim is to balance\n the contribution coming from the prior with that coming from the likelihood.\n Note: this hack is only taken into account when "singleHRf" is used for "prior_type"', 'do_sampling': 'Flag for the HRF estimation (True or False).\nIf set to False then the HRF is fixed to a canonical form.', 'duration': 'HRF length in seconds', 'normalise': 'If 1. : Normalise samples of Hrf and NRLs when they are sampled.\nIf 0. : Normalise posterior means of Hrf and NRLs when they are sampled.\nelse : Do not normalise.', 'prior_type': 'Type of prior:\n - "singleHRF": one HRF modelled for the whole parcel ~N(0,v_h*R).\n - "voxelwiseIID": one HRF per voxel, all HRFs are iid ~N(0,v_h*R).', 'zero_constraint': 'If True: impose first and last value = 0.\nIf False: no constraint.'}¶parametersToShow
= ['do_sampling', 'duration', 'zero_constraint']¶reportCurrentVal
()¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
setFinalValue
()¶updateNorm
()¶updateObsersables
()¶updateXh
()¶pyhrf.jde.hrf.
HRFSamplerWithRelVar
(do_sampling=True, use_true_value=False, val_ini=None, duration=25.0, zero_constraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', do_voxelwise_outputs=False, compute_ah_online=False, output_ah=False)¶Bases: pyhrf.jde.hrf.HRFSampler
This class introduce a new variable w (Relevant Variable) that takes its value in {0, 1} with :
computeStDS_StDY_WithRelVar
(rb, nrls, aa, w)¶finalizeSampling
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.hrf.
HRF_Drift_Sampler
(do_sampling=True, use_true_value=False, val_ini=None, duration=25.0, zero_constraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', do_voxelwise_outputs=False, compute_ah_online=False, output_ah=False)¶Bases: pyhrf.jde.hrf.HRFSampler
Class handling the Gibbs sampling of Neural Response Levels in the case of joint drift sampling.
computeStDS_StDY
(rb, nrls, aa)¶pyhrf.jde.hrf.
HRF_Drift_SamplerWithRelVar
(do_sampling=True, use_true_value=False, val_ini=None, duration=25.0, zero_constraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', do_voxelwise_outputs=False, compute_ah_online=False, output_ah=False)¶Bases: pyhrf.jde.hrf.HRFSamplerWithRelVar
Class handling the Gibbs sampling of Neural Response Levels in the case of joint drift sampling.
computeStDS_StDY_WithRelVar
(rb, nrls, aa, w)¶pyhrf.jde.hrf.
HRF_two_parts_Sampler
(do_sampling=True, use_true_value=False, val_ini=None, duration=25.0, zero_constraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', do_voxelwise_outputs=False, compute_ah_online=False, output_ah=False)¶Bases: pyhrf.jde.hrf.HRFSampler
calcXh
(hrf)¶checkAndSetInitValue
(variables)¶computeStDS_StDY
(rb, nrls, aa)¶detectSignError
()¶finalizeSampling
()¶getCurrentVar
()¶getFinalVar
()¶getOutputs
()¶getScaleFactor
()¶initObservables
()¶linkToData
(dataInput)¶reportCurrentVal
()¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
setFinalValue
()¶updateNorm
()¶updateObsersables
()¶updateXh
()¶pyhrf.jde.hrf.
HRFwithHabSampler
(do_sampling=True, use_true_value=False, val_ini=None, duration=25.0, zero_constraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', do_voxelwise_outputs=False, compute_ah_online=False, output_ah=False)¶Bases: pyhrf.jde.hrf.HRFSampler
computeStDS_StDY
(rb, sumaX, Q)¶finalizeSampling
()¶getScaleFactor
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateNorm
()¶pyhrf.jde.hrf.
RHSampler
(do_sampling=True, use_true_value=False, val_ini=array([ 0.1]), prior_mean=0.001, prior_var=10)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
checkAndSetInitValue
(variables)¶getOutputs
()¶get_final_value
()¶linkToData
(dataInput)¶parametersToShow
= ['do_sampling', 'val_ini']¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.hrf.
ScaleSampler
(do_sampling=False, use_true_value=False, val_ini=array([ 1.]))¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.hrf.
buildDiagGaussianMat
(size, width)¶pyhrf.jde.hrf.
msqrt
(cov)¶sig = msqrt(cov)
Return a matrix square root of a covariance matrix. Tries Cholesky factorization first, and factorizes by diagonalization if that fails.
pyhrf.jde.hrf.
sampleHRF_single_hrf
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox)¶pyhrf.jde.hrf.
sampleHRF_single_hrf_hack
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox)¶pyhrf.jde.hrf.
sampleHRF_voxelwise_iid
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox)¶pyhrf.jde.jde_multi_sess.
BOLDGibbs_Multi_SessSampler
(nb_its=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, response_levels_sess=<pyhrf.jde.jde_multi_sess.NRL_Multi_Sess_Sampler object>, response_levels_mean=<pyhrf.jde.jde_multi_sess.NRLsBar_Drift_Multi_Sess_Sampler object>, beta=<pyhrf.jde.beta.BetaSampler object>, noise_var=<pyhrf.jde.jde_multi_sess.NoiseVariance_Drift_Multi_Sess_Sampler object>, hrf=<pyhrf.jde.jde_multi_sess.HRF_MultiSess_Sampler object>, hrf_var=<pyhrf.jde.hrf.RHSampler object>, mixt_weights=<pyhrf.jde.nrl.bigaussian.MixtureWeightsSampler object>, mixt_params=<pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSampler object>, scale=<pyhrf.jde.hrf.ScaleSampler object>, drift=<pyhrf.jde.jde_multi_sess.Drift_MultiSess_Sampler object>, drift_var=<pyhrf.jde.jde_multi_sess.ETASampler_MultiSess object>, stop_crit_threshold=-1, stop_crit_from_start=False, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
cleanObservables
()¶computeFit
()¶computePMStimInducedSignal
()¶compute_crit_diff
(old_vals, means=None)¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶initGlobalObservables
()¶inputClass
¶alias of BOLDSampler_Multi_SessInput
parametersComments
= {'obs_hist_pace': 'See comment for samplesHistoryPaceSave.', 'smpl_hist_pace': 'To save the samples at each iteration\nIf x<0: no save\n If 0<x<1: define the fraction of iterations for which samples are saved\nIf x>=1: define the step in iterations number between saved samples.\nIf x=1: save samples at each iteration.'}¶parametersToShow
= ['nb_its', 'response_levels_sess', 'response_levels_mean', 'hrf', 'hrf_var']¶saveGlobalObservables
(it)¶stop_criterion
(it)¶updateGlobalObservables
()¶pyhrf.jde.jde_multi_sess.
BOLDSampler_Multi_SessInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Class holding data needed by the sampler : BOLD time courses for each voxel, onsets and voxel topology. It also perform some precalculation such as the convolution matrix based on the onsests (L{stackX}) —- Multi-sessions version
buildCosMat
(paramLFD, ny)¶buildOtherMatX
()¶buildParadigmConvolMatrix
(zc, estimDuration, availableDataIndex, parData)¶buildPolyMat
(paramLFD, n)¶calcDt
(dtMin)¶chewUpOnsets
(dt, hrfZc, hrfDuration)¶cleanMem
()¶cleanPrecalculations
()¶makePrecalculations
()¶setLFDMat
(paramLFD, typeLFD)¶Build the low frequency basis from polynomial basis functions.
pyhrf.jde.jde_multi_sess.
BiGaussMixtureParams_Multi_Sess_NRLsBar_Sampler
(do_sampling=True, use_true_value=False, val_ini=None, hyper_prior_type='Jeffreys', activ_thresh=4.0, var_ci_pr_alpha=2.04, var_ci_pr_beta=0.5, var_ca_pr_alpha=2.01, var_ca_pr_beta=0.5, mean_ca_pr_mean=5.0, mean_ca_pr_var=20.0)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶computeWithProperPriors
(j, cardCIj, cardCAj)¶finalizeSampling
()¶getCurrentMeans
()¶getCurrentVars
()¶getOutputs
()¶get_string_value
(v)¶linkToData
(dataInput)¶parametersComments
= {'activ_thresh': 'Threshold for the max activ mean above which the region is considered activating', 'hyper_prior_type': "Either 'proper' or 'Jeffreys'"}¶parametersToShow
= []¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateObsersables
()¶pyhrf.jde.jde_multi_sess.
Drift_MultiSess_Sampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶getOutputs
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
get_final_value
()¶get_true_value
()¶linkToData
(dataInput)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateNorm
()¶pyhrf.jde.jde_multi_sess.
ETASampler_MultiSess
(do_sampling=True, use_true_value=False, val_ini=array([ 1.]))¶Bases: pyhrf.jde.drift.ETASampler
linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sess.
HRF_MultiSess_Sampler
(do_sampling=True, use_true_value=False, val_ini=None, duration=25.0, zero_constraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', do_voxelwise_outputs=False, compute_ah_online=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
HRF sampler for multisession model
calcXh
(hrf)¶checkAndSetInitValue
(variables)¶computeStDS_StDY
(rb_allSess, nrls_allSess, aa_allSess)¶computeStDS_StDY_from_HRFSampler
(rb, nrls, aa)¶just for comparison purpose. Should be removed in the end.
computeStDS_StDY_one_session
(rb, nrls, aa, sess)¶finalizeSampling
()¶getCurrentVar
()¶getFinalVar
()¶getOutputs
()¶getScaleFactor
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
initObservables
()¶linkToData
(dataInput)¶parametersComments
= {'covar_hack': 'Divide the term coming from the likelihood by the nb of voxels\n when computing the posterior covariance. The aim is to balance\n the contribution coming from the prior with that coming from the likelihood.\n Note: this hack is only taken into account when "singleHRf" is used for "prior_type"', 'do_sampling': 'Flag for the HRF estimation (True or False).\nIf set to False then the HRF is fixed to a canonical form.', 'duration': 'HRF length in seconds', 'normalise': 'If 1. : Normalise samples of Hrf and NRLs when they are sampled.\nIf 0. : Normalise posterior means of Hrf and NRLs when they are sampled.\nelse : Do not normalise.', 'prior_type': 'Type of prior:\n - "singleHRF": one HRF modelled for the whole parcel ~N(0,v_h*R).\n - "voxelwiseIID": one HRF per voxel, all HRFs are iid ~N(0,v_h*R).', 'zero_constraint': 'If True: impose first and last value = 0.\nIf False: no constraint.'}¶parametersToShow
= ['do_sampling', 'duration', 'zero_constraint']¶reportCurrentVal
()¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
setFinalValue
()¶updateNorm
()¶updateObsersables
()¶updateXh
()¶pyhrf.jde.jde_multi_sess.
NRL_Multi_Sess_Sampler
(do_sampling=True, val_ini=None, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶cleanMemory
()¶computeAA
(nrls, destaa)¶computeComponentsApost
(s, m, varXh)¶computeVarYTildeSessionOpt
(varXh, s)¶finalizeSampling
()¶getOutputs
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
is_accurate
()¶linkToData
(dataInput)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶#TODO : comment
saveCurrentValue
(it)¶pyhrf.jde.jde_multi_sess.
NRLsBar_Drift_Multi_Sess_Sampler
(do_sampling=True, val_ini=None, contrasts={}, do_label_sampling=True, use_true_nrls=False, use_true_labels=False, labels_ini=None, ppm_proba_threshold=0.05, ppm_value_threshold=0, ppm_value_multi_threshold=array([ 0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1., 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 2., 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 3., 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 4. ]), mean_activation_threshold=4, rescale_results=False, wip_variance_computation=False)¶Bases: pyhrf.jde.nrl.bigaussian.NRLSampler
Class handling the Gibbs sampling of Neural Response Levels in the case of joint drift sampling.
checkAndSetInitValue
(variables)¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
is_accurate
()¶linkToData
(dataInput)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
sampleNrlsSerial
(varCI, varCA, meanCA, variables)¶samplingWarmUp
(variables)¶#TODO : comment
setFinalValue
()¶pyhrf.jde.jde_multi_sess.
NoiseVariance_Drift_Multi_Sess_Sampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sess.
Variance_GaussianNRL_Multi_Sess
(do_sampling=True, use_true_value=False, val_ini=array([ 1.]))¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sess.
b
()¶pyhrf.jde.jde_multi_sess.
permutation
(x)¶Randomly permute a sequence, or return a permuted range.
If x is a multi-dimensional array, it is only shuffled along its first index.
Parameters: | x (int or array_like) – If x is an integer, randomly permute np.arange(x) .
If x is an array, make a copy and shuffle the elements
randomly. |
---|---|
Returns: | out – Permuted sequence or array range. |
Return type: | ndarray |
Examples
>>> np.random.permutation(10)
array([1, 7, 4, 3, 0, 9, 2, 5, 8, 6])
>>> np.random.permutation([1, 4, 9, 12, 15])
array([15, 1, 9, 4, 12])
>>> arr = np.arange(9).reshape((3, 3))
>>> np.random.permutation(arr)
array([[6, 7, 8],
[0, 1, 2],
[3, 4, 5]])
pyhrf.jde.jde_multi_sess.
rand
(d0, d1, ..., dn)¶Random values in a given shape.
Create an array of the given shape and populate it with
random samples from a uniform distribution
over [0, 1)
.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should all be positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | out – Random values. |
Return type: | ndarray, shape (d0, d1, ..., dn) |
See also
random()
Notes
This is a convenience function. If you want an interface that takes a shape-tuple as the first argument, refer to np.random.random_sample .
Examples
>>> np.random.rand(3,2)
array([[ 0.14022471, 0.96360618], #random
[ 0.37601032, 0.25528411], #random
[ 0.49313049, 0.94909878]]) #random
pyhrf.jde.jde_multi_sess.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.jde.jde_multi_sess.
sampleHRF_single_hrf
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox)¶pyhrf.jde.jde_multi_sess.
sampleHRF_single_hrf_hack
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox)¶pyhrf.jde.jde_multi_sess.
sampleHRF_voxelwise_iid
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox, nbSess)¶pyhrf.jde.jde_multi_sess.
simulate_sessions
(output_dir, snr_scenario='high_snr', spatial_size='tiny')¶pyhrf.jde.jde_multi_sess.
simulate_single_session
(output_dir, var_sessions_nrls, cdefs, nrls_bar, labels, labels_vol, v_noise, drift_coeff_var, drift_amplitude)¶pyhrf.jde.jde_multi_sujets.
BOLDGibbs_Multi_SubjSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, response_levels=<pyhrf.jde.jde_multi_sujets.NRLs_Sampler object>, noise_var=<pyhrf.jde.jde_multi_sujets.NoiseVariance_Drift_MultiSubj_Sampler instance>, hrf_subj=<pyhrf.jde.jde_multi_sujets.HRF_Sampler instance>, hrf_var_subj=<pyhrf.jde.jde_multi_sujets.HRFVarianceSubjectSampler instance>, hrf_group=<pyhrf.jde.jde_multi_sujets.HRF_Group_Sampler instance>, hrf_var_group=<pyhrf.jde.jde_multi_sujets.RHGroupSampler instance>, mixt_params=<pyhrf.jde.jde_multi_sujets.MixtureParamsSampler instance>, labels=<pyhrf.jde.jde_multi_sujets.LabelSampler instance>, drift=<pyhrf.jde.jde_multi_sujets.Drift_MultiSubj_Sampler instance>, drift_var=<pyhrf.jde.jde_multi_sujets.ETASampler_MultiSubj instance>, stop_crit_threshold=-1, stop_crit_from_start=False, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
cleanObservables
()¶computeFit
()¶computePMStimInducedSignal
()¶compute_crit_diff
(old_vals, means=None)¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶initGlobalObservables
()¶inputClass
¶alias of BOLDSampler_MultiSujInput
parametersComments
= {'obs_hist_pace': 'See comment for samplesHistoryPaceSave.', 'smpl_hist_pace': 'To save the samples at each iteration\nIf x<0: no save\n If 0<x<1: define the fraction of iterations for which samples are saved\nIf x>=1: define the step in iterations number between backup copies.\nIf x=1: save samples at each iteration.'}¶parametersToShow
= ['nb_iterations', 'response_levels', 'hrf_subj', 'hrf_var_subj', 'hrf_group', 'hrf_var_group']¶saveGlobalObservables
(it)¶stop_criterion
(it)¶updateGlobalObservables
()¶pyhrf.jde.jde_multi_sujets.
BOLDSampler_MultiSujInput
(GroupData, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Class holding data needed by the sampler : BOLD time courses for each voxel, onsets and voxel topology. It also perform some precalculation such as the convolution matrix based on the onsests (L{stackX}) —- Multi-subjects version (cf. merge_fmri_subjects in core.py)
buildCosMat
(paramLFD, ny)¶buildOtherMatX
()¶buildParadigmConvolMatrix
(zc, estimDuration, availableDataIndex, parData)¶buildPolyMat
(paramLFD, n)¶calcDt
(dtMin)¶chewUpOnsets
(dt, hrfZc, hrfDuration)¶cleanMem
()¶makePrecalculations
()¶setLFDMat
(paramLFD, typeLFD)¶Build the low frequency basis from polynomial basis functions.
pyhrf.jde.jde_multi_sujets.
Drift_MultiSubj_Sampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Gibbs sampler of the parameters modelling the low frequency drift in the fMRI time course, in the case of white noise.
checkAndSetInitValue
(variables)¶getOutputs
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶get_final_value
()¶get_true_value
()¶linkToData
(dataInput)¶sampleNextAlt
(variables)¶sampleNextInternal
(variables)¶updateNorm
()¶pyhrf.jde.jde_multi_sujets.
ETASampler_MultiSubj
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
Gibbs sampler of the variance of the Inverse Gamma prior used to regularise the estimation of the low frequency drift embedded in the fMRI time course
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.jde_multi_sujets.
HRFVarianceSubjectSampler
(val_ini=array([ 0.15]), do_sampling=True, use_true_value=False, prior_mean=0.001, prior_var=10.0)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
checkAndSetInitValue
(variables)¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.jde_multi_sujets.
HRF_Group_Sampler
(val_ini=None, do_sampling=True, use_true_value=False, duration=25.0, zero_contraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', regularise=True, only_hrf_subj=False, compute_ah_online=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
HRF sampler for multisubjects model
P_COMPUTE_AH_ONLINE
= 'compute_ah_online'¶P_COVAR_HACK
= 'hackCovarApost'¶P_DERIV_ORDER
= 'derivOrder'¶P_DURATION
= 'duration'¶P_NORMALISE
= 'normalise'¶P_OUTPUT_PMHRF
= 'writeHrfOutput'¶P_PRIOR_TYPE
= 'priorType'¶P_REGULARIZE
= 'regularize_hrf'¶P_SAMPLE_FLAG
= 'sampleFlag'¶P_USE_TRUE_VALUE
= 'useTrueValue'¶P_VAL_INI
= 'initialValue'¶P_VOXELWISE_OUTPUTS
= 'voxelwiseOutputs'¶P_ZERO_CONSTR
= 'zeroConstraint'¶checkAndSetInitValue
(variables)¶defaultParameters
= {'compute_ah_online': False, 'derivOrder': 2, 'duration': 25, 'hackCovarApost': False, 'initialValue': None, 'normalise': 1.0, 'priorType': 'voxelwiseIID', 'regularize_hrf': True, 'sampleFlag': True, 'useTrueValue': False, 'voxelwiseOutputs': False, 'writeHrfOutput': True, 'zeroConstraint': True}¶finalizeSampling
()¶getCurrentVar
()¶getFinalVar
()¶getOutputs
()¶getScaleFactor
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶get_true_value
()¶linkToData
(dataInput)¶parametersComments
= {'duration': 'HRF length in seconds', 'hackCovarApost': 'Divide the term coming from the likelihood by the nb of voxels\n when computing the posterior covariance. The aim is to balance\n the contribution coming from the prior with that coming from the likelihood.\n Note: this hack is only taken into account when "singleHRf" is used for "prior_type"', 'normalise': 'If 1. : Normalise samples of Hrf, NRLs and Mixture Parameters when they are sampled.\nIf 0. : Normalise posterior means of Hrf, NRLs and Mixture Parameters when they are sampled.\nelse : Do not normalise.', 'priorType': 'Type of prior:\n - "singleHRF": one HRF modelled for the whole parcel ~N(0,v_h*R).\n - "voxelwiseIID": one HRF per voxel, all HRFs are iid ~N(0,v_h*R).', 'sampleFlag': 'Flag for the HRF estimation (True or False).\nIf set to False then the HRF is fixed to a canonical form.', 'zeroConstraint': 'If True: impose first and last value = 0.\nIf False: no constraint.'}¶parametersToShow
= ['duration', 'zeroConstraint', 'sampleFlag', 'writeHrfOutput']¶reportCurrentVal
()¶sampleNextAlt
(variables)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶setFinalValue
()¶updateNorm
()¶updateObsersables
()¶pyhrf.jde.jde_multi_sujets.
HRF_Sampler
(val_ini=None, do_sampling=True, use_true_value=False, duration=25.0, zero_contraint=True, normalise=1.0, deriv_order=2, covar_hack=False, prior_type='voxelwiseIID', regularise=True, only_hrf_subj=False, compute_ah_online=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
HRF sampler for multi subject model
calcXh
(hrfs)¶checkAndSetInitValue
(variables)¶computeStDS_StDY
(rb_allSubj, nrls_allSubj, aa_allSubj)¶computeStDS_StDY_one_subject
(rb, nrls, aa, subj)¶finalizeSampling
()¶getCurrentVar
()¶getFinalVar
()¶getOutputs
()¶getScaleFactor
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶get_true_value
()¶initObservables
()¶linkToData
(dataInput)¶parametersComments
= {'covar_hack': 'Divide the term coming from the likelihood by the nb of voxels\n when computing the posterior covariance. The aim is to balance\n the contribution coming from the prior with that coming from the likelihood.\n Note: this hack is only taken into account when "singleHRf" is used for "prior_type"', 'do_sampling': 'Flag for the HRF estimation (True or False).\nIf set to False then the HRF is fixed to a canonical form.', 'duration': 'HRF length in seconds', 'normalise': 'If 1. : Normalise samples of Hrf, NRLs andMixture Parameters when they are sampled.\nIf 0. : Normalise posterior means of Hrf, NRLs and Mixture Parameters when they are sampled.\nelse : Do not normalise.', 'prior_type': 'Type of prior:\n - "singleHRF": one HRF modelled for the whole parcel ~N(0,v_h*R).\n - "voxelwiseIID": one HRF per voxel, all HRFs are iid ~N(0,v_h*R).', 'zero_contraint': 'If True: impose first and last value = 0.\nIf False: no constraint.'}¶reportCurrentVal
()¶sampleNextAlt
(variables)¶sampleNextInternal
(variables)¶samplingWarmUp
(variables)¶setFinalValue
()¶updateNorm
()¶updateObsersables
()¶updateXh
()¶pyhrf.jde.jde_multi_sujets.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶FALSE_NEG
= 3¶FALSE_POS
= 2¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶samplingWarmUp
(v)¶pyhrf.jde.jde_multi_sujets.
MixtureParamsSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, s, cardCIj, cardCAj)¶get_current_means
()¶return array of shape (class, subject, condition)
get_current_vars
()¶return array of shape (class, subject, condition)
get_true_values_from_simulation_cdefs
(cdefs)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.jde_multi_sujets.
NRLs_Sampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶computeAA
()¶computeVarYTildeOpt
(varXh, s)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶pyhrf.jde.jde_multi_sujets.
NoiseVariance_Drift_MultiSubj_Sampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.jde_multi_sujets.
RHGroupSampler
(val_ini=array([ 0.15]), do_sampling=True, use_true_value=False, prior_mean=0.001, prior_var=10.0)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
checkAndSetInitValue
(variables)¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.jde_multi_sujets.
Variance_GaussianNRL_Multi_Subj
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶pyhrf.jde.jde_multi_sujets.
b
()¶pyhrf.jde.jde_multi_sujets.
create_gaussian_hrf_subject_and_group
(hrf_group_base, hrf_group_var_base, hrf_subject_var_base, dt, alpha=0.0)¶pyhrf.jde.jde_multi_sujets.
create_unnormed_gaussian_hrf_subject
(unnormed_hrf_group, unnormed_var_subject_hrf, dt, alpha=0.0)¶Creation of hrf by subject. Use group level hrf and variance for each subject (var_subjects_hrfs must be a list) Simulated hrfs must be smooth enough: correlation between temporal coeffcients
pyhrf.jde.jde_multi_sujets.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.jde.jde_multi_sujets.
rescale_hrf_group
(unnormed_primary_hrf, unnormed_hrf_group)¶pyhrf.jde.jde_multi_sujets.
rescale_hrf_subj
(unnormed_primary_hrf)¶pyhrf.jde.jde_multi_sujets.
rescale_hrf_subj_var
(unnormed_primary_hrf, unnormed_var_subject_hrf)¶pyhrf.jde.jde_multi_sujets.
sampleHRF_single_hrf
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox, hgroup)¶pyhrf.jde.jde_multi_sujets.
sampleHRF_single_hrf_hack
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox, hgroup)¶pyhrf.jde.jde_multi_sujets.
sampleHRF_voxelwise_iid
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox, hgroup, nbsubj)¶pyhrf.jde.jde_multi_sujets.
simulate_single_subject
(output_dir, cdefs, var_subject_hrf, labels, labels_vol, v_noise, drift_coeff_var, drift_amplitude, hrf_group_level, var_hrf_group, dt=0.6, dsf=4)¶pyhrf.jde.jde_multi_sujets.
simulate_subjects
(output_dir, snr_scenario='high_snr', spatial_size='tiny', hrf_group=None, nbSubj=10)¶Simulate daata for multiple subjects (5 subjects by default)
pyhrf.jde.jde_multi_sujets_alpha.
AlphaVar_Sampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
Gibbs sampler of the variance of the Inverse Gamma prior used to regularise the estimation of the low frequency drift embedded in the fMRI time course
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sujets_alpha.
Alpha_hgroup_Sampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶parametersToShow
= []¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sujets_alpha.
BOLDGibbs_Multi_SubjSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, bold_response_levels_subj=<pyhrf.jde.jde_multi_sujets_alpha.NRLs_Sampler object>, labels=<pyhrf.jde.jde_multi_sujets_alpha.LabelSampler object>, noise_var=<pyhrf.jde.jde_multi_sujets_alpha.NoiseVariance_Drift_MultiSubj_Sampler object>, hrf_subj=<pyhrf.jde.jde_multi_sujets_alpha.HRF_Sampler object>, hrf_subj_var=<pyhrf.jde.jde_multi_sujets_alpha.HRFVarianceSubjectSampler object>, hrf_group=<pyhrf.jde.jde_multi_sujets_alpha.HRF_Group_Sampler object>, hrf_group_var=<pyhrf.jde.jde_multi_sujets_alpha.RHGroupSampler object>, mixt_params=<pyhrf.jde.jde_multi_sujets_alpha.MixtureParamsSampler object>, drift=<pyhrf.jde.jde_multi_sujets_alpha.Drift_MultiSubj_Sampler object>, drift_var=<pyhrf.jde.jde_multi_sujets_alpha.ETASampler_MultiSubj object>, alpha=<pyhrf.jde.jde_multi_sujets_alpha.Alpha_hgroup_Sampler object>, alpha_var=<pyhrf.jde.jde_multi_sujets_alpha.AlphaVar_Sampler object>, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
cleanObservables
()¶computeFit
()¶computePMStimInducedSignal
()¶compute_crit_diff
(old_vals, means=None)¶default_nb_its
= 3000¶finalizeSampling
()¶getGlobalOutputs
()¶initGlobalObservables
()¶inputClass
¶alias of BOLDSampler_MultiSujInput
saveGlobalObservables
(it)¶stop_criterion
(it)¶updateGlobalObservables
()¶pyhrf.jde.jde_multi_sujets_alpha.
BOLDSampler_MultiSujInput
(GroupData, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Class holding data needed by the sampler : BOLD time courses for each voxel, onsets and voxel topology. It also perform some precalculation such as the convolution matrix based on the onsests (L{stackX}) —- Multi-subjects version (cf. merge_fmri_subjects in core.py)
buildCosMat
(paramLFD, ny)¶buildOtherMatX
()¶buildParadigmConvolMatrix
(zc, estimDuration, availableDataIndex, parData)¶buildPolyMat
(paramLFD, n)¶calcDt
(dtMin)¶chewUpOnsets
(dt, hrfZc, hrfDuration)¶cleanMem
()¶makePrecalculations
()¶setLFDMat
(paramLFD, typeLFD)¶Build the low frequency basis from polynomial basis functions.
pyhrf.jde.jde_multi_sujets_alpha.
BiGaussMixtureParamsSampler
(val_ini=None, do_sampling=True, use_true_value=False, prior_type='Jeffrey', var_ci_pr_alpha=2.04, var_ci_pr_beta=2.08, var_ca_pr_alpha=2.01, var_ca_pr_beta=0.5, mean_ca_pr_mean=5.0, mean_ca_pr_var=20.0, mean_activation_threshold=4.0)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, cardCIj, cardCAj)¶computeWithProperPriors
(j, cardCIj, cardCAj)¶finalizeSampling
()¶getCurrentMeans
()¶getCurrentVars
()¶getOutputs
()¶get_string_value
(v)¶linkToData
(dataInput)¶parametersComments
= {'mean_activation_threshold': 'Threshold for the max activ mean above which the region is considered activating', 'prior_type': "Either 'proper' or 'Jeffrey'"}¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateObsersables
()¶pyhrf.jde.jde_multi_sujets_alpha.
Drift_MultiSubj_Sampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
Gibbs sampler of the parameters modelling the low frequency drift in the fMRI time course, in the case of white noise.
P_SAMPLE_FLAG
= 'sampleFlag'¶P_USE_TRUE_VALUE
= 'useTrueValue'¶P_VAL_INI
= 'initialValue'¶checkAndSetInitValue
(variables)¶defaultParameters
= {'initialValue': None, 'sampleFlag': True, 'useTrueValue': False}¶getOutputs
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
get_final_value
()¶get_true_value
()¶linkToData
(dataInput)¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
updateNorm
()¶pyhrf.jde.jde_multi_sujets_alpha.
ETASampler_MultiSubj
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
Gibbs sampler of the variance of the Inverse Gamma prior used to regularise the estimation of the low frequency drift embedded in the fMRI time course
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sujets_alpha.
HRFVarianceSubjectSampler
(val_ini=array([ 0.05]), do_sampling=False, use_true_value=False, pr_mean=0.001, pr_var=10.0)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
PR_MEAN
= 0.001¶PR_VAR
= 10.0¶VAL_INI
= 0.05¶checkAndSetInitValue
(variables)¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sujets_alpha.
HRF_Group_Sampler
(val_ini=None, do_sampling=True, use_true_value=False, duration=25.0, zero_constraint=True, normalise=1.0, derivOrder=2, output_hrf_pm=True, hack_covar_apost=False, prior_type='voxelwiseIID', compute_ah_online=False, regularize_hrf=True, model_subjects_only=False, voxelwise_outputs=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
HRF sampler for multisubjects model
checkAndSetInitValue
(variables)¶finalizeSampling
()¶getCurrentVar
()¶getFinalVar
()¶getOutputs
()¶getScaleFactor
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
linkToData
(dataInput)¶parametersComments
= {'do_sampling': 'Flag for the HRF estimation (True or False).\nIf set to False then the HRF is fixed to a canonical form.', 'duration': 'HRF length in seconds', 'hack_covar_apost': 'Divide the term coming from the likelihood by the nb of voxels\n when computing the posterior covariance. The aim is to balance\n the contribution coming from the prior with that coming from the likelihood.\n Note: this hack is only taken into account when "singleHRf" is used for "prior_type"', 'model_subjects_only': 'If 1: Put hrf group at zero and only estimate hrf by subjects.If 0: Perform group hemodynamic estimation, hrf group sampled', 'normalise': 'If 1. : Normalise samples of Hrf, NRLs and Mixture Parameters when they are sampled.\nIf 0. : Normalise posterior means of Hrf, NRLs and Mixture Parameters when they are sampled.\nelse : Do not normalise.', 'prior_type': 'Type of prior:\n - "singleHRF": one HRF modelled for the whole parcel ~N(0,v_h*R).\n - "voxelwiseIID": one HRF per voxel, all HRFs are iid ~N(0,v_h*R).', 'zero_constraint': 'If True: impose first and last value = 0.\nIf False: no constraint.'}¶parametersToShow
= ['duration', 'zero_constraint', 'do_sampling', 'output_hrf_pm']¶reportCurrentVal
()¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
setFinalValue
()¶updateNorm
()¶updateObsersables
()¶pyhrf.jde.jde_multi_sujets_alpha.
HRF_Sampler
(val_ini=None, do_sampling=True, use_true_value=False, duration=25.0, zero_constraint=True, normalise=1.0, derivOrder=2, output_hrf_pm=True, hack_covar_apost=False, prior_type='voxelwiseIID', compute_ah_online=False, regularize_hrf=True, model_subjects_only=False, voxelwise_outputs=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
HRF sampler for multisession model
calcXh
(hrfs)¶checkAndSetInitValue
(variables)¶computeStDS_StDY
(rb_allSubj, nrls_allSubj, aa_allSubj)¶computeStDS_StDY_one_subject
(rb, nrls, aa, subj)¶finalizeSampling
()¶getCurrentVar
()¶getFinalVar
()¶getOutputs
()¶getScaleFactor
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
initObservables
()¶linkToData
(dataInput)¶parametersComments
= {'do_sampling': 'Flag for the HRF estimation (True or False).\nIf set to False then the HRF is fixed to a canonical form.', 'duration': 'HRF length in seconds', 'hack_covar_apost': 'Divide the term coming from the likelihood by the nb of voxels\n when computing the posterior covariance. The aim is to balance\n the contribution coming from the prior with that coming from the likelihood.\n Note: this hack is only taken into account when "singleHRf" is used for "prior_type"', 'normalise': 'If 1. : Normalise samples of Hrf, NRLs and Mixture Parameters when they are sampled.\nIf 0. : Normalise posterior means of Hrf, NRLs and Mixture Parameters when they are sampled.\nelse : Do not normalise.', 'prior_type': 'Type of prior:\n - "singleHRF": one HRF modelled for the whole parcel ~N(0,v_h*R).\n - "voxelwiseIID": one HRF per voxel, all HRFs are iid ~N(0,v_h*R).', 'zero_constraint': 'If True: impose first and last value = 0.\nIf False: no constraint.'}¶parametersToShow
= ['duration', 'zero_constraint', 'do_sampling', 'output_hrf_pm']¶reportCurrentVal
()¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
setFinalValue
()¶updateNorm
()¶updateObsersables
()¶updateXh
()¶pyhrf.jde.jde_multi_sujets_alpha.
LabelSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶compute_ext_field
()¶countLabels
()¶linkToData
(dataInput)¶sampleNextInternal
(v)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(v)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
pyhrf.jde.jde_multi_sujets_alpha.
MixtureParamsSampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
I_MEAN_CA
= 0¶I_VAR_CA
= 1¶I_VAR_CI
= 2¶L_CA
= 1¶L_CI
= 0¶NB_PARAMS
= 3¶PARAMS_NAMES
= ['Mean_Activ', 'Var_Activ', 'Var_Inactiv']¶checkAndSetInitValue
(variables)¶computeWithJeffreyPriors
(j, s, cardCIj, cardCAj)¶get_current_means
()¶return array of shape (class, subject, condition)
get_current_vars
()¶return array of shape (class, subject, condition)
get_true_values_from_simulation_cdefs
(cdefs)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sujets_alpha.
NRLs_Sampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶FALSE_NEG
= 3¶FALSE_POS
= 2¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶computeAA
()¶computeVarYTildeOpt
(varXh, s)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶pyhrf.jde.jde_multi_sujets_alpha.
NoiseVariance_Drift_MultiSubj_Sampler
(val_ini=None, do_sampling=True, use_true_value=False)¶Bases: pyhrf.jde.noise.NoiseVariance_Drift_Sampler
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶parametersToShow
= []¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sujets_alpha.
RHGroupSampler
(val_ini=array([ 0.05]), do_sampling=False, use_true_value=False, pr_mean=0.001, pr_var=10.0)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
PR_MEAN
= 0.001¶PR_VAR
= 10.0¶VAL_INI
= 0.05¶checkAndSetInitValue
(variables)¶getOutputs
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sujets_alpha.
Variance_GaussianNRL_Multi_Subj
(val_ini=array([ 1.]), do_sampling=True, use_true_value=False)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.jde_multi_sujets_alpha.
b
()¶pyhrf.jde.jde_multi_sujets_alpha.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.jde.jde_multi_sujets_alpha.
sampleHRF_single_hrf
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox, hgroup, reg)¶pyhrf.jde.jde_multi_sujets_alpha.
sampleHRF_single_hrf_hack
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox, hgroup)¶pyhrf.jde.jde_multi_sujets_alpha.
sampleHRF_voxelwise_iid
(stLambdaS, stLambdaY, varR, rh, nbColX, nbVox, hgroup, only_hrf_subj, reg, nbsubj)¶pyhrf.jde.jde_multi_sujets_alpha.
simulate_single_subject
(output_dir, cdefs, var_subject_hrf, labels, labels_vol, v_noise, drift_coeff_var, drift_amplitude, hrf_group_level, alpha_var, dt=0.6, dsf=4)¶pyhrf.jde.jde_multi_sujets_alpha.
simulate_subjects
(output_dir, snr_scenario='high_snr', spatial_size='tiny', hrf_group=array([ 0. , 0.00078678, 0.01381744, 0.0575847 , 0.13317542, 0.22304737, 0.30459629, 0.36130416, 0.38656651, 0.38221983, 0.35502768, 0.3133342 , 0.26480303, 0.21531699, 0.16874149, 0.12718515, 0.09146061, 0.06155495, 0.03701372, 0.01720819, 0.00149434, -0.01071142, -0.01991078, -0.02653129, -0.03094522, -0.03348818, -0.03447231, -0.03419243, -0.0329265 , -0.03093224, -0.02844234, -0.02565985, -0.02275507, -0.01986438, -0.01709107, -0.0145079 , -0.01216086, -0.01007359, -0.00825211, -0.00668921, -0.00536856, -0.00426813, 0. ]), nbSubj=10, vars_hrfs=[0.0006, 0.0004, 9e-05, 2e-05, 1.5e-05, 2e-05, 0.0001, 3e-05, 7.5e-05, 3.2e-05], vars_noise=[0.2, 0.5, 0.4, 0.8, 0.6, 2.1, 2.5, 3.1, 2.75, 7.3], alpha_var=0.6)¶Simulate data for multiple subjects (5 subjects by default)
pyhrf.jde.models.
ARN_BiG_BOLDSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.models.
BOLDGibbsSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, response_levels=<pyhrf.jde.nrl.bigaussian.NRLSampler object>, beta=<pyhrf.jde.beta.BetaSampler object>, noise_var=<pyhrf.jde.noise.NoiseVarianceSampler object>, hrf=<pyhrf.jde.hrf.HRFSampler object>, hrf_var=<pyhrf.jde.hrf.RHSampler object>, mixt_weights=<pyhrf.jde.nrl.bigaussian.MixtureWeightsSampler object>, mixt_params=<pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSampler object>, scale=<pyhrf.jde.hrf.ScaleSampler object>, stop_crit_threshold=-1, stop_crit_from_start=False, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
cleanObservables
()¶computeFit
()¶computePMStimInducedSignal
()¶compute_crit_diff
(old_vals, means=None)¶default_nb_its
= 3000¶getGlobalOutputs
()¶initGlobalObservables
()¶inputClass
¶alias of WN_BiG_BOLDSamplerInput
parametersComments
= {'obs_hist_pace': 'See comment for samplesHistoryPaceSave.', 'smpl_hist_pace': 'To save the samples at each iteration\nIf x<0: no save\n If 0<x<1: define the fraction of iterations for which samples are saved\nIf x>=1: define the step in iterations number between saved samples.\nIf x=1: save samples at each iteration.'}¶parametersToShow
= ['nb_iterations', 'response_levels', 'hrf', 'hrf_var']¶saveGlobalObservables
(it)¶stop_criterion
(it)¶updateGlobalObservables
()¶pyhrf.jde.models.
BOLDGibbsSampler_AR
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, response_levels=<pyhrf.jde.nrl.ar.NRLARSampler object>, beta=<pyhrf.jde.beta.BetaSampler object>, noise_var=<pyhrf.jde.noise.NoiseVarianceARSampler object>, noise_arp=<pyhrf.jde.noise.NoiseARParamsSampler object>, hrf=<pyhrf.jde.hrf.HRFARSampler object>, hrf_var=<pyhrf.jde.hrf.RHSampler object>, mixt_weights=<pyhrf.jde.nrl.bigaussian.MixtureWeightsSampler object>, mixt_params=<pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSampler object>, scale=<pyhrf.jde.hrf.ScaleSampler object>, drift=<pyhrf.jde.drift.DriftARSampler object>, drift_var=<pyhrf.jde.drift.ETASampler object>, stop_crit_threshold=-1, stop_crit_from_start=False, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
cleanObservables
()¶computeFit
()¶computePMStimInducedSignal
()¶compute_crit_diff
(old_vals, means=None)¶default_nb_its
= 3000¶getGlobalOutputs
()¶initGlobalObservables
()¶inputClass
¶alias of ARN_BiG_BOLDSamplerInput
parametersComments
= {'obs_hist_pace': 'See comment for samplesHistoryPaceSave.', 'smpl_hist_pace': 'To save the samples at each iteration\nIf x<0: no save\n If 0<x<1: define the fraction of iterations for which samples are saved\nIf x>=1: define the step in iterations number between saved samples.\nIf x=1: save samples at each iteration.'}¶parametersToShow
= ['nb_iterations', 'response_levels', 'hrf', 'hrf_var']¶saveGlobalObservables
(it)¶stop_criterion
(it)¶updateGlobalObservables
()¶pyhrf.jde.models.
BOLDSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Class holding data needed by the sampler : BOLD time courses for each voxel, onsets and voxel topology. It also perform some precalculation such as the convolution matrix based on the onsests (L{stackX})
buildCosMat
(paramLFD, ny)¶buildOtherMatX
()¶buildParadigmConvolMatrix
(zc, estimDuration, availableDataIndex, parData)¶buildParadigmSingleCondMatrix
(zc, estimDuration, availableDataIndex, parData)¶buildPolyMat
(paramLFD, n)¶calcDt
(dtMin)¶chewUpOnsets
(dt, hrfZc, hrfDuration)¶cleanMem
()¶cleanPrecalculations
()¶makePrecalculations
()¶setLFDMat
(paramLFD, typeLFD)¶Build the low frequency basis from polynomial basis functions.
pyhrf.jde.models.
BOLDSampler_Multi_SessInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Class holding data needed by the sampler : BOLD time courses for each voxel, onsets and voxel topology. It also perform some precalculation such as the convolution matrix based on the onsests (L{stackX}) —- Multi-sessions version
buildCosMat
(paramLFD, ny)¶buildOtherMatX
()¶buildParadigmConvolMatrix
(zc, estimDuration, availableDataIndex, parData)¶buildPolyMat
(paramLFD, n)¶calcDt
(dtMin)¶chewUpOnsets
(dt, hrfZc, hrfDuration)¶cleanMem
()¶cleanPrecalculations
()¶makePrecalculations
()¶setLFDMat
(paramLFD, typeLFD)¶Build the low frequency basis from polynomial basis functions.
pyhrf.jde.models.
CallbackCritDiff
¶Bases: pyhrf.jde.samplerbase.GSDefaultCallbackHandler
callback
(it, variables, samplerEngine)¶Execute action to be made after each Gibbs Sampling step (here : nothing). Should be overriden to define more specialized actions. @param it: the number of iterations elapsed in the current sampling process. @param samplerEngine: the parent gibbs sampler object @param vars: variables envolved in the sampling process (list of C{GibbsSamplerVariable} whose index is defined in L{samplerEngine})
pyhrf.jde.models.
Drift_BOLDGibbsSampler
(nb_iterations=3000, obs_hist_pace=-1, glob_obs_hist_pace=-1, smpl_hist_pace=-1, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, response_levels=<pyhrf.jde.nrl.bigaussian_drift.NRL_Drift_Sampler object>, beta=<pyhrf.jde.beta.BetaSampler object>, noise_var=<pyhrf.jde.noise.NoiseVariance_Drift_Sampler object>, hrf=<pyhrf.jde.hrf.HRF_Drift_Sampler object>, hrf_var=<pyhrf.jde.hrf.RHSampler object>, mixt_weights=<pyhrf.jde.nrl.bigaussian.MixtureWeightsSampler object>, mixt_params=<pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSampler object>, scale=<pyhrf.jde.hrf.ScaleSampler object>, drift=<pyhrf.jde.drift.DriftSampler object>, drift_var=<pyhrf.jde.drift.ETASampler object>, stop_crit_threshold=-1, stop_crit_from_start=False, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
computeFit
()¶default_nb_its
= 3000¶inputClass
¶alias of WN_BiG_Drift_BOLDSamplerInput
parametersToShow
= ['nb_iterations', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.models.
Hab_WN_BiG_BOLDSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.WN_BiG_BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.models.
WN_BiG_BOLDSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.models.
WN_BiG_Drift_BOLDSamplerInput
(data, dt, typeLFD, paramLFD, hrfZc, hrfDuration)¶Bases: pyhrf.jde.models.BOLDSamplerInput
cleanPrecalculations
()¶makePrecalculations
()¶pyhrf.jde.models.
W_BOLDGibbsSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, response_levels=<pyhrf.jde.nrl.bigaussian.NRLSamplerWithRelVar object>, beta=<pyhrf.jde.beta.BetaSampler object>, noise_var=<pyhrf.jde.noise.NoiseVarianceSampler object>, hrf=<pyhrf.jde.hrf.HRFSamplerWithRelVar object>, hrf_var=<pyhrf.jde.hrf.RHSampler object>, mixt_weights=<pyhrf.jde.nrl.bigaussian.MixtureWeightsSampler object>, mixt_params=<pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSamplerWithRelVar object>, scale=<pyhrf.jde.hrf.ScaleSampler object>, relevantVariable=<pyhrf.jde.wsampler.WSampler object>, stop_crit_threshold=-1, stop_crit_from_start=False, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
default_nb_its
= 3000¶inputClass
¶alias of WN_BiG_BOLDSamplerInput
parametersToShow
= ['nb_iterations', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.models.
W_Drift_BOLDGibbsSampler
(nb_iterations=3000, obs_hist_pace=-1.0, glob_obs_hist_pace=-1, smpl_hist_pace=-1.0, burnin=0.3, callback=<pyhrf.jde.samplerbase.GSDefaultCallbackHandler object>, response_levels=<pyhrf.jde.nrl.bigaussian_drift.NRL_Drift_SamplerWithRelVar object>, beta=<pyhrf.jde.beta.BetaSampler object>, noise_var=<pyhrf.jde.noise.NoiseVariance_Drift_Sampler object>, hrf=<pyhrf.jde.hrf.HRF_Drift_SamplerWithRelVar object>, hrf_var=<pyhrf.jde.hrf.RHSampler object>, mixt_weights=<pyhrf.jde.nrl.bigaussian.MixtureWeightsSampler object>, mixt_params=<pyhrf.jde.nrl.bigaussian.BiGaussMixtureParamsSamplerWithRelVar object>, scale=<pyhrf.jde.hrf.ScaleSampler object>, condion_relevance=<pyhrf.jde.wsampler.W_Drift_Sampler object>, drift=<pyhrf.jde.drift.DriftSamplerWithRelVar object>, drift_var=<pyhrf.jde.drift.ETASampler object>, stop_crit_threshold=-1, stop_crit_from_start=False, check_final_value=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSampler
default_nb_its
= 3000¶inputClass
¶alias of WN_BiG_Drift_BOLDSamplerInput
parametersToShow
= ['nb_iterations', 'response_levels', 'hrf', 'hrf_var']¶pyhrf.jde.models.
computePl
(drift, varP, dest=None)¶pyhrf.jde.models.
computeSumjaXh
(nrl, matXh, dest=None)¶pyhrf.jde.models.
computeXh
(hrf, varX, dest=None)¶pyhrf.jde.models.
computeYBar
(varMBY, varPl, dest=None)¶pyhrf.jde.models.
computeYTilde
(sumj_aXh, varMBY, dest=None)¶pyhrf.jde.models.
computeYTilde_Pl
(sumj_aXh, yBar, dest=None)¶pyhrf.jde.models.
computehXQXh
(hrf, matXQX, dest=None)¶pyhrf.jde.models.
permutation
(x)¶Randomly permute a sequence, or return a permuted range.
If x is a multi-dimensional array, it is only shuffled along its first index.
Parameters: | x (int or array_like) – If x is an integer, randomly permute np.arange(x) .
If x is an array, make a copy and shuffle the elements
randomly. |
---|---|
Returns: | out – Permuted sequence or array range. |
Return type: | ndarray |
Examples
>>> np.random.permutation(10)
array([1, 7, 4, 3, 0, 9, 2, 5, 8, 6])
>>> np.random.permutation([1, 4, 9, 12, 15])
array([15, 1, 9, 4, 12])
>>> arr = np.arange(9).reshape((3, 3))
>>> np.random.permutation(arr)
array([[6, 7, 8],
[0, 1, 2],
[3, 4, 5]])
pyhrf.jde.models.
rand
(d0, d1, ..., dn)¶Random values in a given shape.
Create an array of the given shape and populate it with
random samples from a uniform distribution
over [0, 1)
.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should all be positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | out – Random values. |
Return type: | ndarray, shape (d0, d1, ..., dn) |
See also
random()
Notes
This is a convenience function. If you want an interface that takes a shape-tuple as the first argument, refer to np.random.random_sample .
Examples
>>> np.random.rand(3,2)
array([[ 0.14022471, 0.96360618], #random
[ 0.37601032, 0.25528411], #random
[ 0.49313049, 0.94909878]]) #random
pyhrf.jde.models.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.jde.models.
simulate_bold
(output_dir=None, noise_scenario='high_snr', spatial_size='tiny', normalize_hrf=True)¶pyhrf.jde.noise.
NoiseARParamsSampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
MH_ARsampling_gauss_proposal
(sig2, M)¶MH_ARsampling_optim
(A, reps, M)¶P_SAMPLE_FLAG
= 'sampleFlag'¶P_USE_TRUE_VALUE
= 'useTrueValue'¶P_VAL_INI
= 'initialValue'¶checkAndSetInitValue
(variables)¶computeInvAutoCorrNoise
(ARp)¶defaultParameters
= {'initialValue': None, 'sampleFlag': True, 'useTrueValue': False}¶finalizeSampling
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.noise.
NoiseVarianceARSampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.jde.noise.NoiseVarianceSampler
checkAndSetInitValue
(variables)¶computeVarYTilde
(varNrls, varXh, varMBYPl)¶finalizeSampling
()¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.noise.
NoiseVarianceSampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
#TODO : comment
checkAndSetInitValue
(variables)¶computeMXhQXh
(h, varXQX)¶compute_aaXhQXhi
(aa, i)¶finalizeSampling
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
sampleNextInternal_bak
(variables)¶pyhrf.jde.noise.
NoiseVarianceSamplerWithRelVar
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.jde.noise.NoiseVarianceSampler
computeWW
(w, destww)¶compute_aawwXhQXhi
(ww, aa, i)¶finalizeSampling
()¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.noise.
NoiseVariance_Drift_Sampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
checkAndSetInitValue
(variables)¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.noise.
NoiseVariancewithHabSampler
(do_sampling=True, use_true_value=False, val_ini=None)¶Bases: pyhrf.jde.noise.NoiseVarianceSampler
#TODO : Sampling procedure for noise variance parameters (white noise) #in case of habituation modeling wrt magnitude
finalizeSampling
()¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
pyhrf.jde.samplerbase.
DuplicateVariableException
(vName, v)¶Bases: exceptions.Exception
pyhrf.jde.samplerbase.
GSDefaultCallbackHandler
¶Bases: pyhrf.xmlio.Initable
Class handling default action after Gibbs Sampling step (nothing). Should be inherited to define more specialized actions (such as plotting and reporting).
callback
(it, vars, samplerEngine)¶Execute action to be made after each Gibbs Sampling step (here : nothing). Should be overriden to define more specialized actions. @param it: the number of iterations elapsed in the current sampling process. @param samplerEngine: the parent gibbs sampler object @param vars: variables envolved in the sampling process (list of C{GibbsSamplerVariable} whose index is defined in L{samplerEngine})
pyhrf.jde.samplerbase.
GSPrintCallbackHandler
(pace)¶Bases: pyhrf.jde.samplerbase.GSDefaultCallbackHandler
Class defining behaviour after each Gibbs Sampling step : printing reports to stdout.
callback
(it, variables, samplerEngine)¶Execute action to be made after each Gibbs Sampling step (here : nothing). Should be overriden to define more specialized actions. @param it: the number of iterations elapsed in the current sampling process. @param samplerEngine: the parent gibbs sampler object @param vars: variables envolved in the sampling process (list of C{GibbsSamplerVariable} whose index is defined in L{samplerEngine})
pyhrf.jde.samplerbase.
GibbsSampler
(variables, nbIt, smplHistoryPace=-1, obsHistoryPace=-1, nbSweeps=None, callbackObj=None, randomSeed=None, globalObsHistoryPace=-1, check_ftval=None, output_fit=False)¶Generic class of a Gibbs sampler with gathers common operations for any gibbs sampling: variable initialisation, observables updates (posterior mean), outputs …
computeFit
()¶finalizeSampling
()¶getFitAxes
()¶getGlobalOutputs
()¶getOutputs
()¶getShortProfile
()¶getTinyProfile
()¶get_variable
(label)¶initGlobalObservables
()¶iterate_sampling
()¶linkToData
(dataInput)¶regVarsInPipeline
()¶runSampling
(atomData=None)¶Launch a complete sampling process by calling the function L{GibbsSamplerVariable.sampleNext()} of each variable. Call the callback function after each iteration. Measure time elapsed and store it in L{tSamplinOnly} and L{analysis_duration}
saveGlobalObservables
(it)¶set_nb_iterations
(n)¶stop_criterion
(it)¶updateGlobalObservables
()¶pyhrf.jde.samplerbase.
GibbsSamplerVariable
(name, valIni=None, trueVal=None, sampleFlag=1, useTrueValue=False, axes_names=None, axes_domains=None, value_label='value')¶checkAndSetInitValue
(variables)¶check_final_value
()¶chooseSampleNext
(flag)¶cleanObservables
()¶finalizeSampling
()¶getMean
()¶Wip … Compute mean over MCMC iterations within the window defined by itStart, itEnd and pace. By default itStart is set to ‘nbSweeps’ and itEnd to the last iteration.
getMeanHistory
()¶getOutputs
()¶get_accuracy
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
get_final_summary
()¶get_final_value
()¶get_string_value
(v)¶get_summary
()¶get_true_value
()¶get_variable
(label)¶Return a sibling GibbsSamplerVariable
initObservables
()¶linkToData
()¶manageMapping
(cuboid)¶manageMappingInit
(shape, axes_names)¶record_trajectories
(it)¶registerNbIterations
(nbIt)¶roiMapped
()¶sampleNextAlt
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated.
sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
samplingWarmUp
(variables)¶Called before the launch of the main sampling loop by the sampler engine. Should be overriden and perform precalculations.
saveCurrentValue
(it)¶saveObservables
(it)¶setFinalValue
()¶setSamplerEngine
(sampler)¶track_obs_quantity
(q, name, axes_names=None, axes_domains=None, history_pace=None)¶track_sampled_quantity
(q, name, axes_names=None, axes_domains=None, history_pace=None)¶updateObsersables
()¶pyhrf.jde.samplerbase.
Trajectory
(variable, axes_names, axes_domains, history_pace, history_start, max_iterations, first_saved_iteration=-1)¶Keep track of a numpy array that is modified _inplace_ iteratively
get_last
()¶Return the last saved element
record
(iteration)¶Increment the history saving.
to_cuboid
()¶Pack the current trajectory in a xndarray
pyhrf.jde.samplerbase.
VariableTypeException
(vClass, vName, v)¶Bases: exceptions.Exception
pyhrf.jde.wsampler.
WSampler
(do_sampling=True, use_true_value=False, val_ini=None, pr_sigmoid_slope=1.0, pr_sigmoid_thresh=0.0)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶computeProbW1
(Qgj, gTQgj, rb, moyqj, t1, t2, mCAj, vCIj, vCAj, j, cardClassCAj)¶ProbW1 is the probability that condition is relevant It is a vecteur on length nbcond
computeVarXhtQ
(h, matXQ)¶computemoyq
(cardClassCA, nbVoxels)¶Compute mean of labels in ROI
finalizeSampling
()¶getOutputs
()¶initObservables
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
saveCurrentValue
(it)¶saveObservables
(it)¶threshold_W
(meanW, thresh)¶updateObsersables
()¶pyhrf.jde.wsampler.
W_Drift_Sampler
(do_sampling=True, use_true_value=False, val_ini=None, pr_sigmoid_slope=1.0, pr_sigmoid_thresh=0.0)¶Bases: pyhrf.xmlio.Initable
, pyhrf.jde.samplerbase.GibbsSamplerVariable
CLASSES
= array([0, 1])¶CLASS_NAMES
= ['inactiv', 'activ']¶L_CA
= 1¶L_CI
= 0¶checkAndSetInitValue
(variables)¶computeProbW1
(gj, gTgj, rb, t1, t2, mCAj, vCIj, vCAj, j, cardClassCAj)¶ProbW1 is the probability that condition is relevant It is a vecteur on length nbcond
computemoyq
(cardClassCA, nbVoxels)¶Compute mean of labels in ROI
finalizeSampling
()¶getOutputs
()¶initObservables
()¶linkToData
(dataInput)¶sampleNextInternal
(variables)¶Define the behaviour of the variable at each sampling step when its sampling is not activated. Must be overriden in child classes.
saveCurrentValue
(it)¶saveObservables
(it)¶threshold_W
(meanW, thresh)¶updateObsersables
()¶pyhrf.sandbox.data_parser.
StructuredDataParser
(directory_labels, allowed_subdirectories, directory_forgers=None, file_forgers=None, root_path=None)¶get_file_blocs
(file_defs, **subdirs)¶get_files
(file_def, **subdirs)¶set_root
(root)¶pyhrf.sandbox.data_parser.
apply_to_dict
(d, f)¶pyhrf.sandbox.data_parser.
check_subdirs
(path, labels, tree, not_found=None)¶pyhrf.sandbox.data_parser.
forge_nrl_files
(conditions)¶construct list of nrl files from list of conditions
pyhrf.sandbox.data_parser.
safe_init
(x, default)¶pyhrf.sandbox.data_parser.
safe_list
(l)¶pyhrf.sandbox.data_parser.
same
(x)¶identity function
pyhrf.sandbox.data_parser.
unformat_nrl_file
(file_nrl)¶pyhrf.sandbox.func_BMA_consensus_clustering.
BMA_consensus_cluster_parallel
(cfg, remote_path, remote_BOLD_fn, remote_mask_fn, Y, nifti_masker, num_vox, K_clus, K_clusters, parc, alpha, prop, nbItRFIR, onsets, durations, output_sub_parc, rescale=True, averg_bold=False)¶Performs all steps for one clustering case (Kclus given, number l of the parcellation given) remote_path: path on the cluster, where results will be stored
pyhrf.sandbox.func_BMA_consensus_clustering.
compute_consensus_clusters_parallel
(K_clus, consensus_matrices, clustcount_matrices, totalcount_matrices, num_voxels, remote_mask_fn, clusters_consensi)¶Performs parcellation to a list of subjects
Directory structure:
subject:
- preprocessed_data –> GM+WM mask, functional data, normalised tissue masks
- t_maps –> T-maps previously computed with GLM (nipy, SALMA)
- parcellation –> Output
pyhrf.sandbox.make_parcellation.
make_mask
(mask, volume, mask_file)¶pyhrf.sandbox.make_parcellation.
make_parcellation
(subject, dest_dir='parcellation', roi_mask_file=None)¶Perform a functional parcellation from input fmri data
Return: parcellation file name (str)
Hierarchical Agglomerative Clustering
These routines perform some hierarchical agglomerative clustering of some input data. Currently, only Ward’s algorithm is implemented.
Authors : Vincent Michel, Bertrand Thirion, Alexandre Gramfort, Gael Varoquaux Modified: Aina Frau License: BSD 3 clause
pyhrf.sandbox.parcellation.
FWHM
(Y)¶pyhrf.sandbox.parcellation.
GLM_method
(name, data0, ncond, dt=0.5, time_length=25.0, ndelays=0)¶pyhrf.sandbox.parcellation.
Memory
(*args, **kwargs)¶pyhrf.sandbox.parcellation.
Ward
(n_clusters=2, memory=None, connectivity=None, copy=True, n_components=None, compute_full_tree='auto', dist_type='uward', cov_type='spherical', save_history=False)¶Bases: pyhrf.sandbox.parcellation.BaseEstimator
, pyhrf.sandbox.parcellation.ClusterMixin
Ward hierarchical clustering: constructs a tree and cuts it.
Parameters: |
|
---|
children_
¶array-like, shape = [n_nodes, 2] – List of the children of each nodes. Leaves of the tree do not appear.
labels_
¶array [n_samples] – cluster labels for each point
n_leaves_
¶int – Number of leaves in the hierarchical tree.
n_components_
¶sparse matrix. – The estimated number of connected components in the graph.
fit
(X, var=None, act=None, var_ini=None, act_ini=None)¶Fit the hierarchical clustering on the data
Parameters: | X (array-like, shape = [n_samples, n_features]) – The samples a.k.a. observations. |
---|---|
Returns: | |
Return type: | self |
pyhrf.sandbox.parcellation.
WardAgglomeration
(n_clusters=2, memory=None, connectivity=None, copy=True, n_components=None, compute_full_tree='auto', dist_type='uward', cov_type='spherical', save_history=False)¶Bases: pyhrf.sandbox.parcellation.AgglomerationTransform
, pyhrf.sandbox.parcellation.Ward
Feature agglomeration based on Ward hierarchical clustering
Parameters: |
|
---|
children_
¶array-like, shape = [n_nodes, 2] – List of the children of each nodes. Leaves of the tree do not appear.
labels_
¶array [n_samples] – cluster labels for each point
n_leaves_
¶int – Number of leaves in the hierarchical tree.
fit
(X, y=None, **params)¶Fit the hierarchical clustering on the data
Parameters: | X (array-like, shape = [n_samples, n_features]) – The data |
---|---|
Returns: | |
Return type: | self |
pyhrf.sandbox.parcellation.
align_parcellation
(p1, p2, mask=None)¶Align two parcellation p1 and p2 as the minimum number of positions to remove in order to obtain equal partitions. :returns: (p2 aligned to p1)
pyhrf.sandbox.parcellation.
assert_parcellation_equal
(p1, p2, mask=None, tol=0, tol_pos=None)¶pyhrf.sandbox.parcellation.
calculate_uncertainty
(dm, g)¶pyhrf.sandbox.parcellation.
compute_fwhm
(F, dt, a=0)¶pyhrf.sandbox.parcellation.
compute_hrf
(method, my_glm, can, ndelays, i)¶pyhrf.sandbox.parcellation.
compute_mixt_dist
(features, alphas, coord_row, coord_col, cluster_masks, moments, cov_type, res)¶Within one given territory: bi-Gaussian mixture model with known posterior weights:
Estimation:
is the mean of posterior weights.
is estimated by weighted sample
mean and
is estimated by weighted sample variance.
Parameters: |
|
---|
pyhrf.sandbox.parcellation.
compute_mixt_dist_skgmm
(features, alphas, coord_row, coord_col, cluster_masks, moments, cov_type, res)¶pyhrf.sandbox.parcellation.
compute_uward_dist
(m_1, m_2, coord_row, coord_col, variance, actlev, res)¶Function computing Ward distance: inertia = !!!!0
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.parcellation.
compute_uward_dist2
(m_1, features, alphas, coord_row, coord_col, cluster_masks, res)¶Function computing Ward distance: In this case we are using the model-based definition to compute the inertia
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.parcellation.
feature_extraction
(fmri_data, method, dt=0.5, time_length=25.0, ncond=1)¶fmri_data (pyhrf.core.FmriData): single ROI fMRI data
pyhrf.sandbox.parcellation.
generate_features
(parcellation, act_labels, feat_levels, noise_var=0.0)¶Generate noisy features with different levels across positions depending on parcellation and activation clusters.
Parameters: |
|
---|---|
Returns: | The simulated the features. |
Return type: | np.array((n_positions, n_features), float) |
pyhrf.sandbox.parcellation.
hc_get_heads
(parents, copy=True)¶Return the heads of the forest, as defined by parents :param parents: :type parents: array of integers :param The parent structure defining the forest (ensemble of trees): :param copy: :type copy: boolean :param If copy is False, the input ‘parents’ array is modified inplace:
Returns: |
|
---|
pyhrf.sandbox.parcellation.
hrf_canonical_derivatives
(tr, oversampling=2.0, time_length=25.0)¶pyhrf.sandbox.parcellation.
informedGMM
(features, alphas)¶Given a set of features, parameters (mu, v, lambda), and alphas: updates the parameters WARNING: only works for nb features = 1
pyhrf.sandbox.parcellation.
informedGMM_MV
(fm, am, cov_type='spherical')¶Given a set of multivariate features, parameters (mu, v, lambda), and alphas: fit a GMM where posterior weights are known (alphas)
pyhrf.sandbox.parcellation.
loglikelihood_computation
(fm, mu0, v0, mu1, v1, a)¶pyhrf.sandbox.parcellation.
mixtp_to_str
(mp)¶pyhrf.sandbox.parcellation.
norm2_bc
(a, b)¶broadcast the computation of ||a-b||^2 where size(a) = (m,n), size(b) = n
pyhrf.sandbox.parcellation.
parcellation_hemodynamics
(fmri_data, feature_extraction_method, parcellation_method, nb_clusters)¶Perform a hemodynamic-driven parcellation on masked fMRI data
Parameters: |
|
---|---|
Returns: | parcellation array (numpy array of integers) with flatten spatial axes |
Examples #TODO
pyhrf.sandbox.parcellation.
render_ward_tree
(tree, fig_fn, leave_colors=None)¶pyhrf.sandbox.parcellation.
represent_features
(features, labels, ampl, territories, t, fn)¶Generate chart with features represented.
Parameters: |
|
---|---|
Returns: |
|
Return type: | features represented in 2D |
pyhrf.sandbox.parcellation.
spatial_ward
(features, graph, nb_clusters=0)¶pyhrf.sandbox.parcellation.
spatial_ward_sk
(features, graph, nb_clusters=0)¶pyhrf.sandbox.parcellation.
spatial_ward_with_uncertainty
(features, graph, variance, activation, var_ini=None, act_ini=None, nb_clusters=0, dist_type='uward', cov_type='spherical', save_history=False)¶Parcellation the given features with the spatial Ward algorithm, taking into account uncertainty on features (variance) and activation level:
Parameters: |
|
---|
pyhrf.sandbox.parcellation.
squared_error
(n, m)¶pyhrf.sandbox.parcellation.
ward_tree
(X, connectivity=None, n_components=None, copy=True, n_clusters=None, var=None, act=None, var_ini=None, act_ini=None, dist_type='uward', cov_type='spherical', save_history=False)¶Ward clustering based on a Feature matrix.
The inertia matrix uses a Heapq-based representation.
This is the structured version, that takes into account a some topological structure between samples.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.parcellation.
ward_tree_save
(tree, output_dir, mask)¶pyhrf.sandbox.physio.
buildOrder1FiniteDiffMatrix_central
(size, dt)¶returns a toeplitz matrix for central differences to correct for errors on the first and last points (due to the fact that there is no rf[-1] or rf[size] to average with):
pyhrf.sandbox.physio.
calc_linear_rfs
(simu_brf, simu_prf, phy_params, dt, normalized_rfs=True)¶Calculate ‘prf given brf’ and ‘brf given prf’ based on the a linearization around steady state of the physiological model as described in Friston 2000.
- Input:
- simu_brf, simu_prf: brf and prf from the physiological simulation
- from which you wish to calculate the respective prf and brf. Assumed to be of size (1,hrf.size)
- phy_params
- normalized_rfs: set to True if simu_hrfs are normalized
- Output:
- calc_brf, calc_prf: np.arrays of shape (hrf.size, 1)
- q_linear, v_linear: q and v calculated according to the linearized model
Note: These calculations do not account for any rescaling between brf and prf. This means the input simu_brf, simu_prf should NOT be rescaled.
pyhrf.sandbox.physio.
create_asl_from_stim_induced
(bold_stim_induced_rescaled, perf_stim_induced, ctrl_tag_mat, dsf, perf_baseline, noise, drift=None, outliers=None)¶Downsample stim_induced signal according to downsampling factor ‘dsf’ and add noise and drift (nuisance signals) which has to be at downsampled temporal resolution.
pyhrf.sandbox.physio.
create_bold_from_hbr_and_cbv
(physiological_params, hbr, cbv)¶Compute BOLD signal from HbR and blood volume variations obtained by a physiological model
pyhrf.sandbox.physio.
create_evoked_physio_signals
(physiological_params, paradigm, neural_efficacies, dt, integration_step=0.05)¶Generate evoked hemodynamics signals by integrating a physiological model.
Parameters: |
|
---|---|
Returns: | All generated signals, indexes of the first axis correspond to:
|
Return type: | np.array((nb_signals, nb_scans, nb_voxels), float) |
pyhrf.sandbox.physio.
create_omega_prf
(primary_brf, dt, physiological_params)¶pyhrf.sandbox.physio.
create_physio_brf
(physiological_params, response_dt=0.5, response_duration=25.0, return_brf_q_v=False)¶Generate a BOLD response function by integrating a physiological model and setting its driving input signal to a single impulse.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.physio.
create_physio_prf
(physiological_params, response_dt=0.5, response_duration=25.0, return_prf_q_v=False)¶Generate a perfusion response function by setting the input driving signal of the given physiological model with a single impulse.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.physio.
create_tbg_neural_efficacies
(physiological_params, condition_defs, labels)¶Create neural efficacies from a truncated bi-Gaussian mixture.
TODO: settle how to relate brls and prls to neural efficacies
Parameters: |
|
---|---|
Returns: | the generated neural efficacies |
Return type: | np.array(np.array((nb_cond, nb_vox), float)) |
pyhrf.sandbox.physio.
linear_rf_operator
(rf_size, phy_params, dt, calculating_brf=False)¶pyhrf.sandbox.physio.
phy_integrate_euler
(phy_params, tstep, stim, epsilon, Y0=None)¶Integrate the ODFs of the physiological model with the Euler method.
TODO: should the output signals be rescaled wrt their value at rest?
Parameters: |
|
---|---|
Returns: | the integrated physiological signals, where indexes of the first axis correspond to:
|
Return type: | np.array((4, nb_steps), float) |
pyhrf.sandbox.physio.
plot_calc_hrf
(hrf1_simu, hrf1_simu_name, hrf1_calc, hrf1_calc_name, hrf2_simu, hrf2_simu_name, dt)¶pyhrf.sandbox.physio.
rescale_bold_over_perf
(bold_stim_induced, perf_stim_induced, bold_perf_ratio=5.0)¶pyhrf.sandbox.physio.
run_calc_linear_rfs
()¶Choose physio parameters. Choose to generate simu_rfs from multiple or single stimulus.
TODO:
pyhrf.sandbox.physio.
simulate_asl_full_physio
(output_dir=None, noise_scenario='high_snr', spatial_size='tiny')¶Generate ASL data by integrating a physiological dynamical system.
dict (<item_label (str)> : <simulated_item (np.ndarray)>) -> a dictionary mapping names of simulated items to their values
TODO: use magnetization model to properly simulate final ASL signal
pyhrf.sandbox.physio.
simulate_asl_phylin_prf
(output_dir=None, noise_scenario='high_snr', spatial_size='tiny')¶Generate ASL data according to a LTI system, with canonical BRF and PRF = Omega.BRF.
Parameters: |
|
---|
dict (<item_label (str)> : <simulated_item (np.ndarray)>) -> a dictionary mapping names of simulated items to their values
pyhrf.sandbox.physio.
simulate_asl_physio_rfs
(output_dir=None, noise_scenario='high_snr', spatial_size='tiny', v_noise=None)¶Generate ASL data according to a LTI system, with PRF and BRF generated from a physiological model.
Parameters: |
|
---|
dict (<item_label (str)> : <simulated_item (np.ndarray)>) -> a dictionary mapping names of simulated items to their values
pyhrf.sandbox.physio_params.
buildOrder1FiniteDiffMatrix_central
(size, dt)¶Returns a toeplitz matrix for central differences to correct for errors on the first and last points (due to the fact that there is no rf[-1] or rf[size] to average with):
pyhrf.sandbox.physio_params.
calc_linear_rfs
(simu_brf, simu_prf, phy_params, dt, normalized_rfs=True)¶Calculate ‘prf given brf’ and ‘brf given prf’ based on the a linearization around steady state of the physiological model as described in Friston 2000
Note: These calculations do not account for any rescaling between brf and prf. This means the input simu_brf, simu_prf should NOT be rescaled.
pyhrf.sandbox.physio_params.
create_bold_from_hbr_and_cbv
(physiological_params, hbr, cbv)¶Compute BOLD signal from HbR and blood volume variations obtained by a physiological model
pyhrf.sandbox.physio_params.
create_evoked_physio_signals
(physiological_params, paradigm, neural_efficacies, dt, integration_step=0.05)¶Generate evoked hemodynamics signals by integrating a physiological model.
Parameters: |
|
---|---|
Returns: | All generated signals, indexes of the first axis correspond to:
|
Return type: | np.array((nb_signals, nb_scans, nb_voxels), float) |
pyhrf.sandbox.physio_params.
create_k_parameters
(physiological_params)¶Create field strength dependent parameters k1, k2, k3
pyhrf.sandbox.physio_params.
create_omega_prf
(primary_brf, dt, phy_params)¶create prf from omega and brf
pyhrf.sandbox.physio_params.
create_physio_brf
(physiological_params, response_dt=0.5, response_duration=25.0, return_brf_q_v=False)¶Generate a BOLD response function by integrating a physiological model and setting its driving input signal to a single impulse.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.physio_params.
create_physio_prf
(physiological_params, response_dt=0.5, response_duration=25.0, return_prf_q_v=False)¶Generate a perfusion response function by setting the input driving signal of the given physiological model with a single impulse.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.physio_params.
create_tbg_neural_efficacies
(physiological_params, condition_defs, labels)¶Create neural efficacy from a truncated bi-Gaussian mixture.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.physio_params.
linear_rf_operator
(rf_size, phy_params, dt, calculating_brf=False)¶pyhrf.sandbox.physio_params.
phy_integrate_euler
(phy_params, tstep, stim, epsilon, Y0=None)¶Integrate the ODFs of the physiological model with the Euler method.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.sandbox.stats.
GSVariable
(name, initialization, do_sampling=True, axes_names=None, axes_domains=None)¶check_against_truth
(atol, rtol, inaccuracy_handling='print')¶check_initialization_arg
(ia)¶enable_sampling
(flag=True)¶get_accuracy_against_truth
(abs_error, rel_error, fv, tv, atol, rtol)¶Return the accuray of the estimate fv, compared to the true value tv
get_custom_init
()¶Must return a numpy.ndarray. Consider initializing with a good guess so that sampling converges more quickly.
get_estim_value_for_check
()¶get_random_init
()¶Must return a random numpy.ndarray that will then be used as init value for sampling. For example, it can be a sample from the prior distribution. This function will also be used to test for the sensitivity to initialization.
get_true_value_for_check
()¶get_variable
(vname)¶get_variable_value
(vname)¶Short-hand to get variable among all those defined in the parent sampler
init_observables
()¶init_sampling
()¶reset
()¶sample
()¶Draw a sample conditionally to the current Gibbs Sampler state. Must return a numpy.ndarray.
Variables which have been registered in the parent GibbsSampler object can be retrieved via methods self.get_variable(var_name) and self.get_variable_value(var_name)
set_init_value
()¶Set the initial value of self.current_value, depending on the initialization scenario (random, custom, truth).
set_initialization
(init)¶set_outputs
(outputs, output_type='ndarray')¶Parameters: |
|
---|
Return: None
set_true_value
(true_value)¶track_obs_quantity
(name, quantity, history_pace=None, axes_names=None, axes_domains=None)¶track_sampled_quantity
(name, quantity, history_pace=None, axes_names=None, axes_domains=None)¶update_observables
()¶Update quantities after the burnin period
pyhrf.sandbox.stats.
GibbsSampler
(sampled_variables, nb_its_max, obs_pace=1, burnin=0.3, sample_hist_pace=-1, obs_hist_pace=-1)¶check_against_truth
(default_atol=0.1, default_rtol=0.1, var_specific_atol=None, var_specific_rtol=None, inaccuracy_handling='print')¶get_outputs
(output_type='ndarray')¶output_type : ‘ndarray’ or ‘cuboid’
get_variable
(vname)¶get_variable_value
(vname)¶iterate_sampling
()¶reset
()¶run
()¶set_initialization
(vname, init)¶set_true_value
(vname, true_value)¶set_true_values
(true_values)¶set_variable
(name, var)¶set_variables
(var_dict)¶stop_criterion
(iteration)¶track_obs_quantity
(name, q, history_pace=None, axes_names=None, axes_domains=None)¶track_sampled_quantity
(name, q, history_pace=None, axes_names=None, axes_domains=None)¶pyhrf.sandbox.stats.
Trajectory
(variable, history_pace, history_start, max_iterations, init_iteration=None, axes_names=None, axes_domains=None)¶Keep track of a numpy array that is modified _inplace_ iteratively TODO: when mature, should be moved to pyhrf.ndarray should replace pyhrf.jde.samplerbase.Trajectory
get_last
()¶Return the last saved element
to_cuboid
()¶Pack the current trajectory in a xndarray
update
(iteration)¶Record the current variable value
pyhrf.stats.misc.
acorr
(x, maxlags=10, scale='var')¶pyhrf.stats.misc.
compute_T_Pvalue
(betas, stds_beta, mask_file, null_hyp=True)¶Compute Tvalues statistic and Pvalue based upon estimates and their standard deviation beta and std_beta for all voxels beta: shape (nb_vox, 1) std: shape (1) Assume null hypothesis if null_hyp is True
pyhrf.stats.misc.
compute_roc_labels
(mlabels, true_labels, dthres=0.005, lab_ca=1, lab_ci=0, false_pos=2, false_neg=3)¶pyhrf.stats.misc.
compute_roc_labels_scikit
(e_labels, true_labels)¶pyhrf.stats.misc.
cpt_ppm_a_apost
(means, variances, props, alpha=0.05)¶pyhrf.stats.misc.
cpt_ppm_a_mcmc
(samples, alpha=0.05)¶Compute a Posterior Probability Map (fixed alpha) from NRL MCMC samples. Expected shape of ‘samples’: (sample, voxel)
pyhrf.stats.misc.
cpt_ppm_a_norm
(mean, variance, alpha=0.0)¶Compute a Posterior Probability Map (fixed alpha) by assuming a Gaussian distribution.
Parameters: |
|
---|---|
Returns: | ppm – Posterior Probability Map evaluated at alpha |
Return type: | array_like |
pyhrf.stats.misc.
cpt_ppm_g_apost
(means, variances, props, gamma=0.0)¶Compute a Posterior Probability Map (fixed gamma) from posterior gaussian mixture components estimates. Expected shape of ‘means’, ‘variances’ and ‘probs’: (nb_classes, voxel)
pyhrf.stats.misc.
cpt_ppm_g_mcmc
(samples, gamma=0.0)¶Compute a Posterior Probability Map (fixed gamma) from NRL MCMC samples. Expected shape of ‘samples’: (sample, voxel)
pyhrf.stats.misc.
cpt_ppm_g_norm
(mean, variance, gamma=0.95)¶Compute a Posterior Probability Map (fixed gamma) by assuming a Gaussian distribution.
Parameters: |
|
---|---|
Returns: | ppm – Posterior Probability Map corresponding to the upper tail probability gamma |
Return type: | ndarray or scalar |
pyhrf.stats.misc.
cumFreq
(data, thres=None)¶pyhrf.stats.misc.
gm_cdf
(x, means, variances, props)¶Compute the cumulative density function of gaussian mixture, ie: p(x<a) = sum_i Nc(mean_i, variance_i)
pyhrf.stats.misc.
gm_mean
(means, variances, props)¶pyhrf.stats.misc.
gm_var
(means, variances, props)¶pyhrf.stats.misc.
mark_wrong_labels
(labels, true_labels, lab_ca=1, lab_ci=0, false_pos=2, false_neg=3)¶pyhrf.stats.misc.
threshold_labels
(labels, thresh=None, act_class=1)¶Threshold input labels which are assumed being of shape (nb classes, nb conds, nb vox). If thresh is None then take the argmax over classes. Else use it on labels for activating class (act_class), suitable for the 2class case only.
pyhrf.stats.random.
BetaGenerator
(mean=0.5, var=0.1)¶Bases: pyhrf.stats.random.RandomGenerator
Class encapsulating the beta random generator of numpy
generate
(size)¶pyhrf.stats.random.
GammaGenerator
(mean=1.0, var=1.0)¶Bases: pyhrf.stats.random.RandomGenerator
Class encapsulating the gamma random generator of numpy
generate
(size)¶pyhrf.stats.random.
GaussianGenerator
(mean=0.0, var=1.0)¶Bases: pyhrf.stats.random.RandomGenerator
Class encapsulating the gaussian random generator of numpy
generate
(size)¶pyhrf.stats.random.
IndependentMixtureLaw
(states, generators)¶Class handling the generation of values following an indenpendent mixture law. Requires the prior generator of label values.
generate
()¶Generate realisations of the mixture law.
pyhrf.stats.random.
LogNormalGenerator
(meanLogN=1.0, varLogN=1.0)¶Bases: pyhrf.stats.random.RandomGenerator
Class encapsulating the log normal generator of numpy
generate
(size)¶pyhrf.stats.random.
RandomGenerator
¶B Abstract class to ensure the definition of the function generate.
generate
(size)¶pyhrf.stats.random.
UniformGenerator
(minV=0.0, maxV=1.0)¶Bases: pyhrf.stats.random.RandomGenerator
Class encapsulating the random generator
generate
(size)¶pyhrf.stats.random.
ZeroGenerator
¶Bases: pyhrf.stats.random.RandomGenerator
Class encapsulating the null distribution !!!!!!!!!
generate
(size)¶pyhrf.stats.random.
gm_sample
(means, variances, props, n=1)¶pyhrf.stats.random.
rand
(d0, d1, ..., dn)¶Random values in a given shape.
Create an array of the given shape and populate it with
random samples from a uniform distribution
over [0, 1)
.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should all be positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | out – Random values. |
Return type: | ndarray, shape (d0, d1, ..., dn) |
See also
random()
Notes
This is a convenience function. If you want an interface that takes a shape-tuple as the first argument, refer to np.random.random_sample .
Examples
>>> np.random.rand(3,2)
array([[ 0.14022471, 0.96360618], #random
[ 0.37601032, 0.25528411], #random
[ 0.49313049, 0.94909878]]) #random
pyhrf.stats.random.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.stats.random.
rpnorm
(n, m, s)¶Random numbers from the positive normal distribution. rpnorm(n,m,s) is a vector of length n with random entries, generated from a positive normal distribution with mean m and standard deviation s.
Original matlab code from: (c) Vincent Mazet, 06/2005 Centre de Recherche en Automatique de Nancy, France vincent.mazet@cran.uhp-nancy.fr
Reference: V. Mazet, D. Brie, J. Idier, ‘Simulation of Positive Normal Variables using several Proposal Distributions’, IEEE Workshop Statistical Signal Processing 2005, july 17-20 2005, Bordeaux, France.
Adapted by Thomas VINCENT: thomas.vincent@cea.fr
pyhrf.stats.random.
truncRandn
(size, mu=0.0, sigma=1.0, a=0.0, b=inf)¶pyhrf.test.
rand
(d0, d1, ..., dn)¶Random values in a given shape.
Create an array of the given shape and populate it with
random samples from a uniform distribution
over [0, 1)
.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should all be positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | out – Random values. |
Return type: | ndarray, shape (d0, d1, ..., dn) |
See also
random()
Notes
This is a convenience function. If you want an interface that takes a shape-tuple as the first argument, refer to np.random.random_sample .
Examples
>>> np.random.rand(3,2)
array([[ 0.14022471, 0.96360618], #random
[ 0.37601032, 0.25528411], #random
[ 0.49313049, 0.94909878]]) #random
pyhrf.test.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.test.analysertest.
BetaEstimESTest
(methodName='runTest')¶Bases: unittest.case.TestCase
test_obs_2Dfield_MAP
()¶Test estimation of beta with an observed 2D field. Partition function estimation method : extrapolation scheme. Use the MAP on p(beta|label).
test_obs_3Dfield_MAP
()¶Test estimation of beta with an observed field: a small 3D case. Partition function estimation method : extrapolation scheme. Use the MAP on p(beta|label).
test_obs_field_ML
()¶Test estimation of beta with an observed field: a small 2D case. Partition function estimation method : extrapolation scheme. Use the ML on p(label|beta). PF estimation: Onsager
pyhrf.test.boldsynthTest.
FieldFuncsTest
(methodName='runTest')¶Bases: unittest.case.TestCase
test_count_homo_cliques
()¶test_count_homo_cliques1
()¶test_count_homo_cliques2
()¶test_potts_gibbs
()¶test_swendsenwang
()¶pyhrf.test.boldsynthTest.
Mapper1DTest
(methodName='runTest')¶Bases: unittest.case.TestCase
test3D
()¶testIncompleteMapping
()¶testIrregularMapping
()¶pyhrf.test.boldsynthTest.
rand
(d0, d1, ..., dn)¶Random values in a given shape.
Create an array of the given shape and populate it with
random samples from a uniform distribution
over [0, 1)
.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should all be positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | out – Random values. |
Return type: | ndarray, shape (d0, d1, ..., dn) |
See also
random()
Notes
This is a convenience function. If you want an interface that takes a shape-tuple as the first argument, refer to np.random.random_sample .
Examples
>>> np.random.rand(3,2)
array([[ 0.14022471, 0.96360618], #random
[ 0.37601032, 0.25528411], #random
[ 0.49313049, 0.94909878]]) #random
pyhrf.test.boldsynthTest.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.test.commandTest.
MiscCommandTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_gls_default
()¶test_gls_recursive
()¶test_gls_recursive_group
()¶test pyhrf_gls command in recursive mode with file groups specified by a regular expression
pyhrf.test.commandTest.
TreatmentCommandTest
(methodName='runTest')¶Bases: unittest.case.TestCase
makeQuietOutputs
(xmlFile)¶setDummyInputData
(xmlFile)¶setSimulationData
(xmlFile, simu_file)¶setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
testDetectEstimDefault
()¶testHrfEstim
()¶test_WNSGGMS
()¶test_WNSGGMS_surf_cmd
()¶test_buildcfg_contrasts
()¶test_buildcfg_jde_loc_vol_default
()¶test_buildcfg_jde_locav_surf_default
()¶test_buildcfg_jde_locav_vol_default
()¶pyhrf.test.graphtest.
GraphTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_bfs
()¶test_from_lattice1
()¶Test default behaviour of graph_from_lattice, in 2D
test_from_lattice2
()¶Test graph_from_lattice in 3D with another kernel mask
test_from_lattice_toro
()¶Test graph_from_lattice, 2D toroidal case
test_from_lattice_toro_huge
()¶Test graph_from_lattice, 2D toroidal case
test_from_mesh
()¶test_graph_is_sane
()¶test_parcels_to_graphs
()¶test_pyhrf_extract_cc_vol
()¶test_split_vol_cc_2D
()¶test_split_vol_cc_3D
()¶test_sub_graph
()¶pyhrf.test.iotest.
DataLoadTest
(methodName='runTest')¶Bases: unittest.case.TestCase
test_frmi_vol
()¶Test volumic data loading
test_paradigm_csv
()¶test_paradigm_csv2
()¶test_paradigm_csv3
()¶test_paradigm_csv4
()¶pyhrf.test.iotest.
FileHandlingTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_split4DVol
()¶test_split_ext
()¶pyhrf.test.iotest.
GiftiTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_load_fmri_surf_data
()¶Test surfacic data loading
test_read_default_real_data_tiny
()¶test_read_tex_gii_label
()¶test_write_tex_gii_2D_float
()¶test_write_tex_gii_float
()¶test_write_tex_gii_labels
()¶test_write_tex_gii_time_series
()¶pyhrf.test.iotest.
NiftiTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_process_history_extension
()¶pyhrf.test.iotest.
RxCopyTest
(methodName='runTest')¶Bases: unittest.case.TestCase
assert_file_exists
(fn, test_exists=True)¶setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_advanced
()¶test_basic
()¶test_callback
()¶test_dry
()¶test_duplicates_targets
()¶test_replacement
()¶test_with_subfolders
()¶pyhrf.test.iotest.
SPMIOTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_load_regnames_SPM12
()¶test_load_regnames_SPM5
()¶test_load_regnames_SPM8
()¶pyhrf.test.iotest.
xndarrayIOTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_save_nii_3D
()¶test_save_nii_4D
()¶test_save_nii_multi
()¶pyhrf.test.jdetest.
ASLPhysioTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_default_jde_small_simulation
()¶Test ASL Physio sampler on small simulation with small nb of iterations. Estimation accuracy is not tested.
pyhrf.test.jdetest.
ASLTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_default_jde_small_simulation
()¶Test ASL sampler on small simulation with small nb of iterations. Estimation accuracy is not tested.
test_simulation
()¶pyhrf.test.jdetest.
JDETest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
testDefaultWithOutputs
()¶test_parcellation
()¶test_surface_treatment
()¶pyhrf.test.jdetest.
MultiSessTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_default_jde_small_simulation
()¶Test JDE multi-sessions sampler on small simulation with small nb of iterations. Estimation accuracy is not tested.
pyhrf.test.jdetest.
PartitionFunctionTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
testExtrapolation2C
()¶pyhrf.test.jdetest.
test_suite
()¶pyhrf.test.statsTest.
PPMTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
test_gm_cdf
()¶test_gm_sample_active
()¶test_gm_sample_half
()¶test_gm_sample_inactive
()¶test_ppm_a_mcmc
()¶test_ppm_a_norm
()¶test_ppm_g_apost
()¶test_ppm_g_mcmc
()¶test_ppm_g_norm
()¶pyhrf.test.test.
rand
(d0, d1, ..., dn)¶Random values in a given shape.
Create an array of the given shape and populate it with
random samples from a uniform distribution
over [0, 1)
.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should all be positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | out – Random values. |
Return type: | ndarray, shape (d0, d1, ..., dn) |
See also
random()
Notes
This is a convenience function. If you want an interface that takes a shape-tuple as the first argument, refer to np.random.random_sample .
Examples
>>> np.random.rand(3,2)
array([[ 0.14022471, 0.96360618], #random
[ 0.37601032, 0.25528411], #random
[ 0.49313049, 0.94909878]]) #random
pyhrf.test.test.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.test.test_glm.
NipyGLMTest
(methodName='runTest')¶Bases: unittest.case.TestCase
makeQuietOutputs
(xmlFile)¶setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_command_line
()¶test_fir_glm
()¶test_glm_contrasts
()¶test_glm_default_real_data
()¶test_glm_with_files
()¶pyhrf.test.test_glm.
test_suite
()¶pyhrf.test.test_jde_multi_subj.
MultiSubjTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_quick
()¶Test running of JDE multi subject (do not test result accuracy)
pyhrf.test.test_jde_multi_subj.
simulate_subjects
(output_dir, snr_scenario='high_snr', spatial_size='tiny', hrf_group=None, nb_subjects=15, vhrf=0.1, vhrf_group=0.1)¶Simulate daata for multiple subjects (5 subjects by default)
pyhrf.test.test_jde_vem_asl.
VEMASLTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_jdevemanalyser
()¶Test BOLD VEM sampler on small simulation with small nb of iterations. Estimation accuracy is not tested.
pyhrf.test.test_jde_vem_bold.
VEMBOLDTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_jdevemanalyser
()¶Test BOLD VEM sampler on small simulation with small nb of iterations. Estimation accuracy is not tested.
test_vem_bold_constrained
(**kwargs)¶Test BOLD VEM constraint function. Estimation accuracy is not tested.
test_vem_bold_constrained_python
(**kwargs)¶Test BOLD VEM constraint function. Estimation accuracy is not tested.
pyhrf.test.test_jde_vem_tools.
VEMToolsTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_PolyMat
()¶test_buildFiniteDiffMatrix
()¶test_computeFit
()¶test_compute_mat_X2
()¶test_create_conditions
()¶test_create_neighbours
()¶test_entropyA
()¶test_entropyH
()¶test_entropyZ
()¶test_expectA
()¶test_expectH
()¶test_expectZ
()¶test_free_energy
()¶Test of vem tool to compute free energy
test_gradient
()¶test_matrix
()¶test_max_L
()¶test_max_beta
()¶test_max_mu_sigma
()¶test_max_sigmaH
()¶test_max_sigmaH_prior
()¶test_max_sigma_noise
()¶test_maximum
()¶test_normpdf
()¶test_polyFit
()¶pyhrf.test.test_jde_vem_tools_UtilsC.
VEMToolsTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_expectA
()¶test_expectH
()¶test_expectZ
()¶test_max_L
()¶test_max_sigma_noise
()¶pyhrf.test.test_jde_vem_tools_asl.
VEMToolsTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_PolyMat
()¶test_buildFiniteDiffMatrix
()¶test_computeFit
()¶test_compute_mat_X2
()¶test_entropyA
()¶test_entropyH
()¶test_entropyZ
()¶test_matrix
()¶test_max_sigmaH
()¶test_max_sigmaH_prior
()¶test_maximum
()¶test_normpdf
()¶test_polyFit
()¶pyhrf.test.test_ndarray.
TestHtml
(methodName='runTest')¶Bases: unittest.case.TestCase
assert_html_equal
(html, expected)¶setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_plot
()¶test_table_header
()¶test_txt_1d_col_axes_only
()¶test_txt_1d_row_axes_only
()¶test_txt_tooltip
()¶pyhrf.test.test_ndarray.
xndarrayTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_cartesian_eval
()¶Test the multiple evaluations of a function that returns a xndarray, over the cartesian products of given arguments.
test_combine_domains
()¶test_equality
()¶test_expansion
()¶test_explode
()¶test_fill
()¶TODO
test_flatten_and_expand
()¶test_init
()¶test_merge
()¶TODO !!!
test_operations
()¶test_save_as_gii
()¶test_save_as_nii
()¶test_set_orientation
()¶test_split
()¶test_squeeze
()¶test_stack
()¶test_sub_cuboid
()¶test_sub_cuboid_with_float_domain
()¶test_to_latex_1d
()¶test_to_latex_3d
()¶test_to_latex_3d_col_align
()¶test_to_latex_3d_hide_name_style
()¶test_to_latex_3d_inner_axes
()¶test_to_latex_3d_join_style
()¶test_tree_to_xndarray
()¶test_unstack_2D
()¶test_unstack_empty_inner_axes
()¶test_xmapping
()¶test_xmapping_inconsistent_domain
()¶test_xmapping_inconsistent_mapping_value
()¶pyhrf.test.test_paradigm.
ParadigmTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_merge_onsets
()¶test_onsets_loc_av
()¶test_to_nipy_Block
()¶Test event-related paradigm
test_to_nipy_Block_2sess
()¶Test event-related paradigm
test_to_nipy_ER
()¶Test event-related paradigm
test_to_nipy_ER_2sess
()¶Test event-related paradigm
test_to_spm_mat_1st_level
()¶pyhrf.test.test_parallel.
ParallelTest
(methodName='runTest')¶Bases: unittest.case.TestCase
test_remote_map_local
()¶test_remote_map_local_cartesian_args
()¶test_remote_map_serial
()¶pyhrf.test.test_parallel.
foo
(a, b)¶pyhrf.test.test_parallel.
foo_raise
(a, b)¶pyhrf.test.test_parcellation.
CmdParcellationTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_voronoi_with_seeds
()¶test_ward_spatial_cmd
(**kwargs)¶test_ward_spatial_real_data
(**kwargs)¶pyhrf.test.test_parcellation.
MeasureTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
test_intersection_matrix
()¶test_parcellation_distance
(**kwargs)¶pyhrf.test.test_plot.
PlotCommandTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_plot_func_slice_func_only
()¶test_plot_func_slice_func_only_multiple_slices
()¶test_plot_func_slice_func_roi
()¶test_plot_func_slice_func_roi_anat
()¶test_plot_func_slice_func_roi_anat_multiple_slices
()¶pyhrf.test.test_rfir.
RFIRTest
(methodName='runTest')¶Bases: unittest.case.TestCase
Test the Regularized FIR (RFIR)-based methods implemented in pyhrf.rfir
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_rfir_on_small_simulation
()¶Check if pyhrf.rfir runs properly and that returned outputs contains the expected items
pyhrf.test.test_sandbox_physio.
SimulationTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_create_evoked_physio_signal
()¶test_create_physio_brf
()¶test_create_physio_prf
()¶test_create_tbg_neural_efficacies
()¶Test the generation of neural efficacies from a truncated bi-Gaussian mixture
test_phy_integrate_euler
()¶test_simulate_asl_full_physio
()¶test_simulate_asl_full_physio_outputs
()¶test_simulate_asl_physio_rfs
()¶pyhrf.test.test_treatment.
CmdInputTest
(methodName='runTest')¶Bases: unittest.case.TestCase
Test extraction of information from the command line to create an FmriTreatment
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_spm12_option_parse
()¶Test parsing of option “-s SPM.mat” (SPM12)
test_spm5_option_parse
()¶Test parsing of option “-s SPM.mat” (SPM5)
test_spm8_option_parse
()¶Test parsing of option “-s SPM.mat” (SPM8)
pyhrf.test.test_treatment.
TreatmentTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_default_jde_cmd_parallel_local
()¶test_default_treatment
()¶test_default_treatment_parallel_LAN
()¶test_default_treatment_parallel_cluster
()¶test_default_treatment_parallel_local
()¶test_jde_estim_from_treatment_pck
()¶test_parallel_local
()¶test_pickle_treatment
()¶test_remote_dir_writable
()¶test_sub_treatment
()¶pyhrf.test.test_xml.
A
(p=1, c='a')¶Bases: pyhrf.xmlio.Initable
pyhrf.test.test_xml.
B
(obj_t=array([5]))¶Bases: pyhrf.xmlio.Initable
from_stuff
(a=2, b=5)¶pyhrf.test.test_xml.
BaseTest
(methodName='runTest')¶Bases: unittest.case.TestCase
testNumpy
()¶test_basic_types
()¶test_bool
()¶test_list_of_int
()¶test_list_of_misc
()¶test_list_of_str
()¶test_ordered_dict
()¶test_tuple_of_misc
()¶pyhrf.test.test_xml.
C
¶Bases: pyhrf.xmlio.Initable
pyhrf.test.test_xml.
ChildClass
(p_child=2)¶Bases: pyhrf.xmlio.Initable
pyhrf.test.test_xml.
D
(p=2)¶Bases: pyhrf.xmlio.Initable
pyhrf.test.test_xml.
InitableTest
(methodName='runTest')¶Bases: unittest.case.TestCase
test_JDEMCMCAnalyzerXML
()¶test_JDEMCMCAnalyzer_Uinode_bijection
()¶test_TreatmentXML
()¶test_bijection_from_classmethod_init
()¶test_bijection_from_init
()¶test_bijection_from_init_no_arg
()¶test_classmethod_init
()¶test_init
()¶test_pickle_classmethod
()¶test_xml_from_classmethod_init
()¶test_xml_from_init
()¶pyhrf.test.test_xml.
T
(param_a=1)¶Bases: pyhrf.xmlio.Initable
from_param_c
(param_c=array([56]))¶pyhrf.test.test_xml.
TestXML
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_simple_bijection
()¶pyhrf.test.test_xml.
TopClass
(p_top='1')¶Bases: pyhrf.xmlio.Initable
pyhrf.test.test_xml.
XMLableTest
(methodName='runTest')¶Bases: unittest.case.TestCase
testDynamicParamsHierachic
()¶testDynamicParamsSingleClass
()¶test_set_init_param
()¶pyhrf.test.test_xml.
create_t
()¶pyhrf.test.toolsTest.
CachedEvalTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_code_digest
()¶test_simple
()¶test_simple_args
()¶test_slow_func
()¶pyhrf.test.toolsTest.
CartesianTest
(methodName='runTest')¶Bases: unittest.case.TestCase
testCartesianBasic
()¶test_cartesian_apply
()¶test_cartesian_apply_parallel
()¶pyhrf.test.toolsTest.
CropTest
(methodName='runTest')¶Bases: unittest.case.TestCase
testBasic
()¶pyhrf.test.toolsTest.
DiagBlockTest
(methodName='runTest')¶Bases: unittest.case.TestCase
testAll2D
()¶testFrom1D
()¶testFromNdarray
()¶testRepFrom1D
()¶testRepFrom2D
()¶testRepFromBlocks
()¶pyhrf.test.toolsTest.
DictToStringTest
(methodName='runTest')¶Bases: unittest.case.TestCase
testBasic
()¶testOnHierachicDict
()¶testOnNumpyArray
()¶testOnSpmMat
()¶pyhrf.test.toolsTest.
GeometryTest
(methodName='runTest')¶Bases: unittest.case.TestCase
test_convex_hull
()¶test_distance
()¶pyhrf.test.toolsTest.
MiscTest
(methodName='runTest')¶Bases: unittest.case.TestCase
test_decorator_do_if_file_exist
()¶test_decorator_do_if_file_exist2
()¶test_decorator_do_if_file_exist_force
()¶pyhrf.test.toolsTest.
PeelVolumeTest
(methodName='runTest')¶Bases: unittest.case.TestCase
testPeel
()¶pyhrf.test.toolsTest.
PipelineTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
testBadDepTreeInit
()¶testGoodDepTreeInit
()¶testRepr
()¶test_cached
()¶test_func_default_args
()¶test_multiple_output_values
()¶pyhrf.test.toolsTest.
ResampleTest
(methodName='runTest')¶Bases: unittest.case.TestCase
testLargerTargetGrid
()¶testResampleToGrid
()¶pyhrf.test.toolsTest.
TableStringTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
test1Darray
()¶test2Darray
()¶test2Darray_latex
()¶test3Darray
()¶test4Darray
()¶pyhrf.test.toolsTest.
computeB
(a, e)¶pyhrf.test.toolsTest.
computeC
(a)¶pyhrf.test.toolsTest.
computeD
(f, b, c)¶pyhrf.test.toolsTest.
computeF
(g, e)¶pyhrf.test.toolsTest.
computeJ
(i, l)¶pyhrf.test.toolsTest.
computeK
(j)¶pyhrf.test.toolsTest.
computeL
(k)¶pyhrf.test.toolsTest.
foo
(a, b, c=1, d=2)¶pyhrf.test.toolsTest.
foo_a
(c=1)¶pyhrf.test.toolsTest.
foo_default_arg
(a, d=1)¶pyhrf.test.toolsTest.
foo_func
(a, b)¶pyhrf.test.toolsTest.
foo_multiple_returns
(e)¶pyhrf.test.toolsTest.
slow_func
(a, b)¶pyhrf.tools.aexpression.
ArithmeticExpression
(expression, **variables)¶Bases: object
Mathematical Expression Evaluator class. You can set the expression member, set the functions, variables and then call evaluate() function that will return you the result of the mathematical expression given as a string.
addDefaultFunctions
()¶Add the following Python functions to be used in a mathemtical expression: acos asin atan atan2 ceil cos cosh degrees exp fabs floor fmod frexp hypot ldexp log log10 modf pow radians sin sinh sqrt tan tanh
addDefaultVariables
()¶Add e and pi to the list of defined variables.
call_if_func
(x)¶check
()¶evaluate
()¶Evaluate the mathematical expression given as a string in the expression member variable.
functions
= None¶Dictionary of variables that can be used in the expression.
getFunctionNames
()¶Return a List of defined function names in sorted order.
getVariableNames
()¶Return a List of defined variables names in sorted order.
setVariable
(name, value)¶Define the value of a variable defined by name
variables
= None¶pyhrf.tools.aexpression.
ArithmeticExpressionNameError
¶Bases: exceptions.Exception
pyhrf.tools.aexpression.
ArithmeticExpressionSyntaxError
¶Bases: exceptions.Exception
pyhrf.tools.backports.
OrderedDict
(*args, **kwds)¶Bases: dict
Dictionary that remembers insertion order
clear
() → None. Remove all items from od.¶copy
() → a shallow copy of od¶fromkeys
(S[, v]) → New ordered dictionary with keys from S¶and values equal to v (which defaults to None).
items
() → list of (key, value) pairs in od¶iteritems
()¶od.iteritems -> an iterator over the (key, value) items in od
iterkeys
() → an iterator over the keys in od¶itervalues
()¶od.itervalues -> an iterator over the values in od
keys
() → list of keys in od¶pop
(k[, d]) → v, remove specified key and return the corresponding value.¶If key is not found, d is returned if given, otherwise KeyError is raised.
popitem
() → (k, v), return and remove a (key, value) pair.¶Pairs are returned in LIFO order if last is true or FIFO order if false.
setdefault
(k[, d]) → od.get(k,d), also set od[k]=d if k not in od¶update
(E, **F) → None. Update od from dict/iterable E and F.¶If E is a dict instance, does: for k in E: od[k] = E[k] If E has a .keys() method, does: for k in E.keys(): od[k] = E[k] Or if E is an iterable of items, does: for k, v in E: od[k] = v In either case, this is followed by: for k, v in F.items(): od[k] = v
values
() → list of values in od¶viewitems
() → a set-like object providing a view on od's items¶viewkeys
() → a set-like object providing a view on od's keys¶viewvalues
() → an object providing a view on od's values¶This module implements function to detect the number of allowed cpus available to the python process.
This is licensed under the CC-BY-SA 3.0 and written by Bakuriu (https://stackoverflow.com/users/510937/bakuriu), ohspite (https://stackoverflow.com/users/891129/ohspite), and Philipp Hagemeister (https://stackoverflow.com/users/35070/phihag). See https://stackoverflow.com/a/1006301
pyhrf.tools.cpus.
available_cpu_count
()¶Number of available virtual or physical CPUs on this system, i.e. user/real as output by time(1) when called with an optimally scaling userspace-only program
This package provide a mean to print colored message to a standard terminal if color is available else message are print in black and white mode. If stdout is redirected in a file or piped to an other program, the output is made black and white to avoid issus with strange caracters that defined colors in terminals. Remember that all messages are print to stdout.
To use these functionalities, play with ‘msg’ instance. Here, some classical uses :
msg.info(‘something cool happend’): msg.error(self, ‘too bad, an error’): msg.warning(self, ‘something strange but not fatal’): msg.write_list((‘no color’, (‘color in red’, ‘red’))): msg.write(‘simple colored write function’) msg.string(‘string to colored string’)
pyhrf.tools.message.
MessageColor
¶Bases: object
error
(msg)¶haveColor
()¶info
(msg)¶string
(msg, color='back')¶warning
(msg)¶write
(msg, color='back')¶write_list
(msg_list)¶pyhrf.tools.misc.
AnsiColorizer
¶Format strings with an ANSI escape sequence to encode color
BEGINC
= '\x1b['¶COLORS
= {'blue': '94', 'green': '92', 'purple': '95', 'red': '91', 'yellow': '93'}¶ENDC
= '\x1b[0m'¶disable
()¶enable
()¶no_tty_check
()¶tty_check
()¶pyhrf.tools.misc.
Extract_TTP_whM_from_group
(hrfs_pck_file, dt, model, Path_data, acq)¶Extract TTP and whM from a group of hrfs whose values are saved in a .pck (size nb_subjects * nb_coeff_hrf)
pyhrf.tools.misc.
Extract_TTP_whM_hrf
(hrf, dt)¶Extract TTP and whM from an hrf
pyhrf.tools.misc.
PPMcalculus_jde
(threshold_value, apost_mean_activ_fn, apost_var_activ_fn, apost_mean_inactiv_fn, apost_var_inactiv_fn, labels_activ_fn, labels_inactiv_fn, nrls_fn, mask_file, null_hyp=True)¶Function to calculate the probability that the nrl in voxel j, condition m, is superior to a given hreshold_value Computation for all voxels Compute Tvalue according to null hypothesis
pyhrf.tools.misc.
Pipeline
(quantities)¶THE_ROOT
= 0¶add_root
(label)¶checkGraph
()¶Check the rightness of the builded graph (acyclicity, uniqueness and no short-circuits)
detectCyclity
(viewedNodes)¶detectShortCircuit
(curRoot, curDepth, depths)¶Recursive method which detects and corrects short-circuits
get_func
(f)¶get_value
(label)¶Return the value associated with ‘label’
get_values
()¶Return all computed values. Perform a full update if not done yet.
init_dependencies
(quantities)¶removeShortCircuits
(label, depths)¶reportChange
(rootLabel)¶Trigger update of the sub graph starting at the given root
reprAllDeps
()¶Build a string representing the while graph : a concatenation of representations of all nodes (see reprDep)
reprDep
(label)¶Build a string representing all dependencies and dependers of the variable label. The returned string is in the form
label
depee1 <-
depee2 <-
-> deper1
-> deper2
resolve
()¶save_graph_plot
(image_filename, images=None)¶setDepths
(label, depths, curDepth)¶update_all
()¶update_quantity
(label, updated)¶update_subgraph
(root)¶pyhrf.tools.misc.
add_prefix
(fn, prefix)¶Add a prefix at the beginning of a file name.
>>> add_prefix('./my_file.txt', 'my_prefix_')
'./my_prefix_my_file.txt'
pyhrf.tools.misc.
add_suffix
(fn, suffix)¶Add a suffix before file extension.
>>> add_suffix('./my_file.txt', '_my_suffix')
'./my_file_my_suffix.txt'
pyhrf.tools.misc.
apply_to_leaves
(tree, func, funcArgs=None, funcKwargs=None)¶Apply function ‘func’ to all leaves in given ‘tree’ and return a new tree.
pyhrf.tools.misc.
array_summary
(a, precision=4)¶pyhrf.tools.misc.
assert_file_exists
(fn)¶pyhrf.tools.misc.
assert_path_not_in_src
(p)¶pyhrf.tools.misc.
attrs_to_string
(attrs)¶pyhrf.tools.misc.
buildPolyMat
(paramLFD, n, dt)¶pyhrf.tools.misc.
cache_exists
(func, args=None, prefix=None, path='./', digest_code=False)¶pyhrf.tools.misc.
cache_filename
(func, args=None, prefix=None, path='./', digest_code=False)¶pyhrf.tools.misc.
cached_eval
(func, args=None, new=False, save=True, prefix=None, path='./', return_file=False, digest_code=False, gzip_mode='cmd')¶pyhrf.tools.misc.
calc_nc2D
(a, b)¶pyhrf.tools.misc.
cartesian
(*sequences)¶Generate the “cartesian product” of all ‘sequences’. Each member of the product is a list containing an element taken from each original sequence.
Note: equivalent to itertools.product, which is at least 2x faster !!
pyhrf.tools.misc.
cartesian_apply
(varying_args, func, fixed_args=None, nb_parallel_procs=1, joblib_verbose=0)¶Apply function func iteratively on the cartesian product of varying_args with fixed args fixed_args. Produce a tree (nested dicts) mapping arg values to the corresponding evaluation of function func
Parameters: |
|
---|---|
Returns: | nested dicts – where each node is an argument value from varying args and each leaf is the result of the evaluation of the function. The order to the tree levels corresponds the order in the input OrderedDict of varying arguments. |
Return type: | tree |
Examples
>>> from pyhrf.tools import cartesian_apply
>>> from pyhrf.tools.backports import OrderedDict
>>> def foo(a,b,c): return a + b + c
>>> v_args = OrderedDict( [('a',[0,1]), ('b',[1,2])] )
>>> fixed_args = {'c': 3}
>>> cartesian_apply(v_args, foo, fixed_args) == { 0 : { 1:4, 2:5}, 1 : { 1:5, 2:6} }
True
pyhrf.tools.misc.
cartesian_combine_args
(varying_args, fixed_args=None)¶Construct the cartesian product of varying_args and append fixed_args to it.
Parameters: |
|
---|
Examples
>>> from pyhrf.tools import cartesian_combine_args
>>> vargs = {'my_arg1' : ['a','b','c'],'my_arg2' : [2, 5, 10],}
>>> fargs = { 'my_arg3' : 'fixed_value' }
>>> res = cartesian_combine_args(vargs, fargs)
>>> res == [{'my_arg1': 'a', 'my_arg2': 2, 'my_arg3': 'fixed_value'},
... {'my_arg1': 'b', 'my_arg2': 2, 'my_arg3': 'fixed_value'},
... {'my_arg1': 'c', 'my_arg2': 2, 'my_arg3': 'fixed_value'},
... {'my_arg1': 'a', 'my_arg2': 5, 'my_arg3': 'fixed_value'},
... {'my_arg1': 'b', 'my_arg2': 5, 'my_arg3': 'fixed_value'},
... {'my_arg1': 'c', 'my_arg2': 5, 'my_arg3': 'fixed_value'},
... {'my_arg1': 'a', 'my_arg2': 10, 'my_arg3': 'fixed_value'},
... {'my_arg1': 'b', 'my_arg2': 10, 'my_arg3': 'fixed_value'},
... {'my_arg1': 'c', 'my_arg2': 10, 'my_arg3': 'fixed_value'}]
True
pyhrf.tools.misc.
cartesian_eval
(func, varargs, fixedargs=None)¶pyhrf.tools.misc.
cartesian_params
(**kwargs)¶pyhrf.tools.misc.
cartesian_test
()¶pyhrf.tools.misc.
check_files_series
(fseries, verbose=False)¶pyhrf.tools.misc.
closestsorted
(a, val)¶pyhrf.tools.misc.
condense_series
(numbers)¶pyhrf.tools.misc.
convex_hull_mask
(mask)¶Compute the convex hull of the positions defined by the given binary mask
Parameters: | mask (-) – binary mask of positions to build the chull from |
---|---|
Returns: | a numpy.ndarray binary mask of positions within the convex hull |
pyhrf.tools.misc.
crop_array
(a, m=None, extension=0)¶Return a sub array where as many zeros as possible are discarded Increase bounding box of mask by extension
pyhrf.tools.misc.
cuboidPrinter
(c)¶pyhrf.tools.misc.
describeRois
(roiMask)¶pyhrf.tools.misc.
diagBlock
(mats, rep=0)¶Construct a diagonal block matrix from blocks which can be 1D or 2D arrays. 1D arrays are taken as column vectors. If ‘rep’ is a non null positive number then blocks are diagonaly ‘rep’ times
pyhrf.tools.misc.
distance
(c1, c2, coord_system=None)¶pyhrf.tools.misc.
do_if_nonexistent_file
(*dargs, **kwargs)¶pyhrf.tools.misc.
extractRoiMask
(nmask, roiId)¶pyhrf.tools.misc.
extract_file_series
(files)¶group all file names sharing a common prefix followed by a number, ie: <prefix><number><extension> Return a dictionnary with two levels (<tag>,<extension>), mapped to all corresponding series index found.
pyhrf.tools.misc.
foo
(*args, **kwargs)¶pyhrf.tools.misc.
format_duration
(dt)¶pyhrf.tools.misc.
format_serie
(istart, iend)¶pyhrf.tools.misc.
gaussian_blur
(a, shape)¶pyhrf.tools.misc.
gaussian_kernel
(shape)¶Returns a normalized ND gauss kernel array for convolutions
pyhrf.tools.misc.
get_2Dtable_string
(val, rownames, colnames, precision=4, col_sep='|', line_end='', line_start='', outline_char=None)¶Return a nice tabular string representation of a 2D numeric array #TODO : make colnames and rownames optional
pyhrf.tools.misc.
get_cache_filename
(args, path='./', prefix=None, gz=True)¶pyhrf.tools.misc.
get_leaf
(element, branch)¶Return the nested leaf element corresponding to all dictionnary keys in branch from element
pyhrf.tools.misc.
group_file_series
(series, group_rules=None)¶pyhrf.tools.misc.
has_ext
(fn, ext)¶pyhrf.tools.misc.
hash_func_input
(func, args, digest_code)¶pyhrf.tools.misc.
html_body
(s)¶pyhrf.tools.misc.
html_cell
(s, cell_type='d', attrs=None)¶pyhrf.tools.misc.
html_div
(s, attrs=None)¶pyhrf.tools.misc.
html_doc
(s)¶pyhrf.tools.misc.
html_head
(s)¶pyhrf.tools.misc.
html_img
(fn, attrs=None)¶pyhrf.tools.misc.
html_list_to_row
(l, cell_types, attrs)¶pyhrf.tools.misc.
html_row
(s)¶pyhrf.tools.misc.
html_style
(s)¶pyhrf.tools.misc.
html_table
(s, border=None)¶pyhrf.tools.misc.
icartesian_combine_args
(varying_args, fixed_args=None)¶Same as cartesian_combine_args but return an iterator over the list of argument combinations
pyhrf.tools.misc.
inspect_default_args
(args, defaults)¶pyhrf.tools.misc.
is_importable
(module_name, func_name=None)¶Return True if given module_name (str) is importable
pyhrf.tools.misc.
map_dict
(func, d)¶pyhrf.tools.misc.
montecarlo
(datagen, festim, nbit=None)¶Perform a Monte Carlo loop with data generator ‘datagen’ and estimation function ‘festim’. ‘datagen’ have to be iterable. ‘festim’ must return an object on which ** and + operators can be applied. If ‘nbit’ is provided then use it for the maximum iterations else loop until datagen stops.
pyhrf.tools.misc.
my_func
(**kwargs)¶pyhrf.tools.misc.
nc2DGrid
(maxSize)¶pyhrf.tools.misc.
non_existent_canditate
(f, start_idx=1)¶pyhrf.tools.misc.
non_existent_file
(f)¶pyhrf.tools.misc.
now
()¶pyhrf.tools.misc.
peelVolume3D
(volume, backgroundLabel=0)¶pyhrf.tools.misc.
polyError
(expression, message)¶Bases: exceptions.Exception
pyhrf.tools.misc.
polyFit
(signal, tr=1.0, order=5)¶Polynomial fit of signals. ‘signal’ is a 2D matrix with first axis being time and second being position. ‘tr’ is the time resolution (dt). ‘order’ is the order of the polynom. Return the orthogonal polynom basis matrix (P) and fitted coefficients (l), such that P.l yields fitted polynoms.
pyhrf.tools.misc.
rebin
(a, newshape)¶Rebin an array to a new shape. Can be used to rebin a func image onto a anat image
pyhrf.tools.misc.
replace_ext
(fn, ext)¶pyhrf.tools.misc.
report_arrays_in_obj
(o)¶pyhrf.tools.misc.
resampleSignal
(s, osf)¶pyhrf.tools.misc.
resampleToGrid
(x, y, xgrid)¶pyhrf.tools.misc.
rescale_values
(a, v_min=0.0, v_max=1.0, axis=None)¶pyhrf.tools.misc.
root3
(listCoeffs)¶pyhrf.tools.misc.
set_leaf
(tree, branch, leaf, branch_classes=None)¶Set the nested leaf element corresponding to all dictionnary keys defined in branch, within tree
pyhrf.tools.misc.
stack_trees
(trees, join_func=None)¶Stack trees (python dictionnaries) with identical structures into one tree so that one leaf of the resulting tree is a list of the corresponding leaves across input trees. ‘trees’ is a list of dict
pyhrf.tools.misc.
swap_layers
(t, labels, l1, l2)¶Create a new tree from t where layers labeled by l1 and l2 are swapped. labels contains the branch labels of t.
pyhrf.tools.misc.
swapaxes
(array, a1, a2)¶pyhrf.tools.misc.
time_diff_str
(diff)¶pyhrf.tools.misc.
tree
(branched_leaves)¶pyhrf.tools.misc.
treeBranches
(tree, branch=None)¶pyhrf.tools.misc.
treeBranchesClasses
(tree, branch=None)¶pyhrf.tools.misc.
tree_items
(tree)¶pyhrf.tools.misc.
tree_leaves
(tree)¶pyhrf.tools.misc.
tree_rearrange
(t, oldlabels, newlabels)¶Create a new tree from t where layers are rearranged following newlabels. oldlabels contains the branch labels of t.
pyhrf.tools.misc.
undrift
(signal, tr, order=5)¶Remove the low frequency trend from ‘signal’ by a polynomial fit. Assume axis 3 of ‘signal’ is time.
pyhrf.tools.misc.
unstack_trees
(tree)¶Return a list of tree from a tree where leaves are all lists with the same number of items
pyhrf.ui.analyser_ui.
FMRIAnalyser
(outputPrefix='', roiAverage=False, pass_error=True, gzip_outputs=False)¶Bases: pyhrf.xmlio.Initable
P_OUTPUT_PREFIX
= 'outputPrefix'¶P_ROI_AVERAGE
= 'averageRoiBold'¶analyse
(data, output_dir=None)¶Launch the wrapped analyser onto the given data
Parameters: | |
---|---|
Returns: | a list of analysis results -> (list of tuple(FmriData, None|output of analyse_roi, str)) = (list of tuple(parcel data, analysis results, analysis report)) See method analyse_roi_wrap |
analyse_roi
(roiData)¶analyse_roi_wrap
(roiData)¶Wrap the analyse_roi method to catch potential exception
analyse_roi_wrap_bak
(roiData)¶clean_output_files
(output_dir)¶enable_draft_testing
()¶filter_crashed_results
(results)¶get_label
()¶joinOutputs
(cuboids, roiIds, mappers)¶make_outputs_multi_subjects
(data_rois, irois, all_outputs, targetAxes, ext, meta_data, output_dir)¶make_outputs_single_subject
(data_rois, irois, all_outputs, targetAxes, ext, meta_data, output_dir)¶outputResults
(results, output_dir, filter='.\\A')¶Return: a tuple (dictionary of outputs, output file names)
outputResults_back_compat
(results, output_dir, filter='.\\A')¶parametersComments
= {'averageRoiBold': 'Average BOLD signals within each ROI before analysis.', 'outputPrefix': 'Tag to prefix every output name'}¶parametersToShow
= ['averageRoiBold', 'outputPrefix']¶set_gzip_outputs
(gzip_outputs)¶set_pass_errors
(pass_error)¶split_data
(fdata, output_dir=None)¶pyhrf.ui.glm_analyser.
GLMAnalyser
(outputPrefix='glm_')¶Bases: pyhrf.ui.analyser_ui.FMRIAnalyser
analyse_roi
(fdata)¶get_label
()¶pyhrf.ui.glm_ui.
GLMAnalyser
(contrasts={'dummy_contrast_example': '3*audio-video/3'}, contrast_test_baseline=0.0, hrf_model='Canonical', drift_model='Cosine', hfcut=128.0, residuals_model='spherical', fit_method='ols', outputPrefix='glm_', rescale_results=False, rescale_factor_file=None, fir_delays=[0], output_fit=False)¶Bases: pyhrf.ui.analyser_ui.FMRIAnalyser
analyse_roi
(fdata)¶get_label
()¶parametersComments
= {'fit_method': 'Either "ols" or "kalman"', 'residuals_model': 'Either "spherical" or "ar1". If "ar1" then the kalman fit method is used'}¶parametersToShow
= []¶pyhrf.ui.jde.
JDEAnalyser
(outputPrefix='jde_', pass_error=True)¶Bases: pyhrf.ui.analyser_ui.FMRIAnalyser
get_label
()¶pyhrf.ui.jde.
JDEMCMCAnalyser
(sampler=<pyhrf.jde.models.BOLDGibbsSampler object>, osfMax=4, dtMin=0.4, dt=0.6, driftParam=4, driftType='polynomial', outputPrefix='jde_mcmc_', randomSeed=None, pass_error=True, copy_sampler=True)¶Bases: pyhrf.ui.jde.JDEAnalyser
Class that wraps a JDE Gibbs Sampler to launch an fMRI analysis TODO: remove parameters about dt and osf (should go in HRF Sampler class), drift (should go in Drift Sampler class)
P_DRIFT_LFD_PARAM
= 'driftParam'¶P_DRIFT_LFD_TYPE
= 'driftType'¶P_DT
= 'dt'¶P_DTMIN
= 'dtMin'¶P_OSFMAX
= 'osfMax'¶P_RANDOM_SEED
= 'randomSeed'¶P_SAMPLER
= 'sampler'¶analyse_roi
(atomData)¶Launch the JDE Gibbs Sampler on a parcel-specific data set atomData :param - atomData: parcel-specific data :type - atomData: pyhrf.core.FmriData
Returns: | JDE sampler object |
---|
enable_draft_testing
()¶packSamplerInput
(roiData)¶parametersComments
= {'driftParam': 'Parameter of the drift modelling.\nIf drift is "polynomial" then this is the order of the polynom.\nIf drift is "cosine" then this is the cut-off period in second.', 'driftType': 'Either "cosine" or "polynomial" or "None"', 'dt': "If different from 0 or None:\nactual time resolution for the oversampled estimated signal (dtMin is ignored).\n Better when it's a multiple of the time of repetition", 'dtMin': 'Minimum time resolution for the oversampled estimated signal', 'sampler': 'Set of parameters for the sampling scheme'}¶parametersToShow
= ['dtMin', 'dt', 'driftType', 'driftParam', 'sampler']¶pyhrf.ui.jde.
jde_analyse
(data=None, nbIterations=3, hrfModel='estimated', hrfNorm=1.0, hrfTrick=False, sampleHrfVar=True, hrfVar=1e-05, keepSamples=False, samplesHistPace=1)¶pyhrf.ui.jde.
runEstimationBetaEstim
(params)¶pyhrf.ui.jde.
runEstimationSupervised
(params)¶pyhrf.ui.rfir_ui.
RFIRAnalyser
(HrfEstimator=<pyhrf.rfir.RFIREstim object>, outputPrefix='hrf_')¶Bases: pyhrf.ui.analyser_ui.FMRIAnalyser
analyse_roi
(atomData)¶parametersToShow
= ['HrfEstimator']¶pyhrf.ui.treatment.
FMRITreatment
(fmri_data=FmriData(onsets=OrderedDict([('audio', [array([ 15. , 20.7, 29.7, 35.4, 44.7, 48. , 83.4, 89.7, 108. , 119.4, 135. , 137.7, 146.7, 173.7, 191.7, 236.7, 251.7, 284.4, 293.4, 296.7])]), ('video', [array([ 0. , 2.4, 8.7, 33. , 39. , 41.7, 56.4, 59.7, 75. , 96. , 122.7, 125.4, 131.4, 140.4, 149.4, 153. , 156. , 159. , 164.4, 167.7, 176.7, 188.4, 195. , 198. , 201. , 203.7, 207. , 210. , 218.7, 221.4, 224.7, 234. , 246. , 248.4, 260.4, 264. , 266.7, 269.7, 278.4, 288. ])])]),bold=array([[ 126.23066711, 115.65962982, 122.31136322, ..., 116.81171417, 116.13331604, 111.56729126], [ 127.20968628, 114.09086609, 119.19332886, ..., 117.53836823, 115.68979645, 111.80612183], [ 125.94182587, 115.54189301, 122.27174377, ..., 113.3349762 , 114.11522675, 107.98921967], ..., [ 122.55458069, 110.07359314, 120.74815369, ..., 118.938797 , 117.44371796, 113.07602692], [ 120.98430634, 110.80142212, 118.64087677, ..., 117.53933716, 115.92312622, 110.41734314], [ 124.12675476, 114.32633209, 121.71891785, ..., 117.40695953, 117.4828949 , 110.78852081]], dtype=float32),tr=2.4,sessionsScans=[array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124])],roiMask=array([[[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], [[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], [[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], ..., [[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], [[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]], [[0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], ..., [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0], [0, 0, 0, ..., 0, 0, 0]]], dtype=int32),graph=None,stimDurations=OrderedDict([('audio', [array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])]), ('video', [array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])])]),meta_obj=(array([[ -3. , 0. , 0. , 78. ], [ 0. , 3. , 0. , -93. ], [ 0. , 0. , 3. , -67.5], [ 0. , 0. , 0. , 1. ]]), <nibabel.nifti1.Nifti1Header object>)), analyser=<pyhrf.ui.jde.JDEMCMCAnalyser object>, output_dir='./', make_outputs=True, result_dump_file=None)¶Bases: pyhrf.xmlio.Initable
already_done
()¶clean_output_files
()¶dump_roi_datasets
(dry=False, output_dir=None)¶enable_draft_testing
()¶execute
()¶get_data_files
()¶output
(result, dump_result=True, outputs=True)¶parametersComments
= {'analyser': 'Define parameters of the analysis which will be applied to the previously defined data', 'fmri_data': 'FMRI data definition', 'make_outputs': 'Make outputs from analysis results', 'output_dir': 'Output directory where to store analysis results', 'result_dump_file': 'File to save the analyser result (uses pickle).'}¶parametersToShow
= ['fmri_data', 'output_dir', 'analyser']¶pickle_result
(result)¶replace_data_dir
(d)¶run
(parallel=None, n_jobs=None)¶Run the analysis: load data, run estimation, output results
split
(dump_sub_results=None, make_sub_outputs=None, output_dir=None, output_file_list=None)¶xmlComment
= "Group all parameters for a within-subject analysis.\nTwo main parts:\n - data definition ('fmri_data')\n - analysis parameters ('analyser')."¶pyhrf.ui.treatment.
append_common_treatment_options
(parser)¶pyhrf.ui.treatment.
create_treatment
(boldFiles, parcelFile, dt, tr, paradigmFile, nbIterations=4000, writeXmlSetup=True, parallelize=False, outputDir=None, outputSuffix=None, outputPrefix=None, contrasts=None, beta=0.6, estimBeta=True, pfMethod='ps', estimHrf=True, hrfVar=0.01, roiIds=None, nbClasses=2, gzip_rdump=False, make_outputs=True, vbjde=False, simulation_file=None)¶pyhrf.ui.treatment.
create_treatment_surf
(boldFiles, parcelFile, meshFile, dt, tr, paradigmFile, nbIterations=4000, writeXmlSetup=True, parallelize=False, outputDir=None, outputSuffix=None, outputPrefix=None, contrasts=';', beta=0.6, estimBeta=True, pfMethod='ps', estimHrf=True, hrfVar=0.01, roiIds=None, nbClasses=2, gzip_rdump=False, simulation_file=None, make_outputs=True)¶pyhrf.ui.treatment.
exec_t
(t)¶pyhrf.ui.treatment.
jde_surf_from_files
(boldFiles=['/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/real_data_surf_tiny_bold.gii'], parcelFile='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/real_data_surf_tiny_parcellation.gii', meshFile='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/real_data_surf_tiny_mesh.gii', dt=0.6, tr=2.4, paradigm_csv_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/paradigm_loc_av.csv', nbIterations=4000, writeXmlSetup=True, parallelize=None, outputDir=None, outputSuffix=None, outputPrefix=None, contrasts=None, beta=0.6, estimBeta=True, pfMethod='ps', estimHrf=True, hrfVar=0.01, roiIds=None, force_relaunch=False, nbClasses=2, gzip_rdump=False, dry=False, simulation_file=None)¶pyhrf.ui.treatment.
jde_vol_from_files
(boldFiles=['/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/subj0_bold_session0.nii.gz'], parcelFile='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/subj0_parcellation.nii.gz', dt=0.6, tr=2.4, paradigm_csv_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/paradigm_loc_av.csv', nbIterations=4000, writeXmlSetup=True, parallelize=None, outputDir=None, outputSuffix=None, outputPrefix=None, contrasts=None, beta=0.6, estimBeta=True, pfMethod='ps', estimHrf=True, hrfVar=0.01, roiIds=None, force_relaunch=False, nbClasses=2, gzip_rdump=False, dry=False, make_outputs=True, vbjde=False, simulation_file=None)¶pyhrf.ui.treatment.
make_outfile
(fn, path, pre='', suf='')¶pyhrf.ui.treatment.
parse_data_options
(options)¶Return an FmriData object corresponding to input options
pyhrf.ui.treatment.
run_pyhrf_cmd_treatment
(cfg_cmd, exec_cmd, default_cfg_file, default_profile_file, label_for_cluster)¶pyhrf.ui.vb_jde_analyser.
JDEVEMAnalyser
(hrfDuration=25.0, sigmaH=0.1, fast=True, computeContrast=True, nbClasses=2, PLOT=False, nItMax=100, nItMin=1, scale=False, beta=1.0, estimateSigmaH=True, estimateHRF=True, TrueHrfFlag=False, HrfFilename='hrf.nii', estimateDrifts=True, hyper_prior_sigma_H=1000, dt=0.6, estimateBeta=True, contrasts=None, simulation=False, estimateLabels=True, LabelsFilename=None, MFapprox=False, estimateMixtParam=True, constrained=False, InitVar=0.5, InitMean=2.0, MiniVemFlag=False, NbItMiniVem=5, zero_constraint=True, output_drifts=False, drifts_type='poly')¶Bases: pyhrf.ui.jde.JDEAnalyser
Class that handles parcel-wise which is done according to the input data parcellation by default, and also takes care of merging parcel-specific outputs at the end of the JDE VEM analysis.
Parameters: |
|
---|
analyse_roi
(roiData)¶ROI analysis of the fMRI data.
Parameters: | roiData (FmriData ) – fMRI data to be analyzed. |
---|---|
Returns: | packed outputs. |
Return type: | dict of xndarray |
parametersComments
= {'HrfFilename': 'HRF filename', 'InitMean': 'initial value of active gaussian means', 'InitVar': 'initial value of active and inactive gaussian variances', 'LabelsFilename': 'labels filename', 'MFapprox': 'using of the Mean Field approximation in labels estimation', 'MiniVemFlag': 'if true, estimate the best initialisation of MixtParam and gamma_h', 'NbItMiniVem': 'number of iterations in Mini VEM algorithm', 'PLOT': 'plotting flag for convergence curves', 'TrueHrfFlag': 'if true, HRF will be fixed to the simulated value', 'beta': 'initial value of spatial Potts regularization parameter', 'constrained': 'adding constrains: positivity and norm = 1 ', 'contrasts': 'contrasts to be evaluated', 'drifts_type': 'type of the drift basis (default="polynomial")', 'dt': 'time resolution of the estimated HRF (in seconds)', 'estimateBeta': 'estimate or not the Potts spatial regularization parameter', 'estimateDrifts': 'explicit drift estimation (if false, drifts are marginalized', 'estimateHRF': 'estimate or not the HRF', 'estimateLabels': 'estimate or not the labels', 'estimateMixtParam': 'estimate or not the mixture parameters', 'estimateSigmaH': 'estimate or not the HRF variance', 'fast': 'running fast VEM with C extensions', 'hrfDuration': 'duration of the HRF (in seconds)', 'hyper_prior_sigma_H': 'parameter of the hyper-prior on sigma_H (if zero, no prior is applied)', 'nItMax': 'maximum number of iterations', 'nItMin': 'minimum number of iterations', 'nbClasses': 'number of classes for the response levels', 'scale': 'if true, the data fidelity term is divide by the number of voxels, otherwise it does nothing', 'sigmaH': 'initial HRF variance', 'simulation': 'indicates whether the run corresponds to a simulation example or not'}¶parametersToShow
= ['dt', 'hrfDuration', 'nItMax', 'nItMin', 'estimateSigmaH', 'estimateHRF', 'TrueHrfFlag', 'HrfFilename', 'estimateBeta', 'estimateLabels', 'LabelsFilename', 'MFapprox', 'estimateDrifts', 'estimateMixtParam', 'InitVar', 'InitMean', 'scale', 'nbClasses', 'fast', 'PLOT', 'sigmaH', 'contrasts', 'hyper_prior_sigma_H', 'constrained', 'simulation', 'MiniVemFlag', 'NbItMiniVem']¶pyhrf.ui.vb_jde_analyser.
change_dim
(labels)¶Change labels dimension from (ncond, nclass, nvox)
to (nclass, ncond, nvox)
.
pyhrf.ui.vb_jde_analyser.
run_analysis
(**params)¶Function to run the JDE VEM analyzer with parallel computation
pyhrf.ui.vb_jde_analyser_asl_fast.
JDEVEMAnalyser
(hrfDuration=25.0, dt=0.5, fast=True, constrained=False, nbClasses=2, PLOT=False, nItMax=1, nItMin=1, scale=False, beta=1.0, simulation=None, fmri_data=None, computeContrast=True, estimateH=True, estimateG=True, use_hyperprior=False, estimateSigmaH=True, estimateSigmaG=True, positivity=False, sigmaH=0.0001, sigmaG=0.0001, sigmaMu=0.0001, physio=True, gammaH=1000, gammaG=1000, zero_constrained=False, estimateLabels=True, estimateMixtParam=True, contrasts=None, InitVar=0.5, InitMean=2.0, estimateA=True, estimateC=True, estimateBeta=True, estimateNoise=True, estimateLA=True, phy_params={'E0': 0.34, 'TE': 0.018, 'V0': 1, 'alpha_w': 0.33, 'buxton': False, 'e': 1.43, 'eps': 0.54, 'eps_max': 10.0, 'linear': False, 'model': 'RBM', 'model_name': 'Khalidov11', 'obata': False, 'r0': 100, 'tau_f': 2.46, 'tau_m': 0.98, 'tau_s': 1.54, 'vt0': 80.6}, prior='omega', n_session=1)¶Bases: pyhrf.ui.jde.JDEAnalyser
analyse_roi
(roiData)¶finalizeEstimation
(true_labels, labels, nvox, true_brf, estimated_brf, true_prf, estimated_prf, true_brls, brls, true_prls, prls, true_drift, PL, L, true_noise, noise)¶parametersComments
= {'InitMean': 'Initiale value of active gaussian means', 'InitVar': 'Initiale value of active and inactive gaussian variances', 'PLOT': 'plotting flag for convergence curves', 'beta': 'initial value of spatial Potts regularization parameter', 'constrained': 'adding constrains: positivity and norm = 1 ', 'dt': 'time resolution of the estimated HRF in seconds', 'estimateBeta': 'estimate or not the Potts spatial regularization parameter', 'estimateG': 'estimate or not the PRF', 'estimateH': 'estimate or not the HRF', 'estimateLA': 'Explicit drift and perfusion baseline estimation', 'estimateLabels': 'estimate or not the Labels', 'estimateMixtParam': 'estimate or not the mixture parameters', 'estimateSigmaG': 'estimate or not the PRF variance', 'estimateSigmaH': 'estimate or not the HRF variance', 'fast': 'running fast VEM with C extensions', 'hrfDuration': 'duration of the HRF in seconds', 'nItMax': 'maximum iteration number', 'nItMin': 'minimum iteration number', 'nbClasses': 'number of classes for the response levels', 'scale': 'flag for the scaling factor applied to the data fidelity term during m_h step.\nIf scale=False then do nothing, else divide the data fidelity term by the number of voxels', 'sigmaG': 'Initial PRF variance', 'sigmaH': 'Initial HRF variance', 'simulation': 'indicates whether the run corresponds to a simulation example or not', 'zero_constrained': 'putting first and last point of the HRF to zero '}¶parametersToShow
= ['dt', 'hrfDuration', 'nItMax', 'nItMin', 'estimateSigmaH', 'estimateSigmaG', 'estimateH', 'estimateG', 'estimateBeta', 'estimateLabels', 'estimateLA', 'estimateMixtParam', 'InitVar', 'InitMean', 'scale', 'nbClasses', 'fast', 'PLOT', 'sigmaH', 'sigmaG']¶pyhrf.ui.vb_jde_analyser_asl_fast.
change_dim
(labels)¶Change labels dimension from (ncond, nclass, nvox) to (nclass, ncond, nvox)
pyhrf.ui.vb_jde_analyser_asl_fast.
run_analysis
(**params)¶pyhrf.ui.vb_jde_analyser_bold_fast.
JDEVEMAnalyser
(hrfDuration=25.0, dt=0.5, fast=True, constrained=False, nbClasses=2, PLOT=False, nItMax=1, nItMin=1, scale=False, beta=1.0, simulation=None, fmri_data=None, computeContrast=True, estimateH=True, estimateG=True, use_hyperprior=False, estimateSigmaH=True, estimateSigmaG=True, positivity=False, sigmaH=0.0001, sigmaG=0.0001, sigmaMu=0.0001, physio=True, gammaH=1000, gammaG=1000, zero_constrained=False, estimateLabels=True, estimateMixtParam=True, contrasts=None, InitVar=0.5, InitMean=2.0, estimateA=True, estimateC=True, estimateBeta=True, estimateNoise=True, estimateLA=True, phy_params={'E0': 0.34, 'TE': 0.018, 'V0': 1, 'alpha_w': 0.33, 'buxton': False, 'e': 1.43, 'eps': 0.54, 'eps_max': 10.0, 'linear': False, 'model': 'RBM', 'model_name': 'Khalidov11', 'obata': False, 'r0': 100, 'tau_f': 2.46, 'tau_m': 0.98, 'tau_s': 1.54, 'vt0': 80.6}, prior='no', n_session=1)¶Bases: pyhrf.ui.jde.JDEAnalyser
analyse_roi
(roiData)¶finalizeEstimation
(true_labels, labels, nvox, true_brf, estimated_brf, true_prf, estimated_prf, true_brls, brls, true_prls, prls, true_drift, PL, L, true_noise, noise)¶parametersComments
= {'InitMean': 'Initiale value of active gaussian means', 'InitVar': 'Initiale value of active and inactive gaussian variances', 'PLOT': 'plotting flag for convergence curves', 'beta': 'initial value of spatial Potts regularization parameter', 'constrained': 'adding constrains: positivity and norm = 1 ', 'dt': 'time resolution of the estimated HRF in seconds', 'estimateBeta': 'estimate or not the Potts spatial regularization parameter', 'estimateG': 'estimate or not the PRF', 'estimateH': 'estimate or not the HRF', 'estimateLA': 'Explicit drift and perfusion baseline estimation', 'estimateLabels': 'estimate or not the Labels', 'estimateMixtParam': 'estimate or not the mixture parameters', 'estimateSigmaG': 'estimate or not the PRF variance', 'estimateSigmaH': 'estimate or not the HRF variance', 'fast': 'running fast VEM with C extensions', 'hrfDuration': 'duration of the HRF in seconds', 'nItMax': 'maximum iteration number', 'nItMin': 'minimum iteration number', 'nbClasses': 'number of classes for the response levels', 'scale': 'flag for the scaling factor applied to the data fidelity term during m_h step.\nIf scale=False then do nothing, else divide the data fidelity term by the number of voxels', 'sigmaG': 'Initial PRF variance', 'sigmaH': 'Initial HRF variance', 'simulation': 'indicates whether the run corresponds to a simulation example or not', 'zero_constrained': 'putting first and last point of the HRF to zero '}¶parametersToShow
= ['dt', 'hrfDuration', 'nItMax', 'nItMin', 'estimateSigmaH', 'estimateSigmaG', 'estimateH', 'estimateG', 'estimateBeta', 'estimateLabels', 'estimateLA', 'estimateMixtParam', 'InitVar', 'InitMean', 'scale', 'nbClasses', 'fast', 'PLOT', 'sigmaH', 'sigmaG']¶pyhrf.ui.vb_jde_analyser_bold_fast.
change_dim
(labels)¶Change labels dimension from (ncond, nclass, nvox) to (nclass, ncond, nvox)
pyhrf.ui.vb_jde_analyser_bold_fast.
run_analysis
(**params)¶pyhrf.validation.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.validation.config.
clean_cache
()¶pyhrf.validation.config.
figfn
(fn)¶Append the figure file extension to fn
pyhrf.validation.config.
is_tmp_file
(fn)¶pyhrf.validation.valid_beta_estim.
ObsField2DTest
(methodName='runTest')¶Bases: unittest.case.TestCase
Test estimation of beta with on observed 2D fields
MC_comp_pfmethods_ML
(shape)¶MC_comp_pfmethods_ML_3C
(shape)¶setUp
()¶Hook method for setting up the test fixture before exercising it.
test_MC_comp_pfmethods_ML_100x100
()¶test_MC_comp_pfmethods_ML_10x10
()¶test_MC_comp_pfmethods_ML_30x30
()¶test_MC_comp_pfmethods_ML_3C_10x10
()¶test_MC_comp_pfmethods_ML_3C_20x20
()¶test_MC_comp_pfmethods_ML_3C_30x30
()¶test_MC_comp_pfmethods_ML_3C_50x50
()¶test_single_Onsager_MAP
()¶PF method: Onsager. MAP on p(label|beta).
test_single_Onsager_ML
()¶PF method: Onsager. ML on p(beta|label).
test_single_PFES_MAP
()¶PF estimation method : extrapolation scheme. MAP on p(beta|label).
test_single_PFES_ML
()¶PF estimation method : extrapolation scheme. ML on p(label|beta).
test_single_PFPS_MAP
()¶PF estimation method : path sampling. MAP on p(beta|label).
test_single_PFPS_ML
()¶PF estimation method : path sampling. ML on p(label|beta).
test_single_surface_PFPS_ML
()¶PF estimation method : path sampling. ML on p(label|beta). topology from a surfacic RDI
pyhrf.validation.valid_beta_estim.
beta_estim_obs_field_mc
(graph, nbClasses, beta, gridLnz, mcit=1, cachePotts=False)¶pyhrf.validation.valid_beta_estim.
dist
(x, y)¶pyhrf.validation.valid_beta_estim.
randn
(d0, d1, ..., dn)¶Return a sample (or samples) from the “standard normal” distribution.
If positive, int_like or int-convertible arguments are provided,
randn generates an array of shape (d0, d1, ..., dn)
, filled
with random floats sampled from a univariate “normal” (Gaussian)
distribution of mean 0 and variance 1 (if any of the are
floats, they are first converted to integers by truncation). A single
float randomly sampled from the distribution is returned if no
argument is provided.
This is a convenience function. If you want an interface that takes a tuple as the first argument, use numpy.random.standard_normal instead.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should be all positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | Z – A (d0, d1, ..., dn) -shaped array of floating-point samples from
the standard normal distribution, or a single such float if
no parameters were supplied. |
Return type: | ndarray or float |
See also
random.standard_normal()
Notes
For random samples from , use:
sigma * np.random.randn(...) + mu
Examples
>>> np.random.randn()
2.1923875335537315 #random
Two-by-four array of samples from N(3, 6.25):
>>> 2.5 * np.random.randn(2, 4) + 3
array([[-4.49401501, 4.00950034, -1.81814867, 7.29718677], #random
[ 0.39924804, 4.68456316, 4.99394529, 4.84057254]]) #random
pyhrf.validation.valid_beta_estim.
test_beta_estim_obs_fields
(graphs, betas, nbLabels, pfmethod, mcit=1)¶pyhrf.validation.valid_jde_asl.
ASLTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_all
()¶Validate estimation of full ASL model at high SNR
test_brf
()¶Validate estimation of BRF at high SNR
test_brls
()¶Validate estimation of BRLs at high SNR
test_drift
()¶Validate estimation of drift at high SNR
test_labels
()¶Validate estimation of labels at high SNR
test_noise_var
()¶Validate estimation of noise variances at high SNR
test_prf
()¶Validate estimation of PRF
test_prls
()¶Validate estimation of PRLs at high SNR
pyhrf.validation.valid_jde_asl_physio.
ASLPhysioHierarchicalTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_brf
()¶Validate estimation of BRF
test_mu
()¶Validate estimation of mu
test_mu_brf
()¶Validate estimation of mu and brf
test_mu_brf_prf
()¶Validate estimation of mu, brf and prf
test_mu_prf
()¶Validate estimation of mu, brf and prf
test_prf
()¶Validate estimation of PRF
pyhrf.validation.valid_jde_asl_physio.
ASLTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_all
()¶Validate estimation of full ASL model at high SNR
test_brf_basic_reg
()¶Validate estimation of BRF at high SNR
test_brf_physio_det
()¶Validate estimation of BRF at high SNR
test_brf_physio_nonreg
()¶Validate estimation of BRF at high SNR
test_brf_physio_reg
()¶Validate estimation of BRF at high SNR
test_brf_var
()¶Validate estimation of BRF at high SNR
test_brls
()¶Validate estimation of BRLs at high SNR
test_drift
()¶Validate estimation of drift at high SNR
test_drift_var
()¶Validate estimation of drift at high SNR
test_labels
()¶Validate estimation of labels at high SNR
test_noise_var
()¶Validate estimation of noise variances at high SNR
test_perf_baseline
()¶Validate estimation of drift at high SNR
test_perf_baseline_var
()¶Validate estimation of drift at high SNR
test_prf_basic_reg
()¶Validate estimation of BRF at high SNR
test_prf_physio_det
()¶Validate estimation of BRF at high SNR
test_prf_physio_nonreg
()¶Validate estimation of BRF at high SNR
test_prf_physio_reg
()¶Validate estimation of PRF
test_prf_var
()¶Validate estimation of PRF
test_prls
()¶Validate estimation of PRLs at high SNR
pyhrf.validation.valid_jde_asl_physio_alpha.
ASLTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_all
()¶Validate estimation of full ASL model at high SNR
test_brf_basic_reg
()¶Validate estimation of BRF at high SNR
test_brf_physio_det
()¶Validate estimation of BRF at high SNR
test_brf_physio_nonreg
()¶Validate estimation of BRF at high SNR
test_brf_physio_reg
()¶Validate estimation of BRF at high SNR
test_brf_var
()¶Validate estimation of BRF at high SNR
test_brls
()¶Validate estimation of BRLs at high SNR
test_drift
()¶Validate estimation of drift at high SNR
test_drift_var
()¶Validate estimation of drift at high SNR
test_labels
()¶Validate estimation of labels at high SNR
test_noise_var
()¶Validate estimation of noise variances at high SNR
test_perf_baseline
()¶Validate estimation of drift at high SNR
test_perf_baseline_var
()¶Validate estimation of drift at high SNR
test_prf_basic_reg
()¶Validate estimation of BRF at high SNR
test_prf_physio_det
()¶Validate estimation of BRF at high SNR
test_prf_physio_nonreg
()¶Validate estimation of BRF at high SNR
test_prf_physio_reg
()¶Validate estimation of PRF
test_prf_var
()¶Validate estimation of PRF
test_prls
()¶Validate estimation of PRLs at high SNR
pyhrf.validation.valid_jde_bold_mono_subj_multi_sess.
MultiSessTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_bmixt_sampler
()¶test_default_jde_small_simulation
()¶Test JDE multi-sessions sampler on small simulation with small nb of iterations. Estimation accuracy is not tested.
test_drift_and_var_sampler
()¶test_drift_sampler
()¶test_drift_var_sampler
()¶test_full_sampler
()¶Test JDE Multi-sessions sampler on simulation with normal size. Estimation accuracy is tested.
test_hrf_sampler
()¶test_hrf_var_sampler
()¶test_label_sampler
()¶test_noise_var_sampler
()¶test_nrl_bar_only_sampler
()¶test_nrl_bar_sampler
()¶test_nrl_by_session_hrf
()¶test_nrl_by_session_sampler
()¶test_nrl_by_session_var_sampler
()¶test_simulation
()¶pyhrf.validation.valid_jde_bold_mono_subj_sess.
JDETest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_full_sampler
()¶Test JDE on simulation with normal size. Estimation accuracy is tested.
test_hrf_var_sampler
()¶test_hrf_var_sampler_2
()¶test_hrf_with_var_sampler
()¶test_hrf_with_var_sampler_2
()¶test_noise_var_sampler
()¶pyhrf.validation.valid_jde_vem_asl.
ASLTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_E_step
()¶Validate estimation of perfusion component at high SNR
test_all
()¶Validate estimation of full ASL model at high SNR
test_beta
()¶Validate estimation of drift at high SNR
test_bold
()¶Validate estimation of bold component at high SNR
test_brf
()¶Validate estimation of BRF at high SNR
test_brls
()¶Validate estimation of BRLs at high SNR
test_la
()¶Validate estimation of drift at high SNR
test_labels
()¶Validate estimation of labels at high SNR
test_mp
()¶Validate estimation of drift at high SNR
test_noise_var
()¶Validate estimation of noise variances at high SNR
test_perfusion
()¶Validate estimation of perfusion component at high SNR
test_prf
()¶Validate estimation of PRF
test_prls
()¶Validate estimation of PRLs at high SNR
test_sigmaG
()¶Validate estimation of drift at high SNR
test_sigmaH
()¶Validate estimation of drift at high SNR
pyhrf.validation.valid_rfir.
RFIRTest
(methodName='runTest')¶Bases: unittest.case.TestCase
Test the Regularized FIR (RFIR)-based methods implemented in pyhrf.rfir
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_results_small_simulation
()¶TODO: move to validation
test_rfir_on_small_simulation
()¶Check if pyhrf.rfir runs properly and that returned outputs contains the expected items
pyhrf.validation.valid_rndm_field.
PartitionFunctionTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
test_comparison
()¶test_extrapolation
()¶test_onsager
()¶test_onsager1
()¶test_path_sampling
()¶pyhrf.validation.valid_rndm_field.
PottsTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
test_SW_nrj
()¶test_SW_nrj_2C_3C
()¶test_gibbs
()¶test_sw_nrj
()¶test_sw_sampling
()¶pyhrf.validation.valid_rndm_field.
field_energy_calculator
(graph)¶pyhrf.validation.valid_sandbox_parcellation.
FeatureExtractionTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_feature_extraction
()¶test_generate_features
()¶pyhrf.validation.valid_sandbox_parcellation.
ParcellationTest
(methodName='runTest')¶Bases: unittest.case.TestCase
save_parcellation_outputs
(pobj, mask)¶setUp
()¶Hook method for setting up the test fixture before exercising it.
tearDown
()¶Hook method for deconstructing the test fixture after testing it.
test_gmm_from_forged_features
()¶Test spatial Ward with uncertainty on forged features
test_hemodynamic_parcellation_GMM_2D_high_SNR
()¶test GMM-based parcellation on features extracted from a 2D artificial fMRI data set, at high SNR
test_hemodynamic_parcellation_wpu_2D_high_SNR
()¶test WPU on features extracted from a 2D artificial fMRI data set, at high SNR
test_mixtdist
()¶Check that merge is in favour of non-activ at the same feature level, starting from singleton clusters.
test_parcellation_history
()¶test_parcellation_mmp_act_level_1D
()¶Test the ability of MMP to ‘jump’ non-activating positions (1D case).
test_parcellation_mmp_act_level_2D
()¶Test the ability of MMP to ‘jump’ non-activating positions (2D case).
test_parcellation_spatialWard_2
()¶Test WPU on a simple case.
test_parcellation_spatialWard_400_nonoise
()¶test_parcellation_spatialWard_400_variance
()¶test_parcellation_spatialWard_5_sklearn
()¶test_parcellation_spatialWard_act_level_1D
()¶Test the ability of WPU to ‘jump’ non-activating positions (1D case).
test_parcellation_spatialWard_act_level_2D
()¶Test the ability of WPU to ‘jump’ non-activating positions (2D case).
test_parcellation_spatialWard_variance_1D
()¶Test the ability of WPU to ‘jump’ non-activating positions (1D case).
test_parcellation_spatialWard_variance_2D
()¶Test the sensibility to variance (2D case).
test_render_ward_tree
()¶test_spatialward_against_modelbasedspatialward
()¶Check that pyhrf’s spatial Ward parcellation is giving the same results as scikit’s spatial Ward parcellation
test_spatialward_against_ward_sk
()¶Check that pyhrf’s spatial Ward parcellation is giving the same results as scikit’s spatial Ward parcellation
test_spatialward_from_forged_features
()¶Test spatial Ward on forged features
test_uspatialward_formula
()¶Check that pyhrf’s Uncertain spatial Ward parcellation is giving the same results as Uncertain spatial Ward parcellation modified formula
test_uward_tree_save
()¶test_ward_distance_1D_v1
()¶test_ward_distance_1D_v2
()¶test_ward_distance_2D
()¶test_ward_tree_save
()¶test_wpu_from_forged_features
()¶Test spatial Ward with uncertainty on forged features
pyhrf.validation.valid_sandbox_parcellation.
StatTest
(methodName='runTest')¶Bases: unittest.case.TestCase
setUp
()¶Hook method for setting up the test fixture before exercising it.
test_gmm_known_alpha0
()¶Test biGMM update with posterior weights equal to 0
test_gmm_known_weights_difvars_noise
()¶Test biGMM fit with known post weights, from biGMM samples (no noise) 1D case.
test_gmm_known_weights_difvars_noisea
()¶Test biGMM fit with known post weights, from biGMM samples (no noise) 1D case.
test_gmm_known_weights_noise
()¶Test biGMM fit with known post weights, from biGMM samples (no noise) 1D case.
test_gmm_known_weights_noisea
()¶Test biGMM fit with known post weights, from biGMM samples (no noise) 1D case.
test_gmm_known_weights_simu_1D
()¶Test biGMM fit with known post weights, from biGMM samples (no noise) 1D case.
test_gmm_likelihood
()¶Test the log likelihood computation
test_informedGMM_parameters
()¶Check that merge is in favour of non-activ at the same feature level, starting from singleton clusters.
test_norm_bc
()¶pyhrf.validation.valid_sandbox_parcellation.
create_features
(size='2D', feat_contrast='high', noise_var=0.0, n_features=2)¶pyhrf.validation.valid_sandbox_parcellation.
simulate_fmri_data
(scenario='high_snr', output_path=None)¶VEM BOLD Constrained
File that contains function for BOLD data analysis with positivity and l2-norm=1 constraints.
It imports functions from vem_tools.py in pyhrf/vbjde
pyhrf.vbjde.vem_asl_models_fast.
Main_vbjde_physio
(graph, Y, Onsets, durations, Thrf, K, TR, beta, dt, scale=1, estimateSigmaH=True, estimateSigmaG=True, sigmaH=0.05, sigmaG=0.05, gamma_h=0, gamma_g=0, NitMax=-1, NitMin=1, estimateBeta=True, PLOT=False, contrasts=[], computeContrast=False, idx_first_tag=0, simulation=None, sigmaMu=None, estimateH=True, estimateG=True, estimateA=True, estimateC=True, estimateZ=True, estimateNoise=True, estimateMP=True, estimateLA=True, use_hyperprior=False, positivity=False, constraint=False, phy_params={'E0': 0.34, 'TE': 0.018, 'V0': 1, 'alpha_w': 0.33, 'buxton': False, 'e': 1.43, 'eps': 0.54, 'eps_max': 10.0, 'linear': False, 'model': 'RBM', 'model_name': 'Khalidov11', 'obata': False, 'r0': 100, 'tau_f': 2.46, 'tau_m': 0.98, 'tau_s': 1.54, 'vt0': 80.6}, prior='omega', zc=False)¶VEM BOLD Constrained
File that contains function for BOLD data analysis with positivity and l2-norm=1 constraints.
It imports functions from vem_tools.py in pyhrf/vbjde
pyhrf.vbjde.vem_asl_models_fast_ms.
Main_vbjde_physio
(graph, Y, Onsets, durations, Thrf, K, TR, beta, dt, scale=1, estimateSigmaH=True, estimateSigmaG=True, sigmaH=0.05, sigmaG=0.05, gamma_h=0, gamma_g=0, NitMax=-1, NitMin=1, estimateBeta=True, PLOT=False, contrasts=[], computeContrast=False, idx_first_tag=0, simulation=None, sigmaMu=None, estimateH=True, estimateG=True, estimateA=True, estimateC=True, estimateZ=True, estimateNoise=True, estimateMP=True, estimateLA=True, use_hyperprior=False, positivity=False, constraint=False, phy_params={'E0': 0.34, 'TE': 0.018, 'V0': 1, 'alpha_w': 0.33, 'buxton': False, 'e': 1.43, 'eps': 0.54, 'eps_max': 10.0, 'linear': False, 'model': 'RBM', 'model_name': 'Khalidov11', 'obata': False, 'r0': 100, 'tau_f': 2.46, 'tau_m': 0.98, 'tau_s': 1.54, 'vt0': 80.6}, prior='omega', zc=False)¶This module implements the VEM for BOLD data.
The function uses the C extension for expectation and maximization steps (see src/pyhrf/vbjde/utilsmodule.c file).
Notes
TODO: add some refs?
pyhrf.vbjde.vem_bold.
eps
¶float – mimics the machine epsilon to avoid zero values
pyhrf.vbjde.vem_bold.
logger
¶logger – logger instance identifying this module to log informations
pyhrf.vbjde.vem_bold.
jde_vem_bold
(graph, bold_data, onsets, durations, hrf_duration, nb_classes, tr, beta, dt, estimate_sigma_h=True, sigma_h=0.05, it_max=-1, it_min=0, estimate_beta=True, contrasts=None, compute_contrasts=False, hrf_hyperprior=0, estimate_hrf=True, constrained=False, zero_constraint=True, drifts_type='poly', seed=6537546)¶This is the main function that computes the VEM analysis on BOLD data. This function uses optimized python functions.
Parameters: |
|
---|---|
Returns: |
|
Notes
See A novel definition of the multivariate coefficient of variation article for more information about the coefficient of variation.
VEM BOLD Constrained
File that contains function for BOLD data analysis with positivity and l2-norm=1 constraints.
It imports functions from vem_tools.py in pyhrf/vbjde
pyhrf.vbjde.vem_bold_constrained.
Main_vbjde_Extension_constrained
(graph, Y, Onsets, Thrf, K, TR, beta, dt, scale=1, estimateSigmaH=True, sigmaH=0.05, NitMax=-1, NitMin=1, estimateBeta=True, PLOT=False, contrasts=[], computeContrast=False, gamma_h=0, estimateHRF=True, TrueHrfFlag=False, HrfFilename='hrf.nii', estimateLabels=True, LabelsFilename='labels.nii', MFapprox=False, InitVar=0.5, InitMean=2.0, MiniVEMFlag=False, NbItMiniVem=5)¶pyhrf.vbjde.vem_bold_constrained.
Main_vbjde_Extension_constrained_stable
(graph, Y, Onsets, Thrf, K, TR, beta, dt, scale=1, estimateSigmaH=True, sigmaH=0.05, NitMax=-1, NitMin=1, estimateBeta=True, PLOT=False, contrasts=[], computeContrast=False, gamma_h=0)¶Version modified by Lofti from Christine’s version
pyhrf.vbjde.vem_bold_constrained.
Main_vbjde_Python_constrained
(graph, Y, Onsets, Thrf, K, TR, beta, dt, scale=1, estimateSigmaH=True, sigmaH=0.1, NitMax=-1, NitMin=1, estimateBeta=False, PLOT=False)¶VEM BOLD Constrained
File that contains function for BOLD data analysis with positivity and l2-norm=1 constraints.
It imports functions from vem_tools.py in pyhrf/vbjde
pyhrf.vbjde.vem_bold_models_fast.
Main_vbjde_physio
(graph, Y, Onsets, durations, Thrf, K, TR, beta, dt, scale=1, estimateSigmaH=True, estimateSigmaG=True, sigmaH=0.05, sigmaG=0.05, gamma_h=0, gamma_g=0, NitMax=-1, NitMin=1, estimateBeta=True, PLOT=False, contrasts=[], computeContrast=False, idx_first_tag=0, simulation=None, sigmaMu=None, estimateH=True, estimateG=True, estimateA=True, estimateC=True, estimateZ=True, estimateNoise=True, estimateMP=True, estimateLA=True, use_hyperprior=False, positivity=False, constraint=False, phy_params={'E0': 0.34, 'TE': 0.018, 'V0': 1, 'alpha_w': 0.33, 'buxton': False, 'e': 1.43, 'eps': 0.54, 'eps_max': 10.0, 'linear': False, 'model': 'RBM', 'model_name': 'Khalidov11', 'obata': False, 'r0': 100, 'tau_f': 2.46, 'tau_m': 0.98, 'tau_s': 1.54, 'vt0': 80.6}, prior='omega')¶VEM BOLD Constrained
File that contains function for BOLD data analysis with positivity and l2-norm=1 constraints.
It imports functions from vem_tools.py in pyhrf/vbjde
pyhrf.vbjde.vem_bold_models_fast_ms.
Main_vbjde_physio
(graph, Y, Onsets, durations, Thrf, K, TR, beta, dt, scale=1, estimateSigmaH=True, estimateSigmaG=True, sigmaH=0.05, sigmaG=0.05, gamma_h=0, gamma_g=0, NitMax=-1, NitMin=1, estimateBeta=True, PLOT=False, contrasts=[], computeContrast=False, idx_first_tag=0, simulation=None, sigmaMu=None, estimateH=True, estimateG=True, estimateA=True, estimateC=True, estimateZ=True, estimateNoise=True, estimateMP=True, estimateLA=True, use_hyperprior=False, positivity=False, constraint=False, phy_params={'E0': 0.34, 'TE': 0.018, 'V0': 1, 'alpha_w': 0.33, 'buxton': False, 'e': 1.43, 'eps': 0.54, 'eps_max': 10.0, 'linear': False, 'model': 'RBM', 'model_name': 'Khalidov11', 'obata': False, 'r0': 100, 'tau_f': 2.46, 'tau_m': 0.98, 'tau_s': 1.54, 'vt0': 80.6}, prior='omega', zc=False)¶TOOLS and FUNCTIONS for VEM JDE Used in different versions of VEM
pyhrf.vbjde.vem_tools.
A_Entropy
(Sigma_A, M, J)¶pyhrf.vbjde.vem_tools.
Compute_FreeEnergy
(y_tilde, m_A, Sigma_A, mu_Ma, sigma_Ma, m_H, Sigma_H, AuxH, R, R_inv, sigmaH, sigmaG, m_C, Sigma_C, mu_Mc, sigma_Mc, m_G, Sigma_G, AuxG, q_Z, neighboursIndexes, Beta, Gamma, gamma, gamma_h, gamma_g, sigma_eps, XX, W, J, D, M, N, K, hyp, Gamma_X, Gamma_WX, plot=False, bold=False, S=1)¶pyhrf.vbjde.vem_tools.
H_Entropy
(Sigma_H, D)¶pyhrf.vbjde.vem_tools.
PolyMat
(Nscans, paramLFD, tr)¶Build polynomial basis
pyhrf.vbjde.vem_tools.
Q_Entropy
(q_Z, M, J)¶pyhrf.vbjde.vem_tools.
Q_expectation_Ptilde
(q_Z, neighboursIndexes, Beta, gamma, K, M)¶pyhrf.vbjde.vem_tools.
RF_Entropy
(Sigma_RF, D)¶pyhrf.vbjde.vem_tools.
RF_expectation_Ptilde
(m_X, Sigma_X, sigmaX, R, R_inv, D)¶pyhrf.vbjde.vem_tools.
RL_Entropy
(Sigma_RL, M, J)¶pyhrf.vbjde.vem_tools.
RL_expectation_Ptilde
(m_X, Sigma_X, mu_Mx, sigma_Mx, q_Z)¶pyhrf.vbjde.vem_tools.
Z_Entropy
(q_Z, M, J)¶pyhrf.vbjde.vem_tools.
beta_gradient
(beta, labels_proba, labels_neigh, neighbours_indexes, gamma, gradient_method='m1')¶Computes the gradient of the beta function.
The maximization of needs the computation of its derivative with respect to
.
Method 1
Method 2
where
Parameters: | |
---|---|
Returns: | gradient – the gradient estimated in beta |
Return type: |
pyhrf.vbjde.vem_tools.
beta_maximization
(beta, labels_proba, neighbours_indexes, gamma)¶Computes the Beta Maximization step of the JDE VEM algorithm.
The maximization over each corresponds to the M-step obtained for a standard Hiddden MRF model:
Parameters: |
|
---|---|
Returns: |
|
Notes
See beta_gradient()
function.
pyhrf.vbjde.vem_tools.
buildFiniteDiffMatrix
(order, size, regularization=None)¶Build the finite difference matrix used for the hrf regularization prior.
Parameters: | |
---|---|
Returns: | diffMat – the finite difference matrix |
Return type: | ndarray, shape (size, size) |
pyhrf.vbjde.vem_tools.
computeFit
(hrf_mean, nrls_mean, X, nb_voxels, nb_scans)¶Compute the estimated induced signal by each stimulus.
Parameters: |
|
---|---|
Returns: | |
Return type: | ndarray |
pyhrf.vbjde.vem_tools.
computeFit_asl
(H, m_A, G, m_C, W, XX)¶Compute Fit
pyhrf.vbjde.vem_tools.
compute_contrasts
(condition_names, contrasts, m_A, m_C, Sigma_A, Sigma_C, M, J)¶pyhrf.vbjde.vem_tools.
compute_mat_X_2
(nbscans, tr, lhrf, dt, onsets, durations=None)¶pyhrf.vbjde.vem_tools.
constraint_norm1_b
(Ftilde, Sigma_F, positivity=False, perfusion=None)¶Constrain with optimization strategy
pyhrf.vbjde.vem_tools.
contrasts_mean_var_classes
(contrasts, condition_names, nrls_mean, nrls_covar, nrls_class_mean, nrls_class_var, nb_contrasts, nb_classes, nb_voxels)¶Computes the contrasts nrls from the conditions nrls and the mean and variance of the gaussian classes of the contrasts (in the cases of all inactive conditions and all active conditions).
Parameters: |
|
---|---|
Returns: |
|
pyhrf.vbjde.vem_tools.
cosine_drifts_basis
(nb_scans, param_lfd, tr)¶Build cosine drifts basis.
Parameters: | |
---|---|
Returns: | drifts_basis – K is determined by the scipy.linalg.orth function and corresponds to the effective rank of the matrix it is applied to (see function’s docstring) |
Return type: | ndarray, shape (nb_scans, K) |
pyhrf.vbjde.vem_tools.
covariance_matrix
(order, D, dt)¶pyhrf.vbjde.vem_tools.
create_conditions
(onsets, durations, nb_conditions, nb_scans, hrf_len, tr, dt)¶Generate the occurrences matrix.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.vbjde.vem_tools.
create_neighbours
(graph)¶Transforms the graph list in ndarray. This is for performances purposes. Sets the empty neighbours to -1.
Parameters: | graph (list of ndarray) – each graph[i] represents the list of neighbours of the ith voxel |
---|---|
Returns: | neighbours_indexes |
Return type: | ndarray |
pyhrf.vbjde.vem_tools.
drifts_coeffs_fit
(signal, drift_basis)¶# TODO
Parameters: |
|
---|---|
Returns: | drift_coeffs |
Return type: | ndarray, shape |
pyhrf.vbjde.vem_tools.
expectation_A_asl
(H, G, m_C, W, XX, Gamma, Gamma_X, q_Z, mu_Ma, sigma_Ma, J, y_tilde, Sigma_H, sigma_eps_m)¶Expectation-A step:
Returns: | |
---|---|
Return type: | m_A, Sigma_A of probability distribution p_A of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_A_ms
(m_A, Sigma_A, H, G, m_C, W, XX, Gamma, Gamma_X, q_Z, mu_Ma, sigma_Ma, J, y_tilde, Sigma_H, sigma_eps_m, N, M, D, S)¶Expectation-A step:
Returns: | |
---|---|
Return type: | m_A, Sigma_A of probability distribution p_A of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_C_asl
(G, H, m_A, W, XX, Gamma, Gamma_X, q_Z, mu_Mc, sigma_Mc, J, y_tilde, Sigma_G, sigma_eps_m)¶Expectation-C step:
Returns: | |
---|---|
Return type: | m_C, Sigma_C of probability distribution p_C of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_C_ms
(m_C, Sigma_C, G, H, m_A, W, XX, Gamma, Gamma_X, q_Z, mu_Mc, sigma_Mc, J, y_tilde, Sigma_G, sigma_eps_m, N, M, D, S)¶Expectation-C step:
Returns: | |
---|---|
Return type: | m_C, Sigma_C of probability distribution p_C of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_G_asl
(Sigma_C, m_C, m_A, H, XX, W, WX, Gamma, Gamma_WX, XW_Gamma_WX, J, y_tilde, cov_noise, R_inv, sigmaG, prior_mean_term, prior_cov_term)¶Expectation-G step:
Returns: | |
---|---|
Return type: | m_G, Sigma_G of probability distribution p_G of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_G_ms
(Sigma_C, m_C, m_A, H, XX, W, WX, Gamma, Gamma_WX, XW_Gamma_WX, J, y_tilde, cov_noise, R_inv, sigmaG, prior_mean_term, prior_cov_term, N, M, D, S)¶Expectation-G step:
Returns: | |
---|---|
Return type: | m_G, Sigma_G of probability distribution p_G of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_H_asl
(Sigma_A, m_A, m_C, G, XX, W, Gamma, Gamma_X, X_Gamma_X, J, y_tilde, cov_noise, R_inv, sigmaH, prior_mean_term, prior_cov_term)¶Expectation-H step:
Returns: | |
---|---|
Return type: | m_H, Sigma_H of probability distribution p_H of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_H_ms
(Sigma_A, m_A, m_C, G, XX, W, Gamma, Gamma_X, X_Gamma_X, J, y_tilde, cov_noise, R_inv, sigmaH, prior_mean_term, prior_cov_term, N, M, D, S)¶Expectation-H step:
Returns: | |
---|---|
Return type: | m_H, Sigma_H of probability distribution p_H of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_H_ms_concat
(Sigma_A, m_A, m_C, G, XX, W, Gamma, Gamma_X, X_Gamma_X, J, y_tilde, cov_noise, R_inv, sigmaH, prior_mean_term, prior_cov_term, S)¶Expectation-H step:
Returns: | |
---|---|
Return type: | m_H, Sigma_H of probability distribution p_H of the current iteration |
pyhrf.vbjde.vem_tools.
expectation_Ptilde_Likelihood
(y_tilde, m_A, Sigma_A, H, Sigma_H, m_C, Sigma_C, G, Sigma_G, XX, W, sigma_eps, Gamma, J, D, M, N, Gamma_X, Gamma_WX)¶pyhrf.vbjde.vem_tools.
expectation_Q_asl
(Sigma_A, m_A, Sigma_C, m_C, sigma_Ma, mu_Ma, sigma_Mc, mu_Mc, Beta, p_q_t, p_Q, neighbours_indexes, graph, M, J, K)¶pyhrf.vbjde.vem_tools.
expectation_Q_async_asl
(Sigma_A, m_A, Sigma_C, m_C, sigma_Ma, mu_Ma, sigma_Mc, mu_Mc, Beta, p_q_t, p_Q, neighbours_indexes, graph, M, J, K)¶pyhrf.vbjde.vem_tools.
expectation_Q_ms
(Sigma_A, m_A, Sigma_C, m_C, sigma_Ma, mu_Ma, sigma_Mc, mu_Mc, Beta, p_q_t, p_Q, neighbours_indexes, graph, M, J, K, S)¶pyhrf.vbjde.vem_tools.
expectation_ptilde_hrf
(hrf_mean, hrf_covar, sigma_h, hrf_regu_prior, hrf_regu_prior_inv, hrf_len)¶Expectation with respect to p_tilde hrf.
pyhrf.vbjde.vem_tools.
expectation_ptilde_labels
(labels_proba, neighbours_indexes, beta, nb_conditions, nb_classes)¶Expectation with respect to p_tilde q (or z).
pyhrf.vbjde.vem_tools.
expectation_ptilde_likelihood
(data_drift, nrls_mean, nrls_covar, hrf_mean, hrf_covar, occurence_matrix, noise_var, noise_struct, nb_voxels, nb_scans)¶Expectation with respect to likelihood.
where
Parameters: |
|
---|---|
Returns: | ptilde_likelihood |
Return type: |
pyhrf.vbjde.vem_tools.
expectation_ptilde_nrls
(labels_proba, nrls_class_mean, nrls_class_var, nrls_mean, nrls_covar)¶Expectation with respect to p_tilde a.
pyhrf.vbjde.vem_tools.
fit_hrf_two_gammas
(hrf_mean, dt, duration)¶Fits the estimated HRF to the standard two gammas model.
Parameters: | |
---|---|
Returns: |
|
pyhrf.vbjde.vem_tools.
free_energy_computation
(nrls_mean, nrls_covar, hrf_mean, hrf_covar, hrf_len, labels_proba, data_drift, occurence_matrix, noise_var, noise_struct, nb_conditions, nb_voxels, nb_scans, nb_classes, nrls_class_mean, nrls_class_var, neighbours_indexes, beta, sigma_h, hrf_regu_prior, hrf_regu_prior_inv, gamma, hrf_hyperprior)¶Compute the free energy functional.
where denotes the expectation with respect to q and
is the entropy of q.
Returns: | free_energy |
---|---|
Return type: | float |
pyhrf.vbjde.vem_tools.
fun
(Beta, p_Q, Qtilde_sumneighbour, neighboursIndexes, gamma)¶function to minimize
pyhrf.vbjde.vem_tools.
grad_fun
(Beta, p_Q, Qtilde_sumneighbour, neighboursIndexes, gamma)¶function to minimize
pyhrf.vbjde.vem_tools.
hrf_entropy
(hrf_covar, hrf_len)¶Compute the entropy of the hemodynamic response function. The entropy of a multivariate normal distribution is
where n is the dimensionality of the vector space and is the determinant of the
covariance matrix.
Parameters: |
|
---|---|
Returns: | entropy |
Return type: |
pyhrf.vbjde.vem_tools.
hrf_expectation
(nrls_covar, nrls_mean, occurence_matrix, noise_struct, hrf_regu_prior_inv, sigmaH, nb_voxels, y_tilde, noise_var, prior_mean_term=0.0, prior_cov_term=0.0)¶Computes the VE-H step of the JDE-VEM algorithm.
where
Here, and
denote the
and
entries of the mean vector and covariance matrix of the current
, respectively.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.vbjde.vem_tools.
labels_entropy
(labels_proba)¶Compute the labels entropy.
Parameters: | labels_proba (ndarray, shape (nb_conditions, nb_classes, nb_voxels)) – Probability of each voxel to be in one class |
---|---|
Returns: | entropy |
Return type: | float |
pyhrf.vbjde.vem_tools.
labels_expectation
(nrls_covar, nrls_mean, nrls_class_var, nrls_class_mean, beta, labels_proba, neighbours_indexes, nb_conditions, nb_classes, nb_voxels=None, parallel=True, nans_init=False)¶Computes the E-Z (or E-Q) step of the JDE-VEM algorithm.
Using the mean-field approximation,
is approximated by a factorized density
such that if
, then
where
is a particular configuration of
updated at each iteration according to a specific scheme and
where
and denotes the
entries of the covariance matrix
Notes
The mean-field fixed point equation is defined in:
Celeux, G., Forbes, F., & Peyrard, N. (2003). EM procedures using mean field-like approximations for Markov model-based image segmentation. Pattern Recognition, 36(1), 131–144. https://doi.org/10.1016/S0031-3203(02)00027-4
Parameters: |
|
---|---|
Returns: | labels_proba |
Return type: | ndarray, shape (nb_conditions, nb_classes, nb_voxels) |
pyhrf.vbjde.vem_tools.
maximization_LA_asl
(Y, m_A, m_C, XX, WP, W, WP_Gamma_WP, H, G, Gamma)¶pyhrf.vbjde.vem_tools.
maximization_Mu_asl
(H, G, matrix_covH, matrix_covG, sigmaH, sigmaG, sigmaMu, Omega, R_inv)¶pyhrf.vbjde.vem_tools.
maximization_beta_m2_asl
(beta, p_Q, Qtilde_sumneighbour, Qtilde, neighboursIndexes, maxNeighbours, gamma, MaxItGrad, gradientStep)¶pyhrf.vbjde.vem_tools.
maximization_beta_m2_scipy_asl
(Beta, p_Q, Qtilde_sumneighbour, Qtilde, neighboursIndexes, maxNeighbours, gamma, MaxItGrad, gradientStep)¶Maximize beta
pyhrf.vbjde.vem_tools.
maximization_beta_m4_asl
(beta, p_Q, Qtilde_sumneighbour, Qtilde, neighboursIndexes, maxNeighbours, gamma, MaxItGrad, gradientStep)¶pyhrf.vbjde.vem_tools.
maximization_class_proba
(labels_proba, nrls_mean, nrls_covar)¶Computes the M-(mu, sigma) step of the JDE-VEM algorithm.
pyhrf.vbjde.vem_tools.
maximization_drift_coeffs
(data, nrls_mean, occurence_matrix, hrf_mean, noise_struct, drift_basis)¶Computes the M-(l, Gamma) step of the JDE-VEM algorithm. In the AR(1) case:
pyhrf.vbjde.vem_tools.
maximization_mu_sigma_asl
(q_Z, m_X, Sigma_X)¶pyhrf.vbjde.vem_tools.
maximization_mu_sigma_ms
(q_Z, m_X, Sigma_X, M, J, S, K)¶pyhrf.vbjde.vem_tools.
maximization_noise_var
(occurence_matrix, hrf_mean, hrf_covar, nrls_mean, nrls_covar, noise_struct, data_drift, nb_scans)¶Computes the M-sigma_epsilon step of the JDE-VEM algorithm.
where matrix is a
whose element
is given by
pyhrf.vbjde.vem_tools.
maximization_sigmaH
(D, Sigma_H, R, m_H)¶Computes the M-sigma_h step of the JDE-VEM algorithm.
pyhrf.vbjde.vem_tools.
maximization_sigmaH_prior
(D, Sigma_H, R, m_H, gamma_h)¶Computes the M-sigma_h step of the JDE-VEM algorithm with a prior.
where
pyhrf.vbjde.vem_tools.
maximization_sigma_asl
(D, Sigma_H, R_inv, m_H, use_hyp, gamma_h)¶pyhrf.vbjde.vem_tools.
maximization_sigma_noise_asl
(XX, m_A, Sigma_A, H, m_C, Sigma_C, G, Sigma_H, Sigma_G, W, y_tilde, Gamma, Gamma_X, Gamma_WX, N)¶Maximization sigma_noise
pyhrf.vbjde.vem_tools.
maximum
(iterable)¶Return the maximum and the indice of the maximum of an iterable.
Parameters: | iterable (iterable or numpy array) – |
---|---|
Returns: |
|
pyhrf.vbjde.vem_tools.
mult
(v1, v2)¶Multiply two vectors.
The first vector is made vertical and the second one horizontal. The result will be a matrix of size len(v1), len(v2).
Parameters: |
|
---|---|
Returns: | x |
Return type: | ndarray, shape (len(v1), len(v2)) |
pyhrf.vbjde.vem_tools.
norm1_constraint
(function, variance)¶Returns the function constrained with optimization strategy.
Parameters: |
|
---|---|
Returns: | optimized_function |
Return type: | numpy array |
Raises: |
|
pyhrf.vbjde.vem_tools.
normpdf
(x, mu, sigma)¶pyhrf.vbjde.vem_tools.
nrls_entropy
(nrls_covar, nb_conditions)¶Compute the entropy of neural response levels. The entropy of a multivariate normal distribution is
where n is the dimensionality of the vector space and is the determinant of the
covariance matrix.
Parameters: |
|
---|---|
Returns: | entropy |
Return type: |
pyhrf.vbjde.vem_tools.
nrls_expectation
(hrf_mean, nrls_mean, occurence_matrix, noise_struct, labels_proba, nrls_class_mean, nrls_class_var, nb_conditions, y_tilde, nrls_covar, hrf_covar, noise_var)¶Computes the VE-A step of the JDE-VEM algorithm.
where:
The mth column of is denote by
Parameters: |
|
---|---|
Returns: |
|
pyhrf.vbjde.vem_tools.
plot_convergence
(ni, M, cA, cC, cH, cG, cAH, cCG, SUM_q_Z, mua1, muc1, FE)¶pyhrf.vbjde.vem_tools.
plot_response_functions_it
(ni, NitMin, M, H, G, Mu=None, prior=None)¶pyhrf.vbjde.vem_tools.
polyFit
(signal, tr, order, p)¶pyhrf.vbjde.vem_tools.
poly_drifts_basis
(nb_scans, param_lfd, tr)¶Build polynomial drifts basis.
Parameters: | |
---|---|
Returns: | drifts_basis – K is determined by the scipy.linalg.orth function and corresponds to the effective rank of the matrix it is applied to (see function’s docstring) |
Return type: | ndarray, shape (nb_scans, K) |
pyhrf.vbjde.vem_tools.
ppm_contrasts
(contrasts_mean, contrasts_var, contrasts_class_mean, contrasts_class_var, threshold_a='std_inact', threshold_g=0.95)¶Computes the ppm for the given contrast using either the standard deviation of the “all inactive conditions” class gaussian (default) or the intersection of the [all inactive conditions] and [all active conditions] classes gaussians as threshold for the PPM_a and 0.95 (default) for the PPM_g. Be carefull, this computation considers the mean of the inactive class as zero.
Parameters: |
|
---|---|
Returns: |
|
pyhrf.vbjde.vem_tools.
ppms_computation
(elements_mean, elements_var, class_mean, class_var, threshold_a='std_inact', threshold_g=0.9)¶Considering the elements_mean and elements_var from a gaussian distribution, commutes the posterior probability maps considering for the alpha threshold, either the standard deviation of the [all inactive conditions] gaussian class or the intersection of the [all (in)active conditions] gaussian classes; and for the gamma threshold 0.9 (default).
The posterior probability maps (PPM) per experimental condition is computed as
. Note that we have to thresholds to set. We set
to get
a posterior probability distribution, and
is the threshold that we set to see a certain level of
significance. As default, we chose a threshold
for each experimental condition m as the
intersection of the two Gaussian densities of the Gaussian Mixture Model (GMM) that represent active and non-active
voxel.
and
being the parameters of the GMM in
corresponding to
active (i=0) and non-active (i=1) voxels for experimental condition m.
Be careful, this computation considers the mean of the inactive class as zero.
Notes
nb_elements refers either to the number of contrasts (for the PPMs contrasts computation) or for the number of conditions (for the PPMs nrls computation).
Parameters: |
|
---|---|
Returns: |
|
pyhrf.vbjde.vem_tools.
roc_curve
(dvals, labels, rocN=None, normalize=True)¶Compute ROC curve coordinates and area
returns (FP coordinates, TP coordinates, AUC )
pyhrf.vbjde.vem_tools.
sum_over_neighbours
(neighbours_indexes, array_to_sum)¶Sums the array_to_sum over the neighbours in the graph.
pyhrf.xmliobak.xmlbase.
FuncWrapper
(func, params=None)¶pyhrf.xmliobak.xmlbase.
TypedXMLHandler
(write_callback=None)¶Class handling the xml format with the following generic document structure:
<root>
<tagName 'type'=tagType>
tagContent
</tagName>
</root>
The root tag is mandatory, so is the ‘type’ attribute for every other tag. This class can parse an xml string and build a dictionary of python objects according to each tag (see parseXMLString()). Conversely, it can build the xml string corresponding to a list or dictionary of objects ( see to_xml()). This class is based on the xml.dom python module and relies on the DOM structure to parse XML. XML input/output is handled via a mapping between a type attribute of a tag and a static handling function. This class handles the following basic python types: string, int, float, array.array, list, dict. One can add other type-specific read or write handlers with the functions addDOMTagReader() and addDOMWriter().
Reading XML:
specific handlers are added with method addDOMTagReader(stype, myReadHandler) which maps the function myReadHandler to the string stype.
a tag reading handler must have the following prototype:
myReadHandler(domTreeWalker):
# interpret and process tag content
# return built python object
, where domTreeWalker is an instance of the _xmlplus.dom.TreeWalker.TreeWalker class.
useful things to use the domTreeWalker and implement a handler:
Writing XML:
handlers are added with method addDOMTagWriter(pythonType, myWriteHandler), where pythonType is of python type ‘type’ and myWriteHandler a function.
a tag writing handler must have the following prototype:
myWriteHandler(domDocument, node, pyObj):
where:
useful things to write handlers :
ATTRIBUTE_LABEL_META
= 'meta'¶ATTRIBUTE_LABEL_PYTHON_CLASS
= 'pythonClass'¶ATTRIBUTE_LABEL_PYTHON_CLASS_INIT_MODE
= 'pythonInitMode'¶ATTRIBUTE_LABEL_PYTHON_FUNCTION
= 'pythonFunction'¶ATTRIBUTE_LABEL_PYTHON_MODULE
= 'pythonModule'¶ATTRIBUTE_LABEL_TYPE
= 'type'¶TYPE_LABEL_ARRAY
= 'array'¶TYPE_LABEL_BOOL
= 'bool'¶TYPE_LABEL_DICT
= 'struct'¶TYPE_LABEL_FLOAT
= 'double'¶TYPE_LABEL_INT
= 'int'¶TYPE_LABEL_KEY_VAL_PAIR
= 'dictItem'¶TYPE_LABEL_LIST
= 'list'¶TYPE_LABEL_NONE
= 'none'¶TYPE_LABEL_ODICT
= 'ordered_struct'¶TYPE_LABEL_STRING
= 'char'¶TYPE_LABEL_TUPLE
= 'tuple'¶TYPE_LABEL_XML_INCLUDE
= 'include'¶arrayDOMWriter
(node, arrayObj, xmlHandler)¶arrayTagDOMReader
(xmlHandler)¶boolDOMWriter
(node, boolObj, xmlHandler)¶boolTagDOMReader
(xmlHandler)¶buildXMLString
(obj, label=None, pretty=False)¶createDocument
()¶dictDOMWriter
(node, dictObj, xmlHandler, atype=None)¶dictTagDOMReader
(xmlHandler, init_class=None)¶floatDOMWriter
(node, floatObj, xmlHandler)¶floatTagDOMReader
(xmlHandler)¶includeTagDOMReader
(xmlHandler)¶inspect_and_append_to_DOM_tree
(doc, node, obj)¶inspectable
(obj)¶intDOMWriter
(node, intObj, xmlHandler)¶intTagDOMReader
(xmlHandler)¶listDOMWriter
(node, listObj, xmlHandler)¶listTagDOMReader
(xmlHandler)¶mountDefaultHandlers
()¶noneDOMWriter
(node, noneObj, xmlHandler)¶noneTagDOMReader
(xmlHandler)¶odictDOMWriter
(node, dictObj, xmlHandler)¶odictTagDOMReader
(xmlHandler)¶packHandlers
()¶parseXMLString
(xml)¶readDOMData
(walker)¶rootTagDOMReader
(walker)¶stringDOMWriter
(node, stringObj, xmlHandler)¶stringTagDOMReader
(xmlHandler)¶tupleDOMWriter
(node, tupleObj, xmlHandler)¶tupleTagDOMReader
(xmlHandler)¶writeDOMData
(doc, node, obj, label, comment=None, meta=None)¶pyhrf.xmliobak.xmlbase.
XMLParamDrivenClass
(parameters=None, xmlHandler=<pyhrf.xmliobak.xmlbase.TypedXMLHandler instance>, xmlLabel=None, xmlComment=None)¶Base “abstract” class to handle parameters with clear specification and default values. Recursive aggregation is availaible to handle aggregated variables which also require parameter specifications.
appendParametersToDOMTree
(doc, node)¶defaultParameters
= {}¶fetchDefaultParameters
()¶parametersComments
= {}¶parametersMeta
= {}¶parametersToXml
(tagName=None, pretty=False)¶updateParameters
(newp)¶pyhrf.xmliobak.xmlbase.
XMLParamDrivenClassInitException
¶pyhrf.xmliobak.xmlbase.
XMLable
(**kwargs)¶get_parameters_comments
()¶get_parameters_meta
()¶get_parameters_to_show
()¶override_init
(param_name, init_obj, init_params=None)¶override_value
(param_name, value)¶set_init
(init_func, init_params=None)¶pyhrf.xmliobak.xmlbase.
XMLable2
¶Bases: object
check_init_func
(params=None)¶get_parameters_comments
()¶get_parameters_meta
()¶get_parameters_to_show
()¶override_param_init
(init_func, **params)¶TODO (if needed)
set_init
(init_func, **init_params)¶set_init_param
(param_name, param_value)¶pyhrf.xmliobak.xmlbase.
from_xml
(s, handler=<pyhrf.xmliobak.xmlbase.TypedXMLHandler instance>)¶pyhrf.xmliobak.xmlbase.
getargspec
(func)¶pyhrf.xmliobak.xmlbase.
match_init_args
(c, argsDict)¶pyhrf.xmliobak.xmlbase.
read_xml
(fn)¶pyhrf.xmliobak.xmlbase.
to_xml
(o, handler=<pyhrf.xmliobak.xmlbase.TypedXMLHandler instance>, objName='anonymObject', pretty=False)¶pyhrf.xmliobak.xmlbase.
write_xml
(obj, fn)¶pyhrf.xmliobak.xmlmatlab.
MatlabXMLHandler
¶Bases: pyhrf.xmliobak.xmlbase.TypedXMLHandler
TYPE_LABEL_CELL
= 'cell'¶TYPE_LABEL_DOUBLE
= 'double'¶cellDOMWriter
(node, arrayObj, xmlHandler)¶cellTagDOMReader
(xmlHandler)¶doubleDOMWriter
(node, arrayObj, xmlHandler)¶doubleTagDOMReader
(xmlHandler)¶packHandlers
()¶pyhrf.xmliobak.xmlnumpy.
NumpyXMLHandler
(write_callback=None)¶Bases: pyhrf.xmliobak.xmlbase.TypedXMLHandler
NUMPY_ARRAY_TAG_NAME
= 'numpy.ndarray'¶NUMPY_INT16_TAG_NAME
= 'numpy.int16'¶NUMPY_INT32_TAG_NAME
= 'numpy.int32'¶arrayDOMWriter
(node, arrayObj, xmlHandler)¶arrayTagDOMReader
(xmlHandler)¶int16DOMWriter
(node, intObj, xmlHandler)¶int16TagDOMReader
(xmlHandler)¶numpyObjectTagDOMReader
(xmlHandler)¶numpyObjectTagDOMWriter
(node, obj, xmlHandler)¶packHandlers
()¶Loads and allows configuration of PyHRF.
pyhrf.configuration.
ConfigurationError
¶Bases: exceptions.Exception
Exception class for configuration parsing errors.
pyhrf.configuration.
cfg_error_report
(cfg, refcfg)¶pyhrf.configuration.
load_configuration
(filename, refcfg, mode='file_only')¶Load configuration file from ‘filename’ and check it against ‘refcfg’. If mode is ‘file_only’ then only configuration in filename is returned. If mode is ‘update’ then the loaded configuration is updated with ‘refcfg’ to load defaults for unprovided parameters.
pyhrf.configuration.
write_configuration
(cfg_dict, filename, section_order=None)¶pyhrf.core.
Condition
(**kwargs)¶Bases: pyhrf.core.AttrClass
Represents an activation condition
FMRISessionSimulationData(onsets={'audio': array([ 15. , 20.7, 29.7, 35.4, 44.7, 48. , 83.4, 89.7,
108. , 119.4, 135. , 137.7, 146.7, 173.7, 191.7, 236.7,
251.7, 284.4, 293.4, 296.7]), 'video': array([ 0. , 2.4, 8.7, 33. , 39. , 41.7, 56.4, 59.7,
75. , 96. , 122.7, 125.4, 131.4, 140.4, 149.4, 153. ,
156. , 159. , 164.4, 167.7, 176.7, 188.4, 195. , 198. ,
201. , 203.7, 207. , 210. , 218.7, 221.4, 224.7, 234. ,
246. , 248.4, 260.4, 264. , 266.7, 269.7, 278.4, 288. ])}, durations={'audio': array([], dtype=float64), 'video': array([], dtype=float64)}, simulation_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/simu.pck')
Bases: pyhrf.xmlio.Initable
FMRISessionSimulationData.
to_dict
()¶FMRISessionSurfacicData(onsets={'audio': array([ 15. , 20.7, 29.7, 35.4, 44.7, 48. , 83.4, 89.7,
108. , 119.4, 135. , 137.7, 146.7, 173.7, 191.7, 236.7,
251.7, 284.4, 293.4, 296.7]), 'video': array([ 0. , 2.4, 8.7, 33. , 39. , 41.7, 56.4, 59.7,
75. , 96. , 122.7, 125.4, 131.4, 140.4, 149.4, 153. ,
156. , 159. , 164.4, 167.7, 176.7, 188.4, 195. , 198. ,
201. , 203.7, 207. , 210. , 218.7, 221.4, 224.7, 234. ,
246. , 248.4, 260.4, 264. , 266.7, 269.7, 278.4, 288. ])}, durations={'audio': array([], dtype=float64), 'video': array([], dtype=float64)}, bold_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/real_data_surf_tiny_bold.gii')
Bases: pyhrf.xmlio.Initable
FMRISessionSurfacicData.
to_dict
()¶FMRISessionVolumicData(onsets={'audio': array([ 15. , 20.7, 29.7, 35.4, 44.7, 48. , 83.4, 89.7,
108. , 119.4, 135. , 137.7, 146.7, 173.7, 191.7, 236.7,
251.7, 284.4, 293.4, 296.7]), 'video': array([ 0. , 2.4, 8.7, 33. , 39. , 41.7, 56.4, 59.7,
75. , 96. , 122.7, 125.4, 131.4, 140.4, 149.4, 153. ,
156. , 159. , 164.4, 167.7, 176.7, 188.4, 195. , 198. ,
201. , 203.7, 207. , 210. , 218.7, 221.4, 224.7, 234. ,
246. , 248.4, 260.4, 264. , 266.7, 269.7, 278.4, 288. ])}, durations={'audio': array([], dtype=float64), 'video': array([], dtype=float64)}, bold_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/subj0_bold_session0.nii.gz')
Bases: pyhrf.xmlio.Initable
FMRISessionVolumicData.
parametersComments
= {'bold_file': 'Data file containing the 3D+time BOLD signal (nifti format)', 'durations': 'Durations of experimental simtuli in seconds.\nIt has to consistent with the definition of onsets', 'onsets': 'Onsets of experimental simtuli in seconds. \nDictionnary mapping stimulus name to the actual list of onsets.'}¶FMRISessionVolumicData.
to_dict
()¶pyhrf.core.
FmriData
(onsets, bold, tr, sessionsScans, roiMask, graphs=None, stimDurations=None, meta_obj=None, simulation=None, backgroundLabel=0, data_files=None, data_type=None, edge_lengths=None, mask_loaded_from_file=False, extra_data=None)¶Bases: pyhrf.xmlio.Initable
onsets
¶a dictionary mapping a stimulus name to a list of session onsets. Each item of this list is a 1D numpy float array of onsets for a given session.
stimDurations
¶same as ‘onsets’ but stores durations of stimuli
roiMask
¶numpy int array of roi labels (0 stands for the background). Shape depends on the data form (3D volumic or 1D surfacic)
bold
¶either a 4D numpy float array with axes [sag,cor,ax,scan] and then spatial axes must have the same shape as roiMask, or a 2D numpy float array with axes [scan, position] and position axis must have the same length as the number of positions within roiMask (without background). Sessions are stacked in the scan axis
sessionsScans
¶a list of session indexes along scan axis.
tr
¶Time of repetition of the BOLD signal
simulation
¶if not None then it should be a list of simulation instance.
meta_obj
¶extra information associated to data
average
(flag=True)¶build_graphs
(force=False)¶compute_average
()¶discard_rois
(roi_ids)¶discard_small_rois
(min_size)¶from_simu_ui
(sessions_data=None)¶from_simulation_dict
(simulation, mask=None)¶from_surf_files
(paradigm_csv_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/paradigm_loc_av.csv', bold_files=None, tr=2.4, mesh_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/real_data_surf_tiny_mesh.gii', mask_file=None)¶Return FmriData representation from surf files
from_surf_ui
(sessions_data=None, tr=2.4, mask_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/real_data_surf_tiny_parcellation.gii', mesh_file='/home/docs/checkouts/readthedocs.org/user_builds/pyhrf/envs/stable/lib/python2.7/site-packages/pyhrf-0.5.0-py2.7-linux-x86_64.egg/pyhrf/datafiles/real_data_surf_tiny_mesh.gii')¶Convenient creation function intended to be used for XML I/O. ‘session_data’ is a list of FMRISessionVolumicData objects. ‘tr’ is the time of repetition. ‘mask_file’ is a path to a functional mask file.
This represents the following hierarchy:
- FMRIData:
- list of session data:
[ * data for session 1:
- onsets for session 1,
- durations for session 1,
- fmri data file for session 1 (gii)
* data for session 2:
- onsets for session 2,
- durations for session 2,
- fmri data file for session 2 (gii)
],
- time of repetition
- mask file
- mesh file
from_vol_files
¶from_vol_files_rel
¶from_vol_ui
¶getSummary
(long=False)¶get_condition_names
()¶get_data_files
()¶get_extra_data
(label, default)¶get_graph
()¶get_joined_durations
()¶get_joined_onsets
()¶get_nb_rois
()¶Return the number of parcels (background id is discarded)
get_nb_vox_in_mask
()¶get_roi_id
()¶In case of FMRI data containing only one ROI, return the id of this ROI. If data contains several ROIs then raise an exception
get_roi_mask
()¶keep_only_rois
(roiIds)¶parametersComments
= {'mask_file': 'Input n-ary mask file (= parcellation). Only positive integers are allowed. \nAll zeros are treated as background positions.', 'sessions_data': 'List of data definition for all sessions', 'tr': 'repetition time in seconds'}¶parametersToShow
= ['tr', 'sessions_data', 'mask_file']¶roiMask
roi_split
(mask=None)¶save
(output_dir)¶Save paradigm to output_dir/paradigm.csv, BOLD to output_dir/bold.nii, mask to output_dir/mask.nii #TODO: handle multi-session
Return: tuple of file names in this order: (paradigm, bold, mask)
set_extra_data
(label, value)¶store_mask_sparse
(roiMask)¶pyhrf.core.
FmriGroupData
(list_subjects)¶Bases: pyhrf.xmlio.Initable
Used for group level hemodynamic analysis Encapsulates FmriData objects for all subjects All subjects must habe the same number of ROIs
build_graphs
(force=False)¶getSummary
(long=False)¶get_roi_id
()¶roi_split
()¶Retrieve a list of FmriGroupData object, each containing the data for all subject, in one ROI
pyhrf.core.
get_data_file_name
(filename)¶Return the path of a given filename.
pyhrf.core.
get_roi_simulation
(simu_sessions, mask, roi_id)¶Extract the ROI from the given simulation dict. :param - simu: dictionnary of simulated quantities :type - simu: dict :param - mask: binary mask defining the spatial extent of the ROI :type - mask: np.ndarray :param - roi_id: the id of the roi to extract :type - roi_id: int
Returns: | dict of roi-specific simulation items |
---|
pyhrf.core.
get_src_doc_path
()¶Return the documentation path of pyhrf.
pyhrf.core.
get_src_path
()¶Return the source path of pyhrf.
pyhrf.core.
get_tmp_path
(tag='pyhrf_')¶Return a temporary path.
pyhrf.core.
list_data_file_names
()¶List all the data filenames.
pyhrf.core.
load_surf_bold_mask
(bold_files, mesh_file, mask_file=None)¶pyhrf.core.
load_vol_bold_and_mask
(bold_files, mask_file)¶pyhrf.core.
merge_fmri_sessions
(fmri_data_sets)¶fmri_data_sets: list of FmriData objects. Each FmriData object is assumed to contain only one session
pyhrf.core.
merge_fmri_subjects
(fmri_data_sets, roiMask, backgroundLabel=0)¶fmri_data_sets: list of FmriData objects, for different subjects. In case of multisession data, merging of fmri data over sessions must be done for each subject before using this function. roiMask: multi_subject parcellation (nparray)
pyhrf.glm.
glm_nipy
(fmri_data, contrasts=None, hrf_model='Canonical', drift_model='Cosine', hfcut=128, residuals_model='spherical', fit_method='ols', fir_delays=[0], rescale_results=False, rescale_factor=None)¶Perform a GLM analysis on fMRI data using the implementation of Nipy.
Parameters: |
|
---|---|
Returns: | (glm instance, design matrix, dict of contrasts of objects) |
Examples: >>> from pyhrf.core import FmriData >>> from pyhrf.glm import glm_nipy >>> g,dmtx,con = glm_nipy(FmriData.from_vol_ui()) >>> g,dmtx,con = glm_nipy(FmriData.from_vol_ui(), contrasts={‘A-V’:’audio-video’})
pyhrf.glm.
glm_nipy_from_files
(bold_file, tr, paradigm_csv_file, output_dir, mask_file, session=0, contrasts=None, con_test_baseline=0.0, hrf_model='Canonical', drift_model='Cosine', hfcut=128, residuals_model='spherical', fit_method='ols', fir_delays=[0])¶#TODO: handle surface data hrf_model : Canonical | Canonical with Derivative | FIR
Module to handle graphs. Base structures : - undirected, unweighted graph: a list of neighbours index (list of numpy array).
pyhrf.graph.
bfs_set_label
(g, root, data, value, radius)¶pyhrf.graph.
bfs_sub_graph
(g, root, radius)¶pyhrf.graph.
breadth_first_search
(graph, root=0, visitable=None)¶Traverses a graph in breadth-first order.
The first argument should be the tree root; visitable should be an iterable with all searchable nodes;
pyhrf.graph.
center_mask_at
(mask, pos, indexes, toroidal=False)¶pyhrf.graph.
center_mask_at_v01
(mask, pos, shape)¶pyhrf.graph.
center_mask_at_v02
(mask, pos, shape)¶pyhrf.graph.
connected_components
(g)¶pyhrf.graph.
connected_components_iter
(g)¶pyhrf.graph.
flatten_and_graph
(data, mask=None, kerMask=None, depth=1, toroidal=False)¶pyhrf.graph.
graph_from_lattice
(mask, kerMask=None, depth=1, toroidal=False)¶Creates a graph from a n-dimensional lattice ‘mask’ define valid positions to build the graph over. ‘kerMask’ is numpy array mask (tuple of arrays) which defines the neighbourhood system, ie the relative positions of neighbours for a given position in the lattice.
pyhrf.graph.
graph_from_lattice3D
(mask, kerMask=None, depth=1, toroidal=False)¶Creates a graph from a n-dimensional lattice ‘mask’ define valid positions to build the graph over. ‘kerMask’ is numpy array mask (tuple of arrays) which defines the neighbourhood system, ie the relative positions of neighbours for a given position in the lattice.
pyhrf.graph.
graph_from_mesh
(polygonList)¶Return the list of neighbours indexes for each position, from a list of polygons. Each polygon is a triplet.
pyhrf.graph.
graph_is_sane
(g, toroidal=False)¶Check the structure of graph ‘g’, which is a list of neighbours index. Return True if check is ok, else False. -> every nid in g[id] should verify id in g[nid] -> any neighbours list must have unique elements -> no isolated node
pyhrf.graph.
graph_nb_cliques
(graph)¶pyhrf.graph.
graph_pool_indexes
(g)¶pyhrf.graph.
graph_pygraph
(g)¶pyhrf.graph.
graph_to_sparse_matrix
(graph)¶Creates a connectivity sparse matrix from the adjacency graph (list of neighbors list)
pyhrf.graph.
parcels_to_graphs
(parcellation, kerMask, toKeep=None, toDiscard=None)¶Compute graphs for each parcel in parcels. A graph is simply defined as a list of neighbour indexes.
Parameters: |
|
---|---|
Returns: | |
Return type: | a dictionary mapping a roi ID to its graph |
pyhrf.graph.
split_mask_into_cc_iter
(mask, min_size=0, kerMask=None)¶Return an iterator over all connected components (CC) within input mask. CC which are smaller than min_size are discarded. kerMask defines the connectivity, e.g., kerMask3D_6n for 6-neighbours in 3D.
Examples
vol = np.array( [[1,1,0,1,1],
[1,1,0,1,1],
[0,0,0,0,0],
[1,0,1,1,0],
[0,0,1,1,0]], dtype=int )
for cc in split_mask_into_cc_iter(vol):
print cc
Should output:
np.array( [[1,1,0,0,0],
[1,1,0,0,0],
[0,0,0,0,0],
[0,0,0,0,0],
[0,0,0,0,0]]
np.array( [[0,0,0,1,1],
[0,0,0,1,1],
[0,0,0,0,0],
[0,0,0,0,0],
[0,0,0,0,0]]
pyhrf.graph.
sub_graph
(graph, nodes)¶Module to distribute shell commands across a network
Original Author: Mathieu Perrot Extended by: Thomas Vincent
pyhrf.grid.
DispatchedTasksManager
(*args, **kwargs)¶Bases: pyhrf.grid.TasksManager
start
()¶wait_for_end_or_cmd
(print_number, tasks_number, cmd)¶pyhrf.grid.
HierarchicalTasksManager
(*args, **kwargs)¶Bases: pyhrf.grid.DispatchedTasksManager
start
()¶pyhrf.grid.
HostsManager
(list)¶Bases: object
available_status
= 0¶isup
(hostname)¶not_available_status
= 1¶probe
(hostname)¶unknown_host_status
= 2¶unknown_status
= 3¶update_all_hosts
()¶update_host_status
(host, status)¶pyhrf.grid.
OneTaskManager
(*args, **kwargs)¶Bases: pyhrf.grid.TasksManager
abnormal_stop
(task)¶start
()¶pyhrf.grid.
ProbeHost
(hosts_manager, hostname)¶Bases: threading.Thread
run
()¶Method representing the thread’s activity.
You may override this method in a subclass. The standard run() method invokes the callable object passed to the object’s constructor as the target argument, if any, with sequential and keyword arguments taken from the args and kwargs arguments, respectively.
pyhrf.grid.
RepeatedTasksManager
(*args, **kwargs)¶Bases: pyhrf.grid.TasksManager
abnormal_stop
(task)¶start
()¶pyhrf.grid.
Task
(task)¶Bases: object
Only one task that will be computed on only one host.
get
()¶pyhrf.grid.
TaskHierarchical
(rule, tasks_dic)¶Bases: pyhrf.grid.Task
Hiearchic dependencies of TaskList.
init
()¶next
()¶pyhrf.grid.
TaskList
(tasks)¶Bases: pyhrf.grid.Task
List of independent tasks. Each one can be computed on a different task.
append
(task)¶next
()¶pyhrf.grid.
TasksManager
(timeslot, user, tasks, hosts_manager, log, brokenfd, time_limit=86400)¶Bases: object
abnormal_stop
(task)¶print_status
(n, size)¶wait_to_be_ready
()¶pyhrf.grid.
TasksStarter
(tasks_manager, host, task, time_limit=86400)¶Bases: threading.Thread
kill
()¶run
()¶Method representing the thread’s activity.
You may override this method in a subclass. The standard run() method invokes the callable object passed to the object’s constructor as the target argument, if any, with sequential and keyword arguments taken from the args and kwargs arguments, respectively.
pyhrf.grid.
TimeSlot
(start, end)¶Bases: object
Define a contiguous timeslot.
start, end : in second since day beggining.
is_inside
(time)¶is_inside_now
()¶pyhrf.grid.
TimeSlotList
(list)¶Bases: pyhrf.grid.TimeSlot
Define uncontiguous timslots.
list : list of timeslots.
is_inside
(time)¶pyhrf.grid.
User
(name=None, passwd=None, keytype=None)¶Bases: object
Define user launching task and identification process.
Parameters: |
|
---|
key
()¶pyhrf.grid.
broken_help
(cmd)¶pyhrf.grid.
create_options
(argv)¶pyhrf.grid.
hosts_help
(cmd)¶pyhrf.grid.
kill_threads
()¶pyhrf.grid.
log_help
(cmd)¶pyhrf.grid.
main
()¶pyhrf.grid.
main_safe
()¶pyhrf.grid.
mode_help
(cmd)¶pyhrf.grid.
parse_options
(parser)¶pyhrf.grid.
quit
(signal, frame)¶pyhrf.grid.
read_hierarchic_tasks
(tasks_file)¶pyhrf.grid.
read_hosts
(hosts)¶pyhrf.grid.
read_tasks
(tasks, mode)¶pyhrf.grid.
read_timeslot
(timeslot)¶pyhrf.grid.
remote_dir_is_writable
(user, hosts, path)¶Test if path is writable from each host in hosts. Sending bash commands to each host via ssh using the given user login.
Args:
pyhrf.grid.
run_grid
(mode, hosts_list, keytype, tasks, timeslot, brokenfile=None, logfile=None, user=None, passwd=None, time_limit=86400)¶pyhrf.grid.
tasks_help
(cmd)¶pyhrf.grid.
timeslot_help
(cmd)¶This module provides classes and functions to handle multi-dimensionnal numpy array (ndarray) objects and extend them with some semantics (axes labels and axes domains). See xndarray class. (TODO: make xndarray inherit numpy.ndarray?)
pyhrf.ndarray.
ArrayMappingError
¶Bases: exceptions.Exception
pyhrf.ndarray.
expand_array_in_mask
(flat_data, mask, flat_axis=0, dest=None, m=None)¶Map the flat_axis of flat_data onto the region within mask. flat_data is then reshaped so that flat_axis is replaced with mask.shape.
Notes
m is the result of np.where(mask) -> can be passed to speed up if already done before.
Examples
>>> a = np.array([1,2,3])
>>> m = np.array([[0,1,0], [0,1,1]] )
>>> expand_array_in_mask(a,m)
array([[0, 1, 0],
[0, 2, 3]])
>>> a = np.array([[1,2,3],[4,5,6]])
>>> m = np.array([[0,1,0], [0,1,1]] )
>>> expand_array_in_mask(a,m,flat_axis=1)
array([[[0, 1, 0],
[0, 2, 3]],
[[0, 4, 0],
[0, 5, 6]]])
pyhrf.ndarray.
merge
(arrays, mask, axis, fill_value=0)¶Merge the given arrays into a single array according to the given mask, with the given axis being mapped to those of mask. Assume that arrays[id] corresponds to mask==id and that all arrays are in the same orientation.
pyhrf.ndarray.
split_and_save
(cub, axes, fn, meta_data=None, set_MRI_orientation=False, output_dir=None, format_dvalues=False)¶pyhrf.ndarray.
stack_cuboids
(c_list, axis, domain=None, axis_pos='first')¶Stack xndarray instances in list c_list along a new axis label axis. If domain (numpy array or list) is provided, it is associated to the new axis. All cuboids in c_list must have the same orientation and domains. axis_pos defines the position of the new axis: either first or last.
Examples
>>> import numpy as np
>>> from pyhrf.ndarray import xndarray, stack_cuboids
>>> c1 = xndarray(np.arange(4*3).reshape(4,3), ['x','y'])
>>> c1
axes: ['x', 'y'], array([[ 0, 1, 2],
[ 3, 4, 5],
[ 6, 7, 8],
[ 9, 10, 11]])
>>> c2 = xndarray(np.arange(4*3).reshape(4,3)*2, ['x','y'])
>>> c2
axes: ['x', 'y'], array([[ 0, 2, 4],
[ 6, 8, 10],
[12, 14, 16],
[18, 20, 22]])
>>> c_stacked = stack_cuboids([c1,c2], 'stack_axis', ['c1','c2'])
>>> print c_stacked.descrip()
* shape : (2, 4, 3)
* dtype : int64
* orientation: ['stack_axis', 'x', 'y']
* value label: value
* axes domains:
'stack_axis': array(['c1', 'c2'],
dtype='|S2')
'x': arange(0,3,1)
'y': arange(0,2,1)
TODO: enable broadcasting (?)
pyhrf.ndarray.
tree_to_xndarray
(tree, level_labels=None)¶Stack all arrays within input tree into a single array.
Parameters: |
|
---|---|
Returns: | |
Return type: | xndarray object |
Examples
>>> from pyhrf.ndarray import xndarray, tree_to_xndarray
>>> d = { 1 : { .1 : xndarray([1,2], axes_names=['inner_axis']), .2 : xndarray([3,4], axes_names=['inner_axis']), }, 2 : { .1 : xndarray([1,2], axes_names=['inner_axis']), .2 : xndarray([3,4], axes_names=['inner_axis']), } }
>>> tree_to_xndarray(d, ['level_1', 'level_2'])
axes: ['level_1', 'level_2', 'inner_axis'], array([[[1, 2],
[3, 4]],
[[1, 2],
[3, 4]]])
pyhrf.ndarray.
xndarray
(narray, axes_names=None, axes_domains=None, value_label='value', meta_data=None)¶Handles a multidimensional numpy array with axes that are labeled and mapped to domain values.
Examples
>>> c = xndarray( [ [4,5,6],[8,10,12] ], ['time','position'], {'time':[0.1,0.2]} )
Will represent the following situation:
position
------->
4 5 6 | t=0.1 |time
8 10 12 | t=0.2 v
add
(c, dest=None)¶astype
(t)¶cexpand
(cmask, axis, dest=None)¶Same as expand but mask is a cuboid
TODO: + unit test
cflatten
(cmask, new_axis)¶copy
(copy_meta_data=False)¶Return copy of the current cuboid. Domains are copied with a shallow dictionnary copy.
descrip
()¶Return a printable string describing the cuboid.
descrip_shape
()¶divide
(c, dest=None)¶expand
(mask, axis, target_axes=None, target_domains=None, dest=None, do_checks=True, m=None)¶Create a new xndarray instance (or store into an existing dest cuboid) where axis is expanded and values are mapped according to mask.
Examples
>>> import numpy as np
>>> from pyhrf.ndarray import xndarray
>>> c_flat = xndarray(np.arange(2*6).reshape(2,6).astype(np.int64), ['condition', 'voxel'], {'condition' : ['audio','video']})
>>> print c_flat.descrip()
* shape : (2, 6)
* dtype : int64
* orientation: ['condition', 'voxel']
* value label: value
* axes domains:
'condition': array(['audio', 'video'],
dtype='|S5')
'voxel': arange(0,5,1)
>>> mask = np.zeros((4,4,4), dtype=int)
>>> mask[:3,:2,0] = 1
>>> c_expanded = c_flat.expand(mask, 'voxel', ['x','y','z'])
>>> print c_expanded.descrip()
* shape : (2, 4, 4, 4)
* dtype : int64
* orientation: ['condition', 'x', 'y', 'z']
* value label: value
* axes domains:
'condition': array(['audio', 'video'],
dtype='|S5')
'x': arange(0,3,1)
'y': arange(0,3,1)
'z': arange(0,3,1)
explode
(cmask, new_axis='position')¶Explode array according to the given n-ary mask so that axes matchin those of mask are flatten into new_axis.
Parameters: |
|
---|---|
Returns: | dict of xndarray that maps a mask value to a xndarray. |
explode_a
(mask, axes, new_axis)¶Explode array according to given n-ary mask so that axes are flatten into new_axis.
Parameters: |
|
---|---|
Returns: | dict of xndarray that maps a mask value to a xndarray. |
fill
(c)¶flatten
(mask, axes, new_axis)¶flatten cudoid.
TODO: +unit test
get_axes_domains
()¶Return domains associated to axes as a dict (axis_name:domain array)
get_axes_ids
(axes_names)¶Return the index of all axes in given axes_names
get_axis_id
(axis_name)¶Return the id of an axis from the given name.
get_axis_name
(axis_id)¶Return the name of an axis from the given index ‘axis_id’.
get_domain
(axis_id)¶Return the domain of the axis axis_id
Examples
>>> from pyhrf.ndarray import xndarray
>>> c = xndarray(np.random.randn(10,2), axes_names=['x','y'], axes_domains={'y' : ['plop','plip']})
>>> (c.get_domain('y') == np.array(['plop', 'plip'], dtype='|S4')).all()
True
>>> c.get_domain('x') #default domain made of slice indexes
array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
get_domain_idx
(axis, value)¶Get slice index from domain value for axis ‘axis’.
get_dshape
()¶Return the shape of the array as dict mapping an axis name to the corresponding size
get_extra_info
(fmt='dict')¶get_ndims
()¶get_new_axis_name
()¶Return an axis label not already in use. Format is: dim%d
get_orientation
()¶get_voxel_size
(axis)¶Return the size of a voxel along ‘axis’, only if meta data is available.
has_axes
(axes)¶has_axis
(axis)¶len
(axis)¶load
()¶Load cuboid from file. Supported format: nifti1. Extra axis information is retrieved from a nifti extension if available. If it’s not available, label the axes as: (sagittal, coronal, axial[, time]).
TODO: gifti.
map_onto
(xmapping)¶Reshape the array by mapping the axis corresponding to xmapping.value_label onto the shape of xmapping.
Parameters: | xmapping (xndarray) – array whose attribute value_label matches an axis of the current array |
---|---|
Returns: | |
Return type: | a new array (xndarray) where values from the current array have been mapped according to xmapping |
Examples
>>> from pyhrf.ndarray import xndarray
>>> import numpy as np
>>> # data with a region axis:
>>> data = xndarray(np.arange(2*4).reshape(2,4).T * .1, ['time', 'region'], {'time':np.arange(4)*.5, 'region':[2, 6]})
>>> data
axes: ['time', 'region'], array([[ 0. , 0.4],
[ 0.1, 0.5],
[ 0.2, 0.6],
[ 0.3, 0.7]])
>>> # 2D spatial mask of regions:
>>> region_map = xndarray(np.array([[2,2,2,6], [6,6,6,0], [6,6,0,0]]), ['x','y'], value_label='region')
>>> # expand region-specific data into region mask
>>> # (duplicate values)
>>> data.map_onto(region_map)
axes: ['x', 'y', 'time'], array([[[ 0. , 0.1, 0.2, 0.3],
[ 0. , 0.1, 0.2, 0.3],
[ 0. , 0.1, 0.2, 0.3],
[ 0.4, 0.5, 0.6, 0.7]],
[[ 0.4, 0.5, 0.6, 0.7],
[ 0.4, 0.5, 0.6, 0.7],
[ 0.4, 0.5, 0.6, 0.7],
[ 0. , 0. , 0. , 0. ]],
[[ 0.4, 0.5, 0.6, 0.7],
[ 0.4, 0.5, 0.6, 0.7],
[ 0. , 0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. ]]])
max
(axis=None)¶mean
(axis=None)¶min
(axis=None)¶multiply
(c, dest=None)¶ptp
(axis=None)¶reorient
(orientation)¶Return a cuboid with new orientation. If cuboid is already in the right orientation, then return the current cuboid. Else, create a new one.
repeat
(n, new_axis, domain=None)¶Return a new cuboid with self’s data repeated ‘n’ times along a new axis labelled ‘new_axis’. Associated ‘domain’ can be provided.
rescale_values
(v_min=0.0, v_max=1.0, axis=None)¶roll
(axis, pos=-1)¶Roll xndarray by making ‘axis’ the last axis. ‘pos’ is either 0 or -1 (first or last, respectively) TODO: handle all pos.
save
(file_name, meta_data=None, set_MRI_orientation=False)¶Save cuboid to a file. Supported format: Nifti1. ‘meta_data’ shoud be a 2-elements tuple: (affine matrix, Nifti1Header instance). If provided, the meta_data attribute of the cuboid is ignored. All extra axis information is stored as an extension.
set_MRI_orientation
()¶Set orientation to sagittal,coronal,axial,[time|iteration|condition] Priority for the 4th axis: time > condition > iteration. The remaining axes are sorted in alphatical order
set_axis_domain
(axis_id, domain)¶Set the value domain mapped to axis_id as domain
Parameters: |
|
---|---|
Returns: | None |
set_orientation
(axes)¶Set the cuboid orientation (inplace) according to input axes labels
split
(axis)¶Split a cuboid along given axis. Return an OrderedDict of cuboids.
squeeze
(axis=None)¶Remove all dims which have length=1. ‘axis’ selects a subset of the single-dimensional axes.
squeeze_all_but
(axes)¶std
(axis=None)¶sub_cuboid
(orientation=None, **kwargs)¶Return a sub cuboid. ‘kwargs’ allows argument in the form: axis=slice_value.
sub_cuboid_from_slices
(orientation=None, **kwargs)¶Return a sub cuboid. ‘kwargs’ allows argument in the form: axis=slice_index.
substract
(c, dest=None)¶sum
(axis=None)¶swapaxes
(a1, a2)¶Swap axes a1 and a2
Parameters: |
|
---|---|
Returns: | A new cuboid wrapping a swapped view of the numpy array |
to_html_table
(row_axes, col_axes, inner_axes, cell_format='txt', plot_dir=None, rel_plot_dir=None, plot_fig_prefix='xarray_', plot_style='image', plot_args=None, tooltip=False, border=None)¶Render the array as an html table whose column headers correspond to domain values and axis names defined by col_axes, row headers defined by row_axes and inner cell axes defined by inner_axes Data within a cell can be render as text or as a plot figure (image files are produced)
Parameters: | - – |
---|---|
Returns: | html code (str) |
to_latex
(row_axes=None, col_axes=None, inner_axes=None, inner_separator=' | ', header_styles=None, hval_fmt=None, val_fmt='%1.2f', col_align=None)¶to_tree
(level_axes, leaf_axes)¶Convert nested dictionary mapping where each key is a domain value and each leaf is an array or a scalar value if leaf_axes is empty.
Returns: | {dv_axis1 : {dv_axis2 : {… : xndarray|scalar_type} |
---|---|
Return type: | OrderedDict such as |
Example: >>> from pyhrf.ndarray import xndarray >>> import numpy as np >>> c = xndarray(np.arange(4).reshape(2,2), axes_names=[‘a1’,’ia’], axes_domains={‘a1’:[‘out_dv1’, ‘out_dv2’], ‘ia’:[‘in_dv1’, ‘in_dv2’]}) >>> c.to_tree([‘a1’], [‘ia’]) OrderedDict([(‘out_dv1’, axes: [‘ia’], array([0, 1])), (‘out_dv2’, axes: [‘ia’], array([2, 3]))])
unstack
(outer_axes, inner_axes)¶Unstack the array along outer_axes and produce a xndarray of xndarrays
Parameters: |
|
---|---|
Returns: | xndarray object |
Example: >>> from pyhrf.ndarray import xndarray >>> import numpy as np >>> c = xndarray(np.arange(4).reshape(2,2), axes_names=[‘a1’,’ia’], axes_domains={‘a1’:[‘out_dv1’, ‘out_dv2’], ‘ia’:[‘in_dv1’, ‘in_dv2’]}) >>> c.unstack([‘a1’], [‘ia’]) axes: [‘a1’], array([axes: [‘ia’], array([0, 1]), axes: [‘ia’], array([2, 3])], dtype=object)
var
(axis=None)¶xndarray_like
(data=None)¶Return a new cuboid from data with axes, domains and value label copied from ‘c’. If ‘data’ is provided then set it as new cuboid’s data, else a zero array like c.data is used.
TODO: test
pyhrf.ndarray.
xndarray_like
(c, data=None)¶pyhrf.paradigm.
Paradigm
(stimOnsets, sessionDurations=None, stimDurations=None)¶delete_condition
(cond)¶from_csv
(csvFile, delim=None)¶Create a Paradigm object from a CSV file which columns are: session, task name, stimulation onset, stimulation duration, [amplitude]
from_session_dict
(d, sessionDurations=None)¶from_spm_mat
(spm_mat_file)¶TODO: handle session durations
get_info
(long=True)¶get_joined_and_rastered
(dt)¶get_joined_durations
()¶For each condition, join stimulus durations of all sessions.
get_joined_durations_dim
()¶For each condition, join stimulus durations of all sessions.
get_joined_onsets
()¶For each condition, join onsets of all sessions.
get_joined_onsets_dim
()¶For each condition, join onsets of all sessions.
get_nb_trials
()¶get_rastered
(dt, tMax=None)¶Return binary sequences of stimulus arrivals. Each stimulus event is approximated to the closest time point on the time grid defined by dt. eg return
{ 'cond1' : [np.array([ 0 0 0 1 0 0 1 1 1 0 1]),
np.array([ 0 1 1 1 0 0 1 0 1 0 0])] },
'cond2' : [np.array([ 0 0 0 1 0 0 1 1 1 0 0]),
np.array([ 1 1 0 1 0 1 0 0 0 0 0])] },
Parameters: |
---|
get_stimulus_names
()¶get_t_max
()¶join_sessions
()¶save_csv
(csvFile)¶save_spm_mat_for_1st_level_glm
(mat_file, session=0)¶to_nipy_paradigm
()¶pyhrf.paradigm.
check_stim_durations
(stim_onsets, stimDurations)¶If no durations specified (stimDurations is None or empty np.array) then assume spiked stimuli: return a sequence of zeros with same shape as onsets sequence. Check that durations have same shape as onsets.
pyhrf.paradigm.
contrasts_to_spm_vec
(condition_list, contrasts)¶pyhrf.paradigm.
extend_sampled_events
(sampled_events, sampled_durations)¶Add events to encode stimulus duration
pyhrf.paradigm.
merge_onsets
(onsets, new_condition, criterion=None, durations=None, discard=None)¶Convention for definition of onsets or durations.
OrderedDict({
'condition_name': [ <array of timings for sess1>,
<array of timings for sess2>,
...]
}
pyhrf.paradigm.
restarize_events
(events, durations, dt, t_max)¶build a binary sequence of events. Each event start is approximated to the nearest time point on the time grid defined by dt and t_max.
pyhrf.parallel.
RemoteException
¶Bases: exceptions.Exception
pyhrf.parallel.
dump_func
(func, fn)¶pyhrf.parallel.
merge_default_kwargs
(func, kwargs)¶pyhrf.parallel.
prepare_treatment_jobs
(treatment, tmp_local_dir, local_result_path, local_user, local_host, remote_host, remote_user, remote_path, label_for_cluster)¶Prepare soma-workflow jobs to perform one treatment (i.e., one subject).
Parameters: |
|
---|---|
Returns: |
|
pyhrf.parallel.
remote_map
(func, largs=None, lkwargs=None, mode='serial')¶Execute a function in parallel on a list of arguments.
Parameters: |
|
---|---|
Returns: | a list of results |
Raises: | RemoteException if any remote task has failed |
Example: >>> from pyhrf.parallel import remote_map >>> def foo(a, b=2): return a + b >>> remote_map(foo, [(2,),(3,)], [{‘b’:5}, {‘b’:7}]) [7, 10]
pyhrf.parallel.
remote_map_marshal
(func, largs=None, lkwargs=None, mode='local')¶pyhrf.parallel.
run_soma_workflow
(treatments, exec_cmd, tmp_local_dirs, server_id, remote_host, remote_user, remote_pathes, local_result_pathes, label_for_cluster, wait_ending=False)¶Dispatch treatments using soma-workflow.
Parameters: |
|
---|
pyhrf.parallel.
save_treatment
(t, f)¶pyhrf.parcellation.
Ant
(a_id, greed, graph, labels, path_marks, site_marks, pressures, world, verbosity=0)¶Bases: pyhrf.parcellation.Talker
action
(time)¶fix_explosion
()¶to_conquer
()¶to_patrol
()¶pyhrf.parcellation.
Talker
(talker_string_id, verbosity=0)¶verbose
(level, msg)¶verbose_array
(level, array)¶pyhrf.parcellation.
Visit_graph_noeud
(noeud, graphe, Visited=None)¶pyhrf.parcellation.
World
(graph, nb_ants, greed=0.05, time_min=100, time_max=None, tolerance=1, verbosity=0, stop_when_all_controlled=False)¶Bases: pyhrf.parcellation.Talker
balanced
()¶force_end
()¶get_final_labels
()¶resolve
()¶site_taken
(site)¶pyhrf.parcellation.
init_edge_data
(g, init_value=0)¶pyhrf.parcellation.
make_parcellation_cubed_blobs_from_file
(parcellation_file, output_path, roi_ids=None, bg_parcel=0, skip_existing=False)¶pyhrf.parcellation.
make_parcellation_from_files
(betaFiles, maskFile, outFile, nparcels, method, dry=False, spatial_weight=10.0)¶pyhrf.parcellation.
make_parcellation_surf_from_files
(beta_files, mesh_file, parcellation_file, nbparcel, method, mu=10.0, verbose=0)¶pyhrf.parcellation.
parcellate_balanced_vol
(mask, nb_parcels)¶Parameters: |
|
---|---|
Returns: | a 3D array of integers |
Return type: |
|
pyhrf.parcellation.
parcellate_voronoi_vol
(mask, nb_parcels, seeds=None)¶Produce a parcellation from a Voronoi diagram built on random seeds. The number of seeds is equal to the nb of parcels. Seed are randomly placed within the mask, expect on edge positions
Parameters: |
|
---|---|
Returns: | a 3D array of integers - |
Return type: |
|
pyhrf.parcellation.
parcellation_dist
(p1, p2, mask=None)¶Compute the distance between the two parcellation p1 and p2 as the minimum number of positions to remove in order to obtain equal partitions. “mask” may be a binary mask to limit the distance computation to some specific positions. Important convention: parcel label 0 is treated as background and corresponding positions are discarded if no mask is provided.
Returns: | (distance value, parcellation overlap) |
---|
pyhrf.parcellation.
parcellation_for_jde
(fmri_data, avg_parcel_size=250, output_dir=None, method='gkm', glm_drift='Cosine', glm_hfcut=128)¶method: gkm, ward, ward_and_gkm
pyhrf.parcellation.
parcellation_report
(d)¶pyhrf.parcellation.
parcellation_ward_spatial
(func_data, n_clusters, graph=None)¶Make parcellation based upon ward hierarchical clustering from scikit-learn
Parameters: |
|
---|---|
Returns: | |
Return type: | parcellation labels |
pyhrf.parcellation.
permutation
(x)¶Randomly permute a sequence, or return a permuted range.
If x is a multi-dimensional array, it is only shuffled along its first index.
Parameters: | x (int or array_like) – If x is an integer, randomly permute np.arange(x) .
If x is an array, make a copy and shuffle the elements
randomly. |
---|---|
Returns: | out – Permuted sequence or array range. |
Return type: | ndarray |
Examples
>>> np.random.permutation(10)
array([1, 7, 4, 3, 0, 9, 2, 5, 8, 6])
>>> np.random.permutation([1, 4, 9, 12, 15])
array([15, 1, 9, 4, 12])
>>> arr = np.arange(9).reshape((3, 3))
>>> np.random.permutation(arr)
array([[6, 7, 8],
[0, 1, 2],
[3, 4, 5]])
pyhrf.parcellation.
rand
(d0, d1, ..., dn)¶Random values in a given shape.
Create an array of the given shape and populate it with
random samples from a uniform distribution
over [0, 1)
.
Parameters: | d1, .., dn (d0,) – The dimensions of the returned array, should all be positive. If no argument is given a single Python float is returned. |
---|---|
Returns: | out – Random values. |
Return type: | ndarray, shape (d0, d1, ..., dn) |
See also
random()
Notes
This is a convenience function. If you want an interface that takes a shape-tuple as the first argument, refer to np.random.random_sample .
Examples
>>> np.random.rand(3,2)
array([[ 0.14022471, 0.96360618], #random
[ 0.37601032, 0.25528411], #random
[ 0.49313049, 0.94909878]]) #random
pyhrf.parcellation.
randint
(low, high=None, size=None, dtype='l')¶Return random integers from low (inclusive) to high (exclusive).
Return random integers from the “discrete uniform” distribution of the specified dtype in the “half-open” interval [low, high). If high is None (the default), then results are from [0, low).
Parameters: |
|
---|---|
Returns: | out – size-shaped array of random integers from the appropriate distribution, or a single such random int if size not provided. |
Return type: | int or ndarray of ints |
See also
random.random_integers()
Examples
>>> np.random.randint(2, size=10)
array([1, 0, 0, 0, 1, 1, 0, 0, 1, 0])
>>> np.random.randint(1, size=10)
array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
Generate a 2 x 4 array of ints between 0 and 4, inclusive:
>>> np.random.randint(5, size=(2, 4))
array([[4, 0, 2, 1],
[3, 2, 2, 0]])
pyhrf.parcellation.
random_pick
(a)¶pyhrf.parcellation.
round_nb_parcels
(n)¶pyhrf.parcellation.
split_big_parcels
(parcel_file, output_file, max_size=400)¶pyhrf.parcellation.
split_parcel
(labels, graphs, id_parcel, n_parcels, inplace=False, verbosity=0, balance_tolerance='exact')¶balance_tolerance : exact or draft
pyhrf.plot.
autocrop
(img_fn)¶Remove extra background within figure (inplace). Use ImageMagick (convert)
pyhrf.plot.
flip
(img_fn, direction='horizontal')¶Mirror the figure (inplace). Use ImageMagick (convert) ‘horizontal’ direction -> use -flop. ‘vertical’ direction -> use -flip.
pyhrf.plot.
mix_cmap
(img1, cmap1, img2, cmap2, norm1=None, norm2=None, blend_r=0.5)¶pyhrf.plot.
plot_anat_parcel_func_fusion
(anat, func, parcel, parcel_col='white', parcels_line_width=0.5, func_cmap=None, func_norm=None, anat_norm=None, func_mask=None, highlighted_parcels_col=None, highlighted_parcels_line_width=1.5)¶pyhrf.plot.
plot_cub_as_curve
(c, colors=None, plot_kwargs=None, legend_prefix='', show_axis_labels=True, show_legend=False, axes=None, axis_label_fontsize=12)¶Plot a cuboid (ndims <= 2) as curve(s).
Parameters: |
|
---|---|
Returns: | |
Return type: | None |
pyhrf.plot.
plot_cub_as_image
(c, cmap=None, norm=None, show_axes=True, show_axis_labels=True, show_tick_labels=True, show_colorbar=False, axes=None)¶pyhrf.plot.
plot_func_slice
(func_slice_data, anatomy=None, parcellation=None, parcel_col='white', parcels_line_width=2.5, func_cmap=None, func_norm=None, anat_norm=None, func_mask=None, highlighted_parcels_col=None, highlighted_parcels_line_width=2.5, resolution=None, crop_extension=None, blend=0.5)¶pyhrf.plot.
plot_gaussian_mixture
(values, props=None, color='k', lw=1.75)¶axes of values : (component,class)
pyhrf.plot.
plot_gaussian_pdf
(bins, m, v, prop=None, plotArgs={})¶pyhrf.plot.
plot_palette
(cmap, norm=None, fontsize=None)¶pyhrf.plot.
plot_spm_mip
(img_fn, mip_fn)¶pyhrf.plot.
rotate
(img_fn, angle)¶Rotate figure (inplace). Use ImageMagick (convert)
pyhrf.plot.
set_int_tick_labels
(axis, labels, fontsize=None, rotation=None)¶Redefine labels of visible ticks at integer positions for the given axis.
pyhrf.plot.
set_ticks_fontsize
(fontsize, colbar=None)¶Change the fontsize of the tick labels for the current figure. If colorbar (Colorbar instance) is provided then change the fontsize of its tick labels as well.
pyhrf.plot.
set_xticklabels
(labels, positions=None, rotation=None)¶Set ticks labels of the xaxis in the current figure to labels If positions is provided then enforce tick position.
pyhrf.rfir.
RFIREstim
(hrf_nb_coeffs=42, hrf_dt=0.6, drift_type='cosine', stop_crit1=0.0001, stop_crit2=1e-05, nb_its_max=5, nb_iterations=500, nb_its_min=1, average_bold=False, taum=0.01, lambda_reg=100.0, fixed_taum=False, discarded_scan_indexes=None, output_fit=False)¶Bases: pyhrf.xmlio.Initable
Class handling the estimation of HRFs from fMRI data. Analysis is voxel-wise and can be multissession (heteroscedastic noise and session dependent drift). Simultaneous analysis of several conditions is treated. One HRF is considered at each voxel.
Compute_INV_R_and_R_and_DET_R
()¶Both computes self.InvR and self.DetR
Requires:
Compute_onset_matrix3
()¶Computes the onset matrix. Each stimulus onset is considered over a period of LengthOnsets seconds if (LengthOnsets > DetlaT) and a time step otherwise.
Requires:
where self.X[i][m,n,k] is such that:
CptFctQ
(CptType)¶Computes the function at a given iteration
Notes
It requires:
CptSigma
()¶Computes the Sigma at a given iteration.
self.Sigma[m*SBS:(m+1)*SBS,n*SBS:(n+1)*SBS]] -> (m,n)^th block of Sigma in session i.
EM_solver
(POI)¶requires: * everything in the class is supposed initialized
InitMatrixAndVectors
(POI)¶Initialize to zeros: X, y, P, l, h, InvR, Sigma. Initialize to ones: TauM, rb (<-scalar).
InitStorageMat
()¶Initialization of the matrices that will store all voxel results.
Notes
Input signals must have been read (in ReadRealSignal)
ReadPointOfInterestData
(POI)¶Initialize the parameters for a voxel analysis. The voxel ID is ‘POI’ in ‘ConsideredCoord’ initialized in ‘ReadRealSignal’
Notes
Input signals must have been read (in ReadRealSignal)
StoreRes
(POI)¶Store results computed in the voxel defined in POI.
Notes
The estimation at this voxel must have been performed
buildCosMat
(fctNb, tr, ny)¶Build a cosine low frequency basis in P (adapted from samplerbase.py)
Parameters: |
|
---|
buildLowFreqMat
()¶Build the low frequency basis matrix P.
buildPolyMat
(fctNb, tr, ny)¶Build a polynomial low frequency basis in P (adapted from samplerbase.py)
Parameters: |
|
---|
Notes
clean_memory
()¶Clean all objects that are useless for outputs
compute_fit
(POI)¶cpt_XSigmaX
(tempTerm2i, SBS, i)¶default_nb_its
= 500¶default_stop_crit1
= 0.0001¶default_stop_crit2
= 1e-05¶getOutputs
()¶linkToData
(data)¶parametersComments
= {'drift_type': 'Basis type in the drift model. Either "cosine" or "poly"', 'hrf_dt': 'Required HRF temporal resolution', 'hrf_nb_coeffs': 'Number of values in the discrete HRF. Discretization is homogeneous HRF time length is then: nb_hrf_coeffs * hrf_dt '}¶parametersToShow
= ['hrf_nb_coeffs', 'hrf_dt', 'drift_type', 'nb_iterations']¶run
()¶function to launch the analysis
pyhrf.rfir.
init_dict
()¶pyhrf.rfir.
rfir
(func_data, fir_duration=42, fir_dt=0.6, nb_its_max=100, nb_its_min=5, fixed_taum=False, lambda_reg=100.0)¶Fit a Regularized FIR on functional data func_data:
Reference: “Unsupervised robust non-parametric estimation of the hemodynamic response function for any fMRI experiment.” Ciuciu, J.-B. Poline, G. Marrelec, J. Idier, Ch. Pallier, and H. Benali. IEEE Trans. Med. Imag., 22(10):1235-1251, Oct. 2003.
Parameters: |
|
---|---|
Returns: | |
Return type: | dict of xndarray instances |
pyhrf.surface.
create_projection_kernels
(input_mesh, output_kernel, resolution, geod_decay=5.0, norm_decay=2.0, size=7)¶pyhrf.surface.
extract_sub_mesh
(cor, tri, center_node, radius)¶pyhrf.surface.
extract_sub_mesh_with_files
(input_mesh, center_node, radius, output_mesh=None)¶pyhrf.surface.
mesh_contour
(coords, triangles, labels)¶pyhrf.surface.
mesh_contour_with_files
(input_mesh, input_labels, output_mesh=None, output_labels=None)¶TODO: use nibabel here
pyhrf.surface.
project_fmri
(input_mesh, data_file, output_tex_file, output_kernels_file=None, data_resolution=None, geod_decay=5.0, norm_decay=2.0, kernel_size=7, tex_bin_threshold=None)¶pyhrf.surface.
project_fmri_from_kernels
(input_mesh, kernels_file, fmri_data_file, output_tex, bin_threshold=None)¶pyhrf.xmlio.
DeprecatedXMLFormatException
¶Bases: exceptions.Exception
pyhrf.xmlio.
Initable
¶Bases: object
Abstract class to keep track of how an object is initialised. To do so, it stores the init function and its parameters. The aim is to use it in a user interface or to serialize objects. It also allows to add comments and meta info on init parameters.
assert_is_initialized
()¶check_init_obj
(params=None)¶check if the function used for init can be used in this API -> must allow **kwargs and *args. All arguments must have a value: either a default one or specified in the input dict params
from_ui_node
¶get_arg_for_ui
(a)¶get_arg_from_ui
(a)¶get_init_func
()¶get_parameters_comments
()¶get_parameters_meta
()¶get_parameters_to_show
()¶init_new_obj
()¶Creates a new instance
set_arg_translation
(a, t)¶Set the display name of argument a as t
set_init
(init_obj, **init_params)¶Override init function with init_obj and use init_params as new init parameters. init_obj must return an instance of the same class as the current object. Useful when the object is not instanciated via its __init__ function but eg a static method.
set_init_param
(param_name, param_value)¶to_ui_node
(label, parent=None)¶pyhrf.xmlio.
UiNode
(label, parent=None, attributes=None)¶Bases: object
Store data hierarchically to be used in a Qt model/tree view setting. Also store additional node-specific data as attributes (in a dict). Attributes must only contain strings.
The resulting data structure is:
col 0 | col 1
|- <node_label> | <node_attributes> #row 0
|
| col 0 | col 1
|- <child_node_label> | <child_node_attributes> #row 0
|
|...
...
This structure is similar to DOM.
Features:
See static method from_py_object.
add_child
(child)¶child
(row)¶childCount
()¶from_py_object
¶from_xml
¶get_attribute
(attr_name)¶get_children
()¶has_attribute
(attr_name)¶is_leaf_node
()¶label
()¶log
(tabLevel=-1)¶serialize_attributes
()¶set_attribute
(attr_name, attr_value)¶set_label
(label)¶to_xml
(pretty=False)¶Return an XML representation (str) of the Node and its children.
type_info
()¶unserialize_attributes
¶pyhrf.xmlio.
XmlInitable
¶alias of pyhrf.xmlio.Initable
pyhrf.xmlio.
from_xml
(sxml)¶pyhrf.xmlio.
numpy_array_from_string
(s, sdtype, sshape=None)¶pyhrf.xmlio.
protect_xml_attr
(sa)¶pyhrf.xmlio.
read_xml
(fn)¶pyhrf.xmlio.
to_xml
(obj, label='anonym', pretty=False)¶Return an XML representation of the init state of the given object obj.
pyhrf.xmlio.
unprotect_xml_attr
(sa)¶pyhrf.xmlio.
write_xml
(obj, fn)¶