Dataset from CoSMoMVPA
This module provides basic I/O support for datasets in CoSMoMVPA. The current implementation provides (1) loading and saving a CoSMoMVPA dataset struct, which is converted to a PyMVPA Dataset object; and (2) loading a CoSMoMVPA neighborhood struct, which is converted to a CoSMoQueryEngine object that inherits from QueryEngineInterface.
A use case is running searchlights on MEEG data, e.g.:
Suppose that in Matlab using CoSMoMVPA, two structs were created:
Alternatively they can be defined in Matlab directly without use of CoSMoMVPA functionality. For a toy example, consider the following Matlab code:
>> ds=struct();
>> ds.samples=[1 2 3; 4 5 6];
>> ds.a.name='input';
>> ds.fa.i=[1 2 3];
>> ds.fa.j=[1 2 2];
>> ds.sa.chunks=[2 2]';
>> ds.sa.targets=[1 2]';
>> ds.sa.labels={'a','b','c','d';'e','f','g','h'};
>> save('ds_tiny.mat','-struct','ds');
>> nbrhood=struct();
>> nbrhood.neighbors={1, [1 3], [1 2 3], [2 2]};
>> nbrhood.fa.k=[4 3 2 1];
>> nbrhood.a.name='output';
>> save('nbrhood_tiny.mat','-struct','nbrhood');
These can be stored in Matlab by:
>> save('ds.mat','-struct','ds')
>> save('nbrhood.mat','-struct','nbrhood')
and loaded in Python using:
>>> import mvpa2
>>> import os
>>> from mvpa2.datasets.cosmo import from_any, CosmoSearchlight
>>> from mvpa2.mappers.fx import mean_feature
>>> data_path=os.path.join(mvpa2.pymvpa_dataroot,'cosmo')
>>> fn_mat_ds=os.path.join(data_path,'ds_tiny.mat')
>>> fn_mat_nbrhood=os.path.join(data_path,'nbrhood_tiny.mat')
>>> ds=from_any(fn_mat_ds)
>>> print ds
<Dataset: 2x3@float64, <sa: chunks,labels,targets>, <fa: i,j>, <a: name>> >>> qe=from_any(fn_mat_nbrhood) >>> print qe CosmoQueryEngine(4 center ids (0 .. 3), <fa: k>, <a: name>
where ds is a Dataset and qe a CosmoQueryEngine.
A Measure of choice can be used for a searchlight; here the measure simply takes the mean over features in each searchlight:: >>> measure=mean_feature()
A searchlight can be run the CosmoQueryEngine >>> sl=CosmoSearchlight(measure, qe) >>> ds_sl=sl(ds) >>> print ds_sl <Dataset: 2x4@float64, <sa: chunks,labels,targets>, <fa: k>, <a: name>>
Note that the output dataset has the feature and sample attributes taken from the queryengine, not the dataset.
Alternatively it is possible to run the searchlight directly using the filename of the neighborhood .mat file:
>>> sl=CosmoSearchlight(measure, fn_mat_nbrhood)
>>> ds_sl=sl(ds)
>>> print ds_sl
<Dataset: 2x4@float64, <sa: chunks,labels,targets>, <fa: k>, <a: name>>
which gives the same result as above.
Leaving the doctest format here, subsequently the result can be stored in Python using:
>> map2cosmo(ds_sl,'ds_sl.mat')
and loaded in Matlab using:
>> ds_sl=importdata('ds_sl.mat')
so that in Matlab ds_sl is a dataset struct with the output of applying measure to the neighborhoods defined in nbrhood.
Functions
cosmo_dataset(cosmo) | Construct Dataset from CoSMoMVPA format |
from_any(x) | Load CoSMoMVPA dataset or neighborhood |
loadmat(file_name[, mdict, appendmat]) | Load MATLAB file |
map2cosmo(ds[, filename]) | Convert PyMVPA Dataset to CoSMoMVPA struct saveable by scipy’s savemat |
savemat(file_name, mdict[, appendmat, ...]) | Save a dictionary of names and arrays into a MATLAB-style .mat file. |
Classes
ChainMapper(nodes, **kwargs) | Class that amends ChainNode with a mapper-like interface. |
Collection([items]) | Container of some Collectables. |
CosmoQueryEngine(mapping[, a, fa]) | queryengine for neighborhoods defined in CoSMoMVPA. |
CosmoSearchlight(datameasure, nbrhood[, ...]) | Implement a standard Saerchlight measure, but with a separate |
Dataset(samples[, sa, fa, a]) | Generic storage class for datasets with multiple attributes. |
FlattenMapper([shape, maxdims]) | Reshaping mapper that flattens multidimensional arrays into 1D vectors. |
QueryEngineInterface | Very basic class for QueryEngines defining the interface |
Searchlight(datameasure, queryengine[, ...]) | The implementation of a generic searchlight measure. |
StaticFeatureSelection(slicearg[, dshape, ...]) | Feature selection by static slicing argument. |