Skip to content

EEGLAB Importer

The EEGLABImporter class is responsible for importing EMG (and other biopotential) data from EEGLAB .set files.

Class Documentation

emgio.importers.eeglab

BaseImporter

Bases: ABC

Base class for EMG data importers.

Source code in emgio/importers/base.py
class BaseImporter(ABC):
    """Base class for EMG data importers."""

    @abstractmethod
    def load(self, filepath: str) -> EMG:
        """
        Load EMG data from file.

        Args:
            filepath: Path to the input file

        Returns:
            EMG: EMG object containing the loaded data
        """
        pass

load(filepath) abstractmethod

Load EMG data from file.

Args: filepath: Path to the input file

Returns: EMG: EMG object containing the loaded data

Source code in emgio/importers/base.py
@abstractmethod
def load(self, filepath: str) -> EMG:
    """
    Load EMG data from file.

    Args:
        filepath: Path to the input file

    Returns:
        EMG: EMG object containing the loaded data
    """
    pass

EEGLABImporter

Bases: BaseImporter

Importer for EEGLAB .set files containing EMG data.

Source code in emgio/importers/eeglab.py
class EEGLABImporter(BaseImporter):
    """Importer for EEGLAB .set files containing EMG data."""

    def _extract_metadata(self, data: Dict[str, Any]) -> Dict[str, Any]:
        """
        Extract metadata from EEGLAB .set file.

        Args:
            data: Dictionary containing EEGLAB .set file data

        Returns:
            dict: Dictionary containing extracted metadata
        """
        metadata = {}

        # Extract basic recording information
        if 'setname' in data:
            metadata['setname'] = str(data['setname'][0]) if data['setname'].size > 0 else ''

        if 'filename' in data:
            metadata['filename'] = str(data['filename'][0]) if data['filename'].size > 0 else ''

        if 'filepath' in data:
            metadata['filepath'] = str(data['filepath'][0]) if data['filepath'].size > 0 else ''

        # Extract subject information
        if 'subject' in data:
            metadata['subject'] = str(data['subject'][0]) if data['subject'].size > 0 else ''

        if 'group' in data:
            metadata['group'] = str(data['group'][0]) if data['group'].size > 0 else ''

        if 'condition' in data:
            metadata['condition'] = str(data['condition'][0]) if data['condition'].size > 0 else ''

        if 'session' in data:
            metadata['session'] = str(data['session'][0]) if data['session'].size > 0 else ''

        if 'comments' in data:
            metadata['comments'] = str(data['comments'][0]) if data['comments'].size > 0 else ''

        # Extract recording parameters
        if 'srate' in data:
            metadata['srate'] = float(data['srate'][0][0]) if data['srate'].size > 0 else 0

        if 'nbchan' in data:
            metadata['nbchan'] = int(data['nbchan'][0][0]) if data['nbchan'].size > 0 else 0

        if 'trials' in data:
            metadata['trials'] = int(data['trials'][0][0]) if data['trials'].size > 0 else 0

        if 'pnts' in data:
            metadata['pnts'] = int(data['pnts'][0][0]) if data['pnts'].size > 0 else 0

        if 'xmin' in data and data['xmin'].size > 0:
            metadata['xmin'] = float(data['xmin'][0][0])

        if 'xmax' in data and data['xmax'].size > 0:
            metadata['xmax'] = float(data['xmax'][0][0])

        # Add device information
        metadata['device'] = 'EEGLAB'

        return metadata

    def _determine_channel_type(self, channel_info: Dict[str, Any]) -> str:
        """
        Determine channel type based on channel information.

        Args:
            channel_info: Dictionary containing channel information

        Returns:
            str: Channel type ('EMG', 'ACC', 'GYRO', etc.)
        """
        # Check if type is explicitly specified
        if 'type' in channel_info and len(channel_info['type']) > 0:
            ch_type = str(channel_info['type'][0])

            # Map EEGLAB channel types to emgio channel types
            if ch_type.upper() == 'EMG':
                return 'EMG'
            elif ch_type.upper() in ['ACC', 'ACCELEROMETER']:
                return 'ACC'
            elif ch_type.upper() in ['GYRO', 'GYROSCOPE']:
                return 'GYRO'
            elif ch_type.upper() in ['TRIG', 'TRIGGER']:
                return 'TRIG'

        # If type is not specified or not recognized, try to determine from label
        if 'labels' in channel_info and channel_info['labels'].size > 0:
            label = str(channel_info['labels'][0])
            label_upper = label.upper()

            if 'EMG' in label_upper:
                return 'EMG'
            elif 'ACC' in label_upper:
                return 'ACC'
            elif 'GYRO' in label_upper:
                return 'GYRO'
            elif 'TRIG' in label_upper:
                return 'TRIG'

        # Default to EMG if we can't determine the type
        return 'EMG'

    def _process_channel_info(self, chanlocs: np.ndarray) -> List[Dict[str, Any]]:
        """
        Process channel location information.

        Args:
            chanlocs: Array containing channel location information

        Returns:
            list: List of dictionaries containing channel information
        """
        channel_info_list = []

        # Process each channel
        for i in range(len(chanlocs[0])):
            channel_info = {}

            # Extract channel fields
            for field in chanlocs.dtype.names:
                # Get the field value for this channel
                field_value = chanlocs[0][i][field]

                # Process based on field name
                if field == 'labels' and field_value.size > 0:
                    channel_info['label'] = str(field_value[0])
                elif field == 'type' and field_value.size > 0:
                    channel_info['type'] = str(field_value[0])
                elif field == 'X' and field_value.size > 0:
                    channel_info['X'] = float(field_value[0])
                elif field == 'Y' and field_value.size > 0:
                    channel_info['Y'] = float(field_value[0])
                elif field == 'Z' and field_value.size > 0:
                    channel_info['Z'] = float(field_value[0])

            # Determine channel type
            channel_info['channel_type'] = self._determine_channel_type(channel_info)

            # Add to list
            channel_info_list.append(channel_info)

        return channel_info_list

    def _process_events(self, events: np.ndarray) -> List[Dict[str, Any]]:
        """
        Process event information.

        Args:
            events: Array containing event information

        Returns:
            list: List of dictionaries containing event information
        """
        event_list = []

        # Check if events exist
        if events.size == 0:
            return event_list

        # Process each event
        for i in range(len(events[0])):
            event_info = {}

            # Extract event fields
            for field in events.dtype.names:
                # Get the field value for this event
                field_value = events[0][i][field]

                # Process based on field name
                if field == 'latency' and field_value.size > 0:
                    event_info['latency'] = float(field_value[0][0])
                elif field == 'type' and field_value.size > 0:
                    event_info['type'] = str(field_value[0])
                elif field == 'duration' and field_value.size > 0:
                    event_info['duration'] = float(field_value[0][0]) if field_value[0].size > 0 else 0
                elif field == 'trial_type' and field_value.size > 0:
                    event_info['trial_type'] = str(field_value[0]) if field_value[0].size > 0 else ''

            # Add to list if it has required fields
            if 'latency' in event_info and 'type' in event_info:
                event_list.append(event_info)

        return event_list

    def load(self, filepath: str) -> EMG:
        """
        Load EMG data from EEGLAB .set file.

        Args:
            filepath: Path to the EEGLAB .set file

        Returns:
            EMG: EMG object containing the loaded data
        """
        try:
            # Load the .set file
            data = loadmat(filepath)

            # Create EMG object
            emg = EMG()

            # Extract and store metadata
            metadata = self._extract_metadata(data)
            for key, value in metadata.items():
                emg.set_metadata(key, value)

            # Store source file information
            emg.set_metadata('source_file', filepath)

            # Process channel information
            if 'chanlocs' in data and data['chanlocs'].size > 0:
                channel_info_list = self._process_channel_info(data['chanlocs'])
            else:
                # If no channel locations, create default channel info
                channel_info_list = []
                for i in range(metadata.get('nbchan', 0)):
                    channel_info_list.append({
                        'label': f'Channel{i+1}',
                        'channel_type': 'EMG'
                    })

            # Process event information
            if 'event' in data and data['event'].size > 0:
                event_list = self._process_events(data['event'])
                emg.set_metadata('events', event_list)

            # Extract signal data
            if 'data' in data and data['data'].size > 0:
                # Get data array
                signal_data = data['data']

                # Get sampling rate
                srate = metadata.get('srate', 1000)

                # Create time array for index
                if 'times' in data and data['times'].size > 0:
                    time_index = data['times'][0] / srate
                else:
                    # Create time array based on number of points and sampling rate
                    pnts = metadata.get('pnts', signal_data.shape[1])
                    time_index = np.arange(pnts) / srate

                # Create DataFrame with time index
                df = pd.DataFrame(index=time_index)

                # Add channels to EMG object
                for i, channel_info in enumerate(channel_info_list):
                    if i < signal_data.shape[0]:  # Make sure we have data for this channel
                        # Get channel data
                        channel_data = signal_data[i, :]

                        # Add to DataFrame
                        channel_label = channel_info.get('label', f'Channel{i+1}')
                        df[channel_label] = channel_data

                # Set signals DataFrame
                emg.signals = df

                # Add channel information
                for i, channel_info in enumerate(channel_info_list):
                    if i < signal_data.shape[0]:  # Make sure we have data for this channel
                        channel_label = channel_info.get('label', f'Channel{i+1}')

                        # Add channel info
                        emg.channels[channel_label] = {
                            'sample_frequency': srate,
                            'physical_dimension': 'uV',  # Default unit for EEG/EMG
                            'prefilter': 'n/a',
                            'channel_type': channel_info.get('channel_type', 'EMG')
                        }

                        # Add additional channel metadata
                        if 'X' in channel_info:
                            emg.channels[channel_label]['X'] = channel_info['X']
                        if 'Y' in channel_info:
                            emg.channels[channel_label]['Y'] = channel_info['Y']
                        if 'Z' in channel_info:
                            emg.channels[channel_label]['Z'] = channel_info['Z']

            return emg

        except Exception as e:
            raise ValueError(f"Error reading EEGLAB .set file: {str(e)}")

load(filepath)

Load EMG data from EEGLAB .set file.

Args: filepath: Path to the EEGLAB .set file

Returns: EMG: EMG object containing the loaded data

Source code in emgio/importers/eeglab.py
def load(self, filepath: str) -> EMG:
    """
    Load EMG data from EEGLAB .set file.

    Args:
        filepath: Path to the EEGLAB .set file

    Returns:
        EMG: EMG object containing the loaded data
    """
    try:
        # Load the .set file
        data = loadmat(filepath)

        # Create EMG object
        emg = EMG()

        # Extract and store metadata
        metadata = self._extract_metadata(data)
        for key, value in metadata.items():
            emg.set_metadata(key, value)

        # Store source file information
        emg.set_metadata('source_file', filepath)

        # Process channel information
        if 'chanlocs' in data and data['chanlocs'].size > 0:
            channel_info_list = self._process_channel_info(data['chanlocs'])
        else:
            # If no channel locations, create default channel info
            channel_info_list = []
            for i in range(metadata.get('nbchan', 0)):
                channel_info_list.append({
                    'label': f'Channel{i+1}',
                    'channel_type': 'EMG'
                })

        # Process event information
        if 'event' in data and data['event'].size > 0:
            event_list = self._process_events(data['event'])
            emg.set_metadata('events', event_list)

        # Extract signal data
        if 'data' in data and data['data'].size > 0:
            # Get data array
            signal_data = data['data']

            # Get sampling rate
            srate = metadata.get('srate', 1000)

            # Create time array for index
            if 'times' in data and data['times'].size > 0:
                time_index = data['times'][0] / srate
            else:
                # Create time array based on number of points and sampling rate
                pnts = metadata.get('pnts', signal_data.shape[1])
                time_index = np.arange(pnts) / srate

            # Create DataFrame with time index
            df = pd.DataFrame(index=time_index)

            # Add channels to EMG object
            for i, channel_info in enumerate(channel_info_list):
                if i < signal_data.shape[0]:  # Make sure we have data for this channel
                    # Get channel data
                    channel_data = signal_data[i, :]

                    # Add to DataFrame
                    channel_label = channel_info.get('label', f'Channel{i+1}')
                    df[channel_label] = channel_data

            # Set signals DataFrame
            emg.signals = df

            # Add channel information
            for i, channel_info in enumerate(channel_info_list):
                if i < signal_data.shape[0]:  # Make sure we have data for this channel
                    channel_label = channel_info.get('label', f'Channel{i+1}')

                    # Add channel info
                    emg.channels[channel_label] = {
                        'sample_frequency': srate,
                        'physical_dimension': 'uV',  # Default unit for EEG/EMG
                        'prefilter': 'n/a',
                        'channel_type': channel_info.get('channel_type', 'EMG')
                    }

                    # Add additional channel metadata
                    if 'X' in channel_info:
                        emg.channels[channel_label]['X'] = channel_info['X']
                    if 'Y' in channel_info:
                        emg.channels[channel_label]['Y'] = channel_info['Y']
                    if 'Z' in channel_info:
                        emg.channels[channel_label]['Z'] = channel_info['Z']

        return emg

    except Exception as e:
        raise ValueError(f"Error reading EEGLAB .set file: {str(e)}")

EMG

Core EMG class for handling EMG data and metadata.

Attributes: signals (pd.DataFrame): Raw signal data with time as index. metadata (dict): Metadata dictionary containing recording information. channels (dict): Channel information including type, unit, sampling frequency. events (pd.DataFrame): Annotations or events associated with the signals, with columns 'onset', 'duration', 'description'.

Source code in emgio/core/emg.py
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
class EMG:
    """
    Core EMG class for handling EMG data and metadata.

    Attributes:
        signals (pd.DataFrame): Raw signal data with time as index.
        metadata (dict): Metadata dictionary containing recording information.
        channels (dict): Channel information including type, unit, sampling frequency.
        events (pd.DataFrame): Annotations or events associated with the signals,
                               with columns 'onset', 'duration', 'description'.
    """

    def __init__(self):
        """Initialize an empty EMG object."""
        self.signals = None
        self.metadata = {}
        self.channels = {}
        # Initialize events as an empty DataFrame with specified columns
        self.events = pd.DataFrame(columns=['onset', 'duration', 'description'])

    def plot_signals(self, channels=None, time_range=None, offset_scale=0.8,
                    uniform_scale=True, detrend=False, grid=True, title=None,
                    show=True, plt_module=None):
        """
        Plot EMG signals in a single plot with vertical offsets.

        Args:
            channels: List of channels to plot. If None, plot all channels.
            time_range: Tuple of (start_time, end_time) to plot. If None, plot all data.
            offset_scale: Portion of allocated space each signal can use (0.0 to 1.0).
            uniform_scale: Whether to use the same scale for all signals.
            detrend: Whether to remove mean from signals before plotting.
            grid: Whether to show grid lines.
            title: Optional title for the figure.
            show: Whether to display the plot.
            plt_module: Matplotlib pyplot module to use.
        """
        # Delegate to the static plotting function in visualization module
        static_plot_signals(
            emg_object=self,
            channels=channels,
            time_range=time_range,
            offset_scale=offset_scale,
            uniform_scale=uniform_scale,
            detrend=detrend,
            grid=grid,
            title=title,
            show=show,
            plt_module=plt_module
        )

    @classmethod
    def _infer_importer(cls, filepath: str) -> str:
        """
        Infer the importer to use based on the file extension.
        """
        extension = os.path.splitext(filepath)[1].lower()
        if extension in {'.edf', '.bdf'}:
            return 'edf'
        elif extension in {'.set'}:
            return 'eeglab'
        elif extension in {'.otb', '.otb+'}:
            return 'otb'
        elif extension in {'.csv', '.txt'}:
            return 'csv'
        elif extension in {'.hea', '.dat', '.atr'}:
            return 'wfdb'
        else:
            raise ValueError(f"Unsupported file extension: {extension}")

    @classmethod
    def from_file(
            cls,
            filepath: str,
            importer: Literal['trigno', 'otb', 'eeglab', 'edf', 'csv', 'wfdb'] | None = None,
            force_csv: bool = False,
            **kwargs
    ) -> 'EMG':
        """
        The method to create EMG object from file.

        Args:
            filepath: Path to the input file
            importer: Name of the importer to use. Can be one of the following:
                - 'trigno': Delsys Trigno EMG system (CSV)
                - 'otb': OTB/OTB+ EMG system (OTB, OTB+)
                - 'eeglab': EEGLAB .set files (SET)
                - 'edf': EDF/EDF+/BDF/BDF+ format (EDF, BDF)
                - 'csv': Generic CSV (or TXT) files with columnar data
                - 'wfdb': Waveform Database (WFDB)
                If None, the importer will be inferred from the file extension.
                Automatic import is supported for CSV/TXT files.
            force_csv: If True and importer is 'csv', forces using the generic CSV
                      importer even if the file appears to match a specialized format.
            **kwargs: Additional arguments passed to the importer

        Returns:
            EMG: New EMG object with loaded data
        """
        if importer is None:
            importer = cls._infer_importer(filepath)

        importers = {
            'trigno': 'TrignoImporter',  # CSV with Delsys Trigno Headers
            'otb': 'OTBImporter',  # OTB/OTB+ EMG system data
            'edf': 'EDFImporter',  # EDF/EDF+/BDF format
            'eeglab': 'EEGLABImporter',  # EEGLAB .set files
            'csv': 'CSVImporter',  # Generic CSV/Text files
            'wfdb': 'WFDBImporter'  # Waveform Database format
        }

        if importer not in importers:
            raise ValueError(
                f"Unsupported importer: {importer}. "
                f"Available importers: {list(importers.keys())}\n"
                "- trigno: Delsys Trigno EMG system\n"
                "- otb: OTB/OTB+ EMG system\n"
                "- edf: EDF/EDF+/BDF format\n"
                "- eeglab: EEGLAB .set files\n"
                "- csv: Generic CSV/Text files\n"
                "- wfdb: Waveform Database"
            )

        # If using CSV importer and force_csv is set, pass it as force_generic
        if importer == 'csv':
            kwargs['force_generic'] = force_csv

        # Import the appropriate importer class
        importer_module = __import__(
            f'emgio.importers.{importer}',
            globals(),
            locals(),
            [importers[importer]]
        )
        importer_class = getattr(importer_module, importers[importer])

        # Create importer instance and load data
        return importer_class().load(filepath, **kwargs)

    def select_channels(
            self,
            channels: Union[str, List[str], None] = None,
            channel_type: Optional[str] = None,
            inplace: bool = False) -> 'EMG':
        """
        Select specific channels from the data and return a new EMG object.

        Args:
            channels: Channel name or list of channel names to select. If None and
                    channel_type is specified, selects all channels of that type.
            channel_type: Type of channels to select ('EMG', 'ACC', 'GYRO', etc.).
                        If specified with channels, filters the selection to only
                        channels of this type.

        Returns:
            EMG: A new EMG object containing only the selected channels

        Examples:
            # Select specific channels
            new_emg = emg.select_channels(['EMG1', 'ACC1'])

            # Select all EMG channels
            emg_only = emg.select_channels(channel_type='EMG')

            # Select specific EMG channels only, this example does not select ACC channels
            emg_subset = emg.select_channels(['EMG1', 'ACC1'], channel_type='EMG')
        """
        if self.signals is None:
            raise ValueError("No signals loaded")

        # If channel_type specified but no channels, select all of that type
        if channels is None and channel_type is not None:
            channels = [ch for ch, info in self.channels.items()
                        if info['channel_type'] == channel_type]
            if not channels:
                raise ValueError(f"No channels found of type: {channel_type}")
        elif isinstance(channels, str):
            channels = [channels]

        # Validate channels exist
        if not all(ch in self.signals.columns for ch in channels):
            missing = [ch for ch in channels if ch not in self.signals.columns]
            raise ValueError(f"Channels not found: {missing}")

        # Filter by type if specified
        if channel_type is not None:
            channels = [ch for ch in channels
                        if self.channels[ch]['channel_type'] == channel_type]
            if not channels:
                raise ValueError(
                    f"None of the selected channels are of type: {channel_type}")

        # Create new EMG object
        new_emg = EMG()

        # Copy selected signals and channels
        new_emg.signals = self.signals[channels].copy()
        new_emg.channels = {ch: self.channels[ch].copy() for ch in channels}

        # Copy metadata
        new_emg.metadata = self.metadata.copy()

        if not inplace:
            return new_emg
        else:
            self.signals = new_emg.signals
            self.channels = new_emg.channels
            self.metadata = new_emg.metadata
            return self

    def get_channel_types(self) -> List[str]:
        """
        Get list of unique channel types in the data.

        Returns:
            List of channel types (e.g., ['EMG', 'ACC', 'GYRO'])
        """
        return list(set(info['channel_type'] for info in self.channels.values()))

    def get_channels_by_type(self, channel_type: str) -> List[str]:
        """
        Get list of channels of a specific type.

        Args:
            channel_type: Type of channels to get ('EMG', 'ACC', 'GYRO', etc.)

        Returns:
            List of channel names of the specified type
        """
        return [ch for ch, info in self.channels.items()
                if info['channel_type'] == channel_type]

    def to_edf(self, filepath: str, method: str = 'both',
               fft_noise_range: tuple = None, svd_rank: int = None,
               precision_threshold: float = 0.01,
               format: Literal['auto', 'edf', 'bdf'] = 'auto',
               bypass_analysis: bool | None = None,
               verify: bool = False, verify_tolerance: float = 1e-6,
               verify_channel_map: Optional[Dict[str, str]] = None,
               verify_plot: bool = False,
               events_df: Optional[pd.DataFrame] = None,
               **kwargs
               ) -> Union[str, None]:
        """
        Export EMG data to EDF/BDF format, optionally including events.

        Args:
            filepath: Path to save the EDF/BDF file
            method: Method for signal analysis ('svd', 'fft', or 'both')
                'svd': Uses Singular Value Decomposition for noise floor estimation
                'fft': Uses Fast Fourier Transform for noise floor estimation
                'both': Uses both methods and takes the minimum noise floor (default)
            fft_noise_range: Optional tuple (min_freq, max_freq) specifying frequency range for noise in FFT method
            svd_rank: Optional manual rank cutoff for signal/noise separation in SVD method
            precision_threshold: Maximum acceptable precision loss percentage (default: 0.01%)
            format: Format to use ('auto', 'edf', or 'bdf'). Default is 'auto'.
                    If 'edf' or 'bdf' is specified, that format will be used directly.
                    If 'auto', the format (EDF/16-bit or BDF/24-bit) is chosen based
                    on signal analysis to minimize precision loss while preferring EDF
                    if sufficient.
            bypass_analysis: If True, skip signal analysis step when format is explicitly
                             set to 'edf' or 'bdf'. If None (default), analysis is skipped
                             automatically when format is forced. Set to False to force
                             analysis even with a specified format. Ignored if format='auto'.
            verify: If True, reload the exported file and compare signals with the original
                    to check for data integrity loss. Results are printed. (default: False)
            verify_tolerance: Absolute tolerance used when comparing signals during verification. (default: 1e-6)
            verify_channel_map: Optional dictionary mapping original channel names (keys)
                                to reloaded channel names (values) for verification.
                                Used if `verify` is True and channel names might differ.
            verify_plot: If True and verify is True, plots a comparison of original vs reloaded signals.
            events_df: Optional DataFrame with events ('onset', 'duration', 'description').
                      If None, uses self.events. (This provides flexibility)
            **kwargs: Additional arguments for the EDF exporter

        Returns:
            Union[str, None]: If verify is True, returns a string with verification results.
                             Otherwise, returns None.

        Raises:
            ValueError: If no signals are loaded
        """
        from ..exporters.edf import EDFExporter  # Local import

        if self.signals is None:
            raise ValueError("No signals loaded")

        # --- Determine if analysis should be bypassed ---
        final_bypass_analysis = False
        if format.lower() == 'auto':
            if bypass_analysis is True:
                logging.warning("bypass_analysis=True ignored because format='auto'. Analysis is required.")
            # Analysis is always needed for 'auto' format
            final_bypass_analysis = False
        elif format.lower() in ['edf', 'bdf']:
            if bypass_analysis is None:
                # Default behaviour: skip analysis if format is forced
                final_bypass_analysis = True
                msg = (f"Format forced to '{format}'. Skipping signal analysis for faster export. "
                       "Set bypass_analysis=False to force analysis.")
                logging.log(logging.CRITICAL, msg)
            elif bypass_analysis is True:
                final_bypass_analysis = True
                logging.log(logging.CRITICAL, "bypass_analysis=True set. Skipping signal analysis.")
            else:  # bypass_analysis is False
                final_bypass_analysis = False
                logging.info(f"Format forced to '{format}' but bypass_analysis=False. Performing signal analysis.")
        else:
            # Should not happen if Literal type hint works, but good practice
            logging.warning(f"Unknown format '{format}'. Defaulting to 'auto' behavior (analysis enabled).")
            format = 'auto'
            final_bypass_analysis = False

        # Determine which events DataFrame to use
        if events_df is None:
            events_to_export = self.events
        else:
            events_to_export = events_df

        # Combine parameters
        all_params = {
            'precision_threshold': precision_threshold,
            'method': method,
            'fft_noise_range': fft_noise_range,
            'svd_rank': svd_rank,
            'format': format,
            'bypass_analysis': final_bypass_analysis,
            'events_df': events_to_export,  # Pass the events dataframe
            **kwargs
        }

        EDFExporter.export(self, filepath, **all_params)

        verification_report_dict = None
        if verify:
            logging.info(f"Verification requested. Reloading exported file: {filepath}")
            try:
                # Reload the exported file
                reloaded_emg = EMG.from_file(filepath, importer='edf')

                logging.info("Comparing original signals with reloaded signals...")
                # Compare signals using the imported function
                verification_results = compare_signals(
                    self,
                    reloaded_emg,
                    tolerance=verify_tolerance,
                    channel_map=verify_channel_map
                )

                # Generate and log report using the imported function
                report_verification_results(verification_results, verify_tolerance)
                verification_report_dict = verification_results

                # Plot comparison using imported function if requested
                summary = verification_results.get('channel_summary', {})
                comparison_mode = summary.get('comparison_mode', 'unknown')
                compared_count = sum(1 for k in verification_results if k != 'channel_summary')

                if verify_plot and compared_count > 0 and comparison_mode != 'failed':
                    plot_comparison(self, reloaded_emg, channel_map=verify_channel_map)
                elif verify_plot:
                    logging.warning("Skipping verification plot: No channels were successfully compared.")

            except Exception as e:
                logging.error(f"Verification failed during reload or comparison: {e}")
                verification_report_dict = {
                    'error': str(e),
                    'channel_summary': {'comparison_mode': 'failed'}
                }

        return verification_report_dict

    def set_metadata(self, key: str, value: any) -> None:
        """
        Set metadata value.

        Args:
            key: Metadata key
            value: Metadata value
        """
        self.metadata[key] = value

    def get_metadata(self, key: str) -> any:
        """
        Get metadata value.

        Args:
            key: Metadata key

        Returns:
            Value associated with the key
        """
        return self.metadata.get(key)

    def add_channel(
            self, label: str, data: np.ndarray, sample_frequency: float,
            physical_dimension: str, prefilter: str = 'n/a', channel_type: str = 'EMG') -> None:
        """
        Add a new channel to the EMG data.

        Args:
            label: Channel label or name (as per EDF specification)
            data: Channel data
            sample_frequency: Sampling frequency in Hz (as per EDF specification)
            physical_dimension: Physical dimension/unit of measurement (as per EDF specification)
            prefilter: Pre-filtering applied to the channel
            channel_type: Channel type ('EMG', 'ACC', 'GYRO', etc.)
        """
        if self.signals is None:
            # Create DataFrame with time index
            time = np.arange(len(data)) / sample_frequency
            self.signals = pd.DataFrame(index=time)

        self.signals[label] = data
        self.channels[label] = {
            'sample_frequency': sample_frequency,
            'physical_dimension': physical_dimension,
            'prefilter': prefilter,
            'channel_type': channel_type
        }

    def add_event(self, onset: float, duration: float, description: str) -> None:
        """
        Add an event/annotation to the EMG object.

        Args:
            onset: Event onset time in seconds.
            duration: Event duration in seconds.
            description: Event description string.
        """
        new_event = pd.DataFrame([{'onset': onset, 'duration': duration, 'description': description}])
        # Use pd.concat for appending, ignore_index=True resets the index
        self.events = pd.concat([self.events, new_event], ignore_index=True)
        # Sort events by onset time for consistency
        self.events.sort_values(by='onset', inplace=True)
        self.events.reset_index(drop=True, inplace=True)

__init__()

Source code in emgio/core/emg.py
def __init__(self):
    """Initialize an empty EMG object."""
    self.signals = None
    self.metadata = {}
    self.channels = {}
    # Initialize events as an empty DataFrame with specified columns
    self.events = pd.DataFrame(columns=['onset', 'duration', 'description'])

add_channel(label, data, sample_frequency, physical_dimension, prefilter='n/a', channel_type='EMG')

Add a new channel to the EMG data.

Args: label: Channel label or name (as per EDF specification) data: Channel data sample_frequency: Sampling frequency in Hz (as per EDF specification) physical_dimension: Physical dimension/unit of measurement (as per EDF specification) prefilter: Pre-filtering applied to the channel channel_type: Channel type ('EMG', 'ACC', 'GYRO', etc.)

Source code in emgio/core/emg.py
def add_channel(
        self, label: str, data: np.ndarray, sample_frequency: float,
        physical_dimension: str, prefilter: str = 'n/a', channel_type: str = 'EMG') -> None:
    """
    Add a new channel to the EMG data.

    Args:
        label: Channel label or name (as per EDF specification)
        data: Channel data
        sample_frequency: Sampling frequency in Hz (as per EDF specification)
        physical_dimension: Physical dimension/unit of measurement (as per EDF specification)
        prefilter: Pre-filtering applied to the channel
        channel_type: Channel type ('EMG', 'ACC', 'GYRO', etc.)
    """
    if self.signals is None:
        # Create DataFrame with time index
        time = np.arange(len(data)) / sample_frequency
        self.signals = pd.DataFrame(index=time)

    self.signals[label] = data
    self.channels[label] = {
        'sample_frequency': sample_frequency,
        'physical_dimension': physical_dimension,
        'prefilter': prefilter,
        'channel_type': channel_type
    }

add_event(onset, duration, description)

Add an event/annotation to the EMG object.

Args: onset: Event onset time in seconds. duration: Event duration in seconds. description: Event description string.

Source code in emgio/core/emg.py
def add_event(self, onset: float, duration: float, description: str) -> None:
    """
    Add an event/annotation to the EMG object.

    Args:
        onset: Event onset time in seconds.
        duration: Event duration in seconds.
        description: Event description string.
    """
    new_event = pd.DataFrame([{'onset': onset, 'duration': duration, 'description': description}])
    # Use pd.concat for appending, ignore_index=True resets the index
    self.events = pd.concat([self.events, new_event], ignore_index=True)
    # Sort events by onset time for consistency
    self.events.sort_values(by='onset', inplace=True)
    self.events.reset_index(drop=True, inplace=True)

from_file(filepath, importer=None, force_csv=False, **kwargs) classmethod

The method to create EMG object from file.

Args: filepath: Path to the input file importer: Name of the importer to use. Can be one of the following: - 'trigno': Delsys Trigno EMG system (CSV) - 'otb': OTB/OTB+ EMG system (OTB, OTB+) - 'eeglab': EEGLAB .set files (SET) - 'edf': EDF/EDF+/BDF/BDF+ format (EDF, BDF) - 'csv': Generic CSV (or TXT) files with columnar data - 'wfdb': Waveform Database (WFDB) If None, the importer will be inferred from the file extension. Automatic import is supported for CSV/TXT files. force_csv: If True and importer is 'csv', forces using the generic CSV importer even if the file appears to match a specialized format. **kwargs: Additional arguments passed to the importer

Returns: EMG: New EMG object with loaded data

Source code in emgio/core/emg.py
@classmethod
def from_file(
        cls,
        filepath: str,
        importer: Literal['trigno', 'otb', 'eeglab', 'edf', 'csv', 'wfdb'] | None = None,
        force_csv: bool = False,
        **kwargs
) -> 'EMG':
    """
    The method to create EMG object from file.

    Args:
        filepath: Path to the input file
        importer: Name of the importer to use. Can be one of the following:
            - 'trigno': Delsys Trigno EMG system (CSV)
            - 'otb': OTB/OTB+ EMG system (OTB, OTB+)
            - 'eeglab': EEGLAB .set files (SET)
            - 'edf': EDF/EDF+/BDF/BDF+ format (EDF, BDF)
            - 'csv': Generic CSV (or TXT) files with columnar data
            - 'wfdb': Waveform Database (WFDB)
            If None, the importer will be inferred from the file extension.
            Automatic import is supported for CSV/TXT files.
        force_csv: If True and importer is 'csv', forces using the generic CSV
                  importer even if the file appears to match a specialized format.
        **kwargs: Additional arguments passed to the importer

    Returns:
        EMG: New EMG object with loaded data
    """
    if importer is None:
        importer = cls._infer_importer(filepath)

    importers = {
        'trigno': 'TrignoImporter',  # CSV with Delsys Trigno Headers
        'otb': 'OTBImporter',  # OTB/OTB+ EMG system data
        'edf': 'EDFImporter',  # EDF/EDF+/BDF format
        'eeglab': 'EEGLABImporter',  # EEGLAB .set files
        'csv': 'CSVImporter',  # Generic CSV/Text files
        'wfdb': 'WFDBImporter'  # Waveform Database format
    }

    if importer not in importers:
        raise ValueError(
            f"Unsupported importer: {importer}. "
            f"Available importers: {list(importers.keys())}\n"
            "- trigno: Delsys Trigno EMG system\n"
            "- otb: OTB/OTB+ EMG system\n"
            "- edf: EDF/EDF+/BDF format\n"
            "- eeglab: EEGLAB .set files\n"
            "- csv: Generic CSV/Text files\n"
            "- wfdb: Waveform Database"
        )

    # If using CSV importer and force_csv is set, pass it as force_generic
    if importer == 'csv':
        kwargs['force_generic'] = force_csv

    # Import the appropriate importer class
    importer_module = __import__(
        f'emgio.importers.{importer}',
        globals(),
        locals(),
        [importers[importer]]
    )
    importer_class = getattr(importer_module, importers[importer])

    # Create importer instance and load data
    return importer_class().load(filepath, **kwargs)

get_channel_types()

Get list of unique channel types in the data.

Returns: List of channel types (e.g., ['EMG', 'ACC', 'GYRO'])

Source code in emgio/core/emg.py
def get_channel_types(self) -> List[str]:
    """
    Get list of unique channel types in the data.

    Returns:
        List of channel types (e.g., ['EMG', 'ACC', 'GYRO'])
    """
    return list(set(info['channel_type'] for info in self.channels.values()))

get_channels_by_type(channel_type)

Get list of channels of a specific type.

Args: channel_type: Type of channels to get ('EMG', 'ACC', 'GYRO', etc.)

Returns: List of channel names of the specified type

Source code in emgio/core/emg.py
def get_channels_by_type(self, channel_type: str) -> List[str]:
    """
    Get list of channels of a specific type.

    Args:
        channel_type: Type of channels to get ('EMG', 'ACC', 'GYRO', etc.)

    Returns:
        List of channel names of the specified type
    """
    return [ch for ch, info in self.channels.items()
            if info['channel_type'] == channel_type]

get_metadata(key)

Get metadata value.

Args: key: Metadata key

Returns: Value associated with the key

Source code in emgio/core/emg.py
def get_metadata(self, key: str) -> any:
    """
    Get metadata value.

    Args:
        key: Metadata key

    Returns:
        Value associated with the key
    """
    return self.metadata.get(key)

plot_signals(channels=None, time_range=None, offset_scale=0.8, uniform_scale=True, detrend=False, grid=True, title=None, show=True, plt_module=None)

Plot EMG signals in a single plot with vertical offsets.

Args: channels: List of channels to plot. If None, plot all channels. time_range: Tuple of (start_time, end_time) to plot. If None, plot all data. offset_scale: Portion of allocated space each signal can use (0.0 to 1.0). uniform_scale: Whether to use the same scale for all signals. detrend: Whether to remove mean from signals before plotting. grid: Whether to show grid lines. title: Optional title for the figure. show: Whether to display the plot. plt_module: Matplotlib pyplot module to use.

Source code in emgio/core/emg.py
def plot_signals(self, channels=None, time_range=None, offset_scale=0.8,
                uniform_scale=True, detrend=False, grid=True, title=None,
                show=True, plt_module=None):
    """
    Plot EMG signals in a single plot with vertical offsets.

    Args:
        channels: List of channels to plot. If None, plot all channels.
        time_range: Tuple of (start_time, end_time) to plot. If None, plot all data.
        offset_scale: Portion of allocated space each signal can use (0.0 to 1.0).
        uniform_scale: Whether to use the same scale for all signals.
        detrend: Whether to remove mean from signals before plotting.
        grid: Whether to show grid lines.
        title: Optional title for the figure.
        show: Whether to display the plot.
        plt_module: Matplotlib pyplot module to use.
    """
    # Delegate to the static plotting function in visualization module
    static_plot_signals(
        emg_object=self,
        channels=channels,
        time_range=time_range,
        offset_scale=offset_scale,
        uniform_scale=uniform_scale,
        detrend=detrend,
        grid=grid,
        title=title,
        show=show,
        plt_module=plt_module
    )

select_channels(channels=None, channel_type=None, inplace=False)

Select specific channels from the data and return a new EMG object.

Args: channels: Channel name or list of channel names to select. If None and channel_type is specified, selects all channels of that type. channel_type: Type of channels to select ('EMG', 'ACC', 'GYRO', etc.). If specified with channels, filters the selection to only channels of this type.

Returns: EMG: A new EMG object containing only the selected channels

Examples: # Select specific channels new_emg = emg.select_channels(['EMG1', 'ACC1'])

# Select all EMG channels
emg_only = emg.select_channels(channel_type='EMG')

# Select specific EMG channels only, this example does not select ACC channels
emg_subset = emg.select_channels(['EMG1', 'ACC1'], channel_type='EMG')
Source code in emgio/core/emg.py
def select_channels(
        self,
        channels: Union[str, List[str], None] = None,
        channel_type: Optional[str] = None,
        inplace: bool = False) -> 'EMG':
    """
    Select specific channels from the data and return a new EMG object.

    Args:
        channels: Channel name or list of channel names to select. If None and
                channel_type is specified, selects all channels of that type.
        channel_type: Type of channels to select ('EMG', 'ACC', 'GYRO', etc.).
                    If specified with channels, filters the selection to only
                    channels of this type.

    Returns:
        EMG: A new EMG object containing only the selected channels

    Examples:
        # Select specific channels
        new_emg = emg.select_channels(['EMG1', 'ACC1'])

        # Select all EMG channels
        emg_only = emg.select_channels(channel_type='EMG')

        # Select specific EMG channels only, this example does not select ACC channels
        emg_subset = emg.select_channels(['EMG1', 'ACC1'], channel_type='EMG')
    """
    if self.signals is None:
        raise ValueError("No signals loaded")

    # If channel_type specified but no channels, select all of that type
    if channels is None and channel_type is not None:
        channels = [ch for ch, info in self.channels.items()
                    if info['channel_type'] == channel_type]
        if not channels:
            raise ValueError(f"No channels found of type: {channel_type}")
    elif isinstance(channels, str):
        channels = [channels]

    # Validate channels exist
    if not all(ch in self.signals.columns for ch in channels):
        missing = [ch for ch in channels if ch not in self.signals.columns]
        raise ValueError(f"Channels not found: {missing}")

    # Filter by type if specified
    if channel_type is not None:
        channels = [ch for ch in channels
                    if self.channels[ch]['channel_type'] == channel_type]
        if not channels:
            raise ValueError(
                f"None of the selected channels are of type: {channel_type}")

    # Create new EMG object
    new_emg = EMG()

    # Copy selected signals and channels
    new_emg.signals = self.signals[channels].copy()
    new_emg.channels = {ch: self.channels[ch].copy() for ch in channels}

    # Copy metadata
    new_emg.metadata = self.metadata.copy()

    if not inplace:
        return new_emg
    else:
        self.signals = new_emg.signals
        self.channels = new_emg.channels
        self.metadata = new_emg.metadata
        return self

set_metadata(key, value)

Set metadata value.

Args: key: Metadata key value: Metadata value

Source code in emgio/core/emg.py
def set_metadata(self, key: str, value: any) -> None:
    """
    Set metadata value.

    Args:
        key: Metadata key
        value: Metadata value
    """
    self.metadata[key] = value

to_edf(filepath, method='both', fft_noise_range=None, svd_rank=None, precision_threshold=0.01, format='auto', bypass_analysis=None, verify=False, verify_tolerance=1e-06, verify_channel_map=None, verify_plot=False, events_df=None, **kwargs)

Export EMG data to EDF/BDF format, optionally including events.

Args: filepath: Path to save the EDF/BDF file method: Method for signal analysis ('svd', 'fft', or 'both') 'svd': Uses Singular Value Decomposition for noise floor estimation 'fft': Uses Fast Fourier Transform for noise floor estimation 'both': Uses both methods and takes the minimum noise floor (default) fft_noise_range: Optional tuple (min_freq, max_freq) specifying frequency range for noise in FFT method svd_rank: Optional manual rank cutoff for signal/noise separation in SVD method precision_threshold: Maximum acceptable precision loss percentage (default: 0.01%) format: Format to use ('auto', 'edf', or 'bdf'). Default is 'auto'. If 'edf' or 'bdf' is specified, that format will be used directly. If 'auto', the format (EDF/16-bit or BDF/24-bit) is chosen based on signal analysis to minimize precision loss while preferring EDF if sufficient. bypass_analysis: If True, skip signal analysis step when format is explicitly set to 'edf' or 'bdf'. If None (default), analysis is skipped automatically when format is forced. Set to False to force analysis even with a specified format. Ignored if format='auto'. verify: If True, reload the exported file and compare signals with the original to check for data integrity loss. Results are printed. (default: False) verify_tolerance: Absolute tolerance used when comparing signals during verification. (default: 1e-6) verify_channel_map: Optional dictionary mapping original channel names (keys) to reloaded channel names (values) for verification. Used if verify is True and channel names might differ. verify_plot: If True and verify is True, plots a comparison of original vs reloaded signals. events_df: Optional DataFrame with events ('onset', 'duration', 'description'). If None, uses self.events. (This provides flexibility) **kwargs: Additional arguments for the EDF exporter

Returns: Union[str, None]: If verify is True, returns a string with verification results. Otherwise, returns None.

Raises: ValueError: If no signals are loaded

Source code in emgio/core/emg.py
def to_edf(self, filepath: str, method: str = 'both',
           fft_noise_range: tuple = None, svd_rank: int = None,
           precision_threshold: float = 0.01,
           format: Literal['auto', 'edf', 'bdf'] = 'auto',
           bypass_analysis: bool | None = None,
           verify: bool = False, verify_tolerance: float = 1e-6,
           verify_channel_map: Optional[Dict[str, str]] = None,
           verify_plot: bool = False,
           events_df: Optional[pd.DataFrame] = None,
           **kwargs
           ) -> Union[str, None]:
    """
    Export EMG data to EDF/BDF format, optionally including events.

    Args:
        filepath: Path to save the EDF/BDF file
        method: Method for signal analysis ('svd', 'fft', or 'both')
            'svd': Uses Singular Value Decomposition for noise floor estimation
            'fft': Uses Fast Fourier Transform for noise floor estimation
            'both': Uses both methods and takes the minimum noise floor (default)
        fft_noise_range: Optional tuple (min_freq, max_freq) specifying frequency range for noise in FFT method
        svd_rank: Optional manual rank cutoff for signal/noise separation in SVD method
        precision_threshold: Maximum acceptable precision loss percentage (default: 0.01%)
        format: Format to use ('auto', 'edf', or 'bdf'). Default is 'auto'.
                If 'edf' or 'bdf' is specified, that format will be used directly.
                If 'auto', the format (EDF/16-bit or BDF/24-bit) is chosen based
                on signal analysis to minimize precision loss while preferring EDF
                if sufficient.
        bypass_analysis: If True, skip signal analysis step when format is explicitly
                         set to 'edf' or 'bdf'. If None (default), analysis is skipped
                         automatically when format is forced. Set to False to force
                         analysis even with a specified format. Ignored if format='auto'.
        verify: If True, reload the exported file and compare signals with the original
                to check for data integrity loss. Results are printed. (default: False)
        verify_tolerance: Absolute tolerance used when comparing signals during verification. (default: 1e-6)
        verify_channel_map: Optional dictionary mapping original channel names (keys)
                            to reloaded channel names (values) for verification.
                            Used if `verify` is True and channel names might differ.
        verify_plot: If True and verify is True, plots a comparison of original vs reloaded signals.
        events_df: Optional DataFrame with events ('onset', 'duration', 'description').
                  If None, uses self.events. (This provides flexibility)
        **kwargs: Additional arguments for the EDF exporter

    Returns:
        Union[str, None]: If verify is True, returns a string with verification results.
                         Otherwise, returns None.

    Raises:
        ValueError: If no signals are loaded
    """
    from ..exporters.edf import EDFExporter  # Local import

    if self.signals is None:
        raise ValueError("No signals loaded")

    # --- Determine if analysis should be bypassed ---
    final_bypass_analysis = False
    if format.lower() == 'auto':
        if bypass_analysis is True:
            logging.warning("bypass_analysis=True ignored because format='auto'. Analysis is required.")
        # Analysis is always needed for 'auto' format
        final_bypass_analysis = False
    elif format.lower() in ['edf', 'bdf']:
        if bypass_analysis is None:
            # Default behaviour: skip analysis if format is forced
            final_bypass_analysis = True
            msg = (f"Format forced to '{format}'. Skipping signal analysis for faster export. "
                   "Set bypass_analysis=False to force analysis.")
            logging.log(logging.CRITICAL, msg)
        elif bypass_analysis is True:
            final_bypass_analysis = True
            logging.log(logging.CRITICAL, "bypass_analysis=True set. Skipping signal analysis.")
        else:  # bypass_analysis is False
            final_bypass_analysis = False
            logging.info(f"Format forced to '{format}' but bypass_analysis=False. Performing signal analysis.")
    else:
        # Should not happen if Literal type hint works, but good practice
        logging.warning(f"Unknown format '{format}'. Defaulting to 'auto' behavior (analysis enabled).")
        format = 'auto'
        final_bypass_analysis = False

    # Determine which events DataFrame to use
    if events_df is None:
        events_to_export = self.events
    else:
        events_to_export = events_df

    # Combine parameters
    all_params = {
        'precision_threshold': precision_threshold,
        'method': method,
        'fft_noise_range': fft_noise_range,
        'svd_rank': svd_rank,
        'format': format,
        'bypass_analysis': final_bypass_analysis,
        'events_df': events_to_export,  # Pass the events dataframe
        **kwargs
    }

    EDFExporter.export(self, filepath, **all_params)

    verification_report_dict = None
    if verify:
        logging.info(f"Verification requested. Reloading exported file: {filepath}")
        try:
            # Reload the exported file
            reloaded_emg = EMG.from_file(filepath, importer='edf')

            logging.info("Comparing original signals with reloaded signals...")
            # Compare signals using the imported function
            verification_results = compare_signals(
                self,
                reloaded_emg,
                tolerance=verify_tolerance,
                channel_map=verify_channel_map
            )

            # Generate and log report using the imported function
            report_verification_results(verification_results, verify_tolerance)
            verification_report_dict = verification_results

            # Plot comparison using imported function if requested
            summary = verification_results.get('channel_summary', {})
            comparison_mode = summary.get('comparison_mode', 'unknown')
            compared_count = sum(1 for k in verification_results if k != 'channel_summary')

            if verify_plot and compared_count > 0 and comparison_mode != 'failed':
                plot_comparison(self, reloaded_emg, channel_map=verify_channel_map)
            elif verify_plot:
                logging.warning("Skipping verification plot: No channels were successfully compared.")

        except Exception as e:
            logging.error(f"Verification failed during reload or comparison: {e}")
            verification_report_dict = {
                'error': str(e),
                'channel_summary': {'comparison_mode': 'failed'}
            }

    return verification_report_dict

Usage Example

from emgio import EMG
from emgio.importers.eeglab import EEGLABImporter

# Method 1: Using EMG.from_file (recommended)
emg = EMG.from_file('data.set', importer='eeglab')

# Method 2: Using the importer directly
importer = EEGLABImporter('data.set')
signals, channels, metadata = importer.load()
emg = EMG(signals, channels, metadata)

File Format Support

The EEGLAB importer supports:

  1. MATLAB .set files (version 7.3 and earlier)
  2. Both continuous and epoched data
  3. Multiple channel types (EMG, EEG, ACC, etc.)
  4. Event markers and annotations

Channel Type Detection

The EEGLAB importer attempts to detect channel types based on:

  1. Channel labels in the EEGLAB chanlocs structure
  2. Channel type information if available
  3. Naming conventions (e.g., channels with 'EMG' in the name are classified as 'EMG')

Parameters

  • file_path (str): Path to the EEGLAB .set file
  • kwargs (dict): Additional keyword arguments
  • load_data (bool, optional): Whether to load the data or just metadata. Default is True.
  • channel_types (dict, optional): Manual mapping of channel names to types.

Return Values

The load() method returns a tuple of:

  1. signals (pandas.DataFrame): Signal data with channels as columns
  2. channels (dict): Dictionary of channel information including:
  3. channel_type: Type of channel (EMG, EEG, etc.)
  4. physical_dimension: Physical unit (e.g., 'µV')
  5. sample_frequency: Sampling rate in Hz
  6. coordinates: Channel coordinates if available

  7. metadata (dict): Dictionary containing metadata from the EEGLAB file, including:

  8. subject: Subject identifier
  9. session: Session identifier
  10. condition: Condition/task information
  11. srate: Sampling rate
  12. xmin/xmax: Time limits
  13. event: Event markers
  14. epoch: Epoch information (if epoched)

Notes

  • The importer automatically handles both continuous and epoched data
  • For epoched data, epochs are concatenated in the time dimension
  • Event markers are preserved in the metadata
  • Channel locations are preserved in the channel information when available