AI/EXPLORER
ToolsCategoriesSitesLLMsCompareAI QuizAlternativesPremium
—AI Tools
—Sites & Blogs
—LLMs & Models
—Categories
AI Explorer

Find and compare the best artificial intelligence tools for your projects.

Made within France

Explore

  • ›All tools
  • ›Sites & Blogs
  • ›LLMs & Models
  • ›Compare
  • ›Chatbots
  • ›AI Images
  • ›Code & Dev

Company

  • ›Premium
  • ›About
  • ›Contact
  • ›Blog

Legal

  • ›Legal notice
  • ›Privacy
  • ›Terms

© 2026 AI Explorer·All rights reserved.

HomeLLMsreve base

reve base

by brain-bzh

Open source · 30k downloads · 22 likes

1.7
(22 reviews)EmbeddingAPI & Local
About

REVE-base is a language model specialized in processing EEG signals, built on a transformer architecture. Trained on over 60,000 hours of EEG data from 92 datasets and 25,000 subjects, it stands out for its ability to adapt to different electrode configurations and a wide range of EEG-related tasks. Its innovative 4D positional encoding enables it to handle signals of varying lengths and arrangements, providing unprecedented flexibility. This model is particularly valuable for medical, neuroscientific, or brain-computer interface applications where EEG data analysis is critical. Developed by the BRAIN team and the University of Montreal, REVE-base is positioned as a powerful tool for extracting versatile and generalizable EEG representations.

Documentation

Model Card for REVE-base

REVE (project page here) is a transformer-based foundation model for EEG signal processing. It was trained on 60k hours of EEG data from various sources and is designed to be adaptable to any electrode configuration and a wide range of EEG-based tasks.

Model Details

Architecture

REVE (Representation for EEG with Versatile Embeddings), a pretrained encoder explicitly designed to generalize across diverse EEG signals. REVE introduces a novel 4D positional encoding scheme that enables it to process signals of arbitrary length and electrode arrangement. Using a masked autoencoding objective, we pretrain REVE on over 60,000 hours of EEG data from 92 datasets spanning 25,000 subjects.

Developed by the BRAIN team and UdeM

Funded by: This research was supported by the French National Research Agency (ANR) through its AI@IMT program and grant ANR-24-CE23-7365, as well as by a grant from the Brittany region. Further support was provided by a Discovery Grant from the Natural Sciences and Engineering Research Council of Canada (NSERC), by funding from the Canada Research Chairs program and the Fonds de recherche du Québec – Nature et technologies (FRQ-NT). This work was granted access to the HPC resources of IDRIS under the allocation 2024-AD011015237R1 made by GENCI, as well as HPC provided by Digital Alliance Canada.

Model Sources

  • Repository: github
  • Paper : arxiv

Uses

Example script to extract embeddings with REVE, using our position bank:

Python
from transformers import AutoModel

pos_bank = AutoModel.from_pretrained("brain-bzh/reve-positions", trust_remote_code=True)
model = AutoModel.from_pretrained("brain-bzh/reve-base", trust_remote_code=True)

eeg_data = ... # EEG data as a torch Tensor (batch_size, channels, time_points), must be sampled at 200 Hz

electrode_names = [...] # List of electrode names corresponding to the channels in eeg_data
positions = pos_bank(electrode_names) # Get positions (channels, 3)
# Expand the positions vector to match the batch size 
positions = positions.expand(eeg_data.size(0), -1, -1)  # (batch_size, channels, 3)

output = model(eeg_data, positions)
Capabilities & Tags
transformerssafetensorsrevefeature-extractioneegneurosciencefoundation-modelpytorchcustom_codemodel-index
Links & Resources
Specifications
CategoryEmbedding
AccessAPI & Local
LicenseOpen Source
PricingOpen Source
Rating
1.7

Try reve base

Access the model directly