AI ExplorerAI Explorer
ToolsCategoriesSitesLLMsCompareAI QuizAlternativesPremium

—

AI Tools

—

Sites & Blogs

—

LLMs & Models

—

Categories

AI Explorer

Find and compare the best artificial intelligence tools for your projects.

Made within France

Explore

  • All tools
  • Sites & Blogs
  • LLMs & Models
  • Compare
  • Chatbots
  • AI Images
  • Code & Dev

Company

  • Premium
  • About
  • Contact
  • Blog

Legal

  • Legal notice
  • Privacy
  • Terms

© 2026 AI Explorer. All rights reserved.

HomeLLMsreve positions

reve positions

by brain-bzh

Open source · 22k downloads · 7 likes

1.1
(7 reviews)EmbeddingAPI & Local
About

The REVE Positions model is an interface designed to provide the electrode positions required for using the REVE EEG Foundation model. It retrieves the spatial coordinates of electrodes by their names, simplifying inference with the REVE model, which is specifically trained to generalize across diverse EEG signals through an innovative 4D positional encoding. This wrapper supports common electrode configurations such as the 10-20, 10-10, 10-05, or EGI 256 systems, as well as specific setups like Biosemi-128. Its key advantage lies in its ability to adapt to varying electrode arrangements and signal lengths, offering greater flexibility for computational neuroscience applications.

Documentation

Model Card for REVE Position Bank

Wrapper to provide electrode positions to use the REVE EEG Foundation Model.

Model Details

Model Description

  • Developed by: the BRAIN team and UdeM
  • Funded by : AI@IMT, ANR JCJC ENDIVE, Jean Zay (with project numbers), Alliance Canada and Region Bretagne.

REVE (Representation for EEG with Versatile Embeddings) is a pretrained model explicitly designed to generalize across diverse EEG signals. REVE introduces a novel 4D positional encoding scheme that enables it to process signals of arbitrary length and electrode arrangement.

This position bank repository can be used to fetch electrode positions by name, in order to perform inference with the REVE modeL.

Model Sources

  • Repository: github
  • Paper : arxiv

Uses

Example script to fetch electrode positions and extract embeddings with REVE.

Python
from transformers import AutoModel

pos_bank = AutoModel.from_pretrained("brain-bzh/reve-positions", trust_remote_code=True)


eeg_data = ...  # EEG data (batch_size, channels, time_points), must be sampled at 200 Hz
electrode_names = [...]  # List of electrode names corresponding to the channels in eeg_data

positions = pos_bank(electrode_names) # Get positions (channels, 3)

model = AutoModel.from_pretrained("brain-bzh/reve-base", trust_remote_code=True)

## Expand the positions vector to match the batch size 
positions = positions.expand(eeg_data.size(0), -1, -1)  # (batch_size, channels, 3)

output = model(eeg_data, positions)

Available electrodes names can be printed using the method pos_bank.get_all_positions(), and can be visualized here.

Most common electrode setups are available (10-20, 10-10, 10-05, EGI 256). For Biosemi-128, use the prefix biosemi128_ before the electrode names (e.g., biosemi128_C13).

Capabilities & Tags
transformerssafetensorsreve-position-bankfeature-extractioncustom_code
Links & Resources
Specifications
CategoryEmbedding
AccessAPI & Local
LicenseOpen Source
PricingOpen Source
Rating
1.1

Try reve positions

Access the model directly