A curated list of AI tools and resources for developers, see the AI Resources .

NeuralFlight

An open-source framework combining computer vision and motor-imagery EEG classification to control simulated drones via gestures, head motion, or imagined movements.

Detailed Introduction

NeuralFlight is an open-source framework for drone control that combines computer vision (Mediapipe-based) with motor-imagery EEG classification. It enables control of simulated drones using hand gestures, head movements, or imagined actions, all without expensive hardware. The project uses PyTorch for model training and provides a simulator, runnable demos, and example notebooks for rapid prototyping and research.

Main Features

  • Multi-modal control: fist-following hand gestures, head-pose control, and EEG-based motor imagery control.
  • Modern ML stack: PyTorch-based models (EEGNet with residual connections), real-time inference, and pretrained checkpoints.
  • Simulation-first demos: physics-based simulator and visualization let users develop and test without physical drones.

Use Cases

  • BCI research and rapid prototyping of EEG-based control algorithms.
  • Accessibility: alternative control methods for users with motor impairments.
  • Education: teaching signal processing, deep learning, and robotics using hands-on demos.

Technical Features

  • EEG pipeline with dataset integration (PhysioNet Motor Movement/Imagery) and bandpass filtering.
  • Compact neural architectures (~10K parameters) with residual connections for efficient training.
  • Mediapipe-based hand and face tracking, temporal smoothing, and configurable gesture thresholds for stability.
NeuralFlight
Resource Info
🌱 Open Source 🕹️ Simulator 🏋️ Training 📱 Application