New in v0.6.0

Neural Gates are part of the Neural IR backend introduced in Simplex v0.6.0. They enable differentiable control flow for learnable programs.

What are Neural Gates?

Traditional conditionals (if-else) are binary: true or false. Neural Gates are differentiable - they return continuous probability values (0.0-1.0) during training, allowing gradients to flow through decision points. After training, they "crystallize" back to efficient discrete branches.

Why Use Neural Gates?

  • Learnable thresholds - Decision boundaries optimize automatically from data
  • End-to-end training - Backpropagation flows through entire program
  • Zero inference overhead - Compiles to standard branches in production
  • Gradual adoption - Mix with traditional conditionals as needed

Basic Syntax

basic_gate.sx
// Define a neural gate
neural_gate should_retry(confidence: f64) -> Bool {
    confidence > 0.7
}

// Use it like a normal function
fn process_result(result: AnalysisResult) -> Action {
    if should_retry(result.confidence) {
        Action::Retry
    } else {
        Action::Accept(result)
    }
}

Compilation Modes

The same code compiles differently based on mode:

Mode Command Behavior
Training sxc build --mode=train Soft gates with gradient tracking
Inference sxc build --mode=infer Hard branches, zero overhead
Profile sxc build --mode=profile Hard gates with activation statistics

Categorical Neural Gates

For selecting from multiple options, use categorical gates with match:

categorical_gate.sx
// Categorical neural gate - selects from N options
neural_gate route_request(query: Embedding) -> Specialist {
    match classify(query) {
        Category::Technical => Specialist::Engineer,
        Category::Creative => Specialist::Designer,
        Category::Business => Specialist::Analyst,
    }
}

// During training: Gumbel-Softmax provides differentiable selection
// During inference: Standard match with zero overhead

Contract Verification

Neural gates can include contracts for safety verification:

contracts.sx
neural_gate memory_safe_path(analysis: SecurityAnalysis) -> Bool
    requires analysis.confidence > 0.95    // Must exceed 95%
    ensures result => no_buffer_overflow   // Guarantee if true
    fallback safe_default_path()           // If confidence too low
{
    analysis.is_safe
}

// Contract keywords:
//   requires  - Preconditions (minimum confidence thresholds)
//   ensures   - Postconditions guaranteed when gate fires
//   invariant - Properties across gate transitions
//   fallback  - Handler when confidence below threshold

Hardware Targeting

Use annotations to control where gates execute:

hardware.sx
// Run on GPU - batch tensor operations
@gpu
neural_gate batch_classifier(inputs: List<Embedding>) -> List<Label> {
    inputs.map(e => classify_embedding(e))
}

// Run on CPU - branching logic
@cpu
fn process_result(label: Label) -> Action {
    match label {
        Label::Urgent => Action::Escalate,
        Label::Normal => Action::Queue,
        _ => Action::Log
    }
}

// Run on NPU - SLM inference
@npu
fn cognitive_inference(context: Context) -> Response {
    infer("Generate response for: {context}")
}

Training Example

Here's a complete example of training a neural gate:

train_router.sx
use simplex_learning::{Trainer, Adam};

// Define the gate to train
neural_gate smart_router(query: Embedding) -> Specialist {
    match score_specialists(query) {
        scores if scores.security > 0.8 => Specialist::Security,
        scores if scores.quality > 0.7 => Specialist::Quality,
        _ => Specialist::General,
    }
}

fn main() {
    // Load training data
    let dataset = load_routing_examples("train.json");

    // Create trainer
    let trainer = Trainer::new(smart_router)
        .optimizer(Adam::new(0.001))
        .temperature_schedule(10.0, 0.1, 1000);  // Anneal over 1000 steps

    // Train the gate
    for epoch in 0..100 {
        for (query, expected) in &dataset {
            let loss = trainer.step(query, expected);
            println("Epoch {epoch}, Loss: {loss}");
        }
    }

    // Export trained gate for inference
    trainer.export("router_trained.sx");
}

Best Practices

When to Use Neural Gates

  • Decision thresholds that need tuning
  • Routing logic that should adapt to data
  • Classification boundaries that evolve
  • Any conditional you'd A/B test manually

When NOT to Use Neural Gates

  • Safety-critical decisions with fixed requirements
  • Simple boolean checks (use regular if)
  • Business logic that must be auditable
  • Code paths that rarely execute

Next Steps