EDEN-MobileViTv3-Custom-ImageNet300 — SOTA Optimized
Primary KPI: EAG (Energy-to-Accuracy Gradient) =
1.9376e-10ΔAcc/ΔJoules
Abstract
This model is part of Project EDEN (Energy-Driven Evolution of Networks), implementing the E2AM (Energy Efficient Advanced Model) Framework. The goal is to shift AI benchmarking from pure accuracy to Green SOTA — maximising predictive power per Joule consumed.
Applied Technique: Phase 2 – Progressive Unfreezing + AMP (E2AM SOTA)
Profiling Environment
| Component | Specification |
|---|---|
| GPU | NVIDIA GeForce GTX 1080 Ti (11 GB VRAM, 250 W TDP) |
| CPU | Intel Xeon W-2125 (4 cores / 8 threads @ 4.00 GHz) |
| RAM | 63.66 GB System RAM |
| OS | Windows 10 |
| Dataset | Custom-ImageNet300 — ~450,000 images – 300 classes (224 px) |
🟢 Green Delta Table
Comparing this model against the reference baseline (ResNet-50 equivalent)
| Metric | ResNet50 Baseline | MobileViTv3 (EDEN) | Δ |
|---|---|---|---|
| Accuracy | 0.9573 | 0.8850 | -7.24% |
| Total Energy (J) | 380,392,115 | 6,905,436 | 98.18% saved |
| CO₂ Emissions (kg) | 50.1906 | 0.9111 | — |
| EAG Score | — | 1.9376e-10 | ΔAcc/ΔJoules |
A positive EAG means this model learns more per Joule than the baseline. A negative EAG indicates a trade-off where higher accuracy required more energy investment.
E2AM Algorithm — Applied Phases
Phase 1 – Zero-Overhead Initialization: Dataset pre-loaded into pinned System RAM to eliminate disk I/O power spikes.
Phase 2 – Progressive Unfreezing: Backbone frozen for the first E_unfreeze epochs (only the classification head trains). At E_unfreeze, all layers are unfrozen and the learning rate is decayed. Gradient accumulation over N micro-batches simulates large batch sizes without proportional VRAM cost, slashing power-draw spikes.
AMP (Automated Mixed Precision): torch.cuda.amp.autocast() halves GPU memory bandwidth, reducing energy per backward pass.
Sparse Regularisation: L1 penalty λ·Σ|W| applied to trainable weights, driving dead neurons to zero and enabling future pruning.
Training Statistics
| Metric | Value |
|---|---|
| Final Accuracy | 0.8850 (88.50%) |
| Total Energy Consumed | 6,905,436 J (1.9182 kWh) |
| Training Time | 10,333 s (2.87 hrs) |
| Estimated CO₂ | 0.9111 kg CO₂e |
| Training Log | test1\eden_optimized_custom_imagenet_mobilevitv3.csv |
📊 Training Visualizations
Accuracy & Energy over Training
Green = accuracy (left axis) · Orange dashed = cumulative energy (right axis)
Project-Wide Overview
All EDEN models: energy vs accuracy
Cite This Research
@misc{eden2025,
title = {Project EDEN: Energy-Driven Evolution of Networks},
author = {EDEN Research Team},
year = {2025},
note = {Hugging Face: Shanmuk4622},
url = {https://huggingface.co/Shanmuk4622}
}
Evaluation results
- Accuracy on Custom-ImageNet300self-reported0.885
- F1 Score on Custom-ImageNet300self-reported0.882

