1 File = 1 Feature — Atomic Microkernel Architecture

Atomic Microkernel Architecture.
Python-based. AI-Native. One file per feature.

MicroCoreOS is a Python framework built on the Atomic Microkernel Architecture. A implementation seconds, not minutes.

Star on GitHub
Setup
1 git clone https://github.com/theanibalos/MicroCoreOS.git
2 cd MicroCoreOS
3 cp .env.example .env
4 uv run main.py

The Architecture Overhead

In traditional layered architectures, adding a simple CRUD endpoint means explaining entities, repositories, factories, controllers, and DTOs to your AI. That's 6-8 files and 200+ lines of code for one endpoint.

Context window saturation
Leaky abstractions across layers
Repetitive boilerplate for AI to generate
Diff-noise in every pull request
AI AGENT LOG

"I've identified 7 files to modify...
Creating repository...
Injecting into factory..."

MicroCoreOS Solution

Atomic Ownership

All logic, routing, and operations in one atomic plugin file.

AI Context Manifest

Auto-generated `AI_CONTEXT.md` describes everything available.

Pure Kernel

The core knows zero business rules. It only handles DI and lifecycle.

Extensible Tools

Add your own infrastructure (Redis, Kafka, S3) in minutes. Pure plug-and-play.

Plugin Anatomy

Every feature follows the same structure — one file, fully self-contained.

domains/users/plugins/list_users_plugin.py
from core.base_plugin import BasePlugin
from domains.users.models import UserModel, UserListResponse
class ListUsersPlugin(BasePlugin):
    def __init__(self, http, db, logger): ← injected by kernel
        self.http =http
        self.db =db
        self.logger =logger
    def on_boot(self):
        self.http.add_endpoint( ← registers route
            path="/users", method="GET",
            handler=self.execute, response_model=UserListResponse
        )
        self.logger.info("Endpoint /users registered.") ← logger injected
    def execute(self, data: dict, context=None):
        rows = self.db.query("SELECT id, name, email FROM users") ← direct DB — no ORM
        users = [UserModel.from_row(row).to_dict() for row in rows]
        return {"success":True,"users": users}

This is a complete, production-ready feature. One file. No hidden layers.

Event-Driven Communication Between Plugins

No direct imports. No coupling. Plugins talk through the bus — never to each other. The EventBus is itself a Tool — just another stateless driver living in tools/event_bus/.

Fire & Forget — publish & move on
create_order_plugin
📢 order.created
send_email
notify_supplier
update_inventory

Each in its own file. None imports another.

Request / Response — await an answer
create_order_plugin
📨 inventory.check
→ request
check_inventory_plugin
← response (ok / abort)
✅ {"stock": 42}
create_order_plugin → proceeds

Still no direct import. Still self-contained.

Predictable File Structure

Everything has its place. The kernel auto-discovers plugins in the /domains folder and wires them to the /tools they need — zero manual wiring, zero config files.

Your AI assistant knows exactly where to put new code. No more explaining layers, factories, or DI containers. The structure is the documentation.

core/— Microkernel & orchestrator

~240 lines. Read it once, understand everything.

tools/— Infrastructure drivers + your custom tools

Stateless. Config reads from env/configmaps at boot — no mutable state, ever.

domains/— Business logic (1 file per feature)

1 file = 1 feature. Self-contained — delete it and nothing breaks.

Project Explorer
core/ # ~240 lines total
kernel.py
container.py
registry.py
base_plugin.py
tool_plugin.py
tools/
http_server/
sqlite/
event_bus/
your_custom_tool/
domains/
users/
models/
user_model.py
plugins/ # 1 file = 1 feature
create_user_plugin.py
delete_user_plugin.py
AI_CONTEXT.md # Manifest for LLMs

Declare Once. Use Anywhere.

No service locators. No factory calls. No container.get(). Just declare what you need in your constructor — the kernel delivers it.

1

Declare your dependencies

List what you need in __init__. The kernel reads your signature and knows exactly what to inject.

2

Kernel wires everything

Boot time. Zero config files. The orchestrator resolves the dependency graph and delivers each tool instance automatically.

3

Call self.tool — that's it

Use any tool anywhere in your plugin. Your constructor is the contract — the AI reads it and instantly knows what the plugin needs.

The constructor is not boilerplate — it's the dependency specification. One glance tells you everything a plugin needs to function.

Plugin
class CreateUserPlugin(BasePlugin):
def __init__(self,http,db,logger):
# ↑ just declare what you need
self.http =http
self.db =db
self.logger =logger
def execute(self, data):
# ↓ use any tool directly
self.db.execute("INSERT ...")
self.logger.info("User created")
return{"success": True}

Testable by Design

Stop fighting with DI containers in your tests. Because every plugin is a simple class with an explicit constructor, you can test it in perfect isolation.

Mocking is trivial: No complex setup. Just pass your mocks directly.

Side-effect verification: Ensure DB calls and EventBus pings happen exactly as expected without a real environment.

Develop as a Product: Program a feature in the morning, test it locally in the afternoon, drop it into the domains/ folder at night.

Implicit Integration: If it passes its unit tests, the kernel handles the wiring. It will work.

test_plugin.py
python
def test_create_user():
    # Arrange 
    db = MagicMock()
    logger = MagicMock()
    data = {"name": "Satoshi", "email": "[email protected]"}
    
    # Act
    plugin = CreateUserPlugin(None, db, logger)
    response = plugin.execute(data)
    
    # Assert
    assert response["success"] is True
    db.execute.assert_called_once()
    logger.info.assert_called_with("User created")

Core Principles

Designed to be simple, auditable, and rigid where it matters.

Tool = Stateless Plugin = Stateful

Pure Kernel

Zero business logic in the core. It is a neutral orchestrator that boots what you drop in /domains.

Stateless Tools

Infrastructure drivers (DB, HTTP, Bus). Reusable, stateless, and neutral capabilities.

Atomic Plugins

1 file = 1 feature. Pure business logic implementation. AI agents understand the whole module in one read.

Event-Driven

Plugins communicate via EventBus only. Decoupled by default, scalable by design.

Declarative DI

Dependencies are declared in the constructor. The kernel delivers what's requested.

Auto-Discovery

Just drop a file in a folder. The kernel finds it, boots it, and wires it.

The Roadmap

Where MicroCoreOS is heading — from observability to a full tool marketplace.

Phase 1 Completed

Tracer Tool

Integrated mapping of which plugins react to which events. Full observability from day one — no external tools required.

Phase 2 Completed

Hot Reload

Drop a new plugin file and the kernel picks it up instantly — no restart required. Zero-downtime development loop.

Phase 3 In Progress

Observability Dashboard

A visual dashboard built on top of the Tracer Tool — see event flows, plugin timings, and system health in real-time. Currently in development.

Phase 4 Upcoming

Atomic Tool Marketplace

Drop-in tool ecosystem — Redis, PostgreSQL, LLMs — as self-contained folders with their own manifests and AI instructions.

Phase 5 Vision

Polyglot Kernels

Sidecar plugins via gRPC or WASM — write performance-critical modules in Go or Rust, orchestrated by the same MicroCoreOS kernel.

Ready for Atomic Delivery?

Join the early access list for the v0.1.0 release, architecture deep-dives, and AI-native design patterns.

No spam. Just engineering.

© 2024 MicroCoreOS Project. Built for the era of Agentic AI.