LLMs can write Python scripts, they cannot be trusted to design physical systems where tolerance, voltage, and compatibility matter. A chatbot can tell you how a drone works. It cannot tell you if this specific T-Motor F60 will overheat when paired with this specific 6S battery on a hot day in Texas.
I built OpenForge to prove that we can bridge this gap. I didn't want a chatbot; I wanted a Generative Manufacturing Engine.
Here is the architecture I developed to turn vague user intent into flight-proven hardware, and how this pattern scales far beyond drones.
The fatal flaw in most AI engineering is that they treat the LLM as the Source of Truth. In OpenForge, the LLM is merely the Translator.
The architecture relies on a specialized pipeline:
If the AI suggests a part that doesn't fit, the Physics Engine rejects it. The AI is forced to learn within the boundaries of reality.
You cannot automate engineering without structured data. The internet is full of unstructured HTML, messy e-commerce sites, and PDFs. Standard scrapers fail here.
I built a High-Agency Refinery Agent. It doesn't just scrape; it investigates.
If a spec (like weight or mounting pattern) is missing, the agent spins up a headless browser (Playwright), takes a screenshot, uses a Vision Model (Gemini) to identify the Specifications tab, clicks it, and extracts the data.
# tools/refine_arsenal.py - The "Active Recon" Loop async def active_recon_session(component, missing_keys): # 1. Vision AI analyzes the UI screenshot ui_plan = await vision_model.analyze( prompt="Find the 'Technical Specs' tab or 'Read More' button.", image=screenshot ) # 2. Playwright acts on the Vision AI's instructions if ui_plan['found_hidden_section']: await page.get_by_text(ui_plan['click_target']).click() # 3. Extraction Agent reads the newly revealed DOM new_specs = await extractor_agent.parse( content=await page.content(), target_keys=missing_keys ) return new_specs
The Insight: This turns the messy web into a structured SQL database. This is applicable to sourcing chips from DigiKey, pumps from McMaster-Carr, or lumber from Home Depot.
Users speak in intent (Brush busting, Cinematic, Long Range). Engineers speak in constraints (Stator Volume, Deadcat Geometry, Li-Ion Chemistry).
I built a prompt architecture that acts as a compiler. It forces the LLM to output a Parametric Constraint Object, not a shopping list.
# prompts.py - The Architect Persona REQUIREMENTS_SYSTEM_INSTRUCTION = """ You are the Chief Engineer. Translate user intent into PARAMETRIC CONSTRAINTS. INPUT: "I need a brush-busting drone for ranch work." KNOWLEDGE BASE: - "Brush Busting" implies: High Torque (Stator >= 2306), Impact Resistance (Arm Thickness >= 5mm). - "Ranch Work" implies: High Efficiency (6S Voltage), Penetration (Analog Video). OUTPUT SCHEMA: { "topology": { "class": "Heavy 5-inch", "voltage": "6S" }, "technical_constraints": { "min_arm_thickness_mm": 5.0, "motor_stator_index": "2306 or larger", "video_system": "Analog" } } """
The Insight: By decoupling Intent from Selection, we ensure the AI is looking for parts that meet engineering standards, not just parts that have drone in the title.
This is the moat. Most AI tools hallucinate compatibility. OpenForge enforces it with a Compatibility Service that runs purely deterministic code.
It checks voltage matching, geometric clearance, and electronic protocols (UARTs, BECs).
# app/services/compatibility_service.py def validate_build(bom): # 1. Voltage Check (Prevent Fire) # 6S Battery (22.2V) on High KV Motor = Explosion if battery.cells >= 6 and motor.kv > 2150: return {"valid": False, "error": "CRITICAL: Voltage Mismatch. Motor will burn."} # 2. Protocol Check (Prevent Logic Failure) # Does the Flight Controller have enough UART ports for the peripherals? required_uarts = 0 if "DJI" in vtx.name: required_uarts += 1 if "GPS" in bom: required_uarts += 1 if fc.uart_count < required_uarts: return {"valid": False, "error": "I/O Bottleneck: Not enough UARTs."} return {"valid": True}
The Insight: We treat hardware design like software compilation. If the types (voltage, mounting, protocols) don't match, the build fails before it costs money.
I used drones as the anchor for this project because they are complex systems involving mechanical, electrical, and software constraints. However, the architecture I built is domain-agnostic.
This is a Generalized Assembly Engine.
The future of AI in engineering isn't about training a larger model that knows everything. It's about building Agentic Architectures that know how to:
OpenForge is a proof-of-concept for this future. We are moving from Computer-Aided Design (CAD) to Computer-Generated Engineering (CGE).
If you are interested in building systems that interface with the physical world reliably, take a look at the repo.
\


