There is a specific smell to late-stage PCB work. A mix of coffee, solder mask anxiety, and the faint optimism that one more route pass will magically make analog behave. The poking around with oscilloscope probes and poring at datasheets to find out that you failed to pull reset low, out comes the magnet wires for a bodge.
Over the last week I have been building out an ADE9000 breakout with an AI-assisted workflow wrapped around KiCAD scripting. The results are useful, occasionally impressive, and still very far from “hands-off” for precision energy monitoring.
If you grew up with “don’t trust the autorouter” as muscle memory, good news: the saying still holds. We just scaled the blast radius from one menu click to an entire AI pipeline.
What changed on the board, not just on a slide deck
The recent commit sequence in ADE9000_Breakout tells the story better than any marketing copy:
- d1e3352 (2026-05-07): “Initial AI Creation”
- 043161c (2026-05-08): “Complete routing”
- b8c8132 (2026-05-09): “Reroute with JST connector for SPI”
- a469246 (2026-05-09): “Move decoupling closer”
- 84b86e7 (2026-05-12): “Relayout using connectors for analog inputs”
This was not a single-shot generation. It was iterative engineering with scripting and machine assistance in the loop:
- Placement and connector mapping in scripts like
place_pcb.py. - Deterministic route passes in
route_pcb.py. - Critical route seeding and patching in
seed_critical_routes.pyandpatch_remaining_routes.py. - Mechanical/silk cleanup with
apply_board_markings.pyandmove_refs_to_silkscreen.py. - Continuous ERC/DRC artifacts (
erc.json,drc.json) committed as hard checkpoints. - 3D STEP model handling pushed into the reusable layout and size-shape skills so mechanical review can keep pace with PCB edits.
That pattern matters. The AI and automation stack gave speed and repeatability, but the engineering value came from repeated correction passes driven by board physics.
The same iteration loop is now visible in whatnick-energy-monitor-skills:
- 335a718 (2026-05-16): Initial whatnick energy monitor skills
- 78bfc44 (2026-05-17): Document STEP model workflow
- 79d925d (2026-05-17): Clarify connector STEP model matching
That repo now captures reusable circuit, layout, routing, and board-shape guidance, plus the ADE9000 project overlay. The 3D model side was not a first-pass success either. STEP exports and connector models took multiple iterations before the project-local paths, model matching, and export workflow were stable enough to trust in a mechanical review.
NotebookLM research: good at methodology, weaker on analog edge cases
I started on this adventure with posts from Samuel Beek promoting AI PCB design. I blended in my research background and curated a set of academic papers on AI/Algorithm usage in PCB place-n-route. I queried my pre-created NotebookLM notebook “PCB Place and Route Algorithms” specifically for mixed-signal energy-monitoring constraints.
The useful part of that synthesis:
- Current AI placement/routing research still optimizes mostly for geometric proxies (wirelength, overlap, congestion).
- Newer approaches improve constraint capture and collaboration, but are explicitly human-in-the-loop.
- Model quality degrades on low-frequency pattern classes and novel interface combinations (cold-start behavior).
The important caveat from the same query was even more telling: source coverage was strong on AI placement methodology, but thinner on precision metering specifics like safety creepage strategy, return-current choreography around split references, and “what actually ruins ENOB on a real board at 2am”.
That gap is exactly the point.
Schematik, SnapMagic, and the new hardware workflow stack
Samuel Beek’s Schematik story is almost too perfect as an origin myth for this moment. An AI-generated wiring guide for a home-grown door opener took out the fuses in his apartment, which is a fairly direct way for physics to reject your prompt. Schematik has since raised $4.6 million from Lightspeed Venture Partners and is positioning itself as a “Cursor for Hardware”: describe a device in plain language, get a bill of materials, purchase links, and assembly instructions.
The interesting engineering choice is the safety boundary. Schematik is deliberately aiming at low-voltage circuits, typically 3-5 V IoT and maker projects, because that is where the promise is large and the downside can still be bounded. That is a sensible line. It is also a reminder that “AI for hardware” is not one market. The assistant that helps someone build an MP3 player is not the same system I would trust near mains metering, isolated sensing, or a DIN rail enclosure without a lot more constraint machinery around it.
SnapMagic is attacking a nearby but different layer of the stack. Its pitch is an AI copilot for electronics design, built on the huge CAD model base that started as SnapEDA: symbols, footprints, 3D models, part discovery, BOM optimization, supply-chain substitution, and integration with existing EDA tools including KiCad. That matters because a surprising amount of PCB time is not heroic analog insight; it is finding the right model, importing it cleanly, checking whether the footprint is sane, and making sure the thing still exists at Mouser or Digi-Key.
Schematik is closer to natural-language project generation. SnapMagic is closer to CAD-data and component-selection acceleration. My little ADE9000 workflow sits in the garage between them: NotebookLM for research synthesis, KiCad Python for deterministic changes, Freerouting for a first pass, project-local skills for repeatable domain knowledge, and human review for the parts that still smell like physics.
That is the practical opening for individual designers. You do not need to wait for one perfect vendor platform. You can build a small, opinionated workflow from pieces you already control:
- Put your design rules and recurring board-family choices into a local skill or checklist.
- Use AI for datasheet digestion, net naming, script generation, and alternative exploration.
- Keep KiCad, ERC, DRC, STEP export, and git history as the accountability layer.
- Treat external libraries like SnapMagic/SnapEDA as accelerators, then verify footprints, symbols, pin numbers, and 3D models against datasheets and mechanical reality.
- Let autorouters and AI propose routes, but hand-check return currents, decoupling loops, creepage, clocks, testability, and enclosure fit.
That sounds less magical than “hardware Cursor”, but it is much closer to how individual designers can safely get leverage today.
Why this hurts more on energy monitoring boards
An ADE9000-style board is not just “digital plus some analog”. It is a negotiated peace treaty between:
- tiny differential analog signals,
- noisy clocks and digital SPI edges,
- shared ground structures,
- high-voltage interfacing constraints,
- and assembly realities.
AI can route what it can score. Physics punishes what you forgot to score.
For energy monitoring, the failure modes are often subtle first and expensive later:
- A seemingly short route with a terrible return path becomes an EMI antenna.
- “Close enough” decoupling in XY turns into high loop inductance in 3D.
- Digital fanout convenience leaks noise into the measurement front end.
- Clearance passes in one view while creepage silently fails along real surfaces.
The scaled-up autorouter adage
Classic autorouter distrust was about ugly traces and cleanup effort.
AI-assisted distrust is about false confidence.
You now get cleaner visuals, plausible routing, and confidence scores. The board can look more “engineered” while still violating analog intent. That is worse than obviously bad output, because it delays the moment when a human gets suspicious.
In the ADE9000_Breakout commits, that showed up as repeated topology and placement adjustments:
- decoupling moved closer,
- connector strategy revised,
- critical nets explicitly seeded,
- remaining logical gaps patched after freeroute import,
- then relayout for analog input connector realism.
None of that is anti-AI. It is pro-accountability.
A practical review checklist I now treat as mandatory
Before I trust an AI-assisted pass on a precision board, I manually review:
- Return current continuity under each critical signal path.
- Analog/digital ground interaction at the exact stitch points, not just net names.
- Decoupling loop geometry (pin, cap, via topology), not just nearest-component distance.
- Clock and fast digital net proximity to high-impedance analog channels.
- Creepage and clearance across real isolation boundaries and along surfaces.
- Testpoint access and assembly risks (reworkability, tombstoning, awkward probe points).
If any of those rely on “the model probably understood that”, I assume it did not.
What AI is genuinely good for in this workflow
After using it in anger, the wins are real:
- Faster exploration of placement variants.
- Deterministic scriptable edits to keep design intent reproducible.
- Constraint management that catches obvious misses early.
- Better ergonomics for repeated board evolutions.
- Mechanical and export workflows, including STEP models, can be standardized, but they still need several passes to reconcile footprint libraries, connector variants, and CAD export paths.
This is similar to CNC in machining. You still need a machinist mindset. You just get to fail faster and with better logs.
The part that still needs engineers
The physics does not care whether a trace came from a human, an RL policy, or a nicely branded copilot.
Energy metering boards live or die on the details that are hardest to encode as generic reward functions. The edge where analog integrity, EMC, and safety overlap is still mostly tacit knowledge earned through measurements, bring-up scars, and post-mortems.
So yes, use AI aggressively for PCB work. I certainly am.
Just keep the old sign above the bench.
Don’t trust the autorouter.
Now it applies to systems, not just traces.
If you are building similar mixed-signal boards, I would love to compare review checklists and failure cases that escaped DRC but showed up on the bench.
