A friend of mine went to the doctor for TMJ. The doctor told him to massage his jaw with warm water from a cup. That was the treatment. It worked, but only at home. He couldn't carry a cup of water around with him, so outside the house, the pain just came back.
So he decided to build his own device. A small handheld wand to treat it himself. Simple circuit board, basic firmware, nothing crazy. Vibrations and heat to mimic the warm water treatment, just portable. He wrote the firmware with AI in a weekend. The software wasn't the hard part anymore. The PCB design took him seven months. Yes, seven months of back-and-forth for a board that fits in a wand.
And there are thousands of people like him. Engineers, makers, founders with ideas for simple hardware products who hit a wall the moment they need a circuit board designed.
If you ask an electrical engineer why their workflow is so manual, they'll tell you that's just how it is. That's the answer. Not because they like it, because nothing better exists. The EDA industry has operated this way for decades. The tools haven't meaningfully changed, and the problem itself is brutally hard.
And when I say hard, I mean it. PCB design is a combinatorial optimization problem. The same class as the traveling salesman problem, but worse. Even a simple board with 50 components and 10,000 possible grid positions has more placement combinations than there are atoms in the universe. And that's just placement. The physical shapes exist in 3D, but the real problem space is 10+ dimensions: electromagnetic fields that move in time, behavior that changes with frequency and temperature, materials that are imperfect and nonlinear, fields that couple and interact with each other. Add manufacturing variance, current densities, thermals, signal integrity.
And if any of it isn't carefully accounted for, the board either doesn't work or literally self-destructs. It has to be 100%, not 90% correct.
There's no polynomial-time solution. You can't brute-force it. The best algorithms are heuristics. Educated guesses built on decades of human intuition, and that's exactly why the tools have stayed the way they are.
This is also why engineers are so conservative about their tools. When your design has to be 100% correct, you don't switch to a new tool version just because it has a nicer UI. You stick with what you know works. Some engineers won't even upgrade from one version of Altium to the next because their entire workflow is built around the specific behavior of the version they're on. A bug in the new version could mean a failed board. That kind of environment doesn't reward experimentation. It rewards muscle memory. And it's part of why the industry has been so slow to change.
There are millions of PCB designers and engineers working globally. The EDA market they depend on is worth over $19 billion and growing fast, projected to hit $31 billion by 2030. But here's the problem: until recently, the largest segment of PCB designers was over 50 years old. More than half the workforce is in the back half of their careers. And the pipeline behind them is nearly empty.
This isn't a small gap. By 2033, the U.S. manufacturing sector will need 3.8 million new workers, and nearly half those jobs are projected to go unfilled. The semiconductor industry needs over a million additional skilled workers globally by 2030. The demand for hardware is exploding. AI chips, IoT, EVs, defense, and space. And the people who know how to design it are aging out.
Robotics alone saw $14 billion in startup funding in 2025, nearly double the year before. Figure AI raised over $1 billion at a $39 billion valuation to build humanoid robots. Skild AI tripled to $14 billion. Apptronik raised $520 million with Google, Mercedes-Benz, and John Deere backing them. Every one of these companies needs custom motor controllers, sensor arrays, power boards, and compute modules. All custom PCBs. The engineers to design them don't exist yet.
The Knowledge Problem
Most of what makes a great PCB designer great was never written down. It was learned on the job, over years, through mentorship and mistakes. There's no university program that teaches you how to do real PCB layout. There's no bootcamp. IPC is trying with apprenticeship programs, but they can't scale fast enough.
So the knowledge stays locked up. Experienced engineers hoard it not out of selfishness, but because there's no good way to transfer it. The feedback loop between the people who know and the people who want to learn is so broken that young engineers just give up. They look at the tooling, look at the learning curve, and choose software instead.
And honestly? Who can blame them. The gap between modern developer tooling and hardware design tooling is embarrassing. I've been on both sides of this. I've worked at Apple and Meta, and built a robotic surgical arm that had to work under rocket launch vibrations through NASA's RockSat program. Now I'm building Trace, an AI-native PCB design tool. Our team includes engineers from NVIDIA who've designed at the ASIC level. We've seen what world-class dev tooling looks like, and we've seen what the hardware side is stuck with. The gap between those two worlds is insulting.
Software engineers have AI copilots like Cursor and Claude Code writing code alongside them. Instant feedback loops from linters like ESLint and Prettier catching mistakes on every keystroke. Version control on GitHub — branch, PR, merge, done. CI/CD on GitHub Actions and CircleCI running tests automatically on every push. Package managers like npm and pip pulling in any dependency with one command. Frameworks like Next.js and Django scaffolding entire apps in minutes. The entire workflow is integrated, fast, and AI-native.
Hardware engineers have Altium charging thousands per year for what is still a manual workflow. KiCad, free but manual everything. OrCAD that takes months just to learn. Version control means renaming files "_final_v2_REAL". Component selection means opening Digi-Key in one tab, LCSC in another, and a PDF datasheet in a third. DRC is a button you click and pray. Routing is done trace by trace, by hand, for hours. Gerber export is a 12-step wizard you google every time. Separate files for every copper layer, silkscreen, solder mask, paste, outline, and drill. Zip them up, upload to JLCPCB or PCBWay, and pray your design rules match their manufacturing constraints. They usually don't. Minimum trace width, annular ring, via diameter, solder mask clearance. Every fab has different specs, and if you violate any of them, you're back in your EDA tool fixing things and re-exporting. Collaboration means emailing a zip file. The UIs have gotten prettier, but the workflow underneath hasn't changed.
To be fair, the premium tools do have semi-automation features that KiCad doesn't. Altium and Allegro offer push-and-shove interactive routing, room replication for multichannel designs, constraint-driven clearance enforcement, and length matching wizards. These are real features that save time on specific subtasks. But they're power-user shortcuts, not design automation. The engineer still decides where every component goes, still routes every critical trace, still makes every tradeoff. None of these tools reason about your design. They enforce rules you've already set, but they don't derive those rules from a datasheet or understand the cascading consequences of a design decision. The thinking is still entirely on the engineer.
Why Now
You might ask: if the problem is this obvious, why hasn't anyone solved it? Because until recently, they couldn't. When the first wave of LLMs arrived in 2023, they were impressive at generating text and code — but they were terrible at understanding the kind of problem PCB design actually is. It's not just language. It's multi-domain reasoning: electrical, spatial, thermal, mechanical, and manufacturing constraints all at once, with tradeoffs that cascade across domains. Early models couldn't hold that kind of complexity in context. They'd hallucinate component values, ignore physical constraints, or produce designs that looked plausible but violated basic electrical rules.
That's changing. Recent advances in frontier models (deeper reasoning, longer context, better spatial understanding) have shown that while there's still a long way to go, the possibility is no longer ruled out. Models like Claude Opus 4.5 are scoring higher than humans on SWE benchmarks and reasoning through multi-step constraint problems that would've tripped up earlier models completely. The same kind of reasoning (balancing tradeoffs across conflicting constraints) is exactly what PCB design demands. That doesn't mean you point a language model at an EM problem and hope for the best. The AI handles the reasoning layer: understanding design intent, making tradeoff decisions, reading datasheets and extracting circuits. The physics still runs through real solvers, DRC engines, and constraint checkers. The breakthrough is combining the two.
The models aren't perfect, but they're good enough to be genuinely useful: reading a 200-page datasheet and extracting the application circuit, understanding that a decoupling cap needs to be within 2mm of the power pin and why, reasoning about how moving one component changes the routing which changes the impedance which changes the stackup. Not just flagging that something's wrong — understanding the design intent behind why it should be a certain way. The gap between "AI can't do this" and "AI can assist with this" is closing faster than anyone expected.
AI Closes the Gap
We're not claiming we've solved this. We haven't. PCB design is one of the hardest problems AI can be pointed at, and there's a long road ahead. But we're doing the research in-house: combining LLM reasoning with optimization techniques like simulated annealing that have decades of proven results in placement and routing. There are very early signs. The models are starting to reason about constraints in ways that genuinely surprise us. Not perfectly, not every time, but enough to know this isn't a dead end.
This is what gets us excited about building Trace. AI doesn't only speed up the experienced engineer; I like to think of it as two-sided because it also gives the young engineer and the beginner a shot at designing something meaningful. You won't need 15 years of tribal knowledge to place components correctly when the AI understands spacing rules, stackup constraints, and EMC considerations. We're not there yet, but the direction is clear. You won't need to memorize DRC rules when the tool runs them for you in real time.
We're not replacing engineers. We're 10x-ing them. The veteran gets to move faster and focus on the hard problems. The newcomer gets to actually build something instead of spending six weeks just learning the tool.
Beyond PCB
The semiconductor talent crisis is even more severe. The chip industry needs engineers with expertise in VLSI design, embedded systems, and AI chip architecture — and universities can't produce them fast enough. As chip complexity increases with smaller nodes and 3D integration, the hours per tape-out keep going up.
And here's the irony: AI is driving the biggest surge in chip demand in decades. Data centers, edge inference, autonomous vehicles. But we don't have enough engineers to design the chips that AI runs on. The CHIPS Act is pouring $52.7 billion into domestic semiconductor manufacturing, and the fabs are getting built. But a fab without designers is just an expensive building. The bottleneck was never the factories. It's the people.
The best hardware engineers in the world shouldn't have to fight their tools. And the next generation shouldn't have to choose software just because the tooling respects their time. Hardware design shouldn't be held back by outdated tools and decades of unwritten knowledge. The $19 billion EDA market is ready for something new. AI is the catalyst and Trace is how we get there — buildwithtrace.com.
