@Timot05
I wish you the best. Among other things, you are reinventing Verilog and other attempts. Yet, of course, evolution does not happen without people who are willing to devote their valuable time and effort to consider new ideas.
My personal perspective, after having designed hundreds of PCB's, is that there's a reason for which symbol-based data entry has survived decades of computer-based circuit design evolution.
I have also written more software than I can remember in more languages than I am able to list. I have less than zero interest in using a software process to describe circuits of any kind. What makes sense for hardware design inside an FPGA somehow does not translate well outside the chip.
Software and hardware engineering are very different things. It is probably correct to bring-up the fact that many have tried to turn software engineering into precisely the opposite; dragging symbols around and connecting them in schematic form. That approach has failed but for the most trivial educational tools.
Could you imagine the pain of building an entire software product using only assembly code? That’s about how we felt designing hardware.
Having done this dozens of times, I don't think it is painful and don't see a parallel between assembly coding and designing electronics. In fact, assembly coding is easy for the professional experienced software engineer. Of course, it is probably close to a nightmare for someone who's never done it or never even studied machine level coding. Ballroom dancing is impossible for me. I am not a dancer. And it should be in a range between difficult and impossible for anyone without the requisite training and practice. So, that's not the right metric.
Drawing schematics is easy. Schematics convey lots of information quickly. Function, structure, constraints, implementation notes, support/service notes, fabrication guidelines, etc. Also, they are trivially and instantly easy to understand, gather around and discuss. They present a mental image and state that would require fairly deep concentration to develop if reading a schematic as code (except for trivial circuits, which don't really matter). Schematics, as I said earlier, have survived the test of time for a reason: They work.
That's not to say a different approach isn't possible. All I am saying is that I would think hard before attempting something that very clearly few engineers want and care about.
For reference, I started life drawing all of my schematics and PCB layouts by hand. I then transitioned to using AutoCAD v1.0 and writing tons of LISP code to make that easier. From there I migrated and used a bunch of tools most experienced EE's have touched: OrCAD, Protel, Mentor, Cadence, Altium, KiCad, Eagle and likely others I can't remember.
Oh, yes, I also completed a few layouts back in the day using X-acto knives and stick-on traces and doughnuts on vellum. That was fun.
This page shows a number of approaches. I all of these back in the early 80's.
https://www.pcbwizards.com/handtape.htm
Yes, that's a roll of black tape she is using:
https://i.imgur.com/yUgZ5zz.png
Schematics are not the problem.
This isn't intended to discourage you at all. All I am suggesting is that you might want to interview 100 EE's and see if this matches the idea of "build something people want". Don't interview software guys who tinker with electronics as a hobby (unless that's your audience). Interview real, working, EE's with 10+ years of experience using EDA tools from different vendors and generations. My prediction is that you will quickly discover nearly nobody cares about moving away from schematics. You will definitely find other pain pain points, just not schematics.
Thanks for sharing this blog post! Super cool to see how things were done early in the industry!
Counterintuitively, I do agree with your points. We didn't end up designing the ato language because we actively wanted to end up there but rather because we tried everything else before and none of the solutions we tried worked out.
The problem we wanted to solve was: "how does git look like in hardware?". Another way to phrase it is: "How can groups of people coordinate and share their work in hardware?".
The first solution was simply to put KiCAD on GitLab. That solves the version control part of the problem but doesn't solve consistency of design outputs, coordination and reuse problems. The we tried to add automation on top (namely with KiBOT) to generate manufacturing outputs automatically. That was pretty cool for consistency of manufacturing outputs but it didn't solve the coordination and sharing aspect of the problem. And so at that point we kind of had ran out of ideas of things to try. And that's when we started developing the ato language.
Schematics are definitely a great medium to share a design with your peers. But we've found that for those other aspects I mentioned, code is more suited.
Random thoughts.
I have a feeling that very few hardware projects have a bunch of people working on the same schematic. I have personally never experienced that in the context of a team project. What might happens is that each engineer works on a board and the interface between boards is defined by the team. I have done systems with 15 to 20 PCB's interconnected through custom backplanes this way.
Modern EDA tools have ways to save portions of schematics to a library of reusable designs. This could be one vector for collaboration if, for example, you have someone who is an expert in SMPS (Switch Mode Power Supply) design doing all your power supply designs. There is no need for this person to actively modify an ongoing design, all you need is a verified and tested schematic unit you can just plop down and move on.
I think a key difference in hardware is that you cannot constantly be engaged in modifying the schematic. These are things that become real physical instantiations at a given point in time. And so, the concept of using software tools isn't necessarily applicable. You most definitely do not want someone changing a schematic after it has been released to production.
Then you have to step back and look at things at a system level. The schematic is the least important part of the process. Imagine something like an industrial CNC machine, the real deal, a $100K machine that has to work 24/7 and not kill anyone. This machine likely has dozens of PCB's, wiring diagrams, mechanical components that need to be machined, injection molded, made from sheet metal, hoses, valves, sensors, etc.
The schematic, is the least of the problems we come across when designing such systems. It simply isn't a pain point. And there is very little need to iterate like we might in the software domain. In fact, that could be very dangerous, because, at a minimum, you can't just recompile your way out of a mistake.
Sorry if I sound negative about this. That isn't my intent. This is just my opinion based on decades of doing electronics at every level. I have, BTW, used software to generate netlists for very specific designs. In one case it was an array of hundreds of sensors (the same sensor) on a PCB. I just wrote a Python program to generate the netlist. That's a very specific case where a schematic doesn't offer a great deal of value. In this case a simple PDF documenting the board interface and how the sensors were connected (a matrix) was enough.
We hear you! We're most certainly planning on eating up the system's chain to describe, version control and validate up the system's chain.
As one example in an earlier (and likely future) permutation of atopile we could compile harnesses (using the fantastic https://github.com/wireviz/WireViz) by linking two interfaces on connectors.
Long future, if you can integrate mechanically too, you can at least check if not generate the harnesses lengths, bend radii, drip looks etc... for that same harness.
Somewhat like software, you can only tractably do that at scale if the units work and work reliably. Unit tests are the basis of software validation and we expect hardware validation as well. We're starting low-level we know, but with a concepts we know (from the insane scale of big software projects) can work on enormous projects too.
Many of the things you mention are not really done in hardware.
For example, unit tests. Even in FPGA designs, you can run functional simulations on portions of a design to help save time and validate. I don't believe we are yet at the stage where we simulate the entire chip. Not sure it would make sense even if you could. You have to worry about real-world effects such as clock skew and jitter that might not necessarily be deterministic. If you have designs running at hundreds of MHz or GHz, at one point you have no option but to run the design on the real IC and debug in hardware.
The other issue is that every company is likely to have their own process for some of the things you mention. Harness design and manufacturing is a good example of this. Companies like Siemens, Zuken, TE and others have professional solutions that often integrad with CAD/CAM tools (like Siemens NX) and produce professional manufacturing-ready documentation. Job shops, in many cases, are setup to receive files from industry standard tools and work directly from them. WireViz is a neat tool, but it is pretty much at the hobby level.
For example:
https://rapidharness.com/
https://www.zuken.com/us/product/e3series/electrical-cable-d...
https://www.sw.siemens.com/en-US/vehicle-electrification-wir...
You should not be discouraged though. That said, I would still urge you to interview a lot of EE's and product design engineers to really understand what you are walking into. You need to realize that you are not likely to change the entire product design and manufacturing industry just because you offer a software-like approach to design. That's just not going to happen. Industries have tons of inertia and they tend to only be interested in solving pressing problems, not adopting entirely new workflows. Also, the EDA/CAD/CAM industries are paved with the corpses or thousands of offerings that, collectively, over time, defined how things are done today.
My guess is you'd have to raise $100MM to $300MM, hire tons of engineers and devote ten solid years to materially influence how things are done. Nobody has the time or budget to introduce new tools, new problems, new training and grind their entire product development process to a halt just to adopt a new paradigm.
I'll give you an example of this from real life. The CAM tool we use to program our CNC machines is crap. We use CAMWorks Professional, which integrates with Solidworks and probably cost us $30K+ per license (between initial purchase and maintenance fees). We want to switch to at least doing CAM using the Fusion360 tools. However, this will definitely cause us to take a hit in productivity and possibly put out bad product for a period of time until the dust settles. And so, while we absolutely detest CAMWorks, we have no choice but to continue using it until a window of opportunity presents itself to make the switch. And, of course, also knowing full-well that the Fusion360 solution isn't utopia. There are no perfect tools. Just choices you might be forced to live with.
This is all so true! But it also describes the problem very well. Of course a $100k CNC machine design will always be developed by a small team collaborating as you describe. But the tools and processes that work for that most emphatically do not work well for smaller projects with more distributed contributors.
Custom PCBs are so easy to order these days, but they are hellish to design and test well. That would change rapidly if we could reuse basic blocks the same way we do with code
Smaller projects are either handled by one or a few engineers, with each looking after one or more disciplines. I can't see collaboration on a PCB as a thing.
For example, I have personally done electrical, embedded code, FPGA and full mechanical design and manufacturing on many products that can be described as one or a few PCB's with < 1000 components. Not that hard.
I am open to the idea that I just don't have the experience in a domain where more than one engineer actively works on a single circuit board. By "actively works" I mean, simultaneous editing of portions of the PCB with the equivalent of a team leader handling pull requests, etc.
I have worked on PCB's full of analog and digital chips in the 30 x 30 cm range (12 x 12 in) entirely on my own while other engineers worked on other boards that plugged into the same backplane. I have never worked on a single board where anyone else is running simultaneous edits. The closes I have gotten to that would be a power electronics engineer "blessing" a switched mode regulator design that I then manually integrate into my work (and so do others).
So, yeah, don't know. I'd be interested in hearing from anyone who regularly works in an environment where simultaneous team editing of a single PCB happens in the same manner as one might with something like VSCode and "Live Share" collaboration sessions.
Sorry, I just don't know if that proposal is realistic. Not my range of experience, which means nothing at all.
I agree with every comment robomartin has posted on this thread.
I'll add: I HAVE worked on schematic designs with 10-20 people working on an overall schematic design at the same time. Usually one or more engineers own some sub-section of the design ; i.e. for a mobile device, one or a few people own the "power" part of the schematic, the camera team owns the camera portion(s) of the schematic, etc.
Mentor (DxDesigner / Design Capture) and Cadence (Concept) support this flow, but with the oddly more popular but OrCad this can be a nightmare.
Hi, @Timot05. I’m a former EE with 20 yrs experience working in the industry and have designed dozens of very large complex mixed signal PCBAs 4-32 layers, as well as about the same number of large FPGA SoC designs.
I watched the demo video out of curiosity and here’s my 2 cents, though there is a lot to unpack here:
First if you want to know the current state of “how does git look like in hardware” as far as PCBA design is concerned look at Altium which uses git under the hood now to provide a very nice visual way to show differences in both the schematic and PCB layout between versions that solves some real pain points for an EE and therefore EEs actually want it and use it. There are ways to create reusable, version controlled sub-circuits that get put into libraries as well:
https://www.altium.com/altium-365/hardware-version-control https://resources.altium.com/p/introduction-git-altium-desig...
Whatever open source you build should be modeled after that.
I found the above very nice after years of manually using git to version control PCBA designs developed in other ECAD tools like PADS, Orcad, Cadence etc. I even tried to get a couple EE’s and layout people to use my methods of git version control, documented in the README of course, but to no avail, most strictly hardware EEs (with no FPGA or SW background) either can’t grasp git or don’t see the point in spending the time on it. The same people will quickly pay $10k-15k a seat for Altium to get that slick UI on top of git version control because it makes things more visual, not less, which as others have mentioned is important in PCBA design I’ll get to in a second.
But, I understand better what you actually mean because you phrased it another way “how can groups of people coordinate and share their work in hardware?”
That depends of course how / who are you coordinating and sharing with? How are you dividing up the work?
In my experience, even with a very large complex mixed signal design, there is one guy per board, and in the extremely rare case that a single board is being worked by more than one person it is usually divided up by discipline typically, RF, analog and digital and those people contribute their pages in any number of ways including using sub-circuit modules out of an Altium library.
And, EEs lean heavily on reference designs and eval hardware to do their work. Though they do spend a lot of time reading a lot of data sheets they really don’t have time to read most of them, just reference the ones they need to integrate existing designs and handle the new parts of the design (which need to be minimal). They need to get large parts of the working designs handed to them with eval hardware they can test and vet on the bench, and with reference designs (schematics, layout, firmware and some documentation) provided by and supported, somewhat, by the vendors of the major components. Not some open source developers who can’t really support the major components utilized in their designs effectively because they don’t have access to all the information they would need to do that, talk to the fab etc. They can only talk to FAEs, same as all other engineers outside the vendor.
Also, PCBA design is very process oriented. Even if a company doesn’t have a PCBA design process this will not slow down an experienced EE who I say has learned what I call “The process” with a capital T. Schematic capture and layout are 2 very hard and fast steps in that process with boundaries that been defined by decades of EE college curriculum and EDA development that aren’t likely to see any change in EE in the near future, though I think the EDA industry is in need of some disruption somewhere, it is not here.
I started designing CPLDs and FPGAs right as most people had switched from capturing their chip designs in schematic form to capturing them in verilog and VHDL. It was a disaster. Most EEs in that space (including myself for a time) were really bad at architecting their modules and then structuring and writing the HDL code to make sense and be maintainable. Good documentation and block diagrams were essential for development and maintenance. And, after 20 years, it is still a huge problem for a lot of designs. I really wish someone would make a good tool to aid in the construction of block diagrams from the code of large FPGA designs. Too often I would inherit a legacy design and find myself having to spend a week or two creating block diagrams and other documentation for the design.
I shutter to think what it would be like to try to understand and debug a PCBA for a 20 layer board that was just a bunch of code or text and didn’t have a schematic and some decent documentation. It’s bad enough that EE’s are frequently handed a large schematic and layout with little or no documentation for it except a folder full of the data sheets for each component.
Thanks for your detailed answer! Tons of information in there.
To your point about how the work gets divided, I don't think that work will necessarily be distributed amongst many people for a given design but code gives you a couple of benefits:
- you can branch and merge the changes you make, which means that you can design a given feature and have your team review only this change instead of reviewing a batch of changes.
- If someone actually does want to pickup your design and wants to contribute to it, code makes it easier for them because more information is captured in the code, so they get a better sense of what the design requirements are.
To your last point, we don't really see the code becoming the documentation of the implementation of the module. Actually the goal is to do something similar to the IC space or the software space, where the code becomes the implementation of something but the documentation lives somewhere else, perhaps in a datasheet-like document. Currently the schematic forces you to have both the documentation and the implementation baked into the same document, which forces you to find a middle ground between adding information relevant to those two objectives.
I don't think this a fundamental exclusive benefit of "code" (ie text). I am skeptical how textual merging, especially default automated, can be applied in this domain except in trivial ways. This is a usually easy sell for software because you can structure your system to localize effects (even if there it isn't perfect). The big guns have very sophisticated workflow and tooling for this.
https://www.altium.com/documentation/altium-designer/collabo...
https://www.altium.com/documentation/altium-designer/using-v...
I sincerely don't understand. Perhaps it is informative to consider that SPICE has been around for 50 years. The idea of using textual description for circuit design is not new at all, and they are used when appropriate. One must ask why the basic workflow is the way it is when the alternatives have always been possible.
Not entirely onboard with this forcing assertion, but is this necessarily a bad thing? https://en.wikipedia.org/wiki/Literate_programming
That's a seriously important point. Open source is filled with just plain horrible code. EE's who are not generally trained in software development can deliver reasonable schematics. There's the potential for utterly unintelligible designs if they had to create them using software techniques. I have seen FPGA code that is most-definitely cringe-worthy and unintelligible without, as you said, stopping to draw a schematic.
One way to think of this --in terms of circuits, not FPGA's-- is that a schematic can stand on its own. A text based netlist, even if sophisticated, requires documentation and, more than likely, a schematic.
I should probably add this:
One of the most useful things when developing anything at all is push-back or criticism. That's where you find value, insight, ideas and learning. I always value counterpoint far more than anyone who keeps telling me how great I am and that all is well.
Very recent example: I enlisted ten people a few weeks ago to critique a presentation I had to give at a conference in Zurich. One person, when I sent her the first draft, told me it was fantastic and that it sounded great. I never sent her version 2. Others pounded me hard with critique from every angle. They got every version until they had nothing negative to say. The last person stuck with me until the very last draft, she critiqued it until we both agreed it was time to let it go and focus on delivering what I had.
I say this to highlight that what I've been telling you is 100% in the spirit of constructive criticism. People telling you this is great is fantastic, but that never helps you make a better product or understand if you are on the right path. Keep going. Do what you believe is right. You don't have to accept or act on any criticism or push-back, listen to it, take it in and then do what you think is right given your objectives.
Not at all! Thanks so much for engaging with us on this level. It's greatly appreciated!
Fully agree! We'd much rather be in a situation where criticism is consistently thrown at the project so we can further improve it rather than having positive feedback and the discussion just stopping there.
I am curious to hear that you dont see any value in bringing software workflow to hardware. I found in previous jobs we spent a huge fraction of our time dealing with things like release, version management and reviews. Thanks for your thoughts!
Here's something that has been true for decades: Storage is cheap and always getting cheaper.
When you are dealing with complex multidisciplinary designs (a full machine), the concept or a release takes-on a very different meaning.
Scenario: You update a single circuit board, or the code that runs on a single microprocessor.
The release descriptor for the product, the machine, must include every single element that goes into making that machine. If all you do is commit the software update to Git, you very well run a chance to create a very serious problem for manufacturing, qualification, testing and post-sales support.
It has been my experience that, because storage is cheap, a release due to any change is a full annotated copy of all design files for the entire design; electrical, firmware, software, mechanical, testing, manufacturing, etc. In some cases we've had to save entire virtual machines with working code because workstation and system software can mutate and things break. We recently had a case where a Windows 10 automatic update (don't get me started) broke code that used to work in a last month's Windows version.
If we accept that storage is cheap, then the concept of releases at the product level, are simple copies along with a set of relevant "readme" files describing release details. You can come back to this three years later and all is well.
In the era of little Arduino boards it is sometimes easier that, more often than not, electronics don't exist just for the sake of electronics but rather as part of a much larger, and often far more complex, multidisciplinary product. In that context, schematic entry is an insignificant burden. The same is true of hyperventilating about what code editor to use. Some of these things are truly rounding errors in the product life cycle.
The value is mostly in the software workflow, not in the software tooling.
Store your KiCad files in Git. Make a script which automatically adds a revision tag to the PCB. Create a CI pipeline for Gerber generation. Export PNGs so you can do visual diffs.
If you're collaborating on some project which consists of artwork made in Photoshop you aren't looking for a text-based drawing program either - you are looking for a better way to do diffs.
This is the most useful comment in the thread (so far)!
This product would not work at all for any analog or power designs--EEs like to visualize current flow and a schematic is the best way to do that. Maybe if it could interoperate with small blocks of schematics, treating them as modules, it could be useful. If there was a way to parameterize part values, like those used to build analog filters, that could also be useful, but not in the current text-only form.
The one thing it could be useful for is creating net connectivity for large numbers of pin to pin connections, like DDR memory or PCIe buses. Schematics for these end up looking like large data tables anyway, and can be tedious to create and prone to errors.
I see so many EDA startups using their product for simple Arduino boards and other low to medium complexity designs. It's far more effective to start with the most complex board design. Take a server board from the OpenCompute project with a few thousands parts and a few tens of thousands of nets. What would that look like in this language? Would it have too much boilerplate code? What would the experience of creating it be like? How do you handle back annotation? How do you handle pin swaps, or part section swaps? How about BOM variants?
This is a good example. I'll expand it to include such things as breaking up the symbols for a large FPGA (say, over 1000 pins) into multiple schematic blocks and managing them as the schematic design progresses. Very often you have to move pins around within the same block or between blocks just to arrange them logically or align them with whatever they might connect to.
For example, take the output of an FPGA bank and connect it to the input of an HDMI transmitter chip. You would want to align the red, green, blue, clock and control pins so that you can connect them with a bus that looks right. If someone reading this isn't clear, imaging arranging the pins on each device in random order where, for example, the first chip has R0,B8,R1,G5 and the second chip is R0,R1,R2, etc.
This kind of thing is tedious and painful.
The way we solve the problem with Altium was to write code that allows us to fully define an FPGA (and other IC's) using data entry in Excel. One turn of the crank creates all symbols and PCB patterns and exports into the libraries. This took something that was a days-long torture to maybe 45 minutes of work. After that, if you need to move pins around, it takes minutes to update everything.
This is also an example of why a text file based approach would not be as useful. Entering data in Excel is very easy. Editing it is just as easy. In some cases you can even copy and paste from datasheets and easily edit or rearrange. I can also write formulas to generate sequences of pins, etc.
Having to manage this in a text file would not be as slick. Over the years, I have found that Excel, coupled with VBA code, is a very powerful tool for schematic entry and PCB design.
Yeah, I have to agree. It's a neat idea and I can see it having some applications, but I don't really see myself using this for anything serious.
I think OpenSCAD neatly demonstrates the issue with 3D modeling: when you come from a tool like Blender it might look amazing that you can now trivially change dimensions mid-design, but what you're really looking for is the parametric modeling you get from something like Solidworks.
Schematics-as-code doesn't solve a problem I am having. What I want, is for an easy way to import "snippets" in KiCad: I want a library where I can select an MCU, and it'll automatically add all its auxiliary parts too and let me choose from a few pre-made standard routing options. Maybe even make it parametric so it becomes a nice little wizard where if I'm adding an LDO I can enter a voltage and it'll calculate the adjustment resistors for me.
You're right about "code" not being the right solution for everything. In the case of the layout we have indeed already already implemented an MVP of the "snippets" approach you described: https://atopile.io/blog/2024/02/02/-layout-reuse-keeps-getti...
A big part of the code is that creating a wizard for each issue it a tall order to develop, but through expressive and composable language we should be able to articulate the parameters for the configuration just as as clearly and generically solve that class of problems.