But the point still stands, I think. We're talking about 2 or 3 students to be able to fund this.
The very small college I work for (enrollment under 1200) just spent 250k on two robotics classrooms.
A school with any engineering program at all will spend this amount of money on something like this with very little hesitation.
And even if they won't spend hard dollars, every institution employees people to write grants (that's what I do). The amount of public and private grants in the STEM field for higher education is mind boggling. I can think of 4 businesses that would write this check next week after the application is submitted. Hell, I can think of one business that would probably write the check tomorrow after a quick phone call.
As I read more about the dark art of IC fabrication, though, I realized that even this was a faint dream. I had imagined a world of lasers carving troughs, and print heads carefully placing down the lines and doping the silicon, an elegant symphony of modern technology.
But the real world is much messier -- every stage involves dangerous and toxic chemicals, processes that are spoiled by a spec of dust in the wrong place, either causing a cascade of reagent failures or a physical impediment to correctness; distressingly analog and oh so messy and built by trial and error and refined by domain experts in ways that are intensely hard to replicate because all the same lessons need to be learned again each time.
I'm glad to see the work being done here for hobbyist fabrication, but barring huge leaps and bounds, the gap between the neat lines in Magic and the shiny silicon discs is a vast chasm owned by the material scientists, not the electrical engineers or the software engineers.
Well, why not 100um then? It's still way better than discrete components.
Just checking, "4mil"? When I see (or hear) "mil" I assume millimeters, which clearly isn't right here, but I don't know if this is autocorrupt or if this is shorthand for something else I've never seen called this before, say "1e-4 meters"?
I do wonder if taking this approach would work better with a novel construction method. Lithography and nasty chemicals are easier for resolution, but nasty chemicals.
On the other hand there will always be someone standing by to tell you an FPGA could have done that.
Someone's going to judge what you do regardless. As long as you're not hurting someone else, go build what you want to build, others be damned.
I've done
Pacman in DCPU-16 asm
A HTML/JS desktop environment
8-bit AVR assembler in JavaScript
An 8-bit fantasy console with web IDE.
and whatever the hell https://c50.fingswotidun.com/ isplus a bunch of other things that really make people wonder if I have any sort of life plan at all.
People don't even really etch their own pcbs anymore, it's so fast and cheap, let alone spend $10k+ to manufacture a six cent item (maybe!), so there never was enough motivation for a diy movement to make ICs and other nanofabbed stuff
I would whip out the credit card if I could make 555 timers at home for fun for $1,000.
Not sure if I put a second mortgage on my house to have a chance at maybe making one if I didn't screw up too much.
clearly a big part of why all these tech has been so succesful is also how it's all about investing a lot up front, but eventually being able to mass produce in a ridiculous scale, few industries have such a ratio (possibly pharma?)
so it's all about making chips by the hundreds of thousands. it requires a very different approach from any tech intended to make chips by the handful
If there was a box that received supplies and outputted usable PCBs with minimum external mess, a lot of people that currently buy boards would use it instead.
(And well, PCB manufacturing is basically the same process as chip fabrication, without the miniaturization. If nobody managed to create a "PCB printer", why do people keep hopping for a "chip printer"?)
LPKF did, it's just expensive.
> That's why nobody etches their own PCBs.
Get into exotic laminates and ultra-high-speed performance with very tight tolerances and you'll see some prototyping done in-house.
Now, you're getting those massive print farms that are able to change what they produce on the next print.
> As I read more about the dark art of IC fabrication
I want to push back on this being a "dark art" - there is no magic in engineering (nb4, any sufficiently advanced technology etc etc). It's a skillset that requires education, experience, and expertise on par with anything we do in other areas of engineering. The stakes are just a little higher than software because you're dealing with the physical world and physical things have tangible costs and/or danger.
The thing that may trip people up is that IC fabrication is one of those things that doesn't really have a hobbyist tier. Anything beyond a toy requires multiple people and support staff in addition to gear and raw materials that are hard to get as any old civilian - in addition to the clean room facilities. Like the reason my university closed their lab was partly because the grad/PhD students and professors had moved on, and partly because it was becoming more difficult to source wafers for research institutions that they could actually use (everyone got hired by labs in industry, where they were making their own wafers or buying them wholesale afaict).
* iirc only the penultimate project got taped out and fabbed with terrible yields due to time contraints
I think "engineering" in software generally means optimizing a path to a targeted set of behaviors so that the piles of garbage underneath don't end up blocking their execution for eternity.
Our starting point is therefore different. You ought to somehow be working around all the physical piles of dust and patchwork of fires that must be constantly igniting inside your laser machinery. I picture it something like the mad surgeon in Minority Report, creating a small transient sterile environment to do illegal eye surgery in a room full of filth.
In that light your "art" looks "dark."
Are you sure university labs are really able to to this? If so how come only a few companies like tsmc and that one Dutch company are able to manufacture microchips? Or are those two completely different things and I'm just confusing myself?
The physical masks themselves are usually made by Hoya, and the technology to actually etch the masks is made by Veevo.
There was lots of older or used equipment Universities could buy before Fabs started being millions of square feet with hundreds of million dollar pieces of equipment.
But even running a small-scale fab spitting out 7400 series chips and 555's is still pretty serious business; you need chemical engineers and material scientists as well as electrical engineers and software engineers (and multidisciplinary versions of those people) to keep things running at all. And nobody can do this stuff out of college -- everyone has extensive apprenticeships and practical experience working in other fabs because so much of the process is knowhow rather than technical specifications.
You can poke and prod anything into place with e-beams and FIBs and manually dipping wafers in baths and ovens and such. 1% yield, hour long write times, and all sorts of R&D jank are perfectly fine for checking functionality of your fancy ultra-FET design or making a ring oscillator to simulate integration. Did a grain of dust land on the wafer and ruin 100 of them? No prob, use the other 300, just try not to let it happen again. But integrating a billion transistors, coordinating them to do a billion calculations per second, QAing them to work for a billion seconds with 0 errors, and manufacturing them to profitably sell at $100 a pop? No jank allowed, no small scale antics allowed, and your budget now requires all the zeros it can find and more besides.
Yes, I know of multiple universities that have labs for small scale IC production. In fact anywhere doing research in the field will have some ability to build these things, or access to the industrial labs nearby. Even in industry, there are small scale labs that are used to develop the processes before they get built out at scale.
> If so how come only a few companies like tsmc and that one Dutch company are able to manufacture microchips?
There are thousands of chip manufacturers worldwide. TSMC is just the largest/most cutting edge. ASML is the company that makes special tools for IC manufacturing (however, researchers can/do experiment with the things that ASML is doing on smaller scales).
But keep in mind - no researcher at a university is trying to manufacture millions of 3nm CPUs for next year's iPhone. Just as an example, today we have GaN switches in our 100+W USB-C chargers that fit in your pocket. That directly came from university and industry research in small scale labs into high bandgap semiconductors, which was developed by fabbing real circuits and testing them.
In 1983 cult Polish science education TV program SONDA documented design and manufacturing of first batches in a humorous lets bake a cake fashion. Paper plotters, light pens, developing/rinsing dies by hand, electron microscope debugging, the whole nine yards!
part 1 https://www.youtube.com/watch?v=AJGp7keIA_o
They are even able to work with external clients to sell the chips they make.
ASML, that one Dutch company, is the only manufacturer of EUV photolithography machines, which are required to produce the cutting-edge of chips. There are plenty of chips that aren't cutting-edge, though, and plenty of reason to produce them in both academic and commercial settings.
The key here is research scale. Larger process nodes, minimal automation, and smaller yields. Which is just fine, because the idea is to prototype new ideas rather than produce millions of chips.
What they get to do may not help with DIY in a garage on Earth.
imho, the deeper problem is that there are just very few situations where you need a custom chip that can't be covered by existing options or FPGAs, and vanishingly few people have the expertise to get anything interesting done even if they had cheap access to fabs
(check out tiny tapeout, though!)
Most people can't even make FPGA that work properly, and YOLO the simulation given metastability is beyond their users understanding.
>few people have the expertise to get anything interesting done even if they had cheap access to fabs
Chicken or egg problem... a walled garden simply gets fewer visitors, and people with expensive toys tend not to share. =3
As long as there's a problem and there's money to be made, these things you mentioned can be solved.
https://jlcpcb.com/blog/flex-pcb-available-at-jlcpcb-from-sp...
https://www.pcbway.com/fpc-rigid-flex-pcb/flex-pcb.html
And in case folks reading this don't already know it, multi-layer rigid printed circuit boards are a common technology based on laminating together multiple very thin rigid layers with each layer carrying separate traces.
Not only built by trial and error, but also continuously adapted in near real time to deal with new sources of error.
The most complicated aspects of semiconductor manufacturing utilize statistical process control to determine the best course of action by relying on large sample sizes. You probably couldn't start up a modern manufacturing line without already having a manufacturing line due to this. Finding viable "hyperparameters" for a photo tool makes training an LLM look like a tutorial. Bootstrapping all of this required direct human involvement with ever-so-careful incremental offloading of these concerns to automation over a period of decades.
There's generally an unstated (and occasionally explicit, as in this case) reverence from software people for the kind of mythical engineering that goes on in fabs. In reality, if you've had any direct experience with the manufacturing process—and I'm talking about current- or next-gen processes for the most sophisticated mass market devices like those going into flagship smartphones, mining ASICs, GPUs, and critical applications like use in EVs—you know that a bunch of it is in the hands of folks whose most desirable asset in a prospective worker is that they'll accept low pay to eventually get the necessary work done to the prevailing standard best described as "adequate".
Valley types especially, but even other software folks would be really surprised by how much of what goes on in fabs is basically the sort of thing that you would expect to see from people plucked from amateur hour. I've posted about this before on HN. Where improvement to existing chipmaker operations is concerned, the fruit hangs so, so low.
Elon's biggest, dumbest misstep is not just buying Twitter; it's buying Twitter and not putting an equal or lesser amount of resources instead into gaining control over how his own (and others') chips are made—doing the same thing for the industry that he did with SpaceX for aerospace.
Again, because it cannot be emphasized enough: what passes for acceptable in fab operations is bonkers.
Can you link to some of your previous comments on this subject?
All these things have been done by hobbyists before, but I suppose doing all of this for a single PCB just isn't attractive.
[1] https://www.engineeringtoolbox.com/particle-sizes-d_934.html
Vacuum chambers and vacuum pumps....metrology microscopes....
Cost to grow single crystal boules is insane.
Etc....
eg these guys are putting retro games on to FPGA https://www.analogue.co/developer
If my assumption is true, how is this better than FPGAs?
Why? Assuming this is ignoring a good chunk of individual interest. It's similar to people mentioning ordering PCBs instead of making your own: sure, making a thousand copies of a PCB is now cheap enough on the margin to be accessible. But what about making five? Or just one?
Not every human sees a hobby as an investment into business. Not everyone does projects with a sellable product in mind. Many just want to test their ideas, have fun, scratch their own itch, build something so it exists, and not to sell it.
The primary value of a home fab to me would be to enable fabbing a single task-specific chip (or a tiny amount of them) for any random need I have, whenever it occurs.
For example JLCPCB is currently offering 5 PCBs (2L, 100x100mm) for $2 + $1.52 shipping. That's why people are saying that making your own PCBs is not economical.
For starters, you can make analog/mixed-signal chips
That doesn't even work very well with 3D printing. You have no chance at all of transferring something from 10µm chips into a commercial fab.
Right now Tiny Tapeout seems to be the best option.
Dedicated DACs/ADCs will almost always offer better performance than the ones you'd find on a microcontroller or even an ASIC.
You also need large amount of input/output - a good start on a chip would be about 1,000 to 10,000 electrodes. I think it is going to be difficult to put that many on an FPGA.
I recently learned about DNA-directed crystal growth and was excited by the idea that it might be a more tractable approach to being a big thing and making a small thing (like an integrated circuit). I'm not sure how one would go about it in their garage, but programming the fine-control-needed steps into the chemical rather than into the machine feels like a win.
There are low-volume lab processes around that can hit below <234nm feature sizes, but a clean room must be considered part of the machine... And it can take years to figure out how to maintain atmospheres and gas mass-flow-control.
Pretty cheeky selling community designed hardware without citing the original hobbyists. Nothing they posted looks remotely new or novel. Meh =3
> make me a 1um tall pyramid out of nanoparticles which are all the same type
...than it is to:
> make me a NAND gate out of this solution of nanoparticles, some are doped in different ways than others.
Most miracles involving DNA are heavily inspired by nature, and there's not a lot of natural code available for the construction of logic gates. I'm not sure which of these is harder:
1. make biochemical systems which construct arbitrary solid state circuits
2. abandon solid state and do our computing in living tissue
What I like about the former is that there are a lot of other things that investment in that area might produce, things which aren't a hassle to keep alive, even if a general purpose circuit constructor happens to be out of reach.
I have to wonder if there's not more that can be done on this front for low cost IC prototyping. I don't think the fixed infrastructure is necessarily the problem (i.e. building the fab) as there's enough capacity for cheap chips in volume, meaning each additional wafer isn't the cost limiting factor. There are multi-project wafers (like PCB aggregators), but my understanding is that the hard cost limit currently is the NRE of making the mask set, which isn't getting amortized over a sufficient number of devices in a prototype run.
So cheap masks (or fewer masks) would be an area I'd be interested to see development.
Until you actually need to route DDR, run any signal simulation model, etc.
I wouldn't describe this as a "pretty usable" means, however, for the same reasons KiCad is unusable for DDR routing.
https://www.kicad.org/made-with-kicad/categories/Single-boar...
Why are we talking about low cost at home IC development for farmers while we don't let them do even that.
More likely we're going to go in the opposite direction, though.
> We communicate entirely over Discord.
Walled garden, unsearchable content, for what strikes me as an open source like DYI endeavor.
Why ?
From Google ?
By searchable, I mean universally accesible, be it via trad. search engines or LLMs.
Instead, any knowledge accrued by that community in that particular forum is locked in a box for anyone not participating in that forum, which is likely 5 nines of humanity if not 6.
It is sad.
There's the obvious stuff: you can't avoid HF, and it's nasty stuff. You can die. But that's not what I'm the most worried about; people can make smart decisions to reduce risk, and ultimately people can make their own decisions about their risk tolerance.
What I'm worried about is the SF6 for the RIE. Kg for kg, that stuff has a global warming potential of more than 24,000 TIMES the warming potential of CO2. If it's all broken down in the plasma chamber, or there's exhaust scrubbers involved like you'd have at an industrial fab, then it's no issue.
But hobbyists are going to be spilling and purging a bunch of unmodified SF6. It's kind of an ecological catastrophe. Some things are better not done at home.
E-beam lithography has been used since the 1970s. It's slow; it might take a day to make a CPU. That's why it's not used as a production process. But as a prototype process, it works fine. There are a few hobbyists doing this.[1]
E-beam systems are basically scanning electron microscopes with more power. There's a vacuum chamber, means for focusing and steering an electron beam similar to what's inside a CRT, and control equipment. It's all computer-controlled, of course.
This has many advantages. Software can correct for nonlinearities in the scanning. The machine can inspect what it's written by scanning at low power.
You still have to coat and etch; it's not a dry process. The beam just exposes photoresist.
The equipment is the size of a desk. Here's a machine at CMU.[2] Many universities have such machines.
[1] https://hackaday.com/2024/08/06/creating-1%c2%b5m-features-t...
[2] https://nanofab.ece.cmu.edu/facilities-equipment/fei-sirion....
This is a fascinating property that seems very powerful