The Route

Dumb data won’t die. And the blame rests squarely on users who resist upgrading to formats that are designed with today’s boards and manufacturing methods in mind.

That’s the message from the PCEA Portland (OR) Chapter at its meeting in late October. The sentiment was shared across a spectrum of users, from designers to fabricators to assemblers, including the host, Axiom Electronics.

We know the issues: Too often, fabricators and assemblers receive conflicting duplicate and erroneous design data files. More often than not the culprit is Gerber-based data packages, which almost always require modification prior to fabrication or assembly.

So while persistent errors from design to manufacturing are often due to manual entry miscues or otherwise obvious omissions such as a missing solder mask layer or discrepancies within the netlist, the industry by and large continues to put up with the pain instead of migrating to a new format.

Read more: Kicking the Gerber Habit

CAD as we know it for printed circuit design came into existence in the mid-1960s. And while some industry designers still remember the days (more fondly than I would!) of hand-taping components and traces, then using a camera to produce films for fabrication, it didn’t take too long before computers started taking over.

In 1970, Racal-Redac, which later was acquired by Zuken, released its original PCB, schematic and silicon layout tool. A few years later, Scientific Calculations introduced SCICARDS for generating photoplots from Gerber files. Of course, the dimensions back then were epic in size – pads were 70 mils or more, and lines and spaces were 25 mils.

By 1976, the EDA market was starting to pop with companies, and not just for design. Makoto Kaneko founded Zuken. A trio of professors at the University of Texas – James Truchard, Bill Nowlin and Jeff Kodosky – launched National Instruments. A former Tektronix engineering manager, Doug Campbell, started Polar Instruments.

Read more: The Hidden Feature of AI Startups: A Much-Needed Human Touch

Ultra-high-density interconnects are more smoke than fire right now, but they won’t be that way for long. Driven by high-density BGAs and RF products, UHDI is finding its way into the mainstream.

Given the number of conferences, webinars and the like, readers would be forgiven if they thought UHDI was already standard, however.

First, of course, means agreeing on what, exactly, UHDI is. The working definition of UHDI is product with line widths and spaces of fewer than 50 microns, dielectric thickness of less than 50 microns, and a microvia diameter of less than 75 microns. That’s not a standard definition – yet – and the lower lever parameters have yet to be defined. At some point, there stands to be overlap with semiconductor technology. Stay tuned as the definition evolves.

I am reminded – to a degree – of the chaos surrounding UHDI’s (slightly) larger cousin, high-density interconnects, which hit widespread production in the late 1990s (although the original concept dates much earlier). Then, the issues could be boiled down to two:

Read more: Let’s Get Small

Speaking, as we were last month, about artificial intelligence and its adoption into electronics design and manufacturing, we observed that a current obstacle to implementation is the use by vendors of customer data in order to build their models.

And while vendors insist the data are aggregated and anonymized, said customers, naturally, have been generally circumspect over the perceived cost of the lessons they have learned – often painstakingly – being used to enable competitors, not to mention ultimately paying those same vendors for the courtesy.

To that I will add the musings of Neil Thompson, who is the director of the future tech research project at MIT’s Computer Science and Artificial Intelligence Laboratory.

Thompson argues that AI systems must not just be capable of performing “human” tasks but also must overcome the costs of implementation, including redesigning processes and methodologies. “There are a lot of places where ... humans are a more cost-efficient way,” he says.

Over the past couple years, via our PCB Chat podcast and webinars, our editors have spoken with a growing number of AI experts across the spectrum of the electronics supply chain. They include:

Read more: There’s One Problem AI Can’t Resolve: Implementation

Artificial intelligence is applied to electronics design and manufacturing is in its infancy, but interest is high and questions abound as to what it means – and even what it is.

AI is seen as similar to the Internet in 1995: a big, wide-open technology that companies had to embrace and understand. Success depends on narrow implementations that permit companies to see clearly what the return or improvements will be.

AI holds the potential to revolutionize the creation and manufacturing of electronic products. Unlocking this potential, however, requires collaborative efforts to ensure its effective understanding and implementation. Without concerted action, the realization of AI's transformative promise in the electronics industry may remain elusive.

To address this challenge, the PCEA Technical Action Group (AI TG) convened a meeting of select representatives of early adopters and providers of AI tools for the electronics sector. The primary aim of this gathering is to solicit input from industry stakeholders to shape an AI Status and Action Plan.

Read more: A Status Plan for Implementing AI Takes Shape

Should components for military use be made in a dedicated secure facility?

That’s the basic thinking behind a $3.5 billion allocation by the US government to support an undisclosed chipmaker, presumably Intel, to develop a classified advanced semiconductor development project. The monies at the root of the issue touched off yet another question, that is, whether Chips Act funds were misused when routed to the so-called Secure Enclave program.

The Chips Act, of course, is the foundational legislation upon which the US strategy of reclaiming semiconductor manufacturing dominance is built.

Years ago, the major semiconductor foundries, including Intel, Motorola and others, had designated government segments. Their demise more or less concluded with the rollout of new defense procurement policies, now known as the Perry Initiative.

Read more: A Modern-Day Manhattan Project?

Page 1 of 12