Features

Mitigating skin effect’s impact on high-speed signals. 

I’ve spent much of the past seven years dealing with insertion loss as it relates to PCB dielectrics, as well as losses due to copper roughness. During that period, there’s been comparatively little discussion regarding “skin effect,” a significant contributor to signal attenuation that in my view gets less attention than it should. While discussing the phenomenon in-depth, we’ll also discuss what, if anything, can be done to mitigate its impact on high-speed signals.

While writing this article, I’ve been thinking of places that skin appears in nature and pop culture. When I started writing, I flipped on Skinwalker Ranch on the History Channel for the first time as background noise, and they were talking about magnetic fields, current flow, and Tesla coils.

Skin is said to be the largest organ in the human body. It has multiple layers and some amazing properties. Galvanic skin response, used in lie detectors, measures changes in skin conductance caused by sweat-gland activity. I suppose you could call that a “skin effect” too.

It's perfectly reasonable for engineers and PCB designers to ask, “Where should I focus my attention?” insofar as loss is concerned. In Signal and Power Integrity – Simplified,1 Dr. Eric Bogatin points out five ways energy can be lost to the receiver while the signal is propagating down a transmission line:

  1. Radiative loss
  2. Coupling to adjacent traces
  3. Impedance mismatches and glass-weave skew (the latter being my addition)
  4. Conductor loss
  5. Dielectric loss.

Each of these mechanisms reduces or affects the received signal, but they have significantly different causes and remedies. Plenty of articles over the years have discussed managing impedance and crosstalk, including ones I’ve written. I’ve also written about managing loss through dielectric-material selection and copper roughness, one of the two components of conductor loss. The other contributor to conductor loss is commonly known as skin effect.

To continue reading, please log in or register using the link in the upper right corner of the page.


Taiwan is the benchmark for controlling the spread of Covid-19 and minimizing the infection rate throughout the country with very few deaths.

Read more ...

Three options for leveraging the secure digital ledger. 

In last month’s introduction to blockchain technology,1 we noted how the technology offers a way to automate and simplify multiparty processes that are time-consuming, resource-intense, and therefore costly. We often summarize this sort of process as “high-friction.” But pioneers in applying blockchain to improve multiparty processes learned early that it wasn’t enough to find a process that was slow or frustrating. There needed to be a quantifiable performance (often financial) benefit as well. This wasn’t always easy to establish. Unlike applying automation to improve internal processes, the “friction” in multiparty processes occurs outside an organization. As a result, the costs and performance issues caused by that friction may not be captured well enough inside the organization to understand its true impact.

Perhaps it’s understandable, then, that the most successful early blockchain applications were often driven by companies large and sophisticated enough to not only recognize, but quantify, the opportunities and to have enough influence with their partner companies that those partners were willing to collaborate on a solution. Indeed, a recent article in MIT Sloan Management Review2 states, “The biggest challenge to companies creating blockchain apps isn’t the technology – it’s successfully collaborating with ecosystem partners.”

To continue reading, please log in or register using the link in the upper right corner of the page.

Read more ...

Updates in silicon and electronics technology.

Ed.: This is a special feature courtesy of Binghamton University.

IBM unveils world’s first 2nm chip technology. SIBM announced a breakthrough in semiconductor design and process with the development of the world’s first chip announced with 2nm nanosheet technology. The new design is projected to achieve 45% higher performance and 75% lower energy use than today’s 7nm chips. IBM said this new frontier in chip technology will accelerate advancements in AI, 5G and 6G, edge computing, autonomous systems, space exploration, and quantum computing. The technology would likely not be in high volume production until 2024. (IEEC file #12281, Semiconductor Digest, 4/27/21)

ieecjuly 2021 fig1

“Egg carton” quantum dot array could lead to ultralow power devices. University of Michigan researchers have developed a new approach by sending and receiving information with single photons of light using a “quantum egg carton” that captures and releases photons, supporting “excited” quantum states while it possesses the extra energy. Their experiment demonstrated the effect known as nonlinearity to modify and detect extremely weak light signals. This takes advantage of distinct changes to a quantum system to advance next-generation computing. As silicon-electronics-based information technology becomes increasingly throttled by heating and energy consumption, nonlinear optics is a potential solution. (IEEC file #12154, Science Daily, 3/4/21)

To continue reading, please log in or register using the link in the upper right corner of the page.


How to easily check current limits between a DC-DC converter and an FPGA. 

The design of power-supply structures on PCBs is not trivial. It requires careful consideration and techniques to achieve the best performance. Today’s high-pin-count devices need efficient power distribution systems permitting high-speed/high-frequency switching. The space available on PCBs is increasingly scarce. Thus, engineers fight for every square millimeter, using multiple layers for the layout of signal nets and power areas, parts of the power distribution which are then connected using dedicated power distribution network (PDN) via structures.

The narrowing of various supply voltages, coupled with increasing IC complexity and the number of voltage rails required, makes power integrity analysis inevitable for high-speed designs. This applies to AC as well as DC effects. The most compelling evidence is that modern circuits like (LP-)DDR memories operate at very low voltages (LP-DDR4 at 1.1V, for example).

To continue reading, please log in or register using the link in the upper right corner of the page.

Read more ...

EMC for PCB design is anything but black magic.

Electromagnetic compatibility (EMC) problems are often responsible for redesign cycles during the PCB design process, but once engineers and designers understand the basics, they see there’s nothing mystical about it.

EMC is the branch of electrical engineering and physics that deals with the unintentional generation, propagation and reception of electromagnetic waves (in the E and H fields). These can cause undesirable effects in electronic devices, including functional interferences, malfunctions, or even physical damage.

Generally, two fundamental aspects are considered. First, the emission referring to the unwanted generation of electromagnetic energy and its transmission to the sinks, along with the necessary countermeasures to reduce such emission. Second, the respective susceptibility to interference relating to the operation of electrical/electronic equipment (or components) that become “victims” of unintended electromagnetic interference (EMI).

To continue reading, please log in or register using the link in the upper right corner of the page.

Read more ...

Page 18 of 84

Subcategories