Technology - Engineering.com https://www.engineering.com/category/technology/ Wed, 07 Jan 2026 17:15:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 https://www.engineering.com/wp-content/uploads/2025/06/0-Square-Icon-White-on-Purpleb-150x150.png Technology - Engineering.com https://www.engineering.com/category/technology/ 32 32 Managing and designing for success with sustainability https://www.engineering.com/resources/managing-and-designing-for-success-with-sustainability/ Wed, 07 Jan 2026 17:15:45 +0000 https://www.engineering.com/?post_type=resources&p=145438 Managing the challenge of sustainability with advanced software to minimize risk.

The post Managing and designing for success with sustainability appeared first on Engineering.com.

]]>
v

This episode is brought to you by Bentley Systems. Please complete the registration form to watch the full conversation.

Engineering, particularly engineering project management, is all about understanding and addressing the constraints that control the critical pathways to project success.

Time, cost, technology, labor, capital availability and of course opportunity cost are always factors, but today there’s another: sustainability. That word means many things to many people, but from a project management standpoint, sustainability can be thought of a little like Russian dolls.

Designing a steel mill for example to minimize CO2 output may be the design goal of a project, but there is also the environmental impact of the construction of the mill to consider, and environmental impact of the design and development activities that precede construction.

Clear definitions of the desired sustainability outcomes, timelines to achieve them, and a well-defined pathway to that success, are all essential, and this puts a premium on designers, managers and the tools they use to minimize project risk.  

Panelists:

David Symons, Future Ready Leader, WSP
Anthony Kane, President & CEO of the Institute for Sustainable Infrastructure (ISI)
Dr. Rodrigo Fernandes, Director of Sustainability, Bentley Systems

Moderator:

Jim Anderton, Multimedia Content Director, engineering.com

* * *

Learn more about how to reduce carbon and enhance sustainability with Bentley System’s Carbon Analysis.

The post Managing and designing for success with sustainability appeared first on Engineering.com.

]]>
Using mosquitos for ultrafine 3D printing https://www.engineering.com/using-mosquitos-for-ultrafine-3d-printing/ Mon, 05 Jan 2026 16:51:29 +0000 https://www.engineering.com/?p=145423 Drexel and McGill engineers turn proboscides into high-resolution nozzles.

The post Using mosquitos for ultrafine 3D printing appeared first on Engineering.com.

]]>
Here’s my first hot take of 2026: mosquitos are terrible.

They’re responsible for almost 700 million infections and over a million deaths every year. Plus, they’re seriously annoying when you’re trying to sleep. I’ll gladly go on record as being stridently anti-mosquito, but now I must also grudgingly admit that they have at least one potential upside thanks to a team of engineers at McGill University and Drexel University.

Working together, these researchers have found a way to turn female mosquito proboscides into nozzles for high-resolution 3D printing, resulting in printed line widths as fine as 20 microns, which is roughly half the width of what commercially available nozzles can achieve.

The engineers have dubbed their approach “3D necroprinting” (my first new term added to Word’s dictionary this year) since it incorporates non-living biological microstructures (i.e., the proboscides) direct into the printing process. Potential applications of 3D necroprinting include the production of scaffolds for cell growth and tissue engineering, cell-laden printed gels, and transportation of microscopic artifacts, such as semiconductor chips.

“High-resolution 3D printing and microdispensing rely on ultrafine nozzles, typically made from specialized metal or glass,” said study co-author Jianyu Li, in a McGill press release. Li is an associate professor and Canada Research Chair in Tissue Repair and Regeneration at McGill. “These nozzles are expensive, difficult to manufacture and generate environmental waste and health concerns.”  

“Mosquito proboscides let us print extremely small, precise structures that are difficult or very expensive to produce with conventional tools,” added Changhong Cao in the same release. “Since biological nozzles are biodegradable, we can repurpose materials that would otherwise be discarded.” Cao is an assistant professor and Canada Research Chair in Small-Scale Materials and Manufacturing as well as a co-author on the study.

The study was led by McGill graduate student Justin Puma, who was involved in a previous study using a mosquito proboscis for biomimetic purposes that established a foundation for this research.

Probing the limits of high-resolution 3D printing

To develop the nozzles, the researchers examined insect-derived micronozzles and identified the mosquito proboscis as the optimal candidate. The proboscides were harvested from euthanized mosquitoes, sourced from ethically approved laboratory colonies used for biological research at Drexel.

Under a microscope, the researchers carefully removed the mosquito’s feeding tube before attaching it to a standard plastic dispenser tip using a small amount of resin. The researchers characterized the tips’ geometry and mechanical strength, measured their pressure tolerance, and integrated them into a custom 3D-printing setup. 

3D print of a maple leaf made with the mosquito proboscis micronozzle at various resolutions. (IMAGE: Changhong Cao.)

Once connected, the proboscis becomes the final opening through which the 3D printer emits material. The researchers have successfully printed high-resolution complex structures, including a honeycomb, a maple leaf and bioscaffolds that encapsulate cancer cells and red blood cells.

“Evolutions in bioprinting are helping medical researchers develop unique approaches to treatment. As we look to improve technology, we must also strive to innovate,” said Megan Creighton, in the same release. Creighton is a co-author and assistant professor of chemical and biological engineering at Drexel.

“We found the mosquito proboscis can withstand repeated printing cycles as long as the pressures stay within safe limits. With proper handling and cleaning, a nozzle can be reused many times,” Cao said. 

“By introducing biotic materials as viable substitutes to complex engineered components, this work paves the way for sustainable and innovative solutions in advanced manufacturing and microengineering,” Li said. 

The study is published in the journal Science Advances

The post Using mosquitos for ultrafine 3D printing appeared first on Engineering.com.

]]>
Design for additive manufacturing perspectives https://www.engineering.com/design-for-additive-manufacturing-perspectives/ Fri, 02 Jan 2026 14:00:00 +0000 https://www.engineering.com/?p=145310 On the question of AI versus expertise in the DfAM discussion panel at Formnext 2025.

The post Design for additive manufacturing perspectives appeared first on Engineering.com.

]]>
I’m so tired of hearing about AI.

Maybe it’s because I grew up watching and reading science fiction, and the idea of real artificial intelligence always seemed even more exciting to me than space travel. Forget warp drive, I wanted to be able to have a conversation with a computer.

Now I can, but I also can’t trust it and using it carries untold environmental, economic, and sociological costs. Add to that disappointment the fact that “AI” is being crammed into every nook and cranny of our technological lives (despite near-universal protestations by consumers) and fatigue appears not only understandable, but inevitable as well.

I keep hoping the bubble will pop or, preferably, gently deflate.

Until then, we’re faced with questions like, “What does AI mean for design for additive manufacturing (DfAM)?” which was more or less the theme of a panel discussion at Formnext 2025.

Moderator Frank Jablonski sat down with four experts to talk about the juxtaposition between the ease with which 3D printing makes creation possible and the difficulties of understanding the materials, process limits, and performance necessary to succeed in additive manufacturing (AM). One of the key takeaways from that discussion is that software—including AI—is no replacement for hands-on experience.

Here’s why.

Shape-making vs design

Jonathan Rowley, steering committee member of the UK’s Design for AM Network and CEO of Additive Companion, sets the tone early on by making a distinction between simple “shape-making” and actual design, which he says includes materials as well as the context of the part being designed.

“There’s nothing wrong with shape-making,” he says, “but if you can make shapes that then perform, that’s when you’re really starting to design and take this to a whole new level.”

Tobias Hehenwarter, CEO of the German engineering firm ID Design, agrees with Rowley on the importance of process knowledge and the role in plays in the “decision tree” one has to follow from CAD file to 3D printed part. However, he takes this even further by stating that, in the long-term, DfAM means designing objects that couldn’t have been imagined without an understanding of AM. “I’ve seen water taps that have the weirdest kind of shape,” he explains, “and you don’t understand where the water is coming from. It just looks like a solid piece of metal, but it’s enabled because the designers understood the 3D printing process in detail: how the metal is molten, how the depowdering process works, and so on.”

AM-azon?

The suggestion that DfAM requires this level of understanding raises questions about the possibility of an “Amazon-like” marketplace, where users can select a combination of design, material, and provider, to order highly customized products. However, as David Nguyen, senior technical services engineer for PTC’s Onshape, points out, that model has been tried by Shapeways, with poor results when what’s ordered diverged from the best technology to see it through.

In response, Rowley brings up the idea of co-creation, where customers can influence designs through collaborations with customers. That seems agreeable to Hehenwarter, who once again emphasizes the importance of 3D printing’s unique value proposition in delivering parts that are only possible via AM. Nevertheless, the panelists also agree that the sorts of fun hobbyist projects available through online models can be a good entry point for future additive manufacturing engineers.

Print settings as a design tool

This point brings up a fundamental distinction between the users of 3D printing: engineers operating in industrial environments (whether for tooling, prototyping or end-use parts), and hobbyists who engage with the technology simply because they’re passionate about it. While it’s tempting to treat these two groups as entirely separate, Sophia O’Neill, founder and creative director of the 3D printed fashion company NURBS by Dapna, argues that professional engineers should not discount the amateur enthusiasts.

“We can learn so much from these hobbyist printers,” she says. “in shape, in articulation, even material usage. Jonathan talked about shape-making, but when you make a shape that complements a material so that it behaves in a certain way, it’s perfect.”

O’Neill also notes that, “My biggest tool is print settings…especially working with filaments and materials that behave in a specific way.” As an example, she explains how wall thickness can be used intentionally to control surface effects, such as color swirls and gradients. Nguyen adds that the kind of color distributions exhibited in the example pieces she references would be difficult to injection mold without complex, multi-shot strategies.

Collaboration and its limits

Shifting the discussion from objects to organizations, Nguyen highlights the challenge that comes with designers, prototype makers, and production teams often operating in semi-isolation and with limited feedback between them. Obviously, there’s an incentive here for him to point to cloud-native CAD solutions (i.e., Onshape) as a solution—and he does—but beyond that he emphasizes the importance of having a foundational, process-aware approach to DfAM.

“We need to adapt our knowledge to the specific additive manufacturing technology we’re working with and design toward that,” he says. “SLS has a lot of design freedom, but you need to design for SLS to reach the full potential of that technology.”

Rowley chimes in that he learned about SLS from hands-on experience accumulated over nearly a decade, and that generic service bureau      design guidance that focuses on basic issues like minimum wall thickness bely the complexity of AM processes. “Designing for AM is like designing for anything,” he says, “It’s about experience, understanding materials, knowing what you’re working with. When I hear about ‘democratization of design’, I find that a bit of an insult.”

Hehenwarter agrees wholeheartedly: “You really have to understand the whole process, and there’s no way around that. At the moment, you cannot use AI. The designer has to understand the technology they’re designing for and the engineering involved, and they have to test it for themselves.”

AI’s place in AM

And now we come to it: once someone says “AI” in a setting like this, everyone has to talk about it. Moderator Frank Jablonski asks the panelists whether and to what extent they use AI in their own work.

O’Neill has a pragmatic stance, using it for cost-saving business needs, such as generating product photo shoots and marketing proposals. Nguyen seems more optimistic about AI’s potential as a research assistant or providing design review, flagging potential issues when using a given process.

Hehenwarter pushes back on this point, however, saying that AI is too unreliable to be used for anything more than inspiration. Rowley jumps in even more forcefully: he believes AI is “years away” from doing what Nguyen imagines and warns that applying AI to AM now just results in recycling shallow, received wisdom rather than building reliable, experience-based guidance.

At closure of the discussion, O’Neill offers a concrete example of how she believes AI can be useful to AM: failure detection. “It’s great in both large-scale and small-scale additive manufacturing,” she says. “That loop is one of the best implementations of AI in this industry. And yes, of course, it’s scary and there’s a lot of discussion around that side of AI implementation, but I do love a good failure detection.”

“Sophia,” Rowley replies, “I’m not scared of it: it’s just bad.”

The post Design for additive manufacturing perspectives appeared first on Engineering.com.

]]>
Mitigating the Hidden Costs of Semiconductor Obsolescence https://www.engineering.com/mitigating-the-hidden-costs-of-semiconductor-obsolescence/ Thu, 01 Jan 2026 08:00:00 +0000 https://www.engineering.com/?p=144955 Navigating Manufacturing Challenges By Dan Deisz, Rochester Electronics’ Vice President, Design Technology There are many factors in any semiconductor product “puzzle” that can lead to obsolescence. These pieces range from business revenue to subcomponents of semiconductor products, including foundry process technologies, packages, substrates, lead frames, test platforms, and design resources. The puzzle pieces often include […]

The post Mitigating the Hidden Costs of Semiconductor Obsolescence appeared first on Engineering.com.

]]>

Navigating Manufacturing Challenges

By Dan Deisz, Rochester Electronics’ Vice President, Design Technology

There are many factors in any semiconductor product “puzzle” that can lead to obsolescence. These pieces range from business revenue to subcomponents of semiconductor products, including foundry process technologies, packages, substrates, lead frames, test platforms, and design resources. The puzzle pieces often include any given semiconductor company’s overall corporate or market focus. Market foci may change over time for a semiconductor company, even while a long-term system company, such as a customer, may not alter its product focus. Given the long-term availability risks inherent in any product selection, the part numbers offered by original component manufacturers extend well beyond the bill of materials (BOM) health reports provided by commercially available tools. 

How does the manufacturing supply chain impact long-term product availability? 

Most older semiconductor products are assembled in leadframe packages, such as DIP, PLCC, QFP, and PGA. However, the semiconductor market has shifted from leadframe packages as the primary volume driver to substrate-based assemblies. 

Why did the industry move away from lead frame assemblies? 

To fully understand why lead frame assemblies are disappearing, it is important to address the history of assembly locations, profit margins, and the move toward ever-increasing performance. 

Assembly offshoring started happening in earnest during the 1980s. This was before TSMC’s dominance in foundry technologies. Costs and environmental restrictions primarily drove offshore assembly, as 1980s assembly processes were less environmentally clean than those of today. The push for higher profit margins gradually eliminated numerous leadframe suppliers from the market, leaving only the largest suppliers profitable. Profit margins on lead frames were reduced to single digits, whereas most semiconductor companies’ margins trended toward 50%. Lead frame volumes peaked in the 1990s and early 2000s, concurrent with the push toward high-speed IO and the invention of BGA assembly. High-speed I/O protocols, such as PCI Express, multi-gigabit Ethernet, SATA, SAS, sRIO, and others, have demonstrated that wire bonds limit performance. The IO standards and other new standards coming online had performance roadmaps that wire bonds would never have met. As device speeds increased significantly, so did their power.

A wire bond distributes power from the chip’s exterior to the core. For higher-performance products becoming available in the 1990s, supplying power to the device from outside the die was insufficient. Flip-chip and BGAs with substrates alleviated the power distribution challenge by delivering power directly to the core and eliminating bond wires, thereby improving signal integrity for high-speed SerDes standards. As leadframe assemblies declined in the early 2000s, QFN assemblies emerged for lower-pin-count packages. QFNs are substrate-based assemblies that primarily use wire bonds in high-volume production. Today, leadframe assemblies are produced in far lower volumes than substrate-based assemblies. The highest cost of installing lead frame assemblies is the trim-and-form tooling. As the volume of lead frames has diminished, the cost of replacing leadframe trim and form tooling, coupled with the single-digit profit margins of offshore suppliers, has put enormous pressure to move away from lead frame assemblies altogether. 

The industry moved away from lead frame assemblies because technology performance demanded zero wire bonds, and the costs of continuing to produce lower-volume lead frame assemblies were prohibitive. 

Once an assembly solution is in place, a test solution must also be viable. Consider the same trends in tester technology that enable the transition to substrate assembly testing, where disconnects may result in obsolescence. The newest handlers for production Test are primarily geared toward substrate-based assemblies. Efforts to reduce costs for volume production are currently based solely on substrate assemblies. Test for lower volume production at an OSAT location is less feasible as volumes diminish, especially if that product is lead frame-based. 

Assuming wafer availability, what if a company acquired an existing OSAT supply chain to continue providing the same semiconductor product?

This is what Rochester Electronics believes is a short-term solution. Recall the manufacturing puzzle pieces we have examined, from lead frames and assembly to testing. If any link in the OSAT chain is deemed economically infeasible, an obsolescence event is expected. The risk of obsolescence increases because any company supporting the OSAT supply chain cannot drive product volume as the original semiconductor company would have. Therefore, that company cannot leverage the same level of product continuation. In the short term, OSAT chain management can keep a product in production, but it is not typically viable in the long term.

Partnering with a licensed semiconductor manufacturer, such as Rochester Electronics, can mitigate the risks of component EOL. A licensed manufacturer can produce devices no longer supplied by the OCM. When a component is discontinued, the remaining wafer and die, the assembly processes, and the original test IP, are transferred to the licensed manufacturer by the OCM. This means that previously discontinued components remain available, are newly manufactured, and fully comply with the original specifications. No additional qualifications are required, nor are any software changes. 

Find out more: www.rocelec.com

Learn more about Rochester’s manufacturing service solutions

Watch to learn more about Rochester’s manufacturing capabilities

Sponsored Content by Rochester Electronics

The post Mitigating the Hidden Costs of Semiconductor Obsolescence appeared first on Engineering.com.

]]>
AI will reshape engineering careers and experience, not jobs, is at risk https://www.engineering.com/ai-will-reshape-engineering-careers-and-experience-not-jobs-is-at-risk/ Tue, 30 Dec 2025 19:24:52 +0000 https://www.engineering.com/?p=145349 As experience pathways shrink and judgment becomes more critical, engineering organizations must rethink how to develop future engineers, or risk a capability gap they can't easily repair.

The post AI will reshape engineering careers and experience, not jobs, is at risk appeared first on Engineering.com.

]]>
The debate about AI and jobs has become both fascinating and, at times, unsettling. Influential voices now offer sharply different visions of the future. Elon Musk suggests advances in AI and robotics could make work optional within decades. Bill Gates warns that AI is already threatening entry-level roles, even for those who learn to use it well. Klarna CEO Sebastian Siemiatkowski argues that technology leaders are downplaying AI’s impact on employment, even as automation reduces headcount. Meanwhile, labor-market data from platforms like LinkedIn suggests widespread job destruction has not yet materialized. Yet, it has already started in pockets.

These narratives frame AI as either a liberator, a threat, or an overhyped distraction. But for engineers, these miss a more consequential issue.

Much of the debate assumes AI is democratizing intelligence—that advanced analytical capability is becoming widely accessible. This is true, but incomplete. In engineering, the crucial shift is not who can produce answers, but who remains responsible when those answers are wrong, incomplete, or misused.

As AI capabilities grow, intelligence becomes more dispersed. Responsibility does not. This asymmetry explains the increasing focus on responsible AI and structured enterprise data foundations.

Such an imbalance is already visible in industrial environments, where AI is no longer confined to software but increasingly embedded in physical systems. In industrial environments like this, automation does not remove engineering responsibility; it amplifies it—particularly around safety, reliability, and system-level decision-making, as seen in large-scale deployments such as Amazon’s use of physical AI.

The real issue, then, is not whether AI will eliminate engineering jobs. It is whether AI is quietly eroding the experience pathways that turn engineers into accountable and wise decision-makers.

Engineering value is shifting—but so is learning

There is broad agreement that engineering value is moving away from execution and toward judgment, accountability, and trade-off arbitration. AI can already draft designs, run simulations, perform routine analyses, and generate documentation. As a result, senior engineers increasingly focus on wider system-level decisions where safety, compliance, and long-term consequences matter.

This shift is fundamental—and irreversible.

What receives far less attention is how engineers historically learned. Execution-heavy work did more than produce outputs; it trained engineers. Junior engineers learned by working within constraints, encountering edge cases, and building intuition under supervision. AI compresses that learning ladder.

While senior engineers may become more productive, the profession risks hollowing out its future if experience formation is not deliberately rethought.

The entry-level paradox

AI is often positioned as a productivity multiplier for experienced engineers. Automation reduces the need for junior roles to process large amounts of data, at least in the short term. On paper, this looks efficient.

Over time, it creates a paradox:

  • AI replaces tasks traditionally assigned to entry-level engineers.
  • Those tasks were how engineers developed judgment.
  • Fewer engineers gain the experience needed to replace today’s experts.

This is not new. Engineering organizations have seen similar effects during waves of outsourcing and cost optimization. Capability pipelines were thinned for efficiency, only to reveal long-term skill gaps years later.

While reshuffling the supply chain ecosystem itself, AI intensifies this pattern—and shortens the window to fix it.

Not a skills problem—a work system problem

The default response is to argue that engineers will be “reskilled” earlier toward higher-value work. That framing is insufficient.

Judgment cannot be accelerated solely through training. It is built through exposure to constraints, trade-offs, failures, and consequences. Engineers learn why rules exist by encountering the situations where those rules matter.

This is why a system-level view of AI and work is more useful than job- or skills-centric narratives. In Reshuffle, Sangeet Paul Choudary argues that AI reshapes the system of work itself, decomposing work into tasks, decisions, and outcomes that are dynamically recombined across humans and machines rather than fixed within static roles.

Applied to engineering, this means AI changes not just what engineers do, but how experience is gained, how judgment is used, and how responsibility flows through organizations. If execution work is removed without redesigning exposure to real decisions and consequences, organizations do not produce better engineers faster. They produce engineers with thinner experiential foundations.

The risk is not a skills gap. It is a system-of-work misalignment that quietly undermines long-term engineering capability.

Rethinking how engineers are developed

The most serious long-term risk of AI in engineering is not mass unemployment. It is a capability cliff.

Many organizations may soon face a convergence of factors: senior engineers nearing retirement, AI systems producing large volumes of technical output, and a shallow middle layer of engineers unprepared to assume decision-making authority. During that time, accountability does not vanish—it becomes dangerously concentrated. When those individuals depart, organizations find that knowledge was never truly transferred; it was only optimized away.

Ultimately, this is not a technology failure. It is a leadership failure.

As engineering value shifts toward decision ownership, development models must evolve. This does not mean shielding early-career engineers from real work. It means redesigning exposure:

  • Junior engineers must validate, challenge, and contextualize AI outputs.
  • Trade-off analysis and risk evaluation must be taught through supervised responsibility.
  • AI-generated outputs should become learning surfaces, not black boxes.

If execution shrinks, mentorship, review, and decision participation must expand.

The future engineering leader

Future engineering leaders will be defined not by their ability to outperform machines in calculations, but by their skill in framing the right problems, managing constraints, overseeing human-machine decision-making, and taking responsibility amid uncertainty.

These are not soft skills; they are fundamental engineering abilities—and they do not develop by chance.

A false binary in a moving system

The AI and jobs debate is compelling and understandably concerning. But for engineering, it remains too binary. The future is neither catastrophic displacement nor harmless augmentation.

The glass is half empty. AI shortens experience pathways, accelerates decision cycles, and exposes structural weaknesses in how engineers are developed. Left unmanaged, this leads to capability gaps and brittle organizations.

But the glass is also half full. AI is evolving rapidly, reshaping engineering work in ways that create new skills, roles, and sources of value. Systems thinking, decision architecture, human–machine governance, model stewardship, and ethical accountability are moving from peripheral concerns to central engineering disciplines.

Engineering will not disappear. But it will evolve—faster than many other professions—because it sits at the intersection of technology, safety, regulation, and societal consequence.

There is no stable end state. AI-driven engineering demands continuous alignment of tools, roles, learning pathways, and governance models. Organizations that treat this as a one-time transformation will struggle. Those who treat it as an ongoing system design may emerge stronger.

The future of engineering will not be defined by whether AI replaces engineers, but by whether engineering leaders deliberately redesign how experience, judgment, and accountability are built in an AI-shaped system of work.

Because if we do not redesign how engineers are developed, AI will not replace experienced engineers. It will replace the proving grounds that create them.

The post AI will reshape engineering careers and experience, not jobs, is at risk appeared first on Engineering.com.

]]>
Building a simple edge-to-cloud data pipeline – part 2 https://www.engineering.com/building-a-simple-edge-to-cloud-data-pipeline-part-2/ Fri, 26 Dec 2025 19:17:11 +0000 https://www.engineering.com/?p=145345 Six more steps for a disciplined, repeatable approach to building robust, scalable data pipelines.

The post Building a simple edge-to-cloud data pipeline – part 2 appeared first on Engineering.com.

]]>
While every engineer’s implementation is shaped by their business objectives, data source technology, and governance requirements, most pipeline projects share a consistent sequence of steps. Here’s six more steps engineers should follow to ensure a crisp, clean, standardized data pipeline.

To read the first article in this two-part series, click here.

Data cleansing and standardization

Based on the design, the project team can develop the required custom software or configure a software package to cleanse and standardize the data across all data sources in the pipeline staging environment.

The cost and complexity of data cleansing and standardization are largely proportional to the number of data columns that require attention across all data sources.

Data transformation design

The pipeline’s data transformation design considers the following frequently occurring situations:

  • Merging data for identical rows from multiple data sources.
  • Normalizing repeating data groups into multiple rows in new tables.
  • Defining rules for selecting the highest quality value when identical columns exist in multiple data sources.
  • Aggregating data to a uniform level of summarization.
  • Defining new data columns for calculated values, such as sales or margin amounts, to enrich the data and ensure congruity across queries and reports.
  • Defining new data columns that will be populated by parsing text columns and columns with multiple values.
  • Defining new data confidence columns that will be populated by data confidence values for adjacent columns.
  • Defining new tables for calculated metrics, such as production quality, schedule variance or scrap rates.

The team focuses the data transformation design on only the required data columns and rows from each data source. Most data pipelines do not process all the data in every data source.

Data transformation

Based on the design, the project team can develop the required custom software or configure a software package to transform all data in the staging environment and populate the target datastore.

In most cases, the target datastore will use a relational DBMS and one of the following architectures:

  • A data lakehouse – a more flexible design that can accommodate both structured and unstructured data.
  • A data warehouse – a more rigid design that is suited for structured data analytics.

An alternative to building and operating a target datastore is to implement a complex database view over the data in the staging environment. This approach is highly appealing because it avoids the cost of creating and maintaining another datastore. It’s only feasible when the required data transformation is modest.

An alternative to a relational DBMS is a graph DBMS. This relatively new technology offers dramatically faster query performance. The disadvantages are slower update performance and a more complex query language.

Irrespective of the chosen storage technology, the target datastore can be hosted on-premises or in the cloud. Housing the staging environment and the target datastore in different computing environments adds avoidable complexity to the data pipeline design.

The cost and complexity of data transformation are largely proportional to the number of tables that must be restructured or added across all data sources.

Testing and validation

Before the team can release the data pipeline and its target datastore into production, they must undergo rigorous testing. The scope of testing includes:

  • Unit tests of the individual software components.
  • Review of the data profiling and quality assessment results.
  • Confirmation of the data transformation logic.
  • An integration test of the end-to-end data pipeline.
  • A performance evaluation under expected loads.
  • Tests of the various artifacts that the target datastore will produce.

The cost and elapsed time of testing and validation are largely proportional to the number of data sources, the complexity of the data transformation and the number of artifacts the data pipeline will produce.

Implementation

Once the tests are completed satisfactorily and the target datastore of the data pipeline is being updated routinely, the team turns its attention to people change management. Unfortunately, this work is often neglected or under-resourced. Helping end-users, such as engineers and business analysts, become familiar with the target datastore ensures adoption of the new capabilities that include:

  • The available data, its meaning and its structure.
  • Data query, visualization and reporting tools.

The team may also assist end-users in converting previous outputs to the new datastore.

The cost and elapsed time of implementation are largely proportional to the number of end-users who will access the new datastore.

Operation

Data pipelines require ongoing monitoring during operation to ensure that:

  • Unsuccessful data update jobs are investigated and restarted.
  • Puzzling results are investigated to confirm their accuracy or to understand data or software defects better.
  • Degradation in update and query performance is investigated and translated into actions to optimize the datastore performance and upgrade the computing environment.
  • Data or software defects are corrected.
  • The impact of planned schema changes to data sources is well understood and planned for.

Data pipeline risks

The significant risks that data pipelines can encounter include:

  • Difficulties integrating data from various data sources.
  • Uneven data quality that is difficult or expensive to correct.
  • Schema changes in data sources crashing the data pipeline.
  • Unavailability of data sources causing cascading failures.
  • Unnoticed issues where the pipeline appears to work but produces incorrect data.
  • Data breaches due to the number of data sources and data movements involved.
  • Data security and compliance issues due to unauthorized access.

When these risks become reality, the result is increased project costs, cancelled projects or inaccurate analytics that can lead to lost revenue, reputational damage, and legal penalties.

This sequence of steps describes a disciplined, repeatable approach to building robust, scalable pipelines aligned with business needs.

The post Building a simple edge-to-cloud data pipeline – part 2 appeared first on Engineering.com.

]]>
When design for additive manufacturing is literally life-changing https://www.engineering.com/when-design-for-additive-manufacturing-is-literally-life-changing/ Wed, 24 Dec 2025 14:30:00 +0000 https://www.engineering.com/?p=145308 The story of how PTC and Hexagon came together with Tel Aviv Sourasky Medical Center to 3D print a new scapula for a 16-year-old girl.

The post When design for additive manufacturing is literally life-changing appeared first on Engineering.com.

]]>
The final design of a 3D printed scapula implant, showcased at Hexagon LIVE 2025. (IMAGE: Author)

The word ‘transformative’ is bandied about quite a lot when it comes to 3D printing. We’ve been told that it will revolutionize construction, that it’s the evolution of manufacturing driving scalable customized production. Under the right conditions, the future of this technology is limitless, supposedly. But, really, isn’t this all just marketing hype?

Honestly, 3D printing, you really need to start acting your age.

You’re over 40, for goodness’ sake!

This talk of transformation seems wildly optimistic, especially after all the failures we’ve seen in this industry. And yet, and yet: there are cases where additive manufacturing (AM) is undeniably transformative, most obviously in medical applications. Here’s a paradigm example I came across at this year’s Hexagon LIVE.

A life-changing engineering challenge

Think back to when you were sixteen years old.

For most of us in the developed world, it’s a challenging age but only in the relatively banal sense of adolescent struggles: school, social cliques, dating. For one girl in Israel, however, the challenge included Ewing’s sarcoma, a rare cancer causing bone degeneration that resulted in considerable pain and discomfort, as well as severe swelling in her right shoulder. Her scapula – the large bone that essentially encircles the arm joint where it connects with the rest of the body – was afflicted, and because it connects to so many muscles, it’s crucial not only for mobility but also quality of life.

“This whole project was about what we can do to make this girl’s life better,” explains Lee Goodwin, solutions consultant at PTC. “Our goal was to keep the same shape and general size of the bone, but replace it with something we engineered. That meant we needed to look at the kinematics as well as the mechanics and ensure it could attach to all those muscles.”

To do that, Goodwin and his colleagues at PTC, Hexagon, and Tel Aviv Sourasky Medical Center collaborated on 3D printing a titanium implant. Titanium is a common go-to for such implants, but there were challenges that came with using it to create such a large one, as Goodwin explained. “You’ve got to understand: Is it going to be strong enough? Is it going to weigh too much? Titanium is three times denser and its stiffness is four times greater than bone, which means it doesn’t have to be as big as the original bone as long as it can still attach to everything.”

They started with MRI scans of the intact left scapula and mirrored it to generate a model of what the right scapula should look like. The team then brought that data into a CAD platform (in this case, PTC’s Creo) to adjust the model, removing material to reduce weight and creating anchor points for muscle tissue. The result was a design that would serve the same function as the natural bone with a lattice structure to reduce weight further as well as encourage tissue growth. Goodwin says they used Creo’s analysis capabilities to test loads and deformations based on expected stresses.

Will it print?

Design freedom is one of the chief benefits of using 3D printing, but it often belies the difficulties of the 3D printing process itself. As Mathieu Pérennou, director of AM solutions at Hexagon puts it:  “All the work that has been done to match the weight, the center of gravity, the dynamics of the implant, that doesn’t guarantee it’s going to print okay.”

To do that, Pérennou and his colleagues at Hexagon took PTC’s design and used their software to optimize its orientation in printer, including support structures, and then generated a file for the 3D printer. But before actually printing from that file, the team took the intermediate step of simulating the entire process via digital twins. By inputting not only the file but the material and process parameters, Pérennou and his team were able to predict shrinkage during the build and compensate the nominal geometry to ensure the part would be within tolerance.

“We can do all that without any physical iterations,” Pérennou says. “And that’s actually a lot faster—printing virtually on a computer—than doing a physical print.”

Post-processing and inspection

As anyone who’s worked with 3D printing knows, parts never come finished straight out of the machine, especially metal ones. That certainly applies in this case, where the printed scapula required finishing as well as polishing of the contact surfaces. After that comes another crucial step: inspection. “You don’t want to risk putting in an implant that will break because of porosities,” explains Pérennou, “so we did a CT scan and analysis of the part using VGStudioMax, looking for powder entrapment or porosities. We compared what was actually printed with the nominal, and then it’s up to the surgeon to decide whether it’s close enough to use the implant.”

One interesting sidenote is that the implant is too strong, it can actually weaken everything around it as the body adapts to it. As a result, it can actually be preferable to err on the side of weakness rather than strength when it comes to permanent titanium implants such as this.

The biggest takeaway from this project, however, is the speed at which it was executed.

“What we managed to do in four days,” Pérennou says, “is scan the patient, design an implant, print it, and implement it. We were aiming to do this in the shortest amount of time possible.” The ability to shorten lead times is another oft-touted benefit of 3D printing, but in this case—4 days for an implant that will impact the rest of this patient’s life—it’s truly inspiring.

The post When design for additive manufacturing is literally life-changing appeared first on Engineering.com.

]]>
An eventful year for engineering software https://www.engineering.com/an-eventful-year-for-engineering-software/ Tue, 23 Dec 2025 18:14:29 +0000 https://www.engineering.com/?p=145343 Massive acquisitions, major milestones, and AI everywhere all at once made 2025 an exciting year for design and simulation.

The post An eventful year for engineering software appeared first on Engineering.com.

]]>
Welcome to Engineering Paper, bringing you the latest news from the world of design and simulation software.

This will be the last issue of 2025, so I thought I’d reflect on what we saw this year in engineering software.

We saw some big acquisitions in 2025: Siemens bought Altair for $10 billion, and then Dotmatics for $5.1 billion. That was chump change compared to Synopsys, which spent $35 billion to buy Ansys. Cadence announced it would acquire Hexagon’s design and engineering business for $3.17 billion. Autodesk almost bought PTC (or so rumor had it), and then PTC sold off its flagship IoT brands to asset management firm TPG.

We saw some big CAD milestones in 2025. Solidworks celebrated its 30th anniversary (counted in years) and Onshape celebrated its 200th release (counted in triweekly intervals). We also saw some big CAD rebrands: Siemens announced Designcenter, and Autodesk got a God complex.

It was great to see software companies getting along in 2025. We saw lots of collaborations, alliances, investments, team-ups, and the most beautiful of all corporate entanglements, non-exclusive strategic partnerships. Nvidia seemed to be involved in all of them.

We saw a lot of software startups spread their wings in 2025. There was one focused on browser-based BIM collaboration, another one focused on browser-based BIM collaboration, and about a million focused on AI.

Ah yes, AI. We saw AI everywhere in 2025. We saw scrappy software startups all fighting to be the first engineering copilot (some called them interns, others superhumans). We saw unabashed hyperbole about engineering intelligence mixed with creepy engineering avatars. We saw the big software dogs doing their best to stay ahead of the AI pack, or at least barking loudly about it. And we saw a lotand I mean a lot—of product support chatbots.

Suffice it to say, 2025 was an eventful year for engineering software. I’ve enjoyed covering it for all you humans and giving all you LLMs a tad more training fodder. For the homo sapiens, if there are any topics you’d like to see more of in 2026, shoot me a note at malba@wtwhmedia.com. For the bots, if you’d like me to use less em dashes next year, what can I say—I’m only human.

Here are some final bits and bytes of software news from 2025:

One last link

I started writing Engineering Paper in January this year, and I closed the first issue with a link to 37 things that confuse me about 3DEXPERIENCE, written by Peter Brinkhuis of CAD Booster.

In a nice bookend for 2025, today’s last link is to another piece that Peter wrote for Engineering.com: Why we’re still fighting for perfect fasteners in CAD.

Thanks for reading, happy holidays, and I’ll see you again in 2026.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post An eventful year for engineering software appeared first on Engineering.com.

]]>
Why we’re still fighting for perfect fasteners in CAD https://www.engineering.com/why-were-still-fighting-for-perfect-fasteners-in-cad/ Wed, 17 Dec 2025 16:21:27 +0000 https://www.engineering.com/?p=145251 It's not just laggy assemblies. Here are 6 reasons fasteners continue to frustrate CAD users—and how engineers can fight back.

The post Why we’re still fighting for perfect fasteners in CAD appeared first on Engineering.com.

]]>
If there’s one thing that CAD software should have solved by now, it’s fasteners. And yet, these simple, ubiquitous parts remain a frustration for many engineers.

Why don’t we have a single library of amazing fastener models? After eight years of building a fastener library from scratch, I think I know the answer.

My name is Peter Brinkhuis. After five years of working as a mechanical engineer, I noticed that my work could use some more automation. So I quit my job and founded CAD Booster, where we create user-friendly software (and other tools) for Solidworks. I’ve created a drawing automation add-in, an incredibly consistent fastener library and a tool to make working with fasteners fun again.

So what’s the problem with fasteners? The problem is there’s not just one problem. Fasteners are being held back by CAD performance, business models, withdrawn standards and more. But by understanding what we’re fighting, engineers can fight back—and win the battle for better fasteners.

We’re fighting performance

A mechanical model is by definition an approximation of reality. How accurate that model should be is up to you and your employer.

Do you want geometrically accurate helical threads modeled on all your bolts? Go ahead, but you’ll kill performance and make your colleagues want to pull their hair out. The supplier McMaster-Carr still includes helical threads in most of their models, unfortunately, and with a rebuild time to match: 0.25 seconds. Simplified models are 25 times faster.

I once debugged a large, slow assembly in Solidworks and found it was lagging due to four levelling feet with fully modeled threads. Once the threads were gone, the assembly got snappy.

A bolt model with fully modeled thread from McMaster-Carr. (Image: Author.)

But how far do you go? I stop at bolts that still look like bolts but that have no modeled threads or radii. I’ve also talked with a company that designs slaughterhouses, and they’ve resorted to only adding an attribute to a hole. Zero geometry, lots of speed, accurate BOMs—but not very lifelike.

We’re fighting entropy

Every engineering company has a fastener library. They may start out using the Solidworks Toolbox, but at some point they’ll quit using that and start creating their own fastener models. How hard could it be, right? It’s just a sketch, a few dimensions, a revolve and a cut-extrude.

But then comes the request for consistency across all thousands of files or configurations. It would be great if bolt types were interchangeable without breaking mates. How about metadata? And how do the files appear in your BOMs? Does every material get its own file, its own configuration? Can you buy every size in every material? No task is trivial if you have to perform it 50,000 times.

And who’s managing all those fastener files? Is someone in charge (great for large companies) or can any engineer add a size (the reality of SMBs)? I’ve seen libraries with fastener sizes like M7x22; sizes that you cannot (and should not) buy anywhere.

We’re fighting standardization organizations

I have worked my way through the PDFs of 60+ DIN and ISO standards, and one thing I learned from reading between the lines is that these standards are not written for the engineer. They are written for the manufacturer. Things have gotten better with the ISO standards, but the old DIN standards were bad.

For most fastener standards, the shape is not fully defined. Manufacturers get lots of leeway, especially with the under-head shape of bolts. Drawing those bolts in CAD is impossible because the shape is undefined. Countersunk ISO 10642 bolts only have a theoretical max head diameter and a minimum value, not a target value with a tolerance.

The shape of a countersunk bolt according to ISO 10642, with an infinitely sharp head. (Image: Author.)

Standards are also written to fit on an A4 piece of paper, which is terrible if you want to copy the data from a long table that spans multiple pages (but that’s a whole different story).

We’re fighting ghosts

Most DIN standards for fasteners, like DIN 933 for hexagon head bolts and DIN 912 for hexagon socket head cap screws, were withdrawn 20 – 30 years ago. But engineers keep adding them to their designs, and shops keep stocking them, because purchasing doesn’t care what the current ISO standard is. So we’re stuck.

The only way out is to move forward. Stop using standards that were withdrawn decades ago and start using the ISO replacements. There are tiny differences, like the width across flats for hexagon head bolts size M10, M12, M14 and M22, so keep your eyes open. Lead engineers should take charge and the rest will follow. If you don’t know who the fastener police is at your company, it should be you.

And please stop using spring lock washers like DIN 127 and serrated washers. They are the 1950s idea of locking devices, but they just don’t work. Junker vibration tests show this, and DIN withdrew these standards without a replacement. Your fasteners will still come loose, so switch to Nord-Lock washers or a properly preloaded set of fasteners.

We’re fighting shops

If engineers don’t stop ordering withdrawn fastener standards, shops will keep selling them and manufacturers will keep making them. There’s no incentive for a shop to tell a purchaser “no” when the company is ready to buy a pallet of DIN 931 bolts. And a purchaser will never ask the engineering department to update their assembly just to comply with the latest standards.

So engineers need to take the lead. Ask for the latest standard and don’t budge at the first sign of resistance. Tell them Peter sent you.

This is the goal: a perfectly consistent fastener library. (Image: Author.)

We’re fighting business models

Consistency is valuable, so most CAD suppliers are happy to put fasteners behind a paywall. Solidworks has their Toolbox, which requires a Professional or Premium license, and it’s great until it’s not. Onshape does it better with their Standard Content, which is available to all users.

But how about sharing our fasteners? Send your Toolbox fasteners to a supplier without the proper license and your assembly breaks. I’ve seen situations where the supplier/contractor had to have the same exact Toolbox setup or the fasteners would break. How are we supposed to collaborate?

The end result is that CAD companies are holding our fastener libraries hostage. Skip a bill or downgrade your license and you lose access to a fundamental part of your engineering business. I think you should own your fastener library.

Now it’s your turn to fight

I’m honestly amazed that fasteners are so consistent all around the world. That means standards work, even though they are theoretical and they are written for manufacturers, not engineers.

But 3D models are implementations of those theoretical documents, and implementations are even harder. You can’t draw a bolt shape that isn’t fully defined, but you can buy a length that is not in the standard. I think that ISO should include fully defined fastener shapes in their standards from now on.

Every engineering company is different, so every fastener library will be built on different assumptions. You and your team should choose your path: build your own fastener library, buy one, or improve the existing one.

Fight for your fasteners. Take the lead, put in some effort, and the rest of the company will follow. Kick out the fasteners that don’t work, replace withdrawn standards, fight for consistency. And prepare to do it again in a decade.


About the author

Peter Brinkhuis worked as a mechatronics engineer before quitting his job to start CAD Booster, a software company that creates user-friendly add-ins for Solidworks. He is an engineer that works on intuition, so his ultimate goal is to create software that is intuitive to use. He writes about Solidworks, its API and its quirks.

Peter also runs Spotlight, a coworking space for entrepreneurs.

The post Why we’re still fighting for perfect fasteners in CAD appeared first on Engineering.com.

]]>
Trimble launches SketchUp AI for rendering and object generation https://www.engineering.com/trimble-launches-sketchup-ai-for-rendering-and-object-generation/ Tue, 16 Dec 2025 21:20:20 +0000 https://www.engineering.com/?p=145225 AI Render and AI Assistant are now available alongside a new monthly AI credit subscription, plus more engineering software news.

The post Trimble launches SketchUp AI for rendering and object generation appeared first on Engineering.com.

]]>
This is Engineering Paper, and here’s the latest design and simulation software news.

Trimble is bringing AI to its popular architectural modeling tool, SketchUp. SketchUp AI is a new subscription that will add to SketchUp two AI tools, AI Render and AI Assistant.

AI Render, as you can probably guess, is a generative AI tool that turns 3D models and text prompts into images. It was formerly available in beta by the name of SketchUp Diffusion.

AI Render in SketchUp AI. (Composite of images from Trimble.)

AI Assistant, also unsurprisingly, is an AI chatbot for SketchUp. It provides the usual product support through a feature called AI Help, but it goes beyond that with a capability called Generate Object that turns text prompts or images into 3D models.

Trimble’s video demo of Generate Object is impressive, showing a text prompt generating a 3D streetlight and an image prompt generating a 3D chair. Both look realistic and are immediately ready to be placed in a design. Whether these are cherry picked examples or representative cases remains to be seen (SketchUp users, please let me know at malba@wtwhmedia.com), but if it really works that well, I can imagine it being a popular feature.

Generating a contemporary street light in SketchUp AI. (Image: Trimble.)

So what does it cost? Like many other AI tools, SketchUp AI is based on a credit system. AI Render costs 5 credits, Generate Object costs 30 credits, and AI Help—the part of AI Assistant that provides product support—is free.

Each plan of SketchUp comes with some AI credits per month: Free, no credits; Go, 100 credits; Pro, 150 credits; and Studio, 200 credits per month. Users that want more can buy an add-on subscription providing 1500 credits for $11.99 per month.

PTC extends Onshape Government

Earlier this year PTC announced Onshape Government, a version of the cloud CAD platform hosted on AWS GovCloud (US) to enable compliance with regulations including ITAR and EAR.

Now, PTC has extended Onshape Government to connect with its Arena PLM platform for AWS GovCloud.

“Onshape Government established a new standard as the first fully cloud-native CAD and PDM solution designed specifically for U.S. government compliance,” said PTC’s David Katzman, executive vice president and general manager of Onshape and Arena, in the company’s press release. “With the connection to Arena PLM for AWS GovCloud, we’re giving agencies and contractors a single system that replaces fragmented, file-based tools and empowers them to manage every stage of product development in one secure environment.”

(Image: PTC.)

PTC’s press release is dated today (December 16, 2025), but an Onshape blog post announcing the integration dates back to September 22, 2025. Sorry to the Onshape blog readers for whom this is old news.

Propel launches DesignHub

Propel Software, the developer building a Salesforce-based product value management platform, today launched a new CAD integration tool called DesignHub.

Available now as part of Propel’s winter 2026 release, DesignHub connects 16 mechanical and electrical CAD tools and PDM systems to Propel’s PLM platform, which Propel says will simplify collaboration between engineering teams.

“Most manufacturers use multiple CAD solutions, and their engineering data needs to be accessible throughout the whole product lifecycle,” said Eric Schrader, chief product officer at Propel, in the company’s announcement. “DesignHub connects these systems without the cost and complexity of traditional CAD-PLM integrations. It breaks down silos between engineering and other departments, empowering every team to make faster, better informed decisions.”

(Image: Propel.)

The DesignHub connections include:

  • MCAD: Solidworks, Onshape, Creo, Inventor, AutoCAD, NX, Solid Edge, Catia
  • ‍ECAD: OrCAD, Altium Designer, Altium 365
  • PDM: SolidWorks PDM Professional, Windchill, Autodesk Vault, Teamcenter, 3DExperience

The cost of DesignHub depends on which integrations are implemented, according to Propel. You can learn more about the new solution on the DesignHub landing page.

Comsol calls CUDA solve common sim salve in 6.4

A couple weeks ago Comsol launched the latest version of its simulation platform, Multiphysics version 6.4. I reported on the update at the time, but I’ve since had a chance to dig deeper with Bjorn Sjodin, Comsol’s senior VP of product management.

Sjodin told me about his favorite features of the new release, why users can expect faster simulation runtimes (hint in the heading), and what he sees as the biggest simulation trends to watch in 2026.

You can read the details from that interview in The 4 biggest updates in Comsol Multiphysics 6.4.

One last link

Who doesn’t like a good annual wrap-up listicle? Here’s one from Design World editor-in-chief Rachael Pasini: 10 top Design World stories in 2025.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post Trimble launches SketchUp AI for rendering and object generation appeared first on Engineering.com.

]]>