Reading time ( words)
For PCB and assembly manufacturers, test engineering has become a critical factor in enhancing the profitability of new product introductions (NPIs). Given the trend toward high-mix, low-volume production, the journey from design data to an automated PCB testing program must be quicker and more efficient than ever before.
In this article, we will discuss how to optimize the efficiency of the test engineering process in accordance with these new market realities.
A Brief Introduction to PCB Test Engineering
Test engineering describes the methodology of using design data to define a test strategy that will ensure that a PCB assembly is fully tested and free of defects. The testing infrastructure has gradually expanded from the “bed-of-nails” in-circuit technology (ICT) approach used for PCBs, to flying probe machines, which are better suited for high-mix, low-volume production and prototype runs. While ICTs remain in use and are an efficient means of IC programming, in-line testing, and high-volume testing, the migration to flying probes continues to accelerate due to the cost of ICT fixtures and the reduction in production volumes.
The field of test engineering is further divided into two separate disciplines:
- Process verification: The product is tested to ensure that it was manufactured properly.
- Product verification: The product undergoes functional testing to ensure that it works according to its design specifications. Product verification differs widely from product to product and is heavily customized for each project.
This article focuses on process verification. During process verification, the PCB is tested to ensure that it was built correctly, that the right components were placed in the right places, and that they operate correctly, with no shorts or opens. If the board was designed correctly, and passes process verification, then in theory, it should function as specified.
The Need for Efficient Test Engineering
The electronics manufacturing sector has seen a fundamental shift to higher production mixes, as industries move toward individualization, as well as customization of designs for specific markets. The increase in the number of designs and design variations has pressured test engineering departments to produce reliable, debugged programs in ever-shorter periods of time.
Therefore, the goal for test engineering is to shorten and simplify the path from design data to debugged test programs.
To achieve this, I’ve selected and described in the sections below some key guidelines for economizing and optimizing the process engineering task.
Guideline #1: Maintain a Single Data Source
Commonly, manufacturers use multiple data preparation applications to create operational procedures for the entire line. The applications include SMT process software and testing software, but it is not unusual to see three or four different applications performing these tasks, often resulting in duplicated efforts and inaccuracies.
In addition, each application may rely on different source files, increasing the probability of inconsistent results.
Therefore, it is highly recommended to utilize a single application to create data for all the machines in the line—including testing—thereby creating a digital twin that serves as a “single point of truth” upon which all activities are based.
How to accomplish this?
One of the most crucial requirements in test engineering is the ability to obtain complete layout data which can then be imported into whichever application is being used. Intelligent CAD files, such as ODB++ Design, are highly accurate, and save significant amounts of time when compared to other formats, such as Gerber. If needed, CAM formats such as GenCAD are still preferred over Gerber, but a key objective is to use the same file set for all the machines in the line.
If only Gerber files are available, your test engineering solution must support the import and reverse engineering of Gerber files, including generation of the netlist, which is critical to test engineering analysis.
Gerber import can also be used when CAD files are missing accurate solder mask data. The ability to merge the top and bottom mask Gerber files with CAD-based intelligence yields excellent results, and without the need for reverse engineering.
The physical layout data does not always contain the electrical information needed for test program development, test debug, and repair activities. In this case, it becomes necessary to import the bill of materials (BOM) or import data from a parts library to obtain this information.
While standard machine output files can be created directly from CAD data, it is preferable to work with complete output files that identify all the characteristics of each part on the board, including component types, model names, electrical values, tolerances, and pin mappings.
Accurate package outlines are important for ICT and flying probe analysis. CAD outlines can be used, but they may not be sufficiently accurate. Algorithm-based outlines, together with component dimensions, can be used to create a single closed shape. A comprehensive package library database, such as the Valor Parts Library, can be an excellent source of package outlines for DFT activities.
Guideline #2: Ensure Test Probe Placement Efficiency
When through-hole PCBs were prevalent, bottom-side ICT analysis was automatic, since any pin could be probed, yielding multiple points on a net that could be used. The introduction of surface mount components changed the testing paradigm. Now, test engineers are required to analyze the design, determine how to probe the board, and write scripts to implement the solution. Therefore, the applications simply apply what the test engineer has painstakingly created. As the number of assembly projects grows, this time-consuming approach becomes increasingly unscalable.
Advanced software solutions, such as Valor Process Preparation, can automatically determine probe targets, based on general guidance and constraints from the test engineer. The example below shows blue copper pads with a beige mask pad over them, and a purple drill in the center. If the mask pad is the size of the copper pad or larger, then the exposed area is equal to the size of the copper pad. If the mask pad is smaller than the copper pad, then the exposed area is based on the mask size. Drills may affect the results. While a small drill in via can be ignored, a larger drill should be avoided; therefore, the software needs to consider the annular ring around the drill for probing.
Another issue involves determining how close test probes can be to the board edge. Vacuum fixtures have a gasket around them that must be free of test probes.
ICT fixtures can be expensive, so it is preferred to use one fixture for multiple board variants. Therefore, components should always be considered as being placed, even if they are not.
Different targets can be ignored during analysis, and some types of targets can be prioritized over others. For example, test points are typically preferred over vias.
Package outlines can also be expanded by a specific amount to provide a little more margin around each one if needed.
When preparing for panel testing, it is highly desired to not flatten the board, as debug times will be much longer as a result. The preferred approach is to create a debugged test program first and then panelize the result.
Sometimes, it is necessary to place two probes on a net, for example, for 4-wire measurement. If there is only one accessible point, ICT fixtures can support two channels double wired to the same single test probe.
Finally, probe locking can be useful when planning tests that require special probes that are specified in the CAD system or in a separate file.
Figure 2.2: Precision robotic probes testing a printed circuit board.