Preprint
Article

Verifying the Accuracy of 3D-Printed Objects Using an Image Processing System

Altmetrics

Downloads

150

Views

50

Comments

0

A peer-reviewed article of this preprint also exists.

This version is not peer-reviewed

Submitted:

04 April 2024

Posted:

05 April 2024

You are already at the latest version

Alerts
Abstract
Verifying the accuracy of a 3D-printed object involves using an image processing system. This system compares images of the CAD model of the object to be printed with its 3D-printed counterparts to identify any discrepancies. It is important to note that the integrity of the accuracy measurement is heavily dependent on the image processing settings chosen. This study focuses on this issue by developing a customized image processing system. The system generates binary images of a given CAD model and its 3D-printed counterparts and then compares them pixel-by-pixel to determine the accuracy. Users can experiment with various image processing settings, such as grayscale to binary image conversion threshold, noise reduction parameters, masking parameters, and pixel-fineness adjustment parameters, to see how they affect accuracy. The study concludes that the grayscale to binary image conversion threshold has the most significant impact on accuracy and that the optimal threshold varies depending on the color of the 3D-printed object. The system can also effectively eliminate noise (filament marks) during image processing, ensuring accurate measurements. Additionally, the system can measure the accuracy of highly complex porous structures where the pores size, depth, and distribution are random. The insights gained from this study can be used to develop intelligent systems for the metrology of additive manufacturing.
Keywords: 
Subject: Engineering  -   Industrial and Manufacturing Engineering

1. Introduction

Additive manufacturing, commonly known as 3D printing, holds great promise in transforming how we design and manufacture products [1,2]. Its unique capabilities provide an array of opportunities to explore innovative design ideas [1,2,3]. 3D printing fabricates objects by adding materials layer-by-layer [1,2,3]. This manufacturing process has particularly emerged as a highly dependable process for fabricating complex objects, including topologically optimized parts and porous structures [4,5,6,7]. Furthermore, this process has developed a reputation for effectively managing the production of small-volume, high-variety product. Nevertheless, inaccuracy may occur in 3D-printed objects due to pre-processing (e.g., file format conversion and slicing), material processing (e.g., shrinkage and temperature variation, filament blockage), post-processing (e.g., support removal and surface finishing), flaws in the materials (e.g., inconsistency in material properties and particle sizes), design issues (e.g., loose shells and thin walls), and machine structures (e.g., mechanical deformation). Section 2 briefly describes some relevant studies conducted to manage or eliminate the abovementioned causes.
The accuracy measurement ecosystem of 3D-printed objects consists of a set of system components, as schematically illustrated in Figure 1. The description of this ecosystem is as follows. First, the CAD model of the object to be printed is sliced using an appropriate slicing system. This system provides the program to the control system of the 3D printer. The control system then runs the printer according to the program. Once the printer completes fabricating the object (printed object in Figure 1) or completes fabricating a given cross-section, an image acquisition system can be used to obtain the image of the printed cross-section. The obtained image then undergoes a series of image processing steps and produces a binary image denoted as a print image. Simultaneously, the target image generation system operates on the CAD model and produces a binary image of the desired design cross-section denoted as the target image. Finally, the accuracy elicitation system compares the print image with the target image pixel-by-pixel and determines how much the printed object conforms to the CAD model (see the comparison image in Figure 1).
The remarkable thing is that the image processing settings can significantly alter the print image and, thereby, the measurement results. Consequently, the measurement integrity relies on whether or not the image processing settings are selected correctly. This study focuses on this issue using a custom-made image processing system. (See Section 3 for the general description of the system.) The system first produces binary images of a given CAD model and its 3D-printed counterpart and then compares them pixel-by-pixel to quantify the accuracy. While processing the images, a user can select some vital parameters (e.g., the grayscale to binary image conversion threshold, noise reduction parameters, masking parameters, and pixel-fineness adjustment parameters). Depending on the settings of the parameters, the accuracy may vary a lot. Thus, the user must know which parameter settings alter the accuracy results to what extent. Finally, the user can determine the optimal image processing settings, ensuring the integrity of accuracy measurement integrity.
Based on the above contemplation is article is written. For the sake of better understanding, the rest of this paper is organized as follows. Section 2 presents a literature review on the accuracy-checking of 3D-printed objects. Section 3 presents the custom-made image processing system to investigate the measurement integrity of 3D-printed objects. Section 4 presents how to select the optimal image processing settings for relatively simple objects. Section 5 shows how to apply the system for determining the accuracy of highly complex 3D-printed object (a porous structure with randomly sized and distributed pores). Section 6 presents the concluding remarks of this study.

2. Related Work

This section briefly describes some selected contributions developed so far regarding measuring the accuracy of 3D-printed objects.
Carew et al. [8] studied the accuracy of 3D modeling and 3D printing in forensic anthropology evidence reconstruction. They found that complex parts exhibit lower accuracy no matter the additive manufacturing processes used during printing. The printing layer resolution may not affect the accuracy because, in most cases, the resolution of modeling data is more than that of the printing layer. Other authors found similar results (e.g., Edwards et al., [9]). Lee et al. [10] used two different additive manufacturing processes to create the replica of a tooth. They found that a replica shrinks or enlarges, and its surface becomes rough, depending on the additive manufacturing process. They also found that the replica can be used in real-life applications despite accuracy problems because the accuracy remains within the stipulated tolerance limits. Leng et al. [11] developed a quality assurance framework to systematize the accuracy assessment of 3D-printed anatomic models. They found that three main areas cause inaccuracy: 1) image data acquisition, 2) segmentation and processing, and 3) 3D printing and cleaning. They found that both qualitative inspection and quantitative measurement are needed to assess the accuracy. The images of the 3D printed model, obtained by a high-resolution CT scanner, can be compared with the original images to facilitate the quantitative measurement. George et al. [12] found that if the validated workflows are used, the existing 3D printing technologies produce replica within stipulated accuracy limits. Nevertheless, their study showed that reproducibility is a concern due to the performances of the software used in the workflows; the review and manual adjustment of the STL datasets may cause inaccuracy which is unpredictable. If a modification in the workflows is needed, it (modification) must be carried out in a stepwise fashion aided by the STL dataset comparison metrics. The authors emphasized that comprehensive accuracy evaluation of 3D-printed medical models has been evolving, and new measurement methods must be adopted to achieve better results. Bortolotto et al. [13] employed a low-budget workflow consisting of 64-slice computed tomography (CT), three free and open-source software, and a commercially available 3D printer. They measured 3D-printed replicas and original objects using high precision digital calipers and found that the dimensional inaccuracy is about 0.23 mm (0.055%), which is acceptable for medical applications. Herpel et al. [14] fabricated try-in dentures using milling (a subtractive manufacturing process) and 3D printing (additive manufacturing process). The 3D printing was carried out at five facilities. They found that though the 3D-printed try-in dentures qualify for real-life application, they were less accurate than those produced by milling. Cai et al. [15] introduced the concept of residual STL volume as a metric to evaluate the accuracy and reproducibility of 3D printed anatomic models. They applied the evaluation to maxillofacial bone and enhanced the accuracy of the 3D printed structure. Kim et al. [16] studied the accuracy of a simplified 3D-printed implant for surgical guide. They printed the same implant using three different additive manufacturing processes, namely, photopolymer jetting (PolyJet), stereolithography apparatus (SLA), and multi-jet printing (MJP). They found that PolyJet and SLA can meet the required accuracy for clinical applications. Kwon et al. [17] studied the accuracy of a 3D-printed patient-specific implant. The shape datasets are extracted from CT images. The implants were fabricated using a 3D printer that uses photo-resin (curable under ultraviolet rays) with 0.032 mm resolution. In order to evaluate the accuracy, the implants were scanned using a micro-CT scanner and the length and depth of the press-compressed and decompressed implants were compared using Bland-Altman plot. The average differences in length were 0.67 mm ± 0.38 mm, 0.63 mm ± 0.28 mm and 0.10 mm ± 0.10 mm. The average differences in depth were 0.64 mm ± 0.37 mm, 1.22 mm ± 0.56 mm and 0.57 mm ± 0.23 mm, respectively. Yuan et al. [18] also obtained a similar degree of accuracy for the 3D-printed dental implants. Borgue et al. [19] considered that imperfections in material properties can lead to errors in 3D printing. They developed a fuzzy logic-based approach for design-for-AM to manage uncertainties in material properties while meeting the quality standards of 3D printed objects. Holzmond and Li [20] developed a system that detects two common 3D printing errors: filament blockages and low flow. They use a digital image correlation system to compare the point cloud captured from the printer-head movement program (g-code) and the point cloud of a printed surface in real-time. Li et al. [21] considered that machine structure is the main cause of the error and developed an analytical model of the structure of a given 3D printer to elucidate the printing error. Yu et al. [22] developed an image processing-based approach to enhance the accuracy of 3D printed micro channels. They successfully modulated the optical proximity effect or curing light transmission and eliminated the channel blockage or shape distortion while printing small-diameter channels using laser curing technology. They used local greyscale of the projection image as the 3D printing continues. Montgomery et al. [23] studied pixel-level grayscale manipulation to improve digital light processing-based 3D printing accuracy. They first printed an object according to the 2D binary image of the object. The grayscale image of the printed object is processed to create printing data (a relatively smooth contour). The processed information is used to print the same object with high accuracy. The method developed by the authors provided pixel-level grayscale control to create round features from sharp pixels. Ma et al. [24] developed an image processing-based method for measuring layer-wise 3D food printing accuracy and identified the bottleneck of food printing (under- or over-extrusion). They first took a top-view image of the printed object (cookie). This image was projected on a vertical plane and cropped before being segmented from its background using Ostu’s automatic thresholding method [25]. They also convert a layer’s printer-head paths (denoted as G-code) into a binary image. The image produced from printer-head paths and the image of the printed object processed as mentioned above are compared to quantify the accuracy. Vidakis et al. [26] developed a method that uses Micro-Computed Tomography (micro-CT) images of 3D-printed objects to elucidate the dimensional and shape accuracies. They, however, did not show how the images are processed and compared with the treatment images. Eltes et al. [27] developed an image processing-based accuracy checking method for 3D-printed biomedical object. The created surface meshes of the 3D-printed object using 3D scanning and compared it with the targeted surface meshes from CT scan images. The comparison was conducted by using Hausdorff Distance (HD). Nguyen et al. [28] and other, e.g., see reference [29], developed a method to generate a model of a biomedical object processing the sliced images from CT-scan data. The model was fabricated using 3D printers, and the CT-scan data of the printed object was obtained to check the accuracy. The details of the comparison mechanism that quantifies the accuracy were not presented. Xia et al. [30] developed an image acquisition and processing technique using a flatbed scanner for shape accuracy evaluation of 3D-printed objects. The algorithms were formulated to extract useful shape information from the scanned images without human intervention. Centroid distance function and a root mean square error color map were used to visualize the inaccuracy effectively.
The abovementioned studies reveal that the image of the target (or design) object and the image of the printed object are two valuable pieces of information by which one can guarantee whether or not the 3D-printed object is reliable. The real-time or offline comparison of these two types of images can provide valuable insights into the underlying manufacturing process and how to improve it. On the one hand, the target or design image can be created from the CAD model in STL format [3,4,5]. On the other hand, the image of the printed object can be created by applying some image processing techniques (e.g., converting a raw image to a grayscale image, converting a grayscale image to a binary image, removing noises from an image, masking and rescaling an image, and alike). To ensure the integrity of the accurate measurement results, it would be beneficial to develop a specialized image processing system that can examine how different image processing settings impact accuracy. This system could provide valuable insights into the extent of these effects and help improve the accuracy of measurements in various applications. The next section presents a custom-made image processing system that can fulfill the abovementioned measurement needs.

3. Image Processing System

This section presents the working principle, user interfaces, and performances of the proposed custom-made image processing system. The previous versions of the system can be found in references [31,32].
First, consider the working principle of the image processing system, as schematically illustrated in Figure 2. The left-hand side illustrations in Figure 2 shows how the image processing system interacts with the data acquisition and CAD systems. The other side shows how the system processes images for the sake of comparison.
As seen in Figure 2, a 3D printer operates using the STL data collected from a CAD model. After the printing operation, the printed object is collected. An imaging system (e.g., a microscope) can collect the image of the cross-section of the printed object. The image processing system acknowledges the raw image of the printed object as the original image. The system also acknowledges the STL data of the CAD model. The system generates the binary image of a given cross-section using the CAD model, denoted as the design image. The system processes the original image and produces a grayscale image using the user-defined values of RGB. The system then converts the grayscale image into a binary image using the user-defined threshold value. Afterward, the system removes the noise from the binary image using the user-defined values of the noise parameters and produces a noise-removed image. The system then applies a user-defined masking operation and produces the masked image. Finally, the design image is compared with the masked image, resulting in a comparison image. The comparison image is represented by pixels of four colors: black, white, violet, and yellow. Note that processes of getting a binary image, noise-removed image, and masked image can be reshuffled. For example, one can produce the masked image first before producing a noised-removed image. In that case, the noise-removed image is compared with the design image. For the sake of comparison, the resolution of the design image can be adjusted to the resolution of the masked, noise-removed, or binary image.
Let B, W, V, and Y denote black, white, violet, and yellow pixels, respectively. Let PD(I,j), PM(I,j), and PC(I,j), I, = 1,2,…,N, j = 1,2,…,M, denote arbitrary pixels in the design image, masked/noised-removed/binary image, and comparison image, respectively. As such, PD(I,j) ∈ {B, W}, PM(I,j) ∈ {B, W}, and PC(I,j) ∈ {B, W, V, Y}. The following rules are maintained while generating a comparison image by comparing a design image with the masked, noised-removed, or binary image.
P D i , j = P M i , j = B P C i , j = B
P D i , j = P M i , j = W P C i , j = W
P D i , j = B P M i , j = W P C i , j = Y
P D i , j = W P M i , j = B P C i , j = V
Thus, violet or yellow pixels represent errors in a printing process, and minimizing the number of such pixels can lead to better quality outcomes. Consequently, printing error denoted as E can be expressed as follows:
E = Y N + V N T N 100 %
In equation (5), YN, VN, and TN denote the number yellow, violet, and total number of pixels in the comparison image, respectively.

User Interfaces

Based on the image processing system’s outline described above, a system is developed. This sub-section presents the user interfaces of the developed systems. The system consists of five independent components denoted as: 1) Grayscale-binary interface, 2) Noise-removal interface, 3) Masking-scaling interface, 4) Target interface, and 5) Comparison interface. The grayscale-binary interface, as shown in Figure 3, lets a user input an original image of a cross-section of a 3D-printed object. The user then sets the R, G, and B weights to convert the original image to a grayscale image. The user subsequently sets a value of the threshold (an integer in the interval (0, 255)) to obtain a binary image from the grayscale image. The interface displays the results. The user can save the grayscale and binary images in a preferred directory.
Using the noise-removal interface, as shown in Figure 4, a user can remove noises from a given binary image. The sources of noises are mostly the marks of the filaments and other irregularities on a given cross-section of the printed object. There are four parameters that help remove noises. The first two parameters denoted as θmin and θmax (i.e., angles in degrees). They collectively set the orientation of the noises (i.e., the slopes of the noises) on the binary image. The other two parameters are critical spot size (pc) and critical spot length (wc). The units of these two parameters are pixels. The arbitrary case shown in Figure 4 corresponds to θmin = 0° and θmax = 50°, pc = 300 pixels, and wc = 100 pixels. As such, the system removes black spots whose size is less than or equal to 300 pixels, whose length is less than or equal to 100 pixels, and whose slope, in terms of an angle measured in degrees in the anticlockwise direction of the length, belongs to the angular range of [0°, 50°]. Note that the user can select black or white sports as noises. In the case shown in Figure 4, the user selects black spots as noises, which is the obvious thing to do. The interface displays the images before and after noise removal operations. The user can save the images in a preferred directory.
As shown in Figure 5, the masking-scaling interface is used to mask the unnecessary segment of a given binary image, preferably after removing the noises. In this interface, the size of the masked image is rescaled so that the pixel size of the design image (described below) matches that of the masked image. The case shown in Figure 5 presents that a user crops an image using a rectangular boundary so that it takes the size of the design image and the scale of the pixel matches that of the design image. The interface displays the images before and after masking and scaling operations. The user can save the images in a preferred directory.
As shown in Figure 6, the target interface creates a binary image of a given cross-section of the design. For this, the interface allows a user to input the STL data of the 3D CAD model (the design) of the object to be fabricated using a 3D printer. The user then inputs the height of the cross-section to produce the binary image. The interface displays the image. The user can save the image in a preferred directory.
Finally, Figure 7 shows the comparison interface. In this interface, the user inputs the design image (i.e., the image created by the target interface, as shown in Figure 6) and the binary/noise-removed/masked image for comparison. The interface displays the comparison image with the four-color scheme described above. The interface helps the user save the comparison image in a preferred directory. The interface also displays error-related datasets (YN, VN, TN, and other relevant statistics).
As mentioned earlier, the comparison image displays the errors in the 3D-printed object. Since it is produced by comparing it with a binary, noise-removed, or masked image, careful consideration is required when setting the parameters relevant to binary, noised-removed, and masked images. In particular, the settings of the threshold related to binary image producing process and the four parameters related to noise removal process need careful consideration. See the arbitrary cases shown in Figure 8 and Figure 9 for a better understanding. Figure 8 shows the original images of two objects printed in two different colors, green (a relatively dark color) and orange color (a relatively bright color). Both objects are printed using the same design (CAD model). The printing conditions for both objects are shown in Table 1.
From the greyscale images (not shown in Figure 8) of the original images, binary images are produced setting the thresholds at 100 and 150, respectively. For the orange object, both thresholds keep the information of the object. On the other hand, for the green image, both thresholds almost destroy the information of the object, with 150 being the worst. Consequently, a smaller threshold is preferable for dark colors, whereas a higher threshold is better for bright colors.
Figure 9 shows the effect of the settings of the parameters related to noise removal, i.e., two angles (θmin and θmax) and two sizes (pc and wc). As described before, the two angles collectively set the range of slopes within which the noises should be removed. Meanwhile, pc and wc set the critical spot (i.e., noise) size and length in terms of number of pixels. If the number of pixels of a spot is greater than pc, this spot will not be removed during the noise removal process. The same argument is true for wc. Figure 9 shows four different settings of θmin, θmax, pc, and wc, denoted as P1, P2, P3, and P4. The settings denoted as P1 remove a noise that consists of 200 pixels or less, has a length of 100 pixels or less, and can be oriented in any direction. Compared to P1, P2 puts a tight restriction on the orientation of a spot. Compared to P1, P3 puts less restriction on the sizes of a spot. Compared to P1, P4 puts a tight restriction on both the orientation and size. Comparing the images after applying the noise removal process, as shown in Figure 9, reveals that a less restrictive orientation of noise and a moderate noise size (corresponding to P1) can effectively remove noise, while other settings (P2, P3, and P4) may not yield the desired results.

4. Accuracy of a Simple 3D-Printed Object

This section presents a case study where the accuracy is estimated by comparing the design shown in Figure 6 with its 3-D printed counterparts shown in Figure 8. For better understanding, the first half of this section presents the results related to thresholds, and the other half presents results related to noise removal.
First, consider the results related to thresholds. In this case, binary images are produced for thresholds 10, 20,…,250 for both orange and green objects (Figure 8). Afterwards noises are removed using the same settings (θmin = 0°, θmax = 180°, pc = 200, and wc = 100). The noise-removed images are then processed as described in Section 3 to produce the masked images. The masked images are then compared with the design image (Figure 6) to produce the comparison images. Figure 10 shows four selected screen-prints of the comparison interface. As seen in Figure 10, the corresponding design image (see Figure 6) is compared with the masked images of the orange and green objects. The left-hand-side masked images correspond to threshold 40, whereas the right-hand-side images correspond to threshold 100.
However, Figure 11 shows the images of the orange object to be compared with the design image for thresholds 10,20,…,250. As shown in Figure 11, for the thresholds 10 and 20, the images consist of white pixels only. With the increase in the threshold, the black pixels increase, and for thresholds 210 and above, the images consist of black pixels only. Figure 12 shows the comparison images corresponding to the images in Figure 11. As seen in Figure 12, yellow pixels decrease with the increase in the threshold, and violet pixels increase with the increase in the threshold.
On the other hand, Figure 13 shows the images of the green object to be compared with the design image for thresholds 10,20,…,250. As shown in Figure 13, from threshold 70, black pixels start to dominate the images, and at threshold 130 and above, the images consist of black pixels only. Figure 14 shows the comparison images corresponding to the images in Figure 13. As seen in Figure 14, yellow pixels are rare, and the violet pixels increase with the increase in the threshold.
The errors are calculated by Equation (5) using the comparison images shown in Figure 12 and Figure 14. The results are plotted in Figure 15. Figure 15(a) slows the overall trend in error, and Figure 15(b) shows the error trend for some selected thresholds. The green object’s error is calculated for thresholds 70, 75,…,95, whereas the orange object’s error is calculated for thresholds 140, 145,…,165. The comparison images for thresholds 75, 85, 95, 145, 155, and 165 are not shown in Figure 12 and Figure 14, though others are shown. As seen in Figure 15, the orange object’s error starts to decrease from threshold 30 and becomes minimal at around threshold 160. The minimal error is 2.161%. On the other hand, the error slowly decreases for the green object starting from threshold 10. This decreasing trend continues up to threshold 85. Afterward, the error increases sharply with the increase in the threshold. The minimal error here is slightly less than that of the orange object, which is 1.953%. Both objects exhibit the same error (around 50%) for very high thresholds. When the comparison image is either fully white or black, the expected error is about 50% because the design image consists of almost the same amount of black and white pixels. The plot in Figure 15(a) also exhibits the same result; for the white images (the first two images in Figure 12), the error is about 50%; for the fully black images (the last few images in both Figure 12 and Figure 14), the error is about 50%. Thus, the image processing system presented in this study produces reliable results.
What if the noise removal conditions are varied at the optimal threshold? In order to answer this question, nine sets of noise removal conditions are considered for each object, as shown in Table 2. The conditions consider a constant threshold equal to the optimal threshold for each object. The results are summarized in a plot, as shown in Figure 16. As seen in Figure 16, the error increases slightly when the noise orientation parameters are kept somewhat tight. When noise orientation parameters are kept wider, the error remains minimal and is not affected by parameters of noise size, as shown in Table 2.
In synopsis, if the underlying 3D printing operations remain normal, then approximately 3% error can be expected from the proposed image processing-based accuracy measurement method, provided that the optimal threshold for the given color, wider noise orientation, and moderate noise sizes are selected. The threshold significantly impacts the error estimation process compared to other parameters.

5. Accuracy of a 3D-Printed Porous Structure (Complex Object)

As mentioned before, 3D printing has revolutionized the process of fabricating complex objects such as topologically optimized structures and porous structures, paving the way for innovative solutions in various fields and industries [4,5,6,7]. Unlike simple objects, complex objects may consist of loose shells, thin walls, and other difficult-to-fabricate features, resulting in sustainable inaccuracy [31,32]. Thus, measuring the accuracy of 3D-printed complex objects has become a critical issue. This section uses the presented image processing system to determine the accuracy of a complex object, i.e., a porous structure consisting of randomly sized and distributed pores.
Figure 17 shows the CAD model and 3D-printed counterpart of the porous specimen used in this study. The rectangular shape structure consists of randomly sized and distributed pores all over it. It is designed using the system shown in [5]. The outer dimensions of the structure are 30 mm × 30 mm × 30 mm. It is printed using the specifications shown in Table 1, except for the infill related parameters. This time, the infill rate is kept at 100%. As a result, the infill pattern and infill angle is not relevant here. The filament color is orange. The printing process is interrupted at heights 1 mm, 5 mm, 10 mm, 15 mm, 20 mm, and 25 mm. The reason is to take images of the cross-section of the printed object at 1 mm, 5 mm, 10 mm, 15 mm, 20 mm, and 25 mm. The images are shown in Figure 18. As seen in Figure 18, the structure exhibits highly complex pores at each height, and complexity differs from one height to another. The images shown in Figure 18 are processed to measure the accuracy of the porous specimen. The results are summarized in Figure 19. Figure 19(a) shows the original images of the specimen at heights 1 mm, 5 mm, 10 mm, 15 mm, 20 mm, and 25 mm, which are already shown in Figure 18. These images are processed using the presented system. The optimal image processing settings for an orange-colored object described in the previous section are used to obtain the binary images for comparison. The resulting images of the printed specimen for comparison are shown in Figure 19(b). The CAD model (Figure 17) of the specimen is processed at heights 1 mm, 5 mm, 10 mm, 15 mm, 20 mm, and 25 mm to produce the design images for comparison, as shown in Figure 19(c). The respective comparison images are shown in Figure 19(d). The errors exhibited by the comparison images are summarized in Table 3. This time the minimal error is 3.85%, which corresponds to a height of 1 mm. The maximum error is 10.89%, which corresponds to a height of 10 mm. The average error is 8.695%, and the standard deviation is 2.803%. It means that the printing error increased three times due to the complexity of the design compared to that of the simple design. Note that for heights 10 mm, 15 mm, and 20 mm, the noises on the boundaries could not be remove properly. These remaining noises are the cause of the high values of noise for these heights. The error could have been much smaller if the noise had been removed properly.
What if the images of the printed specimen are obtained using a micro-CT scanning device? Exploring this possibility is a valid option. As such, the following case study is conducted.
Figure 20(a) shows images of the cross-sections on the planes x-y, y-z, and z-x of the CAD model of the specimen. The image of the cross-sections on the x-y plane corresponds to z = 15 mm. On the other hand, Figure 20(b) shows images of the cross-section on the planes x-y, y-z, and z-x of the printed specimen (Figure 17). The cross-section image on the x-y plane corresponds to z = 15 mm. The images are obtained using a micro-CT scanning device. The presented image processing system can process micro-CT scans of specimen cross-sections. Accordingly, Figure 21 shows the results. Figure 21(a) shows the images of the specimen at heights 1 mm, 5 mm, 10 mm, 15 mm, 20 mm, and 25 mm. The images are obtained by scanning the specimen (Figure 17) using a micro-CT scanning device. These images are processed using the presented system. Since color is not an issue here, the images are processed using the presented image processing devices where the settings are as follows: threshold = 100, θmin = 0°, θmax = 180°, pc = 40, and wc = 20. The resulting images of the printed specimen for comparison are shown in Figure 21(b). The CAD model (Figure 17) of the specimen is processed at heights 1 mm, 5 mm, 10 mm, 15 mm, 20 mm, and 25 mm to produce the design images, as shown in Figure 21(c). The respective comparison images are shown in Figure 21(d). The errors exhibited by the comparison images are summarized in Table 3. This time the minimal, maximal, average errors are 2.18%, 4.06%, 3.22%, respectively, and the standard deviation is 0.655%. This time, the noise removal process works properly for all cross-section images, resulting in smaller errors.
In synopsis, images obtained from micro-CT scans provide more realistic results for complex structures. For micro-CT scans, the color of the printed image is not a matter; the same optimal image processing settings can be used for different colors of the printed objects.

6. Concluding Remarks

This article presents a custom-made image processing system. The system can generate binary images of a given CAD model. In addition, it can generate images of the CAD model’s 3D-printed counterparts. Moreover, it can compare these two types of images pixel by pixel to confirm whether or not the 3D-printed objects comply with the CAD model.
Using the system, a user can see how the accuracy varies due to the image processing settings such as the grayscale to binary image conversion threshold, noise reduction parameters, masking parameters, and pixel-fineness adjustment parameters.
It is found that the grayscale to binary image conversion threshold affects the accuracy most. In addition, the optimal threshold depends on the color of the 3D-printed object. A control over noise elimination during image processing (e.g., removing marks of the filaments on a given cross-section of a 3D-printed object) makes accuracy-checking more reliable.
The presented system not only can reliably measure the accuracy of 3D-printed objects with simple geometry but also objects with complex geometry (porous structures with random pore size, distribution, and depth). This is confirmed by performing case studies.
A simple object can exhibit an error of approximately 3%, even though visual inspection reveals that there is apparently no error in the 3D-printed object compared to its CAD model. A complex object can exhibit an error of approximately 10%, even though visual inspection reveals that the 3D-printed object has slight or no error compared to its CAD model; the error can decrease to approximately 2% when the presented system processes micro-CT scans and compares them with the images of the CAD model.
By leveraging the outcomes of this study, more pragmatic systems for the metrology of additive manufacturing metrology can be developed.

References

  1. D. L. Bourell, D. W. Rosen, and M. C. Leu, The Roadmap for Additive Manufacturing and Its Impact, 3D Printing and Additive Manufacturing, Vol.1, No.1, pp. 6-9, 2014. [CrossRef]
  2. Gibson, D. Rosen, B. Stucker. Additive Manufacturing Technologies: 3D Printing, Rapid Prototyping, and Direct Digital Manufacturing, Springer, New York, NY, 2015.
  3. M. M. S. Ullah, Tashi, A. Kubo, K. H. Harib. Tutorials for Integrating 3D Printing in Engineering Curricula. Education Sciences. 2020; 10(8):194. [CrossRef]
  4. M. M. S. Ullah, D. M. D’Addona, K. H. Harib, T. Lin. Fractals and Additive Manufacturing, International Journal of Automation Technology, 2016, 10(2), pp. 222-230.
  5. M. M. S. Ullah, Hiroki Kiuno, Akihiko Kubo, Doriana Marilena D’Addona, A system for designing and 3D printing of porous structures, CIRP Annals, 69(1), 2020, pp. 113-116, 2020. [CrossRef]
  6. 6. Yusuke Seto, AMM Sharif Ullah, Akihiko Kubo, Doriana Marilena D’Addona, Roberto Teti, On the Porous Structuring using Unit Cells, Procedia CIRP, 99, 2021, pp. 381-386. [CrossRef]
  7. S. Ullah, D. M. D’Addona, Y. Seto, S. Yonehara, A. Kubo. Utilizing Fractals for Modeling and 3D Printing of Porous Structures, Fractal and Fractional, 2021; 5(2):40. [CrossRef]
  8. Carew, R. M., Morgan, R. M. and Rando, C., A Preliminary Investigation into the Accuracy of 3D Modeling and 3D Printing in Forensic Anthropology Evidence Reconstruction, Journal of Forensic Sciences, Vol. 64, No. 2 (2019), pp. 342−352. [CrossRef]
  9. Edwards, J. and Rogers, T., The Accuracy and Applicability of 3D Modeling and Printing Blunt Force Cranial Injuries, Journal of Forensic Sciences, Vol. 63, No. 3 (2018), pp. 683−691. [CrossRef]
  10. Lee, K. Y., Cho, J. W., Chang, N. Y., Chae, J. M., Kang, K. H., Kim, S. C. and Cho, J. H., Accuracy of three-dimensional printing for manufacturing replica teeth, Korean Journal of Orthodontics, Vol. 45, No. 5 (2015), pp. 217−2251. [CrossRef]
  11. Leng, S., McGee, K., Morris, J., Alexander, A., Kuhlmann, J., Vrieze, T., McCollough C. H., and Matsumoto, J. Anatomic modeling using 3D printing: quality assurance and optimization, 3D Printing in Medicine, Vol. 3 (2017) pp. 6. [CrossRef]
  12. George, E., Liacouras, P., Rybicki, F. J. and Mitsouras, D. Measuring and establishing the accuracy and reproducibility of 3D printed medical models, RadioGraphics, Vol. 37, No. 5 (2017), pp. 1424−1450. [CrossRef]
  13. Bortolotto, C., Eshja, E., Peroni, C., Orlandi, M. A., Bizzotto, N. and Poggi, P., 3D printing of CT dataset: validation of an open source and consumer-available workflow, Journal of Digital Imaging, Vol. 29, No. 1 (2016), pp.14−21. [CrossRef]
  14. Herpel, C., Tasaka, A., Higuchi, S., Finke, D., Kühle, R., Odaka, K., Rues, S., Lux, C. J., Yamashita, S., Rammelsberg, P. and Schwindling, F. S., Accuracy of 3D Printing Compared with Milling—a Multi-Center Analysis of Try-In Dentures, Journal of Dentistry, Vol. 110 (2021), pp. 103681. [CrossRef]
  15. Cai, T., Rybicki, F. J., Giannopoulos, A. A., Schultz, K., Kumamaru, K. K., Liacouras, P., Demehri, S., Small, K. M. and Mitsouras, D., The residual STL volume as a metric to evaluate accuracy and reproducibility of anatomic models for 3D printing: application in the validation of 3D-printable models of maxillofacial bone from reduced radiation dose CT images, 3D Printing in Medicine, Vol. 1, No. 1 (2015), pp. 2. [CrossRef]
  16. Kim, T., Lee, S., Kim, G.B., Hong, D., Kwon, J., Park, J.W. and Kim, N., Accuracy of a simplified 3D-printed implant surgical guide, The Journal of Prosthetic Dentistry, Vol. 124, No. 2 (2020), pp. 195−201. [CrossRef]
  17. Kwon, J., Kim, G.B., Kang, S., Byeon, Y., Sa, H.-S. and Kim, N., Accuracy of 3D printed guide for orbital implant, Rapid Prototyping Journal, Vol. 26, No. 8 (2020), pp. 1363−1370. [CrossRef]
  18. Yuan, F., Sun, Y., Zhang, L. and Sun, Y., Accuracy of chair-side fused-deposition modelling for dental applications, Rapid Prototyping Journal, Vol. 25 No. 5 (2019), pp. 857−863. [CrossRef]
  19. Borgue, M. Panarotto, and O. Isaksson, “Fuzzy model-based design for testing and qualification of additive manufacturing components,” Design Science, vol. 8, p. e11, 2022, Art no. e11.
  20. T. Li, J. Li, X. Ding, X. Sun, and T. Wu, “An error identification and compensation method for Cartesian 3D printer based on specially designed test artifact,” The International Journal of Advanced Manufacturing Technology, vol. 125, no. 9, pp. 4185-4199, 2023. [CrossRef]
  21. Holzmond and X. Li, “In situ real time defect detection of 3D printed parts,” Additive Manufacturing, vol. 17, pp. 135-142, 2017. [CrossRef]
  22. Z. Yu, X. Li, T. Zuo, Q. Wang, H. Wang, and Z. Liu, “High-accuracy DLP 3D printing of closed microfluidic channels based on a mask option strategy,” The International Journal of Advanced Manufacturing Technology, vol. 127, no. 7, pp. 4001-4012, 2023. [CrossRef]
  23. S. M. Montgomery, F. Demoly, K. Zhou, and H. J. Qi, “Pixel-Level Grayscale Manipulation to Improve Accuracy in Digital Light Processing 3D Printing,” Advanced Functional Materials, vol. 33, no. 17, pp. 2213252, 2023. [CrossRef]
  24. Y. Ma, J. Potappel, M. A. I. Schutyser, R. M. Boom, and L. Zhang, “Quantitative analysis of 3D food printing layer extrusion accuracy: Contextualizing automated image analysis with human evaluations: Quantifying 3D food printing accuracy,” Current Research in Food Science, vol. 6, pp. 100511, 2023.
  25. N. Otsu, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 9, no. 1, pp. 62-66, 1979. [CrossRef]
  26. N. Vidakis, C. David, M. Petousis, D. Sagris, and N. Mountakis, “Optimization of key quality indicators in material extrusion 3D printing of acrylonitrile butadiene styrene: The impact of critical process control parameters on the surface roughness, dimensional accuracy, and porosity,” Materials Today Communications, vol. 34, pp. 105171, 2023. [CrossRef]
  27. P. E. Eltes, L. Kiss, M. Bartos, Z. M. Gyorgy, T. Csakany, F. Bereczki, V. Lesko, M. Puhl, P. [28] P. Varga, and A. Lazary, “Geometrical accuracy evaluation of an affordable 3D printing technology for spine physical models,” Journal of Clinical Neuroscience, vol. 72, pp. 438-446, 2020. [CrossRef]
  28. P. Nguyen, I. Stanislaus, C. McGahon, K. Pattabathula, S. Bryant, N. Pinto, J. Jenkins, and C. Meinert, “Quality assurance in 3D-printing: A dimensional accuracy study of patient-specific 3D-printed vascular anatomical models,” Frontiers in Medical Technology, vol. 5, 2023-February-07, 2023. [CrossRef]
  29. H. Huang, C. Xiang, C. Zeng, H. Ouyang, K. K. Wong, and W. Huang, Patient-specific geometrical modeling of orthopedic structures with high efficiency and accuracy for finite element modeling and 3D printing, Australasian Physical and Engineering Sciences in Medicine, vol. 38, no. 4, pp. 743-53, Dec, 2015. [CrossRef]
  30. M. Xia, B. Nematollahi, and J. Sanjayan, “Shape Accuracy Evaluation of Geopolymer Specimens Made Using Particle-Bed 3D Printing,” in Second RILEM International Conference on Concrete and Digital Fabrication, Cham, F. P. Bos, S. S. Lucas, R. J. M. Wolfs, and T. A. M. Salet, Eds., 2020: Springer International Publishing, pp. 1011-1019.
  31. S. Yonehara and A.S. Ullah, Elucidating accuracy of 3D printed porous structure by analyzing images of designed and printed structures, Proceedings of the International Conference on Design and Concurrent Engineering 2021 & Manufacturing Systems Conference 2021 (iDECON/MS2021), JSME, September 3-4, 2021, Virtual, Paper ID: 17.
  32. T. Okamoto, S. Yonehara, S. Ura and A.K. Ghosh, A Metrological System for 3D-Printed Objects, Proceedings of the 38th American Society for Precision Engineering Annual Meeting, Boston, Massachusetts, USA, November 12-17, 2023.
Figure 1. Systems for ensuring quality of 3D-printed objects.
Figure 1. Systems for ensuring quality of 3D-printed objects.
Preprints 103101 g001
Figure 2. The working principle of the proposed image processing system.
Figure 2. The working principle of the proposed image processing system.
Preprints 103101 g002
Figure 3. Grayscale-binary interface.
Figure 3. Grayscale-binary interface.
Preprints 103101 g003
Figure 4. Noise-removal interface.
Figure 4. Noise-removal interface.
Preprints 103101 g004
Figure 5. Masking-scaling interface.
Figure 5. Masking-scaling interface.
Preprints 103101 g005
Figure 6. Target interface.
Figure 6. Target interface.
Preprints 103101 g006
Figure 7. Comparison interface.
Figure 7. Comparison interface.
Preprints 103101 g007
Figure 8. Effect of threshold.
Figure 8. Effect of threshold.
Preprints 103101 g008
Figure 9. Effect of the settings of the parameters related to noise removal.
Figure 9. Effect of the settings of the parameters related to noise removal.
Preprints 103101 g009
Figure 10. Screen-prints of the selected comparison interface.
Figure 10. Screen-prints of the selected comparison interface.
Preprints 103101 g010
Figure 11. Images of the orange object for comparison. The integers in “()” are the thresholds.
Figure 11. Images of the orange object for comparison. The integers in “()” are the thresholds.
Preprints 103101 g011
Figure 12. Comparison images of the orange object. The integers in “()” are the thresholds.
Figure 12. Comparison images of the orange object. The integers in “()” are the thresholds.
Preprints 103101 g012aPreprints 103101 g012b
Figure 13. Images of the green object for comparison. The integers in “()” are the thresholds.
Figure 13. Images of the green object for comparison. The integers in “()” are the thresholds.
Preprints 103101 g013
Figure 14. Comparison images of the green object. The integers in “()” are the thresholds.
Figure 14. Comparison images of the green object. The integers in “()” are the thresholds.
Preprints 103101 g014
Figure 15. Error estimation for the orange and green objects.
Figure 15. Error estimation for the orange and green objects.
Preprints 103101 g015
Figure 16. Variability in the error due to different noise removal settings.
Figure 16. Variability in the error due to different noise removal settings.
Preprints 103101 g016
Figure 17. CAD model and 3D-printed counterpart of the porous specimen.
Figure 17. CAD model and 3D-printed counterpart of the porous specimen.
Preprints 103101 g017
Figure 18. Original images of the porous specimen at different heights.
Figure 18. Original images of the porous specimen at different heights.
Preprints 103101 g018
Figure 19. Different images of the porous structure relevant to accuracy assessment.
Figure 19. Different images of the porous structure relevant to accuracy assessment.
Preprints 103101 g019aPreprints 103101 g019b
Figure 20. Images of the porous structure at different orthogonal planes.
Figure 20. Images of the porous structure at different orthogonal planes.
Preprints 103101 g020
Figure 21. Accuracy assessment using micro-CT scan-driven images of the specimen.
Figure 21. Accuracy assessment using micro-CT scan-driven images of the specimen.
Preprints 103101 g021
Table 1. Printing conditions.
Table 1. Printing conditions.
Material Thermoplastic filament made of Poly-Lactic Acid (PLA)
Printing technology Fused Filament Fabrication (FFF)
Extrusion width [mm] 0.4
Extruder temperature [℃] 205
Printing speed [mm/s] 50.0
Infill speed [mm/s] 80.0
Layer height [mm] 0.25
Infill density [%] 15
Infill pattern Grid
Infill angles [°] 45, 135
Printer Raise3D Pro2™
Table 2. Nine sets of conditions for the error analysis.
Table 2. Nine sets of conditions for the error analysis.
Conditions θmin θmax pc wc Thresholds
C1 0 45 100 50 For the orange object
160
For the green object
85
C2 200 100
C3 500 250
C4 90 100 50
C5 200 100
C6 500 250
C7 180 100 50
C8 200 100
C9 500 250
Table 3. Error in the selected layers of the porous structure.
Table 3. Error in the selected layers of the porous structure.
Height [mm] 1 5 10 15 20 25
E [%] 3.85 9.29 10.89 10.50 10.74 6.90
Table 4. Micro-CT driven accuracy estimation at different layers of the porous structure.
Table 4. Micro-CT driven accuracy estimation at different layers of the porous structure.
Height [mm] 1 5 10 15 20 25
E [%] 2.86 3.55 4.06 3.55 2.18 3.12
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated