1. Introduction
In today’s textile industry, the pieces of leather are manually classified after being cut. This can be done by type of piece, grouping together those that are the same, or by product, grouping together the pieces needed to create a specific product such as footwear or accessories. This process is known as kitting [
1,
2,
3]. Kitting is a strategy that involves assembling all the parts of a product into a package, also known as a kit. This reduces the time spent on final product assembly and increases productivity.
Figure 1 illustrates the process of extracting cut parts from a piece of leather.
Figure 1a shows the leather with the cut pieces, with random orientations to make the most of the leather, the cut is very thin, and this will be a problem when it comes to locating the pieces because they cannot be identified using computer vision. In contrast,
Figure 1b illustrates the extracted pieces sorted into piles of the same type, leaving an empty leather skin that can be recycled.
1.1. Industry automatization
The handling of textile pieces, traditionally carried out manually, is experiencing a high level of automation in the systems, optimizing the systems, and increasing the performance of the processes. To automate a production plant, an optimized design must first be created for the incorporation of automated lines and robots [
4]. In this line, robots are being incorporated into production lines to automate industrial processes [
5,
6]. According to a study carried out in [
7], the implementation of robots in sewing and assembly tasks in the textile industry has led to substantial improvements in production accuracy and speed. The research highlights that robotic systems can operate 24 hours a day, 7 days a week, allowing for continuous production and a significant reduction in production times Robotics has also positively impacted the quality of the final product. A research article [
8] notes that computer-aided sewing (CAD) robots have demonstrated high accuracy in garment manufacturing, which has led to a significant decrease in material waste and a reduction in associated costs. In terms of inventory management efficiency, one study [
9] shows how robotics and automation have enabled real-time monitoring of stock levels through IoT-based sensors. This has optimized inventory management, allowing companies to adjust their production according to demand and reduce warehousing costs. A review article [
10] highlights that robotics in the textile industry has also had a significant impact on sustainability, as automation allows for a more efficient use of resources, reducing the consumption of energy and raw materials.
Robotics, in particular the application of pick-and-place systems, has revolutionized the textile industry by providing highly efficient solutions in the handling and placement of materials in the production process. These systems allow the automation of material handling tasks and are essential in the manufacture of textile products. A recent study [
11] has highlighted the impact of pick-and-place systems in the textile industry. These systems, often equipped with advanced sensors and computer vision technologies, are able to pick up textile components, such as fabric cuts or clothing pieces, quickly and accurately. They then place these elements in the right position, ready for the next stage of the manufacturing process. This automation has significantly reduced production times, minimizing human error and improving the quality of finished garments. The application of pick-and-place robots has also allowed greater flexibility in the production of customized textile products. Instead of using rigid assembly lines, textile companies can quickly adapt production to market demands and fashion trends, which has boosted agility in the industry. In addition, automation of repetitive tasks has freed workers to focus on more creative and value-added tasks, which has increased job satisfaction and innovation in the industry. Overall, the application of pick-and-place systems in the textile industry is a re-markable example of how robotics has transformed textile production, improving efficiency, quality, and flexibility in product manufacturing. These advances have been critical to the continued growth and competitiveness of the industry. The findings in [
11] and other similar research underscores the importance of this technology in the contemporary textile industry. It is also relevant to mention the study [
12,
13] which highlights time optimization in pick-and-place applications in footwear production. Another study [
14] was conducted to optimize the execution of simultaneous tasks by multiple robots in a shared space. The research aimed to minimize total time and avoid collisions by using Markov Decision Processes (MDP).
1.2. Handling of flexible pieces in the industry
In the field of textile manufacturing, the pick-and-place process of the cut pieces after production is a fundamental task that is not very automated due to the complexity of the handling of textile pieces. This complexity lies in the fact that the deformation of the textile pieces can cause failures in the pick and place of the pieces, as well as in their transport. In this line there are several lines of research to solve the different problems that arise. At [
15] the authors develop a robotic gripper with soft tips to simulate the human grip. The main objective is to achieve the stability and robustness of a human-like gripper with a simple and cost-effective gripper. In [
16], the authors present a robotic system for performing pick-and-place operations with de-formable objects. The system uses a structured light scanner to capture a point cloud of the object to be grasped. In [
16], a parallel gripper equipped with a strap embedded in a blade is developed for passive pulling operations. This traction mechanism is effective for picking up thin, flexible objects one at a time. The smooth surface of the belt embedded in the blade provides great adaptability to the shape of the object; therefore, many types of workpieces can be picked up. The authors of [
17], which presents a comprehensive review, the aim of this article is to highlight the challenges associated with the automated handling of these materials and to analyse the main design principles that have been employed for pick-and-place systems in terms of handling strategy, reconfigurability, gripping technology and distribution of the gripping points, etc.
1.3. Actuators in the leather and textile fashion industry
In the textile industry, grippers play a crucial role in automating tasks such as the handling and assembly of fabrics and garments. These devices enable robots to grip and move textile material accurately and efficiently. In recent years, there have been significant advances in the development of grippers specifically designed for the textile industry. Below are some of the key aspects of grippers used in textile automation. In [
18] a brief review of the different gripper categories is presented. The objective of this article is to provide a brief informative overview of the different classifications, as proper gripper selection plays a vital role in the efficiency and performance of the robotic manipulator. In [
19] the authors provide a detailed overview of the current status of robotic grippers, gripping and sensor-based control methods.
A widely used actuator in industry is the pneumatic actuator, in [
20] the authors present a four-finger pneumatic soft gripper with two sizes and four gripping modes. The fibre-reinforced bending actuator has been used to mimic the soft gripper finger. In this line hydraulic actuators are another alternative, in [
21] a new hydraulic robotic gripper called WLRG-I (the first generation of wheeled robotic grippers) is being introduced that can handle large loads and is very robust. The gripper can grip objects weighing 20 kg with a dead weight of 2.15 kg. To the authors’ knowledge, robotic grippers with such a high load-to-weight ratio are very rare. Finally, one of the most commonly used solutions for textile materials are suction cups, in [
22] an experiment-based modelling method is introduced that considers the dynamic deformation behaviour of vacuum grippers in interaction with the specific gripper-object combination. In [
23] a new gripper, called POLYPUS, is introduced, which is characterized by under-actuation and vacuum gripping for handling irregular and uniform objects of different materials, such as cardboard, glass, sheet metal and plastic. It offers a unique ability to lift loads ranging from light to heavy objects, whereas most existing grippers are tailored to a specific payload. Being of modular design, POLYPUS can be easily reconfigured for a wide range of object sizes and applications.
The porosity of textile materials presents significant challenges in the context of automation, particularly in robotic handling operations such as picking. Porosity affects the way a material can be gripped, as it alters the contact surface and friction between the material and the robot end-effector. The textile industry commonly do not uses vacuum cups (
Figure 2a) for handling due to textile porosity. The technology used to allow this manipulation is the needle gripper (
Figure 2b), which uses a needle mechanism to pick up the piece for manipulation. The main problem with these grippers is that they can damage the grain side of leather materials, so the gripping technology must be adapted to the managed material.
In this context, due to the inherent complexity of handling flexible or deformable parts, this research presents a different approach, based on gripping the pieces by an number of points that a robot manages them as a rigid element. For this purpose, a tool based on an array of gripping points is developed. Due to the amount and distribution of these gripping points on the tool, the pieces handled are considered to be de facto rigid and no deformation occurs during transport. This conditions the design of the tool, which must have a large number of gripping points, and also the control of the tool, since the grip of each piece depends on its shape, and this modifies the position and orientation, and the position of the tool will have to be recalculated in each case to maximize the number of gripping points.
The article is structured as follows: In
Section 2, the solutions adopted to the problem of cutting and handling textile pieces will be presented. It will begin with a brief description of the Nesting process (2.1) and then address the main problem by adopting a suction cups array actuator as a solution (2.2). Then, the Contour Scanning (2.3) and Computer vision system (2.4) processes will be detailed. In
Section 3, the results in simulation and in a real test will be presented. Finally, the conclusions will be presented in
Section 4.
2. Textile Cutting and Handling System
In the process of cutting and subsequent handling of textile pieces different tasks are performed, in this context this research will address the problem of handling textile pieces in order to improve the pick-and-place process in the industry.
Firstly, the problem of the flexible piece’s placement task is addressed, in this line a suction cups array actuator has been developed in order to perform the task in the most efficient way possible. This actuator incorporates an algorithmic process that analyses the piece and positions itself by means of rotation and translation, in order to perform the place, maximizing the number of suction cups (or needle actuators) that will pick up the part. Due to the fact that after the cutting process, the pieces are impossible to detect by means of artificial vision, a program has been developed which, through the CAD of the nesting and the detection of the whole contour, sending to the robot the position and orientation of the pieces to pick them up.
2.1. The Nesting process and pieces location
Nesting [
24,
25,
26], in textile manufacturing is a key strategy that goes beyond the simple arrangement of cut patterns on a material. Its key point lies in optimizing the placement of pattern pieces on a roll of fabric or sheet of material in order to minimize the associated costs while simultaneously minimizing the waste generated during the cutting process. In the production of garments and other textile products, the effective application of nesting algorithms has become an essential pillar to achieve optimal levels of efficiency. These algorithms, developed based on optimization principles and heuristic algorithms, play a crucial role in determining the most efficient arrangement of pattern pieces, thus ensuring that the least possible amount of material is used.
Figure 3 shows the Nesting design in CAD format and once cut on the textile material. As can be seen the coordinate system for each piece and that of the leather piece for the CAD is different from that of the image taken by vision, so a coordinate system conversion will have to be made.
The identification and location of pieces within the original sheet (see
Figure 3) cannot be performed solely through computer vision, due to two very relevant factors. On the one hand, the cut is usually so fine that the silhouette of the cut piece cannot be distinguished. On the other hand, pieces have the same colour as the background (the same material from which they are cut), generating no contrast of pieces with its surroundings. This makes it very difficult to locate the pieces only by vision. However, in this work, the CAD file used by the CNC cutting machine to cut the pieces is available. This CAD file can be opened and read from a Python program to obtain the position and orientation of pieces with respect to the CAD coordinate system (SC
cad).
Through computer vision, the outer contour of the entire original sheet can be located, so when the correspondence between the CAD contour and that of the piece acquired by vision (CTv) is found, the positions and orientations of the cut elements are known precisely and they can be picked up by the robot tool. This concept is the main contribution of the project and it has allowed to mix CAD information with computer vision to get the position and orientation of elements that are not recognized processing the image.
When performing rotation and translation transformations, as shown in
Figure 3, a series of formulas are used to transform the coordinate systems. Rigid transformations are movements in space that preserve the shape and size of an object and are essential for understanding how objects move. These transformations consist of rotations and translations, which can be mathematically represented as a pair of a rotation matrix
(1), which is a
2x2 matrix, and a translation vector
(t).
The multiplication of these transformations follows a specific rule that is not commutative. This indicates that the final result is affected by the order of operations. The inverse transformation allows for the calculation of reverse trajectories, which is essential in robotics and motion planning. To simplify the composition of multiple transformations, a 3x3 matrix representation (2) is used that integrates both rotations and translations for calculation purposes.
This representation can be equally adapted for two-dimensional vectors by converting them to three-dimensional vectors, which allows for their inclusion in the unified mathematical framework. For example, assuming
, it can be converted to a 3-dimensional vector
(3).
These equations facilitate the conversion of coordinate systems between the CAD and the vision system, enabling the corresponding pick of each part.
Figure 3 displays the detected part outline after converting the coordinate systems between the CAD and the images captured by the camera and processed by the artificial vision system.
2.2. The Actuator: suction cups array actuator
The pick-and-place process, essential in automated manufacturing environments, demands a specialized tool capable of performing the task quickly and efficiently. In this context, suction cups emerge as an outstanding choice to perform these tasks with precision and speed. In this project, a suction cups array actuator has been designed that manipulates the piece preventing it from deforming during transport, allowing it to be considered as rigid during the handling process.
This suction cups array actuator is composed of a matrix of 16 suction cups arranged in a non-symmetrical and non-homogeneous way in order to be able to perform different picks. This design translates into a highly versatile tool adaptable to different sizes and shapes of parts, providing different picks depending on the orientation of the part. The integration of this suction cups array actuator not only simplifies the task, but also contributes to the overall optimization of the pick-and-place process. The combination of speed, accuracy and simultaneous handling capability makes this tool an advantageous component for continuous improvement in the automation of piece handling in industrial environments.
Figure 4 illustrates part of the tool design process using Solid-Works. The implementation shown corresponds to the vacuum gripper implementation, while the needle gripper implementation is equivalent.
Figure 5 presents the exploded view of the components of the same implementation, providing greater detail and several zoom views of the most representative elements.
2.2.1. Implementation of a suction cups array actuator: Hardware
This research has tested a vacuum actuator designed to manipulate flexible objects. The distribution of the gripping points allows an uniform distribution of the grabbing forces, minimizing the risk of slippage or detachment during the manipulation task. This compact and lightweight design allows to use it with low end-effector payload robots. In this case, a Universal Robots UR5 has been used, so the weight of the tool must be below 5 kg. In any case, the less the tool weighs, the faster it can be moved by the robot without bypassing the robot’s safety limits.
Figure 6a shows the irregular distribution of the suction cups used for the tool and
Figure 6b shows the final weight of the tool.
The sixteen suction cups are distributed on a light frame, as shown in
Figure 6a. To optimize object manipulation, the distance of each suction cup from the centre of coordinates of the frame has been determined, considering as a reference the junction with the robot end effector. This information is essential to work with the matrix and ensure optimal precision in the manipulation operations.
Figure 7 provides a clear visual representation of the spatial arrangement of the suction cups in relation to the centre of the structure, as well as the plan, elevation, and profile of the actuator. The actuator gripper is protected at its edges by a padded liner firmly attached to the tool frame. A quick changer from OnRobot [
27], capable of changing tools in less than 5 seconds, is connected to the suction cup tool. Each of the sixteen suction cups is connected via tubing to a 5-way solenoid valve. The choice of these solenoid valves, specifically of the JSY1000 series [
28]. This is based on its ability to significantly reduce the size, thanks to its high flow rate, thus contributing to efficient space management and reducing the overall weight of the system.
An EX260 system, which operates via a fieldbus, has been used to control the suction cups. This communications link allows synchronization and control of gripping and handling operations. In terms of the control interface, the suction cups array actuator is controlled by a computer running programs in Python, as this is the language in which the algorithm has been developed.
2.2.2. Optimized use of the tool: Software
The actuator is designed to pick pieces of different shapes and sizes by varying the orientation of the tool. To further develop this task, software has been developed that optimizes the number of suction cups per part, i.e., it calculates which orientation the suction cups array actuator should have in order to pick up the piece with the maximum number of suction cups, thus improving the pick-and-place task, as shown in
Figure 8.
The function of maximizing the grip points of each piece has been done by creating a Python function that returns the position and orientation with which the end effector must be corrected to perform the grip. This function has been generated using the Visual Studio Code editor. The packages necessary to achieve the required functionality have been added as development progressed. The packages used to create the program in Python were:
ezdxf: Python interface to the DXF format, developed by Autodesk. Allows developers to read and modify existing DXF documents or create new DXF documents [
29].
math: Module that provides access to mathematical functions defined by the C standard. The functions provided are representation and number theory functions, powers and logarithmic functions, trigonometric functions, hyperbolic functions, special functions, and constants [
30].
matplotlib: Complete library for creating static, animated and interactive visualizations in Python [
31].
NumPy: Library that defines a data type representing multidimensional arrays, having basic functions to work with. It is a stable and fast library [
32].
shapely: Python package used for set-theoretic analysis and manipulation of planar features using GEOS library functions [
33].
The development of the program has been structured in three steps. The first step deals with the preparation of the images for processing in the Python code, using AutoCAD. Obtaining the silhouette of the piece is essential, and for this purpose, a specific procedure has been followed to guarantee its appropriate interpretation in the Python code. In this process, we have worked exclusively with straight lines connected in the required order, excluding the use of circles, semicircles, or splines.
The second step of the program presents the various functions developed for the code, dealing with the manipulation and analysis of the matrices associated with the pieces obtained. Each function has been designed with the purpose of maximizing the number of suction cups coinciding with the part.
Finally, the third step shows the “Driver Code”, which coordinates and executes the previously defined functions. The process is performed by mediating a DXF file created in AutoCAD containing the piece to be picked. The program processes the file and displays the points that make up the resulting polygon. In addition, the centre of the polygon is calculated using the “centroid” function. This systematic and detailed approach facilitates the obtaining of matrices with the maximum number of suction cups activated within the part, thus contributing to the efficiency of the system in the manipulation of objects.
The centre of the piece is shown in the graph by a star. The position of the suction cups array actuator has been moved to the centre of the part; the suction cups being represented by a cross inside a circle (
Figure 9).
Next, the first step of the displacement, known as translation, is carried out. In this phase, a process of optimising the translation of the actuator in the space occupied by the part is carried out in order to calculate the suction cups to be used, maximising the number of suction cups. This process is shown in
Figure 10.
Once the first step is completed, the analysis and calculation of the matrices with the highest number of active suction cups inside the piece is carried out, as well as the identification of the activated suction cups in each matrix. In addition, the number of arrays that meet the requirement of the maximum number of active suction cups, set at 2 in the shown example, are determined, and the coordinates used to move the array are recorded. When the translation phase is completed, the rotation phase is carried out using the matrices obtained in order to maximise the number of active suction cups inside the leather piece. These matrices are rotated around their axis, thus generating new configurations (see
Figure 11 for more information of the process). After calculating the rotation matrices with the highest number of active suction cups inside the part, the active suction cups in each matrix, the total number of selected matrices, and the coordinates associated with each configuration are recorded. In this case, the maximum number of active suction cups is kept at 2, both in the rotation and translation phases. This increase in the number of dies offers a variety of options when selecting the optimal position for piece handling.
Figure 12 shows a flowchart of the program developed to carry out the process presented. Software tests have been carried out for different sizes and dimensions of pieces as shown in
Figure 13, with satisfactory results.
2.3. Contour scanning
Once the Nesting is done on the textile material, in the computer aided manufacturing process, obtaining data from the pieces to be collected is fundamental, for this AutoCAD is used as the main tool. In addition to extracting data from the pieces according to the model sent for cutting on the CNC machine, it is also to obtain detailed information of the uncut base, thus providing a complete set of data for the production process. For the extraction of piece data, the AutoCAD EXTRACDAT command is used. This command allows specific selection of the required data from the templates, thus simplifying access to them from a programming environment such as Python. The data extracted includes the template model, essential for classification, as well as the X and Y coordinates and rotation of each of the templates. The process to obtain this data starts with the use of the EXTRACDAT command in AutoCAD, generating a structured text file that stores the selected information. This data is essential for the classification and organization of the production process.
They also facilitate the traceability of each piece throughout the manufacturing chain. Once the data has been obtained from AutoCAD, the next step involves the manipulation of the cropped image centred on the part. Transformations are applied to identify the contour of the part. This contour, once obtained, provides valuable information, including the centre of the part. Correlation of this centre with that obtained during AutoCAD data extraction establishes a coordinate system to facilitate subsequent piece processing. In addition, the eigenvectors of the contour are obtained. These vectors, which coincide with the axes of inertia of the figure, are used to extract the orientation and shape of the part.
2.4. Computer vision system
After the textile or leather sheet has undergone the cutting process, the coordinates of each piece cannot be extracted by vision, as it has been mentioned before. The complexity lies in the fact that the cut is so fine that it is impossible to differentiate each piece by vision. Therefore, it is necessary to resort to the information contained in the CAD file of the Nesting. The coordinates and rotation angles of the individual pieces can be extracted from the CAD file. First of all, the CAD and the image taken by vision must be matched. This is because the centre of coordinates of the CAD does not have to coincide with the centre of coordinates of the vision captures. Once the translation and rotation between the CAD and the vision capture is known, the rotation-translation matrix will be applied to locate the pieces in the robot’s workspace.
The objective in this piece is to locate the textile piece on the conveyor belt using machine vision. Once the position of the plate is known, the data can be extracted from AutoCAD and the transformations described in
Section 2.1 can be applied. This procedure allows correcting small positioning errors that may occur due to the movement of the plate on the conveyor belt. At the same time, the correspondence of the AutoCAD, camera and robot coordinate systems is calculated. The algorithm starts by reading the image that has been taken during the simulation. This image is cropped in such a way that all pixels around the piece that are not interesting, such as the sides of the tape, are removed. So, the image is centred on the textile piece from which it must extract information. Once the image is cropped and centred on the piece, several morphological transformations are performed to detect the contour of the piece. From this contour, the centre of the contour is obtained, with which a first relation of the coordinate system with the centre obtained in the AutoCAD data extraction can be made. In addition, you can also obtain the value of the eigenvectors of the contour, which coincide with the axes of inertia of the figure that are also extracted from AutoCAD.
With the above-mentioned data, it is possible to calculate whether the piece has been rotated with respect to AutoCAD. By having the eigenvectors of the axes both from AutoCAD and from the outline extracted from the image, the axes can be drawn on the image and the rotation and translation between the coordinate systems can be calculated. In the cropped image can be seen and in it the outline of the part, the centre and the axes have been drawn. In this case, the rotation is 0 degrees, so the piece has not changed its orientation during the movement of the ribbons.
3. Results
3.1. Simulation Results
This section describes the simulation process of the workstation with representative images.
Figure 14 shows the complete station before starting the simulation. The sequence starts with the introduction of the uncut leather into the CNC for processing according to the specified cutting pattern. After cutting, the re-cut pieces move along the rolling mat until they reach the collection area, where an image of the cut base is captured. Once the capture is done, the Python algorithm is started. The first step involves obtaining data, both for each individual piece and the uncut base, directly from AutoCAD. Subsequently, all piece data is recalculated to establish relationships with the data obtained from the captured image. After this recalculation process, the phase of sending data from Python to RobotStudio begins. It is imperative that the robot has been positioned in the waiting area for the start of the collection before this step. From RobotStudio, the signal is sent to initiate the socket communication and start sending data.
Figure 14 shows the robot in a waiting position to receive the first data. The robot executes the pick-and-place task by picking up each piece and depositing it in a pile together with other pieces of the same cutting pattern. This strategy ensures a proper sorting in the case of having several cut models. This simulation process integrates the interaction between the CNC machine, the Python algorithm, and the robot in RobotStudio, showing a pick-and-place operation and the correct sorting of the collected parts.
Figure 14 illustrates the moment when the robot performs the task of picking a piece and moves to the corresponding place. The sequence culminates with the correct pick-up of all the parts, depositing them one by one in the designated pile together with the rest of the pieces of the same model and all aligned in the same orientation. In
Figure 14, the result of the process is presented, showing all the collected templates in an organized manner. This closure visualizes the efficiency of the system in completing the handling of the pieces and their successful sorting, demonstrating the ability of the workstation to perform pick-and-place operations in an autonomous and orderly manner.
3.2. Real tests
The process starts with the operator inserting the uncut textile piece into the CNC machine, indicating the CAD file model to be cut.
Figure 15 shows the piece inserted and ready for cutting. Once positioned, the CNC is activated and performs the cut, generating the different cut pieces. Subsequently, these pieces move on the CNC rolling mat, as shown in
Figure 15, until they reach the position where the algorithm starts to extract data from AutoCAD, capture the image and send the data to the robot. The algorithm starts with the extraction of data from each template as well as from the base to establish the relationship between the piece on the table and the AutoCAD data. With all the data extracted, the visual recognition process is started. The camera is activated and records, storing the frames in a folder. When the operator indicates that the piece is in place or a specific time has elapsed, the recording stops, and the algorithm starts reading the last 15 frames.
In each of the 15 frames, morphological transformations are applied to detect the contour of the piece and determine the angle of rotation with respect to the AutoCAD data. After calculating all the iterations, the algorithm discards those results that indicate anomalous angles. Finally, an image of the detected con-torque and the axes of the base in AutoCAD and the actual rotated base is displayed on the screen.
Figure 3 shows the resulting image and the on-screen message with the angle of rotation. This approach combines CNC accuracy with visual recognition capability, allowing precise alignment of the cut pieces with the AutoCAD data prior to manipulation by the robot. The algorithm then recalculates all the template positions and rotation angles required to send the instructions to the robot accurately. This step is essential to ensure proper handling of each cut part. Afterwards, the robot establishes a connection via the socket, waiting to receive a message from the robot controller with the template number to be sent. This approach facilitates synchronization between the algorithm and the robot, allowing a coordinated execution of the process. With the connection established, the algorithm sends the data for each piece to the robot sequentially. Once the sending of data for all pieces is completed, the robot returns to its initial position, awaiting the next pick-and-place process.
Figure 15 illustrates the end of the process where all pieces have been picked and sorted according to the model that has been cut. This closure visualizes the efficiency of the system in the coordination of the cutting, visual recognition, and robotic handling operations.
4. Conclusions
This paper deals with the automation of processes in the textile and clothing industry, especially in the robotic sorting of CNC cut parts. A non-symmetrical suction cups array actuator has been developed and tested, complemented with a vision system and algorithms for the handling of textile parts. The results obtained both in simulations and in industrial plant environments highlight the efficiency and practical applicability of this technology in real production environments. Traditionally, the handling of textiles in industrial environments has been a challenge due to their deformable and varied nature. This project addresses these challenges effectively, enabling more precise handling adapted to the specific characteristics of each textile piece. In addition, the integration of a vision system and image processing algorithms allows for more accurate and automated sorting and handling, representing a significant step towards the modernization of the textile industry. Experiments have focused on specific textile types and handling conditions. The development of the suction cups array actuator allows the activation of the suction cups in an individual way, using an algorithm to maximize the gripping points to perform the task of picking up the part, in this way the actuator moves to the position and angle that maximizes the suction cups to pick up the piece so that it does not deform during the trajectory towards the place position. When pieces are cut and need to be transferred, the computer vision system cannot differentiate the pieces due to its thin cut, this problem is addressed by matching the CAD generated by the nesting and the recognition of the external contour of the leather made by vision, by means of a rotation-translation matrix the correct data is sent to the robot so that it can carry out the pick-and-place task. The applicability of this technology in the textile and leather industry has been confirmed through experiments in simulated production environments, reaching TRL 4.
Author Contributions
Conceptualization, F.J.M.-P., J.B.-M., J.V.S.H., and C.P.-V.; methodology, J.V.S.H., and C.P.-V.; software, F.J.M.-P. and J.B.-M.; validation, F.J.M.-P. and J.B.-M.; formal analysis, J.V.S.H., and C.P.-V.; investigation, D.M.-G. and F.J.M.-P.; resources, D.M.-G. and F.J.M.-P.; writing—original draft preparation, F.J.M.-P., J.B.-M., J.V.S.H., and C.P.-V.; writing—review and editing, F.J.M.-P., J.B.-M., J.V.S.H., and C.P.-V.; visualization, F.J.M.-P. and J.B.-M.; supervision, J.V.S.H., and C.P.-V.; project administration, C.P.-V.; funding acquisition, C.P.-V.; All authors have read and agreed to the published version of the manuscript.
Funding
This research has been partly funded by project CPP2021-008593, grant MCIN/AEI/10.13039/501100011033 and by the European Union-NextGenerationEU/PRTR.
Acknowledgments
This project has been performed in collaboration with CFZ Cobots SL (
https://cfzcobots.com/), a company devoted to create software solutions in combination with hardware implementations for integrators and engineering companies.
Conflicts of Interest
The authors declare no conflict of interest.
References
- H. Brynzér, M.I. Johansson, Design and performance of kitting and order picking systems. International Journal of Production Economics 1995, 41, 115–125, ISSN 0925-5273. [CrossRef]
- R. Hanson, L. Medbo. Kitting and time efficiency in manual assembly. International Journal of Production Research 2012, 1125, 4. [CrossRef]
- R. Hanson, A. Brolin A comparison of kitting and continuous supply in in-plant materials supply. International Journal of Production Research 2013, 51, 979–992. [CrossRef]
- Borrell Méndez, J.; Cremades, D.; Nicolas, F.; Perez-Vidal, C.; Segura-Heras, J.V. Conceptual and Preliminary Design of a Shoe Manufacturing Plant. Appl. Sci. 2021, 11, 11055. [Google Scholar] [CrossRef]
- González, A.; Perez-Vidal, C.; Borrell Méndez, J.; Solanes, J.E.; Gracia, L. Development of a collaborative robotic system to polish leather surfaces. International Journal of Computer Integrated Manufacturing 2023, 1, 19. [Google Scholar] [CrossRef]
- Borrell Méndez, J., González, A., Perez-Vidal, C. et al. Cooperative human–robot polishing for the task of patina growing on high-quality leather shoes. Int J Adv Manuf Technol 2023, 125, 2467–2484. [CrossRef]
- Smith, A., Johnson, B., & Brown, C. Robotic Automation in Textile Industry: Advancements and Benefits. Journal of Textile Engineering & Fashion Technology 2020, 6, 110–112.
- Lee, S., & Kim, H. Precision Sewing with Computer-Aided Robotic Systems in Textile Manufacturing. International Journal of Robotics and Automation 2019, 34, 209–218.
- Wang, Y., Liu, H., & Zhang, Q. IoT-Based Inventory Management in Textile Manufacturing: A Case Study. Journal of Industrial Engineering and Management 2021, 14, 865–878.
- García, L., Pérez, M., & Rodríguez, J. Sustainability Benefits of Automation in the Textile Industry. Journal of Sustainable Textiles 2018, 5, 94–105.
- Chen, X., Wu, Z., & Li, J. Robotic Pick-and-Place Systems in Textile Manufacturing: A Review of Efficiency and Quality Improvements. Robotics and Automation Review 2022, 18, 45–56.
- Borrell Méndez, J., Perez-Vidal, C., Segura Heras, J.V., & Pérez-Hernández, J.J. Robotic Pick-and-Place Time Optimization: Application to Footwear Production. IEEE Access 2020, 8, 209428–209440. [CrossRef]
- Borrell, J., Perez-Vidal, C. & Segura, J.V. Optimization of the pick-and-place sequence of a bimanual collaborative robot in an industrial production line. Int J Adv Manuf Technol 2024, 130, 4221–4234. [CrossRef]
- Mateu-Gomez, D. Martínez-Peral, F.J., & Perez-Vidal, C. Multi-Arm Trajectory Planning for Optimal Collision-Free Pick-and-Place Operations. Technologies 2024, 12, 12. [Google Scholar] [CrossRef]
- Jørgensen, T.B. Jensen, S.H.N., Aanæs, H. et al. An Adaptive Robotic System for Doing Pick and Place Operations with Deformable Objects. J Intell Robot Syst 2019, 94, 81–100. [Google Scholar] [CrossRef]
- K. Morino, S. Kikuchi, S. Chikagawa, M. Izumi and T. Watanabe. Sheet-Based Gripper Featuring Passive Pull-In Functionality for Bin Picking and for Picking Up Thin Flexible Objects. IEEE Robotics and Automation Letters 2020, 5, 2007–2014. [CrossRef]
- Andreas Björnsson, Marie Jonsson, Kerstin Johansen, Automated material handling in composite manufacturing using pick-and-place systems—a review. Robotics and Computer-Integrated Manufacturing 2018, 51, 222–229, ISSN 0736-5845. [CrossRef]
- Z. Samadikhoshkho, K. Zareinia and F. Janabi-Sharifi. A Brief Review on Robotic Grippers Classifications. In Proceedings of the 2019 IEEE Canadian Conference of Electrical and Computer Engineering (CCECE), Edmonton, AB, Canada, 2019; pp. 1–4. [CrossRef]
- Baohua Zhang, Yuanxin Xie, Jun Zhou, Kai Wang, Zhen Zhang, State-of-the-art robotic grippers, grasping and control strategies, as well as their applications in agricultural robots: A review. Computers and Electronics in Agriculture 2020, 177, 105694, ISSN 0168-1699. [CrossRef]
- Ye, Y., Cheng, P., Yan, B. et al. Design of a Novel Soft Pneumatic Gripper with Variable Gripping Size and Mode. J Intell Robot Syst 2022, 106, 5. [CrossRef]
- J. Qi, X. Li, Z. Tao, H. Feng and Y. Fu. Design and Control of a Hydraulic Driven Robotic Gripper. In Proceedings of the 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 2021; pp. 398–404. [CrossRef]
- Gabriel, F. Fahning, M., Meiners, J. et al. Modeling of vacuum grippers for the design of energy efficient vacuum-based handling processes. Prod. Eng. Res. Devel. 2020, 14, 545–554. [Google Scholar] [CrossRef]
- Matteo Maggi, Giacomo Mantriota, Giulio Reina, Introducing POLYPUS: A novel adaptive vacuum gripper. Mechanism and Machine Theory 2022, 167, 104483, ISSN 0094-114X. [CrossRef]
- Heckmann, R. Lengauer, T. A simulated annealing approach to the nesting problem in the textile manufacturing industry. Ann Oper Res 1995, 57, 103–133. [Google Scholar] [CrossRef]
- G. Chryssolouris, N. Papakostas and D. Mourtzis, A decision-making approach for nesting scheduling: A textile case. International Journal of Production Research 2000, 38, 4555–4564. [CrossRef]
- Cláudio Alves, Pedro Brás, José Valério de Carvalho, Telmo Pinto, New constructive algorithms for leather nesting in the automotive industry. Computers & Operations Research 2012, 39, 1487–1505, ISSN 0305-0548. [CrossRef]
- OnRobot: Quik Changer. Available online: https://onrobot.com/es/productos/quick-changer (accessed on 11 December 2023).
- Electric Vacuum: JSY100. Available online: https://www.smcworld.com/catalog/New-products-en/mpv/es11-113-jsy-np/data/es11-113-jsy-np.pdf (accessed on 11 December 2023).
- Introduction—ezdxf 1.0.3 documentation. (s. f.). Available online: https://ezdxf.readthedocs.io/en/stable/introduction.html (accessed on 11 December 2023).
- math—Mathematical functions. (s. f.). Python documentation. Available online: https://docs.python.org/3/library/math.html (accessed on 11 December 2023).
- Matplotlib—Visualization with Python. (s. f.). Available online: https://matplotlib.org/ (accessed on 11 December 2023).
- NumPy—Bioinformatics at COMAV 0.1 documentation. (s. f.). Available online: https://bioinf.comav.upv.es/courses/linux/python/scipy.html (accessed on 11 December 2023).
- The Shapely User Manual — Shapely 2.0.1 documentation. (s. f.). Available online: https://shapely.readthedocs.io/en/stable/manual.html (accessed on 11 December 2023).
Figure 1.
Visual description of project’s objective, starting from (a) to get (b) automatically.
Figure 1.
Visual description of project’s objective, starting from (a) to get (b) automatically.
Figure 2.
Gripping point techniques: (a) based on suction cups and (b) based on needle actuators.
Figure 2.
Gripping point techniques: (a) based on suction cups and (b) based on needle actuators.
Figure 3.
Nesting design in CAD and textile material (leather).
Figure 3.
Nesting design in CAD and textile material (leather).
Figure 4.
Design of the suction cups array actuator: elevation, plan, and profile together with perspective view.
Figure 4.
Design of the suction cups array actuator: elevation, plan, and profile together with perspective view.
Figure 5.
Exploded-view drawing of tool components with element details.
Figure 5.
Exploded-view drawing of tool components with element details.
Figure 6.
Suction cups array actuator implementation detail: (a) Non-symmetrical distribution of the actuator suction cups and (b) Final weight of the tool: 3.18 kg.
Figure 6.
Suction cups array actuator implementation detail: (a) Non-symmetrical distribution of the actuator suction cups and (b) Final weight of the tool: 3.18 kg.
Figure 7.
Elevation, plan, and profile of the suction cups array implementation.
Figure 7.
Elevation, plan, and profile of the suction cups array implementation.
Figure 8.
Gripper of a piece with a generic matrix gripper: (a) Gripping by assigning the position and orientation of the workpiece to the robot end effector: 7 gripping points, and (b) Gripping of the workpiece by rotating and moving the tool to maximize the number of gripping points: 11 gripping points.
Figure 8.
Gripper of a piece with a generic matrix gripper: (a) Gripping by assigning the position and orientation of the workpiece to the robot end effector: 7 gripping points, and (b) Gripping of the workpiece by rotating and moving the tool to maximize the number of gripping points: 11 gripping points.
Figure 9.
Centre of the tool represented in the centre by a star (so gripping point at that point).
Figure 9.
Centre of the tool represented in the centre by a star (so gripping point at that point).
Figure 10.
Process for finding the optimal configuration.
Figure 10.
Process for finding the optimal configuration.
Figure 11.
Selection of the actuator arrangement according to the selected suction cups.
Figure 11.
Selection of the actuator arrangement according to the selected suction cups.
Figure 12.
Flowchart of the code developed in Python.
Figure 12.
Flowchart of the code developed in Python.
Figure 13.
Process for finding the optimal configuration of various parts.
Figure 13.
Process for finding the optimal configuration of various parts.
Figure 14.
Complete simulated process: Nesting, piece cutting, machine vision and pick-and-place.
Figure 14.
Complete simulated process: Nesting, piece cutting, machine vision and pick-and-place.
Figure 15.
Real setup: Set of frames extracted from the video that shows how the system works. These 16 frames depict the entry of a leather piece into the cutting machine, its exit and how the robot equipped with the tool developed in this project manages cut pieces successfully.
Figure 15.
Real setup: Set of frames extracted from the video that shows how the system works. These 16 frames depict the entry of a leather piece into the cutting machine, its exit and how the robot equipped with the tool developed in this project manages cut pieces successfully.
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).