Improve Quality and Efficiency of Textile Process using Data-driven Machine Learning in Industry 4.0
Volume 4, Issue 2
Farzad Tahriri, Ali Azadeh
Published online: 13 April 2018
Article Views: 33
Abstract
This paper focuses on the relationship between key operation parameters and machine learning defects to design an Operation Parameters Recommender System (OPRS) in the textile industry. This paper integrates historic manufacturing process data from the perspective of data science, such as machine operation parameters from warping, sizing, beaming, weaving process, and management experience data, such as textile inspection results from the quality control section. Then, the regression models are applied to predict the textile operation parameters. This research also uses the classification models to predict the quality of the textile. Based on the ten-fold cross-validation testing, experimental results show that our model can achieve 90.8% accuracy on the quality level prediction. The best regression model for predicting weaving operation parameters can reduce the mean square error (MSE) 0.01%. By combining the above two models, the proposed OPRS can provide a completed analysis data of operation parameters. It provides good performance when comparing with previous stochastic methods. As the proposed OPRS can support technicians setting operation parameters more precisely, even for a new type of yarn, it can help to fix the tech skills gap in the textile manufacturing process.
Reference
A. Hasanbeigi, “Energy-efficiency improvement opportunities for the textile industry,” Ernest Orlando Lawrence Berkeley National Laboratory, Berkeley, CA, Tech. Rep., 2010.
S. Wang, J. Wan, D. Li, and C. Zhang, “Implementing smart factory of industrie 4.0: An outlook,” International Journal of Distributed Sensor Networks, vol. 12, no. 1, pp. 158–168, 2016.
A. V. Prasad, & Krishna, Exploring the Convergence of Big Data and the Internet of Things. Hershey, PA: IGI Global, 2017.
R. Buyya, J. Broberg, and A. M. Goscinski, Cloud computing: Principles and paradigms. Hoboken, NJ: John Wiley & Sons, 2010.
K. Hwang and M. Chen, Big-Data Analytics for Cloud, IoT and Cognitive Computing. Hoboken, NJ: John Wiley & Sons, 2017.
M. Dorgham, “Warping parameters influence on warp yarns properties: Part 2: Warp yarn material and cone position on warping creel,” Ph.D. dissertation, Weaving and Knitting Department Faculty of Applied Arts, Helwan University, Cairo, Egypt, 2014.
NPTEL, “Introduction to fabric manufacturing,” 2017. [Online]. Available: https://bit.ly/2BekVLK
M. Dorgham, “Warping parameters influence on warp yarns properties: Part 2: Warp yarn material and cone position on warping creel,” Journal of Textile Science & Engineering, vol. 4, no. 5, pp. 164–170, 2014. doi: https://doi.org/10.4172/2165-8064.1000132
Y. Gloy, W. Renkens, M. Herty, and T. Gries, “Simulation and optimisation of warp tension in the weaving process,” Journal of Textile Science & Engineering, vol. 5, no. 1, pp. 179–186, 2015. doi:https://doi.org/10.4172/2165-8064.100017
A. Karnoub, N. Kadi, Z. Azari, and E. Bakeer, “Find the suitable warp tension to get the best resistance for jacquard fabric,” Journal of Textile Science & Engineering, vol. 5, no. 6, pp. 222–232, 2015. doi: https://doi.org/10.4172/2165-8064.1000222
J. Geršak, “Development of the system for qualitative prediction of garments appearance quality,” International Journal of Clothing Science and Technology, vol. 14, no. 3/4, pp. 169–180, 2002. doi:https://doi.org/10.1108/09556220210437149
S. Yanık, C. Kahraman, and H. Yılmaz, “Intelligent process control using control chartsii: Control charts for attributes,” in Intelligent Decision Making in Quality Management. Springer, 2016, pp.71–100.
D. Z. Pavlinic, J. Geršak, J. Demšar, and I. Bratko, ´ “Predicting seam appearance quality,” Textile Research Journal, vol. 76, no. 3, pp. 235–242, 2006. doi: https://doi.org/10.1177/0040517506061533
B. Heshmaty and A. Kandel, “Fuzzy linear regression and its applications to forecasting in uncertain environment,” Fuzzy Sets and Systems, vol. 15, no. 2, pp. 159–191, 1985. doi: https://doi.org/10.1016/0165-0114(85)90044-2
B. Savkovic, P. Kovac, I. Mankova, M. Gostimirovic, K. Rokosz, and D. Rodic, “Surface roughness modeling of semi solid aluminum milling by fuzzy logic,” Journal of Advances in Technology and Engineering Studies, vol. 3, no. 2, pp. 51–63, 2017. doi: https://doi.org/10.20474/jater-3.2.2
F. Gongor, O. Tutsoy, and S. Colak, “Development and implementation of a sit-to-stand motion algorithm for humanoid robots,” Journal of Advances in Technology and Engineering Research, vol. 3, no. 6, pp. 245–256, 2017. doi: https://doi.org/10.20474/jater-3.6.4
V. Sellam and E. Poovammal, “Prediction of crop yield using regression analysis,” Indian Journal of Science and Technology, vol. 9, no. 38, pp.1–5, 2016. doi: https://doi.org/10.17485/ijst/2016/v9i38/91714
R. Tibshirani, “Regression shrinkage and selection via the lasso,” Journal of the Royal Statistical Society. Series B (Methodological), pp. 267–288, 1996.
S. Wang, B. Ji, J. Zhao, W. Liu, and T. Xu, “Predicting ship fuel consumption based on LASSO regression,” Transportation Research Part D: Transport and Environment (In Press), 2017. doi: https://doi.org/10.1016/j.trd.2017.09.014
J. Macek, “Incremental learning of ensemble classifiers on ECG data,” in 18th IEEE Symposium on Computer-Based Medical Systems (CBMS’05) Dublin, Ireland: IEEE, 2005. doi: https://doi.org/10.1109/cbms.2005.69 pp. 315–320.
P. Fergus, M. Selvaraj, and C. Chalmers, “Machine learning ensemble modelling to classify caesarean section and vaginal delivery types using cardiotocography traces,” Computers in Biology and Medicine, vol. 93, pp. 7–16, 2018. doi:https://doi.org/10.1016/j.compbiomed.2017.12.002
Jesus, “Ensemble-based classifiers,” in Multilabel Classification, F. Herrera, Ed. Cham, Switzerland: Springer, 2016, pp. 101–113.
I. I. Baskin, G. Marcou, D. Horvath, and A. Varnek, “Bagging and boosting of classification models,” in Tutorials in Chemoinformatics, A. Varnek, Ed. Hoboken, NJ: Wiley, 2017, pp. 241–247.
G. Ditzler, J. LaBarck, J. Ritchie, G. Rosen, and R. Polikar, “Extensions to online feature selection using bagging and boosting,” IEEE Transactions on Neural Networks and Learning Systems, vol. 29, no. 9, pp. 4504 – 4509, 2017. doi: https://doi.org/10.1109/tnnls.2017.2746107
B. Sumana and T. Santhanam, “Optimizing the prediction of bagging and boosting,” Indian Journal of Science and Technology, vol. 8, no. 35, pp. 1– 13, 2015. doi: https://doi.org/10.17485/ijst/2015/v8i35/78449
C. Ying, M. Qi-Guang, L. Jia-Chen, and G. Lin, “Advance and prospects of adaboost algorithm,” Acta Automatica Sinica, vol. 39, no. 6, pp.745–758, 2013. doi: https://doi.org/10.3724/sp.j.1004.2013.00745
J. Son, I. Jung, K. Park, and B. Han, “Tracking-bysegmentation with online gradient boosting decision tree,” in Proceedings of the IEEE International Conference on Computer Vision, Washington, DC, WA, 2015. doi: https://doi.org/10.1109/iccv.2015.350 pp. 3056–3064.
T. Chen and C. Guestrin, “Xgboost: A scalable tree boosting system.” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA: ACM, 2016. doi: https://doi.org/10.1145/2939672.2939785 pp. 785–794.
A. Gómez-Ríos, J. Luengo, and F. Herrera, “A study on the noise label influence in boosting algorithms: AdaBoost, GBM and XGBoost.” International Conference on Hybrid Artificial Intelligence Systems, La Rioja, Spain: Springer, 2017. doi: https://doi.org/10.1007/978-3-319-59650-1_ 23 pp. 268–280.
J. Han, J. Pei, and M. Kamber, Data Mining: Concepts and Techniques. New York, NY: Elsevier, 2011.
D. T. Larose and C. D. Larose, Discovering Knowledge in Data: An Introduction to Data Mining. Hoboken, NJ: John Wiley & Sons, 2014.
InAnalysis, “Inanalysis: Data science getting started best tools,” 2017. [Online]. Available: https://inanalysis.github.io/
To Cite this article
C.-Y. Lee, J.-Y. Lin, R.-I. Chang, “Improve quality and efficiency of textile process using data-driven machine learning in industry 4.0,” International Journal of Technology and Engineering Studies, vol. 4, no. 2, pp. 64-76, 2018.