{"id":112904,"date":"2026-02-05T09:33:56","date_gmt":"2026-02-05T08:33:56","guid":{"rendered":"https:\/\/industry-science.com\/?post_type=article&#038;p=112904"},"modified":"2026-02-10T22:00:57","modified_gmt":"2026-02-10T21:00:57","slug":"ai-assembly-workplace-design","status":"publish","type":"article","link":"https:\/\/industry-science.com\/en\/articles\/ai-assembly-workplace-design\/","title":{"rendered":"Applied AI for Human-Centric Assembly Workplace Design"},"content":{"rendered":"\n<p>Assembly workstations are advancing quickly as artificial intelligence (AI) becomes more integrated into industrial production settings. Specifically, the ability to predict human movements and identify features within a workspace is key for optimizing <a href=\"https:\/\/industry-science.com\/en\/functions\/assembly\/\">assembly<\/a> tasks, tracking progress, and enabling flexible setups. With the increased use of collaborative robots, the traditional fixed automation model is shifting toward human-centered automation, where people and machines collaborate closely on shared tasks. This approach is particularly beneficial in high-mix, low-volume production, where flexibility and adaptability are essential.<\/p>\n\n\n\n<p>Effective human-robot collaboration (HRC) depends on the robot\u2019s ability to anticipate human actions. Achieving this requires systems that can sense and interpret body movements, gestures, gaze, and posture in real time. Manual interpretation of such behaviors is often slow and context-dependent, making AI-powered solutions essential. Predictive models that process multimodal sensor data, such as skeletal joint trajectories and eye-tracking signals, allow systems to infer both the physical actions and the intent behind them. These capabilities transform conventional workstations into intelligent, responsive environments.<\/p>\n\n\n\n<p>Smart workstations, which combine human input and robotic support, are designed to adapt to user behavior by providing proactive assistance. While AI-driven support in controlled environments is well established, challenges remain in applying these systems to real-world production. Some of these challenges include model interpretation, transparency of AI decisions, and compliance with emerging regulations. Specifically, predictive AI systems that model human behavior need to be evaluated to ensure they align with ethical standards, such as the European Union\u2019s Artificial Intelligence Act, which categorizes such systems in HRC as \u201chigh-risk.\u201d This classification requires adherence to strict guidelines on human oversight, data protection, fairness, and transparency [1].<\/p>\n\n\n\n<p>In this work, we introduce a holistic and systematic approach for applying AI ethics assessment to an industrial use case based on a Smart Work Assistant (SWA) system that supports human workers in assembly. The approach combines multimodal sensing and AI to predict human intention and focus through motion and gaze data. In addition to the technical implementation, we assess the system\u2019s ethical and regulatory implications using the Z-Inspection\u00ae methodology [2]. Originally designed to evaluate trustworthy AI, Z-Inspection\u00ae has been demonstrated in sectors such as healthcare and offers a structured way to assess alignment with legal, social, and technical standards.<\/p>\n\n\n\n<p>This paper introduces initial ideas for applying Z-Inspection\u00ae in an industrial assembly environment. We explore whether this methodology can effectively support ethical assessments of predictive AI in dynamic, physically interactive settings, where safety, accountability, and human involvement are essential. While large-scale testing is still in progress, preliminary observations indicate notable improvements, such as clearer task guidance, less physical strain, and maintained operator control. These insights are distilled into practical and adaptable principles for AI-enhanced assembly work. Results are based on initial qualitative observations from pilot use; plans for quantitative validation, including posture-angle, task duration, and operator comfort ratings, are underway.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Current developments in AI for human-centric assembly<\/h2>\n\n\n\n<p>AI for smart work assistants in assembly offers significant benefits in terms of operational efficiency, human-robot collaboration, and worker well-being. For instance, it can optimize manufacturing processes by adapting assembly lines to fluctuating product designs and market demands, enhance equipment life cycle management through predictive maintenance, and optimize workforce allocation based on skills and production needs [3]. The role of AI in defect inspection and product quality is crucial for improving the product life cycle, from design to after-sales support [4].<\/p>\n\n\n\n<p>In a human-robot collaboration (HRC) scenario, [5] proposed human action recognition (HAR) in assembly lines to enhance HRC efficiency. This framework has been validated in real-world applications, demonstrating its effectiveness in enhancing human-robot interactions. Similarly, conversation models proposed in [<a href=\"http:\/\/10.1007\/s10845-023-02228-8\">6<\/a>] create a more natural human-robot interaction in real-world scenarios. In [<a href=\"http:\/\/10.1145\/ 3434074.3447163\">7<\/a>], digital intelligent assistants (DIAs) are proposed, which leverage large language models (LLMs) to reduce cognitive workload while improving user experience in assembly processes.<\/p>\n\n\n\n<p>AI can identify potential ergonomic issues in manufacturing processes, preventing workplace injuries and long-term medical problems. Automated solutions analyze operators\u2019 work allocations to ensure ergonomic safety, adapting to changes in line speed or product mix [8]. The integration of AI in manufacturing shifts the role of humans from merely interacting with technology to actively collaborating with AI-enabled agents. This shift necessitates a comprehensive understanding of human-AI collaboration to design systems that support effective production and operator well-being [9].<\/p>\n\n\n\n<p>Integrating AI solutions into manufacturing processes poses challenges related to data integration and scalability. Addressing these challenges is crucial for the widespread adoption of AI in manufacturing [3]. Designing AI-enabled systems that consider human input at the design stage can improve the effectiveness of these systems in production environments [10]. In this regard, challenges related to data integration, scalability, and ethical considerations must be addressed to fully realize the potential of AI in this domain. Future research should focus on developing predictive models, optimizing equipment replacement strategies, and designing human-AI collaborative systems that genuinely support effective production and operator well-being.<\/p>\n\n\n\n<p>Human-centered applications such as human-robot collaboration (HRC) employ advanced sensor technology and machine learning to predict human motion behaviors using time-series data. They thereby improve robot responsiveness and prevent collisions. There are several human motion-modeling techniques including deep learning [11], Gaussian mixture models [12], and motion diffusion models [13]. Gaussian Mixture Models (GMMs) and their derivatives have been shown to handle temporal variability and multimodal distributions well [12, 14]. Similarly, attention-based networks are promising for modeling sequential data in tasks involving gesture and motion prediction [15].<\/p>\n\n\n\n<p>Ethically, several frameworks have been proposed to ensure responsible AI deployment. The EU\u2019s High-Level Expert Group on AI introduced guidelines on Trustworthy AI, emphasizing human agency, transparency, privacy, and non-discrimination. However, these guidelines lack concrete implementation strategies, particularly in dynamic industrial contexts. The Z-Inspection<sup>\u00ae<\/sup> methodology fills this gap by providing a structured, iterative process for assessing AI systems through legal, ethical, and technical lenses. Applied initially in healthcare AI, recent work points out its potential adaptability to industrial and manufacturing contexts (see [2]).<\/p>\n\n\n\n<p>Despite these advances, a gap remains in integrating predictive AI models with holistic ethical assessments in real-world industrial settings. Our work addresses this by combining technical innovation with regulatory alignment, using a scenario of the ELAM system, a digital worker assistance system developed by Armbruster Engineering.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Proposed method for AI-supported motion capture system<\/h2>\n\n\n\n<p>In this work, we propose a method for applied AI based on a modular pipeline that employs human motion modeling, inference, and predictive human motion understanding for assistive smart workstations using the Z-Inspection\u00ae trustworthy AI assessment. The Z-Inspection\u00ae process involves three main phases: setup, access, and resolution. The setup phase includes preconditions, team, boundaries, and contexts, whereas the access phase expands to socio-technical scenarios, ethical issues, a map of ethical tensions according to regulations, and the definition and execution of action paths. In the resolution phase, solutions are resolved and recommended.<\/p>\n\n\n\n<p>Our approach applies the Z-Inspection\u00ae method to AI-driven digital human motion analysis, thereby improving process monitoring by identifying manual assembly tasks that typically lack sensor-based documentation. This allows ELAM to effectively record these actions and adapt flexibly to different tasks without hardware changes. Moreover, ergonomic monitoring through digital human motion analysis can automatically adjust workstations to reduce physical strain. Analyzing digital motion data can also lead to better workstation layouts by strategically positioning frequently accessed components, thus improving efficiency. Additionally, capturing complete assembly sequences digitally could streamline the creation of accurate work instructions, making initial setups and future adjustments easier. In our proposed pipeline (see <strong>Fig. 1<\/strong>), the three phases are contained within the Z-Inspection\u00ae access phase.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Phase I &#8211; Inference and predictive AI<\/h3>\n\n\n\n<p>Phase I involves constructing an offline primitive motion model using Morphable Graphs (see [12]). A motion capture system captures the skeletal trajectories of a worker during typical assembly tasks. The primitive motion models are created based on the extended work, initially presented in a new planning approach by Manns et al. [16], which combined a Gaussian Mixture Model (GMM) and Functional Principal Component Analysis (FPCA) to create probabilistic primitive motion models (see also [13]). \u00a0Each primitive is encoded as a structured entry in a knowledge representation module, forming a domain-specific motion library.<\/p>\n\n\n\n<p>In this phase, real-time motion capture data is projected into the learned FPCA space. A likelihood-based inference mechanism classifies the observed motion segment and predicts future frames. The system not only identifies which primitive is being executed, but also forecasts the likely subsequent action, enabling anticipatory behavior in robotic assistants. The red dashed lines in <strong>Figure 1<\/strong> highlight the real-time test and inference paths of this phase.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"786\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1-1024x786.jpeg\" alt=\"Figure 1: Overview of the proposed predictive motion AI system integrated with the ELAM platform. The system combines Morphable Graph modeling for motion primitive learning with a real-time inference engine, governed by an applied ethics layer based on Z-Inspection\u00ae.\" class=\"wp-image-113084\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1-1024x786.jpeg 1024w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1-488x375.jpeg 488w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1-768x590.jpeg 768w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1-380x292.jpeg 380w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1-1536x1180.jpeg 1536w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1-510x392.jpeg 510w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1-64x49.jpeg 64w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-1.jpeg 2000w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>Figure 1: Overview of the proposed predictive motion AI system integrated with the ELAM platform. The system combines Morphable Graph modeling for motion primitive learning with a real-time inference engine, governed by an applied ethics layer based on Z-Inspection\u00ae.<\/em><\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Phase II &#8211; \u201cApplied AI Ethics\u201d assessment through morphological matrix<\/h3>\n\n\n\n<p>To ensure that predictive AI complies with the EU AI Act and promotes trustworthy and explainable AI, especially in smart workplaces, a holistic assessment for applied AI ethics and trustworthiness is proposed based on the Z-Inspection\u00ae methodology. This assessment is part of motion capture support for AI in ELAM, which continuously monitors the system\u2019s ethical, legal, and technical aspects.<\/p>\n\n\n\n<p>The key ethical considerations to address during this assessment include: <\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>fairness and non-discrimination<\/li>\n\n\n\n<li>transparency and explainability<\/li>\n\n\n\n<li>safety and risk mitigation<\/li>\n\n\n\n<li>human oversight and control<\/li>\n\n\n\n<li>data protection and privacy<\/li>\n<\/ul>\n\n\n\n<p>This should cover fairness in risk analysis of motion or gaze data, the representativeness of training datasets, and the prevention of systematic disadvantages for specific operators. Similarly, the AI model\u2019s transparency and explainability must include generating understandable predictions and explanations of why, what, and what\u2019s next, as well as informing humans about system decisions. <\/p>\n\n\n\n<p>Safety and risk mitigation involve preventing harmful or unintended robot behavior, real-time risk monitoring (such as collision detection), and anticipatory motion with safety constraints. Human oversight and control involve operator override, final decision-making authority, error recovery, and corrective actions for mispredictions. Lastly, data protection and privacy focus on how to anonymize motion or gaze data, obtain proper consent, and minimize the disclosure of personally identifiable information.<\/p>\n\n\n\n<p>To evaluate the ethics and trustworthiness of an AI application in a production scenario, the holistic concept is systematically described using a morphological matrix. This matrix supports adaptable, transparent, and multi-perspective evaluation during both system development and live operation.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Phase III &#8211; ELAM integration<\/h3>\n\n\n\n<p>In this phase, the application concept for an AI-supported motion capture system for ELAM is proposed for process monitoring, optimization, and ergonomic analysis. The applied AI that serves as input for ELAM\u2019s analytics is assessed by Applied AI ethics for its compliance.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Applied AI ethics assessment for collaborative assembly scenario in practice<\/h2>\n\n\n\n<p>Consider a smart assembly station (see <strong>Fig. 2<\/strong>) equipped with the ELAM worker assistance system developed by Armbruster Engineering, where a human operator collaborates with a cobot to mount heavy or bulky components, such as a large control cabinet door. Motion sensors and a head-mounted eye tracker continuously monitor the operator\u2019s actions and gaze. <\/p>\n\n\n\n<p>When the operator briefly looks at the hinge location and reaches for the screwdriving tool, the AI infers the intention to mount the door and proactively commands the cobot to position the door at an ergonomic height and hold it securely in place while the operator fastens the hinges with a torque-controlled screwdriver. This proactive assistance reduces alignment time and idle periods, eliminates the need for an additional person or auxiliary fixtures, and improves overall safety and ergonomics.<\/p>\n\n\n\n<p><a>In this setup, the ELAM interface communicates the cobot\u2019s predictive decisions via a Robot Operating System (e.g., ROS2). The operator retains full control and can override any incorrect actions using a touchscreen. In practice, the override feature increases operator autonomy by enabling immediate correction of AI mispredictions. If the cobot selects an incorrect motion or position, the operator can stop or modify the action without disrupting the workflow. This maintains user control and allows quick error recovery, ensuring that decision authority remains with the human even during predictive assistance.<\/a><\/p>\n\n\n\n<p>Ethical oversight is upheld according to the Z-Inspection\u00ae framework, which emphasizes explainability, human oversight, and data traceability. The system can be adapted for other tasks involving heavy or unwieldy parts, such as front panels or chassis subassemblies, supporting versatile holding, precise positioning, and step-by-step guidance during multi-point screwdriving. Because the system relies on camera and motion-based tracking, data protection measures such as anonymization, transparency, and user consent are incorporated from the initial design phase.<\/p>\n\n\n\n<p>In addition to supporting operator autonomy, the predictive assistance reduces unnecessary motion effort and strain observed qualitatively during pilot tests. The Z-Inspection\u00ae assessment directly informed these design choices by identifying risks related to misprediction, transparency, and data handling and translating them into concrete mitigations like override control, intent visualization, safety limits, and privacy-by-design measures. This method links human analysis, ergonomics evidence, and Z-Inspection\u00ae reproducibility.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"337\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2-1024x337.jpeg\" alt=\"Figure 2: Overview of the proposed electric cabinet assembly process. The collaborative robot overlay illustrates the concept.\" class=\"wp-image-113086\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2-1024x337.jpeg 1024w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2-764x251.jpeg 764w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2-768x253.jpeg 768w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2-514x169.jpeg 514w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2-1536x505.jpeg 1536w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2-510x168.jpeg 510w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2-64x21.jpeg 64w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-2.jpeg 2000w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>Figure 2: Overview of the proposed electric cabinet assembly process. The collaborative robot overlay illustrates the concept.<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Alignment of AI design with ethical dimensions<\/h2>\n\n\n\n<p>The proposed use case was chosen for its potential in product assembly applications because it supports AI based on a motion capture system and combines it with the ELAM worker assistance system. It includes features for process monitoring, ergonomics, process optimization, and process setup. These four functions can be virtually applied to any external assembly process using a worker assistance system, regardless of product type. The use of a motion capture system-based AI in the ELAM system thus has the potential to enhance processes and, consequently, reduce costs. The applied AI ethics assessment layer, in particular, helps to mitigate AI risks.<\/p>\n\n\n\n<p>The applied AI relies on interpretable, explainable probabilistic models. Because the input data consists of skeletal data and spatial semantics, the risk of manipulating ethics and trustworthiness is low. From an ethical standpoint, this integration presents both risks and opportunities. By focusing on human motion in AI-driven inference, the system must be thoroughly assessed against existing AI ethics standards and regulatory frameworks. To support this, the systematic approach introduced, along with a configurable morphological matrix (see <strong>Fig. 3)<\/strong>, can play an important role in risk assessment and mitigation.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"466\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3-1024x466.jpeg\" alt=\"Figure 3: Morphological matrix mapping ethical dimensions to practical implementation strategies in the predictive motion\nAI system. Highlighted options represent the system\u2019s current or proposed design.\" class=\"wp-image-113088\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3-1024x466.jpeg 1024w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3-764x348.jpeg 764w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3-768x350.jpeg 768w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3-514x234.jpeg 514w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3-1536x700.jpeg 1536w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3-510x232.jpeg 510w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3-64x29.jpeg 64w, https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_I4S-26-1_Figure-3.jpeg 2000w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>Figure 3: Morphological matrix mapping ethical dimensions to practical implementation strategies in the predictive motion AI system. Highlighted options represent the system\u2019s current or proposed design.<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Concluding reflections on ethical applied predictive AI beyond \u2018Industry 4.0\u2019<\/h2>\n\n\n\n<p>In conclusion, Z-Inspection\u00ae is a holistic, trustworthy, ethics-driven assessment process that involves independent evaluations by domain experts, ethicists, and engineers, culminating in consensus. Our collaborative assessment with industry partners highlights how operator override enhances human agency, demonstrates ergonomic effects through qualitative observations with planned validation, and outlines the stages and mitigations of the Z-Inspection\u00ae method. This approach offers a more transparent, evidence-based, and reproducible framework for ethics evaluation.<\/p>\n\n\n\n<p>The presented approach shows the technical feasibility of predictive AI in human-centered assembly environments and its compliance with responsible AI principles. This provides a foundation for the ethical deployment of AI systems in Industry 4.0 and similar settings. However, several challenges remain. Real-time integration of gaze and motion data is sensitive to hardware limitations, such as sensor quality, operator head movements, and changing lighting conditions. Maintaining calibration and accuracy across sessions remains challenging. Additionally, ensuring demographic and task diversity in training datasets is crucial to prevent systematic bias in AI predictions.<\/p>\n\n\n\n<p>Future research should incorporate longitudinal studies to examine how predictive AI affects trust, usability, and operator performance. Federated learning could improve cross-site generalization while safeguarding data privacy and regulatory compliance. Moving these principles toward Industry 5.0 will require adaptive methods that maintain transparency, safety, and human focus within evolving industrial environments.<\/p>\n\n\n\n<p><em>The authors would like to acknowledge the financial support from the DFG within the HiSMoT project (grant number: 500490184), from the Federal Ministry for Economic Affairs and Climate Action (BMWK) within the program \u201cFuture Investments for Vehicle Manufacturers and Supplier Industry\u201d (KoPa 35c) for the project SkaLaB (grant number: 13IK025B) and from the Horizon Europe FLEX4RES project (grant number: 101091903).<\/em><\/p>\n<hr><div class=\"gito-pub-content-bibliography\"><h2>Bibliography <\/h2>(1) European Union: Regulation (EU) 2024\/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300\/2008, (EU) No 167\/2013, (EU) No 168\/2013, (EU) 2018\/858, (EU) 2018\/1139 and (EU) 2019\/2144 and Directives 2014\/90\/EU, (EU) 2016\/797 and (EU) 2020\/1828 (Artificial Intelligence Act). Official Journal of the European Union, L 2024\/1689, 12 July 2024. In force. URL: http:\/\/data.europa.eu\/eli\/reg\/2024\/1689\/oj, accessed 03.06.2025.\r<br>(2) Zicari, R. V.; Brodersen, J.; Brusseau, J.; Du\u00a8dder, B.; Eichhorn, T. et al.: Z-inspection\u00ae: A process to assess trustworthy AI. In: IEEE Transactions on Technology and Society 2 (2021) 2, pp. 83-97. DOI: 10.1109\/TTS.2021.3066209.\r<br>(3) Elyasi, M.; Thevenin, S.; Cerqueus, A.: Use of AI in assembly line design and worker and equipment management: review and future directions. In: Flexible Services and Manufacturing Journal 37 (2025) 2, pp.367-408. DOI: 10.1007\/s10696-024-09576-4.\r<br>(4) Rychtyckyj, N.; Stephens, A.: Assembly ergonomics filters for prevention of injury risk operations at ford motor company, 2009, pp. 58-65. DOI: 10.1109\/CIVVS. 2009.4938724.\r<br>(5) Hartikainen, M.; Spurava, G.; V\u00e4\u00e4n\u00e4nen, K.: Human-AI collaboration in smart manufacturing: Key concepts and framework for design. In: Frontiers in Artificial Intelligence and Applications 386 (2024), pp. 162-172. DOI: 10.3233\/FAIA240192.\r<br>(6) Gkournelos, C.; Konstantinou, C.; Angelakis, P.; Tzavara, E.; Makris, S.: Praxis: A framework for AI-driven human action recognition in assembly. In: Journal of Intelligent Manufacturing 35 (2024) 8, pp. 3697-3711. DOI: 10.1007\/s10845-023-02228-8.\r<br>(7) Li, C.; Park, J.; Kim, H.; Chrysostomou, D.: How can I help you? an intelligent virtual assistant for industrial robots. 2021, pp. 220-224. DOI: 10.1145\/ 3434074.3447163.\r<br>(8) Roos, G.: Al\u2019s growing role in industrial manufacturing. In: Electronic Products 64 (2022) 3, pp. 5-6.\r<br>(9) Emmanouilidis, C.; Waschull, S.; Bokhorst, J. A. C.; Wortmann, J. C.: Human in the AI loop in production environments. In: IFIP Advances in Information and Communication Technology 633 (2021), pp. 331-342. DOI: 10.1007\/978-3-030-85910-7 35.\r<br>(10) Colabianchi, S. Costantino, F.; Sabetta, N.: Assessment of a large language model based digital intelligent assistant in assembly manufacturing. In: Computers in Industry 162 (2024). DOI: 10.1016\/j.compind.2024.104129.\r<br>(11) Holden, D. Saito, J.; Komura, T: A deep learning framework for character motion synthesis and editing. In: ACM Trans. Graph. 35 (2016) 4, pp. 1-11. ISSN 0730-0301. DOI: 10.1145\/2897824.2925975.\r<br>(12) Min, J.; Chai, J.: Motion graphs++: A compact generative model for semantic motion analysis and synthesis. In: ACM Trans. Graph 31 (2012) 6, pp. 153:1-153:12. ISSN 0730-0301. DOI: 10.1145\/2366145.2366172.\r<br>(13) Andreou, N.; Wang, X.; Fern\u00e1ndez Abrevaya, V.; Cani, M.-P.; Chrysanthou, Y.; Kalogeiton, V.: LEAD: Latent Realignment for Human Motion Diffusion. In: Computer Graphics Forum, e70093. ISSN 1467-8659. DOI: 10.1111\/cgf.70093.\r<br>(14) Saeed, R.; Tuli, T. B.; Weikum, M.; Manns, M.: Towards trajectory-based latent space control for human-robot collaboration. In: Procedia CIRP 134 (2025), pp. 431-436. DOI: 10.1016\/j.procir.2025.02.143.\r<br>(15) Bandi, C.; Thomas, U.: Action Conditioned Attention Encoder-Decoder and Discriminator for Human Motion Generation. In: Fred, A.; Hadjali, A.; Gusikhin, O.; Sansone, C. (eds.): Deep Learning Theory and Applications. Cham; Springer Nature Switzerland 2024, pp. 259-276. ISBN 978-3-031-66694-0. DOI: 10.1007\/978-3-031-66694-0 16.\r<br>(16) Manns, M.; Fischer, K.; Du, H.; Slusallek, P.; Alexopoulos, K.: A new approach to plan manual assembly. In: Int. J. Computer Integrated Manufacturing 31 (2018) 9, pp. 907-920. DOI: 10.1080\/0951192X.2018.1466396.<\/div><div id=\"download-section\" class=\"gito-pub-download-section\" style=\"text-align:center;margin:20px;\"><h2>Your downloads<\/h2><button style=\"font-size:14px;margin-right:15px;\" class=\"button gito-pub-cpt-download-button\" data-postid=\"112904\" data-userid =\"0\" data-filename=\"I4S_01-2026_DE_Tuli.pdf\"><span style=\"margin-top:5px !important;\" class=\"dashicons dashicons-download\"><\/span>&nbsp;&nbsp;PDF (DE)<\/button><button style=\"font-size:14px;margin-right:15px;\" class=\"button gito-pub-cpt-download-button\" data-postid=\"112904\" data-userid =\"0\" data-filename=\"I4S_01-2026_ENG_Tuli.pdf\"><span style=\"margin-top:5px !important;\" class=\"dashicons dashicons-download\"><\/span>&nbsp;&nbsp;PDF (EN)<\/button><\/div><br>Potentials: <span class=\"gito-pub-tag-element\"><a href=\"\/potentials\/strategy\/\">Strategy<\/a><\/span> <br>Solutions: <span class=\"gito-pub-tag-element\"><a href=\"\/en\/functions\/assembly\/\">Assembly<\/a><\/span> <div class=\"gito-pub-tags-social-share\" style=\"display:flex;justify-content:space-between;\"><div>Tags: <span class=\"gito-pub-tag-element\"><a href=\"\/tag\/applied-ai\/\">applied AI<\/a><\/span> <span class=\"gito-pub-tag-element\"><a href=\"\/tag\/collaborative-robot-workplace-design\/\">collaborative robot workplace design<\/a><\/span> <span class=\"gito-pub-tag-element\"><a href=\"\/tag\/human-robot-collaboration-en\/\">human-robot collaboration<\/a><\/span> <span class=\"gito-pub-tag-element\"><a href=\"\/tag\/predictive-human-motion-ai-model\/\">predictive human motion AI model<\/a><\/span> <br>Industries: <span class=\"gito-pub-tag-element\"><a href=\"https:\/\/industry-science.com\/en\/industries\/assembly\/\">Assembly<\/a><\/span> <span class=\"gito-pub-tag-element\"><a href=\"https:\/\/industry-science.com\/en\/industries\/design\/\">Design<\/a><\/span> <span class=\"gito-pub-tag-element\"><a href=\"https:\/\/industry-science.com\/en\/industries\/manufacturing-en\/\">Manufacturing<\/a><\/span> <\/div><div><div class=\"social-icons share-icons share-row relative\" ><a href=\"whatsapp:\/\/send?text=Applied%20AI%20for%20Human-Centric%20Assembly%20Workplace%20Design - https:\/\/industry-science.com\/en\/articles\/ai-assembly-workplace-design\/\" data-action=\"share\/whatsapp\/share\" class=\"icon button circle is-outline tooltip whatsapp show-for-medium\" title=\"Share on WhatsApp\" aria-label=\"Share on WhatsApp\"><i class=\"icon-whatsapp\" aria-hidden=\"true\"><\/i><\/a><a href=\"https:\/\/www.facebook.com\/sharer.php?u=https:\/\/industry-science.com\/en\/articles\/ai-assembly-workplace-design\/\" data-label=\"Facebook\" onclick=\"window.open(this.href,this.title,'width=500,height=500,top=300px,left=300px'); return false;\" target=\"_blank\" class=\"icon button circle is-outline tooltip facebook\" title=\"Share on Facebook\" aria-label=\"Share on Facebook\" rel=\"noopener nofollow\"><i class=\"icon-facebook\" aria-hidden=\"true\"><\/i><\/a><a href=\"https:\/\/x.com\/share?url=https:\/\/industry-science.com\/en\/articles\/ai-assembly-workplace-design\/\" onclick=\"window.open(this.href,this.title,'width=500,height=500,top=300px,left=300px'); return false;\" target=\"_blank\" class=\"icon button circle is-outline tooltip x\" title=\"Share on X\" aria-label=\"Share on X\" rel=\"noopener nofollow\"><i class=\"icon-x\" aria-hidden=\"true\"><\/i><\/a><a href=\"mailto:?subject=Applied%20AI%20for%20Human-Centric%20Assembly%20Workplace%20Design&body=Check%20this%20out%3A%20https%3A%2F%2Findustry-science.com%2Fen%2Farticles%2Fai-assembly-workplace-design%2F\" class=\"icon button circle is-outline tooltip email\" title=\"Email to a Friend\" aria-label=\"Email to a Friend\" rel=\"nofollow\"><i class=\"icon-envelop\" aria-hidden=\"true\"><\/i><\/a><a href=\"https:\/\/www.linkedin.com\/shareArticle?mini=true&url=https:\/\/industry-science.com\/en\/articles\/ai-assembly-workplace-design\/&title=Applied%20AI%20for%20Human-Centric%20Assembly%20Workplace%20Design\" onclick=\"window.open(this.href,this.title,'width=500,height=500,top=300px,left=300px'); return false;\" target=\"_blank\" class=\"icon button circle is-outline tooltip linkedin\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" rel=\"noopener nofollow\"><i class=\"icon-linkedin\" aria-hidden=\"true\"><\/i><\/a><\/div><\/div><\/div><hr style=\"margin-top:0px;\">\n<h2 class=\"gito-pub-frontend-post-headline\">You might also be interested in<\/h2>\n<!-- GITO_PUB_POST start flex-container -->\n<div class=\"gito-pub-flex-container\">\n   <div class=\"gito-pub-frontend-post-card gito-pub-flex-item gito-pub-flex-item-1\">\n      <a href=\"https:\/\/industry-science.com\/en\/articles\/serious-games-as-a-training-tool\/\">\n         <div class=\"gito-pub-frontend-post-card-row\">         <div class=\"gito-pub-frontend-post-card-column gito-pub-frontend-post-card-column-image\">\n            <picture>\n               <source media=\"(max-width:640px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/Lange_AdobeStock_734724963_alexkich-640x325.webp\">\n               <source media=\"(min-width:641px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/Lange_AdobeStock_734724963_alexkich-196x180.webp\">\n               <img decoding=\"async\" class=\"gito-pub-frontend-post-card-image\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/Lange_AdobeStock_734724963_alexkich-196x180.webp\" alt=\"Serious Games as a Training Tool\">\n            <\/picture>\n         <\/div>\n            <div class=\"gito-pub-frontend-post-card-column\">               <div class=\"ellipsis\" style=\"height:166px !important;overflow:hidden;\" title=\"Serious Games as a Training Tool\">                  <table class=\"gito-pub-frontend-post-card-header\">\n            \t     <tr>\n                        <td>                  \t\t   <h4 class=\"gito-pub-frontend-post-card-title\" style=\"line-height:1.2em;\">Serious Games as a Training Tool<\/h4>\n                        <div class=\"gito-pub-frontend-post-card-subtitle\">Game mechanics design to promote resilience<\/div>                        <div class=\"gito-pub-frontend-post-card-author\"><a href=\"\/authors\/annika-lange\/\">Annika Lange<\/a> <a href=\"https:\/\/orcid.org\/0000-0002-4514-9306\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a>, <a href=\"\/authors\/thomas-knothe\/\">Thomas Knothe<\/a> <a href=\"https:\/\/orcid.org\/0000-0002-3055-7155\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a><\/div>\n                        <\/td>\n                     <\/tr>\n                  <\/table>\n                  <div class=\"gito-pub-frontend-post-card-text\">\n                     <div class=\"gito-pub-frontend-post-card-abo-sign gito-pub-login-register-link\" data-targetabo=\"expert\" data-targeturl=\"https:\/\/industry-science.com\/en\/articles\/serious-games-as-a-training-tool\/\" title=\"please login or register - content can only be read in its entirety with a subscription  expert\">\n\t\t\t                         <img decoding=\"async\" src=\"https:\/\/industry-science.com\/wp-content\/plugins\/gito-publisher\/img\/i4s-login.png\">\n\t\t\t                      <\/div>Unforeseen events are increasingly challenging manufacturing companies. Being resilient during crises is becoming a key competence. Serious games (SG) can help make resilience-building processes more transparent. This article derives specific requirements for SG from different phases of resilience and shows how these can be implemented in game mechanics in order to effectively support the training of resilience.                  <\/div>\n               <\/div>\n               <div class=\"gito-pub-frontend-post-card-scientific\"><strong>Industry 4.0 Science<\/strong> | Volume 42 | 2026 | Edition 2 | Pages 98-104<\/div>            <\/div>\n         <\/div>\n      <\/a>\n   <\/div>\n   <div class=\"gito-pub-frontend-post-card gito-pub-flex-item gito-pub-flex-item-1\">\n      <a href=\"https:\/\/industry-science.com\/en\/articles\/experiments-learning-factories\/\">\n         <div class=\"gito-pub-frontend-post-card-row\">         <div class=\"gito-pub-frontend-post-card-column gito-pub-frontend-post-card-column-image\">\n            <picture>\n               <source media=\"(max-width:640px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_1765062059_Design-Praxis-640x325.webp\">\n               <source media=\"(min-width:641px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_1765062059_Design-Praxis-196x180.webp\">\n               <img decoding=\"async\" class=\"gito-pub-frontend-post-card-image\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_1765062059_Design-Praxis-196x180.webp\" alt=\"Conducting Experiments in Hybrid Learning Factories\">\n            <\/picture>\n         <\/div>\n            <div class=\"gito-pub-frontend-post-card-column\">               <div class=\"ellipsis\" style=\"height:185px;overflow:hidden;\" title=\"Conducting Experiments in Hybrid Learning Factories\">                  <table class=\"gito-pub-frontend-post-card-header\">\n            \t     <tr>\n                        <td>                  \t\t   <h4 class=\"gito-pub-frontend-post-card-title\">Conducting Experiments in Hybrid Learning Factories<\/h4>\n                        <div class=\"gito-pub-frontend-post-card-subtitle\">The example of the InTraLab Potsdam<\/div>                        <\/td>\n                     <\/tr>\n                  <\/table>\n                  <div class=\"gito-pub-frontend-post-card-text\">\n<div class=\"gito-pub-frontend-post-card-abo-sign gito-pub-login-register-link\" data-targetabo=\"expert\" data-targeturl=\"https:\/\/industry-science.com\/en\/articles\/experiments-learning-factories\/\" title=\"please login or register - content can only be read in its entirety with a subscription  expert\">\n\t\t\t                         <img decoding=\"async\" src=\"https:\/\/industry-science.com\/wp-content\/plugins\/gito-publisher\/img\/i4s-login.png\">\n\t\t\t                      <\/div>\nIndustrial production is undergoing rapid transformation through digitalization, automation and cyber-physical systems, creating new competence requirements for employees. Learning factories provide experiential environments for developing these competences. This article presents the Industrial Transformation Lab (InTraLab) as a hybrid learning factory combining physical demonstrators and digital simulations.                  <\/div>\n               <\/div>\n            <\/div>\n         <\/div>\n      <\/a>\n   <\/div>\n   <div class=\"gito-pub-frontend-post-card gito-pub-flex-item gito-pub-flex-item-1\">\n      <a href=\"https:\/\/industry-science.com\/en\/articles\/learning-factories-future-brazil\/\">\n         <div class=\"gito-pub-frontend-post-card-row\">         <div class=\"gito-pub-frontend-post-card-column gito-pub-frontend-post-card-column-image\">\n            <picture>\n               <source media=\"(max-width:640px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_521020784_Gorodenkoff-640x325.webp\">\n               <source media=\"(min-width:641px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_521020784_Gorodenkoff-196x180.webp\">\n               <img decoding=\"async\" class=\"gito-pub-frontend-post-card-image\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_521020784_Gorodenkoff-196x180.webp\" alt=\"Learning Factories for the Future of Manufacturing in Brazil\">\n            <\/picture>\n         <\/div>\n            <div class=\"gito-pub-frontend-post-card-column\">               <div class=\"ellipsis\" style=\"height:185px;overflow:hidden;\" title=\"Learning Factories for the Future of Manufacturing in Brazil\">                  <table class=\"gito-pub-frontend-post-card-header\">\n            \t     <tr>\n                        <td>                  \t\t   <h4 class=\"gito-pub-frontend-post-card-title\">Learning Factories for the Future of Manufacturing in Brazil<\/h4>\n                        <div class=\"gito-pub-frontend-post-card-subtitle\">Advancing manufacturing through technology and skills development<\/div>                        <\/td>\n                     <\/tr>\n                  <\/table>\n                  <div class=\"gito-pub-frontend-post-card-text\">\n<div class=\"gito-pub-frontend-post-card-abo-sign gito-pub-login-register-link\" data-targetabo=\"expert\" data-targeturl=\"https:\/\/industry-science.com\/en\/articles\/learning-factories-future-brazil\/\" title=\"please login or register - content can only be read in its entirety with a subscription  expert\">\n\t\t\t                         <img decoding=\"async\" src=\"https:\/\/industry-science.com\/wp-content\/plugins\/gito-publisher\/img\/i4s-login.png\">\n\t\t\t                      <\/div>\nManufacturing firms in developing countries face challenges in closing productivity gaps while adopting Industry 4.0 technologies. Learning factories are one helpful approach to countering these challenges. One such example is the learning factory F\u00e1brica do Futuroin S\u00e3o Paulo, Brazil, which has engaged students, supported competence development, and collaborated with industry in applied research, functioning as a hub for advanced manufacturing initiatives.                  <\/div>\n               <\/div>\n            <\/div>\n         <\/div>\n      <\/a>\n   <\/div>\n   <div class=\"gito-pub-frontend-post-card gito-pub-flex-item gito-pub-flex-item-1\">\n      <a href=\"https:\/\/industry-science.com\/en\/articles\/energy-transition-serious-gaming\/\">\n         <div class=\"gito-pub-frontend-post-card-row\">         <div class=\"gito-pub-frontend-post-card-column gito-pub-frontend-post-card-column-image\">\n            <picture>\n               <source media=\"(max-width:640px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_423992056_BullRun-640x325.webp\">\n               <source media=\"(min-width:641px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_423992056_BullRun-196x180.webp\">\n               <img decoding=\"async\" class=\"gito-pub-frontend-post-card-image\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_423992056_BullRun-196x180.webp\" alt=\"Serious Gaming and the Energy Transition\">\n            <\/picture>\n         <\/div>\n            <div class=\"gito-pub-frontend-post-card-column\">               <div class=\"ellipsis\" style=\"height:166px !important;overflow:hidden;\" title=\"Serious Gaming and the Energy Transition\">                  <table class=\"gito-pub-frontend-post-card-header\">\n            \t     <tr>\n                        <td>                  \t\t   <h4 class=\"gito-pub-frontend-post-card-title\" style=\"line-height:1.2em;\">Serious Gaming and the Energy Transition<\/h4>\n                        <div class=\"gito-pub-frontend-post-card-subtitle\">Collaborative knowledge generation and interactive understanding of complex interrelationships<\/div>                        <div class=\"gito-pub-frontend-post-card-author\"><a href=\"\/authors\/janine-gondolf\/\">Janine Gondolf<\/a> <a href=\"https:\/\/orcid.org\/0000-0002-5644-8328\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a>, <a href=\"\/authors\/gert-mehlmann\/\">Gert Mehlmann<\/a>, <a href=\"\/authors\/joern-hartung\/\">J\u00f6rn Hartung<\/a>, <a href=\"\/authors\/bernd-schweinshaut\/\">Bernd Schweinshaut<\/a>, <a href=\"\/authors\/anne-bauer\/\">Anne Bauer<\/a><\/div>\n                        <\/td>\n                     <\/tr>\n                  <\/table>\n                  <div class=\"gito-pub-frontend-post-card-text\">\n                     Conveying the complexity and multifaceted nature of the energy transition to a broad audience is a challenge. This article demonstrates how interactive serious games on a multitouch table can help make connections tangible and comprehensible. The games and the table were used in various conversational contexts. These are presented here in three case vignettes based on participant observation of the different applications, as well as situated and shared reflection. The vignettes demonstrate how interaction can trigger epistemic processes, enable shifts in perspective, and foster collective thinking, all of which are necessary for shaping the future of society as a whole.                  <\/div>\n               <\/div>\n               <div class=\"gito-pub-frontend-post-card-scientific\"><strong>Industry 4.0 Science<\/strong> | Volume 42 | 2026 | Edition 2 | Pages 62-69<\/div>            <\/div>\n         <\/div>\n      <\/a>\n   <\/div>\n   <div class=\"gito-pub-frontend-post-card gito-pub-flex-item gito-pub-flex-item-1\">\n      <a href=\"https:\/\/industry-science.com\/en\/articles\/learning-module-sustainable\/\">\n         <div class=\"gito-pub-frontend-post-card-row\">         <div class=\"gito-pub-frontend-post-card-column gito-pub-frontend-post-card-column-image\">\n            <picture>\n               <source media=\"(max-width:640px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_289023545_Gorodenkoff-640x325.webp\">\n               <source media=\"(min-width:641px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_289023545_Gorodenkoff-196x180.webp\">\n               <img decoding=\"async\" class=\"gito-pub-frontend-post-card-image\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_289023545_Gorodenkoff-196x180.webp\" alt=\"Industrial Transformation via a Machining Learning Factory\">\n            <\/picture>\n         <\/div>\n            <div class=\"gito-pub-frontend-post-card-column\">               <div class=\"ellipsis\" style=\"height:166px !important;overflow:hidden;\" title=\"Industrial Transformation via a Machining Learning Factory\">                  <table class=\"gito-pub-frontend-post-card-header\">\n            \t     <tr>\n                        <td>                  \t\t   <h4 class=\"gito-pub-frontend-post-card-title\" style=\"line-height:1.2em;\">Industrial Transformation via a Machining Learning Factory<\/h4>\n                        <div class=\"gito-pub-frontend-post-card-subtitle\">A learning module to foster competencies for a sustainability-driven transformation<\/div>                        <div class=\"gito-pub-frontend-post-card-author\"><a href=\"\/authors\/oskay-ozen\/\">Oskay Ozen<\/a> <a href=\"https:\/\/orcid.org\/0000-0001-5566-6633\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a>, <a href=\"\/authors\/victoria-breidling\/\">Victoria Breidling<\/a> <a href=\"https:\/\/orcid.org\/0009-0000-0384-4813\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a>, <a href=\"\/authors\/stefan-seyfried\/\">Stefan Seyfried<\/a> <a href=\"https:\/\/orcid.org\/0000-0001-8278-0212\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a>, <a href=\"\/authors\/matthias-weigold\/\">Matthias Weigold<\/a> <a href=\"https:\/\/orcid.org\/0000-0002-7820-8544\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a><\/div>\n                        <\/td>\n                     <\/tr>\n                  <\/table>\n                  <div class=\"gito-pub-frontend-post-card-text\">\n                     Sustainability-enhancing transformation processes are necessary in all sectors if we are to remain within planetary boundaries. This also applies to the industrial sector as a significant emitter of greenhouse gases. Employees need new competencies to master this complex task of industrial transformation. These range from CO2 equivalents accounting to the development and evaluation of transformation scenarios, including technical measures. The learning module developed here addresses these competency requirements and uses the example of the ETA factory to show how a competency-oriented learning module for industrial transformation can be structured. It essentially comprises four phases: data collection and CO2 equivalents accounting, cause analysis, development of measures and evaluation of measures.                  <\/div>\n               <\/div>\n               <div class=\"gito-pub-frontend-post-card-scientific\"><strong>Industry 4.0 Science<\/strong> | Volume 42 | Edition 2 | Pages 38-47 | DOI <a style=\"font-weight:bold !important;\" href=\"https:\/\/doi.org\/10.30844\/I4SE.26.2.38\" target=\"_blank\" rel=\"noopener\">10.30844\/I4SE.26.2.38<\/a><\/div>            <\/div>\n         <\/div>\n      <\/a>\n   <\/div>\n   <div class=\"gito-pub-frontend-post-card gito-pub-flex-item gito-pub-flex-item-1\">\n      <a href=\"https:\/\/industry-science.com\/en\/articles\/digital-twins-production-logistics\/\">\n         <div class=\"gito-pub-frontend-post-card-row\">         <div class=\"gito-pub-frontend-post-card-column gito-pub-frontend-post-card-column-image\">\n            <picture>\n               <source media=\"(max-width:640px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_1784362718_Andrey-Popov-640x325.webp\">\n               <source media=\"(min-width:641px)\" srcset=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_1784362718_Andrey-Popov-196x180.webp\">\n               <img decoding=\"async\" class=\"gito-pub-frontend-post-card-image\" src=\"https:\/\/industry-science.com\/wp-content\/uploads\/2026\/04\/AdobeStock_1784362718_Andrey-Popov-196x180.webp\" alt=\"Experiencing Digital Twins in Production and Logistics\">\n            <\/picture>\n         <\/div>\n            <div class=\"gito-pub-frontend-post-card-column\">               <div class=\"ellipsis\" style=\"height:166px !important;overflow:hidden;\" title=\"Experiencing Digital Twins in Production and Logistics\">                  <table class=\"gito-pub-frontend-post-card-header\">\n            \t     <tr>\n                        <td>                  \t\t   <h4 class=\"gito-pub-frontend-post-card-title\" style=\"line-height:1.2em;\">Experiencing Digital Twins in Production and Logistics<\/h4>\n                        <div class=\"gito-pub-frontend-post-card-subtitle\">The fischertechnik\u00ae Learning Factory 4.0 as a development platform for possible expansion stages<\/div>                        <div class=\"gito-pub-frontend-post-card-author\"><a href=\"\/authors\/deike-gliem\/\">Deike Gliem<\/a> <a href=\"https:\/\/orcid.org\/0000-0001-8098-334X\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a>, <a href=\"\/authors\/sigrid-wenzel\/\">Sigrid Wenzel<\/a> <a href=\"https:\/\/orcid.org\/0000-0001-9594-1839\" target=\"_blank\" title=\"ORCID eintrag \u00f6ffnen.\" rel=\"noopener\">\n        <img decoding=\"async\" src=\"https:\/\/orcid.org\/assets\/vectors\/orcid.logo.icon.svg\" alt=\"ORCID Icon\" style=\"width:16px;height:16px;vertical-align:middle;\"><\/a>, <a href=\"\/authors\/jan-schickram\/\">Jan Schickram<\/a>, <a href=\"\/authors\/tareq-albeesh\/\">Tareq Albeesh<\/a><\/div>\n                        <\/td>\n                     <\/tr>\n                  <\/table>\n                  <div class=\"gito-pub-frontend-post-card-text\">\n                     The fischertechnik\u00ae Learning Factory 4.0 has proven to be a suitable experimental environment for testing digital twins. Depending on the targeted maturity stage, the functions of a digital twin range from status monitoring and forecasting to the operational control of production and logistics systems. To systematically classify these functions, this article presents a maturity model that serves as a framework for the development of a digital twin. Building on this, selected use cases are implemented in a test and development environment based on a system architecture with multi-layered logic structure. These initial implementations serve to highlight application purposes, relevant methods, and typical challenges and potentials in the transfer to real factory environments.                  <\/div>\n               <\/div>\n               <div class=\"gito-pub-frontend-post-card-scientific\"><strong>Industry 4.0 Science<\/strong> | Volume 42 | Edition 2 | Pages 30-37 | DOI <a style=\"font-weight:bold !important;\" href=\"https:\/\/doi.org\/10.30844\/I4SE.26.2.30\" target=\"_blank\" rel=\"noopener\">10.30844\/I4SE.26.2.30<\/a><\/div>            <\/div>\n         <\/div>\n      <\/a>\n   <\/div>\n<\/div>\n<!-- GITO_PUB_POST end flex-container -->\n","protected":false},"excerpt":{"rendered":"<p>Artificial intelligence (AI) can enhance smart assembly by predicting human motion and adapting workplace design. Using probabilistic models such as Gaussian Mixture Models (GMMs), AI systems anticipate operator actions to improve coordination with robots. However, these predictive systems raise ethical concerns related to safety, fairness, and privacy under the EU AI Act, which classifies them as high-risk. This paper presents a conceptual method integrating probabilistic motion modeling with ethical evaluation via Z-Inspection\u00ae. An industrial case study using the Smart Work Assistant (SWA) demonstrates how multimodal sensing (motion, gaze) and interpretable models enable anticipatory assistance. The approach moves from ethics evaluation to ethics-informed work design, yielding transferable principles and a configurable assessment matrix that supports compliance-by-design in collaborative assembly.<\/p>\n","protected":false},"featured_media":112903,"menu_order":0,"template":"","categories":[79167,79168,79298],"tags":[85492,85493,83799,85491],"product_cat":[],"topic":[68005,68760,68206,79319,79333,67701],"technology":[67790,79493],"knowhow":[],"industry":[84650,84636,79494],"writer":[83344,83650],"content-type":[83932],"potential":[68108],"solution":[69358],"glossary":[],"class_list":{"0":"post-112904","1":"article","2":"type-article","3":"status-publish","4":"has-post-thumbnail","6":"category-design-en","7":"category-translate-en","8":"category-typeset","9":"tag-applied-ai","10":"tag-collaborative-robot-workplace-design","11":"tag-human-robot-collaboration-en","12":"tag-predictive-human-motion-ai-model","13":"topic-automation","14":"topic-factory-design","15":"topic-industry-4-0","16":"topic-platforms","17":"topic-process-optimization","18":"topic-production-system","19":"technology-artificial-intelligence","20":"technology-digitalization","21":"industry-assembly","22":"industry-design","23":"industry-manufacturing-en","24":"writer-henning-vogler-en","25":"writer-martin-manns-en","26":"content-type-article","27":"potential-strategy","28":"solution-assembly","29":"product","30":"first","31":"instock","32":"downloadable","33":"virtual","34":"sold-individually","35":"taxable","36":"purchasable","37":"product-type-article"},"uagb_featured_image_src":{"full":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb.webp",1400,788,false],"thumbnail":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-150x150.webp",150,150,true],"medium":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-666x375.webp",666,375,true],"medium_large":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-768x432.webp",768,432,true],"large":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-1024x576.webp",1020,574,true],"front-page-entry":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-1032x320.webp",1032,320,true],"post-entry":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-764x376.webp",764,376,true],"post-teaser":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-392x320.webp",392,320,true],"post-teaser-mobile":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-608x496.webp",608,496,true],"post-custom-size":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-640x325.webp",640,325,true],"whitepaper-teaser":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-274x376.webp",274,376,true],"card-big":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-514x292.webp",514,292,true],"card-portrait":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-320x440.webp",320,440,true],"card-big-company":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-514x289.webp",514,289,true],"gp-listing":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-196x180.webp",196,180,true],"1536x1536":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb.webp",1400,788,false],"2048x2048":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb.webp",1400,788,false],"woocommerce_thumbnail":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-510x510.webp",510,510,true],"woocommerce_single":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-510x287.webp",510,287,true],"woocommerce_gallery_thumbnail":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-100x100.webp",100,100,true],"dgwt-wcas-product-suggestion":["https:\/\/industry-science.com\/wp-content\/uploads\/2026\/02\/Tuli_AdobeStock_1665432467_Grispb-64x36.webp",64,36,true]},"uagb_author_info":{"display_name":"Florian Goldmann","author_link":"https:\/\/industry-science.com\/en\/author\/"},"uagb_comment_info":0,"uagb_excerpt":"Artificial intelligence (AI) can enhance smart assembly by predicting human motion and adapting workplace design. Using probabilistic models such as Gaussian Mixture Models (GMMs), AI systems anticipate operator actions to improve coordination with robots. However, these predictive systems raise ethical concerns related to safety, fairness, and privacy under the EU AI Act, which classifies them&hellip;","_links":{"self":[{"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/article\/112904","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/article"}],"about":[{"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/types\/article"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/media\/112903"}],"wp:attachment":[{"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/media?parent=112904"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/categories?post=112904"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/tags?post=112904"},{"taxonomy":"product_cat","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/product_cat?post=112904"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/topic?post=112904"},{"taxonomy":"technology","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/technology?post=112904"},{"taxonomy":"knowhow","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/knowhow?post=112904"},{"taxonomy":"industry","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/industry?post=112904"},{"taxonomy":"writer","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/writer?post=112904"},{"taxonomy":"content-type","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/content-type?post=112904"},{"taxonomy":"potential","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/potential?post=112904"},{"taxonomy":"solution","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/solution?post=112904"},{"taxonomy":"glossary","embeddable":true,"href":"https:\/\/industry-science.com\/en\/wp-json\/wp\/v2\/glossary?post=112904"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}