The workshop’s third day began with a recap of day 2. Thomas R. Kurfess, planning committee chair, recapped the statistics primer and Panel 4. He offered a list of a few key ideas from those sessions. The first related to data curation: How does one assemble data, make sure they are fresh, and make sure they are valid? Digital integration is a second key idea: How are data from different sources linked, and, once the data are assembled, how are they shared and protected? Standards are particularly important in making sure that people across space and time will be able to make use of the data. Uncertainty quantification is critical and challenging. Defect detection is a crucial aspect of additive manufacturing (AM), but defect prediction also has a role to play—can defects be foreseen and prevented before they occur? Last, being able to integrate the various approaches used in AM is vital in order to develop a unified image of what is going on via the creation of a digital thread.
Planning committee member Ralph G. Nuzzo then recapped the metrology overview and Panel 5. He began by saying that a key lesson is that new advances in metrology and multimodal characterization methods are essential, including enhancing dimensional accuracy and precision in both AM and post-processing and improving dimensional stability in usage; such innovations will support advances in process, control, and materials design. Furthermore, he said, progress in statistics, data analytics, artificial intelligence (AI), and machine learning (ML) are needed to aid automated procedures for machine calibration and to optimize processing parameters in real time along the toolpath in AM
processes. The field is still in the early stages of understanding how to best develop information and knowledge from data to advance AM competencies, he said.
He continued with a number of other lessons that he took from the overview and the panel: metrology, calibration, and validating measurement may be painful, but they are essential enablers of AM. Models supported by data analytics are central to efficient process design and methods of validation. Data alone are insufficient; metadata are crucial. Current models miss important attributes of chemistry and physics. There is also a need for robust methods of calibration and operando characterization while minimizing operational overhead cost.