StructGNN: An efficient graph neural network framework for static structural analysis
Journal
Computers and Structures
Journal Volume
299
ISSN
0045-7949
Date Issued
2024-08-01
Author(s)
DOI
10.1016/j.compstruc.2024.107385
Abstract
In the field of structural analysis prediction via supervised learning, neural networks are widely employed. Recent advances in graph neural networks (GNNs) have expanded their capabilities, enabling the prediction of structures with diverse geometries by utilizing graph representations and GNNs' message-passing mechanism. However, conventional message-passing in GNNs doesn't align with structural properties, resulting in inefficient computation and limited generalization to extrapolated datasets. To address this, a novel structural graph representation, incorporating pseudo nodes as rigid diaphragms in each story, alongside an efficient GNN framework called StructGNN is proposed. StructGNN employs an adaptive message-passing mechanism tailored to the structure's story count, enabling seamless transmission of input loading features across the structural graph. Extensive experiments validate the effectiveness of this approach, achieving over 99% accuracy in predicting displacements, bending moments, and shear forces. StructGNN also exhibits strong generalization over non-GNN models, with an average accuracy of 96% on taller, unseen structures. These results highlight StructGNN's potential as a reliable, computationally efficient tool for static structural response prediction, offering promise for addressing challenges associated with dynamic seismic loads in structural analysis
Subjects
Deep learning
Graph neural network
Graph representation
Structural analysis
SDGs
Publisher
Elsevier BV
Type
journal article
