0
  • DE
  • EN
  • FR
  • Internationale Datenbank und Galerie für Ingenieurbauwerke

Anzeige

Cloud Accelerated Performance Based Seismic Design

 Cloud Accelerated Performance Based Seismic Design
Autor(en): , ,
Beitrag für IABSE Symposium: Engineering the Future, Vancouver, Canada, 21-23 September 2017, veröffentlicht in , S. 670-676
DOI: 10.2749/vancouver.2017.0670
Preis: € 25,00 inkl. MwSt. als PDF-Dokument  
ZUM EINKAUFSWAGEN HINZUFÜGEN
Vorschau herunterladen (PDF-Datei) 0.19 MB

Non-linear Time History Analysis (NLTHA) is a key enabler of Performance Based Seismic Design (PBSD). Arup Los Angeles office typically performs these simulations in LS-Dyna solver. In order to res...
Weiterlesen

Bibliografische Angaben

Autor(en): (Arup, Los Angeles, CA)
(Arup, Los Angeles, CA)
(Arup, Los Angeles, CA)
Medium: Tagungsbeitrag
Sprache(n): Englisch
Tagung: IABSE Symposium: Engineering the Future, Vancouver, Canada, 21-23 September 2017
Veröffentlicht in:
Seite(n): 670-676 Anzahl der Seiten (im PDF): 7
Seite(n): 670-676
Anzahl der Seiten (im PDF): 7
Jahr: 2017
DOI: 10.2749/vancouver.2017.0670
Abstrakt:

Non-linear Time History Analysis (NLTHA) is a key enabler of Performance Based Seismic Design (PBSD). Arup Los Angeles office typically performs these simulations in LS-Dyna solver. In order to respond to the demands of concurrent design projects, the authors have adopted a cloud centric approach to accelerate our workflows and to enable the use of non-linear time history analysis as a design tool as opposed to a verification tool. This paper will present our custom workflow which enable a dramatic compression of the time required for these analysis. The workflow generates LS- Dyna models in parametric fashion via Rhino- Grasshopper. Since a single design iteration of analysis can result in 48 to 110 models from a range of ground motions and input parameters these models are typically executed on a compute cluster with a large number of compute cores. The resulting number of analyses generates a large amount of data (8-16TB) which we post process leveraging “Big Data” approaches typically used by other industries (financial or retail firms).