IBM, NASA Team Up to Build Transformer-Based Language Models for Scientific Community
IBM has teamed up with NASA to build a suite of transformer architecture-based language models designed to give the scientific and academic community access to vast amounts of scientific knowledge and information. The IBM-NASA models were trained on 60 billion tokens from a collection of astrophysics, earth science, planetary science, heliophysics and biological and physical sciences data.
Join the Potomac Officers Club’s 5th Annual Artificial Intelligence Summit on March 21. Register here.