MOFTransformer

Note

From version 2.0.0, the default pre-training model has been changed from MOFTransformer to PMTransformer.

MOFTransformer (or PMTransformer) is a Python library that focuses on structure-property relationships in porous materials including Metal-Organic Frameworks (MOFs), Covalent-Organic Frameworks (COFs), Porous Polymer Networks (PPNs), and zeolites The multi-modal pre-trianing Transformer showcases remarkable transfer learning capabilities across various properties of porous materials. With MOFTrasformer, there is no need to develop and train machine learning models to predict different properties for different applications from scratch. The library provides tools for fine-tuning, pre-training, and feature importance analysis using attention scores.

Features

  • The library provides a pre-trained PMTransformer ckpt file with 1.9 million hypothetical porous materials. we provide of PMTransformer pre-trained with 1.9 million hypothetical porous materials.

  • With fine-tuning, the pre-training model allows for high-performance machine learning models to predict properties of porous materials.

  • The pre-embeddings (i.e., atom-based embeddings and energy-grid embeddings) for CoRE MOF, QMOF databases are available.

  • Feature importance analysis can be easily visualized from attention scores of the fine-tuning models.

atom-base graph embedding

_images/1.gif

energy-grid embedding

_images/6.gif

patches of energy-grid embedding

_images/7.gif _images/8.gif

Contents

Indices and Tables