Are attention and convolution all you need for RNA modeling?

We previously featured a Kaggle competition about RNA structure prediction. It finished on December 7, 2023, and the top winners have published summaries of their models.

Most of the top-performing teams combined both attention and convolution in their models. They modeled the primary RNA sequence is modeled using a transformer encoder (self-attention blocks). The base pair probability matrix (BPPM), usually pre-computed by software like EternaFold, is modeled using a convolutional block.

Although implementation details differ, all the top 3 teams used a similar architecture, i.e., a modified transformer model, in which the attention map of the self-attention block was calculated by combining 1) attention values from the primary sequence embedding and 2) BPPM features from the convolutional block (Figure 1).

It makes sense because RNA function highly depends on its secondary structure, and adding the BPPM features to the attention map helps the model learn the interaction of amino acids.

Figure 1. 1st place solution to the kaggle competition

Atomic AI, a startup using AI and structure biology to develop RNA-targeting drugs, released a large language model for RNA on December 15, 2023, called ATOM-1. The published manuscript didn’t reveal the architecture and training process of ATOM-1. However, it seems they also modified the transformer model to include the structure-related features. (The manuscript mainly focused on the application of the embeddings from the transformer encoder of ATOM-1, which is useful for the predictions of secondary structure, tertiary structure, and in-solution stability.)

Featured News

  • New federal rules demand transparency into AI models used in health decisions

  • Google introduces MedLM — a family of foundation models fine-tuned for healthcare industry use cases.

  • Healthcare AI must also swear a Hippocratic oath like doctors to ensure a high ethics standard.

If you find the newsletter helpful, please consider:

  • 🔊 Sharing the newsletter with other people

  • 👍 Upvoting on Product Hunt

  • 📧 Sending any feedback, suggestions, and questions by directly replying to this email or writing reviews on Product Hunt

  • 🙏 Supporting us with a cup of coffee.

Thanks, and see you next time!

Join the conversation

or to participate.