Skip to content

MLP优化 #17

@hongxiahao91

Description

@hongxiahao91

Task Description

我们这边原子模拟的模型,graph transformer架构。现在有三个问题:1. 现在模型设计上有点问题,模型的势能面不够平滑,导致在做材料的结构优化的时候fail不好收敛。 2. 模型attention部分比较原始,导致memory特别大,模型训练特别吃显存。3. 模型架构上,transformer领域的各种先进的技术,没有尝试过,不知是否会有很好的机会在里面。

Baseline Repository Link (Must be Public)

TBD

Baseline reproduction (minimal)

TBD

Dataset (Must be Public)

TBD

Results and Evaluation Metrics

matbench discovery board

Preconditions (required)

  • The baseline I provide comes from public code hosted on an open-source platform such as GitHub.
  • The task I use/optimize is based on a public dataset.
  • If my request involves private code/data, I will contact interndiscovery@pjlab.org.cn by email instead.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions