Ning Liu

Applied Scientist at Amazon, Bellevue, WA.

prof_pic.jpg

I am a Machine Learning Scientist with 7 years of experience developing and deploying innovative machine learning solutions at scale to address challenges in LLM post-training, reasoning and building agentic solutions, as well as advancing next-generation design in complex physics-based scenarios. I have a strong focus on LLMs and attention-based foundation models, generative models, meta learning, disentangled/causal representation learning, and neural operator learning.

I earned my Ph.D. from the University of Michigan - Ann Arbor, and am actively collaborating with researchers across academia. Prior to joining Amazon, I was a Principal Machine Learning Scientist at Global Engineering & Materials, Inc. I am passionate about advancing the field of machine learning, and my work has been published in top-tier conference venues such as ICML, NeurIPS, and AISTATS.

Hit me up if you are interested in discussing or exploring potential collaboration opportunities!

news

May 01, 2025 Our paper Neural Interpretable PDEs: Harmonizing Fourier Insights with Attention for Scalable and Interpretable Physics Discovery has been accepted by ICML2025.
Feb 10, 2025 Happy to share that our paper Harnessing large language models for data-scarce learning of polymer properties has been published by Nature Computational Science.
Oct 03, 2024 The preprint of our paper Disentangled Representation Learning for Parametric Partial Differential Equations is available on Arxiv.
Sep 25, 2024 Our paper Nonlocal Attention Operator: Materializing Hidden Knowledge Towards Interpretable Physics Discovery has been accepted by NeurIPS2024 as a spotlight paper!
May 01, 2024 Our paper Harnessing the power of neural operators with automatically encoded conservation laws has been accepted by ICML2024 as a spotlight paper!

selected publications

  1. ICML
    nips.png
    Neural Interpretable PDEs: Harmonizing Fourier Insights with Attention for Scalable and Interpretable Physics Discovery
    Ning Liu, and Yue Yu
    In Forty-Second International Conference on Machine Learning, 2025
  2. NeurIPS
    NAO.png
    Nonlocal attention operator: Materializing hidden knowledge towards interpretable physics discovery
    Yue Yu, Ning Liu, Fei Lu, Tian Gao, Siavash Jafarzadeh, and Stewart Silling
    In Thirty-Eighth Conference on Neural Information Processing Systems, 2024
  3. Nature Comput. Sci.
    trinity2.png
    Harnessing large language models for data-scarce learning of polymer properties
    Ning Liu, Siavash Jafarzadeh, Brian Y Lattimer, Shuna Ni, Jim Lua, and Yue Yu
    Nature Computational Science, 2025
  4. ICML
    clawNO2.png
    Harnessing the Power of Neural Operators with Automatically Encoded Conservation Laws
    Ning Liu, Yiming Fan, Xianyi Zeng, Milan Klöwer, Lu Zhang, and Yue Yu
    In Forty-first International Conference on Machine Learning, 2024
  5. NeurIPS
    dafno.png
    Domain agnostic fourier neural operators
    Ning Liu, Siavash Jafarzadeh, and Yue Yu
    In Advances in Neural Information Processing Systems, 2023
  6. AISTATS
    ino2.png
    INO: Invariant neural operators for learning complex physical systems with momentum conservation
    Ning Liu, Yue Yu, Huaiqian You, and Neeraj Tatikola
    In International Conference on Artificial Intelligence and Statistics, 2023