AKOOL Research

Welcome to our research hub, where we showcase groundbreaking work in GenAI.

Featured Publications

Latent Space Energy-Based Neural ODEs

This research introduces a novel framework for modeling continuous-time sequential data using Latent Space Energy-Based Neural Ordinary Differential Equations (ODEs). Our approach enhances the modeling of sequential data, offering new insights for machine learning tasks.

Authors:

Sheng Cheng, Deqian Kong, Jianwen Xie, Kookjin Lee, Ying Nian Wu, Yezhou Yang

Conference:

Journal:

Transactions on Machine Learning Research (TMLR) 2025

Archive:

Read the Paper

Molecule Design by Latent Prompt Transformer

We propose the Latent Prompt Transformer (LPT), a novel generative model for challenging problem of molecule design. Experiments demonstrate that LPT not only effectively discovers useful molecules across single-objective, multi-objective, and structure-constrained optimization tasks, but also exhibits strong sample efficiency.

Authors:

Deqian Kong, Yuhao Huang, Jianwen Xie, Edouardo Honig, Ming Xu, Shuanghong Xue, Pei Lin, Sanping Zhou, Sheng Zhong, Nanning Zheng, Ying Nian Wu

Conference:

The 38th Conference on Neural Information Processing Systems (NeurIPS 2024)

Journal:

Archive:

Read the Paper

Learning Energy-Based Models by Cooperative Diffusion Recovery Likelihood

This research presents an innovative method for learning energy-based models using cooperative diffusion recovery likelihood. Our approach leverages the strengths of cooperative learning and diffusion processes to improve the training efficiency and effectiveness of energy-based models.

Authors:

Yaxuan Zhu, Jianwen Xie, Ying Nian Wu, Ruiqi Gao

Conference:

The Twelfth International Conference on Learning Representations (ICLR), 2024

Journal:

Archive:

Read the Paper

An Energy-Based Prior for Generative Saliency

In this paper, we introduce a novel energy-based prior for generative saliency models. Our approach enhances the interpretability and performance of saliency detection by integrating energy-based techniques, leading to more accurate and reliable results in various applications.

Authors:

Jing Zhang, Jianwen Xie, Nick Barnes, Ping Li

Conference:

Journal:

IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2023

Archive:

Read the Paper

Latent Energy-Based Odyssey: Black-Box Optimization via Expanded Exploration in the Energy-Based Latent Space

This paper introduces a novel black-box optimization method known as the Latent Energy-Based Odyssey. The approach focuses on expanded exploration within an energy-based latent space, enhancing the optimization process. By leveraging the latent space’s energy landscape, the method improves the efficiency and effectiveness of optimization tasks, making it a significant advancement in the field of machine learning and optimization.

Authors:

Peiyu Yu, Dinghuai Zhang, Hengzhi He, Xiaojian Ma, Ruiyao Miao, Yifan Lu, Yasi Zhang, Deqian Kong, Ruiqi Gao, Jianwen Xie, Guang Cheng, Ying Nian Wu

Conference:

Journal:

Archive:

Read the Paper

CoopHash: Cooperative Learning of Multipurpose Descriptor and Contrastive Pair Generator via Variational MCMC Teaching for Supervised Image Hashing

CoopHash introduces a novel approach to supervised image hashing through cooperative learning. By leveraging Variational Markov Chain Monte Carlo (MCMC) teaching, it simultaneously optimizes a multipurpose descriptor and a contrastive pair generator. This innovative method enhances the efficiency and accuracy of image hashing, making it a significant advancement in the field.

Authors:

Khoa Doan, Jianwen Xie, Yaxuan Zhu, Yang Zhao, Ping Li

Conference:

Journal:

Archive:

Read the Paper

Latent Plan Transformer for Trajectory Abstraction: Planning as Latent Space Inference

In this paper, we introduce the Latent Plan Transformer, a novel framework that treats planning as latent variable inference. Our approach combines the strengths of transformers and latent variable models to achieve robust and efficient planning in complex environments.

Authors:

Deqian Kong, Dehong Xu, Minglu Zhao, Bo Pang, Jianwen Xie, Andrew Lizarraga, Yuhao Huang, Sirui Xie, Ying Nian Wu

Conference:

The 38th Conference on Neural Information Processing Systems (NeurIPS 2024)

Journal:

Archive:

Read the Paper

Progressive Energy-Based Cooperative Learning for Multi-Domain Image-to-Image Translation

We propose a progressive energy-based cooperative learning framework for multi-domain image-to-image translation. This method addresses the challenges of domain adaptation and translation by progressively refining the learning process, resulting in superior performance across multiple domains.

Authors:

Weinan Song, Yaxuan Zhu, Lei He, Ying Nian Wu, Jianwen Xie

Conference:

Journal:

Archive:

ArXiv, 2024

Read the Paper

About us

Our team consists of renowned researchers from prestigious institutions, working collaboratively to advance the field of artificial intelligence. We focus on developing innovative solutions that address real-world problems and contribute to the broader scientific community.

Stay Connected

Contact Us: For inquiries, collaborations, or more information about our research, please reach out to us at info@akool.com.