Cutting-Edge Gen AI Platform
for
AKOOL Research
Welcome to our research hub, where we showcase groundbreaking work in GenAI.
Featured Publications
An Energy-Based Prior for Generative Saliency
Authors: Jing Zhang, Jianwen Xie, Nick Barnes, Ping Li
Journal: IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2023
In this paper, we introduce a novel energy-based prior for generative saliency models. Our approach enhances the interpretability and performance of saliency detection by integrating energy-based techniques, leading to more accurate and reliable results in various applications.
Read the Paper
Journal: IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2023
In this paper, we introduce a novel energy-based prior for generative saliency models. Our approach enhances the interpretability and performance of saliency detection by integrating energy-based techniques, leading to more accurate and reliable results in various applications.
Read the Paper
Learning Energy-Based Models by Cooperative Diffusion Recovery Likelihood
Authors: Yaxuan Zhu, Jianwen Xie, Ying Nian Wu, Ruiqi Gao
Conference: The Twelfth International Conference on Learning Representations (ICLR), 2024
This research presents an innovative method for learning energy-based models using cooperative diffusion recovery likelihood. Our approach leverages the strengths of cooperative learning and diffusion processes to improve the training efficiency and effectiveness of energy-based models.
Read the Paper
Conference: The Twelfth International Conference on Learning Representations (ICLR), 2024
This research presents an innovative method for learning energy-based models using cooperative diffusion recovery likelihood. Our approach leverages the strengths of cooperative learning and diffusion processes to improve the training efficiency and effectiveness of energy-based models.
Read the Paper
Progressive Energy-Based Cooperative Learning for Multi-Domain Image-to-Image Translation
Authors: Weinan Song, Yaxuan Zhu, Lei He, Ying Nian Wu, Jianwen Xie
Archive: ArXiv, 2024
We propose a progressive energy-based cooperative learning framework for multi-domain image-to-image translation. This method addresses the challenges of domain adaptation and translation by progressively refining the learning process, resulting in superior performance across multiple domains.
Read the Paper
Archive: ArXiv, 2024
We propose a progressive energy-based cooperative learning framework for multi-domain image-to-image translation. This method addresses the challenges of domain adaptation and translation by progressively refining the learning process, resulting in superior performance across multiple domains.
Read the Paper
Latent Plan Transformer for Trajectory Abstraction: Planning as Latent Space Inference
Authors: Deqian Kong, Dehong Xu, Minglu Zhao, Bo Pang, Jianwen Xie, Andrew Lizarraga, Yuhao Huang, Sirui Xie, Ying Nian Wu
In this paper, we introduce the Latent Plan Transformer, a novel framework that treats planning as latent variable inference. Our approach combines the strengths of transformers and latent variable models to achieve robust and efficient planning in complex environments.
Conference: The 38th Conference on Neural Information Processing Systems (NeurIPS 2024)
Read the Paper
In this paper, we introduce the Latent Plan Transformer, a novel framework that treats planning as latent variable inference. Our approach combines the strengths of transformers and latent variable models to achieve robust and efficient planning in complex environments.
Conference: The 38th Conference on Neural Information Processing Systems (NeurIPS 2024)
Read the Paper
CoopHash: Cooperative Learning of Multipurpose Descriptor and Contrastive Pair Generator via Variational MCMC Teaching for Supervised Image Hashing
Authors: Khoa Doan, Jianwen Xie, Yaxuan Zhu, Yang Zhao, Ping Li
CoopHash introduces a novel approach to supervised image hashing through cooperative learning. By leveraging Variational Markov Chain Monte Carlo (MCMC) teaching, it simultaneously optimizes a multipurpose descriptor and a contrastive pair generator. This innovative method enhances the efficiency and accuracy of image hashing, making it a significant advancement in the field.
Read the Paper
CoopHash introduces a novel approach to supervised image hashing through cooperative learning. By leveraging Variational Markov Chain Monte Carlo (MCMC) teaching, it simultaneously optimizes a multipurpose descriptor and a contrastive pair generator. This innovative method enhances the efficiency and accuracy of image hashing, making it a significant advancement in the field.
Read the Paper
Latent Energy-Based Odyssey: Black-Box Optimization via Expanded Exploration in the Energy-Based Latent Space
Authors: Peiyu Yu, Dinghuai Zhang, Hengzhi He, Xiaojian Ma, Ruiyao Miao, Yifan Lu, Yasi Zhang, Deqian Kong, Ruiqi Gao, Jianwen Xie, Guang Cheng, Ying Nian Wu
This paper introduces a novel black-box optimization method known as the Latent Energy-Based Odyssey. The approach focuses on expanded exploration within an energy-based latent space, enhancing the optimization process. By leveraging the latent space’s energy landscape, the method improves the efficiency and effectiveness of optimization tasks, making it a significant advancement in the field of machine learning and optimization.
Read the Paper
This paper introduces a novel black-box optimization method known as the Latent Energy-Based Odyssey. The approach focuses on expanded exploration within an energy-based latent space, enhancing the optimization process. By leveraging the latent space’s energy landscape, the method improves the efficiency and effectiveness of optimization tasks, making it a significant advancement in the field of machine learning and optimization.
Read the Paper
Molecule Design by Latent Prompt Transformer
Authors: Deqian Kong, Yuhao Huang, Jianwen Xie, Edouardo Honig, Ming Xu, Shuanghong Xue, Pei Lin, Sanping Zhou, Sheng Zhong, Nanning Zheng, Ying Nian Wu
We propose the Latent Prompt Transformer (LPT), a novel generative model for challenging problem of molecule design. Experiments demonstrate that LPT not only effectively discovers useful molecules across single-objective, multi-objective, and structure-constrained optimization tasks, but also exhibits strong sample efficiency.
Read the Paper
We propose the Latent Prompt Transformer (LPT), a novel generative model for challenging problem of molecule design. Experiments demonstrate that LPT not only effectively discovers useful molecules across single-objective, multi-objective, and structure-constrained optimization tasks, but also exhibits strong sample efficiency.
Read the Paper
About us
Our team consists of renowned researchers from prestigious institutions, working collaboratively to advance the field of artificial intelligence. We focus on developing innovative solutions that address real-world problems and contribute to the broader scientific community.
Stay Connected
Contact Us: For inquiries, collaborations, or more information about our research, please reach out to us at info@akool.com.