Jungtaek Kim's Publications / E-prints / Manuscripts / Technical Reports / Dissertation

(* and + indicate equal contribution by representing co-first and co-corresponding authors, respectively.)

  1. Mingxuan Li, Jungtaek Kim, and Paul W. Leu (2024),
    "Discovering multi-layer films for electromagnetic interference shielding and passive cooling with multi-objective active learning,"
    NeurIPS Workshop on AI for Accelerated Materials Discovery (AI4Mat-2024),
    Vancouver, British Columbia, Canada, December 14, 2024.
  2. Chaeyun Jang*, Hyungi Lee*, Jungtaek Kim+, and Juho Lee+ (2024),
    "Model fusion through Bayesian optimization in language model fine-tuning,"
    in Advances in Neural Information Processing Systems 37 (NeurIPS-2024),
    Vancouver, British Columbia, Canada, December 10-15, 2024.
    Acceptance rate: 40xx/15671 = 25.8%
    Spotlight Presentation
  3. Seokjun Ahn*, Jungtaek Kim*, Minsu Cho, and Jaesik Park (2024),
    "Budget-aware sequential brick assembly with efficient constraint satisfaction,"
    Transactions on Machine Learning Research,
    2024.
  4. Hyunsoo Chung, Jungtaek Kim, Hyungeun Jo, and Hyungwon Choi (2024),
    "Exploiting preferences in loss functions for sequential recommendation via weak transitivity,"
    in Proceedings of the Thirty-Third ACM International Conference on Information and Knowledge Management (CIKM-2024),
    Boise, Idaho, USA, October 21-25, 2024.
    Acceptance rate: 141/527 = 26.8%
    Short Research Paper Track
  5. Kwang-Sung Jun and Jungtaek Kim (2024),
    "Noise-adaptive confidence sets for linear bandits and application to Bayesian optimization,"
    in Proceedings of the Forty-First International Conference on Machine Learning (ICML-2024),
    Vienna, Austria, July 21-27, 2024.
    Acceptance rate: 2610/9473 = 27.6%
  6. Karinna Martin, Katie Shanks, Yipeng Liu, Jungtaek Kim, Sajad Haghanifar, Mehdi Zarei, Sooraj Sharma, and Paul W. Leu (2024),
    "Minimizing annual reflection loss in fixed-tilt photovoltaic modules using graded refractive index (GRIN) anti-reflective glass,"
    Solar Energy,
    vol. 272, p. 112424, 2024.
  7. Mehdi Zarei, Mingxuan Li, Elizabeth E. Medvedeva, Sooraj Sharma, Jungtaek Kim, Zefan Shao, S. Brett Walker, Melbs LeMieux, Qihan Liu, and Paul W. Leu (2024),
    "Flexible embedded metal meshes by sputter-free crack lithography for transparent electrodes and electromagnetic interference shielding,"
    ACS Applied Materials & Interfaces,
    vol. 16, no. 5, pp. 6382-6393, 2024.
  8. Jungtaek Kim, Jeongbeen Yoon, and Minsu Cho (2024),
    "Generalized neural sorting networks with error-free differentiable swap functions,"
    in Proceedings of the Twelfth International Conference on Learning Representations (ICLR-2024),
    Vienna, Austria, May 7-11, 2024.
    Acceptance rate: 2260/7262 = 31.1%
  9. Jungtaek Kim, Mingxuan Li, Yirong Li, Andrés Gómez, Oliver Hinder, and Paul W. Leu (2024),
    "Multi-BOWS: Multi-fidelity multi-objective Bayesian optimization with warm starts for nanophotonic structure design,"
    Digital Discovery,
    vol. 3, no. 2, pp. 381-391, 2024.
  10. Chaeyun Jang, Jungtaek Kim, Hyungi Lee, and Juho Lee (2023),
    "Model fusion through Bayesian optimization in language model fine-tuning,"
    NeurIPS Workshop on Efficient Natural Language and Speech Processing (ENLSP-2023),
    New Orleans, Louisiana, USA, December 16, 2023.
  11. Hyunsoo Chung and Jungtaek Kim (2023),
    "Leveraging uniformity of normalized embeddings for sequential recommendation,"
    NeurIPS Workshop on Self-Supervised Learning - Theory and Practice (SSL-TP-2023),
    New Orleans, Louisiana, USA, December 16, 2023.
  12. Jungtaek Kim and Seungjin Choi (2023),
    "BayesO: A Bayesian optimization framework in Python,"
    Journal of Open Source Software,
    vol. 8, no. 90, p. 5320, 2023.
  13. Jungtaek Kim, Mingxuan Li, Oliver Hinder, and Paul W. Leu (2023),
    "Datasets and benchmarks for nanophotonic structure and parametric design simulations,"
    in Advances in Neural Information Processing Systems 36 (NeurIPS-2023),
    New Orleans, Louisiana, USA, December 10-16, 2023.
    Acceptance rate: 322/987 = 32.6%
    Datasets and Benchmarks Track
  14. Tackgeun You, Mijeong Kim, Jungtaek Kim, and Bohyung Han (2023),
    "Generative neural fields by mixtures of neural implicit functions,"
    in Advances in Neural Information Processing Systems 36 (NeurIPS-2023),
    New Orleans, Louisiana, USA, December 10-16, 2023.
    Acceptance rate: 3218/12343 = 26.1%
  15. Jungtaek Kim, Seungjin Choi+, and Minsu Cho+ (2022),
    "Combinatorial Bayesian optimization with random mapping functions to convex polytopes,"
    in Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence (UAI-2022),
    Eindhoven, the Netherlands, August 1-5, 2022.
    Acceptance rate: 230/712 = 32.3%
  16. Jinhwi Lee*, Jungtaek Kim*, Hyunsoo Chung, Jaesik Park, and Minsu Cho (2022),
    "Learning to assemble geometric shapes,"
    in Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence (IJCAI-2022),
    Vienna, Austria, July 23-29, 2022.
    Acceptance rate: 681/4535 = 15.0%
  17. Rylee Thompson, Boris Knyazev, Elahe Ghalebi, Jungtaek Kim, and Graham W. Taylor (2022),
    "On evaluation metrics for graph generative models,"
    in Proceedings of the Tenth International Conference on Learning Representations (ICLR-2022),
    Virtual, April 25-29, 2022.
    Acceptance rate: 1095/3391 = 32.3%
  18. Jungtaek Kim and Seungjin Choi (2022),
    "On uncertainty estimation by tree-based surrogate models in sequential model-based optimization,"
    in Proceedings of the Twenty-Fifth International Conference on Artificial Intelligence and Statistics (AISTATS-2022),
    Virtual, March 28-30, 2022.
    Acceptance rate: 492/1685 = 29.2%
  19. Jungtaek Kim (2022),
    "Efficient Bayesian optimization: Algorithms, approximation, and regret analysis,"
    Doctoral Dissertation,
    2022.
  20. Hyunsoo Chung*, Jungtaek Kim*, Boris Knyazev, Jinhwi Lee, Graham W. Taylor, Jaesik Park, and Minsu Cho (2021),
    "Brick-by-Brick: Combinatorial construction with deep reinforcement learning,"
    in Advances in Neural Information Processing Systems 34 (NeurIPS-2021),
    Virtual, December 6-14, 2021.
    Acceptance rate: 2344/9122 = 25.7%
  21. Jungtaek Kim, Michael McCourt, Tackgeun You, Saehoon Kim, and Seungjin Choi (2021),
    "Bayesian optimization with approximate set kernels,"
    Machine Learning,
    vol. 110, no. 5, pp. 857-879, 2021.
    Acceptance rate: 20/107 = 18.7%
    Part of Special Issue of the ECML-PKDD-2021 Journal Track
  22. Jungtaek Kim, Hyunsoo Chung, Jinhwi Lee, Minsu Cho, and Jaesik Park (2020),
    "Combinatorial 3D shape generation via sequential assembly,"
    NeurIPS Workshop on Machine Learning for Engineering Modeling, Simulation, and Design (ML4Eng-2020),
    Virtual, December 12, 2020.
  23. Jinhwi Lee*, Jungtaek Kim*, Hyunsoo Chung, Jaesik Park, and Minsu Cho (2020),
    "Fragment relation networks for geometric shape assembly,"
    NeurIPS Workshop on Learning Meets Combinatorial Algorithms (LMCA-2020),
    Virtual, December 12, 2020.
  24. Juho Lee*, Yoonho Lee*, Jungtaek Kim, Eunho Yang, Sung Ju Hwang, and Yee Whye Teh (2020),
    "Bootstrapping neural processes,"
    in Advances in Neural Information Processing Systems 33 (NeurIPS-2020),
    Virtual, December 6-12, 2020.
    Acceptance rate: 1900/9454 = 20.1%
  25. Jungtaek Kim and Seungjin Choi (2020),
    "On local optimizers of acquisition functions in Bayesian optimization,"
    in Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD-2020),
    Virtual, September 14-18, 2020.
    Acceptance rate: 131/687 = 19.1%
  26. Jungtaek Kim (2020),
    "Benchmark functions for Bayesian optimization,"
    Notes on Bayesian Optimization,
    July 30, 2020.
  27. Jungtaek Kim, Michael McCourt, Tackgeun You, Saehoon Kim, and Seungjin Choi (2019),
    "Bayesian optimization over sets,"
    ICML Workshop on Automated Machine Learning (AutoML-2019),
    Long Beach, California, USA, June 14, 2019.
  28. Juho Lee, Yoonho Lee, Jungtaek Kim, Adam R. Kosiorek, Seungjin Choi, and Yee Whye Teh (2019),
    "Set Transformer: A framework for attention-based permutation-invariant neural networks,"
    in Proceedings of the Thirty-Sixth International Conference on Machine Learning (ICML-2019),
    Long Beach, California, USA, June 9-15, 2019.
    Acceptance rate: 773/3424 = 22.6%
  29. Jungtaek Kim and Seungjin Choi (2019),
    "Practical Bayesian optimization with threshold-guided marginal likelihood maximization,"
    arXiv e-prints, arXiv:1905.07540, May 18, 2019.
  30. Minseop Park, Jungtaek Kim, Saehoon Kim, Yanbin Liu, and Seungjin Choi (2019),
    "MxML: Mixture of meta-learners for few-shot classification,"
    arXiv e-prints, arXiv:1904.05658, April 11, 2019.
  31. Minseop Park, Saehoon Kim, Jungtaek Kim, Yanbin Liu, and Seungjin Choi (2018),
    "TAEML: Task-adaptive ensemble of meta-learners,"
    NeurIPS Workshop on Meta-Learning (MetaLearn-2018),
    Montreal, Quebec, Canada, December 8, 2018.
  32. Jungtaek Kim and Seungjin Choi (2018),
    "Automated machine learning for soft voting in an ensemble of tree-based classifiers,"
    ICML Workshop on Automatic Machine Learning (AutoML-2018),
    Stockholm, Sweden, July 14, 2018.
  33. Inhyuk Jo, Jungtaek Kim, Hyohyeong Kang, Yong-Deok Kim, and Seungjin Choi (2018),
    "Open set recognition by regularizing classifier with fake data generated by generative adversarial networks,"
    in Proceedings of the Forty-Third IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP-2018),
    Calgary, Alberta, Canada, April 15-20, 2018.
    Acceptance rate: 1406/2829 = 49.7%
  34. Jungtaek Kim and Seungjin Choi (2018),
    "Clustering-guided GP-UCB for Bayesian optimization,"
    in Proceedings of the Forty-Third IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP-2018),
    Calgary, Alberta, Canada, April 15-20, 2018.
    Acceptance rate: 1406/2829 = 49.7%
  35. Saehoon Kim, Jungtaek Kim, and Seungjin Choi (2018),
    "On the optimal bit complexity of circulant binary embedding,"
    in Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-2018),
    New Orleans, Louisiana, USA, February 2-7, 2018.
    Acceptance rate: 933/3800 = 24.6%
  36. Jungtaek Kim, Saehoon Kim, and Seungjin Choi (2017),
    "Learning to transfer initializations for Bayesian hyperparameter optimization,"
    NeurIPS Workshop on Bayesian Optimization (BayesOpt-2017),
    Long Beach, California, USA, December 9, 2017.
  37. Jungtaek Kim, Saehoon Kim, and Seungjin Choi (2017),
    "Learning to warm-start Bayesian hyperparameter optimization,"
    arXiv e-prints, arXiv:1710.06219, October 17, 2017.
  38. Jungtaek Kim, Jongheon Jeong, and Seungjin Choi (2016),
    "AutoML Challenge: AutoML framework using random space partitioning optimizer,"
    ICML Workshop on Automatic Machine Learning (AutoML-2016),
    New York, New York, USA, June 24, 2016.
    Special Track, 3-page