您的瀏覽器不支援JavaScript語法,網站的部份功能在JavaScript沒有啟用的狀態下無法正常使用。

Institute of Information Science, Academia Sinica

Events

Print

Press Ctrl+P to print from browser

Seminar

:::

[DLS2026-5]Conditional expectation and generative AI (Delivered in English)

  • LecturerProf. Eugene Wong (Emeritus Professor, University of California, Berkeley)
    Host: D.T. Lee
  • Time2026-07-14 (Tue.) 10:30 ~ 12:00
  • LocationAuditorium N106 at IIS
Abstract

Generative AI deals with the following problem: Given a collection of objects of the same type (e.g., images, texts), we want to generate new and interesting examples of the same type. The new examples should look like they belong to the initial collection, but not too much like any one of them. A popular approach is the Diffusion Model: In this approach a diffusion equation is used to produce a large set of data samples from the starting examples. The data samples are then used in a machine learning computation to produce a second diffusion equation which is used to generate the new examples. In this talk we propose a way to replace the machine learning operation by a direct estimation of conditional expectation. In particular, we try to identify the various functions that make machine learning so effective and replicate them in the conditional expectation approach. Our goal is to match or exceed the efficacy of machine learning in Generative Ai, and with much greater efficiency.

In this talk, we present a formulation of the generative AI problem as one of data-driven estimation of conditional expectation and some early evidence of its efficacy and computational efficiency.Generative AI deals with the following problem: Given a collection of objects of the same type (e.g., images, texts), we want to generate new and interesting examples of the same type. The new examples should look like they belong to the initial collection, but not too much like any one of them. A popular approach is the Diffusion Model: In this approach a diffusion equation is used to produce a large set of data samples from the starting examples. The data samples are then used in a machine learning computation to produce a second diffusion equation which is used to generate the new examples. In this talk we propose a way to replace the machine learning operation by a direct estimation of conditional expectation. In particular, we try to identify the various functions that make machine learning so effective and replicate them in the conditional expectation approach. Our goal is to match or exceed the efficacy of machine learning in Generative Ai, and with much greater efficiency.

BIO
Eugene Wong is a graduate of Princeton University (B.S.E 1955, Ph.D. 1959). In a career of 70 years, he has served in academia, government, industry and in international advisory capacities.
He was on the Berkeley faculty from 1962 to 1994, where he was the Chairman of the Department of Electrical Engineering and Computer Sciences from 1985 to 1989.
He was Associate Director of the White House Office of Science and Technology Policy (1990 - 1993) and led the Engineering Directorate of the National Science Foundation (1998 – 2000).
He was a founder of the INGRES Corporation in 1981 and the CEO of Versata Corporation from 2002 to 2004. He served on the Board of NACCO Corporation (2005-2014) and Hyster-Yale Corporation (2014 - 2023).
He has served in advisory roles for Science and Technology Policy in Hong Kong, Ireland, Switzerland and Taiwan.
In research he is best known for his work in stochastic processes and database management systems (Ingres).
Wong received the ACM Systems Prize for his work on Ingres and the IEEE Founders Medal for his lifetime contributions. He is member of several honorary professional organizations, including Academia Sinica,  the U.S. National Academy of Engineering and the American Academy of Arts and Sciences.