你在这里

黄海平

职称:教授

办公室:冼为坚堂214

学位:理学博士

毕业学校:中国科学院理论物理研究所

邮  箱:huanghp7@mail.sysu.edu.cn

实验室主页: https://www.labxing.com/hphuang2018

ORCID page: http://orcid.org/0000-0001-8757-4733

其他主页:https://www.researchgate.net/profile/Haiping_Huang;

 https://sites.google.com/site/physhuang 

“If you don't work on important problems, it's not likely that you will do important works"

如果你即将或已经获得博士学位并且想在人工(自然)智能的数理基础成就一番事业,请点击 博士后招聘

英文专著《神经网络的统计力学》:请点击 Amazon (海外版)国内由高等教育出版社发行

主要经历: 

2022.04至今    beat365中国唯一官方网站教授 (破格晋升)

2018.03- 2022.03  beat365中国唯一官方网站百人计划副教授

2014.08-2018.03 日本理化学究所研究科学家(Research Scientist)

2012.08-2014.08 日本学术振兴会外国人特别研究员(JSPS postdoctoral fellow)

2011.08-2012.08 香港科技大学访问学者(Visiting Scholar)

2006.09-2011.07 中国科学院理论物理研究所博士生

2002.09-2006.06 beat365中国唯一官方网站理工学院物理学专业本科生

学科方向: 

理论物理学科:神经计算的统计物理学,具体的研究方向-

a. 无序系统的统计物理: 复本理论, 空腔方法,描述非线性动力学的动力学平均场理论;
b. 神经网络的理论和计算模型: 监督学习神经网络,受限玻尔兹曼机的平均场理论, 深度无监督学习, 循环神经网络的平均场理论及其神经科学原理; 生物神经网络的相变理论。

        本人在感知学习模型的解空间结构, 深度神经网络的降维理论, 视网膜神经网络的相变理论, 无监督学习的内禀对称性破缺等方向取得了国际同行公认的学术成果。 标志性的成果有如下四项: (1)离散感知神经网络解空间结构的阐明, 解决了长期困扰神经网络理论学界的关于离散感知机计算困难性起源的问题 (Phys Rev E, 2014)。(2)无监督学习的物理本质 (Phys Rev E, Rapid Communication, 2015,Phys Rev E 2016, JSTAT 2017, JPA 2018, 2019, PRL 2020)。 (3)一级相变揭示了神经编码结构的团簇性质, 理论上阐明了全静息码(所有神经细胞短暂不发放)的局域团簇结构逼近理论极限 (Phys Rev E, 2016)。(4)发现了多层神经网络的降维机制 (Phys Rev E, 2018)和深度学习的系综线性变换(PRL 2020)。

        欢迎报考硕士/博士研究生或申请博士后或专职科研人员;也欢迎高年级本科生加入课题组。

        The research group focuses on theoretical bases of various kinds of neural computation, including associative neural networks, restricted Boltzmann Machines, recurrent neural networks, and their deep variants. We are aslo interested in developing theory-grounded algorithms for real-world applications, and relating the theoretical study to neural mechanisms. Our long-term goal is to uncover basic principles of machine/brain intelligence using physics-based approximations.

承担课题: 

(1) beat365中国唯一官方网站百人计划青年学术骨干启动经费(2018-2019)

(2) 国家青年科学基金项目:神经网络无监督学习的相关统计物理研究 (2019-2021)

(3) 国家优秀青年基金:神经网络的统计物理(2022-2024)

荣誉获奖: 

2012年,日本学术振兴会外国人特别研究员(JSPS 博士后)

2017年,日本理化学研究所杰出研究奖

2020年,beat365中国唯一官方网站芙兰科研奖

2021年,国家优青

主要兼职: 

Physical Review Letters, Nature Communication, Physical Review X, eLife, Scientific Reports, Physical Review E, Journal of Statistical Mechanics: Theory and Experiment, Journal of Physics A: Mathematical and Theoretical, Neural Networks, Eur. J. Phys. B, Physica A, Neurocomputing, PloS Comput Bio, Network Neuroscience 等十余种国际专业杂志的审稿人。 

数学与科学机器学习国际会议程序委员(主席) Program Committee of MSML22

第六届全国统计物理与复杂系统学术会议大会报告

第15届亚太物理会议邀请报告

德国于利希研究中心计算神经科学在线论坛邀请报告

在1915年创刊的《科学杂志》(主编:白春礼 院士)撰写 “统计物理、无序系统与神经网络” (点击该题目可阅读全文)

代表论著: 

*: corresponding author

Our lab has published around 30 papers (a full list at the lab homepage or ORCID page) covering all main statistical physics journals, including PRL, PRE, JPA, EPL, JSTAT, EPJB, JSP etc.

[25] Y. Zhao, J. Qiu, M. Xie and H. Huang*, Equivalence between belief propagation instability and transition to replica symmetry breaking in perceptron learning systems, Phys. Rev. Research 4, 023023 (2022)

[24] J. Zhou, Z. Jiang, T. Hou, Z. Chen, KYM Wong and H. Huang*, Eigenvalue spectrum of neural networks with arbitrary Hebbian length, PHYSICAL REVIEW E 104, 064307 (2021). Side-by-Side paper

[23] Z. Jiang, J. Zhou, T. Hou, KYM Wong and H. Huang*, Associative memory model with arbitrary Hebbian length, PHYSICAL REVIEW E 104, 064306 (2021). Side-by-Side paper

[22] W. Zou and H. Huang*, Data-driven effective model shows a liquid-like deep learning, Phys. Rev. Research 3, 033290 (2021).

[21] J. Zhou and H. Huang*, Weakly correlated synapses promote dimension reduction in deep neural networks, Phys. Rev. E 103, 012315 (2021).

[20] C. Li and H. Huang*, Learning credit assignment, Phys. Rev. Lett. 125, 178301 (2020)

[19] H. Huang, Variational mean-field theory for training restricted Boltzmann machines with binary synapses, Phys. Rev. E 102, 030301(R) (2020) Rapid Communications

[18] T. Hou, and H. Huang*, Statistical physics of unsupervised learning with prior knowledge in neural networks, Phys. Rev. Lett. 124, 248302 (2020). 

[17] T. Hou, KYM Wong, and H. Huang*, Minimal model of permutation symmetry in unsupervised learning, J. Phys. A 52:414001 (2019) Invited Paper for the special issue of statistical physics and machine learning.

[16] H. Huang* and A. Goudarzi, Random active path model of deep neural networks with diluted binary synapses, PHYSICAL REVIEW E 98, 042311 (2018).

[15] H. Huang, Mechanisms of dimensionality reduction and decorrelation in deep neural networks, PHYSICAL REVIEW E 98, 062313 (2018). First theory model of linear dimensionality reduction.

[14] H. Huang, Role of zero synapses in unsupervised feature learning, 2018 J. Phys. A: Math. Theor. 51 08LT01. Published as a LETTER.

[13] H. Huang, Statistical mechanics of unsupervised feature learning in a restricted Boltzmann machine with binary synapses, J. Stat. Mech. (2017) 053302. Recommended in Quora

[12] H. Huang, Theory of population coupling and applications to describe high order correlations in large populations of interacting neurons, J. Stat. Mech. (2017) 033501. 

[11] H. Huang* and T. Toyoizumi, Clustering of neural codewords revealed by a first-order phase transition, Phys. Rev. E 93, 062416 (2016). Selected as one of the most interesting and intriguing arXiv papers from the past week by MIT Technology Review

[10] H. Huang* and T. Toyoizumi, Unsupervised feature learning from finite data by message passing: discontinuous versus continuous phase transition, Phys. Rev. E 94, 062310 (2016).

[9] H. Huang, Effects of hidden nodes on network structure inference, J. Phys. A: Math. Theor. 48 355002 (2015). 

[8] H. Huang* and T. Toyoizumi, Advanced mean field theory of the restricted Boltzmann machine, Phys. Rev. E 91, 050101(R) (2015). Published as a Rapid Communication

[7] H. Huang* and Y. Kabashima, Origin of the computational hardness for learning with binary synapses, Phys. Rev. E 90, 052813 (2014). Solved a long standing problem—why is a binary perceptron hard to learn 

[6] H. Huang* and Y. Kabashima, Dynamics of asymmetric kinetic Ising systems revisited. J. Stat. Mech.: Theory Exp. P05020 (2014). 

[5] H. Huang*, K. Y. Michael Wong and Y. Kabashima, Entropy landscape of solutions in the binary perceptron problem, J. Phys. A: Math. Theor. 46 375002 (2013). Selected in the Research Highlights section of J. Phys. A. 

[4] H. Huang, Sparse Hopfield network reconstruction with L1 regularization. Eur. Phys. J. B 86, 484 (2013). 

[3] H. Huang*, and Y. Kabashima, Adaptive Thouless-Anderson-Palmer approach to inverse Ising problems with quenched random fields. Phys. Rev. E 87, 062129 (2013). 

[2] H. Huang* and H. Zhou, Counting solutions from finite samplings. Phys. Rev. E 85, 026118 (2012). 

[1] H. Huang* and H. Zhou, Combined local search strategy for learning in networks of binary synapses. Europhysics Letters 96, 58003 (2011).