{"id":32,"date":"2026-02-21T18:29:23","date_gmt":"2026-02-21T10:29:23","guid":{"rendered":"https:\/\/sioklab.com\/?page_id=32"},"modified":"2026-04-16T20:59:43","modified_gmt":"2026-04-16T12:59:43","slug":"news","status":"publish","type":"page","link":"https:\/\/sioklab.com\/index.php\/news\/","title":{"rendered":"News"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-page\" data-elementor-id=\"32\" class=\"elementor elementor-32\" data-elementor-post-type=\"page\">\n\t\t\t\t<div class=\"elementor-element elementor-element-aaa8bec e-flex e-con-boxed e-con e-parent\" data-id=\"aaa8bec\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-b1e02a9 elementor-widget elementor-widget-heading\" data-id=\"b1e02a9\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">News<\/h2>\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-b359441 e-flex e-con-boxed e-con e-parent\" data-id=\"b359441\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-fa2056d e-con-full e-flex e-con e-child\" data-id=\"fa2056d\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t<div class=\"elementor-element elementor-element-3a0b755 e-flex e-con-boxed e-con e-child\" data-id=\"3a0b755\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-d3a8d4c e-flex e-con-boxed e-con e-child\" data-id=\"d3a8d4c\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-79ef5bc elementor-widget elementor-widget-image\" data-id=\"79ef5bc\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img fetchpriority=\"high\" decoding=\"async\" width=\"1500\" height=\"1200\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/04\/openday_news-1.jpg\" class=\"attachment-full size-full wp-image-933\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/04\/openday_news-1.jpg 1500w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/04\/openday_news-1-300x240.jpg 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/04\/openday_news-1-1024x819.jpg 1024w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/04\/openday_news-1-768x614.jpg 768w\" sizes=\"(max-width: 1500px) 100vw, 1500px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-2d7ea90 e-con-full e-flex e-con e-child\" data-id=\"2d7ea90\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-b57103f elementor-widget elementor-widget-heading\" data-id=\"b57103f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[04\/2026] NeuroVoice Showcased at PolyU Alumni Day!<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c982d28 elementor-widget elementor-widget-text-editor\" data-id=\"c982d28\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>We are delighted to share that the SIOK LAB of the Department of Language Sciences and Technology at The Hong Kong Polytechnic University was invited to participate in the PolyU Alumni Day showcase. At the event, our laboratory presented <strong data-start=\"346\" data-end=\"360\">NeuroVoice<\/strong>, a non-invasive intelligent speech brain-computer interface system developed entirely in-house with full proprietary intellectual property.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-1b4bf41 elementor-widget elementor-widget-text-editor\" data-id=\"1b4bf41\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p data-start=\"502\" data-end=\"846\">NeuroVoice is designed to help restore natural communication for individuals with speech impairments. It integrates <strong data-start=\"618\" data-end=\"652\">a non-invasive wearable device<\/strong> with <strong data-start=\"658\" data-end=\"694\">an intelligent analysis platform<\/strong> to monitor language-related brain activity, interpret neural signals, and translate them into speech and text in real time through an AI-powered model.<\/p><p data-start=\"848\" data-end=\"1410\">During the showcase, many volunteers enthusiastically experienced the NeuroVoice system, which received highly positive feedback from scholars, teachers, students, and alumni alike. We are sincerely grateful for the long-standing support from the University, the Faculty, and the Department, as well as for the tireless dedication of our team members, whose hard work made this presentation possible. We would also like to extend our special thanks to the Dean of the Faculty of Humanities, Dean Hu, for the continued attention and support given to our research.<\/p><p data-start=\"1412\" data-end=\"1702\">The showcase also brought us many valuable suggestions from users, all of which our team greatly appreciates. We will continue refining the system and strive to launch an upgraded version of NeuroVoice in the near future, with the goal of better serving individuals with speech impairments.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-4c1b49e e-con-full e-flex e-con e-child\" data-id=\"4c1b49e\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t<div class=\"elementor-element elementor-element-9376ca6 e-flex e-con-boxed e-con e-child\" data-id=\"9376ca6\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-8b3203b e-flex e-con-boxed e-con e-child\" data-id=\"8b3203b\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-7adf210 elementor-widget elementor-widget-image\" data-id=\"7adf210\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"789\" height=\"1118\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u8bb2\u5ea7202604.png\" class=\"attachment-full size-full wp-image-488\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u8bb2\u5ea7202604.png 789w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u8bb2\u5ea7202604-212x300.png 212w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u8bb2\u5ea7202604-723x1024.png 723w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u8bb2\u5ea7202604-768x1088.png 768w\" sizes=\"(max-width: 789px) 100vw, 789px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-bf5deb6 e-con-full e-flex e-con e-child\" data-id=\"bf5deb6\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-4edf6bb elementor-widget elementor-widget-heading\" data-id=\"4edf6bb\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[04\/2026] UBSN Research Seminar: Prof. Yang Yang to Give a Talk at PolyU (17 April 2026)!<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cf3a2f7 elementor-widget elementor-widget-text-editor\" data-id=\"cf3a2f7\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>We are pleased to share that our lab will host <strong>Prof. Yang Yang (Associate Professor, Institute of Psychology, Chinese Academy of Sciences)<\/strong> for a UBSN Research Seminar under the UBSN Capacity Building Scheme: Inbound Scheme at <strong>The Hong Kong Polytechnic University on 17 April 2026 (Friday)<\/strong>.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e3718ab elementor-widget elementor-widget-text-editor\" data-id=\"e3718ab\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>In this talk, \u201cHow the Brain Learns to Read and Write \u2013 And Why Some Struggle,\u201d Prof. Yang will present recent findings on the developmental and evolutionary mechanisms of Chinese handwriting and reading based on functional and structural MRI studies. He will discuss how handwriting development is accompanied by focal functional specialization, increasing functional lateralization, and dynamic reconfiguration of cognitive, sensorimotor, and visual networks. He will also introduce cross-species evidence from humans and macaques that highlights the anatomical similarity and functional evolution of Exner\u2019s area, a shared brain locus involved in both reading and writing.<\/p><p>In the second part of the talk, Prof. Yang will discuss the neural basis of writing deficits and their relationship to reading impairments in developmental dyslexia. He will present findings showing that children with dyslexia exhibit abnormalities in both regional activation and functional connectivity during handwriting, and that reduced activation in the left supplementary motor area and the right precuneus is linked to impairments in both handwriting and reading. He will also introduce a digital handwriting-based training program that significantly improves writing and reading skills, with transfer effects on attention abilities.<\/p><p><strong>\u2022 Date: 17 April 2026 (Friday) <\/strong><br \/><strong>\u2022 Time: 11:00 am\u201312:00 noon <\/strong><br \/><strong>\u2022 Venue: Room PQ303, PolyU <\/strong><br \/><strong>\u2022 Registration: Please register via the QR code on the poster.<\/strong><\/p><p>We warmly welcome students and colleagues to join us!<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-7362bdf e-flex e-con-boxed e-con e-child\" data-id=\"7362bdf\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-76c71c8 e-flex e-con-boxed e-con e-child\" data-id=\"76c71c8\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-2e92b54 e-flex e-con-boxed e-con e-child\" data-id=\"2e92b54\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-6ac8a17 elementor-widget elementor-widget-image\" data-id=\"6ac8a17\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"732\" height=\"384\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260307172011_2271_124.png\" class=\"attachment-full size-full wp-image-547\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260307172011_2271_124.png 732w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260307172011_2271_124-300x157.png 300w\" sizes=\"(max-width: 732px) 100vw, 732px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-338b45d elementor-widget elementor-widget-image\" data-id=\"338b45d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"828\" height=\"237\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260308165452_2288_124.png\" class=\"attachment-full size-full wp-image-549\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260308165452_2288_124.png 828w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260308165452_2288_124-300x86.png 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260308165452_2288_124-768x220.png 768w\" sizes=\"(max-width: 828px) 100vw, 828px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-48afb8f elementor-widget elementor-widget-image\" data-id=\"48afb8f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"828\" height=\"201\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260308165452_2287_124.png\" class=\"attachment-full size-full wp-image-548\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260308165452_2287_124.png 828w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260308165452_2287_124-300x73.png 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260308165452_2287_124-768x186.png 768w\" sizes=\"(max-width: 828px) 100vw, 828px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-4eb1a08 e-con-full e-flex e-con e-child\" data-id=\"4eb1a08\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-409b0e8 elementor-widget elementor-widget-heading\" data-id=\"409b0e8\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[03\/2026] Congratulations to Gong\u2019s paper is accepted by IEEE Sensors Journal!<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-783e535 elementor-widget elementor-widget-text-editor\" data-id=\"783e535\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>S Gong, Y Li, Z Kang, B Chai, W Zeng, H Yan, Z Zhang, WT Siok, N Wang. LEREL: Lipschitz Continuity-Constrained Emotion Recognition Ensemble Learning For Electroencephalography [J]. IEEE Sensors Journal, 2026<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-4075223 elementor-widget elementor-widget-text-editor\" data-id=\"4075223\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p><strong>Abstract:<\/strong><\/p><p>Accurate and efficient recognition of emotional states is critical for human social functioning, and impairments in this ability are associated with significant psychosocial difficulties. While electroencephalography (EEG) offers a powerful tool for objective emotion detection, existing EEG-based Emotion Recognition (EER) methods suffer from three key limitations: (1) insufficient model stability, (2) limited accuracy in processing high-dimensional nonlinear EEG signals, and (3) poor robustness against intra-subject variability and signal noise. To address these challenges, we introduce Lipschitz continuity-constrained Ensemble Learning (LEL), a novel framework that enhances EEG-based emotion recognition by enforcing Lipschitz continuity constraints on Transformer-based attention mechanisms, spectral extraction, and normalization modules. This constraint ensures model stability, reduces sensitivity to signal variability and noise, and improves generalization capability. Additionally, LEL employs a learnable ensemble fusion strategy that optimally combines decisions from multiple heterogeneous classifiers to mitigate single-model bias and variance. Extensive experiments on three public benchmark datasets (EAV, FACED, and SEED) demonstrate superior performance, achieving average recognition accuracies of 74.25%, 81.19%, and 86.79%, respectively. The official implementation codes are available at\u00a0<a href=\"https:\/\/github.com\/NZWANG\/LEL\">https:\/\/github.com\/NZWANG\/LEL<\/a>.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-e7fb894 e-flex e-con-boxed e-con e-child\" data-id=\"e7fb894\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-54240cf e-flex e-con-boxed e-con e-child\" data-id=\"54240cf\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-dc5a38e e-flex e-con-boxed e-con e-child\" data-id=\"dc5a38e\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-c8b70ac elementor-widget elementor-widget-image\" data-id=\"c8b70ac\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1587\" height=\"2245\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260302105828_2767_194.jpg\" class=\"attachment-full size-full wp-image-331\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260302105828_2767_194.jpg 1587w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260302105828_2767_194-212x300.jpg 212w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260302105828_2767_194-724x1024.jpg 724w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260302105828_2767_194-768x1086.jpg 768w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260302105828_2767_194-1086x1536.jpg 1086w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20260302105828_2767_194-1448x2048.jpg 1448w\" sizes=\"(max-width: 1587px) 100vw, 1587px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-e2f4d5d e-con-full e-flex e-con e-child\" data-id=\"e2f4d5d\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-a3675b0 elementor-widget elementor-widget-heading\" data-id=\"a3675b0\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[03\/2026] LST Research Seminar: Dr. Yuanning Li to Give a Talk at PolyU (24 March 2026)!<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d058576 elementor-widget elementor-widget-text-editor\" data-id=\"d058576\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>We are pleased to share that our lab will host\u00a0<strong>Dr. Yuanning Li<\/strong>\u00a0(Assistant Professor, School of Biomedical Engineering,\u00a0<strong>ShanghaiTech University<\/strong>) for an\u00a0<strong>LST Research Seminar<\/strong>\u00a0at The Hong Kong Polytechnic University on\u00a0<strong>24 March 2026 (Tuesday)<\/strong>.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-03417d4 elementor-widget elementor-widget-text-editor\" data-id=\"03417d4\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>We are pleased to share that our lab will host\u00a0<strong>Dr. Yuanning Li<\/strong>\u00a0(Assistant Professor, School of Biomedical Engineering,\u00a0<strong>ShanghaiTech University<\/strong>) for an\u00a0<strong>LST Research Seminar<\/strong>\u00a0at The Hong Kong Polytechnic University on\u00a0<strong>24 March 2026 (Tuesday)<\/strong>.<\/p><p>In this talk,\u00a0<strong>\u201cNeural coding, computational models and brain-computer interfaces for human languages,\u201d<\/strong>\u00a0Dr. Li will introduce recent computational efforts to understand and reconstruct speech perception and production using\u00a0<strong>human intracranial electrophysiology recordings<\/strong>\u00a0and\u00a0<strong>AI models<\/strong>. He will also discuss converging representations between biological speech networks and deep neural network models, and how tailored deep learning models can enable\u00a0<strong>speech brain\u2013computer interfaces<\/strong>\u00a0that synthesize speech directly from intracranial signals.<\/p><ul class=\"wp-block-list\"><li><strong>Date<\/strong>: 24 March 2026 (Tue)<\/li><li><strong>Time<\/strong>: 15:00\u201316:00 (HKT)<\/li><li><strong>Venue<\/strong>:\u00a0<strong>HHB106<\/strong>, Hung Hom Bay Campus, PolyU<\/li><li><strong>Zoom<\/strong>: Meeting ID\u00a0<strong>925 2441 8249<\/strong>\u00a0| Password\u00a0<strong>557055<\/strong>\u00a0(or scan the QR code on the poster)<\/li><\/ul><p>We warmly welcome students and colleagues to join us onsite or online!<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-f1c20b6 e-flex e-con-boxed e-con e-child\" data-id=\"f1c20b6\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-d6b6fbe e-flex e-con-boxed e-con e-child\" data-id=\"d6b6fbe\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-8f44ed6 e-flex e-con-boxed e-con e-child\" data-id=\"8f44ed6\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-00254f7 elementor-widget elementor-widget-image\" data-id=\"00254f7\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1038\" height=\"897\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u65b0\u95fb202260224.png\" class=\"attachment-full size-full wp-image-335\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u65b0\u95fb202260224.png 1038w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u65b0\u95fb202260224-300x259.png 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u65b0\u95fb202260224-1024x885.png 1024w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u65b0\u95fb202260224-768x664.png 768w\" sizes=\"(max-width: 1038px) 100vw, 1038px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-4c9d78a e-con-full e-flex e-con e-child\" data-id=\"4c9d78a\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-11176b1 elementor-widget elementor-widget-heading\" data-id=\"11176b1\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[02\/2026] NeuroVoice Featured on Times Higher Education (THE) !<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-207004c elementor-widget elementor-widget-text-editor\" data-id=\"207004c\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>We are delighted to share that NeuroVoice has been featured on Times Higher Education (THE) in a story on \u201cLeading the way in AI and humanities research\u201d<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-56eb96a elementor-widget elementor-widget-text-editor\" data-id=\"56eb96a\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div><p><span style=\"font-size: 16px;\">NeuroVoice is a brain\u2013computer interface application designed to enhance communication for individuals with speech impairments. It integrates\u00a0<\/span><strong style=\"font-size: 16px;\">a wearable device<\/strong><span style=\"font-size: 16px;\">\u00a0that monitors language-related brain regions with\u00a0<\/span><strong style=\"font-size: 16px;\">an analysis platform<\/strong><span style=\"font-size: 16px;\"> for interpretation and real-time visualisation, and can decode neural activity to \u201ctranslate\u201d it into speech and text via an AI-based model.<\/span><\/p><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-604f09f elementor-widget elementor-widget-text-editor\" data-id=\"604f09f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div><p><span style=\"font-size: 16px;\">Read the full feature here: <\/span><a style=\"font-size: 16px; background-color: #ffffff;\" href=\"https:\/\/www.timeshighereducation.com\/content\/hong-kong-polytechnic-university?shpath=\/the-hong-kong-polytechnic-university\/leading-the-way-in-ai-and-humanities-research\" target=\"_blank\" rel=\"noreferrer noopener\"><strong><em>The Hong Kong Polytechnic University | Times Higher Education (THE)<\/em><\/strong><\/a><\/p><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-65301cc e-con-full e-flex e-con e-child\" data-id=\"65301cc\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t<div class=\"elementor-element elementor-element-a24a03d e-flex e-con-boxed e-con e-child\" data-id=\"a24a03d\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-f2b7be6 e-flex e-con-boxed e-con e-child\" data-id=\"f2b7be6\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-727a3d8 elementor-widget elementor-widget-image\" data-id=\"727a3d8\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1080\" height=\"1094\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_2026-01-26_153754_344.jpg\" class=\"attachment-full size-full wp-image-709\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_2026-01-26_153754_344.jpg 1080w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_2026-01-26_153754_344-296x300.jpg 296w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_2026-01-26_153754_344-1011x1024.jpg 1011w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_2026-01-26_153754_344-768x778.jpg 768w\" sizes=\"(max-width: 1080px) 100vw, 1080px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-2229b08 e-con-full e-flex e-con e-child\" data-id=\"2229b08\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-cf70e93 elementor-widget elementor-widget-heading\" data-id=\"cf70e93\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[01\/2026] Congratulations To Dr. WANG Nizhuan On His Invited Talk At IECBS-IECNS 2026 !<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c85ca3c elementor-widget elementor-widget-text-editor\" data-id=\"c85ca3c\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>It is delighted to announce that Dr. WANG Nizhuan has been warmly invited by Prof. Woon-Man Kung to deliver an invited talk at The 5th International Electronic Conference on Brain Sciences &amp; 1st International Electronic Conference on Neurosciences (IECBS-IECNS 2026), which will be held online on March 9\u201311, 2026.<\/p><p>During the conference, Dr. WANG Nizhuan will present a comprehensive analysis to experts, scholars, and colleagues worldwide, highlighting the current landscape, key challenges, and future directions of single-channel EEG-based brain-computer interfaces.<br \/>The talk title is: \u201cSingle-Channel EEG-Based Brain-Computer Interfaces: Current Landscape and Future Directions\u201d. He looks forward to meeting everyone at IECBS-IECNS 2026.<\/p><p>For more information about the conference and the speaker session, please visit: https:\/\/sciforum.net\/event\/IECBS-IECNS2026?section=#event_speakers<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-54b0b15 e-con-full e-flex e-con e-child\" data-id=\"54b0b15\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t<div class=\"elementor-element elementor-element-47643c6 e-flex e-con-boxed e-con e-child\" data-id=\"47643c6\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-16ca8dd e-flex e-con-boxed e-con e-child\" data-id=\"16ca8dd\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-3435c6d elementor-widget elementor-widget-image\" data-id=\"3435c6d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1178\" height=\"917\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news20251222.png\" class=\"attachment-full size-full wp-image-720\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news20251222.png 1178w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news20251222-300x234.png 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news20251222-1024x797.png 1024w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news20251222-768x598.png 768w\" sizes=\"(max-width: 1178px) 100vw, 1178px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-ddf1397 e-con-full e-flex e-con e-child\" data-id=\"ddf1397\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-c1764e5 elementor-widget elementor-widget-heading\" data-id=\"c1764e5\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[12\/2025] Congratulations to Lei\u2019s paper is accepted by Visual Computing for Industry, Biomedicine, and Art!<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e28ff4f elementor-widget elementor-widget-text-editor\" data-id=\"e28ff4f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p>Lei Wang, Weiming Zeng, Kai Long, Hongyu Chen, Rongfeng Lan, Li Liu, Wai Ting Siok, Nizhuan Wang. Advances in Photoacoustic Imaging Reconstruction and Quantitative Analysis for Biomedical Applications [J]. Visual Computing for Industry, Biomedicine, and Art, 2025<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-68f420f elementor-widget elementor-widget-text-editor\" data-id=\"68f420f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p><strong>Abstract<\/strong>:<br \/>Photoacoustic imaging (PAI), a modality that combines the high contrast of optical imaging with the deep penetration of ultrasound, is rapidly transitioning from preclinical research to clinical practice. However, its widespread clinical adoption faces challenges such as the inherent trade-off between penetration depth and spatial resolution, along with the demand for faster imaging speeds. This review comprehensively examines the fundamental principles of PAI, focusing on three primary implementations: photoacoustic computed tomography (PACT), photoacoustic microscopy (PAM), and photoacoustic endoscopy (PAE). It critically analyzes their respective advantages and limitations to provide insights into practical applications. The discussion then extends to recent advancements in image reconstruction and artifact suppression, where both conventional and deep learning (DL)-based approaches have been highlighted for their role in enhancing image quality and streamlining workflows. Furthermore, this work explores progress in quantitative PAI, particularly its ability to precisely measure hemoglobin concentration, oxygen saturation, and other physiological biomarkers. Finally, this review outlines emerging trends and future directions, underscoring the transformative potential of DL in shaping the clinical evolution of PAI.<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-83aba79 e-con-full e-flex e-con e-child\" data-id=\"83aba79\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t<div class=\"elementor-element elementor-element-f1cfa9d e-flex e-con-boxed e-con e-child\" data-id=\"f1cfa9d\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-ac5f939 e-flex e-con-boxed e-con e-child\" data-id=\"ac5f939\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-05ca4a9 elementor-widget elementor-widget-image\" data-id=\"05ca4a9\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"945\" height=\"1192\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20251216150207_2154_194.png\" class=\"attachment-full size-full wp-image-722\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20251216150207_2154_194.png 945w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20251216150207_2154_194-238x300.png 238w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20251216150207_2154_194-812x1024.png 812w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/\u5fae\u4fe1\u56fe\u7247_20251216150207_2154_194-768x969.png 768w\" sizes=\"(max-width: 945px) 100vw, 945px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-d7c38f2 e-con-full e-flex e-con e-child\" data-id=\"d7c38f2\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-1676879 elementor-widget elementor-widget-heading\" data-id=\"1676879\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">\n[12\/2025] Congratulations to Dr. WANG Nizhuan on His Election as Senior Associate Editor of Cognitive Neurodynamics<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-778633f elementor-widget elementor-widget-text-editor\" data-id=\"778633f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p>It is delighted to announce that Dr. Wang Nizhuan has been elected as Senior Associate Editor of Cognitive Neurodynamics (CODY), a prestigious hybrid journal published by Springer Nature.<br \/>Founded in 2007, Cognitive Neurodynamics has established itself as a key academic platform in related fields. It currently holds a latest impact factor of 3.9 and is ranked Q2 in the Journal Citation Reports (JCR). The journal focuses on cutting-edge research areas including cognitive neuroscience, brain-computer interfaces, and computational neuroscience, providing a vital forum for scholars worldwide to exchange innovative ideas and findings.For more information about the journal and its editorial board, please visit: https:\/\/link.springer.com\/journal\/11571\/editorial-board.<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-deae2ce e-con-full e-flex e-con e-child\" data-id=\"deae2ce\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t<div class=\"elementor-element elementor-element-acacf16 e-flex e-con-boxed e-con e-child\" data-id=\"acacf16\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-24f9fa9 e-flex e-con-boxed e-con e-child\" data-id=\"24f9fa9\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-0c27992 elementor-widget elementor-widget-image\" data-id=\"0c27992\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"687\" height=\"1249\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news1.png\" class=\"attachment-full size-full wp-image-723\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news1.png 687w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news1-165x300.png 165w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/news1-563x1024.png 563w\" sizes=\"(max-width: 687px) 100vw, 687px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-1a4eb25 e-con-full e-flex e-con e-child\" data-id=\"1a4eb25\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-cd38017 elementor-widget elementor-widget-heading\" data-id=\"cd38017\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">\n[10\/2025] Congratulations !<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-bffbac9 elementor-widget elementor-widget-text-editor\" data-id=\"bffbac9\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p>Dr. WANG Nizhuan has been invited to deliver a plenary address at the 2025 International Neural Regeneration Symposium (INRS2025), held from October 24-26, 2025. His presentation, titled \u201cFrom Neural Mechanisms to Clinical Diagnosis: Decoding Brain Disorders via AI-powered Neuroimaging,\u201d will showcase his pioneering research at the intersection of AI, neuroimaging and brain disorders.<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-0c951cc e-con-full e-flex e-con e-child\" data-id=\"0c951cc\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t<div class=\"elementor-element elementor-element-9065630 e-flex e-con-boxed e-con e-child\" data-id=\"9065630\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-4ab9ee7 e-flex e-con-boxed e-con e-child\" data-id=\"4ab9ee7\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-7f32697 elementor-widget elementor-widget-image\" data-id=\"7f32697\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1086\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/FreqDGT-scaled.jpg\" class=\"attachment-full size-full wp-image-724\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/FreqDGT-scaled.jpg 2560w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/FreqDGT-300x127.jpg 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/FreqDGT-1024x434.jpg 1024w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/FreqDGT-768x326.jpg 768w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/FreqDGT-1536x652.jpg 1536w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/FreqDGT-2048x869.jpg 2048w\" sizes=\"(max-width: 2560px) 100vw, 2560px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d827138 elementor-widget elementor-widget-image\" data-id=\"d827138\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"2294\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/vis-scaled.png\" class=\"attachment-full size-full wp-image-727\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/vis-scaled.png 2560w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/vis-300x269.png 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/vis-1024x918.png 1024w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/vis-768x688.png 768w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/vis-1536x1376.png 1536w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/vis-2048x1835.png 2048w\" sizes=\"(max-width: 2560px) 100vw, 2560px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-4eaaf7b e-con-full e-flex e-con e-child\" data-id=\"4eaaf7b\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-c28821e elementor-widget elementor-widget-heading\" data-id=\"c28821e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[08\/2025] One paper is accepted to MIND2025 (Oral) !<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-6eb2f21 elementor-widget elementor-widget-text-editor\" data-id=\"6eb2f21\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p>Yueyang Li, Shengyu Gong, Weiming Zeng, Nizhuan Wang, Wai Ting Siok. FreqDGT: Frequency-Adaptive Dynamic Graph Networks with Transformer for Cross-subject EEG Emotion Recognition. The 2025 International Conference on Machine Intelligence and Nature-InspireD Computing (MIND).<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-2488437 elementor-widget elementor-widget-text-editor\" data-id=\"2488437\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p><strong>Abstract<\/strong>:<br \/>Electroencephalography (EEG) serves as a reliable and objective signal for emotion recognition in affective brain-computer interfaces, offering unique advantages through its high temporal resolution and ability to capture authentic emotional states that cannot be consciously controlled. However, cross-subject generalization remains a fundamental challenge due to individual variability, cognitive traits, and emotional responses. We propose FreqDGT, a frequency-adaptive dynamic graph transformer that systematically addresses these limitations through an integrated framework. FreqDGT introduces frequency-adaptive processing (FAP) to dynamically weight emotion-relevant frequency bands based on neuroscientific evidence, employs adaptive dynamic graph learning (ADGL) to learn input-specific brain connectivity patterns, and implements multi-scale temporal disentanglement network (MTDN) that combines hierarchical temporal transformers with adversarial feature disentanglement to capture both temporal dynamics and ensure cross-subject robustness. Comprehensive experiments demonstrate that FreqDGT significantly improves cross-subject emotion recognition accuracy, confirming the effectiveness of integrating frequency-adaptive, spatial-dynamic, and temporal-hierarchical modeling while ensuring robustness to individual differences. The code is available at https:\/\/github.com\/NZWANG\/FreqDGT.<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-03c6c3b e-con-full e-flex e-con e-child\" data-id=\"03c6c3b\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t<div class=\"elementor-element elementor-element-6b9fac4 e-flex e-con-boxed e-con e-child\" data-id=\"6b9fac4\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-1ec4709 e-flex e-con-boxed e-con e-child\" data-id=\"1ec4709\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-bbac9c6 elementor-widget elementor-widget-image\" data-id=\"bbac9c6\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1488\" height=\"893\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/StarFormer.png\" class=\"attachment-full size-full wp-image-726\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/StarFormer.png 1488w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/StarFormer-300x180.png 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/StarFormer-1024x615.png 1024w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/StarFormer-768x461.png 768w\" sizes=\"(max-width: 1488px) 100vw, 1488px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-0048794 elementor-widget elementor-widget-image\" data-id=\"0048794\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"811\" height=\"381\" src=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/EEgEmotion2.png\" class=\"attachment-full size-full wp-image-725\" alt=\"\" srcset=\"https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/EEgEmotion2.png 811w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/EEgEmotion2-300x141.png 300w, https:\/\/sioklab.com\/wp-content\/uploads\/2026\/03\/EEgEmotion2-768x361.png 768w\" sizes=\"(max-width: 811px) 100vw, 811px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-4055360 e-con-full e-flex e-con e-child\" data-id=\"4055360\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-26c5903 elementor-widget elementor-widget-heading\" data-id=\"26c5903\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<div class=\"elementor-heading-title elementor-size-default\">[07\/2025] Two papers is accepted to Neural Networks !<\/div>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ff7a442 elementor-widget elementor-widget-text-editor\" data-id=\"ff7a442\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p>Wenhao Dong*, Yueyang Li*, Weiming Zeng, Lei Chen, Hongjie Yan, Wai Ting Siok, Nizhuan Wang.\u00a0<a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S0893608025008081\" target=\"_blank\" rel=\"noreferrer noopener\">STARFormer: A Novel Spatio-Temporal Aggregation Reorganization Transformer of FMRI for Brain Disorder Diagnosis<\/a>.\u00a0<em>Neural Networks<\/em>\u00a0(2025): 107927.<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-990be67 elementor-widget elementor-widget-text-editor\" data-id=\"990be67\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p><strong>Abstract<\/strong>:<br \/>Many existing methods that use functional magnetic resonance imaging (fMRI) to classify brain disorders, such as autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD), often overlook the integration of spatial and temporal dependencies of the blood oxygen level-dependent (BOLD) signals, \u2026.<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-6dc679d elementor-widget elementor-widget-text-editor\" data-id=\"6dc679d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p>Hongyu Chen, Weiming Zeng, Chengcheng Chen, Luhui Cai, Fei Wang, Yuhu Shi, Lei Wang, Wei Zhang, Yueyang Li, Hongjie Yan, Wai Ting Siok, Nizhuan Wang.\u00a0<a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S0893608025007282\">EEG emotion copilot: Optimizing lightweight LLMs for emotional EEG interpretation with assisted medical record generation<\/a>.\u00a0<em>Neural Networks<\/em>\u00a0(2025): 107848.<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ee10235 elementor-widget elementor-widget-text-editor\" data-id=\"ee10235\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<div class=\"wp-block-group alignwide has-base-background-color has-background is-layout-flow wp-container-core-group-is-layout-7593a3d2 wp-block-group-is-layout-flow\"><div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-87beb0d0 wp-block-columns-is-layout-flex\"><div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-f5bb311e wp-block-column-is-layout-flow\"><p><strong>Abstract<\/strong>:<br \/>In the fields of affective computing (AC) and brain-computer interface (BCI), the analysis of physiological and behavioral signals to discern individual emotional states has emerged as a critical research frontier. While deep learning-based approaches have made notable strides in EEG emotion recognition, particularly in feature extraction and pattern recognition, significant challenges persist in achieving end-to-end emotion computation, including rapid processing, individual adaptation\u2026.<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>News [04\/2026] Neuro [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"site-sidebar-layout":"no-sidebar","site-content-layout":"","ast-site-content-layout":"full-width-container","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"disabled","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"class_list":["post-32","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/sioklab.com\/index.php\/wp-json\/wp\/v2\/pages\/32","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sioklab.com\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sioklab.com\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sioklab.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/sioklab.com\/index.php\/wp-json\/wp\/v2\/comments?post=32"}],"version-history":[{"count":130,"href":"https:\/\/sioklab.com\/index.php\/wp-json\/wp\/v2\/pages\/32\/revisions"}],"predecessor-version":[{"id":939,"href":"https:\/\/sioklab.com\/index.php\/wp-json\/wp\/v2\/pages\/32\/revisions\/939"}],"wp:attachment":[{"href":"https:\/\/sioklab.com\/index.php\/wp-json\/wp\/v2\/media?parent=32"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}