Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Short description of portfolio item number 1
Short description of portfolio item number 2
Published in , 1900
Federated learning enables many local devices to train a deep learning model jointly without sharing the local data. Currently, most of federated training schemes learns a global model by averaging the parameters of local models. However, most of these training schemes suffer from high communication cost resulted from transmitting full local model parameters. Moreover, directly averaging model parameters leads to a significant performance degradation, due to the class-imbalanced non-iid data on different devices. Especially for the real life federated learning tasks involving extreme classification, (1) communication becomes the main bottleneck since the model size increases proportionally to the number of output classes; (2) extreme classification (such as user recommendation) normally have extremely imbalanced classes and heterogeneous data on different devices. To overcome this problem, we propose federated multiple label hashing (FedMLH), which leverages label hashing to simultaneously reduce the model size (up to 3.40X decrease) with communication cost (up to 18.75X decrease) and achieves significant better accuracy (up to 35.5% relative accuracy improvement) and faster convergence rate (up to 5.5X increase) for free on the federated extreme classification tasks compared to federated average algorithm.
Published in Gut, 67(6):1024–1032, 2018, 2018
Recommended citation: Olabisi Oluwabukola Coker, Zhenwei Dai, Yongzhan Nie, etc. Gut, 67(6):1024–1032, 2018.
Published in Microbiome 6, no. 1 (2018): 70, 2018
Recommended citation: Zhenwei Dai, Olabisi Oluwabukola Coker, Geicho Nakatsu, etc. Microbiome 6, no. 1 (2018): 70.
Published in Bioinformatics. 2019 Mar 1;35(5):807-14, 2019
Recommended citation: Zhenwei Dai, Sunny H Wong, Jun Yu, Yingying Wei. Bioinformatics. 2019 Mar 1;35(5):807-14.
Published in ICML 2019 Deep Phenomena Workshop, 2019
Recommended citation: Zhenwei Dai, Reinhard Heckel. ICML 2019 Deep Phenomena Workshop.
Published in NeurIPS 2020 34th Conference on Neural Information Processing Systems, 2020
Recommended citation: Zhenwei Dai, Anshumali Shrivastava. NeurIPS 2020.
Published in ACM SIGMOD 2021 , 2021
Recommended citation: Zhenwei Dai, Aditya Desai, Reinhard Heckel, Anshumali Shrivastava. ACM SIGMOD 2021.