## Arya Mazumdar
## UpdatesMar 8, 2022: A Fortune Education article featuring my quotes on the growth of data science. Mar 2022: I gave a talk at Rutgers University on learning mixtures. Dec 2021: I am Data Science featuring my research. Nov 17, 2021: I gave a talk in the inaugural TILOS Seminar on learning mixtures. Foundational research in TILOS: UCSD News The National Science Foundation announced our institute to be a National AI Institute, where I am a co-PI and co-leader of the Foundations research. For more information, see our website or read the UCSD News Story. Finished teaching DSC 40B Theoretical Foundations of Data Science. All lecture notes are online. May 10: I gave a talk in UCSD CS Theory Seminar on learning mixtures. Apr 23: I gave a talk in UPenn Machine Learning Seminar on Optimization with Communication Constraints. I am a PC member in RANDOM 2021 and an area chair in NeurIPS 2021. Please submit your best works. On April 9, 2021, we are organizing a Workshop on Communication Efficient Distributed Optimization. Please register (free) to attend and present a poster. Jan-Mar 2021: Finished teaching DSC 190 Algorithms for Data Science. All lecture slides are up. I moved to UCSD in 2020-2021 academic year. Please apply for HDSI/Simons postdoctoral fellowships here. Ph.D. Applicants interested to work with me: Please apply formally to the CSE department and mention my name in your application. I gave an invited talk at the Asilomar conference on gradient quantization. I gave talks on Learning Mixtures and Trace Reconstruction at the CCSP Seminar, Workshop on Inference problems: algorithms and lower bounds, TBSI Workshop on Learning Theory and at SPCOM. 2020 EURASIP JASP Best Paper Award for this paper. Foundations of Data Science Virtual Talk Series starting Feb 28, 2020. Thanks NSF. Four sessions in ITA on Federated Learning. See here. The TRIPODS Institute for Theoretical Foundations of Data Science now has a website. Multiple postdoc positions are available. NSF HDR TRIPODS Award: HDR TRIPODS: Institute for Integrated Data Science: A Transdisciplinary Approach to Understanding Fundamental Trade-offs and Theoretical Foundations. A new NSF award NSF1909046 New Directions in Clustering: Interactive Algorithms and Statistical Models. An **oral**presentation at the ICML 2019 Workshop on Security and Privacy in Machine Learning: Private vqSGD: Vector-Quantized Stochastic Gradient Descent.A **spotlight**talk at the ICML 2019 Workshop on Coding Theory for Large-Scale Machine Learning: Reliable Distributed Clustering with Redundant Data Assignment.For students interested in CMPSCI 690-OP: Optimization in Computer Science (spring 2019): The course webpage is up. I gave a talk at the Shannon Channel on Graph Clustering: Variations on the Block Models. Video Link. I gave an invited talk at Frontiers of Coding Theory at the Information Theory Workshop, Guangzhou, based on this work. We organized a session on New Directions in Clustering in the Allerton Conference where I gave a talk on Geometric Block Model. Thanks to all the other speakers. The course webpage for CS 514: Algorithms for Data Science, fall 2018, is up. I am part of the Workshop on Coding and Information Theory at the Center of Mathematical Sciences and Applications in Harvard Apr 9-13 and speaking about LRCs. The Dagstuhl Workshop on Coding Theory for Inference, Learning and Optimization that we organized was a great success. Thanks attendees. An **oral**presentation at SysML Conference, Feb 16, 2018.Invited talk at Institute for Advanced Study (IAS) on Locally repairable codes, Storage capacity and Index coding, Feb 5, 2018. Invited talk at Bombay Information Theory Seminar (BITS) on Interactive Algorithms for Clustering and Community Detection, Jan 2018. A **spotlight**paper Semisupervised Clustering, AND-Queries and Locally Encodable Source Coding, among three accepted at NIPS 2017.Invited talk at AMS Fall Sectional Meeting on Locally repairable codes, Oct 28. Invited talks at Google (Oct 11), IMA (Oct 28) on Clustering with an Oracle. We are organizing the CS Theory Seminar in fall. Look for updates in this page. A new NSF award BSF-NSF1618512, Coding and Information - Theoretic Aspects of Local Data Recovery. Invited talk at SIAM Discrete Math. Locally repairable codes and index coding. Invited talk at the Shannon Centenary Session of SPCOM and invited paper Local partial clique covers and Index coding. Elevated to the grade of IEEE Senior member. Machine Learning and Friends Lunch, Neural Auto-associative Memory Via Sparse Recovery Jan 29, 2016. New invited paper and talk at Information Theory Workshop, Jeju Island, Oct 11-15, 2015. NSF CyberBridges Workshop, Aug 31-Sep 1, 2015. A new NSF grant NSF1642550, NSF1526763 Ordinal Data Compression. Invited Talk on “Recent Results in Local Repair” at Communication Theory Workshop (CTW), May 13, 2015. Invited Talk on “Associative Memory via Linear Neural Networks” at ITW, Apr 30, 2015. Invited Talk on “Memoryless Adversary,” at Between Shannon and Hamming: Network Information Theory and Combinatorics, Banff Workshop, Mar 2, 2015. Invited Talk on “Local Repairability,” at Simons Institute Workshop on Coding, Feb 12, 2015. Thanks NSF, for the CAREER award, Reliability in Large-Scale Storage, UMN Part, UMass Part (2015-2020).
## BioI am a tenured associate professor in the Halıcıoğlu Data Science Institute (HDSI), and an affiliated faculty in the Computer Science and Engineering and Electrical and Computer Engineering departments at UCSD. From 2015-2021, I was first an assistant, and then an associate professor of Computer Science at UMass with additional affiliations to Center for Data Science and Department of Mathematics and Statistics. During 2019-2020, I was also a senior scientist in Amazon in Berkeley, CA. My research is mainly supported by the NSF via multiple grants, including an NSF CAREER award. Before joining UMass, I was an assistant professor at University of Minnesota – Twin Cities. And even before that, I was a postdoctoral scholar (Aug 2011- Dec 2012) at Signals, Information and Algorithms Lab led by Greg Wornell at MIT. I received my PhD in 2011 from University of Maryland, College Park under the guidance of Alexander Barg. My thesis, Combinatorial Methods in Coding Theory, won a distinguished dissertation fellowship award in UMD, and also an ISIT Jack K Wolf paper award. While a graduate student, I spent summer of 2010 at IBM Almaden, San Jose, CA, and summer of 2008 at HP Labs, Palo Alto, CA doing internships. As of now, I am also serving as an Associate Editor of IEEE Transactions on Information Theory and as an Area Editor of Foundation and Trends in Communication and Information Theory. Back in a magical decade, I was born in the then-small-town of Siliguri in Darjeeling, West Bengal, India. I spent the wonder-years there and in Jadavpur, Kolkata, till I graduated with a B.E. in Electronics and Telecommunication engineering. |