ИНСТИТУТ ВЫЧИСЛИТЕЛЬНОЙ МАТЕМАТИКИ им. Г.И. МАРЧУКА
РОССИЙСКОЙ АКАДЕМИИ НАУК

ИНСТИТУТ ВЫЧИСЛИТЕЛЬНОЙ МАТЕМАТИКИ
им. Г.И. МАРЧУКА РАН

ИВМ РАН

119333, г. Москва, ул. Губкина, 8.
Тел.: (495) 984‑81‑20, (495) 989‑80‑24, факс: (495) 989‑80‑23, E‑mail: director@mail.inm.ras.ru

  • English


School on Tensor Methods in Mathematics and Data Sciences

In the fall of 2024 (November 11-20), a school-conference for senior students on tensor methods in mathematics and data sciences is planned on the campus of the joint MSU-PPI university in Shenzhen. The school is targeting the lectures for  undergraduates, graduate students and young researchers. The mini-conference for the young researchers and invited speakers will be also scheduled for November 18-20 within the days of this school. The poster session of young scientists is pleminimary planned for the morning time of November 18 and November 18-20 will be dedicated to the invited talks.

The topics of lectures and workshops presentations include but not limited to

  • novel algorithms and theory for tensor decompositions,
  • advanced optimization techniques,
  • randomized techniques and processing of noisy data,
  • applications of low-rank tensors and optimization in data sciences,
  • applications of tensor and optimization methods in wireless communications.

We hope to increase the interest and skills of the students in the covered materials through the school and boost new research activities devoted to listed topics.

No fees are needed for the school. The student accommodation on campus will be supported and covered by MSU-PPI university in Shenzhen!

Travel grants for the flights for the young researchers, students and even undergraduates might be obtained if the students prepares a research presentation (poster) based on his/her peer-reviewed publication.

Lecturers of the mini-courses school:

  • Yao Xin, Introduction to Evolutionary optimization
  • Eugene Tyrtyshnikov, Kolmogorov Theorem
  • Alexander Beznosikov, Various aspects of efficient optimization: stochasticity, adaptivity, distributedness
  • Sergey Matveev, Basics of tensor train
  • Vladimir Lyashev, Matrix usage in physical object representation
  • Evgeny Atamas, Control of Multiagent Systems

One-lecture reviews given by:

  • Sergey Kabanikhin, Regularization: from Algebraic Equations to Neural Networks
  • Maksim Shshlenin, Matrix methods, block-Toeplitz matrices in inverse problems of wave tomography
  • Alexander Osinsky, Adaptive cross and maxvol approximation methods
  • Stanislav Morozov,

The schedule is available.
The posters are available by link!
Applications for participation can already be submitted:

  • preliminary selection deadline is June 15.
  • deadline is extended by June 24!

List of invited speakers of our workshop:

Oseledets Ivan AIRI, Skoltech Eugene Tyrtyshnikov INM RAS, Lomonosov Moscow State University
Carmine Di Fiore University of Rome “Tor Vergata” Chao Wang City University of Hong Kong
Dario Fasino University of Udine Alexander Osinsky Skoltech, INM RAS
Sergey Dolgov University of Bath Michael Kwok-Po Ng Hong Kong Baptist University
Vladimir Kazeev Faculty of Mathematics, University of Vienna Sergey Matveev Lomonosov Moscow State University
Dmitry Yarotsky Skolkovo Institute of Science and Technology Alekandr Mikhalev Skolkovo Institute of Science and Technology
Gianluca Ceruti University of Innsbruck Tiangang Cui University of Sydney
Lieven De Lathauwer KU Leuven Andrei Krylov Lomonosov Moscow State University
Maxim Shishlenin Siberian branch, RAS Alexander Beznosikov MIPT
Sergey Kabanikhin Siberian branch, RAS Alexander Khvatov ITMO University
Raymond Chan Lingnan University Yao Xin Lingnan University
Yimin Wei Fudan University Xiaoman Hu Huawei
Semyon Dorokhin MIPT

 

 

Poster Presenter
Low-rank canonical approximations with sequential dimensionality increments Burtsev Leonid, Taumurzaev Akhmatali
Precoder calculation in MIMO systems using Tucker Decomposition Blagodarnyi Alexander
Approximation Theory of a Class of Tensor-Train Neural Network Li Bo
Taking into account the low-rank structure of the matrix of perturbation coefficients in the problem of compensation of nonlinear distortion in fiber optics Kosolapov Ilya
Diffractive neural networks for image processing Krasnikov Viktor
Mosaic-skeleton format in method of moments for large-scale scattering problems Mass Ilya
Shuffling Heuristic in Variational Inequalities: Establishing New Convergence Guarantees Molodtsov Gleb
Alternating minimization method for low-rank approximation of matrices and tensors in the Chebyshev norm
Morozov Stanislav
Fourier Neural Networks and Kolmogorov-Arnold Neural Networks for MRI Reconstruction Penkin Maksim
Usage of Tensor Methods in Problems of Searching for Transport Equilibrium: an Overview
Podlipnova Irina
Neural Operators Meet Conjugate Gradients: The FCG-NO Method for Efficient PDE Solving Rudikov Alexander
Automatic segmentation of epicardial fat and quantification of radiomic parameters in cardiac computed tomography Samatov Denis
Group control algorithm in an uncertain environment with dynamic obstacles
Sarvarov Gennady
An investigation of the structure of perturbation coefficients for compensation of fiber nonlinear distortions Sheloput Tatiana
Flow Matching for Solving Inverse Problems Sherki Daniil
On the quantized TT-ranks of moment sequences Smirnov Matvey
Nonnegative tensor train for the multicomponent Smoluchowski equation Tretyak Ilya
Learning from Linear Algebra: A Graph Neural Network Approach to Preconditioner Design for Conjugate Gradient Solvers Trifonov Vladislav
Low rank structures and efficient solving of the integral equations Valiakhmetov Bulat
Group and Shuffle: Efficient Structured Orthogonal Parametrization Yudin Nikolay

To participate in the competitive selection for participation in the School, you must send a letter to shenzenschool@gmail.com containing:

  • Full name, contact details;
  • Name of the university, faculty, department, course number or year of graduate school;
  • Information about the scientific supervisor (full name, e-mail, contact phone number);
  • A letter of recommendation from a supervisor (strictly desirable);

It is also advisable to attach:

  • For students – GPA for sessions, topics of coursework and grades for them; for graduate students – GPA in the diploma, topic of the thesis and the candidate’s dissertation being prepared, specialty, grades for candidate exams in the specialty;
  • Information about publications, participation in conferences, awards, etc.
  • If you wish to make a short report on the topic of your research – the title of the report (in English). This is mandatory for obtaining the travel grants.

Organizing contacts:

  • Ph.D. Matveev Sergey Aleksandrovich (Associate Professor of Computational Mathematics and Mathematics of Moscow State University and researcher at the Institute of Computational Mathematics of the Russian Academy of Sciences)
    e-mail: matseralex@cs.msu.ru

School co-chairmen:

  • Tyrtyshnikov Eugene (academician of the Russian Academy of Sciences, director of the Institute of Computational Mathematics of the Russian Academy of Sciences, head of the department of computer science at the Faculty of Computational Mathematics and Mathematics of Moscow State University)
    e-mail: eugene.tyrtyshnikov@gmail.com
  • Iline Alexander (corresponding member of the Russian Academy of Sciences, professor at the Faculty of Computational Mathematics and Mathematics of Moscow State University)
    e-mail: iline@cs.msu.su
  • Fomichev Vasily (Doctor of Physical and Mathematical Sciences, Professor of the Faculty of Computational Mathematics and Mathematics of Moscow State University, Head of the Department of NDSiPU)
    e-mail: fomichev@cs.msu.ru

Аэрофлот" и российско-китайский Университет МГУ-ППИ в Шэньчжэне подписали меморандум о сотрудничестве - AEX.RU

In 2024, the school is held jointly by the Institute of Computational Mathematics of the Russian Academy of Sciences, Moscow State University, MSU-PPI University in Shenzhen, Moscow Center of Fundamental and Applied Mathematics and Huawei.