**STEMFORALL
2024 WORKSHOP**

Welcome to StemForAll2024 summer workshop. All the interested
University of Rochester students are welcome to participate. The
registration process is only used to assign the students to suitable
projects. The main idea behind the workshop is to share the research
we are doing with undergraduate students for the purpose of
familiarizing them with research methods and techniques. Quite often
research papers result from these discussions, but the main emphasis
is on learning and the creative process.

**Organizers:** Alex Iosevich and Azita Mayeli

**StemForAll2024 Workshop Instructional Team:** Matthew
Dannenberg, Gabe Hart, Alex Iosevich, Steven Kleene, Anuraag Kumar,
Azita Mayeli, Svetlana Pack, Alice Quillen, Nathan Skerret,
Stephanie Wang, and Nathan Whybra

**Dates:** July 29 - August 9, 2024

**Structure of the workshop:** The workshop is going to
consist of supervised research projects and series of lectures
designed to help gain the necessary background as you are working on
your projects. The exact topics of the lecture series will be
determined in the coming weeks. The exact topics for the research
projects will be selected based on your interests and preferences.

This workshop is partly a culmination of undergraduate research
activities that have transpired during the academic year. Once the
workshop is over, the research projects are likely to continue into
the Fall 2024 semester and beyond.

**Registration process:** Please fill out an on-line
registration form at the following
link__.__ The students at all the Rochester area colleges
and universities are welcome to register.

Dates and locations:

The program is going to run from July 29 until August 9 on the 11th
floor of Hylan Bldg at the University of Rochester

**Preliminary schedule: **

**Week 1: **

Discrete Fourier Analysis mini-course: 8 a.m. - 9.30 a.m. (location
to be announced)

Probability mini-course: 10.00 a.m. - 11.30 p.m. (location to
be announced)

Lunch break: 11:30 a.m. - 2.00 p.m.

Meeting of the research groups: 2.00 p.m. - 4.00 p.m.

Dinner break: 4.00 p.m. - 7.00 p.m.

Python coding groups: 7 p.m. - 9 p.m.

**Week 2: **

Research groups meeting on their own, with or without
instructors: 8.00 a.m. - 9.00 a.m.

Research groups meeting with project supervisors: 9:00 a.m. - 11:00
a.m.

Lunch break: 11:00 a.m. -1.30 p.m.

Research group meetings with project supervisors: 1.30 p.m. - 3.00
p.m.

Participants working individually and in small groups: 3.00 p.m. -
4.00 p.m.

Dinner break: 4:00 p.m. - 7.00 p.m.

Evening regroup with supervisors: 7.00 p.m. - 8.00 p.m.

__ __

Workshop Projects:

__ __

**Exact signal recovery**

**Project supervisors:** Alex Iosevich and Azita Mayeli

**1) Project description: **Suppose that a signal of length N is
transmitted via its discrete Fourier transform and some of the
signal is lost in the transmission due to noise or interference.
Under what conditions is it possible to recover the original signal
exactly? This innocent looking question quickly leads to some
interesting techniques and ideas from analysis, combinatorics and
other areas of mathematics. We are going to investigate these types
of questions from both the theoretical and computational points of
view.

**Project participants: **Karam Aldahleh, Karina Gurevich, Josh
Iosevich, Jonathan Jaimangal, Kelvin Ngyuen, Aidan Rohrbach, Nate
Shaffer, Terrence Wong,

**Things to learn (or review) before the workshop:** Discrete
Fourier transform on integers modulo N, Fourier inversion,
Plancherel, basic Gauss sums, Fourier transform of the indicator
function of the circle, Fourier transform of the indicator function
of a random set, python packages that compute the discrete Fourier
transform.

**Reading materials:** i) Notes on the Fourier transform by Lazslo Babai,
ii) The paper by Hart,
Iosevich, Koh and Rudnev on geometric combinatorics in vector
spaces over finite field, and iii) The paper on signal recovery by Iosevich and Mayeli.

__2)__ __Buffon Needle Problem__

Project supervisors: Gabe Hart and Steven Kleene

** **

**Project description:** This project is continuing from last
summer. The question is, which convex domain K in d-dimensional
Euclidean space with a boundary with a fixed (d-1)-dimensional
Hausorff content maximizes the Buffon probability? Here the Buffon
probability p(K,r) is the probability that if one end of a needle of
length r lands in K with uniform probability, then the other end
also lands in K. Last summer, William Hagerstrom, Gabriel Hart, Tran
Duy Anh Le, Isaac Li, and Nathan Skerett essentially resolved this
question in two dimensions. They proved that given any convex set K
in the plane where the length of the boundary is equal to 2 pi,
there exists a threshold r_0 such that if r<r_0 and K is not the
unit disk D, then p(K,r)<p(D,r). The purpose of this year's
project is to extend this result to higher dimensions.

**Project participants: **William Hagerstrom, Gabe Hart, and
Philip Olapade

**Things to learn (or review) before the workshop: **Basic
probability, integral geometry, convexity-based inequalities, and
study the results on the Buffon Needle Problem produced during
StemForAll2023.

**Reading materials: **i) Integral Geometry and Geometric
Probability, by Luis Santalo (book) ii) The paper based on
StemForAll2023 Buffon Needle results (coming soon)

__3)__ __Automated theorem proving__

Project supervisors: Alex Iosevich, Azita Mayeli, Stephanie
Wang, Yifan Zhu

** **

Project description: The Automated Theorem Proving (ATP)
research group will focus on first understanding, then developing
computer programs that can prove mathematical theorems
automatically. We plan to tackle the first few chapters of
"Introduction to Univalent Foundations of Mathematics," which
introduces a new way of thinking about mathematics, emphasizing how
objects are the same (or equivalent) rather than just focusing on
their properties. The group's goal is to use computers to navigate
and validate the theories and exercises in these chapters, helping
to advance our understanding of how these foundational concepts can
be applied and verified using technology. Coding will be done in
Python, and use of ChatGPT is welcome and even encouraged so far as
the programmer understands the code produced.

**P****roject participants: **Fatimah Almuallim, Jessica
Chen, James Choe, Ioanna Geba, Zhifeng Guo, Xinliang He, Yunhui Li,
Aidan Lieberman, Shouyi Lin, Zacharay Tan, Ruzicka Vuckovic,
Stephanie Wang

**Things to learn (or review) before the workshop: **Python
programming and read up on automated theorem proving.

**Reading materials:** i) Please
read the following notes ii) Please read the
following book

**4)**** ****Kolmogorov complexity and Hausdorff
dimension**

**Project supervisors:** Alex Iosevich, Azita Mayeli and Svetlana
Pack

**Project description: **The purpose of this project is to
understand the emerging connections between Kolomogorov complexity
and Hausdorff dimension, with applications to configuration
problems, complexity of graphs and machine learning.

**Project participants: **Noah Ernst, Adarsh Kumar, Charlie Li

**Things to learn (or review) before the workshop:** Read up on
Kolomogorov complexity and fractal dimension.

**Reading materials: **i) A book on Kolmogorov
complexity

**5)**** ****Numerical solutions for partial
differential equations**

**Project supervisors:** Alex Iosevich and Kunxu Song

**Project description: **The group will use deep learning
methods to investigate solutions of various partial differential
equations. In particular, they will investigate how to solve the
high latitude heat equation using a neural additive model. The group
will also work on other SPDE related problems if time permits.

**Project participants:** Yuning Ren, Kunxu Song, Xiaoya Tan,
Jingwen Xu, Xianquan Yan

**Things to learn (or review) before the workshop:** Basic theory
of partial differential equations, probability, fundamentals of
stochastic analysis, and python programming.

**Reading materials:** Coming soon

**6)**** ****Graph theory and cycle double-covers**

**Project supervisors:** Gabe Hart and Alex Iosevich

**Project description:** A cycle double-cover of a graph G is a
set of cycles in G such that every edge of G is included in exactly
2 of the cycles. The cycle double-cover conjecture states that every
bridgeless graph has a cycle double-cover. We will investigate this
conjecture and related problems using both theoretical and
computational methods.

**Project participants: **Nicholas Arnold, Fardowsa Abdulle,
Gabe Hart, Aidan Rohrbach, and BingKun Ye

**Things to learn (or review) before the workshop: **Basic graph
theory, definition of the cycle double-cover iv)

**Reading materials:** i) Wikipedia graph
theory article, ii) Cycle
double cover wikipedia article, iii) Matroid
theory,

** **

7) Math education modeling methods

**Project supervisors:** Alex Iosevich, Azita Mayeli, Anuurag
Kumar and Stephanie Wang

**Project description:** At present, the United States severely
underperforms in mathematics relative to its global standing,
leading to countless attempts to aid STEM students. Lesser studied,
however, is pedagogy at the undergraduate level, and the factors
that go into attracting and retaining math major students. In this
group, we will explore the variety of factors that contribute to the
undergraduate mathematics experience through a first-generation
lens, including, but not limited to: allocation of university
resources, available mathematical support, community openness,
perceived mathematical stigma and career trajectory, etc. Students
in this group can expect to do mathematical modeling of demographics
and statistical analysis, as well as learn how to apply quantitative
measurements to subject data (interviews, surveys, etc.). Coding
will be done in Python and/or R, and use of ChatGPT is welcome and
even encouraged so far as the programmer understands the code
produced.

**Project participants: **Ayse Bicacki, Anurag Kumar, Stephanie
Wang, Shuyu Zhang, Kangcheng Zhao, Xinrui Zhao

**Things to learn (or review) before the workshop: **Read up on
challenges faced by first-generation college students

**Reading materials:** i) Post-secondary
success and first-generation students, ii) Pre-college
experience and effect on first-generation students

**
**

8) Random walks and finite graphs

Project supervisors: Alex Iosevich, Matthew Dannenberg, and
Anuraag Kumar

**Project description:** We are going to study random walks on
graphs using Markov chain methods. The goal is to determine accurate
distributions of hitting times.

**Project participants: **Yujia Ju, Anuurag Kumar, and Yiling
Zou

**Things to learn (or review) before the workshop:** Basic
probability and theory of random walks

**Reading materials: **Coming soon

**9) Erdos distance problem on manifolds **

**Project supervisors:** Alex Iosevich, Steven Kleene and
Nathan Skerett

**Project description:** The classical Erdos distance problem
asks for the smallest possible number of distances determined by n
points in Euclidean space in dimensions two and higher. In this
project we are going to investigate this problem on Riemannian
manifolds. The starting point for this investigation is Nathan
Skerrett's undergraduate honors thesis.

**Project participants: **Lily Stolberg, June Terzioglu and Nate
Shaffer

**Things to learn (or review) before the workshop:** Basic
combinatorics, fundamental results on the Erdos distance problem,
Riemannian geometry

**Reading materials:** i) Erdos Distance Problem by Garibaldi, Iosevich
and Senger, ii) Notes on Differential
Geometry by Desernno, iii) Nathan Skerett's honors thesis

** **

10) Improving numerical techniques for simulating active matter
and pattern formation with moving boundaries

Project supervisors: Alice Quillen and Nathan Skerret

Project description: Active matter and pattern formation can
be described with PDEs. The behavior of the system can be
affected by the boundaries that confine the continuous medium.
Our goal is to develop numerical techniques based on particle based
or finite element methods for exploring the behavior of confined
active media in 2D. One possibility is to generalize the
Immersed Boundary method so that it can be used for a more diverse
set of PDEs than hydrodynamics.

Associated lectures could be on active matter and simulation
techniques for active matter, pattern formation models.
particle based methods and grid based methods for PDEs and immersed
boundary methods.

Project participants: Aaron Iosevich, Roshan Mehta and Allen
Shao

**Things to learn (or review) before the workshop:** Coming soon

**Reading materials:** Coming soon

** **

11) Sales modeling with economic indicators

Project supervisors: Alex Iosevich, Azita Mayeli, and Nathan
Whybra

**Project description:** We are going to build and test neural
network models with economic indicator regressors to effectively
predict future sales in retail. A variety of neural network models
will be built using tensorflow, keras, facebook prophet and others.
Theoretical aspects of this problem will be considered as well.

**Project participants: **Hashem Alomari, Yiqin He, Sylvia Liu,
Xavier Jiang, Shimo Li, Adam Sun, Yicheng Shi, Josih Torres, Joy
Xiang, Binbo Xu, Jonathan Zhang, Yuxuan Zhao, Zhechao Zhao, and
Daiming Zhou

**Things to learn (or review) before the workshop: **Basics of
python, including numpy and pandas, and basic usage of tensorflow
and related packages.

**Reading materials:** i) Python
tutorial ii) Tensorflow
tutorials

12) Forecasting medical data using neural networks

Project supervisors: Alex Iosevich, Azita Mayeli and Svetlana
Pack

Project description: We are going to work with large swaths
of medical data, including EEG, seizures and others, and look for
identifiable patterns using neural network analysis and more
elementary statistical techniques.

**Project participants: **Nadia Lab Hab, Ji Woong Hong, Alex
Novak, Aabha Pandit, Mishnu Pendri, Aidan Lieberman, Yujun Sun,
Kangmin Sung, Yi Wu, and Haotian Yang

**Things to learn (or review) before the workshop: **Basics of
python, including numpy and pandas, and basic usage of tensorflow
and related packages.

**Reading materials:** i) Python
tutorial ii) Tensorflow
tutorials