i) A learning theory perspective on Erdos type problems in
combinatorial geometry
Project team: Alex Iosevich, Azita Mayeli, Brian McDonald
and Emmett Wyman
Description: We are going to study the connections
between the Vapnik-Chervonenkis dimension and problems in
combinatorial geometry, such as the Erdos distance problem, the
Szemeredi-Trotter incidence theorem and related topics. Roughly
speaking, one can create learning tasks and natural families of
classifiers such that when computing the VC-dimension, one
encounters interesting point configuration problems that shed
considerable amount of light on the aforementioned combinatorial
problems.
Participants: Nathanel Grand and Maxwell Sun
ii) Neural networks and universal algebras
Project team: Charlotte Aten and Alex Iosevich
Description: .pdf
Participants: Nicholas Cimaszewski, Michele Martino,
Svetlana Pack, Conor Taliancic, and Andrey Yao
iii) Neural networks with noise
Project team: Alex Iosevich and Steven Senger
Description: .pdf
Participants: Jordan Darefsky, Lucy Lin, George Lyu, Anna
Myakushina, Edmund Sepeku, Maxwell Sun
iv) Neural networks and sales models with economic
indicators
Project team: Alex Iosevich
Description: Many sales models starting returning less
than stellar results during the Covid era, in part because the
training data came from before the Covid period. In this project
we are going to take several publically available data sets
containing sales data and try to come up with the right mix of
economic (and other) indicators that will make predictions as
stable as possible across time, including the Covid period.
Participants: Nicholas O'Brien, Haiyan Huang, George Lyu,
Kevin Xue, Kehan Yu, Kaiyuan Zhao, Stella Zhang
v) VC-dimension and neural networks
Project team: Ivan Chio, Alex Iosevich, Azita Mayeli,
Andrew Thomas, and Emmett Wyman
Description: When we go over Chapter 20 (neural
networks), we are going to see that neural networks are
"universal approximators" in that any Lipschitz function can be
approximated by a neural network arbitrarily closely. This is a
fundamental result, but many real-life data sets are not
realistically described by a Lipschitz function because the
Lipschitz condition limits volatility. In this project we are
going to explore the universal approximation in the case when
Lipschitz functions are replaced by more complicated (and
hopefully more realistic) classes of function, such as function
with graphs satisfying a suitable fractal dimension condition.
Participants: Julie Fleischman, Filippo Iulianelli,
Michele Martino, Svetlana Pack, Conor Taliancic, Nate Whybra,
Kaiyuan Zhao
vi) Natural language processing on the social web
Project team: Alex Iosevich, Boris Iskra and Patricia
Medina
Description: The idea of this project is to extract data
from Twitter on certain topics, doing an NPL setup and
performing and analyzing this social media data using machine
learning techniques such as SVM, neural networks, and dimension
reduction methods such as PCA and auto-encoders. Tools such as
MongoDB will come in handy, and the participants will have the
option of learning how to work on a given database in the cloud.
The project is inspired by the presentations in the MAA-SIAM and
TRIPODS Advanced Workshop in Data Science for Mathematical
Sciences Faculty (ICERM) which used code based on "mining the
Social Web" book by Matthew A. Russel. Several sets of code will
be provided that will dictate the different milestones of the
project.
Participants: Ivan Chio, Haiyan Huang, Zhiyu Lei, Lucy
Lin, Anna Myakushina, Edmund Sepeku, Siriu Wang
vii) Hype versus performance in the English football league
Project team: Alex Iosevich
Description: We are going to scrape the web for a news
stories on players in the English football league and come up
with a "hype metric" by rating how positive the stories are. We
are also going to compile a variety of performance based
metrics. We will then run a variety of neural network models to
check how well the "hype metric" and performance ratings
correlate.
Participants: Noah Boonin, Jordan Darefsky, Kevin Xue