Sajad Hashemi, Research Intern

Sajad Hashemi

Research Intern

University of Toronto

Location
Canada - Mississauga
Education
Master's degree, Computer Science and Machine Learning
Experience
3 years, 0 Months

Share My Profile

Block User


Work Experience

Total years of experience :3 years, 0 Months

Research Intern at University of Toronto
  • Canada - Toronto
  • My current job since September 2022

Crafted synthetic material datasets to experiment with various generative models such as the Variational Autoencoders (VAE), Generative Adversarial Network (GAN), and the latent diffusion model.
● Trained a VAE to produce reconstructions of textures, resulting in generating never-before-seen materials while reducing data representation down to 2%.
● Single-handedly realized a novel loss function based on the Fast Fourier Transform (FFT) to produce reconstructions of textures that preserve the spatial statistics of the original texture while not necessarily reconstructing the same image in the data space.
● Using the FFT loss function, enhanced training speed by a factor of 2 and reduced GPU RAM requirement by a factor of 4.
● Tools used: Python, PyTorch, FFT, deep generative models, CNNs, VAE, transfer learning, latent diffusion models, generative adversarial networks, graph neural networks, attention mechanism.

Machine Learning Intern at Cognitive Systems Corp
  • Canada
  • May 2022 to May 2023

Developed a semi-supervised machine learning algorithm leveraging Integer programming and deep learning (autoencoder) in Python that predicts the number of people in a house from WiFi channel state information (CSI) with 80% accuracy resulting in bringing an additional estimated $2M value to the company.
● Created an unsupervised machine learning model that improved motion localization and prediction in a house from WiFi CSI. The refined model employed clustering, integer programming, and Natural Language Processing principles (NLP-Bag-of-Words) and achieved 95% accuracy and resulted in bringing $2M value to the company.
● Applied mixture models, spectral model, and multidimensional scaling from scikit-learn to cluster noisy data resulting in an 80% accuracy score.
● Built a hidden markov model algorithm to extract hidden states (number of people in a house) from the noisy house occupancy data using integer programming resulting in an additional 15% improvement.
● Tools used: Python, Scikit Learn, Numpy, PyTorch, hmmlearn, Linux, Google OR tools; signal processing with autoencoders, and hidden markov models; Dimensionality reduction with autoencoders and PCA; manifold learning with Spectral models, multidimensional scaling, and mixture models.

Statistical Computation Intern at Cognitive Systems Corp
  • Canada - Waterloo
  • November 2021 to May 2022

● Enhanced a signal segmentation algorithm based on calculating rate of change in signal amplitude in Python which elevated segmentation accuracy from 40% to 85%.
● Applied kurtosis analysis algorithm to detect misclassified signals using statistical analysis; advanced performance from from 74% to 98%.
● Tools used: Python, Scikit Learn, Numpy, Pandas, Linux, signal processing using convolution.

Data Science Intern at Mobkast Inc.
  • Canada - Toronto
  • May 2021 to November 2021

● Contributed to a robust cost prediction model by leveraging a PyTorch-based neural network regression framework, tailored specifically for estimating production expenses at Calvin Klein during their commercialization phase. Achieved a prediction accuracy rate of 99%.
● Tools used: Python, Scikit Learn, Numpy, Pandas, PyTorch, SQL.

Education

Master's degree, Computer Science and Machine Learning
  • at Georgia Institute Of Technology
  • April 2027

Master of Science in Computer Science Candidate

Bachelor's degree, Industrial Engineering
  • at University Of Toronto
  • April 2023

Specialties & Skills

Software Design
Applied Mathematics
Applied Research
Machine Learning
Software Development

Languages

English
Expert
Persian
Native Speaker
Arabic
Beginner
French
Intermediate