Ramansh Sharma

I am a PhD student at the Kahlert School of Computing at The University of Utah advised by Professor Varun Shankar. Our research focuses on designing novel scientific machine learning to solve various partial differential equations.

From 2021 to 2023, I was a visiting scholar at the School of Computing at The University of Utah supervised by Professor Shankar and a remote collaborator at the Approximate Bayesian Inference Team at RIKEN supervised by Dr. Gian Maria Marconi and Dr. Thomas Möllenhoff.

Warnock Engineering Building, Office 2883

ramansh [at] cs.utah.edu

Google Scholar  /  GitHub  /  CV  /  Blog

profile photo

Timeline

August 2023 - Present: PhD student at the University of Utah.

September 2021 - June 2023: Visiting scholar at The University of Utah focusing on physics-informed machine learning.

October 2021 - June 2023: Remote collaborator at the ABI team focusing on curriculum learning.

February 2021 - September 2021: Machine Learning Engineer at the World Resources Institute working on identifying economic and financial incentives for forest and landscape restoration using NLP.

October 2020 - December 2020: Data Scientist at SevaExchange working on recommender systems to connect volunteers with volunteer opportunities.

June 2020 - August 2020: Lead Machine Learning Engineer at Omdena's TrashOut project working on detecting illegal dumpsites (project article).

July 2019 - June 2023: Undergraduate degree in Computer Science and Engineering at SRM Institute of Science and Technology, Ramapuram, India.

Research

My interests lie in the field of physics-informed machine learning (PIML). My current works involves adapting physics-informed neural networks with traditional scientific computing techniques. I strongly believe the two fields together hold strong promise.

project image

Accelerated Training of Physics Informed Neural Networks (PINNs) using Meshless Discretizations


Ramansh Sharma, Varun Shankar
Neural Information Processing Systems (NeurIPS), 2022
paper / arxiv / code / video / poster

We present a new technique for the accelerated training of physics-informed neural networks (PINNs): discretely-trained PINNs (DT-PINNs). DT-PINNs are trained by replacing exact spatial derivatives with high-order accurate numerical discretizations computed using meshless radial basis function-finite differences (RBF-FD) and applied via sparse-matrix vector multiplication. Additionally, though traditional PINNs (vanilla-PINNs) are typically stored and trained in 32-bit floating-point (fp32) on the GPU, we show that for DT-PINNs, using fp64 on the GPU leads to significantly faster training times than fp32 vanilla-PINNs with comparable accuracy. Our results show that fp64 DT-PINNs offer a superior cost-accuracy profile to fp32 vanilla-PINNs.

project image

Beyond modeling: NLP Pipeline for efficient environmental policy analysis


Jordi Planas, Daniel Firebanks-Quevedo, Galina Naydenova, Ramansh Sharma, Cristina Taylor, Kathleen Buckingham, Rong Fang
Fragile Earth proceedings - KDD, 2021
paper / arxiv / code / video /

We propose a Knowledge Management Framework based on NLP techniques that would tackle challenges such as resource-intensive nature, lack of comprehensive information sources, and overlapping jurisdictions in policy analysis. The framework is designed to be platform-, language- and policy-agnostic. To classify financial incentives in restoration policies, both Sentence-BERT and Cross-Encoders performed well. For sentence classification inference with Sentence-BERT, a random forest classifier can be used to assign a category to a given sentence using the Sentence-BERT learned embeddings.

Mentoring

I frequently mentor undergraduate and high school students who wish to pursue a career in Machine Learning. If you think my academic and professional experiences so far can help you in any way, please send me an email.

Talks

I often speak at events to encourage students to consider a path in machine learning. I share some of those events below:

  1. Bronx Science Machine Learning club: I discussed different ways students can start learning machine learning concepts, make unique and interesting projects (and share them), and approach internship opportunities.

  2. Devs' Street event: I discussed my past projects and the lessons I learned from them on applying machine learning in industry. I shared my personal experience looking for internship opportunities and making presentable projects.

  3. NeurIPS 2020 Nairobi meetup: I co-hosted this online meetup for NeurIPS 2020 where we invited researchers and practitioners from Stanford University, George Washinton University, Facebook AI Residency program, Amazon, Spotify, Google, Instadeep, and NVIDIA research divisions.

  4. Neural Networks Workshop: I gave a technical workshop at SRM IST in 2020 for Microsoft Student Partner Open Day on introduction to neural networks and machine learning.


Adapted from Dharmesh Tailor's fork of Leonid Keselman's website.