Structural Identifiability Toolbox

Posted on Sun 25 July 2021 in posts • Tagged with Maple, Symbolic Computing • 2 min read

Introduction

In this repository, I will describe our recently-released Structural Identifiability Toolbox, a web-based application for assessing parameter identifiability of differential models.

Click here to checkout the application! Read on to learn more.

Why is it better?

The program is fast, free, and is available in any web-browser, including mobile …


Continue reading

My Google Summer of Code Project

Posted on Tue 08 June 2021 in posts • Tagged with Julia, GSOC • 2 min read

About The Project

Problem Formulation

The problem of parameter identifiability is one of the most crucial issues arising in systems biology. To take a look at a problem of identifiability, we must first describe a setting in which it arises. Systems biology deals with biological processes that are described by …


Continue reading

How I had to translate Matlab code into Maple

Posted on Tue 18 August 2020 in posts • Tagged with python, regular expressions, matlab, maple • 3 min read

In this short post, I wanted to point out one interesting application of regular expressions I had to work on for my PhD research project. The code was meant as a technical tool to help tranlate some ordingary differential equation models from numerical (Matlab) to symbolic (Maple) code.

The original …


Continue reading

NumPy-Learn, A Homemade Machine Learning Library

Posted on Sun 14 June 2020 in posts • Tagged with machine learning, python, numpy, deep learning • 12 min read

In this post, I expand on a little class/self-teaching project that I did during the Spring 2020 semester.

NumPy-Learn: A Homemade Machine Learning Library

Organization

In this section we will discuss the main organization of the library:

  • How the layers are built
  • How loss functions work
  • How a stochastic …

Continue reading

Three Ways to Deal With Imbalance

Posted on Mon 02 March 2020 in posts • Tagged with machine learning, logistic regression, python, scikit-learn, statistical learning • 5 min read

In this post, I put together an interesting example of what to do with imbalanced datasets and why precision and recall matter.

Introduction

The following is part of a Machine learning assignment I had to do while at CUNY. This particular example illustrates quite well the importance of understanding various …


Continue reading

Linear Regression as the Simplest Classifier

Posted on Mon 24 February 2020 in posts • Tagged with machine learning, linear regression, python, scikit-learn, statistical learning • 12 min read

In this post I wanted to describe a simple application of a linear least squares method to a problem of data classification. It is a naive approach and is unlikely to beat more sophisticated techniques like Logistic Regression, for instance.

Imports

Some imports we are going to need for this …


Continue reading

How to write a decent training loop with enough flexibility.

Posted on Sat 15 June 2019 in posts • Tagged with deep learning • 2 min read

In this post, I briefly describe my experience in setting up training with PyTorch.

Introduction

PyTorch is an extremely useful and convenient framework for deep learning. When it comes to working on a deep learning project, I am more comfortable with PyTorch rather than TensorFlow.

In this quick post, I …


Continue reading

RiCNN and Rotation Robustness of ConvNets. A Paper Review

Posted on Sat 15 June 2019 in posts • Tagged with review series, deep learning, computer vision • 4 min read

Lately, I have been reading more papers on modern advances in deep learning in order to get a clear view of what problem I want to focus on during my PhD research.

There is a lot of information to process and an incredible amount of papers are being published from …


Continue reading

Computer Vision. Can You Teach a Machine To See?

Posted on Fri 22 March 2019 in posts • Tagged with education, talks • 1 min read

A little overview of what I talked about at CUNY CSI Science Day.

During the Science Day at the CUNY College of Staten Island, I presented a gentle introduction to area of computer vision with fun examples and research results to visiting middle and high school students.

It was a …


Continue reading

Harmonic networks. Implementation of paper results

Posted on Sun 10 March 2019 in posts • Tagged with deep learning, computer vision, work in progress • 5 min read

I implement an interesting result from a recent paper on convolutional neural networks.

Introduction

In this post I will briefly discuss my implementation of a model introduced in this paper.

In short, the authors suggest using predefined filters in a convolutional network based on Discrete Cosine Transform.

I used PyTorch …


Continue reading