Data Science, Big Data, & Cloud nerd with a focus on healthcare & a passion for making complex topics easier to understand. All thoughts are mine & mine alone.

# Minutia Matters — Hello World Year 2

Why I Write the Blogs I Write

I’m currently buried in a full-semester course that’s being run over just four weeks as part of my M.S. in Health Data Science program, so I might not get back to a time-consuming blog post till February. This is really a shame, because…

# Uncovering the Deep State… of Neural Networks

Deep Learning Math Walk-Through and Code Logic

In the previous blog post we walked through an example where we used a neural network with two neurons and a single hidden layer to produce a (wildly inaccurate) prediction function. As we discussed before, this is fine if two kinks in your…

# Form of… An Innacurate Prediction!

Activating Transformative Powers in Neural Networks

Before we move into the activation function part of the show, let’s stop and consider what we’ve done in the first step. Applying weights and a bias value to variables/features in a data set is, in effect, making a prediction. We know this prediction…

# The First Cut is the Shallowest

Weighting Data Prior to the Activation Function

In this post we’re going to take the first step in making our math real by seeing what happens when you apply the randomized weights and bias values to our data prior to submitting it to an activation function.

Recall that we’ve simplified…

# KISSing Neural Networks

Simplifying and Enhancing Forward Propogation

Publishing this on Christmas Day, there’s a mistletoe joke in here somewhere.

So far we have defined what neural networks are and what they do, and then taken it a level deeper and explained how deep learning works — in theory anyway — in terms…

# Matrix Multiplication

Even I Cant’ Come Up With a Way to Try to Make This Entertaining

Sometimes, you just gotta eat your vegetables.

In my next post, we’re going to go into some math behind forward propogation in neural networks. But before we do that, I need to…

# Kinks In the Works

Increasing Functional Complexity in Neural Networks

(For the record, I almost titled this one “Let’s Get Kinky!” but eventually, thankfully talked myself out of it…)

In the last post on neural networks, we looked at how you could take a couple of neurons, using ReLU activation, and produce a set…

# Machine Learning for Zombies

Neural Networks in Bite-Sized Chunks — The Beginning

Neural networks, a.k.a. Multilayer Perceptrons (MLP), are complex algorithms that take a lot of compute power and a *ton* of data in order…

# Curving the Straights, Hilling the Flats

Creatively Crafting Curvy Classifiers

I almost titled this one “The Hazzards of Duke”, but I’m not sure that would have resonated as much today as it did in the 1980’s. For the uninitiated:

When I wrote…

# Generating Useful Synthetic Patient Data for Machine Learning

A Theoretical Approach Moving from Impossible to Plausible

# Background

I have previously written about the need for and potential value of synthetic patient data for building machine learning models in healthcare. My exploration into ways to generate it from a common data format, such as FHIR, has taken a few twists…

## Jason Eden

Get the Medium app