Masked Tidepool

Deep Learning from a Legitimate Imposter

A Biography Forgotten

• Machine Learning, Neural Net, Artificial Intelligence, NeurIPS, and Bio

In returning to a more active presence on Twitter, I noticed a link to this blog, started years ago. A moment of chill ran down my spine as I recalled some of the topics I naively tackled so long ago, as I first dipped a toe in the very deep pool of data science. Where is the delete key?!? Why doesn’t this crap expire? But once the panic receded, a wry smirk I realized something new. There is something useful in such a timeline. An example, if you will.

An Introvert's Guide to NeurIPS

• Machine Learning, Neural Net, Artificial Intelligence, NeurIPS, Conference, and Networking

Day 2

The mind begins to spin with new ideas. People all around are engaged in topics you want to hear more about. Ideas you want to contribute to. And the last thing in the world you want to do right now is talk to another human. You don’t want to miss a word, a chance to be further inspired. You want to hide. I’m here to tell you. That’s okay. Hide.

Victory Lap

• Machine Learning, Neural Net, Convolutional Net, Artificial Intelligence, MNIST, and Kaggle

Kaggle Leaderboard

This took longer than it should have. And though this is an already “won” race, I’m going to take a moment to enjoy this and celebrate by … moving on. While I did find the result I was looking for, it came in about around a mystery that I never solved.

Your Next Pet, or Its?

• Machine Interaction, Neural Net, Psychology, Artificial Intelligence, and Finnegan

I’ll preface this with the fact that I am not a psychologist, I haven’t studied it since college. I’m not even a scientist. So what follows should be treated as nothing more than anecdotal. That being said, it seems fascinating, so if anyone out there feels the need to write a doctoral thesis on this please let me know how it turns out.

My simple neural net was a puppy.

Smaller and Smaller and Troublesome

• Machine Learning, Neural Net, Python, MNIST, and Scikit-Image

A brand new rabbit hole of something to learn. And I’ll wisely choose to side-step this one for the moment, but it is interesting. How do you hold on to as much information as possible while downsizing an image to a size you want to work with?

Handwritten Digits

Binomial Bottleneck

• Machine Learning, Dropout, Regularization, AWS, GPU, Python, and Numpy

Neural Net Architecture is not a playground for those that demand instant gratification. The endless trials with slight variations of one parameter or another (made ever so much worse when you don’t take rigorous notes, head slap) provide feedback only when they are good and ready. So it is much like watching water boil. While we’re waiting. Lets break out Python’s cProfile and see what is going on under the hood, just to pass the time, of course.

Nice, Neat Rows

• Neural Net, Vectorization, Linear Algebra, Python, and Machine Learning

So I failed to take notes. Lesson learned. It has been a hectic two weeks plying through a mess that I admittedly should not have made. I should have continued iterating through the design only with a test firmly in hand at each step. I didn’t and I paid the price. Here is where I should pile up some epic story of failure and perseverance, littered with trials and tribulations, successes and pains. But there is nothing so entertaining to be had in this venture, just a mess of spaghetti and a commit path just as wandering and even more useless. So I’ll just say it:

Test.

Shallow Dream

• Machine Learning, Perceptron, Python, and Deep Dream

So far, this experiment has yielded one major take away: I need more tools!

Based on the very broad concept behind Google’s Deep Dream project, Thunder wondered what it was my little pet Perceptron had actually learned. Seemed an innocent enough quesion.

The Jungle of Documentation

• Sphinx, ReadtheDocs, Documentation, and Python

Project, check. Splash page for project, check (thank you, gh-pages). Documentation, in case someone want to use this thing, … Well I’ve got docstrings, that’s good enough, right? Not so, say they, what hold sway.

Speed at a Cost

• Machine Learning, Efficiency, Python, Numpy, and Vectorization

Waiting. Waiting … Waiting …

Okay, that is getting lame.

Running 1000 input vectors through ten neurons hundreds of time with hundreds of calculations each time, to verify the code is functional, turns out to take some bit of time.