• Machine Learning, Neural Net, Artificial Intelligence, NeurIPS, and Bio
In returning to a more active presence on Twitter, I noticed a link to this blog, started years ago. A moment of chill ran down my spine as I recalled some of the topics I naively tackled so long ago, as I first dipped a toe in the very deep pool of data science. Where is the delete key?!? Why doesn’t this crap expire? But once the panic receded, a wry smirk I realized something new. There is something useful in such a timeline. An example, if you will.
The mind begins to spin with new ideas. People all around are engaged in topics you want to hear more about. Ideas you want to contribute to. And the last thing in the world you want to do right now is talk to another human. You don’t want to miss a word, a chance to be further inspired. You want to hide. I’m here to tell you. That’s okay. Hide.
This took longer than it should have. And though this is an already “won” race, I’m going to take a moment to enjoy this and celebrate by … moving on. While I did find the result I was looking for, it came in about around a mystery that I never solved.
• Machine Interaction, Neural Net, Psychology, Artificial Intelligence, and Finnegan
I’ll preface this with the fact that I am not a psychologist, I haven’t studied it since college. I’m not even a scientist. So what follows should be treated as nothing more than anecdotal. That being said, it seems fascinating, so if anyone out there feels the need to write a doctoral thesis on this please let me know how it turns out.
• Machine Learning, Neural Net, Python, MNIST, and Scikit-Image
A brand new rabbit hole of something to learn. And I’ll wisely choose to side-step this one for the moment, but it is interesting. How do you hold on to as much information as possible while downsizing an image to a size you want to work with?
• Machine Learning, Dropout, Regularization, AWS, GPU, Python, and Numpy
Neural Net Architecture is not a playground for those that demand instant gratification. The endless trials with slight variations of one parameter or another (made ever so much worse when you don’t take rigorous notes, head slap) provide feedback only when they are good and ready. So it is much like watching water boil. While we’re waiting. Lets break out Python’s cProfile and see what is going on under the hood, just to pass the time, of course.
• Neural Net, Vectorization, Linear Algebra, Python, and Machine Learning
So I failed to take notes. Lesson learned. It has been a hectic two weeks plying through a mess that I admittedly should not have made. I should have continued iterating through the design only with a test firmly in hand at each step. I didn’t and I paid the price. Here is where I should pile up some epic story of failure and perseverance, littered with trials and tribulations, successes and pains. But there is nothing so entertaining to be had in this venture, just a mess of spaghetti and a commit path just as wandering and even more useless. So I’ll just say it:
Project, check. Splash page for project, check (thank you, gh-pages). Documentation, in case someone want to use this thing, … Well I’ve got docstrings, that’s good enough, right? Not so, say they, what hold sway.