Posts

Showing posts from June, 2013

Machine Learning Course

Enough time has passed since I undertook the Stanford University Natural Language Processing Course for me to forget just how much hard work it was for me to start all over again.  This year I decided to have a go at the coursera Machine Learning Course.

Unlike the 12 week NLP course last year which estimated 10 hours a week and turned out to be more like 15-20 hours a week, this course was much more realistic in estimation at 10 weeks of 8 hours.  I think I more or less hit the mark on that point spending about 1 day every week for the past 10 weeks studying machine learning - so around half the time required for the NLP course.

The course was written and presented by Andrew Ng who seems to be rather prolific and somewhat of an academic star in his fields of machine learning and artificial intelligence.  He is one of the co-founders of the coursera site which along with their main rival, Udacity, have brought about the popular rise of Massive Open Online Learning.

The Machine Learning Course followed the same format as the NLP course from last year which I can only assume is the standard coursera format, at least for technical courses anyway.  Each week there were 1 or two main topic areas to study which were presented in a series of videos featuring Andrew talking through a set of slides on which he's able to hand write notes for demonstration purposes, just as if you're sitting in a real lecture hall at university.  To check your understanding of the content of the videos there are questions which must be answered on each topic against which you're graded.  The second main component each week is a programming exercise which for the Machine Learning Course must be completed in Octave - so yet another programming language to add to your list.  Achieving a mark of 80% or above across all the questions and programming exercises results in a course pass.  I appear to have done that with relative ease for this course.

The 18 topics covered were:

  • Introduction
  • Linear Regression with One Variable
  • Linear Algebra Review
  • Linear Regression with Multiple Variables
  • Octave Tutorial
  • Logistic Regression
  • Regularisation
  • Neural Networks Representation
  • Neural Networks Learning
  • Advice for Applying Machine Learning
  • Machine Learning System Design
  • Support Vector Machines
  • Clustering
  • Dimensionality Reduction
  • Anomaly Detection
  • Recommender Systems
  • Large Scale Machine Learning
  • Application Example Photo OCR
The course served as a good revision of some maths I haven't used in quite some time, lots of Linear Algebra for which you need a pretty good understanding and lots of calculus which you didn't really need to understand if all you care about is implementing the algorithms rather than working out how they're derived or proven.  Being quite maths based, the course used matrices and vectorisation very heavily rather than using the loop structures that most of us would use as a go-to framework for writing complex algorithms.  Again, this was some good revision as I've not programmed in this fashion for quite some time.  You're definitely reminded of just how efficient you can make complex tasks on modern processors if you stand back from your algorithm for a bit and work out how best to utilise the hardware (via the appropriately optimised libraries) you have.

The major thought behind the course seems to be to teach as many different algorithms as possible.  There really is a great range.  Starting of simply with linear algorithms and progressing right up to the current state-of-the-art Neural Networks and the ever fashionable map-reduce stuff.

I didn't find the course terribly difficult, I'm no expert in any of the topics but have studied enough maths not to struggle with that side of things and don't struggle with programming either.  I didn't need to use the forums or any of the other social elements offered during the course so I don't really have a feel for how others found the course.  I can certainly imagine someone finding it a real struggle if they don't have a particularly deep background in either maths or programming.

There was, as far as I can think right now, one (or maybe two depending on how you count) omission from the course.  Most of the programming exercises were heavily frameworked for you in advance, you just have to fill in the gaps.  This is great for learning the various different algorithms presented during the course but does leave a couple of areas at the end of the course you're not so confident with (aside from not really having a wide grasp of the Octave programming language).  The omission of which I speak is that of storing and bootstrapping the models you've trained with the algorithm.  All the exercises concentrated on training a model, storing it in memory, using it and as the program terminates then so your model disappears.  It would have been great to have another module on the best ways to persist models between program runs, and how to continue training (bootstrap) a model that you have already persisted.  I'll feed that thought back to Andrew when the opportunity arises over the next couple of weeks.

The problem going forward wont so much be applying what has been offered here but working out what to apply it to.  The range of problems that can be tackled with these techniques is mind-blowing, just look at the rise of analytics we're seeing in all areas of business and technology.

Overall then, a really nice introduction into the world of machine learning.  Recommended!


Making a Cajón

When I asked my best mate what he wanted for his birthday this year he came back with something rather unexpected, he said "I'd really like a Cajón!".  Having never heard of one before he continued to explain what it was and I looked it up a bit later too.  It turned out that for the sort of thing he wanted, something with an electrical pick-up (to make it semi-acoustic) with an adjustable snare too, it was a bit out of budget.  After a bit of research around various different makes and models I wondered how hard it could really be (it's just a wooden box after all) and offered to make one.  Matt quickly warmed to the idea and so with his knowledge of what he wanted in the way of design and my woodworking experience we set about a joint project that we've just finished this weekend.

To save the long blog post about exactly what we did, I'll simply refer you to a video (below).  This is more or less exactly what we made, following Steve Ramsey's design almost to the letter.  There were a few things we had to make up that the video didn't explain very well and a couple of design adjustments (where we found the video to be incorrect - we weren't the only one's to notice the problem).


The main departure from Steve's design in the video was the inclusion of an electric pick-up.  However, we didn't depart from Steve's advice and just followed his design for an electric pick-up using a piezo transducer and a 6mm jack socket soldered together as can be seen from about 4:30 in the video for his stomp box.

We took pictures all the way through which can be seen in chronological order in my Flickr set or via the slideshow at the bottom of this post.  We started off with a bunch of different stuff we needed to work up.  Here's Matt with the cheesy-grinned first picture before we got started, posing with the various bits and pieces:


More or less everything we used is there in the picture above:

  • 4' x 2' x ¾" birch faced ply sheet  (for the top, bottom and sides)
  • 3mm ply (for the front piece, called the tapa)
  • 25mm dowel rod
  • piezo transducer and 6mm jack socket
  • 4 speaker feet
  • Snare wire
  • 2 knobs, m6 40mm long thread
  • Clear wax
  • Glue

On the first afternoon's work, the birch ply was cut to size and rebated to form a box shape, albeit not yet glued together:

This was actually the main part of the work we had to do.  The next time we got together we modified the back panel so it had a large hole in it (to let the sound out) and a fitting for the jack socket to be screwed through.  After that came the tricky business of fitting the adjustable snare dowel rod mechanism to the sides which can best be seen in a couple of different pictures.  Once all that was done we were able to glue it all together and left it clamped up for a couple of days to dry, the result was a completed box:


Finally, we cut the front to size and fitted that, waxed the whole thing then fitted the feet and jack socket.  We gave it a few different tests.  First was to sit on it (since that's how they're played) and it survived that, then Matt had his first play on it in my garage followed by heading in doors to hook it up to the stereo in order to test the semi-accoustic-ness off it.  Everything worked well.

We're both really pleased with it.  It's really solidly constructed and feels like it should last a good many years use.  All the tweaks to a basic Cajón design work really well including the adjustable snare and the electric pick-up.  It looks really good too, we were really lucky to source such a nice looking piece of ply (thanks to my cousin's at Ascot Timer Buildings), finished it off with nicely rounded corners and a good quality clear wax.  Of course, the really important bit is the sound and fortunately it performs on that front too (better than I'd expected).  The base notes from the middle sound really deep and can be quite loud if you're really going for it and they graduate to a nice high pitch as you move towards playing at the sides.  When turned on, the snare adds an extra dimension when hitting near the top too.


So, it's happy birthday to Matt (a wee bit late since we started making it just after his birthday).  There have been loads of people interested in the project as we've bee going through so I'm sure he's going to be a busy boy showing it off all over the place now.

I'll close out with the slideshow and another mention of thanks to Steve Ramsey for his excellent video tutorial.