On A Hockey Website
Just to not let the month of January slip away without another post, I got sentimental and decided to tell a small story about how my website came to life.
There was a void. A lot of time people on hockey boards would wonder if specific statistics of players and teams were available, and they wouldn't, although the raw data seemed to be there. Then, there was the fantasy hockey world, with its pizzazz, and asking for a predictive tool, - and again, the raw data seemed to be there.
Now, I am a sysadmin by trade, with occasional forays into software development, and since I've been doing Perl for all of my career, I got a few exposures to the Web development process and to databases. I've got a college degree in Engineering, so that gave me some idea about statistics.
So I got a look at the publicly available NHL reports, but was unsure of how to use them. I tried some standard database approach, but it wasn't working.
The turning point came when I attended a lecture on MongoDB. That one turned out to be perfect, with the loosely compiled NHL stats documents, just spill them into the Mongo database. Then extract data from them and summarize them into tables. Store the tables in an SQL database for quick serving on the website. And along came more luck - a lecture on the Mojolicious Perl Web framework which equipped me with an easy solution for how to run a website.
Thus, I was able to actually implement what I had in mind. First came the spider part, to crawl and collect the data available on NHL.com. Fortunately, I was able to scrape everything before the website's design changed drastically, and the box scores prior to 2002 stopped being available. I got everything from the 1987/88 season on.
Then, I started writing the parsers,.. and had to take a step back. There was quite a lot of inconsistent and missing reports. Therefore I had to a) add a thorough testing of every report I scraped to ensure it came together, b) look for complementing sources for whatever data was missing. So before I got done with the parsers, I had a large testing framework, and also visited all corners of the hockey-related websites to get the missing or conflicting data resolved, even the online archives of newspapers such as USA Today. Some of the downloaded reports had to be edited manually. Then, NHL.com landed another blow, dropping draft pick information from their player info pages. Luckily, the German version of the website still had it, so I began to scrape the German NHL website too.
I was able to produce the unusual statistics tables relatively quickly and easily. However I decided that the website will not open without the prediction models I had in mind. Being a retired chess arbiter and a big chess enthusiast I decided to try to apply the Chess Elo rating model to the performances of hockey teams and players. Whether it really works, or not, I don't know yet. I guess by the end of the season I can make a judgement on that.
In October 2016 I opened my website by using a free design I found somewhere online. Unfortunately, I quickly realized it was not a good fit with the contents the site was serving, so I sighed, breathed some air, opened w3schools.com in my browser, and created my own design. And a CMS too. At least I am happy with the way the site looks now, and even more happier that when someone asks a question - on Twitter, Reddit or hockey forums - whether it's possible to measure a specific metric, I am able to answer, 'Already done! Welcome to my website!'
At the end I'm a software developer, a web designer, a DBA, a sysadmin, a statistician and an SEO amateur. Oh, and a journalist too, since I'm writing a blog.
- 1
0 Comments
Recommended Comments
There are no comments to display.