Wednesday, March 15, 2017

Sundays

So during the dot-com craziness era, I was on a Saturday phone call with a guy who reported to me. I'm running though deadlines and what needs to happen that weekend and I say "Alright, I'll follow up with you tomorrow so we can see where we are."

He says "Dave. Tomorrow is Sunday. It's my day to worship the Lord."

So now I have to think "oh, shit. Legally, I'm pretty sure I have to leave it at that," so I say "No problem, man, I'll talk to you Monday."

I made it a point not to contact him for anything Sunday-related after that.

But about nine months later, same thing. Saturday phone call, going over the schedule and I said something about what he needs to get done "tomorrow." Then I realized what day it was, so I say "Oops, sorry, didn't realize that today is Saturday."

He says "So what?"

Now, all uncomfortable, I had to say "Ummm...isn't tomorrow your day to 'worship the Lord'?"

I've rarely heard anyone laugh that hard. I love the guy, but if he ever works for me again, he owes me nine months of weekends.


Thursday, February 16, 2017

Euler Zero - backlog

It's taking longer to document my approach to some of these early problems than to actually solve them so I'm developing quite a backlog. If I ever win the lottery (which I don't actually play*, because...math), forget the "I'd be on a beach, or travel" BS that other people always say - I would immediately hire a tech writer.

*Full disclosure: if the potential payout gets large enough, I might occasionally buy a ticket

Monday, February 13, 2017

Simple LRU cache implementation

This originally came up as an interview question (my sabbatical is over, so...). The requirement was to implement a fixed-size LRU cache; the public interface was given as:
class LRUCache {
public:
    LRUCache(int size);
    void put(int aKey, int aValue); // cache aValue, associated with aKey
    int  get(int aKey);             // returns value associated with aKey
    // ... fill in anything else necessary
};
Obviously, for a cache implementation we want O(1) lookups, so I started with std::unordered_map as the associative container. Unfortunately, I got sidetracked with my initial choice of a backing-store container for the actual values. The "conversational chat" part of the interview went a bit long and we were on a schedule so I ran out of time. Although my solution probably would have worked, it still sucked (I would have caught that as soon as I had to fix all of my iterators, but literal pencil-and-paper is a major time sink). At the end, it was pointed out to me that a std::list would have been a more natural (and optimal) choice for the required backing storage. This is embarrassingly obvious, so naturally I had to cook up a solution when I got back home.

It only took a few minutes to throw something together, but then I had a few basic issues about the general problem. For example: the definition of get() - what if aKey is not in the cache? What should be returned? I had asked the interviewer and the answer was -1 (this is completely understandable - what else would he ask to be written in a few minutes?). I'm never a fan of magic numbers though, so at the time I just put

const int invalid_value = -1;
into the class declaration as changing the given interface didn't seem to be a choice (really: it was an interview question, not a real work problem). Nevertheless, this obviously needed to be fixed once I had liberty to change the interface definition back at the ranch.

Anyway, after cooking up a quick-and-dirty working solution I spent a little more time to tweak the public interface a bit and add some additional functionality. It's now a simple template-based generic object LRU cache implementation that uses std::unordered_map for the associative container, std::list (by default) for backing storage, iterators for results, etc. and has acceptable performance - pretty much everything you'd actually use is O(1). I didn't spend that much time on it so it could still use some work, but it's at https://github.com/frasnian/LRUCache (I did wait a week before posting it on github, so I hope they won't mind).

Again, the stable iterators for std::list are so obvious for the backing container choice that it is just embarrassing (I originally even considered a list, but for some reason I had a brain freeze and started thinking about O(n) performance for operations I wouldn't have ever used for this). I'm okay with ending my sabbatical, but the first few interviews are probably going to suck.