Sometimes You Just Gotta Rant

So, this is a thing: There’s a billboard in Utah for a dating website called WhereWhitePeopleMeet.com. (I’m not going to dignify either the site or the news outlet that broke it by linking. If you want to dig deeper, I’m sure you have a favorite search engine.) When I saw this site posted on Facebook, there were (predictably) a cadre of white guys chiming in with dipshittery about “well, black people can have their own dating sites…” and challenging anyone to find something fundamentally wrong with this.

As both a white person and a former employee of a well-trafficked Internet dating service, I have things to say.

Challenge Accepted

If you’re an American looking for a dating service where it’s easy to meet white people, your options are damn near any of them. The mainstream sites in America are all predominantly white. All of them. This is because, despite the racist anxieties clung to by so many conservatives, America is predominantly white. In a state like Utah, which is even whiter than the nation at large, the local pool on a mainstream dating website will be whiter still, and when you take into consideration differences in Internet and smartphone usage/penetration, then even that majority will be overrepresented.

To spell it out for any Fox News viewers who might have wandered to this page by accident: Minorities have specialty dating (and other) services because they are minorities and finding people like them (which, for good or ill, is most people’s preference) is harder than it is for those of us in the majority.

Put even more simply: If you believe that this website exists to overcome real obstacles keeping single white people apart, or that it legitimately addresses some fundamental issue of fairness, the nicest thing that might be said about you is that you are pathologically inattentive.

The point of WhereWhitePeopleMeet.com is not to help white people meet (especially white Utahns). The point of this site is obvious: It exists for those who want to avoid non-white people, and who want to connect with others who have that priority. In that sense, I suppose it’s another minority dating website, though its branding is disingenuous.

You can fill in the blanks yourself about what the underlying motives might be for the site’s users. In any case, if you’re not somewhere on the scale between uncomfortable and appalled about such a site, we’re going to have difficulty being friends.

September Writing Challenge, Post 24: Three Things I Wish American Tech Culture Would Learn

Note: I’ve had a couple things holding my attention this week, and as a result missed a couple of days of the writing challenge. I’ll catch up.

One more note: I’m having a slightly rant day. Bear with me.

There are a bunch of things that could be done to make the tech culture more sane and humane. Here are three that rank highly on my list:

1. Working more hours does not necessarily make you more productive. In fact, it may make you far, far less so. We work in one of the few professions where it is possible to do negative work on a daily basis – that is, to make the code worse than we left it. We are more likely to do this when we work long hours. Unfortunately, both American work culture and the tech subculture seek to twist overwork into a virtue. It’s not. Overwork leads to bad decisions. If your boss doesn’t understand this, give him the slide deck I linked earlier in this paragraph (which contains a ton of researched information on productivity topics beyond just hours). If he willfully ignores the facts and says he doesn’t believe it, go work for someone smarter, and let him fail on someone else’s broken back. Also: If you think you’re somehow the exception to this, you’re not. There’s ample research out there – I urge you to look it up.

2. Trading sleep for work just makes you dumber, not more productive. This goes hand-in-hand with the issue of long hours; as with overwork, our culture makes a badge of honor out of sleep deprivation. (I was guilty of this myself when I was younger.) When we don’t get enough sleep, it degrades the quality of our work, and our ability to notice how much our work has degraded. This may be a reason so many people think they’re exceptional in this regard. Spoiler: They’re not. Again, there’s loads of research; Google is your friend.

3. The software profession is not a meritocracy. At least, it’s not if you’re black or a woman. This is made worse by the fact that white guys in the profession often think they’re too smart to have unconscious biases about race, gender, sexuality, &c. It’s made worse still by the fact that most of us in the profession who are any good at it actually did work hard to get there, and feel there’s merit in the rewards we’ve gathered. But if it’s not a meritocracy for everyone, it’s not a meritocracy for anyone, and those of us on the inside need to check our privilege and start examining our own behavior.

</rant>

September Writing Challenge, Post 1: The Worst Thing I Have Seen in Object-Oriented Code, and How to Fix It

This is post #1 in my 30-day writing challenge for September. A couple of notes:

  1. This post is about a technical topic (because I am, after all, me), but not all my posts this month will be. So if today’s post bores you or makes your eyes glaze over, try again tomorrow.
  2. This went way over the five minutes allotted – so far over that I just gave into the urge to write a (nearly) complete post about it. I think one thing I might try to learn from this challenge is how to break down and/or distill an idea to where it can be expressed in five minutes of top-speed typing – so 400-500 words with no links, editing, or formatting, less if I want to be fancy. And maybe I could back off on the editing perfectionism. It might also help if I avoid topics that activate my ranting gene.

Anyway, on with today’s episode…

This topic goes a little beyond “pet peeve” for me. It’s an anti-pattern I’ve seen far, far too often from novice – and rarely, intermediate – coders in object-oriented languages, and it’s guaranteed to make code bug-ridden and insanely expensive to maintain. I know this, because I’ve had to alter, debug, and refactor code like this.

Here’s how it happens: You have a class – let’s say, a table cell. The cell shows an employee record with, let’s say, name, ID#, and salary. Very straightforward.

A requirement is added that two different employee types (regular employee and manager) should have different background colors. So you do something like this in your Employee class (pardon my Swift):

…and then in the code that loads the table cell:

So you’ve adjusted the background color according to employee type. The requirement is met. Pretty benign, yes?

Then the requirement is added that company officers (who are also managers) need an extra line added to show their equity-based compensation. So, you add another type to the enum (now it’s Employee/Manager/Officer), and in the function that computes the table cell’s height, you add an if statement that computes the height one way for an Officer, and another way for everyone else. And oh, yeah, you go back and make sure the Officer case is covered when you set the table cell’s background color.

Then you add a requirement that you must handle contractors differently: They need a third background color, they show hourly rate instead of salary, they need an extra line in the table cell – but this one shows their security clearance – and when the cell is selected, it takes you to a different kind of detail view than the other types. So you add a type to the enumeration, add a case to the background color switch statement, change how the cell draws so that Contractor gets an hourly rate, while everyone else gets a salary, change the if in the cell height computation to be a switch and compute the height for a Contractor like you do for an Officer (since they’re both adding an extra line), and you add another conditional in the cell selection response to differentially choose what kind of detail screen comes up based on employee type.

But then the Officer type needs to add yet another line to the table cell to indicate how many shares of the company the officer holds, so you have to go break apart the conflated Officer and Contract cases in the cell height computation and turn them into two separate behaviors.

And then a type is added for a part-time employee, which also has an hourly rate but no extra line…

And you start using this type enumeration to switch between business logic cases elsewhere in the code…

(By this point, many of you know where I’m going. If you don’t, please, please read on.)

Every single one of those switch/case statements becomes an opportunity to forget to add a case. (Less so in a few languages, such as Swift, that demand that you cover all cases – but even then it’ll bite you if you add a default case.) Very quickly, every switch/case become a tangled mess where cases are conflated and you’ll have to separate them when requirements change. What used to be a simple table cell becomes a 2000-line behemoth. Listen: I am not exaggerating. I have seen this table cell with 2000 LoC, and methods hundreds of lines long, because of exactly what I’m describing here.

And what do you think the Employee class looks like? Or any other class that touches the Employee class?

Every time you have to make a change related to employees, you dread it, because you know you’ll spend a day or more combing through all the cases and debugging all these complex decision paths for even the simplest change.

Where did we go so wrong?

The screw-up was here:

Actually, even that is too much. The screw-up was here:

If you catch yourself representing a type with an enumeration (or a similar device in your language of choice), stop. Stop. Stop! STAAAAAAAAHP.

stahp-sign

There’s a better way to represent types in an object-oriented language. You represent a type with a type:

And then the thing you absolutely do not do is switch on the class of an employee instance to determine behavior in other classes like your table cell. (I’ve seen that done too, and it’s even worse than the enumeration anti-pattern.) When a component like a table cell needs to change behavior based on a type, you make a table cell class for each class of worker you have to represent. Each child class contains only the things that make it special, and commonalities are moved up to a parent class.

What you wind up with instead of the tangled mess of switch/case statements is a larger number of much smaller classes, each with a radically reduced number of conditional statements – which means a radically reduced number of chances to screw up.

This is what objects are for in an object-oriented language – tying together groups of related types that have some common behavior and some divergent behavior, while minimally expressing the things that make each type divergent. This is the primary way we manage complexity in the OO milieu.

Of course, now we have multiple parallel class hierarchies – a hierarchy of Worker types, a matching hierarchy of table cell types, maybe a matching hierarchy of components to calculate compensation… Does that sound unwieldy or difficult to manage?

You and I are not the first to notice the problem – it is addressed with the Abstract Factory Pattern. I’d describe that to you, but it is amply explicated elsewhere. And I urge you to learn it, because it’s tremendously valuable for managing complexity in your code.

And I promise, I’ll follow up this post with an example of how to use that pattern in a case like this. (Though possibly not in September.)

Does anyone else have any computing language abuses they’d like to share? Any language, any paradigm – drop some science in the comments below!