Recently, I’ve put together two tiny projects mostly for personal usage.
Usually I strive for my projects to uphold good code practices as far as possible. I find myself a bit of a perfectionist and I want my code to be a joy to work on, both for me and for others.
But that’s hard. As I alluded to in the previous post on here, it seems that I can’t hold focus on working on a single project for too long.
I’ve been wondering how to alleviate this. Perhaps the aforementioned idealism is at fault.
Recently, a friend asked about what has informed my philosophy on programming and computing in general. Even though I couldn’t exactly give a list of citations, it made me think about what my philosophy even is, and to put it down in concrete terms. So I’ll attempt listing my opinions here, as they apply to software I create.
- A project should have the bare minimum set of dependencies. Trivial code should be authored by oneself, and the necessity of any libraries should be thoroughly considered. Prevent NPMization as far as you can.
- Corollary to the above, the affordances provided by the language’s or runtime’s standard library should be exploited, and it’s considered safe to rely upon them, as long as they’re not known to be subject to depreciation.
- Security should be baked in from the start. It’s easier to fix a security problem if you already have the necessary infrastructure to do so as opposed to tacking it on after the fact.
- Don’t fight the environment. Follow the paradigms that the runtime supports best.
The cost of perfection
The language both of the aforementioned projects were written in, for multiple reasons, was PHP. Now, as related to the last fact outlined above, the paradigm that PHP supposedly supports best seems to be mostly object-oriented programming, with some imperative programming mixed in.
OOP is likely the most popular programming paradigm in use today. Programming languages such as Java which support and encourage or even force OOP are among the most popular.
My experience with OOP has been a mixed bag at best. I believe that I know OOP principles well enough insofar they’re applied in practice, and I’ve both contributed to a few projects and written plenty of my own following best practices as I could.
However, OOP can be taken to an extreme, and it often is. OOP encourages thinking about the environment and the world of the program as composed exclusively of other OOP primitives—mostly, objects—and boxing everything else into OOP categories so no trace of other styles remains.
This seems a bit disparaging when it’s put this way, but indeed creating a pure OOP design seems like catnip to some minds. I’ve been struck with this idea multiple times, taking object-oriented principles as far as possible, creating an immaculate system of objects that only talk to other objects through well-defined interfaces. The project was probably perfectly testable and components were fiercly decoupled.
Yet I can’t say I was happy with it afterwards. Sure, it might seem fun creating all the scaffolding around the fundamentally imperative idea of sequential commands mutating data to make it seem more immaculate in an object-oriented world, but working on such code afterwards I find quite hard.
Perhaps it’s just confirmation bias. Until now I haven’t really tried stepping out of the OOP zone for my web projects in particular, even though I’ve done some others in the meantime. But I do feel strongly that the OOP paradigm tends to be taken to an unusual extreme more often than not.
In either case, my resolution is to follow the call of the bodge. Swap the code design idealism for taking some shortcuts so far as they aren’t in contradiction of the scope of the project at hand.
Though the inner idealist might suggest that it’s worth putting in all the effort into every single detail of every single project, that isn’t sustainable and will just kill the joy of working on anything before it gives tangible results.
The only two things that will matter in the end are whether the project works, and whether it works well for its users. It’s okay to put in details that the user will notice, but obsessing over the technical side will only get you so far, which actually isn’t that far at all.
If the project never ships, it has failed both of these goals. Therefore making the tradeoff of perhaps relaxing code quality requirements in order to ship the project in the first place.
Despite what I claimed earlier, I acknowledge that the cathedral-ish nature of what I call “extreme OOP” has its merits and is popular for a reason. If one believes in test-driven development or extensive automatic testing, properly using object-oriented paradigms greatly simplifies accomplishing these goals.
Unfortunately, my aforementioned projects, as well as others I’m considering, are so small, menial and integrated that putting significant effort into unit testing would prolong their development cycle quite a bit, and they can usually be tested manually much faster.
This is somewhat in line with opinions of people like Marco Arment and “Underscore” David Smith, who don’t do much, if any, unit testing on their apps, mostly testing them manually instead. I can’t commit to suggesting anyone else do this because it seems to go against the good practice, but perhaps there is place for a category of people who aren’t aligned with automatic testing.
Perhaps it’s worth pondering whether what we call “good practices” are much more than social stigmata, like the other norms we’ve been following without giving any thought to them, both within society and during programming.
I can’t claim this is a panacea by any measure. I’ve also noticed that this post speaks almost exclusively about myself and my experiences and opinions. It seems like this blog doesn’t actually have any recurring readers, so with this post I’m treating it more as a place to dump my thoughts in search of some clarity, and perhaps one day it will either come back to haunt me or be useful to someone else. Until then, it is what it is.