The Art of Writing Garbage Software

Written By: Duncan Robertson
Published: Sun Nov 03 2024

I've been writing software for a very long time. I'm 31 as of writing and I started somewhere around the 6th grade so approximately 20 years, which translates to the vast majority of my life. Before I was coding myself, I was listening to my dad talk about coding as he was employed as a developer. In my time I have written a whole lot of code but if you peruse my public repositories, you might notice they are, how you say, empty. The reason for that is almost 100% of the software was in fact ✨garbage✨.

By garbage what I really mean is code not intended to be a product, code that is not intended to open sourced for a community to use, and code that was not carefully formatted and organized to be a showcase of talent. Usually, it was an exploration of a singular concept, programming feature, language, or puzzle. I would plug away until I reached the level of code completion that allowed me to be satisfied with what I learned, or the code became so tangled and knotted that I knew I was approaching things incorrectly and would need to regroup later. Both were perfectly acceptable outcomes to me as I was interested in learning something but I feel like this kind of approach has become disincentivized and I think there's a couple reasons why.

Entrepreneurship in Software

Technology and as an extension software has become a posterchild for startup companies, especially in the United States. Trillion-dollar companies Apple, Facebook, Google, and Microsoft are all homegrown American success stories of a few entrepreneurial kids growing these companies by writing killer software. This sets a backdrop where both investors and innovators are trying to hit it big in the tech space. Everyone wants to find that next big company and ride it all the way to the bank.

Business and computer science is a popular combination in university, either double major or some combination with a minor. It makes a lot of sense as there's big venture capital money available for anybody who can produce and innovative new app that gets its hooks into a large population. The fervor for the next big thing means you can be handed a blank cheque even if you have absolutely no path to profitability. Twitter famously has been operating at a loss nearly every year since it's been funded. It's no mystery why; it has to handle a tremendous amount of network traffic and it has very few options to capitalize on people using its services. It doesn't matter though as it has a huge audience it received funding as if they ever find a way to effectively make money per user it'll result in huge returns.

This seems to have actually warped the learning of coding towards the end goal of creating products. There are endless tutorials on how to make clones of Twitter/Instagram/Facebook/Amazon clones and this is starting to affect the expectations of companies hiring for coders. It's becoming common place to have a portfolio of project apps when applying to a job. These should at least semi to fully polished functioning applications that show off your elite hacker skills. If you are interested in persuing entrepreneruship obviously I have nothing against trying to stack a portfolio. Afterall you are trying to launch one of them to make money. However, if you are trying to work as a developer for a company innovating and launching new ideas will almost certainly be a miniscule part of what you do. So why are companies expecting you to have them?

In my opinion this is mostly a pacifier for HR. They can click through the application and if it feels good to use and there are lots of code files that go along with it, they can feel confident that the applicant is a good developer. Is that true? Hard to say. They could have just copy pasted from a tutorial or they could have had an AI like ChatGPT or had Copilot write the code for them. It is impossible to determine if someone truly understands their own code by reading it. Sure, the team technical lead can review the code for the app, which they almost certainly won't fully do because they're busy and that's a lot of code, but even then, they will mostly be looking for bad code smells. This all blends into the main issue.

HR Filtering for Developers is Hard 

Having good software developers is becoming more and more important for most medium to enterprise businesses, and many small businesses as well. There's also a record number of people applying for these developer roles with varying degrees of skill. So how do you separate the good from the bad? Well, it's harder than you might think.

Problem 1: Lack of Accreditation 

Anybody can code which is awesome but it's also part of the problem. If you want to become a Software professional you can learn the entirety of what you need for free online, study for a certificate, attend a bootcamp, obtain computer science degree, or any combination to put on your résumé. Do any of those things prove you can do the job of a developer? Absolutely not. A hiring company may but more or less value on one thing or the other but honestly, they show you may possibly have some of the skillset required at best. Experience is probably your indicator and even that isn't fool proof.

Other industries have standardized job levels with licensing requirements. These can involve taking and passing practical and written tests that show you meet the level of skill required for that license. Certifications in software are the closest thing to this but they are often a better representation of the candidate's ability to memorize exam material versus their practical ability to use that software. 

Problem 2: Lack of Options to Test Development Skills 

Well, if there's no standardized licensing why doesn't the hiring company create their own tests as part of the interview? This is a common practice nowadays when hiring a developer but has its own pitfalls. Tactics include 

  • Requiring the candidate to write a solution to a problem with no internet access 

  • Requiring the candidate to answer a question about how a given piece of code functions 

  • Requiring the candidate to take home a coding problem and write a full solution and return it 

  • Requiring the candidate to answer complicated algorithm theory questions like how to invert a binary tree 

  • Requiring the candidate answer "creativity" questions to test how they think. AKA the Google special 

You'll note that all of these "tests" all somehow artificially limit the tools available to the developer. On top of that they miss what I consider to be the single most important element of being a developer, iteration based on feedback. In software solutions are very rarely "right" immediately, it's much closer to traditional writing than some people might think. There are usually first, second, and third drafts required before software functions just right. This should be expected and encouraged but extraordinarily hard to distill in an interview setting. 

Problem 3: Nobody Wants a Noob 

The software environments of today's modern workplaces are vast and complicated and companies don't want to hire a developer that doesn't have specific experience with the technologies they work with. So much so that companies started standardizing on technology stacks that developers have experience with. For example, big web stacks include LAMP (Linux Apache MySQL PHP), MEAN/MERN (Mongo Express Angular/React Nginx), .NET and Java.

The reason for this is obviously so that companies can more directly target developers and filter out candidates without experience in their technologies of choice. The problem with this is you a potentially excluding good candidates that don't have the prerequisite experience with the technologies you use. Entry level jobs are routinely posted requiring years of technical experience which seems unfair at best and cruel at worst. There are many examples of job postings requiring more years of experience with a technology than that technology has existed, or the creator of a technology being rejected due to lack of experience.

This problem gets worse the second a company starts to work with a more esoteric technology stack. If you've got less known web application server and pair it with a smaller JavaScript framework, you're in for a bad time. The majority of the people that have experience with that stack already work for your company and trying to hire someone with direct experience in all of your technologies is a fool's errand.

At the end of the day developers are not factory workers. You can't isolate the job to familiarity with a technology and expect equal results from all contributors that know the technology. The job is much closer to an art where a good artist is able to fluidly switch between the required mediums.

Advantages of Garbage 

Getting good at anything requires a lot of time investment. It's just the nature of the beast that anybody who has ever been really, really good at something has spent the hours. The amount of time required to become proficient in something is measured in the thousands of hours. Someone may have a natural talent but they must still put in the work to achieve results.

What do you work on while you are getting good then? The answer in most cases is you produce some hot stinky garbage. Think about this from a musical perspective. I don't know if you've ever had the pleasure of being in the presence of someone learning an instrument for the first time. It's usually not recognizable as music. It's going to take years of exercises, musical theory, and muscle memory before they reach competence. Then there's a whole other level above being able to play something and being able to compose something.

If someone attempts to learn music by simply copying the notes of a song, they will most likely wind up becoming a mimic. You will know individual songs by memory but there's something missing in the fundamentals. This is actually incredibly common in the musical space as it's a fast track to playing whole pieces and many people have a great talent in mimicry. These individuals usually hit a ceiling however as pieces get too complex to match by ear and struggle to contribute to composition. To improve they usually have to go back to fundamentals and put the work in.

These garbage reps are consistent in high skill fields. Artists don't start out sculpting beautiful faces, welders don't start with tight seams, and gamers don't just start being able to beat bosses in Elden Ring. It's the same with coding you need the garbage reps before you will be able to start producing high value work, but the industry is trending towards teaching the skill as mimicry. With the advent of coding AI models, the problem escalates as it's even easier to mimic results without understanding how it works.

How to Write Garbage 

The most important part is start with a blank slate. If you think that the technology you work with can't operate from a blank slate, that's a perfect garbage project to try to remove the boilerplate project templates and build systems and make it work. Understanding all the technologies in the build system for your language is very valuable. I understand you might not be able to do this right away so just try to keep it as blank as you possibly can.

Now that you've got your blank slate you need a concept. Maybe it's a type of encryption, or a system integration, or just a general problem like how to implement a data type or algorithm. No that you've got your problem do not look up how to implement it. You can use the reference documentation for your language and perhaps the Wikipedia description of your problem if it exists, but you need to work through the solution without help. This is important because almost certainly the problem you are working on has a very well optimized and defined solution and you don't want it. You should create a solution from whole cloth.

Work through the process of making it function. You should be familiar with all aspects of the code because it's 100% your garbage, which means when it's not working it's your fault and when it's slow it's your fault. Since you don't have a predefined solution, this should take you multiple iterations and here is the value. Going from nothing to solution and refactoring as you go to get to that solution. 

You can use LeetCode as a jumping off point when you're just getting started with this, but ideally you create your own slightly larger garbage projects. These projects should be small enough to throw straight in the trash when you're done but not so simple they contain only one or two functions. The scope is up to you but if you've reached the point where you're starting to implement "new features" it's probably not a garbage anymore. If that's the case you should start to evaluate if you really think the project is worth continuing. There's absolutely nothing wrong with having a real project but they represent a higher time and less learning potential.

This website started as a garbage project for me and is slowly creeping into something more of a real project. It's still garbage and could wind up binned, but it can return some real value if I keep it going. Specifically, one of the values it provides is the ability for me to quickly write garbage web applications and tinker with those technologies. Don't waste your time writing code to impress an HR rep or hiring manager, write code that has value to you. It will make you a far better developer even if it doesn't show up on paper it will show up when evaluating your results.