What every new software developer should know
The tech landscape changes dramatically every 5-10 years. And we poor software developers have to constantly run and re-skill just to catch up. Was it always like this?
As a software developer today, you get bombarded with an unending stream of articles & videos convincing you that you have to switch to using X database, Y programming language or Z cloud service. If you're just starting out, this deluge of "information" can be overwhelming at best and downright impostor syndrome-inducing at worst.
"How am I ever gonna learn this infinitely long list of best practices, design patterns, libraries, performance optimizations, scaling strategies, and developer tools? I'll never be able to make an app!" you exclaim. "How did the facebooks, githubs, googles, instagrams, and airbnbs of the previous generation accomplish all this with just 2 people?".
The short answer is: they didn't. Let me explain. Like I said, the tech landscape changes every 5-10 years. If we go back in time to the 2000s, it was totally different scene:
- companies were just starting to break out of the dot-com crash,
- venture-capital funding and startup ecosystems were not widespread,
- the "cloud" didn't quite exist, and
- smartphones didn't exist ("What?! So people just...read news-papers on the toilet?")
The economic and technological realities of that era dictated how technologies were marketed on the web and how that affected the experience of new software developers entering the field. Let's break them down one-by-one.
1. The dot-com crash
It might be hard to believe, but a couple of decades ago "internet" and "web" were dirty words which most investors wouldn't touch with a 10-foot pole. Why? Because a lot of companies from this era turned out to be overselling and overhyping their internet/web products, and ended up burning through their huge investments with no revenue to show for it. (The modern-day equivalent of this is crypto/web3).
All in all, investors lost trillions of dollars during this 1999-2001 era of market panic, and many internet companies went out of business. To see how truly drastic this was, have a look at this graph of total American VC funding per year since 1995.
Suddenly, there was much lower demand for web developers. New companies forged in the ashes of this era had to operate lean and accomplish a lot with little resources. They couldn't afford to hire large teams of devs and give each one their own microservice to develop. The founders did everything themselves, and they did it using simple monolithic web frameworks.
Why monolithic frameworks instead of microservices? Well, there's a really cool pattern called Conway's Law, which states that a system's design mirrors the communication structure of the people who create it. Since the "people" during this era were usually just 1-2 founders, the system design that best matched their social organization was a monolith.
Interestingly, there weren't many monolithic frameworks around - basically just one canonical framework existed for each programming language, e.g. Rails, or Django. It was easy for new developers to know what to do. Just choose the framework for the language you are comfortable with and go use that. Easy!
This is not to say that web development itself was easy during this era. We have a lot better tooling and access to knowledge today that we take for granted. To be clear, what I'm saying is that choosing what to learn was easier for a new developer back then, than it is today.
2. Less funding
The startup ecosystem as it is today was not so well established 20 years ago. Your company couldn't get millions of dollars in funding just because you had the latest tech buzzword in your name...investors were wary of that due to the aforementioned dot-com crash.
If we follow the money, we can see that less funding meant lowered demand for developers and hence less marketing targeted at developers. Which meant fewer shiny frameworks, libraries, programming styles, databases and developer tools to distract you from mastering the monolithic tool you already know and love.
We don't have this luxury today - if you are a junior developer, your skills are in extremely high demand and may be so for decades to come (yay?). Companies have no interest in empowering you to create products yourself - they need large teams of employees, not founders, so they do everything in their power to convince us to learn tools that they use in-house. Not that being an employee is necessarily a bad thing. Some ambitions, like serving videos to billions of people or taking humanity to Mars, can only be accomplished by large groups of employees working together.
But as a junior developer, maybe you just want to make a personal blog or send a tweet every time your plant needs watering. Your projects, hacks and startup ideas at this stage can totally be accomplished using simple tools like a monolithic framework running on a basic server. You don't really need to learn modern enterprise tools, languages or frameworks. But alas...tech companies need you and will do all sorts of subtle marketing to convince you to learn these technologies.
Get used to being bombarded with one-year subscription discounts, content marketing (i.e. tech "articles" that are actually just ads), free student subscriptions, free ebooks, free cloud functions, free microservice monitoring, and so on. Let's be frank: the purpose of all this marketing is either to attract developers to work at their companies, or to convince devs to integrate (the paid versions of) these tools into their current company's workflow.
3. No "cloud"
What even is the "cloud"? Many say it's "just someone else's computer/server". While that is a useful simplification, it's not entirely true. It would be more accurate to say that the cloud is a collection of resources (memory, storage, processors) that appears to be running as a single computer/server/endpoint. This is pretty great and allows a whole new level of flexibility in scaling large apps. But wait...if the cloud didn't exist in the 2000s, does that mean it was impossible to scale apps? Of course not! The truth is that for most of us (and especially new hackers), the cloud and its scaling capabilities offer no benefit whatsoever over renting a normal virtual server.
What?! If there's no benefit, then why are we being told to learn containerization, container orchestration, load balancing, database clustering, microservice architecture, API payload serialization, cloud delivery networks, and more? Again, the answer comes down to this: large companies that deliver their services to millions of people need teams of employees who can work with their large-scale systems. These technologies surely benefit large companies and sometimes even society but rarely ever benefit you, the solo hacker, who just wanted to put your personal blog on the web and share it with people. (Well, it does benefit you in terms of the eye-popping salaries that are now common in the industry).
If we invoke Conway's law again from earlier, notice the new design pattern that we all are told to learn: as companies hire larger and more geographically distributed teams of developers, the companies' monolith codebases have splintered into dozens of microservice fragments. Coincidence? Nah. It's Conway's law, baby - the system design (microservices) mirrors the communication structure of the people (distributed teams) who create it.
4. No smartphones
Ah, smartphones. Just uttering the word evokes a myriad of powerful imagery, from the disruption of major industries, to cute videos of cats, to mass swaying of democratic elections via social media manipulation, to ordering food by clicking on a picture of the food. But I digress - we're here to talk about life on the web as seen through the eyes of a new software developer.
In the 2000s, before smartphones, the only major interface to the internet was the open web. And that was mostly accessed on desktop computers or laptops using a web browser. That's it. For a new developer, this was a simple reality to contend with - just learn web development and you're off to the races! (Browser incompatibilities notwithstanding - I'm looking at you, Internet Explorer). In fact, it was so simple that you could literally right-click on any webpage, open Developer Tools, and look at the HTML and CSS of any website so you could learn from it.
The tech ecosystem today has, sadly, become closed off and fragmented. If you make a cool new app and want to deploy it to everyone's phone, you have the choice between making it a web app, a progressive web app, a web app inside a native container, or a native app. And if you go with that last route, then you are faced with the horror of learning both iOS development and Android development, the latter of which itself has a fragmented device ecosystem. How would a new developer even know how to make these choices? And why should they have to make these pointless choices at all?
Don't get me wrong, I think smartphones are modern miracles and the companies behind their proliferation do deserve their rewards for advancing the world. Whether the best way to do that is to have them gate-keep separate, proprietary "app stores" and charge hefty payment fees, I don't know. But for a new developer, none of this matters - all they see is that they have more platforms to target, more programming languages to learn, and more hiccups to face in their learning journey.
More problems = more marketing
Whew! Between competing frameworks, cloud tools, and multiple target platforms, there's a lot of information for new developers to juggle and filter as they're learning. The irony of all this is that the more fragmented things get, the more companies and services spring up that promise to "simplify", "manage", and "optimize" all this complexity. Which means more marketing and hence more noise that distracts the young developer on their learning journey. It's a vicious spiral where noise begets more noise, and before we know it, the web will turn into a giant advertisement-delivery machine with only rare traces of original content. (Maybe it already has).
People > Technology
I hope I gave you important context on why today's tech world is the way it is. Perhaps these trends of rapid change and rampant marketing were clearer to me because I come from an embedded software / firmware background where technology evolves extremely slowly (most of us are still using C99 or, if we're lucky, C11). The 4 points I discussed above have had minimal impact on embedded software, although that is starting to change as well.
In the end - regardless of which technologies I've used in my day jobs, side projects, or startup ideas - it was my colleagues and online teachers who taught me everything I know. I have deep respect and gratitude especially for those mentors and tutors who encouraged solid foundations and kept things simple when I was a beginner, instead of adding to the noise. It made development a lot easier and set me up for a joyful work-life balance. I hope, dear reader, that you can achieve the same 🧘🏽♂️
P.S.
This piece was inspired by the philosophy of minimalist coders like Pieter Levels, Miguel Grinberg, Caspar v. Wrede, Jon Yongfook, and many others.
A big thanks to Tom Hart, Matija Han, and Natalie Plociennik for reading drafts of this and giving honest, helpful feedback!
By the way, if you're still here - I'm actually a MicroPython educator and I create written tutorials and videos right here! If you're into home hacks, electronics, IoT, or shiny rainbow lights and want an easy (and free) way to learn it all, just hit Follow/Subscribe below and I'll email you my next tutorial! If you'd like to ask questions, chat with me, or tell me that this article is complete nonsense, hit me up on my twitter. Till next time! 😃