I got laid off a few months ago. It wasn't because of any performance faults, but because the company was hemorrhaging money, and the project I was building was deemed too experimental/not close enough to key revenue drivers.
As nice as a few months of paid vacation have been, I don't particularly want to experience another layoff. Which is why the main thing I've thought about when looking for my next role is defensibility: how likely is it that this role/company will be around in 5 years.
In the past this would be a decision about the prospects of the company. If you were a programmer for IBM in the 1980s, the likelihood that IBM will still need programmers in 5 years is pretty high. Sure, you might need to learn a new language or retool in minor ways, but the role itself would still exist. So the majority of your decision would rest on the direction of the company, and whether you felt the company was in a position to succeed. You'd look at factors like revenue growth, profitability and company culture to get an idea of whether IBM was the right place for you.
AI has changed the equation both at the company level, and the role level (and this will become more apparent in the next 1-2 years).
It's fair to say that for the first few years of doing anything, no matter how confident you are, the results are probably shit.
The reason why companies put up with this is because eventually the total shit, turns into a little shit, turns into pretty good. All that noise about 10,000 hours and whatnot. Turns out doing something repeatedly over a long period of time, generally gives good results in the end. So for every intern project that nobody touches after they leave, or entry level analyst report that nobody reads, eventually a certain subset of those individuals will turn into strong performers. Honed by the criticisms of senior employees, the juniors are trained to become better.
But what happens when that 10,000 hours of mastery can instead be found in a cluster of NVIDIA V100 GPUs?
As it turns out, a good deal of what white collar jobs constitute can be outsourced to AI. If you were a managing partner at a law firm, and you had the option to either:
A) Let an articling student bumble their way through stacks of old cases in order to find precedent arguments, that you may or may not agree with
B) Direct an AI to pull up the 10 most likely matches, from which you could determine what cases best match your arguments in a matter of minutes
What would you choose?
In the case of businesses interested in short term profit maximization, the instant gratification that comes with AI tools may be too great to pass up. A lot of the work put on junior employees is the "grind"-y stuff – the sort of task that more senior employees don't want to worry themselves with. There's a sort of peverse pride in junior employees being in the office at 11pm on a Friday, responding to a seniors request to "pls fix".
But what if the senior employee would be able to send the same document to an AI, with the same request, and forgo the difficulty of trying sculpt the document through the adderall-infused junior analyst? Instantly they make the changes, for next to no cost, and with less of a rate of error (once properly trained for the tasks). How does a junior compete?
Unless the company willingly takes on the additional costs of upskilling juniors, eventually the seniors leave, and the backfill is weaker candidates. In the case that the company does attempt to think long term and eats the training costs, what's to prevent their competitors, who have replaced their juniors with AI, from hiring these juniors for their own backfill at higher rates (due to the cost savings in training)?
Okay, maybe a bit of an exaggeration.
So much of what these applications do is lower the friction of interacting with different platforms, or performing repetitive tasks. You sell the service on its time saving ability - this app makes it easier to create data visualizations, that app allows you to book meetings more efficiently. Especially when dealing with large enterprises, business leaders will pay out the nose for any sort of optimization of the company. Many SaaS platforms might do millions of revenue off of a sub 100 company client base.
Autonomous agents present a wrinkle in this model, where previous moats will be eroded by larger players. There will still be needs for CRM's and database systems, but so much of what glues these platforms together will find their entire business models cannibalized overnight. We've already seen demonstrations of GPT-4 plugins working in collaboration to perform tasks like identifying prospects, crafting emails to them, and then recording events in Salesforce. We'll begin to see many companies either pivoting their offerings, or being cannibalized.
As an aside, the situation is even more dire for tracts of business that can be codified. Rules based professions like law and accounting will struggle to employ the same numbers of people as previous. That's not to say they'll be going out of business, but rather you'll see more "10x lawyers and accountants", who are able to perform superhuman amounts of work by leveraging AI. Previously constrained by time, they'll be unburdened from the monotonous.
I feel as though I'm at a crossroads in career aspirations.
For my entire educational pathway, the goal was always to get into the hardest program, get the highest marks, and then get the job with the most pay/prestige. I did the first education part pretty well, but when it came time to building a career, I've found myself confused - a bit of "that's it?" shock.
Most of my talented classmates went to big tech or finance corporations, where they fulfill some tiny slice of some giant profit scheme. They live in a big city, they work 60 hour weeks, and they get paid handsomely.
But can I really convince myself that using my time on earth to improve average user session length by 12%, or ensure that a company is properly hedged for inflation rate risk, is a noble and worthwhile cause?
You can argue that you can use the funds that you procure from one of these jobs to fund the life you really want, but it doesn't quite add up to me. Too often the game seems to be zero-sum, or even a net-negative to the world.
So much of white collar work seems to be bullshit, or seven levels removed from any real world impact. The carpenter can point to the houses he framed. The civil engineer can point to the bridge she designed. The data analyst for a big bank can't point to anything really. The big tech software engineer (if they're lucky) might be able to point to a button or feature they shipped.
To me, the satisfaction of a job is directly correlated with the real-world positive impact you have. Perhaps I was wrong with my educational pursuits, maybe I should have taken a path to medicine or the trades if that was the case.
All this is to say I hope I can look back at this post, employed in a career that creates real world good, and that AI hasn't doomed us all to mass unemployment.