The funny thing about CS is they don't teach you coding for software development.
They teach you C as a foundational tool for writing algorithms and understanding memory, machines, and assembly as a path to building compilers, or Python as a way of doing machine learning.
But that's about it.
You eventually find postgraduate CS students going to bootcamps to learn software development.
I always thought that was weird.
Why would they do that?
As someone who had been writing code since I was a kid, I expected them to teach software development and all the cool stuff I would see on Twitch being done by Prime and Theo.
I guess now I understand why.
Why more effort was placed on being able to whiteboard a compiler than write it.
Why more effort was put into understanding systems and not a component.
In fact, one funny bit about CS is that toward the end, you're no longer taught anything related to CS, but rather management, how the technology doesn't matter as much as you think, and it's just a piece that's governed by the business environment it operates in.
One example I found interesting was why DynamoDB was developed. Essentially, it was built because Amazon wanted an always-writable database. And Amazon wanted an always-writable database for one simple reason: so that when customers added items to their cart, they wouldn't get lost. I think reading that drove the point home. Dynamo itself is amazing technology, but it wouldn't exist if the business didn't want it.
Another example, still under distributed databases, is Google Spanner, where Google had to essentially engineer a distributed database system where they could trust the timestamps. They had to develop a system that could guarantee total timestamp ordering by not treating it as a software problem, integrating GPS and atomic clocks just to make the system work. And the funny part is this was done so the marketing department could be more efficient. And guess who earns the most money for google.
And based on that, it's easier to come to the buzzword-heavy conclusion that coding is dead as a method of instruction for computers.
But a caveat is that Computer Science isn't.
I think anyone with a CS background will understand that computers still need to be instructed somehow; we just won't be doing it by writing out the code, at least not in the same way.
Something I think will slowly start to become a reality is more frameworks. In essence, frameworks are an opinionated way to define reusable and stable pieces of software that can be put together in any way to achieve a goal, like building an app or building a machine learning model. They are a level of abstraction above the code itself. These may not necessarily be built or designed by a human, but I think they'll be built and designed based on human needs.
Even though code can be generated, I think the current backlash from existing engineers is that they don't want to lose the ability to write reliable software. But there's a pathway toward AI-generated code that's guaranteed to always create reliable software if you essentially abstract away the unreliable parts and make it so that the LLM doesn't have to write those.
My point is, coding is becoming an old tool. There are new tools on the horizon, not referring to AI, but rather tools for instruction. For example, a framework is a way of instruction. I don't know what other tools are on this list, but I also know there are a lot more pages to turn.
Happy instructing, nerds.