* Database Systems (relational algebra, SQL)
* Concurrent Programming
* Network Programming
It seems most are exposed to them partially through project work but without the base knowledge.
Is this typical for CS undergraduate degrees because you get to pick your own classes?
People/companies commonly treating both the same is IMHO one of the major problems of the current industry.
None of the topic you mentioned are fundamental to CS.
They are fundamentals of software development.
Wrt. to computer science they are at most specializations and even then what you might do in a science context of them might differ largely to what you would need to use them for production focused software development. Through they do contain some fundamentals like, e.g. set theory in relational databases and graph theory in network programming and concurrent programming.
You can (rightfully) have a master of computer sience _without having ever written a single line of code_. And going back ~20years that wasn't even that uncommon.
Now today a lot of universities have realized that this mismatch causes problems and are also teaching the fundamentals of software developments additionally to the fundamentals of computer science. Additionally of lot of computer science today requires the use of tooling which requires some programming and SQL.
Still what the "fundamentals of software development" are is a much less clear topic then the "fundamentals of computer science" (and even there people disagree all the time). And for example "relational databases/SQL" is one of the thing people can strongly disagree on weather its foundational to software development or not (anymore).
I may be showing my age but there was a time where companies had new hire programs and OJT where entry level workers could learn the job and learn the job the way the company wants you to learn it. Now there seems to be an expectation that every fresher should have a decade of experience under their belt and know the ins an outs of a job they've never done or at best did for a few week/months as an intern/college hire. This is just another example of companies living on the cheap and pushing the onus training new workers from the company on to the worker. In my opinion it's stupid and short sited.
The vast majority of CS students aren’t interested in the minutia of theory, they’re interested in reliably getting a job in a field that will let them have a family. And schools know it too, which is why a degree can command and justify a six figure cost. All of the saccharine scholarly platitudes aside, the cost of learning about Homer or first year calculus keeps going up every year.
Forcing students to pay for their own vocational training isn’t stupid and short sighted at all. It’s a form of industry-wide tacit collusion [1] and it’s working.
[1] https://en.m.wikipedia.org/wiki/Tacit_collusion
That's why boot camps haven't really taken off: most students can reproduce what they learned but will struggle growing up. They learned the tech before the theory!
I wasn't going insane, I was just going Marxist. (Which sounds like an insult, but isn't.)
More specifically, it seems that you can easily get a load of cash for your startup if you're a Stanford CS grad. Then you grow up, have a couple of liquidity events and put some of your cash in a VC fund managed by a college pal. Then you get asked to listen to a few pitches from young kids and you spend the hour talking about fun times on the quad and in Terman hall. And you vote to give them some cash 'cause they seem like nice kids.
Don't get me wrong. I'm as committed to perpetuating classiest stereotypes as the next guy. But it all seems a bit financially and socially incestuous.
I agree with your sentiment. This was true for a while, but it isn't anymore. Companies would hire you just because you had a college degree, because it showed you were capable of learning. Once they stopped that and required particular degrees, it became a trade school.
If someone is probably going to head somewhere else in a couple years it might make sense to put them through a one week orientation but probably not a 6 month class.)
And hiring a trained worker is overall less expensive than hiring an untrained worker who takes a year to get up to speed and then leaves for greener pa$ture$.
What's more, if you have a brain drain of experienced people leaving for Big Tech, hiring an untrained worker who is able to make messes that won't be cleaned up is sometimes even more expensive than the wages paid to the employee.
The result of this is that after a few rounds of people leaving management is faced with two options. Either contract projects out as the standard approach rather than having software developers on staff or move to "we're only hiring experienced people with all of these qualifications" so that on the job training isn't necessary.
While some will also say "raise your wages to be competitive with Big Tech" - that isn't always practical or able to be justified for budgeting based on the revenue of the company.
The overall industry appears to be bimodal with "Big Tech" and "everyone else". Any employee who can move from "everyone else" to "Big Tech" can out compete any wage offered by "everyone else" (and in boom times this was a much easier prospect).
With that consideration, it was often very poor ROI for any company in the "everyone else" portion of the industry to offer training.
If someone got an offer from Facebook and puts it on the table, there is no way to compete with that as anything other than another big tech company. There's no way that say... Jack's Links ( https://jobs.smartrecruiters.com/JackLinksProteinSnacks/7439... ) can compete with working at a big tech company.
This also goes for interviewing. I've seen new grads (back in the boomier times) say "I have an offer with {big tech co}" part way through the interview and say "ok" and stop the process since there's really no point in going on - the smaller shop better uses their time interviewing other candidates that may accept.
If you could somehow anonymously go through your company's hiring process you'd be offered more as a new joiner for your skills and experience than someone already there, which is bizarre set-up (even though any new joiner has to spend time ramping up & existing staff have a great level of instituional knowledge).
So companies have created a system where if you want to get a fair market rate, you _have to_ move every 2-3 years. I know a lot of people who stay put in spite of this, as they don't like the friction/effort of interviewing & are comfortable/like colleagues etc. If places paid market rate to existing staff, I'd imagine their churn would plummet.
The University of Bologna, the first university, had faculties of law, medicine and theology. That’s a trade school for lawyers, notaries, physicians and priests. While university has always been an alien place for anyone not of the bourgeoisie or higher it’s always been mostly about getting a good job afterwards. There were never enough people whose family had enough money to support them doing anything or nothing to support that many scholars. College is and always has been in large part about getting its graduates good jobs. Saying it’s not a trade school is primarily about snobbery. One of the ways it makes its graduates suitable for those jobs is by teaching them the habitus of university men (and nowadays women) so that they can’t be mistaken for the kind of people who do go to a trade school.
Wasn't this also a time when people could expect to stay at the same company for 10+ years? Things are different now as total compensation is linked to how your options package is doing. People don't stay at jobs where their options aren't likely to be worth much.
Why as a hiring manager would I waste time with a junior dev that does “negative work” knowing that by the time they get productive they would leave instead of poaching a former junior dev from another company?
(More a comment on how some pedagogical methodologies constrain us and less on whether college CS programs are or aren't trade schools.)
OP needs to hire Computer Engineering grads, not Computer Science grads.
Like you said, that background does have a lot of benefits. But the downside is that a CE grad typically spends less time doing "software" work than a CS grad. When a CE student is taking courses on circuit design, FPGAs, and CPU architectures, a CS student might be taking courses about databases and concurrency. I don't think I touched SQL or a multithreaded program in a university course. Those were all "high level" things that CS students focused on.
This isn't meant as a knock on CE grads -- I just don't think it's wise to suggest them as the solution to OP's problem since they have their own set of "blind spots" to deal with.
Anyone who can figure out timing in a complex digital circuit can learn how threads and mutexes and message queues work, but there’s a good chance a CE grad would not be able to speak to those topics very well compared to a CS grad.
Again, this isn’t a criticism of CE. I just doubt OP would have their socks blown off in the areas they mentioned if they started interviewing engineering students.
My CS degree had digital logic (far too many hours working on Mentor Graphics), machine language programming (from Professor Larus of SPIM fame), compilers (from Professor Fischer), databases (from Professor DeWitt), and networking (from Professor Landweber).
Many classes were cross listed with the engineering department. The ECE degree was much more focused on the "designing a computer" rather than "writing software".
The difficult part there is that some degrees are "build hardware", some are "write software", others are "study the science", and others are "a survey of all things"... and others are "just enough in the business school to write html and do JavaScript."
Here are the foundations of computer science: Sets, Boolean algebra, integers, strings, functions, logic circuits, iteration, recursion, proof by induction, loop invariants, automata, regular expressions, context-free languages, Turing machines, computability, asymptotic complexity, data structures, algorithms, NP-completeness, models, formal logic systems, operational semantics.
But to agree with the OP, a lot of CS graduates may not have learned or have forgotten the mathematical underpinnings of CS. A typical bad "CS" education would be some hodge-podge of algorithm design, hand-wavy analysis, and teaching specific technologies/tools (e.g. CSS, NoSQL).
With the rising popularity of tangential fields, we've taken to calling it 'computation science' to help delineate from disciplines that rely on the application of technologies built using CS that run on "computers".
Also, all of the topics that the OP mentions are areas of study in CS there are entire CS conferences dedicated to them. Your comment reads as some really strange and arbitrary gatekeeping around what CS means.
Also helps work toward justifying the "software engineer" title that many non qualifying people like to use.
Learning how to use SQL is more of a trade school course.
I have a BS, Bachelor of Science. The foundational classes were math, math, math, math, and more math. There were no classes in how to operate a machine tool. I befriended the guy who ran the machine shop that built apparatus for the scientists, and he taught me how to run the machines. But that wasn't a class, it was just something I did on my own initiative.
While SQL the language is more "trade school", the original poster mentioned;
>> Database Systems (relational algebra, SQL)
Certainly I would expect a CS course to cover databases, in the sense of 3rd-normal-form etc, sorting, searching, indexing and so on. I wouldn't expect them to necessarily be proficient in any one database product, but I would expect them to understand different ways of storing data, and how to design a "data layout" based on good practices (again 3rd normal form etc).
But in my long career, I've never encountered anyone who learned math on the job.
Math was my superpower at work.
But since I was not a CS major, I never properly learned the academic side of CS. I run into this deficit now and then, and I'm not proud of it. Andrei Alexandrescu joined me on the D project, and had the academic chops, which was greatly appreciated by me.
Would you have been a CS major if that had been an option at Caltech?
For everyone wondering why CS would not have been an option, at the time Walter was at Caltech they didn't offer CS as an undergraduate major. They did have a CS department which did offer some undergraduate CS classes but it only offered graduate degrees. Undergraduates interested in CS typically majored in math or physics or engineering and took the undergraduate CS courses in addition to the coursework of their major.
I don't remember when they started offering an undergraduate CS degree. I think it was not too long after I graduated (class of '82), which was a couple years or so behind Walter.
No. I wanted to design airplanes, jet engines, rockets, etc. I had a secondary interest in electronics, and a tertiary one in programming. I wasn't that interested in the academic side of computing.
P.S. nice to see other techers here!
There wasn't widespread access to computers at MIT until Project Athena. So CS was historically mostly not hands on. And that's somewhere where CS was in the engineering school. At many schools, it was part of the math department.
So, there's a strong academic tradition of CS being more about math than programming. Even the current CS intro course at MIT is mostly about algorithms and you're pretty much expected to pick up Python on your own.
Authored by Dr. Jan Roskam perhaps?
https://www.amazon.com/Airplane-Design-Part-Preliminary-Airp...
I remember entering a Java course, where the professor specifically said to write Java 6, even though it was EOL and Java 8 had been out for a while, and you would get points knocked off if you used any "fancy" Java 8 stuff they didn't know.
Whether it is the former or latter matters a lot. Oftentimes the people taking the course don't appreciate which it either which is problematic. In your example, someone could be given a really good desktop programming class on Win32 whether they used Windows XP or Windows 10. The success of this is orthogonal to the tools at hand really. It is what the professor and/or students make of it.
What's wrong with that?
You're supposed to cover the basics in intro to programming courses, and typically Java is not outright taught in college, and instead it's object-oriented programming using a random OO language, which more often than not is Java. Do you need to use Java20 to learn inheritance?
Also, unless you expect to work exclusively on greenfield projects, the bulk of any developer work is maintaining legacy applications. I still see an awful lot of projects in java 8 and only a few in java 11. Insisting on the latest and greatest java release makes as much sense as pressing to cover Lombok or Kotlin.
Do you need to prohibit Java 20 to teach inheritance? Working with an outdated version as basis is fine, not accepting newer versions for no reason is something different imho.
> I still see an awful lot of projects in java 8 and only a few in java 11.
Sure, but that doesn't mean knowledge of newer features should be discouraged.
Java20 did not introduced inheritance, nor did it introduced any OO concept. Java20 adds no value to those learning OO. If you're writing java code but are not using any Java20 feature, you are not writing Java20 code.
> Working with an outdated version (...)
It is not outdated. You are not using any feature. You're just succumbing to the misplaced belief that new means more value. It doesn't, specially when the only thing that you get is more complexity.
> Sure, but that doesn't mean knowledge of newer features (...)
Repeat after me: Java20 adds no OO feature. You do not have to use Java20 to learn OO. OO courses are not java courses. Do you understand this?
Sure, technically you can still teach HTML in Internet Explorer 6 and OOP in Java 7 or something, but that has no added benefit and is actually more difficult to set up nowadays. And you can't deny that old versions are outdated when they don't even receive security updates anymore.
> the only thing that you get is more complexity.
What part of OOP basics got more complex in Java? And why teach Java at all if simpler OOP languages exist? Not to mention that avoiding complexity can actually be counterproductive for teaching. As an example, I got more than 5 years of Java-focused programming education and afterwards I still didn't know what a classpath is and couldn't write a single line outside an IDE because nobody wanted to expose complicated stuff to the students. Instead of forbidding students from using a newer Java version one should focus on the actually important part.
Some universities choose a more “exotic” but more stable programming language for that reason (like SML or Scheme or Smalltalk), but then also get criticized because it’s not a major industry standard.
But like, that’s the rule. I took a web class in college and the professor was teaching how to use <i> and <b> tags, which are no longer supported as part of the standard. (Not to mention that these days it’s not really helpful.)
https://webmasters.stackexchange.com/questions/27693/should-...
What role do you believe lambdas and support for functional programming play in object-oriented programming courses?
Most algos classes would probably benefit from streams, and there are many of them being taught in Java for reasons unrelated to OOP. Universities often stick to one language for much of the programming curriculum.
Pointless. They are OO courses, not FP courses. You have FP courses already covering that.
> second that under the hood everything can be shoehorned to object model.
You do not need lambdas or FP for that.
So, what's the point of lambdas and FP in a OO course?
If the intent is actually to teach polymorphism and encapsulation, both of those can be done in raw C.
However, polymorphism and encapsulation have existed for much longer than “OO” languages. What is stdout? Seems pretty polymorphic to me. How does one gain access[1] to a variable declared in a .c file that is not exposed via the header. That’s encapsulation…
[1]: there are of course ways, but they are often also true of private members of class instances too.
Learn assembly, then learn C, then code up a small OS and learn semaphores and stacks and how a computer actually works.
They are not required courses. I agree with GP that DBs are not fundamental to CS in any way. It's trivial with other building blocks of CS though: predicate logic, algorithms, data structures. Set theory. Discrete math basics (also fundamental CS) gives lots of examples of normal forms (prime decomposition, mod p arithmetic, etc).
On top of all that, it’s likely their first (or close to first) time in an interview setting. Nerves have a way of making you forget a lot of things.
I personally don't agree with the purist approach that CS degrees should just be maths. There needs to be some minimal application at the very least and different universities will have different maths-CS-engineering boundaries based on how they've evolved.
I've encountered far too many CS interns and grads who couldn't actually write reliable, non-spaghetti code. They also typically write commit messages that consist entirely of "Updated $THING" (no shit, I can see that from your diff), nothing about the why.
My perspective may be slightly skewed from being called in as a consultant to fix the software these "computer scientists" have been hacking on, but I encountered a lot of these people when I was a undergrad student also.
If you want a trade school grad, hire one from a trade school. There's nothing wrong with that.
I wouldn't hire a Caltech grad to fix my transmission, either. (I could fix your transmission, but AFAIK I was the only motorhead at Caltech, which was pretty disappointing.)
TasTAFE offers a Cert III in Information and Communications Technology. This looks to be one third tech support, one third networking and server operations, and one third python / web / CSS. So not really much programming.
A programming trade school to Mr would look like: four years on the job as an apprentice, full time, at some discounted pay rate.
Two weeks four times a year at a training institution, away from the demands of the job, where theory and practice are taught.
My trade certificate (metal fabrication) is a Cert III, my trade school included a fairly wide variety of subjects, both theory and practical.
The employer would be required to give progressively more difficult tasks, and doing apprentice swaps with other businesses to cover industry experience if in-house doesn't provide opportunities for key requirements.
Is anyone, anywhere doing that?
I help train two apprentices every year here in metal fabrication. It's not that difficult, but few developing foetuses are borth wielding a welder. I mean the trade is not difficult, training people is hard.
When finished you can continue for a master degree at a university, but have to take 1 year to get caught up on the theory you missed by not doing a bachelor degree on a university level.
You can do it in Switzerland:
https://www.berufsberatung.ch/dyn/show/1900?id=7671
If someone wants a person with good CS fundamentals and an idea of how to write code in a non-trivial codebase (so commit messages and an idea of concrete SQL and not just "I've proven this expression is correct but never ran it") they should look at Computer/Software Engineering programs, or at least select CS candidates from schools where the program incorporate real software development.
These usually are the worst "programmers", in my experience. And I was one as I didn't go to CS school and was mostly self-taught. Lacking the basics of CS, you are just a code monkey.
> I've encountered far too many CS interns and grads who couldn't actually write reliable, non-spaghetti code.
An intern is an intern, I guess.
> They also typically write commit messages that consist entirely of "Updated $THING" (no shit, I can see that from your diff), nothing about the why.
You should have a an employee handbook and a coding style. Lazy people are everywhere, however.
That's the type of condescending, discriminatory bias that having a degree affords, I guess.
But you're right that there should be schools that teach the practical programming skills. But it's the the trade schools that should be developed towards this goal, not universities.
Lots of degrees have no pathway to a job or career.
In Europe where third level is far cheaper, it's more looked upon as continuing education. The way in the US people look at high school. Sure you'd like to get a job after, but it's really just a stepping stone to your career and not a huge financial investment that needs to be repayed.
It's still made for those that want to be educated.
In the Netherlands the have, for example, the Hoger Beroep Onderwijs (HBO), which translates to Higher Professional Education. It is exactly tailored for the job market.
HBO sounds more like a vocational system that provides diplomas vs a bachelor degree in reference to universities in this discussion.
This type of system definitely exists in many places but I just assume it's out of scope as we're talking about in reference to CS graduates i.e. bachelor courses.
This is just a simple misunderstanding of what college is for, expanding one's knowledge for the sake of expanding one's knowledge -being a good programmer is not the intent but it might be a by product.
Sure, but those things can be learned on the actual job. So after a few years, the CS student knows both the theory (because of uni) and the practice (because of work)... Whereas someone who only learnt the practical stuff, well, usually lacks the theoretical foundations.
So a CS degree is useless in industry. By the time you learn how to build things that work, you know how things work.
Employers MUST NOT require university degree where trade school is sufficient (99% of positions in industry).
This bears repeating.
It's important to learn how to deal with asynchronous and concurrent and parallel programming, but it's more important to have a firm grasp of message-passing and event-driven architectures.
Rote learning the computational complexity of obscure algorithms no one uses does not hold a candle to knowing how to write clean and testable code, and design a component to accommodate changes.
This computer science bullshit starts to feel a lot more like an exercise in ladder-pulling than actually prepare and assess an individual's ability to do meaningful work.
Software developers aren't paid to write code. They are paid to deliver value.
Your code delivers no value whatsoever. Many masters and PhD thesis are boiled down to a shareable library or even function call. You might feel clever for coming up with it or by knowing some trivia behind it, but ChatGPT can easily spit it out negating all the value you believe you were able to create.
There is no value in something being hard. There is value in delivering value.
We don’t do it because we’re paid for it[0]. If that’s your primary motivation maybe this is the wrong website.
[0] sure, [too] many people are in tech primarily to make money, but it’s a bit sad.
I'm fairly sure it's the people that pay our salaries and choose to keep us employed while we do whatever they need to, based on their perception of our output and how the needs of the customers/users are met.
Will ChatGPT and other solutions like that be helpful with the complex, low level stuff? Perhaps, sometimes - depending on how explicit you'll be with your prompts, but more knowledge about what and how to ask will be necessary and even then good results aren't guaranteed per se.
But will they help out a lot of developers with common, well known tasks and problems, for the majority of developers whose job consists of creating a few RESTful endpoints, maybe a database migration and hooking it up to React? I'm inclined to say yes a bit more confidently.
For starters, if you can't argue what is the value then none can fill in the void in your reasoning.
Secondly, software developers have as part of their job description the task of finding solutions to problems. If they find solutions to problems that don't involve or require whatever trick you have in mind, that tells you what is it's value.
Thirdly, did you ever saw any job advert explicitly requiring prior knowledge of any trick you have in mind? Or do you see ads for the ability to solve problems with a pre established software stack?
> who can't multiply and shift t
Tell me, how many open positions have you ever saw whose main ask was "can multiply and shift"? Zero.
> That would be like Google hiring engineers who only got good because they're good at using Google.
No, it would be like Google hiring people without any CS or IT degree who were good at software development but rejecting PhDs who had nothing to offer other than their little trivia on stuff no one cares about.
You know, reality.
The person paying for it with their own money.
> Many masters and PhD thesis are boiled down to a shareable library or even function call.
Oh wow... I have to ask, hast thou ever wondered _where_ the nicely-packaged shared library or function call implementation code came from in the first place?
Thank you!
There’s math and science behind concurrency and parallelism, and the coordination thereof.
Network programming is a practical application of several parts, each of having foundations in math and science. From coding/decoding, to state machines. Wires are all about math and physics too.
There’s more that is often left out. Specifically holistic systems and their research. Lisp, Smalltalk, Self etc. These systems reach into math, HCI, compilers, JITs…
How many database programmers do you know who ever did anything involving relational algebra when creating or tuning up a database?
This point is specially pointless with the prevalence of ORM frameworks.
> There’s math and science behind concurrency and parallelism, and the coordination thereof.
Please find me the best example you can come up of what you see as math being put to practice in concurrent programming.
> Network programming is a practical application of several parts (...)
This is too much grasping at straws. Just show what you feel is the absolute best example.
> There’s more that is often left out.
There is nothing. You showed nothing as your best examples, and once you showed nothing you were left with nothing else to show.
Enough with this silly "but it is math" nonsense. You can come up with a math formalization of an egg but that does not mean a degree in math will help you boil one. Enough with this nonsense.
Formalising 3rd normal form and giving it a rigorous definition so that we can test if data is properly normalised is a big deal. Ditto having algorithms in concurrent systems that we can guarantee don't have logic bugs in them.
If those things aren't math to you, then what are you willing to accept? Do you mean programmers have to sit down and draw an integral sign or it doesn't count? You're correct that doesn't happen, and you're probably setting yourself up to be steamrollered by someone who sees a well known pattern that the academics have been researching for years.
The typical corporate database has big tracts that have been "designed" by someone doesn't appear to have thought very much about how to store data in an organised fashion. It raises the question of what fuzzy thoughts go through their brain since they are only interacting with a database because they have a need to store data in an organised fashion. Hopefully their thoughtlessness doesn't matter, but doing better than that is a low bar.
It isn't. Half the world already relies on ORM frameworks that barely require any input from the developer to setup relational databases, let alone knowledge on how to normalize them.
On the rare occurrences a project feels that specialization on relational database design is relevant, more often than not it's far more useful to require expertise on specific RDBMS than trivia tricks.
> If those things aren't math to you (...)
You're missing the whole point. No one is arguing that you have math formalisms. The whole point is that those formalisms do not add any value at all. Don't you understand that?
I mean, I can whip out a web app with multiple services running in the backend and with datastore using both DynamoDB and S3, and build a business around it. We can do this without ever knowing who Jack or Joe or Jim or Jose Cobb was, let alone what he wrote. What does it tell you?
I'm going to go a bit further than just not understanding, I'm actively disagreeing with you. Based on your arguments here, in a workplace I would advocate not letting you make any decisions about database schema.
Sounds like the feeling might be mutual there though. Not saying you're bad at whatever it is you do; just I want to have to have schema designed by someone who has thought deeply about schema design. Databases usually outlive all the employees assembling it, can outlive an entire organisation and getting the design right can save days to months of work.
The academics have got entire classes of bugs that they can demonstrate don't exist in normalised data, there are benefits to everyone coordinating around a standard shared view on how to lay out data and usually drifting away from the standard it turns out not to lead to real benefits long term. You can believe that is valueless or isn't math or whatever, but there is no point aggressively introducing opportunities for bugs when there is no reason for it. And to avoid doing that, the person making the decisions needs to understand the theory.
It's unfortunate that most people these days prefer unstructured blobs of json to formal relational databases. It's impossible to get up to speed in these types of environments because every time you think you "understand" the data, you find out about a random property that gets set in rare occurrences that throws off the whole relational model you intuitively built in your mind.
This also does not hold up well in an operations heavy environment. Like all things in programming, a well balanced approach is necessary. Some combination of formally structured data with some flexibility for future requirements is probably the best course of action here, but that's too much work for programmers these days.
...and this explains why database designs are what they are. It's also why you don't learn SQL and if you do you never admit it: It condemns you to a lifetime of maintaining CRUDs written by people who don't understand relational algebra.
You mean good enough?
Are they, though? Or are you just trying to put up a strawman?
Because I assure you plenty of highly reliable infrastructure is running 24-7 without the faintest concern over normalization.
Writing from personal experience. I can also assure you that a lot of this highly reliable infrastructure works only because a lot of tears and cursing that could have avoided if the designers had any idea what they were doing.
Apart from the fact it isn't; elements have specific order and that order matters.
SQL is SQL, it's not relational algebra or whatever bastardization of it you can think up. In fact when it comes to SQL the only use relational algebra has is to tell you that anyone talking about it has never used SQL.
But it's useful to learn the foundational mathematics of these things as there is a direct relationship between them.
SQL has semantics that can be understood by relational algebra of relational bags. It's of course not a 1:1 translation, but the mathematical underpinnings are explained there. It's useful to learn the simpler, more abstract thinking tool in order to understand the concrete programming tool.
Also wrong for Turing Machines, it really is infinite. That's a big difference to arbitrarily large. The halting problem is undecidable for TM's but not for arbitrarily large (you'll need precise definitions though).
The machine with 7918-states, Z, stops (well, Z cannot be proven to run infinitly long) iff. ZFC is consistent. For this it needs a finite amount of space but we cannot calculate how much. If we could calculate an upper bound we've proven ZFC is consistent.
There's only so many that you can buy.
It's not exactly the same thing, but the underlying lessons and mental model matter.
Of course, the move back towards nonrelational data stores these days may make the point moot.
You must be a delight to work with.
https://www.di.fct.unl.pt/en/education/integrated-master-com...
https://tecnico.ulisboa.pt/en/education/courses/undergraduat...
In two undergrad courses, we programmed in an old language and a very outdated environment, the one the professor had used their entire career - and apparently that continued until the prof retired. The courses were about implementing algorithms, so the language didn't really matter, but I would say the course was probably a net negative from a software engineering perspective.
We had a database course covering the mathematics in good detail. We learned the relational model, normal forms and such mathematically, with a little basic SQL at the end. That was great for understanding, as at the end of the course you'd be able to write out the actual maths performed by a query but it didn't really teach SQL.
Multiple professors wrote typical "researcher code" - I don't necessarily want to say it's bad but it's very different from what you'd see as good industry code. Lots of global state, variable names with 1-2 letters, magic numbers. It's understandable why research code to be like that but it is much more fitting for an implementation of a research paper than for a software product.
On my degree, database classes required both theory, as writing a full blown application in Oracle.
Naturally like everything there are students that work around it, but then they aren't in position to complain about not learning.
We did not truly learn any of those though, it's just that they were used.
The specific languages did not really matter, they were secondary, just some tools. One Theoretical CS professor used some really obscure thing whose name I forgot, really niche, again it was only used to demonstrate some theoretical aspects (provability I think).
Most content was math and formal stuff. Some experiments and low-level stuff, like creating a sound directly from a chip (no computer), in assembler, as part of intro to hardware.
I would have fekt pretty irritated and that I was wasting my time if we had had courses about learning programming languages (designing them, writing compilers etc., is another matter). This is something easily and better done on your own, learning the actual languages.
It’s ultimately a matter of the focus of the major and department.
You can graduate without writing a single line of code, without having ever heard of SQL, and knowing nothing about math.
If you do it right. Or is that "do it wrong"?
Mildly off-topic, but is "BS" the usual shortening of this type of degree where you hail from? If it's United States I would be even more surprised that this would be the case, since who wants a "BS Degree"? In the UK at least we normally refer to them as BSc degrees and there's never really a need to expand.
"CS" on the other hand is not short for "computer programmer job training" so I'm confused by the expectation of the original Ask HN question.
SQL itself is full of problems, but the foundations behind it are definitely not; database theory and database internals is one of the deepest domains of computer science, up there with operating systems. And dismissing this knowledge as "trade school course"work is... sad.
We need, essentially, the divide between chemistry degrees and chemical engineering degrees. That hasn't happened yet for CS, but it needs to.
And that's what the OP is asking for - people who have software engineering knowledge, not CS knowledge.
Though I struggle with leet code quizzes so there's that..
Then why almost every semester has some heavy programming course?
1) We should teach relational database purely from relational algebra without an practical example.
or
2) Relational database is a trade school course too.
If 1), maybe it makes sense for in academic sense, but I personally would rather to work with an intern who doesn't know database at all than someone who knows a bit of relational algebra but not SQL.
If 2), how about operation systems? Compiler? Networking? Are these all just implementation details and not belong to a CS cirriculum?
Hum, allow me to disagree strongly.
Maybe the vast majority of those you read, but not overall. My experience is that most research papers look like math, or linguistic, etc, while the annexes show example use cases that relate to actual pragmatic applications.
From my friends that did PhDs and regularly write research papers, here is what it looks like:
- One of them works on detecting or bypassing virus signatures. The papers look like high level linguistic theorems to prove that some grammars respect some constraints. It's mainly algebra and language theory.
- An other works on watermarks that can resist compression. These ones are 100% algebra and analysis. The applications on watermarking are only mentioned as possible use cases, and a proof of concept was done in his thesis.
- One work on transactional memories. I don't think she even know herself how to program, her papers are hardcore math in some intricate vector space. She has no idea what transactional memories are used for, she's just interested in the underlying math problems to be solved.
Programming is just a way to write down a mechanism. A mechanical computer would be "programmed" quite differently than by writing code in a text editor.
It was stated above that 'a function is effectively calculable if its values can be found by some purely mechanical process'. We may take this statement literally, understanding by a purely mechanical process one which could be carried out by a machine. It is possible to give a mathematical description, in a certain normal form, of the structures of these machines. The development of these ideas leads to the author's definition of a computable function, and to an identification of computability with effective calculability. It is not difficult, though somewhat laborious, to prove that these three definitions [the 3rd is the λ-calculus] are equivalent.
— Turing (1939) in The Undecidable, p. 160
Computer science is most definitely the science of programming machines (computers). It seems ridiculous to me to say CS is not the study of programming. Does CS involve other disciplines like math and physics? Yes. Of course. But it uses math and physics towards the end of programming a machine to carry out the programs.
Let’s consult Wikipedia:
Computer science is the study of computation, automation, and information.[1][2][3] Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to practical disciplines (including the design and implementation of hardware and software).[4][5][6] Computer science is generally considered an academic discipline and distinct from computer programming which is considered to be a technical field.[7]
It directly contradicts the narrative you’re peddling. Please, listen to people trying to explain the nuance to you and actually engage. Don’t just keep repeating the same misconception ad nauseam.
“A computer is a machine that can be programmed to carry out sequences of arithmetic or logical operations (computation) automatically”
“Computation is any type of arithmetic or non-arithmetic calculation that follows a well-defined model (e.g., an algorithm)” AKA a program.
So computation is executing a program. And computers are machines which execute programs.
Turing and the Church-Turing thesis define computability as whether something can be expressed as a program (an algorithm).
So computer science is about programming. The science of programming, programs, and the machines that run them.
Nobody is arguing that computers don't compute. Why are you stuck on that minutia?
This is patently incorrect. An algorithm is an idea, whereas a program contains a concrete representation (implementation) of one or more algorithms executable on a computer.
"Programming" in industry has a very narrow definition though.
Computer Science is the study of how we design algorithms to process data. We do computer science using grammars that allow us to describe abstract operations on data. We categorize the different types of algorithmic solutions to problems. We study the limits of efficiency and prove things about the various classes of algorithmic problems and their solutions. We generally work with discrete structures and type systems (like our beloved lambda calculus, origin of the Y combinator).
You can design algorithms without ever compiling a single piece of code just like you can add numbers without ever using a calculator. In computer science we talk about abstract syntax trees, higher order functions, context free and regular grammars, finite state automata, logic, and numbers. You can even get meta and modify your own algorithm’s data as part of the algorithm itself. All this happens independent of a particular instance of a physical machine.
Programming means “issue instructions to a given instance of a machine so that it behaves a certain way”. We take all our theory and apply it to an electrical device that has a physical processor and fixed memory. We program a microchip by writing chains of instructions to its memory. We measure performance in cycles per second. When programming we talk about machine instructions, loadable images, calling conventions, binary interfaces, program counter, alignment, and words.
TL;DR: just go read the Wikipedia page on Computer Science, it’s quite clear.
https://en.wikipedia.org/wiki/Computer_science
Geezer reaction:
https://en.wikipedia.org/wiki/Computer_(occupation)
https://en.wikipedia.org/wiki/Analog_computer
https://en.wikipedia.org/wiki/Mark_I_Fire_Control_Computer
With a sound type system, you can do things like compose a new type out of other types, and it will work consistently (not produce weird edge behaviors).
For example, C's `void` type introduces anomalous behavior. A function returning `void` is not composable, e.g.:
would work in a sound type system, but does not work in C. You'll see it in the compiler implementation because it's a special case that appears over and over.As has been pointed out to me several times by CS academics, I could stand taking a course in type theory. I have a good intuitive sense about it (likely from my math background), but have no idea how to prove something about it.
A CS courseload certainly doesn't (and certainly not in the required selection of courses) cover soundness of a language. Even my theory-focused classload only covered automata up to proofs of regularity etc. This is, sort of, step 0 in soundness-proving methodologies, but it's only step zero.
The whole reason we want to prove a type system sound is so we can prove certain things about the programs that use that type system.
Typed lambda calculus was formalized before the first programmable computers, and it's relation to programming wasn't clarified for another 20+ years (and real type systems don't really start to appear in programming languages for another decade after that afaik).
We use type systems to help abstract patterns of data into logical constructs that can be reasoned about. They are logic systems with grammar that describes relationships between axioms and constructs.
Seriously, go read a textbook and then we can pick this discussion up. Wikipedia has a good overview.
https://en.m.wikipedia.org/wiki/Type_theory
We don’t call set theory and category theory “computer science” unless it’s about programming computers.
Much as how people who study computability are computer scientist, even though none of the asymptomatic improvements to matrix multiplication since 1990 are even remotely relevant to any kind of real-life program.
But there is computer science not applied to computers: Operations research for instance is basically a branch of CS: it is not about programming and it has applications in business, logistics, etc.
My computer science degree covered a lot of topics that didn't require a computer, for example relational algebra, discrete mathematics, and introductory formal logic. Of course, the practical usages of these are often best done through a computer and programming, but it's not a requirement.
In reality all computer science degrees I'm familiar with make an attempt to expose students to relevant programming languages, it's simply practical and students demand it, but the details of coding style, how to use git, or similar topics may not be relevant in a computer science degree. That stuff isn't so hard to learn if you have clear guidelines and they have some baseline amount of intelligence which is one thing a degree tries to validate.
A computer is a machine which runs programs (it computes). That's literally the practical and theoretical basis for computing. Instead of running the program in our heads, we have designed machines to do it for us.
We can calculate SHA256 with a paper and pencil. But we created machines (computers) to compute for us, according to the instructions we give them (the program, the algorithm).
not sure what you mean by this, because there's no such thing as programming without computers. (actually, the first computers were invented before programmable computers were invented)
Or, you really don't think I can design a logical grammar to formally express the creation of a PB&J?
The answer is: it runs on dad, poorly :)
They should know systems programming, databases, operating systems, web app development, security, basic logic, data structures, algorithms, JavaScript, and some backend languages. Add some ability to interview users, design at least basic UI, organize tasks, release your product and monitor it.
Math, physics, statistics, calculus, proofs, theorems, and computability should be the interest when you decide to pursue PhD. For someone who's going to build ERP systems in Java, Oracle ad Angular these things are completely useless. Most of the developers are like this.
The more important question is: Are you explicitly mentioning databases, concurrency and networks in your posting? If not, then it explains why candidates are not filtering themselves out.
To answer your question: we don't. I have no expectation around this. Primarily asking for my own curiosity in terms of differences in CS degrees.
At some level, we had SQL and network programming in my degree, but only for that level that we knew it exists and how to do it in very basics. We did not have concurrent programming at all as mandatory course.
The most of your deeper expertise comes from the project work here, and depending on which project you end up or choose, they might or might not include above.
Why don't they know database systems? They might have taken a database course for 4 months 3 years ago and never touched a database again because it's not a trade school. School just validates that you can learn a series of related skills over a few months when necessary.
What's the last thing you started and dropped after a few months just before the pandemic? How comfortable would you be if you interviewed for a job exclusively on that skill?
21 year olds won't know very much of anything in general.
A 21 year old 3rd year college intern is... 21 years old
Three quarters of a four year computer science degree doesn't change the fact they're a 21 year old.
Even in the topics they have covered, the knowledge won't be very deep.
A true mental model of concurrent programming is not something easily obtained.
Frankly, most 35 y/o engineers don't truly appreciate the intricacies of intra-thread concurrent algorithms, unless it's their specialized area.
Frankly, most engineers in the industry are too lazy to learn SQL well.
Lower your expectations of 21 year olds. Lower your expectations of the workforce in general. Hackernews is a self selecting community of tech works who study their job in their spare time as a hobby.
Most people I've worked with go home and watch football after 5pm.
It's worth noting that if you're in the position of interviewing then you probably have 10+ /years/ of industry specific experience.
The applicant's prior ten years included 2 years of covid prison, a fractured high school experience, and learning basic life skills.
Also, usually what you're learning in class are the fundamentals that undergird what you need to know for software development in the job. The basic underpinnings and context to understand what you're learning those first two weeks on the job.
And if you're going to be that dismissive about what you learn in a college course, I'll tell you most of what I learned the first two weeks on the job: how to use the specific IDE the team I was hired onto preferred, how to build their specific project (barely applicable outside the project), how to contact IT to get a ticket to get them to install the IDE because I didn't have permission to, how to use the timecard website to log my hours, where the people on my team prefer to go out for lunch, a few hours of HR sexual harassment and cybersecurity training, how to set up my 401k and medical benefits, etc. etc. Basically, nothing to do with "computer science" which is what the original post was about.
Assuming your semester is twelve weeks long (as is the case at my university), that's less than 4h/week. I'm guessing you're only counting lectures as learning. If you only go to lectures for learning and don't do any kind of work on your own, no tutorials, no office hours, no revising for exams, no practice exercises, no homework, no discussions with your peers, nothing... Yeah, you'll probably feel like your first two weeks at the job is a crash course. But I'd say you failed at taking advantage of learning at a university to its fullest.
They are still only 21 years old, and just literally haven't had as many afternoons to spend tinkering.
When I was in school mostly the poor kids and lower middle class had jobs, but I wonder if that's still the case.
Those 45 hours of lectures are usually condensed material with little to no time to practice. It's expected that you practice during the other 90 hours (and on your own time if you plan on having straight A's).
While you may get more hands-on experience in a few months of working full-time, you usually learn much fewer concepts.
Once you get into a job, you're constantly revising past mistakes, doing new things and all the while you have coworkers who are helping you- they don't want to wait for you to make a mistake, they want to help you get it right the first time if you need the help.
Uni courses rarely cover real-world knowledge that you will use on the job. Aside from some specialized jobs, most of what they teach you is either too low-level or mostly useful as background knowledge. So many practices aren't covered in college courses- even things as simple as version control have only recently started to become common.
You're going to be learning a lot on the job, and at a decent job what you learn will make what you went through in college pale in comparison.
I genuinely don't think most of workplace actually reassemble this ideal. Sometimes you learn ... plenty of times you do something repetitive. Sometimes you don't even learn about own bugs (hello agile). And sometimes they give you great advice and plenty of time they just don't.
In CS, we regularly had courses worth 300h in a semester, fe. analysis, system architecture or software engineering.
In practice, students start to complain once a course tries to enforce their full "hour contingent".
Related: one of the faculties here recently announced the introduction of a threshold grade. If you were bad at school, you are not taken into consideration, even if there are available places.
I had a couple coworkers who were in the same classes as me and as part of trying to improve my time management I'd ask them how long the homework took and would get ridiculous answers like 'an hour' (2+ week assignments usually take tens of hours). I couldn't tell if they were smarter than I thought, braggarts, or liars, but after I switched from a support role to a coding role, in the space of a semester I was doing my homework in 2-3 hours. Often those homework problems are just a bit harder than an interview question, but without practice you're improvising the whole thing and that's a lot of effort.
Before we started talking about 10,000 hours, I already had a rule of thumb that your competency as a developer tends to ratchet up at 100 hours, and rather substantially around 1000 hours. An internship will definitely hit 100 hours, but 1000 is still easily achievable in a year. 10k hours might as well be an imaginary number. That's longer than they've been in school and so feels like an unreachable finish line. Demotivating for sure. 1k just means "work hard for a bit".
There were certainly a lot of people who didn't really understand the question, and I couldn't really help them much without risking the poorly worded guidelines on what would earn you an expulsion, but there were a lot of people spending a lot of time banging their heads against typos and simple structural errors.
I learned early on that the facts and tasks I biff on are often the things I remember the clearest later on, but also that most people are not like that. They remember the things that they got right easily on the test. I'd wager that for most of them, that time spent wrestling with the computer provided very little to no growth opportunities at all. 10 hours on a homework assignment might have yielded at most 90 minutes of actual progress.
For the more difficult classes, that number can be even higher.
So for a class that has 3 hours of lecture time per week, that's 9+ hours per week on that subject alone.
On the job training only works for certain types of skills
This whole “gaps” argument is just people who’ve bought into a system. People coming out of uni have “gaps” as well just different gaps… gaps rich kids have so it’s ok with Google.
That is optimistic
Now I feel obliged to work even harder to make a few million dollars out of it before I'm 35.
I spent so long just jamming my brain full of Vim shortcuts and functional programming theory...
I need to make some money out of it.
Hard to enjoy my hobbies when I'm in debt to the bank and my mortgage isn't paid.
35-year-old here, with a similar box of CS knowledge floating around. Comfortably sub-million, but with a nice house and a couple dogs.
You really aren't obliged, and you don't really need to "make some money" out of it. More money would be nice, don't get me wrong. But you said you learned it because you enjoyed it. Which I did, too. I got my money's worth out of the learning.
You are not obligated to financialize everything you enjoy. It's super important to understand that.
The mortgage anxiety--I get that too. YMMV, but for me, all it took was practice. I don't think about the mortgage on my house. I've owned it for about five years. The money goes out every month. I work on renovating a room or building some furniture or going for a walk with my dogs. It need not be a terrifying specter; it'll either get paid in due course or I'll die and won't have to worry about it.
I think there are a lot of brains (and mine is definitely one) wired to make a bigger deal of things than necessary to provide a sense of direction and meaning, but this one is so abstract and so luck-based that, for me at least, it was deeply valuable to fight against it. I kicked it into the back of a closet, and for me it's a lot easier now to just live.
Again, YMMV.
Maybe? Like--define optimize. VC-chasing (alluded to by the great-grandparent poster elsewhere in-thread) is almost certainly less optimized from an EV perspective.
If it's one's dream, absolutely chase it--but it should not feel obligatory because you absolutely can have a lot of happiness now without precluding later happiness, too, if you so wish.
Nor did I say one had to optimize. I just said it makes sense if someone chooses to do so.
If you're only trading time for money in your career, IMHO, you don't have a career, you only have a job.
At the least you should also be trading your time for expertise, networking, and accomplishment, although I would also recommend some degree of self-actualization and everyday enjoyment (if you're in a position to seek for those).
Unfortunately in academia it only matters if you hit your grades, which are gamed to hell and back. So you really can't tell at a glance if an A student is going to be good or bad, I hired based on a transcript once and got burned so badly that I'll never do it again.
You are much better off developing a specialty that you enjoy. Money is a side-effect of happiness and persistence.
Yeah "money is not everything" but I'd be able to do more things I like if I had house and spacious garage. But that's $$, $$$ if I also want to live in place with reasonable commute
My house is 2000 square feet. I have a recording studio and a wood shop and too many 3D printers. I am getting a garage permitted for another ~800 square feet. The place cost $480,000 four years ago and is around $620,000 now on Zillow. I am a fifteen minute walk from a train and can be in downtown Boston within the hour.
There are ways to get what you want other than "be an extremely unlikely curve-breaker of a success". It frequently means "not being in San Francisco or New York", granted, but...you've got options.
Im glad for you but I don't think you are making the point you intended.
But I don’t expect the person to whom I replied to be concerned with individually solving it either, and so while I think we can solve our own problems while being mindful of our society’s, the latter didn’t seem appropriate here.
You don't really need a graph to understand that change...
Good on you for fucking with the NIMBYs though, they're the reason house prices went up by an insane amount.
My house, on a 30-year, 10% down, if bought today at the Zillow valuation + my property taxes and insurance, would be about $3800/month. That is absolutely not nothing, but the same square footage in Alameda, CA--a better substitute in some ways, esp. re: transit--is in the $1mil to $1.25mil range, which would be more like $7500/month (if you could make the 10% down at all).
Given that staff+ compensation in the Boston area is pretty reliably north of $200K these days, these are much friendlier numbers. And again, Boston is still an expensive area! There's lots of places in the country that are cheaper. I use Boston as an example because even on the edge of the suburbs there's still decent-to-good public transit. Go just down the road to Providence, which has a thriving local scene of its own and is still on the commuter rail to Boston but otherwise kinda lacks local public transit, and you can get a house like mine that's roughly as walkable to the basics (restaurants, supermarkets, etc.) for like $380K. And if your job is in Boston you can still be at South Station in under an hour and a half of sitting on a train that, IMO, beats BART and CalTrain pretty solidly, let alone driving (which I've known a lot of people, historically, to do from Providence--and that seemed awful).
Me, I work mostly-remote, so I'm not tied to being here in the first place. I just like it here. But realizing that there are significant options is important.
My hobbies take up a lot of space so I understand what you want. We had to move out of an in town location to get the house we wanted. A 1,000sqft in town house the same size as our duplex rental wasn’t going to cut it.
Luckily I was able to find jobs in an area 45 minutes away from downtown with virtual worker status (3-5 days a month in office max). I changed jobs and was able to find many other ones in the same area.
There's a lot of great places out there. Believing that we have to all cluster around a few cities is pretty bad for us.
The West Coast does have a higher employment of Software Developers per 1,000 people. Especially SF/San Jose. Texas seems to be increasing more than other areas outside of California and Washington.
https://www.bls.gov/oes/current/oes151252.htm#st
It doesn't. Money buys comfort, power, prestige, security, ..., but not happiness. A happy person is a happy person. An unhappy person is an unhappy person. It sucks in a way but from a different pov it is a wonderful equalizing fact of human experience (like the bliss of orgasm, available to poor and rich alike).
A million is a lot but my first reaction to your comment was this is something only wealthy people can afford to say. Especially in the US where we don’t have universal healthcare.
Just typing this made me realize I need concrete, achievable goals though so I appreciate your comment. I am not the gp but this is something I expect a lot of us need to work on. Something like:
1. What are my financial goals? 2. Is it realistic to have those goals? 3. How do I know if I am on track? If not, what can I do to correct my course?
Money is largely a product of working, but broadly uncorrelated with time or effort. If you think this is untrue, try working a minimum wage job or two for a while. This shouldn't stop you working hard for returns. but it should offer a reality check.
Money doesn't buy happiness, but it can remove stressors. However realistically this is linked to income-outgoings / debt. Earning a lot but being highly leveraged and living an expensive lifestyle ends up not being much less stressful.
I may work until I'm 70. I may be dead before I'm 50. I don't want to be miserable today in order to live a mythical good life in the future. So I plan assuming the most important thing is being content day in day out.
I guess I have a high discount rate. It's not obviously right!
1. Allowing one's discount rate to vary with age.
2. Varying the time horizon. Exponential decay probably doesn't map very well to how people think across different phases of life.
For many people, their discounting may change roughly as follows:
- When you first pop out, life is only the present.
- When young, e.g. 2 or 3 years old, kids may want things very soon, within seconds, minutes, maybe hours.
- Young'uns start to understand money. They might look towards saving for their first bicycle. Now time horizons look more like 1 to 12 months.
- As people start to develop educational and professional goals, they allow more deferred gratification: preparing for their career or buying a house. This horizon might be around 1 - 10 years.
- Somewhere in a career, people might start planning 5 - 50 years out, often for for kids and retirement.
- As people approach their end of expected lifespan (or face a life threatening illness), they probably care a lot about:
-- personally, living in the moment. This might mean checking off the bucket list as well as high quality relationships
-- financially, allocating money away from themselves towards emphasizing long-term, inter-generational goals, like providing for their family or donating to charity. These goals have 20+ year horizons.
- The ultra wealthy have the means to create charities with 50+ year time horizons via an endowment.
do things that are good
ultimately no one knows when we will expire
balance is hard and sometimes it’s nice to look back and say oh i worked so hard for nothing but the experience of working hard and that’s okay too
we all end in the same place
“nothing is good or bad, thinking makes it so”
As a side note, finding and focusing on developing a speciality that I enjoy is a lot harder than making a lot of money. For me at least.
It's just so fun to do something where knowing stuff actually changes people's lives every day.
I'll be happy along the way regardless I'd say.
Because that's what usually happens with such "bet my life on it" goals on something that affects most of your daily efforts for years.
This is not a "I'll learn to play guitar this year" kind of resolution.
And don't pretend your way is the only way of looking at things. "Money is not important" is something people say if they have a lot of money.
Well if money was key to happiness, why would people with money say that? Wouldn't you expect the opposite? The reality is money helps with lots of things in life, but it's never more than a means to an end. You should always be striving to enjoy what you can in life as you go, because cultivating a happy life requires being present. Otherwise you'll blow right by the stuff around you, and by the time you get the money you were seeking you may be on an island. That's all people mean when they make that statement.
2) Happiness isn't my goal in life. It's not a healthy goal either.
Not sure if you were going to go there, but lets skip the 'on a permanent heroine IV' interpretation of happiness.
but i aim for peace which isn’t joy
i find pleasure to be destructive (sugar, sex, booze, drugs)
so i’ve been aiming for peace which isn’t happiness but it is pleasant without causing me to crave more of it
where joy or dopamine rush always makes me want moar
Sugar, booze, drugs, video games, social media, television, etc. are destructive. Those are unnatural things which are engineered to give a dopamine rush. I've been healthier in periods of my life when I did. Going "clean" is stressful, but lands me in a healthier place. I can't usually do that because of family and social ties.
The odd item on your list is sex. Sex -- in the context of a permanent, stable relationship -- builds social bonds and helps emotionally regulated. Studies show that too.
in shorter term relationships it makes me want sugar booze drugs more
sometimes it’s like
oh this is “pleasure” why not get more
i hope that in a committed relationship it would be different
I'm not going to generalize to anyone else, but for me:
- Casual sex is psychologically harmful.
- Regular sex with e.g. a spouse is healthy.
I find that it (1) Builds a close emotional bond. Conflicts don't seem so important. (2) Emotionally stabilizes me and makes stress go away.
As well as a number of additional positive effects. The older I grow, the more I agree with conservative, traditional cultures about sex: One lifetime partner is ideal. A small number or zero is okay. A large number is bad.
I'll mention: I'm not drifting conservative overall. Some places, I'm to the left of the left, some to the right of the right, some off-axis, and some at the center.
Also, stoicism.
A good book -- more mainstream than most of these comments -- is "Man's Search for Meaning" by Frankl. It has been listed as one of the ten most influential books in the US a few decades ago, although has fallen somewhat by the wayside in recent years. I believe the core thesis. It's worked well for me.
Leading a meaningful and purposeful life, having good relationships, helping people, doing interesting things, learning, and growing tends to be a more effective path to emotional well-being than searching for happiness.
To answer your question directly: My high-level goal is to leave the world a better place than I found it and to have a positive impact on the lives around me.
Or everybody should be left to live oblivious to some statistical facts, lest they be hurt?
Depends on what happens to the economy.
15% inflation for 10 years will make us all millionaires!
That's assuming wages keep up with inflation, which is something we're seeing very few signs of these days.
I don't think I can do it in 1 year. 5 years is perfectly possible.
We live in the golden age of computer science entrepreneurialism.
Venture capital has never been easier to get. The low hanging fruit is largely still unpicked. And, with full remote, hard working highly capable talent has never been easier to hire (hire Europeans, they're overeducated and underpaid).
IMO, not starting a business in the next 5 years would be remarkably foolish.
Just have to write a 35 page business plan, get it reviewed by capable people, have a good idea, and execute.
I think you've missed a lot of developments in the last year or so...
>Just have to write a 35 page business plan, get it reviewed by capable people, have a good idea, and execute.
Thousands of aspiring SV millionaires have done just that and their companies still got nowhere.
Not to mention this sounds like the business version of "how to draw an owl" meme.
Then if you can't impress people with your 35 page document, back to your day job.
Getting excited about a subpar startup idea seems like the biggest point of failure.
> I think you've missed a lot of developments in the last year or so...
Trend vs business cycle.
The long term trend is the world's accumulating vast stores of capital. And capital has to be deployed.
Once the current business cycle is over, the capital saturation will mean Silicon Valley VC idiocy will resume.
Last word in that sentence doing a lot.
Step 2: ?????????
Step 3: Profit!
iirc, graduates at Goldman and similar traditionally do/did a 20 page investment banking proposal every 2 days.
They're filtering, so whenever they don't have enough real work, research and write a business proposal for some IB deal or whatever line of work.
Essentially, they just research, write, then supervisor reviews, director reviews, nonstop.
That's investment banking. Write dozens of proposals, pitch a couple to actual companies the partner thinks are good ideas.
https://mergersandinquisitions.com/investment-banking-pitch-...
https://www.10xebitda.com/investment-banking-presentations/
And it works. They're usually quite good ideas (after all those drafts).
Meanwhile, Silicon Valley funds ridiculous stupidity, because somehow technology startups just make that much money.
I have no idea how Silicon Valley VC firms make any sort of return on capital.
Then it wasn't on the merits of the business plan, but the idea, niche, timing, connections, delivery, and several other factors besides...
Millions can scrible some startup idea on a napkin. Getting a VC to take you seriously is another matter, even with a fully fleshed presentation, a prototype, and a well written and researched business plan.
no we don't, not any more. the disruptors are now the ones to be disrupted, and they're trying awful hard to keep that from happening.
like, sure there are still new businesses starting, that's true in a lot of fields. but SVB imploding is a fantastic sign as to what is truly happening to startup culture.
Given that I'm not one to balk at writing 35 pages and an 8 digit bank account is relevant to my interests could you throw out a few of the low hanging fruit? I assume you won't be angry if I take one.
The state of the globe's capital markets for a range of non-equity financial products is appalling.
Forget cryptocurrency exchanges. The global financial tools available for bonds, futures, you name it, just about everything else is so appalling.
You need to pick an area that you really like and focus on it and I think the money will come when you improve and specialize your talents in that area.
That area, of course, should preferably coincide with the jobs available in your area in order to make it easier on yourself, but nothing says you can't apply for remote jobs.
You are young and worried for a mortgage to complete... and that prevents you to enjoy what you like in life?
What will you say when you get older and have sick relatives, dying parents, troubled teenagers, health issues... and so on? Life is not about waiting for the end of the storm to be happy. It is about learning to danse in the rain.
Be sad when it is time, manage crises when you have to... And collect and share as much joy as you can the rest of the time. No matter what, don't wait and enjoy your life!
Do not let anyone slow you down.
Bad news: computer science knowledge has rapidly diminishing returns past a certain point in the real world.
Fast-flowing games work better, soccer is particularly great because it has 2x 45 minutes of solid action. NFL has too many breaks where my brain can start thinking about other things.
I say this as an Australian.
Gridiron edited down so it's only back to back plays, no ads no stoppages, is the greatest sport on television.
lol good luck. if anything the amount of ads will only increase.
it's why soccer (futbol) never became a thing in the US -- no channel on the planet is going to air 90 minutes without ads.
As an avid American Football fan, I agree; condensed replays are great, especially when you are interested in, but not deeply invested in a particular game (for those, I prefer to watch long-form).
Frankly those who do would try to avoid it anyway unless it is truly the best/only solution to the problem.
The best way to not have concurrency bugs is to have as little of it as possible. If I have a choice of making single job faster vs just running 64 jobs in parallel on CPU I'd pick the second every time I can because code will be simpler and less buggy every single time.
I maybe misunderstanding what you're trying to say, don't you mean you'd pick the first one every time? Make the single job faster instead of introducing concurrency with 64 parallel jobs?
I believe this also happens because often the most accessible/obvious form of concurrency is the super problematic "shared data by default" kind (C/C++/Java "Threads"), which is basically ASKING for trouble.
Also programming education sometimes even reinforces this in a very harmful way from what I've seen. Telling a student about mutexes/semaphores/atomicity is actively harmful IMO if it is not accompanied by strong discouragement and actual practice in finding subtle race conditions.
In C++ the patterns I don't see bugs in is thread pools for processing large batches of heterogeneous data, and fine grained parallelism for stuff like vector and matrix operations. But in the latter case it is all library driven, and in the former, it is a really simple pattern.
I dunno, I reinvented message-passing concurrency with multiple processes before I knew the terminology or had any formal programming training. It came completely naturally and it was nearly a decade before I understood why people thought concurrency was difficult to get right.
If I were just following the course material I wouldn’t expect too much from what you can learn from course work.
I have even become slightly addicted to learning everything I can about building software. From reading (3 of Robert C. Martins books, and a few other popular ones from hacker news’ top 100 reads) and taking Udemy courses[0] in my free time.
I also have been working at a small teacher resource website for the past few years, to get some on the job knowledge.
Even then I still don’t feel like I’ve got the best understanding of SQL, networking or concurrency. I spent most of my time learning specific languages and practices and principles like Agile and spec docs. I’m working on building my own web app and have been creating a dev log for it[1], as well as building an arguably crappy personal website[2].
[0] https://www.udemy.com/user/legozombieking/ [1] https://youtube.com/playlist?list=PLeFBzv7SGgs903uH6Mfm34J94... [2] https://www.cadleta.dev/
(But after saying that, congrats on the learning you've been doing)
Software engineers consistently overvalue the benefits of "I need to be a crazy outlier," and consistently undervalue getting to bed on time, using a visual debugger, and "normal work habits."
Still, gotta actually do the thing.
All that went away with SQL as the management was pushed down to the database. It freed the developer up to concentrate on the problem, rather than the hygiene.
Likewise, I guess, the kids these days don't need to worry about the OSI model, subnetting, routing, TTL...
I can complain that a cloud native Hello World application can be made with the end user knowing NOTHING about the underlying code to get it there, but that's just me being an old guy, shouting at a cloud.
I look at what I know, and what the kids know, and recognize that each of these technologies was spoon-fed to me over decades. I didn't learn Linux, SQL, networking, GPU acceleration, Network Intrusion and Detection and how to manage it...all at once, when I was 21.
I have been developing for 5 years now and I'm always finding a new neat thing even in SQL.
Recently, had an issue with deadlocks, I learned about them in school(even did a presentation on it) but had to research how SQL handles them and how I can rectify the issue we were having. I really enjoy this part of the job. There is always something 'new' to learn, even if it existed for a while like SQL.
It's not even a Jack-of-all-Trades thing, and I hate it. You're a Git away from building your startup and you've got no idea what libraries or dependencies it has, you just logged into AWS with a credit card and ran with it.
(Not you, exactly, the 'Royal You')
If you're a person who likes being visually stimulated while learning, spending free time outside of work to learn how to query tables and design schemas is about as exciting as watching paint dry.
I have never used and will (in the foreseeable future) not use SQL.
I already have a ton of stuff on my plate that I need or want to learn, SQL is not one of them and I won't include it "just because". Right now it's absolutely useless to my skillset and not something that I have an interest with either. This might change, and then we'll see, but right now it's not even a thought.
Spend some time doing initial technical screen interviews. You'll quickly realize most people with years of industry experience have barely any clue what they're doing.
Think about the average quality of engineer you've seen make it to a panel interview. Then realize the majority of applicants don't make it to a manager screen, and the majority of those screened don't make it to the panel. And a large percentage (varies by company) won't make it through the interview panel. If you're only thinking about people who are hired, you're even more distant.
None of this is to insult people. This skill set is hard. It's just that if you work with competent engineers and spend your days in technical communities, you're in a bubble that tricks you into thinking most people in the industry are way more capable than they actually are.
Well-rounded computer scientists spend a lot of time reading papers (and docs and blogs and...) and a lot of time thinking about things. But that means having less time left over for being 10x engineers unless they're also 10x computer scientists.
I would never expect network programming and maybe by concurrent all OP really means is threads but hard to say. They should understand that computers are a multi-threaded environment even if they don't have experience implementing thread safe software.
Then that's a failing of whatever education system they were in, not of their age. Infantilizing 21 year olds even further, isn't doing them, or anybody else any favors.
It's basic physics
(a 31-year-old may not know as much as a 21-year-old, but that's not [usually] because they can't know more)
We could be teaching all this stuff at a far younger age, and many countries do.
But you can't teach it all
Again - this is basic physics
Not to mention “college isn’t a trade school”, “if you want your CS program to teach you how to program, go to a boot camp”, and so on.
And that's what's chured out record levels of student debt, pointless degrees, and infantilized "adults"
Meanwhile, vocational schools and community colleges in the U.S. are considered low-prestige by both industry employers and American society itself, despite less likely to result in high student debt and perhaps fewer pointless degrees.
Just complete lack of coordination from different segments of society, and misaligned incentives across the board.
/s
Through time, regardless of what a performer can do, without an impressive file (by comparison to some known slackers already in the corp) they won't even be given a chance.
These outstanding performers are getting harder to find since so many more people have a degree these days, even PhD's, and with all the average-to-below-average students now earning degrees and thus diluting the degreed talent pool.
Plus more often than not the original need for enhanced credentials was purely bureaucratic, which can work OK to an extent in a well-run bureaucracy, but when you've got a key position where an average performer just will not do, it really doesn't help no matter how many degrees they have.
Often you need someone who has put 4 years or less of effort into rapidly earning credentials better-than-average (at the expense of getting deep into the actual science), who will take that type of success into the workplace, having an advantage doing more of the same politically within the org.
Other times, especially in science, you need someone who has had a lifetime of interest, to build upon that much more technical background, which could not be achieved in merely 4 years anyway, even by the same person.
When you need a lifetime of progress you're going to get a lot more from a 21-year-old who didn't just start a few years earlier.
If you're in your final semester of a formal CS education then you have almost certainly taken courses on all of the things the poster mentioned. I remember studying all of these things, at least at the introductory level, in the second last year of my university education.
A lot of skills are acquired by experience and when needed; don't hire fresh graduates for foundational and theretical knowledge, hire for learning ability. Give them a take-home assignment where they have to show they know or can learn something new.
But, not all four years were spent on CS. Usually 1+ of those years are spent on generals and dicking around trying to decide what to major in.
And leetcode does not prove anything.
Is part of this because not too long ago, Facebook + Google were basically giving the top new grads $175k signing bonuses to convince as much top tier talent to work for them as possible (and things of this nature)?
This throttled the market as a whole, limiting supply of workers which drove up TC.
The upside at least is that I can play that game too and optimize for it, and then reimplement the same practice which further limits the supply of labor and benefits my TC at the expense of the company I work for. Win.
You don't need to know the intricacies, just the foundational basics. Locks, blocking threads, deadlocks etc. Any half decent CS course will teach you these concepts.
The real issue IMO is that you can get a CS degree without taking a concurrent programming course
Maybe we have different definitions of "CS course"
after 3 years of working as a programmer in the same company I understood why people do that. It's a sanity preservation mechanism.
So then I burnt out and quit.
I say we shouldn't even have a distinction between work and home. We should set up dorms so that people don't have to leave the premises. Rent can just be taken out of their pay, talk about convenience! In fact, do they really need to get paid, when the company can provide all their needs? Isn't getting to do the work reward enough?
I mean honestly it’s like debugging - for God’s sake just stop writing the bugs in the first place.
You could spend your evening honing your skills for a while, yes, but even then at some point you might feel tired of just spending all your time on computer-related stuff.
Then you log on HN, like you did for the last decade or so, and see another "Ask HN: I don't feel passionate about coding anymore" post. It's okay buddy, passion will come back, or might not in a way that makes it okay, you just need a breather.
Sorry, went on a bit of a tangent, cause I do agree with you in fact. I do expect my colleagues to meet a baseline of competency, one that usually only met by "doing your homework", and at the same time, I don't think it's reasonable to expect them to do it out of habit.
Are you a founder, perhaps? Or just a toxic individual?
You know there's life outside work, right?
I think this is a really important point!
Of course it's acceptable to go home and watch soccer.
But then you are relying on your employer to keep your skills current. Some employers will send you to a conference once a year, but I'm not sure that's enough.
If you don't keep learning you'll realize that when you are 45 your skills are 20 years out of date.
Considering how tech salaries exploded in the last decade, I think we're getting a fair deal for spending some of our free time learning.
It's also dangerous to willingly give this away based on the salaries during a boom. The expectation won't go away when the boom goes away... In fact, it will probably get worse.
I’m not saying that I’ve never worked “overtime” to do what would have been 8 hours worth of work for someone who knows what they are doing.
Should we expect more? No.
If you are lucky enough to work a job you love, work as long as you want - but no one is obligated to do more than they are contracted for.
But even 'back in the day' on my first job out of college as a freshly minted CS major. My boss looked at me and said 'today you start to learn to write software'. He was mildly joking with me but was also very serious. A CS degree does not mean you can write software. Even 25 years ago I graduated with people who only took the intro pascal class and nothing else. These were CS majors. I knew going out on that I was a 'green' and had no real practical coding skills. Those I learned very quickly on the job.
I had all the fundamentals. The thing I was missing was writing software itself. Why testing is important. Why structure and readability is important. How to decompose the program into different systems. I am glad for the CS degree as those issues do pop up from time to time. Mostly though I am gluing this framework to that framework so someone can fill out a form and automate some paperwork system.
For new devs I take on. I stress good code structure, testing, regression, QA, path testing, logical layout of architecture, end user requirements and delivery of code. I assume fundamentals. If they are missing them I get them on projects where they would learn particular ones.
I learned very little on the job. It was mostly things outside the job.
So, also exceptional in terms of wetware.
Gates had a 'genius' for ruthlessly exploiting a monopoly in business; apart from that there is nothing else exceptional about the man apart from being a thoroughly unlikeable and harmful person, in my view.
The original paper seems to be https://www.sciencedirect.com/science/article/pii/0012365X79...
Rather interesting, I didn't know that about his history either.
Cheers. I've found the full paper is accessible through Sci-Hub [1]. I've had a very quick glance over it. The paper was submitted in January 1978 (when he'd already been at Microsoft for 2 years). But this NPR transcript [2] of a chat with Gates' Harvard professor Harry Lewis describes him as coming up with the solution a few days after the professor referenced the problem in class (in either 1974 or '75). Gates' co-author Christos Papadimitriou was an assistant professor at Harvard at the time. Papadimitriou went on to a glittering academic career and he recounts his recollection of the collaboration with Gates in a 2013 ACM profile [3].
So it appears that I do have to concede Gates this one contribution towards computer science. Although the paper contains a somewhat intriguing note right at the bottom :
> Ervin Györi of the Hungarian Academy of Sciences and György Turán of J. Atilla University have independently discovered a proof of Theorem 1 [the five-thirds solution]. Their algorithm and proof are essentially the same as ours.
And their paper [4] was published in 1978, preceding publication of the Gates & Papadimitriou paper in 1979.
[1] https://sci-hub.se/https://doi.org/10.1016/0012-365X(79)9006...
[2] https://web.archive.org/web/20110919161456/http://www.npr.or...
[3] https://www.acm.org/articles/people-of-acm/2013/christos-pap...
[4] https://scholar.google.com/citations?view_op=view_citation&h...
They just wouldn't have gotten as early of a start.
Edit: Not my downvote. Corrective upvote in fact.
Don’t forget that as late as 1993, Apple was worth more than MS and it was going back and forth with HP to be the number one PC seller. Things started going south around the time of Windows 95 - only two years before he came back.
We can't all define "better" exactly the same way, but I think it can be agreed there were mistakes due to immaturity that would not have ocurred otherwise.
Profit is necessary to stay in business so it's important. But it can't be the only and exclusive goal. A company is just a structure to make things happen / collectively achieve things.
Please don’t tell me that you believe in sone BS “mission statement” that companies spout? Did you also believe Google’s “Don’t be Evil” motto thru had for years?
> Profit and profitability are absolute requirements. That is why even non-profit corporations must strive mightily for profitability. However, this does not mean that profit is the basic purpose of a business. Profitability really is an essential ingredient by itself which might be better spoken of in terms of an optimal, rather than a maximum size. In support of this thesis, Drucker noted that the primary test of any business is not the maximization of profit, but the achievement of sufficient profit to allow for the risks of the financial activity of the business, and thus to avoid catastrophic loss leading to failure. And we might add, achieve the success which would benefit both the business and society. So profit is necessary, but [not] the purpose of business.
https://www.hrexchangenetwork.com/hr-talent-management/colum...
For instance, out of all the companies that YC has invested in, less than a dozen have gone public.
And once you either accept outside funding, you are immediately at the mercy of the investors whose only goal is to make a profit.
And honestly what Drucker thinks is irrelevant. What’s relevant is what the owners of the companies want - either public or private companies.
What do you think the pension funds and mutual funds want when they own stock in a company? What do you want from your retirement accounts?
Not every for-profit company has investors.
>Do you really think that any investor cared about anything besides making money?
Some of the most successful investors already have enough money and/or lucrative investments, and truly do care more about other things, sometimes even while executing their foundational money-making investments.
For start all 3 were from top Universities? Gates and Zuck were at Harvard, Jobs was I think at Reed and partially at Stanford. How many people graduate from there? How many graduates of top universities go straight to FAANG for those fat salaries? How many go to work at their rich parents' companies?
Steve Jobs founded his company (with the genius Wozniak) at young age. Why would the theoretical "next Jobs" want to work at OPs company? The "next Jobs" (whoever they are) probably had already funded their own company. Also founders are outliers, since most new companies fail, you dont hear about those who tried and lost.
Bill Gates knew how to program well enough to get BASIC by the time he dropped out. There are people like this nowadays too. But they wont work as a junior in OPs company, when they can be hired as a mid or even a senior. They probably did few internships since year one university, so they can go to FAANG and earn a ton of money, or they just get hired by the place where they did the first internship.
In a matter of days, I have seen similar comments about HN audience in HN comments. Is consuming HN really an added qualifying factor or is HN audience a little too full of themselves. I don't know if I can attribute anything but being a bit curious as a quality for being HN audience. I don't think it's true that HN is a self selecting community who study their job in their part time. I spend time on HN, I am in tech and used to spend a ton of time "studying my job in my spare time" when I was younger, as I get older and have a family, my priorities are different. In the recent time, I tend to spend time now spending time with family or learning how to ski or spend time with yard work instead of spending energy on the latest shiny framework/language every year. I still work on personal projects but as I grow older, I have other priorities and things I'd rather do as well. I just don't think I am better just because I read HN as opposed to reddit or Instagram or Whatsapp.
I still remember, this stem cell professor, Irving weissman, giving a lecture and the introduction said he got his start in stem cells when he started doing some stem cell isolation, when he was 15, in school, in the 60s. So 21 year olds, if they’re good, should know a bit about the thing they spent 3 years studying. let’s not hide behind some weird watered down metric that they’re too young.
There's only a limited amount of time in life to gain experience. A 15yr old's brain is quite literally not fully developed yet, expecting someone to have chosen and begun studying in their specialty then is ridiculous. Such an early start also not generally possible without the privilege of good mentorship. There's also reason universities historically wanted well rounded individuals and require general education classes--you'll end up stunted or miss opportunity for cross pollination otherwise.
Unfortunately, the job market is getting more and more competitive.
Software engineers had easy in the last 10 years due to high demand, but things are changing now IMO.
Automation and AI will make most basic programming jobs redundant. Combine with saturation of entry level programmer. Everyone will need to push harder to differentiate from others. Race to the bottom..
You are probably not an engineer, since you should understand GPT makes programming harder, not easier. You won't necessarily make something easier by making it more high-level. Following your logic, you could conclude introduction of C made Assembler engineers redundant or that introducing Python left C engineers without a job. This is not true, using GPT to code is leveraging a natural highest-level language for the job, which is certainly leading to trouble, because it's not the best tool for the job – people specifically invented new languages so it's easier to express the business algorithm, all the attempts to make coding look easier by making it more as natural language failed, and the thought of GPT would suddenly change something? It is naive and ignorant, doing code is a pure thought process and fingers have long learned to tap it out by heart with the usual syntax without falling for the trap of ambiguities and inconsistencies in natural language. You just can't build reliable things with the prose, you do it with stricter rules of expression in mind
I 100% would not have written the code as well as it came out with gpt's help.
That’s exactly my point. The current scenario where someone can just go into a 3 months javascript bootcamp won’t be enough.
In my team, there is a grad dev doing bare minimum work. He has no initiative and struggles to understand basic requirements. I need to break down the task so much that I’m almost doing the work. In a few years, with better tooling/copilot/gpt, I will be able to just “finish” the job myself, and this kind of dev is made redundant.
Maybe this kind of dev is not common in FANG, but I met several, from small to big companies, in my over 10 years software engineer career.
Realistically 3 months of any bootcamp was never enough.
>In my team, there is a grad dev doing bare minimum work. He has no initiative and struggles to understand basic requirements.
This kind of person has been around all over my 25+ year career, starting in the dot boom. "You should get into programming because of the money!" This is the result. With programming, you have to have an almost unhealthy obsession with it to be successful. These people get weeded out during the crashes, in which we are in the midst of.
FWIW, we have one of those too.
I kinda doubt that. You still need someone to act as a translator between user and machine.
AI/automation will help more seniors developer to a point that most basic tasks can be done instantly and you don’t need to ask a junior dev to do it.
That actually sounds to me like the opposite, i.e. "Race to the top", or just "a race".
My university internship and first job was at an insurance company.
Know who works less than employees in the insurance industry?
Almost nobody. I don't think anyone I've ever met in my entire life worked less than people at my first tech job.
Actual programming is more complex and involves tons of non-code logic.
I graduated with my Bachelors in CS in 2016, and those classes were optional senior electives. You were required to take a certain number as well as some required ones (i.e., Computer Architecture, Analysis of Algorithms). I chose to take Database Systems, Data Mining & Machine Learning, Robotics, Computer Vision, etc. as electives but not Concurrent Programming or Network Programming because I already felt comfortable with those topics. Others chose classes in topics like mobile application programming or programming language theory.
Those topics may be foundational for you, but not for others.
Database transaction locks, data (form of race condition), SQL (declarative graph traversal combined with a simple projection), slightly derivative.
Compilers and SQL are technically not the foundations IMO.
Jumping/reading/evaluating/copying data, binary trees/log base 2 hierarchies, state machines, set theory, functional programming, Von Neumann model plus knowledge of multiple pipelines for integer adding are the basics.
...But, studying compilers and SQL is highly advised. Compiling code, and an understanding of database transactions locks are incredibly important practical skills.
Interns / juniors have little to no practical experience, and practical experience is where you _really_ learn how to program.
I think there is room for innovation in CS/SE education. Imo some sort of "code review" class where students analyze and report on a bit of code would do wonders for interns / juniors ability to onboard quickly into their first job. I've written about this in the past [1]
[1] https://sophiabits.com/blog/the-one-change-id-make-to-se
Your comment makes me think about a blog post that I once read (I cannot find it now). Roughly, it was a list of things you need to learn as a computer programmer, but are not explicitly taught in classrooms.
One nice example: What is "staging" in Git? Hell, I've used Git for years now, and that part of my mental model is _still_ fuzzy, but the model is good enough to do my job well. I cannot know everything crystal clear.
I would the same for concurrent programming. Not looking down upon anybody, but the vast majority of programmers will never go deeper than: "Oh, just use a thread/worker pool with a runnable to do work." I've gone deeper, but blew off both my feet several times! (Hat tip to in/famous C++ quote.) And I only need to write concurrent code a couple times a year. I'm always rusty when I get back into it.
You may have heard this one before, but staging is just saying, on your local filesystem, of all the files I've changed in this repo since last commit, these are the files I want to add into my next commit. You mark them for this with `git add <filename`.
(Slightly more detail: it doesn't need to be file-level; you can stage only certain changes within files if you like, using `git add -p <filename>`.)
I remember in university we had peer review for writing courses, yet that didn't really seem to elevate anybody's writing more than it had been before. The people who did well continued to do so and the people who did not also continued to do so.
For example several first year students not only had no familiarity with calculus, many were having a hard time with algebraic concepts. Concern was around finding some way to add a remedial math course before Calc 1. This pushed everything else out farther.
CS grads may lack foundational knowledge simply because high school grads lack foundational knowledge. What it took to "pass" high school 30-50 years ago seems substantially more rigorous than today.
The administration didn't care. They didn't even care if the students dropped out. Butts in seats = more money. That's it.
Maybe the issues around high school math are part of it, but this was kind of obvious in the 1990s as well, I remember sharing a cube as an intern with a graduate student at another well known school and they were taking graduate classes in the summer and were using the same book for a supposed graduate class that I had for a class my freshman year at RPI.
I think RPI's core program requiring Data Structures & Analysis, Fundamentals of CS/Models of Computation, Programming Language Design, study of Grammars, etc.. is an outlier for undergraduate CS curriculum.
It seems rare I come across someone who can analyze the complexity of an algorithm, check if something can be parsed by a Regular Expression, really understands recursive algorithms, etc.. unless they went to graduate school, and the industry doesn't seem to expect people to understand this stuff.
But you can't necessarily generalize. Not all high schools are the same and they never were. And old professors at RPI are just mirroring the historical reputation of RPI as a tough school. In the past they probably just had more freedom to fail all those students out of the school.
Another major consideration: "CS degree" is a misnomer in the US. We have thousands of colleges and universities that vary from "best in the world" the "literally fraudulent". I've even come across a few small colleges where the CS department is staffed entirely by people with a few years of industry experience and an MS, who likely wouldn't even clear the bar for a senior eng role.
I would suggest looking at the CS departments page from whatever university you mainly recruit from to get an idea what the core of their program looks like.
I'm sure it's needed by a large percentage of programmers, but maybe that's my bias speaking.
That was a fun course where I learnt horribly bad patterns like "your data forms a key so you should use that as your key instead of having a surrogate key"
The prof was good at relational algebra, but not designing software
I used SQL all the time in web dev but haven't touched it since going to college. I do embedded so it's just never been necessary.
Now if this is a web dev or data related posting and students are coming in supposedly for this specialty then this ask HN is a little more understandable. Although perhaps poorly worded.
Oh that's cool! I've never worked on that side of things. I assumed SQLite was used in embedded stuff, but I guess not.
SQL was then something that was a side thing in the class that you were supposed to pick up for one of the projects.
We had to learn to normalize schemas in that class. I have yet to come across a team in my career where normalization was understood well enough to where it could be correctly matched to the architecture of the entire system.
The other amazing takeaway was going through the historical failures of hierarchical & graph databases. Since I started working hierarchical databases have come back with tons of hype twice now (once in the late 90s, once in the late 2010s). Both times they failed exactly the same way as history would predict based on what happened in the 60s and 70s that led to the development of relational databases. Sometimes it seems like the industry hype train is completely unaware of computer science.
Perhaps the problem is that we train so many computer scientists to do programming, but that's the learn-on-the-job part I guess.
I met version control and unit tests on the first day of my first real programming job, because I studied physics, math and computer science. And I'm still (by far) the go to person in the office for all matters related to plasmas or cosmology.
All the things you mention are fairly big topics, and not only that, you only really understand them by doing a bunch of coding over several years. You can get introduced to them in university, but a degree course is only so many topics and they all need an introduction. Chances are if a student has done these things it's superficially, in practicals that are similar to practicals in other sciences: you don't really understand it, you write it up, and then you don't rely on what you did for further studies.
I studied a bunch of things at university, leaving without being particularly good at any of them. For instance I built a radio and a bridge in my first year on the engineering degree, but I couldn't just become an EE or civil engineer from that. I wrote a thesis about early WiFi for the business school, but that doesn't mean I could just be a product manager.
Similarly, a student may have done a bit of joining tables in SQL, a bit of multithreading, and a bit of routing during practicals, but you wouldn't think they really understood any of those things in the way someone with a couple of years on the job would.
However in my experience most of them pick things up relatively quickly and end up becoming good engineers and reliable team players.
At a BSc level with no experience we should be looking for genuine curiosity, motivation and interest in learning and solving hard problems.
Everything else can be taught.
We have even expanded our program to include interns with degrees in other areas of science such as mathematics, physics, materials and mechanical engineering, some with very little programming experience, with great results.
Our objective is always to hire them and retain them, so we do invest plenty of resources in training them well.
These engineers all turn out to be okay but we don't end up with any new or advanced ideas out of this pool. We cargo cult every behavior that Google does because a few of our senior engineers were ex-early-Googlers.
If feels like working with the blind. There's no interest from our engineering teams about new developments in the field and they don't even recognize when the work that they're doing is a fit for new patterns.
99% of these people don't know what a CRDT is and don't recognize when they're accidentally building one. Once every six months or so someone will post on Slack about their eureka moment of just discovering what a Bloom filter is and how it might apply to long-lived problems that we have.
To someone entirely from the practical/self-taught/trade side of things it's a kick in the teeth knowing what I bring to the table and how my org depends on me but doesn't respect me enough to hire other people with the same background.
My curriculum was: Year 1: Intro Comp Sci / Year 2: 2 courses in Logic, 2 courses in data structures and algorithms / Year 3: 2 courses in processor design, 1 course in finite state automata, 1 course in parsers / Year 4: a course in ethics, a course in team programming (which covered UML and version control), and two electives.
I believe a major was 14 courses so I'm missing one, or it may have been it was three electives. I didn't take databases because I was already a paid sysadmin before I started college and mostly at the time database courses were just ten tedious weeks of normalization crap.
Also, treat your interns better. The reason to hire interns is because you plan to devote some of your resources to help them in their professional development. Stop asking what you can get out of your interns and start asking how you can best give something to them.
I got them to work on an internal tool using Node and Vue.js for 2 months. They were good programmers overall.
After their internship ended and completed the requirements for college, I received a personal letter from them thanking me for the knowledge and support they got from me. Apparently, I lost the letter after closing our startup after a year. But I vividly remember the gratitude I got from them.
What I learn is that internship is a two-way street. You learn how to communicate with them effectively at their level and they learn from you in writing software.
This may be different in different universities and it may have changed today, but the tests we took were largely about remembering lecture talking points and being able to regurgitate them with or without any real understanding.
For example, you might learn a bit about relational databases, but your understanding will be limited to the talking points of the lecture. Eg, you might get question to explain the use of primary keys, but if you were asked them how you might design a relational database for some data with normalized tables they'd have no idea, because they'd never have actually put the talking points into use.
It upset me because by the time I had finished university I had launched two startups and worked professionally as a developer for 3 years. I was consistently helping students with practical exercises while at uni given I was one of the most capable on the course, but none of my experience really helped me in the tests because I discovered so little of it was about practical understanding, and mostly just an extended English exam tested mostly on writing ability and being able to regurgitate talking points in lecture slides.
It's not that the students weren't smart or capable individuals, its just that the course didn't incentivise obtaining depth of knowledge in what was taught so no one did.
I was fortunate to get my degree where I did. In our RDBMS course, the mid-term and final was to build a fairly sophisticated model based on a set of written (in English not code) requirements.
In our senior two semester class (I forgot the name), we had to design and spec out an application first semester using Microsoft project (I'm old) and in the second semester, we had to build it and demo it to the class the last few weeks.
SQL? Yes. Database theory? Have never discussed it beyond "what is an index"? So I never looked at that topic again.
Concurrent programming? Never dealt with it outside of courses and jobs that care about it mention it, so I self selected out. So I never looked at that topic again after the course.
Network programming? Took a course in it, but outside of a few devops use cases, I have never had any reason to recall that knowledge. I just memorized 5 versions of that test and went in to it with that.
My advice to my undergrad self would be to basically abandon anything that is not fun projects (so you get familiar with the languages themselves), hackathons (so you have culture fit), and leetcode.
I can't imagine the average ROI on learning these things is great.
Second, I don't know what part of the world you are in, but usually formal education courses have certain requirements to them, roughly x1 hours of social, x2 hours of humanities and so on. Then there are basic prerequisites like math. In the end, the final number of hours for subject is not that high as it would seem at first glance.
Finally, there is competition among education providers and their "tiers". Universities/colleges compete not only among themselves, but against codecamps too. The premise of code camp is to help somewhat computer literate people memorize a bunch of text macros that yield certain result on the screen. Colleges must adapt to compete, dropping the quality floor even lower.
In the end, unless you have graduates from "general" college/university you can expect that deep foundational understanding will be replaced with quick factoids on how to produce certain result in certain specific context without understanding said context or even being aware of said context.
Greybeards looked the same at "us", by the way.
Universities have no vested interest in education of students - their money comes most often from government loans at the time of attendance. Once the student graduates, they have their money, they don't care how successful the graduate is in the industry.
As such, if you ever want to see massive education reform, all that needs to happen is make the state colleges run their own bank system and give out education loans, instead of the government adjacent banks. You would see massive changes overnight.
Is this really the case for software development/ engineering? I would think having lots of lackluster applicants would tank undergrad rankings for us news, that having successful career fairs and starting salaries are in fact a priority. (Granted, these universities care more about research.)
And the institution doesn't care, but many of the teachers do care, so they do what they can to help the students.
Also, they cook the books. I went to a college that supposedly had a high 90% placement rate. But what they actually counted is if the person had a job soon after they graduated, no matter what field it was in. Because most people who had to pay those tuition prices needed to be earning money to pay them back already, they took whatever job they could.
I called them on it and they claimed that I was refusing to follow their advice and that's why I couldn't get a job in the industry. In reality, they were just horrifically bad at job searches. The interviews they sent me to were for horrible companies, and even then I didn't even get a second interview.
They claimed it was because I refused to follow their advice of hand-writing a note to give to the secretary to give to give to the interviewer. I informed them that seeing my handwriting was a massive turn-off, but that continued to be the career counselor's excuse as to why I couldn't get a job.
My dad ended up going in and yelling at both the counselor and the head of the head of the school, which changed their attitude considerably. They were still completely ineffectual, though.
Career fairs are reflective of general job market. As long as companies do see career fairs as a viable recruitment path they do participate, even if for the sole reason of poaching "the best". Companies do not really care (at least as first-order effect. Of course there are second-order considerations) whether the grey mass is sub-par or completely unemployable. On the other hand, as long as an institution is not notorious for producing unemployable graduates they also do not really care (again, as first order effect) whether the lowest rated classmates are merely slightly below average or "complete morons".
Why not? It's a zero-sum game as in the number of students in a given year is +/- fixed. I see it as a competition to attract students to apply AND eventually graduate
Also what network programming means to you?
Basic networking knowledge or actually writing low lvl network code?
Besides that: higher edu institutions suck, unless it is something like top3 then do not expect a lot just because it is a degree, everything is up to the person.
The rest were mostly fluff. Interesting fluff, and good background material, but very little useful stuff.
The interesting things were available through student societies, and if you managed to get to work at the University's IT department.
Your typical liberal arts degree has 4 years; about 2 of them are dedicated to liberal arts, or core education, and two years to the major, so you only have, say 16 classes for a CS major. Of those, say 4 are math, 4 are the intro programming sequence, and 3 are the architecture sequence (circuits, architecture, and operating systems), so you only have 5 or so classes, for your foundational classes and cool electives.
What normally happens is that, depending on uni and student, students will take some of them, but not all of what you call foundational knowledge. And the same with what the next person wants ;). Your foundational knowledge doesn't include AI, Data Science, cloud, mobile , graphics, UI, ...
Learning more theory later on is still possible but those are more like financial investments that give lower yields over longer time periods. So they are best done "early" in your personal development.
In terms of interviewing interns, just find out what they do know, and judge the best one on a balance of talent, knowledge and people skills. They will do you proud. No need to have a set expectation against specific skills unless that is the core domain they'll be working on.
I never took any database modules, I am entirely self-taught on SQL and MongoDB through some side projects.
I never took any concurrent programming courses (not interested).
I only took one network course and quickly forgot everything except there are 7 layers in OSI model (or is it 5?)
Needless to say, I never needed any of those knowledge in my work as a frontend engineer. Even if it is needed somehow, I could just fire up MDN/Wikipedia or ChatGPT and ask.
What I did take are a lot of software engineering and AI modules, and I found them to be more useful or interesting.
Network programming, concurrency and distributed systems are literal force multipliers. Super impactful fields.
But yes, regardless of what people choose to learn in university. There’s always a much more than can and should be learned after.
This might be especially true for theory/fundamentals. It's easy to skip that stuff if your program's focus is on immediate job-readiness training.
I actually would love to know more about network programming. Specifically, "how do I go from electrons wiggling in a wire (or radio waves in air) to the TCP stack"
I have had hints of this - the OSI model, wireshark, etc but when it comes to figuring out the nitty gritty of networks I feel like I'm stabbing around in the dark. How should I configure my networks on AWS? What's the best way to get VPC's talking to each other safely?
At an interview with a former (excellent) boss I was doing an exercise and set up a REST API. He said one simple thing - "ok, but why HTTP?" and suddenly my very, very faint memories of netcat came to mind and I ended up making it ~10 times faster but just putting bare messages on TCP instead of using HTTP. (I'd have used a low overhead protocol in a production situation). But I want that sort of idea to come naturally.
Should I just take the AWS cert courses or is there a better way? I want to have more than just "uhh netstat -peanut and grep for stuff" to figure things out.
Similar, how do the internals of linux work? I can generally get what I need to do, done, but I only knew to check load average after a colleague mentioned it.
Everything I've learned is top-down; I think I'd like to learn bottom-up.
For more direct help I believe Beej's guide to network programming is greatly recommended https://beej.us/guide/bgnet/
To put it this way: I'm just about a decade out from when I finished my BS degree, and my CS courses were (in order):
1. A SICP-based intro course using Scheme. I loved this course!
2. A data structures course using Java
3. A machine structures / light hardware design (just to understand pipelines, caching, etc) in C and assembly
4. Now pick whatever CS area you want to study
I mentored a new grad from my uni who is just graduated, so is 10 years younger than me roughly. This curriculum was changed to:
1. Same SICP-ish course, but using Python
2. Data structures was cut shorter to make room for ML
3. C++ course to build some kind of "distributed system" (but no discussion of the fundamentals of how the ABI works, for example)
I had to explain very basic things to this guy (e.g. what the stack was, basic gdb usage), who was otherwise very bright.
That's not to say "the kids these days are all dumb!", as I have TAed some classes where some of the students were far better hackers / coders than I was, but I just think the funnel towards skills and technology that have Proper Nouns that can be put on the resume is an unfortunate pressure being placed on universities these days.
The "foundation" depends on the context. Your process may be different but many companies will consider D&A as foundational knowledge as it is the closest thing we have so far to a standardized measure.
In my country you get to pick just a small percent of classes and the foundational ones are mandatory for anyone.
>Database Systems (relational algebra, SQL)
We did shitload courses on databases in both undergraduate and graduate programs. Not my favorite but they were useful. There's now way to not deal with databases as a programmer.
> Concurrent Programming
Did that and parallel programming, too.
> Network programming
Did that as a subset of Operating systems course, where we had to tackle many aspects uf Linux systems programming
We did a lot of other courses that were at least just as important. Algorithms, Data structures, 3D programming, Testing (forgot how the course was called), Formal Languages and automata, Data mining, ML, AI, Digital circuits, Cryptography, Big data, Cloud, Complexity, Web, Semantic Web etc.
I don't know about CS graduates, but I've seen with 5 to 10 years of experience lacking basic skills such as commonly used sort algorithms and time complexity. Their justification was in lines of: "I know JS and TypeScript and React and I don't need anything else".
I still remember being asked ridiculous questions about design patterns and UML as a new grad. Stuff that is never taught in a typical CS degree - but interviewers seemed aghast I didn't know them. I still remember one saying "but you didn't learn design patterns?!".
I think the only things you should expect from new grads is ability to code, basic understanding of computer architecture, and possibly data structures / algorithms.
We learnt the fundamental concepts plus do some programming, e.g for database: relational model, normalisation etc and coding in Java/PHP with Postgres/Oracle.
Our curriculum is pretty generic, and most of the alumni work in the industries as software engineer, IT consultant, startup founder, etc. Not many do academic work.
I'm not saying that all of us understand the basic concepts well, though. Some maybe only read a bit of theory and spend more focus on writing apps, instead.
I've had UCB/Cal State students say things like "Why would I ever need to interact Excel when there's X?" - uh if your boss or execs use it! I recall the "Hadoop big data expert" who couldn't figure out how to do a VLOOKUP during an open-browser interview, or folks ready to graduate from a data bootcamp who couldn't give even one example of how they could distinguish between plausible and accurate data. Also if any local programs teach students about the ascendancy/utility of PowerShell for anything IT/cloud/admin/DevOps-related it's not trickled down to recent applicants.
I've been doing interview of freshers since long.
Sometimes they'll know a lot of things, sometimes they can't tell a computer apart from a file cabinet.
One can't predict what a person knows based on their degree, yes it does give you a general idea but that's not the case each time.
Moreover, there are various other factors that you should consider while doing the interview, if they are freshers, like they might be nervous, they might have travelled long and came to the place, many many factors.
Looking back on it now, a lot of the courses could have been replaced with something a bit more practical, e.g., how to use Git. But strangely, some of the larger team projects made all sorts of assumptions around being able to code, knowing about networks and networking etc. which were never explicitly taught in the course.
No concept of processes, no idea about any data structures, intimidated by everything. They've studied C/C#/Python in a class but can't remember anything about it. It's really a lack of passion and interest and it's endemic I think. People study CS because they've heard it pays well.
I fully expect these people to become my managers!
They have better social skills than the CS students of the 1990s, so this is a really good bet.
For example, my concurrency course involved a lot of formal methods and temporal logic. Because the teachers were doing research in that area. In retrospect, this was all a bit academic and not very practical. This stuff does not map very well to the real world when you are actually trying to solve some concurrency issues. But enough of it stuck that I was able to read up on this when I needed to.
And of course, some of that academic stuff actually works out in the real world once in a while. E.g. the java.concurrent package is a straight integration of a framework that came out of the academic world and very nicely done. Great stuff and I still remember pulling that in as a separate maven dependency before that was integrated.
I think of university more as a place where you learn to learn new things rather than as a place where you actually learn a lot of things. Mostly becoming a good software engineer is basically an apprenticeship. I was lucky with my early gigs and I learned a lot on the job from more senior people. I've worked with lots of people with either no degree or a non computer science degree as well. It's fine as long as they can learn new things.
Do an experiment: pick a handful of intern applicant resumes, look at their degree and then look at what the uni requires to get that degree. In the US, you will find often that there are ABET accredited and non-accreddited versions of the same degree. What you are looking at is commonly required for ABET but often avoided when possible. Or in the case of one person, I've found that the college they claim to have studied at doesn't have any such degree program.
Reminds me of a conversation with a new hire that was re-orged onto my team, and a machine learning engineer on another team. She said "Ah, you are lucky to join his team this way; they are all the people who took those hard classes like Operating Systems and said 'more please' instead of quickly dropping the course. Their interview process is very hard."
It's also worth remembering that the US BS program is not like Germany's old Diplom system. It's not 2 years of gen ed then 3 years of only computer science. It's 4 years total and entirely possible for a student to study SQL for one semester two years ago and never touch it again as part of coursework, if they took databases early and don't have a school job.
My impression is that the philosophy of CS curriculum at most places is to keep the barrier of entry low. That's why bootcamps are a thing, and my fellow CS grads are excited about ChatGPT being able to write code. On the other hand, there is a growing shortage of electrical engineers because the level of gatekeeping is too high.
I am an electrical engineer myself and I have working knowledge of all the things you mentioned.
I think I had one course on "parallel"(concurrent) programming. I had exactly zero courses on SQL or relational algebra, although we did write a basic database. I had one course that MIGHT have touched on a bit of network programming.
So, academically, I would have had little or none of what you are considering is "foundational" CS knowledge.
I actually did learn SQL during a job I did during school, and I was fascinated with race conditions and mutexes so knew a fair bit about the theory of concurrency. And I loved learning about all kinds of protocols, like SMTP and Telnet and FTP, so I knew quite a bit about networking.
I think the disconnect here isn't that "students don't learn the fundamentals", it's that you think the fundamentals are what they need to "do work" on day 1. Back in the day, University was supposed to train you to THINK in your field, and so things like data structures and algorithms, yes.
And, I have to say, it has done well for me. I started as a C/C++ programmer, then moved to Java, worked in PHP and have now been writing a lot of Python. I've helped with C#, Visual Basic and a number of other languages.
And if tomorrow I need to use Go, or Rust, or $whateverLanguageOfTheWeek it is, I'll be able to. I understand what can make a program run fast or like a dog.
If someone can do all of the work through 3/4 years of university, they should be able to learn and do a large variety of tasks.
Nowadays, it might be that a CS program requires some or even all of the courses you mention in their study. But do you expect someone who took this for 1 of perhaps 20 courses to be an expert at them?
There are lots of "foundational knowledge" materials out there you can pick from the realm of "CS" - there's a theoretical track, practical track, grad school track, "just enough CS to be CS while I have fun doing other stuff" track ...
If it's vital [to you] that an intern (which, most often, is defined as someone you'll only have for a few weeks or months and who is distinctly still in school) have all that "foundational knowledge", then put it in the job posting :)
Most internships I've ever seen have far fewer requirements (because the company is going to do some amount of training for the interns during their tenure). They're more along the lines of:
- Jr majoring in <degree> with minimum <GPA>
And ... that's it
I knew "a lot" coming out of high school - more than ended up being covered in pretty much every [core] class I took over the next couple years
Didn't make me at 20 comparable to someone who'd been doing the work professionally for a decade or more :)
Be aware that university today is largely unaffordable to most. I don't mean that as in "it's expensive". I mean that literally most people don't qualify for federal student loans, and cannot afford the payments on private loans. Yes you have to pay on private loans as you go to school. Those payments based on a percentage. Which has obviously massively increased since 10 years ago. Schools don't have jobs for students that are not in "financial need" (parents income). Therefore you have to work a job unrelated to your field just to go to school.
Good engineers find out quickly that university is much more about the fed paying itself at your expense. The system is corrupted by politics (the blue kind) with accreditation clearly influenced by whomever sits in the chair. It's not about merit, it's a gravy train for gov workers.
Look elsewhere for employment.
I chose the "traditional" stuff, like operating systems, compilers, did my databases course with a lot of interest, embedded, and then wrote some software in my free time (beammp.com's server, which is a game server for a multiplayer game, lots of concurrency and network stuff), and other side projects, a lot of stuff from scratch, and a good lot of open source contrib.
A lot of my peers don't do their own side projects, and if they do, its a website or something ontop of layers of abstraction (like a CRUD app in TS).
A different subset of my peers chose specializations in which they barely need to know what a variable is, such as security related courses which are more law than CS (e.g. forensics), some who go for "fullstack" webdev almost exclusively (no interaction with hardware, DBs only through abstractions with no sql in sight, abstractions of abstractions and a lot of copy paste).
The few of my peers who are interested in CS to the degree that they enjoy learning the foundational things share my view, and I often talk to them about this topic.
I dont mean to be dismissive of those other disciplines - they are valid and the depth of knowledge that can be acquired about, say, TS, is not something im questioning.
To me, its just not that CS-y to write html, I feel that close to the hardware is where the programmers with the degrees should be. In my experience as a webdev you get outpriced by third world country developers very easily, and thats a tough spot. Not so much in, say, robotics.
But then again that's just where most of the money is. And I don't think some Indian remote developers are a threat when it comes to any serious employer in the west: The code quality you get for those 15$/h is often crap and the communication sucks. Not to mention they can always just decide to be fed up with you and run with the 100'000s of Rupees they already got, when things get hard.
Engineering is a craft you learn under the tutelage of peers and masters, i.e. at work or perhaps in a community (e.g. open source). That is what you should expect to provide to your interns. They're not cheaper, fully-formed labor.
Since then, I perceive that CS programs have come under immense industry pressure to crank out software engineers and not computer scientists. The need for real computer science is not very large as a percentage in the entire computing/software industry. And generally once a funky CS problem is solved, it gets encoded into software and then reused by software engineers over and over.
I think this has forced CS departments to start directing some of the curriculum towards software engineering rather than CS -- resulting in students who are more diluted in the fundamentals, but who also aren't great software engineers.
I also think that some departments are also now trying to figure out how to support the massive growth in the kinds of computer science theory needed to support the emerging data science/machine learning/deep learning/etc. fields. When I graduated GPUs were still relatively new and we simply didn't learn those until later MS programs. Now I couldn't see somebody graduating undergrad without knowing some basics on how GPUs work on the hardware and software side, and understand the data structures and computations required to work with the new classes of models emerging in the field.
If you need someone to write a network stack and a concurrent database system, you don't need an intern, you need an engineer.
Interns are generally students. Sometimes they're still in school. If they had skills in these kinds of deep topics, they would be applying for full time engineering roles, not internships.
People apply to internships because they lack experience and skills. It's your job to mentor them and expose them to these kinds of specialized roles so that they can decide in what direction to start their career. Interns come to you to learn and gain experience they couldn't otherwise.
Interns aren't workers, they're apprentices. Once you treat them that way, you'll have a much more rewarding relationship. They're there for you to teach, not to do an engineer's job for a fraction of the pay.
Nearly all of them are strong in the areas of design and software engineering principles. They are also strong in SQL, the Git/GitHub ecosystem, and at least one language/framework. And they know just enough about infrastructure to use a simple CI/CD pipeline. But anything beyond that infrastructure-wise (networks, security, tiered architecture, IaC, services, PaaS, etc.) is foreign territory for them because it isn't covered during a typical undergrad program.
Now this is perfectly OK in my opinion because I provide that for them initially and teach them many of those concepts while coaching their projects. There is limited time in any undergrad program, and I'd rather universities spend that time focusing on what they are currently focusing on because that is a foundation I can work with.
In general, however, fresh CS grads from even good universities need to be onboarded on software tools and certain types of systems when they have their first software engineering jobs. For example, many college students don’t know git. Basic sysadmin and more advanced Unix command line skills are also generally not taught.
Personally, I still think that you should be exposed to those for a computer science degree. Here is why:
1. Universities have, and maybe always have, evolved to meet the demands of the labor market and most jobs will touch on those topics in one way or the other.
2. Generally, you need to persist data, whether it's on a drive, in a data store and so on. Having heard of different data stores and maybe differences in query languages, seems very relevant. This doesn't mean you know how to write your own database.
3. Which processor these days does not have more than one core? Even in languages like Python or Ruby, data races can result in subtle bugs. Having some idea around that maybe access to shared resources needs to be protected is useful.
4. Whether it's writing code in microservice architecture or integrating an API, we make network requests. Having an idea how this might look different, for HTTP, TCP or UDP provides a lot of context to make better decisions.
To me the idea of a formal degree, is some level of exposure, so that even when you have never touched those things years later, you have some reference in your head to start looking it up and refreshing your memory.
I also agree, that a computer science degree doesn't mean you become a software engineer and so it might not make sense to force everyone to take those classes, but then again, see point 1. Alternatively, which classes would be more suited or a viable alternative to those if you have to make a choice?
So American CS curricula do emphasize teaching concurrency as a theme, as well as expose students to all levels of HW/SW abstraction (architecture, OS, networks, etc.). It's just that there's less emphasis on the specific trends because those are expected to change over time anyways. Also, CS is diverse, you wouldn't necessarily expect a quantum computing student or an AI student to know industrial level practices for concurrent programming.
The various software engineering modules I had were pretty lacking and flawed in their teaching and I could tell that at the time and even now with several years on industry experience I still believe they are lacking foundational areas especially the ones you identified.
For example:
- Databases were barely taught or even used. We had some fairly poorly put together “web” module that covered PHP, a tiny bit of SQL, no HTML, CSS, JS.
- Networking module was “here is a Cisco CLI, setup RIP following these instructions, congratulations you are now network experts”. I think we covered NAT in one module.
It seems to be a deeper issue too. I remember that the vast majority of people in the software engineering modules couldn’t not just write code - they’d never even so much as attempted it.
This was in other modules too - a majority didn’t know what operating systems were beyond “what, there’s things other than windows?” all the way to being for all intents and purposes tech illiterate.
To the point I remember a second year teaching another second year what copy and pasting is.
This was around 2012-2015.
So I don’t know where the blame likes really, I think it’s all an amalgamation of:
- Clear lack of interest or passion about anything in their degree of study (why the hell sign up for it then, with so many people doing that?)
- Completely failed tech or STEM education
- Total failure to vet applications to the university
- Whatever STEM GCSE or A levels they had gained clearly not being up to scratch
- Some modules being so dumbed down as to be meaningless or something you couldn’t gain equivalent learning from googling for a few days
I have noticed that undergrad CS curriculums in most places are shying away from making practical skills part of mandatory courses.
Sure, theoretical underpinnings are ‘more’ foundational. But unless you know how to use these (and similar) tools, there is simply not much motivation to understand the theory naturally.
If you don't want to understand the theory, what's the point in going to college?
Lets compare with a graphic library; you can learn the DrawRectangle API without learning the DrawLine API.
Not so with Concurrent. Some kids won't understand the debugger (if any LOL) Some kids will get stuck at coding. Some kids will have weird conceptual hangups. All the kids need to learn all of it all at once. Its a big chunk.
Same problem with Network. Pre-reqs are seen my uni admin as a way to restrict tuition income, LOL. You can't assume the kids are up to the level of a Net+ cert or even an A+ cert. Its a big chunk all at once. See above, you "probably" can't do network "correctly" unless you async it. Does the class system even have a pre-req that includes the concept of a TCP/UDP network port? Its just a HUGE topic to inject all at once.
When people say DB they are usually unclear. I took a senior level DB class and learned Codd-Normal forms and all that which makes SQL query writing pretty easy. The problem is your interns aren't graduated seniors yet and they probably eliminated that class from the curriculum because "No-SQL movement" or whatever new hotness caused a distraction. I think you'd be a better logical thinker about organization of data if you know your Codd-Normal forms or have at least been exposed to the general concept. But other kids want to learn mobile app dev or frontend or whatever new hotness.
Finally a lot of uni work is filtering. Must be smart enough to learn up to this level even if you never use diffeqs again. So the kids you're interviewing don't know anything useful, but they're smart enough to learn, and it'll turn out OK, probably.
There was some depth but it wasn’t geared towards praxis. It was much more geared towards “base knowledge”, and the curriculum looks much the same today. SICP is still popular.
I happened to take a networking class where we learned sockets, but most people didn’t.
The core upper division CS classes were: OS, algorithms, compilers. Then people may take DB, graphics, networking, more algorithms, UX, “software engineering”, or maybe some HW classes.
Another thing to consider is where you’re recruiting from. Berkeley/Stanford/MIT will be somewhat similar but San Jose St or other state school will be much more focused on teaching python, SQL, C++, etc.
This all is to say: as a matter of curriculum design maybe we can train future software engineers better, but when presented with an individual candidate consider not only what the candidate knows now but also that person's ability to learn.
Even if it doesn't sink in while reading - and it probably won't - it will make the lecture make so much more sense. You'll learn much faster than you would without the preparation.
In my view, those classes and the education in general have exactly the same role with respect to your first job. They won't sink in fully during those four months, but they will provide context for the experiences in your first job and allow you to gain experience much faster.
In other words, view that new graduate as somebody who hasn't covered the material in class yet, but has done some reading to prepare.
These are not required courses in the CS curriculum of most schools in the US. Elective yes, but at some smaller schools, these are not even options.
Have you checked out the degree map of some of the schools? (Even the most well known ones) And you will see it.
1) different colleges have wildy different expectations and learnibg material. I went to a very small school with a high waulity cs education. I was amazed by my friends that went to bigger schools, id often have to study 3x more than them for the same grade, and their course material was basically constantly 1 semester lagging.
2) youre interviewing juniors. Major concepts in CS really started to "click" for me and my cohort towards the end of junior year. I think this is about when you have enough exposure to really start to grok big fundamental topics
3) a lot of students do not learn the fubdamentals and just hobble along to get the degree. I couldnt believe some of the shit my friends didnt know after getting the same degree as I did. Stuff like they still didnt understand pointers, couldnt explain tcp vs udp, etc.
But they're not upset when some interviewer tells them to do Leetcode whiteboard performance art, for pieces they've hopefully memorized but will pretend to be approaching for the first time, as a hazing ritual, and the interviewer, briefly feeling in a position of power normally denied them, says, "I want to see how you think", as if the interviewer can actually discern that.
People are accustomed to having their ducks lined up for them, and they just have to do the things they were told, and then they get the big paycheck, and they get to be the one hazing the new pledges. You're not playing along with the system.
And your list isn't at all what I expected it to be, all pretty practical 'application' stuff. I think that's largely the 'problem'.
> Is this typical for CS undergraduate degrees because you get to pick your own classes?
I don't think it makes much difference who chose them (student or programme director) - they'll be different at different institutions (or among students at the same institution) and not all of them will line up with your own education or opinion of what it should be.
If you absolutely need those things, then make it clearer in the job description. If you don't, then why not ask what they have studied (or most enjoyed, or whatever) and ask questions about those areas?
I would argue those are specialized areas, not base knowledge. Moreover, what kind of questions are you asking? It is more likely that you have a misaligned assessment of an undergraduates knowledge.
That's with an undergraduate degree. Some programs don't cover these things, or they are optional.
Personally, I definitely lacked knowledge of a few things when I graduated from undergrad (2008)
* Source Control. This wasn't as commonplace back in 2008.
* Linear Algebra. This wasn't a required class, but I took it after I graduated and proved to be invaluable.
* Concurrency. I ended up learning this myself, since this was only very lightly touched on.
No one is going to be familiar with everything when they're fresh out of undergrad. But usually you have at least some specialty (mine was 3D graphics at the time).
Some of the best I've had were from math, physics, philosophy, EE, & drama.
Not only do students come into university (and sometimes even into CS) not knowing what a file system is, many of them have a total lack of interest in learning what is perceived by them to be pointless.
I'd argue it is going to be pretty difficult to engage with any of those foundational topics if you aren't willing to engage with the basic metaphor of most operating systems, files and directories.
None of the courses you listed I would expect of all CS students in an undergraduate degree, and quite frankly, databases is something I would explicitly expect few CS students to have taken (the only branch I'd expect to be less popular to take would be specialization into numerical modelling, although that's more because I expect the people taking such courses to be science majors and not CS majors).
Honest question here: what's the location and comp like?
Keep in mind some students will end up with 3 internships during their undergrads, and many will end up interning twice at the same place. Why should they jump ship to your company?
I recall a story someone told me a while ago. Software business that did local CoL/prevailing wages. Hired an intern one summer that was just running around in circles around the other, more senior devs. Useless to say they loved him and the next summer they tried to get him back, even offering a signing bonus for an internship (something they considered unheard of) but he was already at a large search engine company down in the Bay. You can guess the comp was probably already 3x what his previous job was offering. Of course, he wouldn't return.
There's a whole class of engineers were completely invisible to most companies, even if they are in the same "local market" [0][1] (Some use the term "dark matter devs" but I know it has another meaning [2]). These guys tend to fly under the radar quite a bit. If you are in a tier 2 market or company, your chances of attracting one are close to nil. Because they are extremely valuable, they don't interview a lot and tend to hop between companies where they know people (or get fast tracked internally).
FAANG companies have internship pipelines, with bonus for returning interns. These guys are off the market years before they even graduate.
[0] https://blog.pragmaticengineer.com/software-engineering-sala...
[1] http://danluu.com/bimodal-compensation/
[2] https://www.hanselman.com/blog/dark-matter-developers-the-un...
Computer Science is about, you know, science. Not craft or engineering. Typical programmer's work is craft.
You will not hire metallurgist, who trained to develop new sorts of steels, to work in machinist shop.
Yes, big corporations could have research departments, where computer scientists are needed, every big player has one. Most of programmers work is not.
It is problem in our craft: requirement for university degree for simplest positions. It is wrong. You try to hire people who are overqualified in one areas (which is not needed for this positions anyway!) and underqualified in needed skills. Because they were trained for OTHER work!
Not by choice though.
Vast majority of people who get CS degrees don't want to be computer scientists. They want to do programming work, so they get a CS degree because that's the degree employers require. They don't even know that said degree won't focus on what they actually want to learn.
And it's not like they have better choices. There are no programming trade schools. Bootcamps seem to have the right idea in principle, but implementations are often questionable, and most employers won't consider bootcamp graduates.
Yep, I agree with that.
But I think, that way to go is to make trade schools & relax employment requirements, not change CS curriculum.
Left craft to craftsmen, engineering to engineers and science to scientists.
BTW, Electrical Engineering always be "Engineering", not "Science" degree, even if it is studied in universities.
What you really get out of CS education is not a lot of crystalized knowledge but an awareness of the shape of the literature. I used to joke that I could get through any interview with "look it up in the hashtable" and "look it up in the literature". If you're aware that something exists in the literature and know how to look it up you can get it done. For instance I wouldn't trust anybody (including myself or ChatGPT) to code up a binary search correctly out of memory, but I would look it up in a (recent) algorithms book.
What's important is how they think and how much knowledge they have. No one cares if you use the terminal or you're still using windows for programming.
Lot of the course work is around mathematics, data, machine learning, compilers and that is something that got them excited too.
A take home test really works well in this setup as they can research on these concepts and try to solve it if they are really motivated. However this does not scale.
We test them on how fast they can learning things, how motivated a candidate is, how driven and ambitious the candidate is. If the grad is really good, these concepts can be picked up pretty fast from the peers.
Likewise, I could have taken a database class but I wasn't interested.
I did take the programming class, though. Theorems, proofs, pen, paper.
Obviously, my university's CS department came out of the math tradition of CS, not the engineering tradition. That doesn't mean I didn't take a hardware design class, though.
I comment I read somewhere was that SQL goes from basic to advanced with nothing in between. It's a slippery slope of course: you start writing your own query and then you figure a more sophisticated query would be helpful and pretty soon you're having to care about optimisation and indexes.
https://en.m.wikipedia.org/wiki/Pedagogy_of_the_Oppressed
https://en.m.wikipedia.org/wiki/Small_Is_Beautiful
People forget the memorization after taking the final exam, and it takes some dedicated interest to take that many similar courses in a row.
So basically it's the old problem of juniors that in order to have a chance to get experience they need to have experience. If I was in a position of hiring graduates, I would focus on how well they master the true fundamentals, and let them pick up the practical details in the job.
There are also a decent amount of electives though, I have a strong networking and IP knowledge because I picked networking courses in my final year
In a lot of cases those areas are also covered solely by electives, so unless the student was lucky enough to take those classes, or they happen to spend their free time reading about these topics, they won't know them.
heck, even a decent-sized side-project would quickly expose one to DBS intricacies , computing tasks in parallel or across networks of computers.
I don't see how a more stringent CS curriculums would help here - the market is often ahead of academia in this respect (computing).
But generally, since moving to Europe, I see that a lot of new grads don’t have that foundational knowledge as you mention. I’ve even had other colleagues complain that university recruiting was terrible because people had zero idea about things like operating systems. My feeling is that unis started targeting skills that were required in the job market instead of foundational knowledge.
In contrast, my uni was very “systems” focused and didn’t really focus on skills employers were looking for at the time.
I'd expect that if a CS student has a passion for what they do they will try to learn topics beyond the curriculum. Doing the bare minimum has a name: mediocrity. Maybe that is what you are seeing.
Other comments mention that students come unprepared and thus introductory courses need to be more/longer. I think that too is true.
There's a bell curve to aptitude and drive to learn. The curriculum at any university reflects the intersection of that median in the student body with a social narrative.
OP – what do you think this dynamic you have noticed would have to do with the folks picking their classes? Could it be more related to, say, your notion of what constitutes topical knowledge, or indeed, how to measure it?
What are you interviewing for? Examples of your dialogue might help but the power of reasoning from first principles, ELI5 comprehension, window management continues to increase over time while the value of mastering specific content remains very context dependent.
There is usually a gap of 1-3 years for university curriculum to be approved to update. Some of the above topics above might be just one course, or one chapter of one course, or a few pages of a chapter.
As you know, in that time things can completely change in both established and new areas of knowledge.
Learning how to learn is something that's the most important for a CS grad to learn from themselves.
Comp sci students don't typically deal with a lot of real data, and the requirements around data that shape how it's structured in the real world. They definitely do not deal much with scaling dbs, or concurrent access, or anything you might find in the typical distributed systems that make up IRL.
> Concurrent Programming
Same for the same reasons above. They usually just have to do some simple things with simple systems on a local level.
> Network Programming
Same again.
Frankly you can't have it all.
What I see successful companies do is invest in uni programs so they have a stake on what the students learn. Take for example the UT Inventors Program.
This has been a problem for a while. Hopefully this provides some extra context.
So in general you cannot expect the typical college senior to know any particular one of these advanced topics. The baseline is still data structures and algorithms.
Yes. For example, here's MIT's computer science and engineering degree requirements. After math and basic programming and engineering fundamentals, it's all electives.
http://catalog.mit.edu/degree-charts/computer-science-engine...
Day one, we started with Gang of Four, the 25 class design patterns, and what an MVC is. Then we are leading into SOLID, and I’ll push him into mastering Dependency Injection and Unit testing.
I feel like his college professor or something lol, and I’m still so surprised that programmers aren’t taught the ABC’s in school.
When I am teaching a newbie how to build a modern frontend using React, and Redux for state management, how would I describe what Redux is at the core without mentioning a state pattern or one way data binding?
SOLID is a fundamental principle in TDD and AGILE development lol. Again, how can one master Jest frontend testing without properly understanding Dependency Injection first?
Rather, none of those things are in fact foundational to the field of "Computer Science" -- it's not a programming/software engineering program or apprenticeship. For better or worse (and you clearly think worse!).
OP, I'm guessing you do not have a CS degree?
Most developers probably do not need data structures, OS, compiler type courses, but instead would benefit from higher level, engineering type courses that reflect modern application design and development.
Note: I am 3 decades out of school, and there have been some moves in this direction I can see from interviewing juniors. But not enough.
I encountered the same lack of foundational computer science knowledge, but -here 's the twist- mostly from American students.
The best candidates I ever interviewed came from (in no particular order) : the Technical University of Munich, 42/Epita/Epitech, and the École Polytechnique Fédérale de Lausanne.
Needless to say I didn't get that job. I still do poorly at interviews, but I have gotten a little better.
Don't hire people based off programming riddles.
Don't hire people because they know an SQL join - hire people who can come up to speed with SQL in a reasonable amount of time.
Industry has a responsibility to teach good coding practices. That's just the way it is.
primarily, it is your expectation that advanced low-level knowledge of these topics is "fundamental" to being a modern programmer. today, you can get along just fine without knowing them.
additionally, most cs graduates are simply taking the class as their major, and it is not their great underlying passion or hobby. where you or i spend our spare time reading about how chess.com balances load or the changelog of the new postgres point release, most students will simply be doing something else.
i took a few introductory cs classes at my university. we were taught to use IDEs and were only given a cursory explanation of the shell. i would consider proficient use of a shell to be foundational knowledge, but in an era where all your files exist within a gui and the only shell you ever need to touch pops up at the bottom of vscode, it can be largely avoided.
most students don't feel the need to go beyond the bare minimum to graduate, so they don't.
probably. although these were all topics that i taught myself through books and practice anyhow.
there are great resources for learning these topics these days, many of which are even free.
maybe try asking them how they focused their studies and try to suss out if they're willing to fill any gaps they may have.
Also, it is a very good example of Goodhart's Law: https://en.m.wikipedia.org/wiki/Goodhart%27s_law
Doing a post final semester internship sounds kind of weird to me, so I would suspect you are European?
There is somewhat of a divide between computer science and engineering. Computer science tends to mean all math and algorithms. Engineering is closer to what you listed.
Why would you expect interns to have deep understanding of these topics? What required courses would they have learned about these topics in?
Why don't you already know not to expect this knowledge from candidates?
We were taught SQL heavily.
Network programming like packet switching, etc was something we didn't get into until grad school though.
I was being paid to write code for over 15 years before I needed to interact with a database or know anything about a network. What you would call foundational knowledge others would call niche.
And because some people say they did CS, while in fact they did some related study. No experience = no knowledge.
It's not just people who just finished their study.
I have been thinking the same thing. I am training an CS Student in computer operation fundamentals and it appears that collages do not teach practicalities of computing. I can only assume it's for cost/time reasons.
I've seen postings even require you have authored OSS and you provide a repo link. I guess you end up with fewer candidates or none.
concurrent programming for example also not taught same always. myself in school learned about concurrency like threads and mutex theory but “really” learned first time on job.
I do not think it is reasonable at this point to expect a candidate to have basic knowledge about _all_ the areas of CS.
- You can write a basic program with loop/branch/etc. structure. Nice if you know a bit about recursion (fibo), Nicer if done side projects of any kind;
- You are eager to learn
MIT has an extra online class teaching their CS grads the basics of software development.
Look for competency, not knowledge
The answer is obvious.
A "typical" CS student cannot care about all this and also jump hoops to land a job.
For example in my current programming job I don’t do anything involving networking or databases
Because a lot of companies have moved on to NoSQL databases and key value stores.
> * Concurrent Programming
Because many languages use lightweight threads (fibers, coroutines, etc.) now that don't involve context switching.
> * Network Programming
Because these days barely any company does anything beyond web requests and nobody implements these from scratch. No DNS lookups, no ACKs, no manual buffer writing and reading, no marshalling, etc.
The value of learning database systems is learning how to model and represent data. All entirely relevant to NoSQL and key value stores.
> Because many languages use lightweight threads (fibers, coroutines, etc.) now that don't involve context switching
Concurrent programming is everywhere. It is not just something that has to do with threads. Even a simple CRUD app with a React front-end and DB back-end is a concurrent system. If your ever used a Promise or async/await in JS, you're dealing with concurrency.
> barely any company does anything beyond web requests
You still need the basics and at least an understand of what DNS is, what a firewall is, etc.
No one here is saying that graduates need to have mastered all of these different areas. But they should at least be able to recognise them when they hit problem later on in their careers. I've worked with junior web developers in the past who had no idea that they're working with a concurrent system.
I doubt that this situation has changed much in the intervening time, if anything it's likely gotten significantly worse. So, it stands to reason that most CS graduates actually don't care about programming, computers, computer science, or anything related, but see this as a pathway to get ahead in life via a decent paying office job that has a high probability of remote work in their lifetime. There were 3 people (including myself) out of around 450 people in the CS program I attended that seemed to actually care about knowing how computers really worked, and all three of us went on to have pretty lucrative careers, I have no idea how the other 347 people ended up, because I didn't associate more than required with folks who thought college was mostly about getting black out drunk and still getting the paper at the end of the experience.
At least for me, my observation has been that the folks who actually cared all did quite well, and these were often the folks who actually put in the effort to do internships and co-ops or contributed to open source software during their time in college. If you are concerned with the skills of intern candidates, I'd suggest either things have degraded far beyond expectations from a few decades ago, or alternatively they have strong skills in areas of CS that you aren't looking at. I know a lot more about at least one topic than anyone I meet, and I presume anyone I meet knows a lot more about at least one topic than I do. CS has become a much broader field in the time since I was in college, and it's very reasonable to believe that students now who care deeply about CS also care about different things than I did. I spent a lot of time messing with network programming and getting interested in information security, these days I imagine someone might be more interested in things like optimizing web-assembly or building cross-platform applications using react-native, and learning these more "front-end" things deeply, vs focusing on systems programming concepts. For many people, the browser basically is their operating system.
Databases didn't fit into my schedule because I was also studying philosophy and mathematics, and I focused my electives on areas where there was a lot of overlap between at least two of those subjects: graph theory, abstract algebra, formal logic, formal semantics, philosophy of language, philosophy of science, compilers, formal languages & automata, programming paradigms (comparative survey of functional, logic, and object-oriented programming), etc.
I don't think network programming and concurrent programming were even offered at my school, but I did implement networked multiplayer on a TRPG that I implemented in my OOP class, one of the few CS courses I had where we had relatively long group projects assigned. I have no idea if it was any good from a network programming perspective, but that part of our game worked reliably.
I think those topics you mention would be more central to the curriculum of a software engineering degree (which was not offered at my school).
Allocate time to teach others around you.
The foundations are things like (this list should not be read as being comprehensive):
* Formalized languages
* The implementation, all the way down to the hardware, underneath those languages and how to bridge between them (this is far too frequently omitted unless it comes up in some 1 hour credit C course - and lately we're seeing C replaced with C++ courses trying to pretend to be abstract, making them pretty pointless to teach at all)
* Algorithms
* Complexity, Big-O notation or equivalent, computatibility in general, and mapping underlying algorithms to different problems
* The reality of just how different an implemented language and platform is from the abstract idea of one - limitations on sizes, errors, failures, etc, all the things that complicate the lives of a theorist trying to do real work
I'm especially annoyed by the C++ classes - said language is a massive cognitive load to inflict on students, and a huge, vocational distraction from the theory and concepts a degree SHOULD be teaching. A better course spread would be some machine language, C, LISP, Smalltalk, some (modern version of a) goal oriented language like Prolog or KL/1, something with intrinsic multiprocessing support, and one that is essentially distributed, and so on. Languages that demonstrate the breadth of what a language can encompass, rather than grinding students into the bottomless pit that C++ has become.
I do agree that these all are relevant: databases (by implementing one, with attention to ACID, but with a lot of assumptions about reliability given and highlighted), concurrency (both with a language that does it intrinsically, and in one that doesn't), and network programming (at least three totally different approaches here: intrinsic to a cluster environment, intrinsic to the language, and implemented via libraries like in C). However, these ideas are not each important enough to count as core.
The point of a non-vocational, classical degree is to be able to understand the field and to be able to create tools - including new ones. The higher the degree, the more important it becomes to be able to extend the field. The objectives (in part) of a classical degree are what I've described, with the goal of producing synthesists and creators within the field of computer science.
In the vocational education, the grads will hit the ground ready to write code using existing libraries and tools, perfect to drop into some project underway to use what they've learned to tie everything together. But they'll be pretty naïve when it comes to creating them, and generally unaware of ideas that fit the tasks better and just need to be pulled in. Have them learn whatever language is the current fad for a couple of years, and train them in all the current things. But they'll have a harder time as the tools shift underfoot.
Some things bridge both the classic and the vocational. Source code control, for one (UTexas CS basically requires knowing git, for example), all the tools' varieties we take for granted to share work, google, communicate, and try to make AI write our homework or job assignments. But the classes for git should be quite different in a vocational versus classical curriculum.
Basically, given what I've heard from former students at various places (and I taught for a decade myself) I see many colleges leaning towards teaching a vocational CS curriculum and pretending it's classical, and this damages the field overall. At the same time, I've seen overly theoretical degrees in CS that I think are also a problem, in a different way, if the students were led to believe they'd be able to actually create software when they were done.
The most pathological example I've seen is a professor at the University of Texas who was trying to teach his students IPC using C and Unix as the demonstration environment (essentially a classical lesson). However, since the professor's awareness of the implementation was too limited, he was using examples with the wrong paradigm - FIFO pipes - instead of sockets. The result being that the examples only worked for processes with a shared parent proc. This undercut the objective so badly that the students were missing the point of IPC, since they could have had a single process produce essentially the same results as the example, and were getting no payoff from the FIFO aspect. The professor's limited vocational grounding was producing students who were failing to understand both the practical AND theoretical.
So it's not just a case of a poor curriculum poorly serving the students, the professors themselves are suffering from the problem of being too polarized between theory and practice. The problem needs to be viewed as endemic in some colleges and I'm not sure anyone's really talking about it enough.
The three bodies of knowledge you list (DB's, Concurrent and Network programming) are not mainstream at all. Someone can go through an entire career in software engineering and never touch any of these areas except for, perhaps, superficially through libraries, etc.
Is, it, then, fair to judge recent CS grads based on a test of these skills?
Again, while I agree with the general sentiment that the average CS graduate has serious gaps in knowledge, skills and experience, I have to temper my thinking.
I am not sure it is fair to use such skills tests as a metric any more than giving someone 60 minutes to complete 60 Calculus problems is a measure of their understanding of the topic, their ability to apply it and learn what they don't know.
That last part, to me, is the most important thing I try to learn about someone in an interview. I could not care less what they know today. The basics have to be there, of course. Past that, I need to understand how their mind works, if they are adaptable and what their approach to learning and applying something they don't know looks like.
Some examples of the stunts I will pull:
- Implement something in LISP, Forth or APL, knowing they don't know the languages. I want to see how they react and solve that problem. No, not the code challenge, the matter of being asked to do something they don't have a clue how to approach.
- Write an FPGA module in VHDL when they have never used anything other than Verilog.
- Design a multi-failure tolerant circuit when they have never done such a thing
- Explain how to design an electrical DDR4 interface (again, knowing they have never done it).
- Expanding on that, explain how to design an SDRAM memory controller from scratch
Etc.
This isn't at all about looking for the correct or perfect answers. In my 40 years in engineering I can probably say 75% to 90% of what I have worked on has had an element of "How the hell do I do this?". You want people who are able to deal with that, adapt and deliver. Engineering is constantly evolving, what someone learned in school, at some point, becomes irrelevant.
When I went to engineering school FPGA's, the internet and digital circuits running in the GHz range did not exist. I had to learn all of that, and more, as life and career progressed.
I think the right paradigm and metric is to evaluate the person rather than the contents of the mental database they happen to have stored at that point in their journey.
It's like asking why someone with a degree in theoretical physics cannot fix my car.
The real question is why so many people with CS degrees don't know what CS is really about.
your choice.