I was a student at Northeastern (where Matthias Felleisen was a professor) from 2016-2020, so I have first-hand experience with exactly this system of teaching.
The combination of the "program design" process and the simplicity of the teaching language (student Racket) made the introductory courses at Northeastern excellent. I found that students who already had lots of programming experience hated the initial courses, but those of us with limited experience really excelled. For me, it really validates that Dijkstra quote about Basic programmers being psychologically injured.
The second introductory course used Java, but it mostly covered all the same topics as the first Racket-based course, just using Java constructs. It was a much more gentle introduction to the extra complexity of "real" programming languages, but the program design process was identical.
As I understand it, Northeastern is unique in its CS pedagogy, and there's only 1 other school I know of (WPI) that uses Racket as its teaching language. I will always be grateful for my time there.
A sad development is that the current administration is attempting to strangle the curriculum Felleisen et al. have developed over the last 2+ decades in favor of returning to the "old way" of teaching he criticizes in this essay. Their motivation is in large part—though not exclusively; ideological elements also enter into the picture—a consequence of Northeastern recently snapping up various bankrupt colleges worldwide and wanting to homogenize the curriculum across these new satellite campuses. Sadly, this means homogenizing down. Apparently, training faculty in this curriculum is too much for them.
This is so sad. I got an incredible CS education at Northeastern. I’m very successful in my career, and for someone who didn’t know how to program before college, I found that the CS curriculum pioneered by Felleisen prepared me far better than graduates of other colleges. The curriculum was tough and I spent many nights banging my head against homework assignments. But, everything eventually “clicked” and I graduated feeling confident, empowered, and humbled.
Not a single CS major in my graduating class got a 4.0, and I refer to this with honor and respect. The curriculum taught us how to think, how to problem-solve, and how to design programs. It felt like the curriculum was created to foster _understanding_, not to crank out high GPAs.
I’m so disappointed that the Northeastern admin is trying to force such an excellent CS program into something more “accessible” a-la a boot camp. That’s not a knock against boot camps, which should be a low-cost way for people to get their foot in the door for this amazing profession! But, for a 4-5 year university costing $60k per year, I would expect to be challenged, learn theory, become versed in things I’ll never use on the job, and come out a well-rounded SE.
Felleisen may be a bit cantankerous, but he sure as hell knows how to approach CS education, and I can’t thank him enough for the opportunity I had to learn via his approach.
I've been advocating the use of a LISP in the feedback committee of a local CS school I'm in... Some start the course with quite strong JS/Java/C#/Python skills, and some have zero exposure to programming.
A LISP would in most cases:
* level the playing field for all pupils
* focus on learning the concepts over learning the language (I argue LISPs are almost syntax-free)
My initial thought is that's a great idea. But then I start to think about how college classes are supposed to build on what you already know. Your math department doesn't begin with addition, the English department doesn't start with picture books.
Perhaps the real issue is forcing everyone with experience to start over at the beginning.
I've been an assistant to a professor teaching introductory programming at a university. And we chose ML (later Haskell), as the first programming language, exactly because of this reason. Weaker students with no programming experience can build their knowledge on top of their mathematical knowledge from school. Whereas stronger students, with lots of programming experience, were challenged to reconsider their assumptions. Both groups did learn significantly.
> Your math department doesn't begin with addition
Well, ... actually, ... "Mathematik für Informatiker I" (mathematics for computer scientist I) did start with groups, then abelian groups, i.e. addition.
If you wanted native English speakers and second-language English speakers to be on a level playing field in a literature class, maybe you could teach the class using entirely Esperanto or Lojban translations of the works you are studying.
The language one speaks has nothing to do with literature class*, as the point is to teach reading comprehension, critical thinking, writing, and whatnot. The exposure of great works before college helps build a firm foundation on which to read and dissect more complex works.
* Obviously the works need to be readable in a language one knows. But it's not like the essence of literature classes change whether one speaks English or German or whatever. That's not the point.
I attended the University of Delaware around the same time, where the CS honors program also started with Racket.
As someone self-taught with experience in imperative languages like Obj-C, Java, and Haxe, most intro courses would have been redundant.
Racket’s functional approach, however, required a significant mindset shift. It deepened my understanding of core programming principles and fundamentally changed how I approach problem-solving.
UConn had a Racket programming course for maybe a decade up until last year. Enough people complained that it was too hard and a weed-out course and the administration dropped it. Yet another blunder by the CSE department.
> As I understand it, Northeastern is unique in its CS pedagogy, and there's only 1 other school I know of (WPI) that uses Racket as its teaching language.
Not Racket, but Indiana University uses Scheme. Dan Friedman is a professor there and teaches a great 300-level programming languages class (in Scheme ofc)
In-state tuition to that school (and Purdue for that matter) is one of the few reasons I'd advocate for living in Indiana after growing up there haha.
"In 2016, there should not be many undergraduates that are familiar with the version of Basic that Dijkstra was referring to when he made this quote in 1975"
Dijkstra was talking about Dartmouth Basic in 1975:
- Variables: Single letter, optional digit.
- Control flow: FOR loops, GOTO for others.
- Subroutines: GOSUB line, RETURN.
- Parameters: Passed via global variables.
- Functions: 26 (FNA–FNZ), one line each.
- IF statements: One line only.
It's much worse than assembly. On all but the shittiest machines, you can store code pointers in RAM and registers, and in a subroutine call, the return address is stored somewhere you can read and change it (whether on a stack, in a register, or before the first instruction of the called subroutine). This allows you to implement object-oriented programming, switch/case, backtracking, and multithreading in assembly. You can't do that in BASIC.
Also, since the early 01960s, all but the most primitive assemblers have macros and named labels. One result is that they have an unlimited number of named functions with line number independence, as marcosdumay said. Many of them have a sufficiently powerful macro system to implement nested control structures like while loops and multiline if. Since the 01970s they've also had local labels. BASIC doesn't have any of that.
Modern assembly you give you named functions, line number independence, unlimited functions, places for carrying a value over RET... Basic had none of those.
> For me, it really validates that Dijkstra quote about Basic programmers being psychologically injured.
> I was a student at Northeastern (where Matthias Felleisen was a professor) from 2016-2020, so I have first-hand experience with exactly this system of teaching.
This maybe so, however you likely don't have first-hand experience with early, unstructured versions of basic to which Dijkstra was referring to in his quote. These early versions lacked control structures such as loops or even if-then-else functions. Later versions of basic evolved to support modularity, OOP, local variables and everything else. Dijkstra tended towards hyperbole and exaggeration IMHO.
Ha, Ha! Teamwork is vastly overrated in the Industry. Almost everything achieved by mankind is because one man put together a lot of knowledge in his own head and came up with insights. Even when they worked in Teams each man was an individual and did his own thinking.
Today "Teamwork" has come to mean playing politics, jockeying for influence, taking credit for other people's ideas and so on.
When was the last time anybody cared what HR had to say? I've never encountered an HR department whose primary role wasn't to indoctrinate or to create roadblocks for everybody else.
Hyperbole: exaggerated statements or claims not meant to be taken literally.
Do you really think these statements were meant to be taken literally?
"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
Or this?
"the use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense,"
I think that saying that Dijkstra was capable of making hyperbolic statements is rather kind in this case ;-)
All languages have Hyperbole to enable one to cut through noise and convey insight via forceful phrases. That is its proper use; only when it is used for mere egoistic reasons is it frowned upon.
Dijkstra was instrumental in inventing Structured Programming when Programming was literally anything goes spaghetti. This was the main reason his famous GOTO paper was such a hit. Given this background you can understand his comment about BASIC. This comment https://news.ycombinator.com/item?id=42461921 explains how bad Dartmouth BASIC was. Looked at in this light you can see how a person trying to establish structured programming might feel about that flavour of BASIC and his quote nicely captures that.
Dijkstra was also all about mathematical notation, precision and rigour. In fact the article i had linked to mentions that he was unhappy with all languages and hence taught his class at UT Austin using a language of his own design and notation. Now realize that COBOL at that time had all the unstructured faults of BASIC listed above plus an even worse handicap in that it was using Structured English rather than mathematical notation! To somebody who was all about mathematics this would be sheer torture and thus his quote.
Quote from previously linked article;
Dijkstra argued that computer programming should be taught in a radically different way. His proposal was to teach some of the elements of mathematical logic, select a small but powerful programming language, and then concentrate on the task of constructing provably correct computer programs. In his view, programs should be considered the same way as formulas, while programming should be taught as a branch of mathematics.
One should always think when reading an acknowledged great's quotes/phrases. This is not to say that they can never be wrong but this probability is generally quite low.
In a rare moment of self-awareness I realise that I'm arguing on the internet.
The only point I can make is that I'm tending towards idiocy ;-)
Let us just agree with the facts: (a) I first learned to program in BASIC at 10 years old, (b) I am indeed psychologically damaged by the stigma promoted by Djikstra and (c) it's true that I struggled with pointers in C in first year university. So hey, he was probably right - both Djikstra and the OP. Honest enough for ya? ;-)
p.s. I lost two internet points trying to be Captain Defend-the-old-BASIC-programmers. Lessons learned.
I don’t really enjoy pair programming. I like pair “thinking” (if that’s a correct term). I like to think about a problem and the design space with others… but writing actual code with others doesn’t really appeal me. It’s like 2 people painting in the same canvas parts of the same picture; it’s not gonna look pretty.
Pair programming can be great when applied selectively, as needed for short bursts on specific problems.
Pair programming when enforced full-time as a way of having two developers work together is completely exhausting, unpopular with most developers, and slower than having everyone work on their own problems. There is a certain personality who really likes full-time pair programming because they are very social and like coworking problems, but most people dislike it.
>> slower than having everyone work on their own problems
That’s fascinating, i recognise every bit of everything else you said but in my experience this was the golden redeeming quality - velocity of correct code is massively (maybe 4x) sped up.
Maybe the difference is overall - pairing becomes too tiring so you might be extremely productive but the short periods of it mean that the slow and steady solo approach wins out because you can do it for longer.
Pair programming if done on task such as "let's make this completely standard thing we've done five times already and need a slight variation of" is horrible. When working with new developers, if they're senior or junior, is great though. It doesn't have to be full days but a few hours a day of pair programming brings someone up to speed incredibly quickly.
When there's hard problems to track down, I think all developers go to pair debugging. If not only for the rubber duck effect.
Ha, humorously, I like pair debugging even less than pair programming. I need to think deeply when debugging a thorny issue, and I can't do that while interspersing communication.
I quite like "pair debugging". As someone that doesn't really like using a debugger, it's good to have someone that does use it to go through code (possibly not mine) with me commenting or suggesting problems. Probably irritates the heck out of the co-debugger though!
As a person who really enjoys debugging including (but not limited to) using a debugger: Yep, I'd rather do this myself than have to keep a mental channel open to listen to your comments and suggestions.
With more junior devs, I usually solve a problem together with them but I'm in the driver's seat. The best opportunity is when they come to your with a question. Just dive right into the code with them (this also shows them that it's "safe" to come to you with questions). It's so easy now with Slack and Zoom to ad-hoc jump into a session.
I'll talk out loud to myself and verbalize my thought process.
The objective is to show them tools, techniques, and approaches that they would otherwise not pick up. I'm not literally coding with them; I'm doing the coding and explaining my inner monologue for a given problem.
Tooling in particular can be hard to pick up. Even little things like the JavaScript debug console in VS Code can be a productivity changer.
For me, this has been very successful and I have helped more junior devs accelerate their careers. A lot of it is honestly selfish; the more I teach them to work and think like me, the more of my work I can hand off to them. In exchange, I get to work on more interesting things -- win-win
imo, the more senior counterpart benefits less from pair programming and therefore enjoys less. however, it's still the fastest way to get someone familiarized with a concept/project when done correctly. might be really valuable in a cs curriculum.
That doesn't sound like any pair programming I've ever done. There's always one person who writes the actual code and the other is there to think along, spot any bugs and discuss any better ways to do what we're doing (covering the rubber duck angle too). And then swap when the other one has a better idea on how to do something.
People always analogize this, but I've found "pair programming" with LLMs to be extremely underwhelming. I've tried it with Aider+Claude, Copilot Chat, 4o and o1, and the experience is always the same: I spend most of the time correcting misunderstandings before eventually giving up and doing it myself.
To this day I haven't found an LLM application that works better than regular autocomplete Copilot. It's sufficient to get over the blank canvas problem in a lot of cases without being overly ambitious and biting off more than the LLM can chew.
I've yet to try Cursor, but that's because I don't have a lot of hope: I first heard the same kinds of great things about Aider and Claude and o1, and all of those have disappointed. It's hard to want to switch editors to see if this time is for real.
I used Copilot and dropped it when it stopped being free. But Cursor is a different beast - the autocomplete is far better and intuitive. The chat functions like a personal StackOverflow (or just paste an error message for debugging).
For me as a senior eng, Cursor is where AI turned the corner from "maybe helpful for fringe / basic things" to "actually amplifies my productivity". Took about 30 min to flip the switch for me, so I suggest you give it a try.
The trouble is that I heard all of this—personal stack overflow, error messages—already with Claude, and I'm skeptical that anything Cursor can do on top of that will be worth losing the productivity of the JetBrains IDEs.
I imagine it's easier to switch for someone who's already making do with VS Code, because at that point the main hurdle is just being willing to pay for an editor. I already pay for an editor (a bundle of them), but "VS Code with better AI integration" just... doesn't appeal, especially when the word-of-mouth recommendations say all the same things that I've already heard about other tools that didn't work for me.
I had the perfect use-case for LLM-assisted coding a few days ago: I had a well-defined function to implement, for which dynamic programming was a great fit. I haven't used DP in years but it's a well-trodden path, why not throw the problem at the LLM and let it implement it?
Well despite careful prompting, it kept getting it wrong. At some point it generated some credible code, but I spent the day finding problems, nudging it to fix it, asking for explanations for stuff that looked (and was) incorrrect... Resulting code was bug riddled and horrible (LLMs tend to fix problems by adding more code rather than rearchitecting the existing code to eliminate edge-cases). I ended up spending a whole lot more time, I had to spend ages carefully nudging the LLM to no avail, understand LLM-generated garbage code _and_ still solve the problem myself.
I was a CS student for 2 semesters at Northeastern before dropping out thanks to a job offer. No prior coding experience.
I think that the curriculum design and principles that guide the NEU CS education are fantastic. I’ve been fortunate (or unfortunate, depending on your perspective) to quickly find myself in a mentorship position at work, and there have been a number of times where I realize that the boot camp hire just isn’t thinking the way I do _at all_. The first things drilled into my head were function signatures and manipulating data structures (by implementing a subset of the ruby enumerable module in Racket). This has made problem solving by manipulating data structures (a decently common part of the job, especially at first!) genuinely trivial. Things more or less immediately translate to a map, filter, andmap, ormap, or reduce when trying to get data from its input to its output for whatever unit of work I’m trying to do.
Other developers on my team though experience each new technique/thing as a new or different thing, which to me seems far more overwhelming. I think most developers naturally develop the intuition, but being told upfront “everything is just these 5 or a combination of them lol” implicitly by the work we were doing was something I’m grateful for.
I never enjoyed the pair programming at Northeastern. I was so behind my peers at the time, since most everyone else was like an AP CS student or had been coding since they were a child. I was busy trying to brute force the learning with 40+ hour weeks just for the CS fundamentals classes. I never found someone in my position. I was only paired with people that the intro course was trivial for or they just did not care at all haha. Most brutal part was waiting 5+ hours at office hours with a white board wait list 90+ names deep and then office hours would end and they would send us home. Life before ChatGPT was crazy.
If a university charges you $32,495/term (Northeastern ) for fulltime when in a lot of other countries people make a living feeding their families with 20,000 us dollars or less, why on earth would you wait for 5+ hours to get help from the professor? There should be as many professors and TAs as possible to help you out!
It costs more than that to put food on your family in Boston, and most of that money goes to rent-seeking administrators rather than TAs and professors.
From the linked article : "How to Design Programs is the first text book on programming that explicitly spells out how to construct programs in a systematic manner."
You didn't explain how the "bootcamp grads" thought process differs from yours.
> Things more or less immediately translate to a map, filter, andmap, ormap, or reduce when trying to get data from its input to its output for whatever unit of work I’m trying to do
It comes across as smug. "How dare these bootcamp grads write a for loop when I am wrangling a complex reduce expression."
Even thought that may not be what you meant.
Probably why people even around here find the PLT nerds obnoxious.
whatever you described is often not the hard part of the code at $dayjob.
Also for most people database dictates the choice of data structure and algorithm.
CS is not complete without compilers, networks, DBs, OSes and computer architecture. Yet somehow PLT nerds pretend they unlocked a super power with map and reduce.
> You didn't explain how the "bootcamp grads" thought process differs from yours.
In my experience, their thought process starts with "I know framework/library X" and ends with "What library/framework solves my specific problem".
In recent years it seems like they've completely outsourced their thought process to tools like ChatGPT. However, it's been a while since I've worked with a recent college graduate so outsourcing ones thought process may be the new normal?
I have worked with a few bootcamp grads who didn't start their thought process this way but that's something they've had to learn on their own.
Map, filter and reduce are typically simpler than for/foreach, because of how intermediate variables are handled and commonly there's a scope boundary reducing the risk of unwanted mutation or context pollution that doesn't exist in for/foreach/while in the same language.
I have met some "bootcamp grads", and unless they've managed to learn it on their own they tend to struggle with data structures, especially transformations and reductions. Getting an intuition for data and learning to keep mental models of it is rather important to be effective in software development. HtDP is quite good at teaching this specifically, and you also pick up several algorithmic techniques that are good to have and not very discoverable in themselves to a newbie, like recursion.
Furthermore, once you've gotten fluent with scalars and flat collections you're well prepared for trees, and when you get the hang of trees you can start programming programs, i.e. metaprogramming, since in the abstract a program is a tree traversed by the execution. From there getting good at software architecture is achievable too.
After re-reading my comment, you certainly _could_ read it in a smug/conceited tone. And I did explain my thoughts, they approach each logic problem with more novelty than I do, not for a lack of practice or ability, but for a lack of a mental model to map heuristics to.
But I will say that my comments to them, and here, come from a place of wanting to raise all tides, so to speak. There is no smugness where there is no (or little) ego, and I think you’re projecting yourself onto my comment.
Good response. The GP is trying to be woke where it is not warranted.
> they approach each logic problem with more novelty than I do, not for a lack of practice or ability, but for a lack of a mental model to map heuristics to.
Very right; A proper "mental model" is fundamental to all types of learning.
I just hold that PLT or DSA are not end-all of CS. You learn more by studying OS, Networks, compilers, DBMS, processor architecture. But somehow PLT nerds pretend knowing map and filter is a superpower. Just like kubernetes people think they can throw distributed systems at any problem.
> I just hold that PLT or DSA are not end-all of CS.
Nobody said this, that's your preconception. The comment has nothing to do with PLT or DSA. What it was talking about was patterns used in functional programming which most developers coming from imperative languages don't really appreciate (i was one of them). Hence when somebody says it is useful i try to understand them rather than dismissing it out of hand. For example, here is a recent HN submission "Haskell vs. Ada vs. C++ vs. Awk vs ... An Experiment in Software Prototyping Productivity" which gives something to think about - https://news.ycombinator.com/item?id=42445328 See "Lessons Learned" section here - https://news.ycombinator.com/item?id=42460631
> You learn more by studying OS, Networks, compilers, DBMS, processor architecture.
Again, this is orthogonal to what the comment was about. These are application domains/end products and not programming technique itself. They are not in conflict.
You certainly have not understood what the GP was talking about. It is about Concepts, Computation and Mental Models. The fact that you equate it with PLT just proves my point.
> CS is not complete without compilers, networks, DBs, OSes and computer architecture.
This is completely orthogonal to what the comment is talking about.
I have no experience in this style, but my experience with bootcampers and others has me wondering how much of the benefit is accidental because of their implementation of this style, rather than because of the different teaching style itself.
Specifically: I've had multiple co-workers who learned one programming language (bootcamp, self-taught, or otherwise) and were resistant to learning another one. Based on things they've said after I'd pushed them to do something in another language, I think their resistance was entirely because they remember how difficult the first programming language was and expect further languages to be similarly difficult. Instead, as they start to actually work with the second language they realize how much is conceptually similar - the things this article refers to as typically learned implicitly through experience.
But the alternate style described in the article does the same thing: moving from a sort-of pseudocode (the design process) to student Racket to Java (per another comment here) gets that same implicit learning benefit independent of a full restructuring of the curriculum.
I wouldn't be surprised if keeping the more traditional style and simply requiring different languages in different courses/years gets most of the benefits with minimal changes.
That’s an interesting comment, I had a similar experience going from Python to JavaScript then TypeScript, but now I’m really excited to pick up new languages (C, C#)
Yep, it gets fun after about the third language. This was the best thing about my programming languages course in college. We learned a bunch of languages a bit, while comparing and contrasting, which made all of them, and all the ones I've learned since then, a lot less scary.
These are all pretty similar though. If you want to widen your horizons, perhaps try a Lisp or one of the ML family languages (eh Haskell). These languages are actually different than the ones you used.
I think I kinda agree, but there's a big caveat. I've thought about these problems a lot, and I like the idea of cooperation and team-work being how we grow and develop. It doesn't square off with my own lived experience though. I consider myself a pretty good developer. That hasn't come out of "teamwork" though. My early developer days, the days where I didn't quite know enough to actually plan out what I needed to work on, weren't dominated by helpful voices guiding me along. They were filled with antisocial freaks on the internet telling me how stupid I was for asking such a basic question. The odd thing was that it didn't repel me, it drew me in. I became one of them, I sat down and searched day and night. I spent 4-7 hours every day after school just trying to understand what this computer thing was, and how it all fit together. The early work I did to learn the technical aspects of being a developer was all fueled by antisocial hyper focus, at a great cost to many other parts of my life. It undeniably made me pretty good at the technical aspects though.
I've since leveled out a bit. With the technical stuff roughly figured out. I've since moved on to the people problems, and there it's much more about the cooperation. I couldn't meaningfully contribute to those "people problems" If I didn't have the antisocial beginnings though.
I have a hard time "developing" developers, when I look back at my own lived experience. I like what and who I am, but the cost has been pretty steep. I'm not sure I can take other people down that path in good consistence.
I’m local to Boston and I’ve had great experience with all of the co-ops I’ve worked with from Northeastern. I’ve worked with them across many fields (ME, EE, and CS) and they have been almost uniformly great to work with.
They have all had some exposure to real world engineering practices in their respective disciplines’ teaching tracks.
They have to do coops to graduate, so you have them to work with for six months instead of ten weeks.
They are all eager to do real work and ship real stuff, and they understand quickly how to integrate into a team to do that. I don’t know if this is a selective property of Northeastern’s culture or something they instill in their students, but I really like it.
Worked with university interns before who are clearly on an industry pit stop en route to a postgraduate program and subsequent career as a corduroy elbowed academic. Thanks for playing, you’re a wonderful little creature, but we aren’t motivated to do the same things with our time. They may play the same game I do, but they’ve chosen a different character arc. They’re journeyman wizards building an arcana and looking for a tower. I’m a dwarf foreman looking to dig out the next Moria.
Love the Northeastern crowd. I’d put them as equivalent to or better than their rodent ringed counterparts from the other side of the Charles river.
This is an interesting article. I do appreciate the focus on pair programming, which is probably something that's really helped me improve as a programmer, and their process seems quite interesting.
It would be interesting to see this method actually works, i.e. if Northeastern programmers are "more useful to their employers"/"better developers" than graduates of Universities using the older approach ('tinker until it works', as the featured article puts it)
Well...back when I was working in Boston-based companies, I spent many a day at Northeastern interviewing intern candidates there. I didn't have too much trouble finding promising programmers there. So, yeah, I guess it works.
That's true of a software developer in the narrow sense. In the broader sense there are skills around collaborating with others, working sync/async, delivering with quality, communicating status, mentoring junior people, educating others about what is going on, and maintaining the systems over time.
Well, of course. But primarily, you have to be able to write code. And to do that, reading books is the primary resource (and for a lot of the other things you mention).
It's the way I roll, I guess. Everything technical with regards to computing I have learned in life I've done so from books/manuals. It is kind of all of human history to do so. And are you suggesting that you can't get anything of value out of (for e.g.) "The C Programming Language"?
From a Lisp-like introduction, to OOP (in Java), then to ACL2, and finally back to OOD. It's quite interesting in its own right, but now that functional programming is more and more adopted by the mainstream, perhaps there's no need for the "practical appeal" of OOP/OOD in the mix, or does the author genuinely believe that's the way to go?
My first thought is this lookes like a well designed curriculum, and several other posters here who have studied at or hired from that uni are very positive about it.
That said, it's not quite a red flag but perhaps a yellow one for me when someone trots out the "everyone else is doing it wrong" line with particular emotion-triggering words. Scott Alexander once said this was the approach of "every therapy book, ever" (https://slatestarcodex.com/2019/11/20/book-review-all-therap...)
For example, we start with the curriculum being "unique" (though they do caveat this in a link on the side), sits aloof from what is "currently fashionable", and then (Sec 1.1) paint "the vast majority" of other courses as "traditional" (section title) and "old-fashioned". Dismissing your "traditional", an emotion-laden word for some to say the least, normally activates my B.S. detector because every other startup pitch works like that. Come and invest in our innovative crypto as opposed to traditional, old-fashioned fiat currency!
Sometimes, something has become tradition because people tried it, it went well, they kept on trying it, and it kept on going well. (see also: Chesterton's fence)
I'm sure there are CS courses that could improve by following Northeastern's principles, but I'm also sure there's a lot of other colleges that turn out competent programmers who understand program design and teamwork and systematic reasoning.
Whether to start with a C-style, python style (indentation is structure), or (lisp (style)) language is a matter of taste, but I don't think I'd have got on well with the DrRacket IDE. I like to use my own editor, with my own color scheme and keybindings and regexp search/replace (where I don't need to check each time whether it's \1 or $1 to refer to a capture group), and where I can interact with git and store my code in a repo out of the box (or by opening a terminal window). Anything else feels too much like a walled garden to me.
If I was trying to develop good developers, I'd add a 7th initial, recurring step to the vertical: Talk to your customer as often and directly as possible to ensure you are still working on the right problem.
It seems to me that a lot of wasted energy is in the form of working on problems that no one cares about. Not that this is necessarily bad (hobby, fun, art, side projects, new ideas, etc), but in a concrete business setting you need to be a bit more aggressive about making sure the customer still gives a shit about what you are working on over time.
I find all of this leads neatly into the 2nd most important thing for me which is making sure you have a good domain model (schema). If the tables, columns and relations that represent the business are high quality and accurate (I.e., your customer understands them), the code that follows will usually go smoothly. Staying on the customer's heels regarding the applicability of your software to the business means you can keep this well aligned over time.
I think much of the tech bloat we see today is a side effect of attempting to outrun the complexity of the customer's specific needs. After sitting on the phone with vendors and customers for a week, you will likely not find yourself playing around with the idea of using some esoteric language or infra to implement things. It's incredible what being directly exposed to the customers can do for a developer's growth.
It’s completely true. Sometimes software development is at the crossroads of art and craftsmanship. So we like to do things nicely for the sake/pleasure of it and we can loose sight of what really matters. Most of the times, what really matters is the customer using the product and if the product is a tool, the most important criterias for him have to be aligned to what we spent the most time on. Sometimes we want to make a generic case out of specific ones in order for all customers to also benefit from it and that can lead us to higher, unnecessary levels of abstraction and sometimes because there isn’t enough it leads to hard to maintain spaghetti code that is working under specific conditions that are hard to evolve. The balance is thin and often blurry because it’s a bet on an uncertain future that depends on how well we can predict future business
> It seems to me that a lot of wasted energy is in the form of working on problems that no one cares about.
That's where classic software methodologies such as Waterfall are good at: everything must be carefully discussed with the customer during the "requirements analysis" phase.
It is much more than that; the "Waterfall Model" SDLC is very much misunderstood. In fact the commonly used diagram for waterfall was an example of what not to do! Folks should read the following;
You know what would make us all faster? The entire team in a meeting talking about which JIRA tickets they moved yesterday and which ones they plan to move today. We should also ask the same in-depth technical questions on projects which we have already asked that developer a dozen times.
Why would that make us all faster? Also, is faster better, or better is better?
Waterfall development is the most appropriate way to develop software, most of the times. CRUDs developed by startups don't change requirements often, their clueless managers that change their minds as they get to understand what they should already know before starting the project.
That's a strange non sequitur. A statement against Waterfall is not a statement for Scrum (no reason to shout, I get that you don't like it but shouting its name is weird).
Please, point out non-trivial successful (delivered on time, on budget, and with all initially planned features) Waterfall projects that did not modify Waterfall into something sensible (that is, incorporated feedback loops and probably executed as a series of iterations rather than one 5-year long project with hard distinctions between each phase).
I was a student at Northeastern (where Matthias Felleisen was a professor) from 2016-2020, so I have first-hand experience with exactly this system of teaching.
The combination of the "program design" process and the simplicity of the teaching language (student Racket) made the introductory courses at Northeastern excellent. I found that students who already had lots of programming experience hated the initial courses, but those of us with limited experience really excelled. For me, it really validates that Dijkstra quote about Basic programmers being psychologically injured.
The second introductory course used Java, but it mostly covered all the same topics as the first Racket-based course, just using Java constructs. It was a much more gentle introduction to the extra complexity of "real" programming languages, but the program design process was identical.
As I understand it, Northeastern is unique in its CS pedagogy, and there's only 1 other school I know of (WPI) that uses Racket as its teaching language. I will always be grateful for my time there.
A sad development is that the current administration is attempting to strangle the curriculum Felleisen et al. have developed over the last 2+ decades in favor of returning to the "old way" of teaching he criticizes in this essay. Their motivation is in large part—though not exclusively; ideological elements also enter into the picture—a consequence of Northeastern recently snapping up various bankrupt colleges worldwide and wanting to homogenize the curriculum across these new satellite campuses. Sadly, this means homogenizing down. Apparently, training faculty in this curriculum is too much for them.
This is so sad. I got an incredible CS education at Northeastern. I’m very successful in my career, and for someone who didn’t know how to program before college, I found that the CS curriculum pioneered by Felleisen prepared me far better than graduates of other colleges. The curriculum was tough and I spent many nights banging my head against homework assignments. But, everything eventually “clicked” and I graduated feeling confident, empowered, and humbled.
Not a single CS major in my graduating class got a 4.0, and I refer to this with honor and respect. The curriculum taught us how to think, how to problem-solve, and how to design programs. It felt like the curriculum was created to foster _understanding_, not to crank out high GPAs.
I’m so disappointed that the Northeastern admin is trying to force such an excellent CS program into something more “accessible” a-la a boot camp. That’s not a knock against boot camps, which should be a low-cost way for people to get their foot in the door for this amazing profession! But, for a 4-5 year university costing $60k per year, I would expect to be challenged, learn theory, become versed in things I’ll never use on the job, and come out a well-rounded SE.
Felleisen may be a bit cantankerous, but he sure as hell knows how to approach CS education, and I can’t thank him enough for the opportunity I had to learn via his approach.
I've been advocating the use of a LISP in the feedback committee of a local CS school I'm in... Some start the course with quite strong JS/Java/C#/Python skills, and some have zero exposure to programming.
A LISP would in most cases:
* level the playing field for all pupils
* focus on learning the concepts over learning the language (I argue LISPs are almost syntax-free)
* while not delving into type systems just yet!
The level playing field is interesting.
My initial thought is that's a great idea. But then I start to think about how college classes are supposed to build on what you already know. Your math department doesn't begin with addition, the English department doesn't start with picture books.
Perhaps the real issue is forcing everyone with experience to start over at the beginning.
> Your math department doesn't begin with addition
Adition is a mandatory topic in school, so you can assume students know it when they get to the university.
Anyway one of my coworkers in the first year of the university asked 1/16 + 1/16 and after some tries from the students the best answer was 1/32.
> The level playing field is interesting.
I've been an assistant to a professor teaching introductory programming at a university. And we chose ML (later Haskell), as the first programming language, exactly because of this reason. Weaker students with no programming experience can build their knowledge on top of their mathematical knowledge from school. Whereas stronger students, with lots of programming experience, were challenged to reconsider their assumptions. Both groups did learn significantly.
> Your math department doesn't begin with addition
Well, ... actually, ... "Mathematik für Informatiker I" (mathematics for computer scientist I) did start with groups, then abelian groups, i.e. addition.
If you wanted native English speakers and second-language English speakers to be on a level playing field in a literature class, maybe you could teach the class using entirely Esperanto or Lojban translations of the works you are studying.
The language one speaks has nothing to do with literature class*, as the point is to teach reading comprehension, critical thinking, writing, and whatnot. The exposure of great works before college helps build a firm foundation on which to read and dissect more complex works.
* Obviously the works need to be readable in a language one knows. But it's not like the essence of literature classes change whether one speaks English or German or whatever. That's not the point.
It is easier to pass a literature class when you can already read the language the literature is in.
I attended the University of Delaware around the same time, where the CS honors program also started with Racket.
As someone self-taught with experience in imperative languages like Obj-C, Java, and Haxe, most intro courses would have been redundant.
Racket’s functional approach, however, required a significant mindset shift. It deepened my understanding of core programming principles and fundamentally changed how I approach problem-solving.
UConn had a Racket programming course for maybe a decade up until last year. Enough people complained that it was too hard and a weed-out course and the administration dropped it. Yet another blunder by the CSE department.
> As I understand it, Northeastern is unique in its CS pedagogy, and there's only 1 other school I know of (WPI) that uses Racket as its teaching language.
Not Racket, but Indiana University uses Scheme. Dan Friedman is a professor there and teaches a great 300-level programming languages class (in Scheme ofc)
In-state tuition to that school (and Purdue for that matter) is one of the few reasons I'd advocate for living in Indiana after growing up there haha.
There shouldn't be a lot of people that knows the Basic Dijkstra was talking about in an undergrad course in 2016.
> There shouldn't be a lot of people that knows the Basic Dijkstra was talking about in an undergrad course in 2016.
please clarify.
few know basic in 2016?
few know Dijkstra said it in 2016?
in 2016 few knew that Dijkstra made the claim at some earlier point in time?
I don't understand what you want to say.
"In 2016, there should not be many undergraduates that are familiar with the version of Basic that Dijkstra was referring to when he made this quote in 1975"
see https://www.cs.utexas.edu/~EWD/transcriptions/EWD04xx/EWD498... for the original quote
thank you!
Dijkstra was talking about Dartmouth Basic in 1975:
In terms of control flow, that's basically assembly, just with a friendlier syntax.
It's much worse than assembly. On all but the shittiest machines, you can store code pointers in RAM and registers, and in a subroutine call, the return address is stored somewhere you can read and change it (whether on a stack, in a register, or before the first instruction of the called subroutine). This allows you to implement object-oriented programming, switch/case, backtracking, and multithreading in assembly. You can't do that in BASIC.
Also, since the early 01960s, all but the most primitive assemblers have macros and named labels. One result is that they have an unlimited number of named functions with line number independence, as marcosdumay said. Many of them have a sufficiently powerful macro system to implement nested control structures like while loops and multiline if. Since the 01970s they've also had local labels. BASIC doesn't have any of that.
Modern assembly you give you named functions, line number independence, unlimited functions, places for carrying a value over RET... Basic had none of those.
> For me, it really validates that Dijkstra quote about Basic programmers being psychologically injured.
> I was a student at Northeastern (where Matthias Felleisen was a professor) from 2016-2020, so I have first-hand experience with exactly this system of teaching.
This maybe so, however you likely don't have first-hand experience with early, unstructured versions of basic to which Dijkstra was referring to in his quote. These early versions lacked control structures such as loops or even if-then-else functions. Later versions of basic evolved to support modularity, OOP, local variables and everything else. Dijkstra tended towards hyperbole and exaggeration IMHO.
> Dijkstra tended towards hyperbole and exaggeration IMHO.
No.
Dijkstra being Dutch was famously blunt, vigorously contrarian, uncompromising perfectionist and extremely honest.
A summary of his life and works; The Man Who Carried Computer Science on His Shoulders - https://inference-review.com/article/the-man-who-carried-com...
"... blunt, vigorously contrarian, uncompromising perfectionist and extremely honest."
Or what HR would call as "not a team player".
Ha, Ha! Teamwork is vastly overrated in the Industry. Almost everything achieved by mankind is because one man put together a lot of knowledge in his own head and came up with insights. Even when they worked in Teams each man was an individual and did his own thinking.
Today "Teamwork" has come to mean playing politics, jockeying for influence, taking credit for other people's ideas and so on.
When was the last time anybody cared what HR had to say? I've never encountered an HR department whose primary role wasn't to indoctrinate or to create roadblocks for everybody else.
Hyperbole: exaggerated statements or claims not meant to be taken literally.
Do you really think these statements were meant to be taken literally?
"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
Or this?
"the use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense,"
I think that saying that Dijkstra was capable of making hyperbolic statements is rather kind in this case ;-)
All languages have Hyperbole to enable one to cut through noise and convey insight via forceful phrases. That is its proper use; only when it is used for mere egoistic reasons is it frowned upon.
Dijkstra was instrumental in inventing Structured Programming when Programming was literally anything goes spaghetti. This was the main reason his famous GOTO paper was such a hit. Given this background you can understand his comment about BASIC. This comment https://news.ycombinator.com/item?id=42461921 explains how bad Dartmouth BASIC was. Looked at in this light you can see how a person trying to establish structured programming might feel about that flavour of BASIC and his quote nicely captures that.
Dijkstra was also all about mathematical notation, precision and rigour. In fact the article i had linked to mentions that he was unhappy with all languages and hence taught his class at UT Austin using a language of his own design and notation. Now realize that COBOL at that time had all the unstructured faults of BASIC listed above plus an even worse handicap in that it was using Structured English rather than mathematical notation! To somebody who was all about mathematics this would be sheer torture and thus his quote.
Quote from previously linked article;
Dijkstra argued that computer programming should be taught in a radically different way. His proposal was to teach some of the elements of mathematical logic, select a small but powerful programming language, and then concentrate on the task of constructing provably correct computer programs. In his view, programs should be considered the same way as formulas, while programming should be taught as a branch of mathematics.
One should always think when reading an acknowledged great's quotes/phrases. This is not to say that they can never be wrong but this probability is generally quite low.
So he used hyperbole at least twice. Does that mean he tended to it?
In a rare moment of self-awareness I realise that I'm arguing on the internet.
The only point I can make is that I'm tending towards idiocy ;-)
Let us just agree with the facts: (a) I first learned to program in BASIC at 10 years old, (b) I am indeed psychologically damaged by the stigma promoted by Djikstra and (c) it's true that I struggled with pointers in C in first year university. So hey, he was probably right - both Djikstra and the OP. Honest enough for ya? ;-)
p.s. I lost two internet points trying to be Captain Defend-the-old-BASIC-programmers. Lessons learned.
At University of Waterloo we used scheme
I don’t really enjoy pair programming. I like pair “thinking” (if that’s a correct term). I like to think about a problem and the design space with others… but writing actual code with others doesn’t really appeal me. It’s like 2 people painting in the same canvas parts of the same picture; it’s not gonna look pretty.
Pair programming can be great when applied selectively, as needed for short bursts on specific problems.
Pair programming when enforced full-time as a way of having two developers work together is completely exhausting, unpopular with most developers, and slower than having everyone work on their own problems. There is a certain personality who really likes full-time pair programming because they are very social and like coworking problems, but most people dislike it.
>> slower than having everyone work on their own problems
That’s fascinating, i recognise every bit of everything else you said but in my experience this was the golden redeeming quality - velocity of correct code is massively (maybe 4x) sped up.
Maybe the difference is overall - pairing becomes too tiring so you might be extremely productive but the short periods of it mean that the slow and steady solo approach wins out because you can do it for longer.
Pair programming if done on task such as "let's make this completely standard thing we've done five times already and need a slight variation of" is horrible. When working with new developers, if they're senior or junior, is great though. It doesn't have to be full days but a few hours a day of pair programming brings someone up to speed incredibly quickly.
When there's hard problems to track down, I think all developers go to pair debugging. If not only for the rubber duck effect.
Ha, humorously, I like pair debugging even less than pair programming. I need to think deeply when debugging a thorny issue, and I can't do that while interspersing communication.
I quite like "pair debugging". As someone that doesn't really like using a debugger, it's good to have someone that does use it to go through code (possibly not mine) with me commenting or suggesting problems. Probably irritates the heck out of the co-debugger though!
As a person who really enjoys debugging including (but not limited to) using a debugger: Yep, I'd rather do this myself than have to keep a mental channel open to listen to your comments and suggestions.
With more junior devs, I usually solve a problem together with them but I'm in the driver's seat. The best opportunity is when they come to your with a question. Just dive right into the code with them (this also shows them that it's "safe" to come to you with questions). It's so easy now with Slack and Zoom to ad-hoc jump into a session.
I'll talk out loud to myself and verbalize my thought process.
The objective is to show them tools, techniques, and approaches that they would otherwise not pick up. I'm not literally coding with them; I'm doing the coding and explaining my inner monologue for a given problem.
Tooling in particular can be hard to pick up. Even little things like the JavaScript debug console in VS Code can be a productivity changer.
For me, this has been very successful and I have helped more junior devs accelerate their careers. A lot of it is honestly selfish; the more I teach them to work and think like me, the more of my work I can hand off to them. In exchange, I get to work on more interesting things -- win-win
imo, the more senior counterpart benefits less from pair programming and therefore enjoys less. however, it's still the fastest way to get someone familiarized with a concept/project when done correctly. might be really valuable in a cs curriculum.
That doesn't sound like any pair programming I've ever done. There's always one person who writes the actual code and the other is there to think along, spot any bugs and discuss any better ways to do what we're doing (covering the rubber duck angle too). And then swap when the other one has a better idea on how to do something.
Maybe it's ok if the other coder is an LLM.
People always analogize this, but I've found "pair programming" with LLMs to be extremely underwhelming. I've tried it with Aider+Claude, Copilot Chat, 4o and o1, and the experience is always the same: I spend most of the time correcting misunderstandings before eventually giving up and doing it myself.
To this day I haven't found an LLM application that works better than regular autocomplete Copilot. It's sufficient to get over the blank canvas problem in a lot of cases without being overly ambitious and biting off more than the LLM can chew.
I've yet to try Cursor, but that's because I don't have a lot of hope: I first heard the same kinds of great things about Aider and Claude and o1, and all of those have disappointed. It's hard to want to switch editors to see if this time is for real.
I used Copilot and dropped it when it stopped being free. But Cursor is a different beast - the autocomplete is far better and intuitive. The chat functions like a personal StackOverflow (or just paste an error message for debugging).
For me as a senior eng, Cursor is where AI turned the corner from "maybe helpful for fringe / basic things" to "actually amplifies my productivity". Took about 30 min to flip the switch for me, so I suggest you give it a try.
The trouble is that I heard all of this—personal stack overflow, error messages—already with Claude, and I'm skeptical that anything Cursor can do on top of that will be worth losing the productivity of the JetBrains IDEs.
I imagine it's easier to switch for someone who's already making do with VS Code, because at that point the main hurdle is just being willing to pay for an editor. I already pay for an editor (a bundle of them), but "VS Code with better AI integration" just... doesn't appeal, especially when the word-of-mouth recommendations say all the same things that I've already heard about other tools that didn't work for me.
Same.
I had the perfect use-case for LLM-assisted coding a few days ago: I had a well-defined function to implement, for which dynamic programming was a great fit. I haven't used DP in years but it's a well-trodden path, why not throw the problem at the LLM and let it implement it?
Well despite careful prompting, it kept getting it wrong. At some point it generated some credible code, but I spent the day finding problems, nudging it to fix it, asking for explanations for stuff that looked (and was) incorrrect... Resulting code was bug riddled and horrible (LLMs tend to fix problems by adding more code rather than rearchitecting the existing code to eliminate edge-cases). I ended up spending a whole lot more time, I had to spend ages carefully nudging the LLM to no avail, understand LLM-generated garbage code _and_ still solve the problem myself.
Agree. I actually decided to blog about that here earlier today: https://jensrantil.github.io/posts/pair-mob-sessions/
I completely agree.
I was a CS student for 2 semesters at Northeastern before dropping out thanks to a job offer. No prior coding experience.
I think that the curriculum design and principles that guide the NEU CS education are fantastic. I’ve been fortunate (or unfortunate, depending on your perspective) to quickly find myself in a mentorship position at work, and there have been a number of times where I realize that the boot camp hire just isn’t thinking the way I do _at all_. The first things drilled into my head were function signatures and manipulating data structures (by implementing a subset of the ruby enumerable module in Racket). This has made problem solving by manipulating data structures (a decently common part of the job, especially at first!) genuinely trivial. Things more or less immediately translate to a map, filter, andmap, ormap, or reduce when trying to get data from its input to its output for whatever unit of work I’m trying to do.
Other developers on my team though experience each new technique/thing as a new or different thing, which to me seems far more overwhelming. I think most developers naturally develop the intuition, but being told upfront “everything is just these 5 or a combination of them lol” implicitly by the work we were doing was something I’m grateful for.
I never enjoyed the pair programming at Northeastern. I was so behind my peers at the time, since most everyone else was like an AP CS student or had been coding since they were a child. I was busy trying to brute force the learning with 40+ hour weeks just for the CS fundamentals classes. I never found someone in my position. I was only paired with people that the intro course was trivial for or they just did not care at all haha. Most brutal part was waiting 5+ hours at office hours with a white board wait list 90+ names deep and then office hours would end and they would send us home. Life before ChatGPT was crazy.
If a university charges you $32,495/term (Northeastern ) for fulltime when in a lot of other countries people make a living feeding their families with 20,000 us dollars or less, why on earth would you wait for 5+ hours to get help from the professor? There should be as many professors and TAs as possible to help you out!
It costs more than that to put food on your family in Boston, and most of that money goes to rent-seeking administrators rather than TAs and professors.
There is a decently large amount of TAs for the freshman first semester course (aka Fundies 1) — this semester has 77 TAs
"translate to a map, filter, andmap, ormap, or reduce"
A hadoop programmer is born
Are there any books that discuss this technique?
It's the philosophy behind How to Design Programs (https://htdp.org/) and Matthias' other educational works. On HtDP might be relevant to your interests (https://felleisen.org/matthias/OnHtDP/index.html)
From the linked article : "How to Design Programs is the first text book on programming that explicitly spells out how to construct programs in a systematic manner."
You didn't explain how the "bootcamp grads" thought process differs from yours.
> Things more or less immediately translate to a map, filter, andmap, ormap, or reduce when trying to get data from its input to its output for whatever unit of work I’m trying to do
It comes across as smug. "How dare these bootcamp grads write a for loop when I am wrangling a complex reduce expression."
Even thought that may not be what you meant.
Probably why people even around here find the PLT nerds obnoxious.
whatever you described is often not the hard part of the code at $dayjob.
Also for most people database dictates the choice of data structure and algorithm.
CS is not complete without compilers, networks, DBs, OSes and computer architecture. Yet somehow PLT nerds pretend they unlocked a super power with map and reduce.
> You didn't explain how the "bootcamp grads" thought process differs from yours.
In my experience, their thought process starts with "I know framework/library X" and ends with "What library/framework solves my specific problem".
In recent years it seems like they've completely outsourced their thought process to tools like ChatGPT. However, it's been a while since I've worked with a recent college graduate so outsourcing ones thought process may be the new normal?
I have worked with a few bootcamp grads who didn't start their thought process this way but that's something they've had to learn on their own.
Who hurt you?
Map, filter and reduce are typically simpler than for/foreach, because of how intermediate variables are handled and commonly there's a scope boundary reducing the risk of unwanted mutation or context pollution that doesn't exist in for/foreach/while in the same language.
I have met some "bootcamp grads", and unless they've managed to learn it on their own they tend to struggle with data structures, especially transformations and reductions. Getting an intuition for data and learning to keep mental models of it is rather important to be effective in software development. HtDP is quite good at teaching this specifically, and you also pick up several algorithmic techniques that are good to have and not very discoverable in themselves to a newbie, like recursion.
Furthermore, once you've gotten fluent with scalars and flat collections you're well prepared for trees, and when you get the hang of trees you can start programming programs, i.e. metaprogramming, since in the abstract a program is a tree traversed by the execution. From there getting good at software architecture is achievable too.
Well I certainly didn’t do well in English class!
After re-reading my comment, you certainly _could_ read it in a smug/conceited tone. And I did explain my thoughts, they approach each logic problem with more novelty than I do, not for a lack of practice or ability, but for a lack of a mental model to map heuristics to.
But I will say that my comments to them, and here, come from a place of wanting to raise all tides, so to speak. There is no smugness where there is no (or little) ego, and I think you’re projecting yourself onto my comment.
Good response. The GP is trying to be woke where it is not warranted.
> they approach each logic problem with more novelty than I do, not for a lack of practice or ability, but for a lack of a mental model to map heuristics to.
Very right; A proper "mental model" is fundamental to all types of learning.
I am not woke.
I don't support boot camps.
I just hold that PLT or DSA are not end-all of CS. You learn more by studying OS, Networks, compilers, DBMS, processor architecture. But somehow PLT nerds pretend knowing map and filter is a superpower. Just like kubernetes people think they can throw distributed systems at any problem.
> I just hold that PLT or DSA are not end-all of CS.
Nobody said this, that's your preconception. The comment has nothing to do with PLT or DSA. What it was talking about was patterns used in functional programming which most developers coming from imperative languages don't really appreciate (i was one of them). Hence when somebody says it is useful i try to understand them rather than dismissing it out of hand. For example, here is a recent HN submission "Haskell vs. Ada vs. C++ vs. Awk vs ... An Experiment in Software Prototyping Productivity" which gives something to think about - https://news.ycombinator.com/item?id=42445328 See "Lessons Learned" section here - https://news.ycombinator.com/item?id=42460631
> You learn more by studying OS, Networks, compilers, DBMS, processor architecture.
Again, this is orthogonal to what the comment was about. These are application domains/end products and not programming technique itself. They are not in conflict.
You certainly have not understood what the GP was talking about. It is about Concepts, Computation and Mental Models. The fact that you equate it with PLT just proves my point.
> CS is not complete without compilers, networks, DBs, OSes and computer architecture.
This is completely orthogonal to what the comment is talking about.
PS: Here is a good tutorial on Map/Filter/Reduce model of computation - https://web.mit.edu/6.005/www/fa15/classes/25-map-filter-red...
I have no experience in this style, but my experience with bootcampers and others has me wondering how much of the benefit is accidental because of their implementation of this style, rather than because of the different teaching style itself.
Specifically: I've had multiple co-workers who learned one programming language (bootcamp, self-taught, or otherwise) and were resistant to learning another one. Based on things they've said after I'd pushed them to do something in another language, I think their resistance was entirely because they remember how difficult the first programming language was and expect further languages to be similarly difficult. Instead, as they start to actually work with the second language they realize how much is conceptually similar - the things this article refers to as typically learned implicitly through experience.
But the alternate style described in the article does the same thing: moving from a sort-of pseudocode (the design process) to student Racket to Java (per another comment here) gets that same implicit learning benefit independent of a full restructuring of the curriculum.
I wouldn't be surprised if keeping the more traditional style and simply requiring different languages in different courses/years gets most of the benefits with minimal changes.
That’s an interesting comment, I had a similar experience going from Python to JavaScript then TypeScript, but now I’m really excited to pick up new languages (C, C#)
Yep, it gets fun after about the third language. This was the best thing about my programming languages course in college. We learned a bunch of languages a bit, while comparing and contrasting, which made all of them, and all the ones I've learned since then, a lot less scary.
These are all pretty similar though. If you want to widen your horizons, perhaps try a Lisp or one of the ML family languages (eh Haskell). These languages are actually different than the ones you used.
I think I kinda agree, but there's a big caveat. I've thought about these problems a lot, and I like the idea of cooperation and team-work being how we grow and develop. It doesn't square off with my own lived experience though. I consider myself a pretty good developer. That hasn't come out of "teamwork" though. My early developer days, the days where I didn't quite know enough to actually plan out what I needed to work on, weren't dominated by helpful voices guiding me along. They were filled with antisocial freaks on the internet telling me how stupid I was for asking such a basic question. The odd thing was that it didn't repel me, it drew me in. I became one of them, I sat down and searched day and night. I spent 4-7 hours every day after school just trying to understand what this computer thing was, and how it all fit together. The early work I did to learn the technical aspects of being a developer was all fueled by antisocial hyper focus, at a great cost to many other parts of my life. It undeniably made me pretty good at the technical aspects though.
I've since leveled out a bit. With the technical stuff roughly figured out. I've since moved on to the people problems, and there it's much more about the cooperation. I couldn't meaningfully contribute to those "people problems" If I didn't have the antisocial beginnings though.
I have a hard time "developing" developers, when I look back at my own lived experience. I like what and who I am, but the cost has been pretty steep. I'm not sure I can take other people down that path in good consistence.
That remind me of Peter Van Roy, Concepts, Techniques, and Models of Computer Programming, 2004
Similar approach for first years in two of the main universities in Belgium.
One uses lisp with SICP, the other the language developed in this book (Oz).
The book constructs many of the current programming paradigms, starting from almost nothing, with an emphasis on software design.
See the paradigms poster built http://www.info.ucl.ac.be/people/PVR/paradigmsDIAGRAMeng201....
I’m local to Boston and I’ve had great experience with all of the co-ops I’ve worked with from Northeastern. I’ve worked with them across many fields (ME, EE, and CS) and they have been almost uniformly great to work with.
They have all had some exposure to real world engineering practices in their respective disciplines’ teaching tracks.
They have to do coops to graduate, so you have them to work with for six months instead of ten weeks.
They are all eager to do real work and ship real stuff, and they understand quickly how to integrate into a team to do that. I don’t know if this is a selective property of Northeastern’s culture or something they instill in their students, but I really like it.
Worked with university interns before who are clearly on an industry pit stop en route to a postgraduate program and subsequent career as a corduroy elbowed academic. Thanks for playing, you’re a wonderful little creature, but we aren’t motivated to do the same things with our time. They may play the same game I do, but they’ve chosen a different character arc. They’re journeyman wizards building an arcana and looking for a tower. I’m a dwarf foreman looking to dig out the next Moria.
Love the Northeastern crowd. I’d put them as equivalent to or better than their rodent ringed counterparts from the other side of the Charles river.
Hey there! ;)
This is an interesting article. I do appreciate the focus on pair programming, which is probably something that's really helped me improve as a programmer, and their process seems quite interesting.
It would be interesting to see this method actually works, i.e. if Northeastern programmers are "more useful to their employers"/"better developers" than graduates of Universities using the older approach ('tinker until it works', as the featured article puts it)
Well...back when I was working in Boston-based companies, I spent many a day at Northeastern interviewing intern candidates there. I didn't have too much trouble finding promising programmers there. So, yeah, I guess it works.
Get computer. Play with it a bit. Get book on programming in X. Write code.
Books, and reading them and doing, are the only way of learning how to be a software developer.
That's true of a software developer in the narrow sense. In the broader sense there are skills around collaborating with others, working sync/async, delivering with quality, communicating status, mentoring junior people, educating others about what is going on, and maintaining the systems over time.
Well, of course. But primarily, you have to be able to write code. And to do that, reading books is the primary resource (and for a lot of the other things you mention).
There is a hands-on book about the OP's approach to programming (How to design programs). It distills some excellent ideas I had to learn over years.
It does feel long and dull to me now -- I haven't seen a book condensing the same for a non-beginner looking to improve.
why specifically books? i never was able to get real value out of them in the case of programming.
It's the way I roll, I guess. Everything technical with regards to computing I have learned in life I've done so from books/manuals. It is kind of all of human history to do so. And are you suggesting that you can't get anything of value out of (for e.g.) "The C Programming Language"?
what do you mean by "write code"
I mainly write code by opening my notebook and drawing boxes with little arrows between them.
From a Lisp-like introduction, to OOP (in Java), then to ACL2, and finally back to OOD. It's quite interesting in its own right, but now that functional programming is more and more adopted by the mainstream, perhaps there's no need for the "practical appeal" of OOP/OOD in the mix, or does the author genuinely believe that's the way to go?
The book A Little Java, A Few Patterns by the same author and his colleague might answer your question.
Thanks for the recommendation.
Nice article. Pair programming seems good for growing as a developer. The described process also seems well thought out and worth exploring
I found the text interesting but either my browser is not loading something or the "meat" developing the core concepts is lacking.
Would it be correct to understand this as a syllabus, and not the actual explanations/lecture/content?
You might want to try with another browser then.
My first thought is this lookes like a well designed curriculum, and several other posters here who have studied at or hired from that uni are very positive about it.
That said, it's not quite a red flag but perhaps a yellow one for me when someone trots out the "everyone else is doing it wrong" line with particular emotion-triggering words. Scott Alexander once said this was the approach of "every therapy book, ever" (https://slatestarcodex.com/2019/11/20/book-review-all-therap...)
For example, we start with the curriculum being "unique" (though they do caveat this in a link on the side), sits aloof from what is "currently fashionable", and then (Sec 1.1) paint "the vast majority" of other courses as "traditional" (section title) and "old-fashioned". Dismissing your "traditional", an emotion-laden word for some to say the least, normally activates my B.S. detector because every other startup pitch works like that. Come and invest in our innovative crypto as opposed to traditional, old-fashioned fiat currency!
Sometimes, something has become tradition because people tried it, it went well, they kept on trying it, and it kept on going well. (see also: Chesterton's fence)
I'm sure there are CS courses that could improve by following Northeastern's principles, but I'm also sure there's a lot of other colleges that turn out competent programmers who understand program design and teamwork and systematic reasoning.
Whether to start with a C-style, python style (indentation is structure), or (lisp (style)) language is a matter of taste, but I don't think I'd have got on well with the DrRacket IDE. I like to use my own editor, with my own color scheme and keybindings and regexp search/replace (where I don't need to check each time whether it's \1 or $1 to refer to a capture group), and where I can interact with git and store my code in a repo out of the box (or by opening a terminal window). Anything else feels too much like a walled garden to me.
If I was trying to develop good developers, I'd add a 7th initial, recurring step to the vertical: Talk to your customer as often and directly as possible to ensure you are still working on the right problem.
It seems to me that a lot of wasted energy is in the form of working on problems that no one cares about. Not that this is necessarily bad (hobby, fun, art, side projects, new ideas, etc), but in a concrete business setting you need to be a bit more aggressive about making sure the customer still gives a shit about what you are working on over time.
I find all of this leads neatly into the 2nd most important thing for me which is making sure you have a good domain model (schema). If the tables, columns and relations that represent the business are high quality and accurate (I.e., your customer understands them), the code that follows will usually go smoothly. Staying on the customer's heels regarding the applicability of your software to the business means you can keep this well aligned over time.
I think much of the tech bloat we see today is a side effect of attempting to outrun the complexity of the customer's specific needs. After sitting on the phone with vendors and customers for a week, you will likely not find yourself playing around with the idea of using some esoteric language or infra to implement things. It's incredible what being directly exposed to the customers can do for a developer's growth.
It’s completely true. Sometimes software development is at the crossroads of art and craftsmanship. So we like to do things nicely for the sake/pleasure of it and we can loose sight of what really matters. Most of the times, what really matters is the customer using the product and if the product is a tool, the most important criterias for him have to be aligned to what we spent the most time on. Sometimes we want to make a generic case out of specific ones in order for all customers to also benefit from it and that can lead us to higher, unnecessary levels of abstraction and sometimes because there isn’t enough it leads to hard to maintain spaghetti code that is working under specific conditions that are hard to evolve. The balance is thin and often blurry because it’s a bet on an uncertain future that depends on how well we can predict future business
> It seems to me that a lot of wasted energy is in the form of working on problems that no one cares about.
That's where classic software methodologies such as Waterfall are good at: everything must be carefully discussed with the customer during the "requirements analysis" phase.
It funny, because one of the main tenets of Agile was that developers should talk to the customer all the time, differently from Waterfall.
Are modern "Agile" shops not allowing developers to talk to the customers? That's the only Agile principle that Scrum didn't dare destroy.
It is much more than that; the "Waterfall Model" SDLC is very much misunderstood. In fact the commonly used diagram for waterfall was an example of what not to do! Folks should read the following;
Wikipedia - https://en.wikipedia.org/wiki/Waterfall_model
The Myth of the 'Waterfall' SDLC - http://www.bawiki.com/wiki/Waterfall.html This is a very good analysis.
What are the more modern methodologies?
"hey $name, is what you're working on going to benefit the customer"
no need for meetings, PMs or frameworks if it's a small shop
Scrum, which means letting a clueless "scrum master" and "product owner" change directions twice a week so they can pretend to be working.
You know what would make us all faster? The entire team in a meeting talking about which JIRA tickets they moved yesterday and which ones they plan to move today. We should also ask the same in-depth technical questions on projects which we have already asked that developer a dozen times.
Why would that make us all faster? Also, is faster better, or better is better?
Waterfall development is the most appropriate way to develop software, most of the times. CRUDs developed by startups don't change requirements often, their clueless managers that change their minds as they get to understand what they should already know before starting the project.
I heard Kent Beck (if my memory is good) that Waterfall is the method used at Facebook (he worked there)
> Waterfall development is the most appropriate way to develop software, most of the times.
This is only true if you work on unimportant projects where delays and failures are acceptable.
> This is only true if you work on unimportant projects where delays and failures are acceptable.
And SCRUM only for important project you can release crap full of bugs and wrong use cases every 2 weeks.
That's a strange non sequitur. A statement against Waterfall is not a statement for Scrum (no reason to shout, I get that you don't like it but shouting its name is weird).
Please, point out non-trivial successful (delivered on time, on budget, and with all initially planned features) Waterfall projects that did not modify Waterfall into something sensible (that is, incorporated feedback loops and probably executed as a series of iterations rather than one 5-year long project with hard distinctions between each phase).
One must remember, Scrum is not agile.
Also, Scrum-as-practiced-in-most-software-companies is not Scrum.
And real communism has never been tried
More like https://en.wikipedia.org/wiki/Sturgeon%27s_law
Most people are incompetent at what they do. That includes managers at software companies.