You're missing the point. Stanford being like the Army doesn't mean anything about any particular individual. It means that institutionally, [the engineering department at] Stanford consciously or unconsciously balances hardcore engineering fundamentals with making graduates industry-friendly, whether that means being Java/C++ heavy (which Stanford is), or encouraging spinoffs with IP policies, or whatever else. MIT (the OP thinks, and I find plausible) doesn't really institutionally pay attention to that in the same way. That doesn't mean that there are no hardcore fundamental Stanford people, or that MIT people can't go into industry.
If you scroll up to the other posts in that thread, it turns out to be a wider discussion about the role of MIT in research and the technology world overall, and in particular the claim that MIT was instrumental in solving a lot of the fundamental technical problems necessary for the Internet to work right, and that furthermore MIT was neutered in its ability to do such huge things by the cutoff of blue-sky DARPA funding in the early and mid 00's. And that, furthermore, a lot of the late-90's Internet boom in SV was dependent on MIT having solved these big fundamental problems, and that we should be worried that MIT is no longer doing things like this.
Rather outlandish, I'd say, but interesting, and at least worth talking about.
"Stanford being like the Army doesn't mean anything about any particular individual. It means that institutionally, [the engineering department at] Stanford consciously or unconsciously balances hardcore engineering fundamentals with making graduates industry-friendly, whether that means being Java/C++ heavy (which Stanford is), or encouraging spinoffs with IP policies, or whatever else. MIT (the OP thinks, and I find plausible) doesn't really institutionally pay attention to that in the same way. That doesn't mean that there are no hardcore fundamental Stanford people, or that MIT people can't go into industry."
Exactly. Up until very recently, MIT's introductory programming class (6.001) had the students learn Scheme, not for the purpose of learning Scheme, but to teach them general principles about algorithms. Nowadays, I believe they're using Python for that purpose. If a student wants to learn C++ or Java (or God forbid, Fortran), they're expected to get up to speed on their own time.
Some time ago the software engineering course and I think the compiler course changed from CLU to Java.
Now, for undergraduates, that which is not use Python (e.g. AI) uses Java, like the course 6.005, "Elements of Software Construction". Or MATLAB for a lot of EE.
Other departments have always had field specific introductory computing courses that teach whatever's relevant today, e.g. based on a quick skim of OCW, Java for civil engineering, "FORTRAN, C, C++, MATLAB, and Mathematica" for Earth, Atmospheric, and Planetary Sciences, Python for Biological Engineering, etc.
From what I've heard, in EECS after the post-dot.com enrollment crash the order came from on high that Scheme was to be terminated with extreme prejudice in the basic undergraduate curriculum.
If you scroll up to the other posts in that thread, it turns out to be a wider discussion about the role of MIT in research and the technology world overall, and in particular the claim that MIT was instrumental in solving a lot of the fundamental technical problems necessary for the Internet to work right, and that furthermore MIT was neutered in its ability to do such huge things by the cutoff of blue-sky DARPA funding in the early and mid 00's. And that, furthermore, a lot of the late-90's Internet boom in SV was dependent on MIT having solved these big fundamental problems, and that we should be worried that MIT is no longer doing things like this.
Rather outlandish, I'd say, but interesting, and at least worth talking about.