I can't tell if you're joking, but I have a 2018 Mac Mini with just 8GB of RAM, and I often run Eclipse, IntelliJ, and PyCharm at the same time (along with multiple browsers and other stuff), and performance is fine.
I was actually surprised by this--when I first started using this computer, I thought for sure I would need to add more RAM, which for the 2018 model is too complicated to do yourself (at least to me it seemed too risky).
Semi-joking, but the problem is real for me. I've a 2013 13' MacBook Pro with 8GB RAM, and my system can't cope with my workflow ... tens of tabs in Safari, webapps in Chrome (YouTube, Google Docs, ...), Eclipse with Scala / Java, ... it's a huge struggle.
I was handed a 2017 MacBook Pro with 8GB of RAM at my current job while waiting for my actual laptop to be delivered, and it was a nightmare.
I keep a lot of tabs open to look things up, but nothing excessive on that machine. I also run VSCode or Pycharm and would also bring up 5-10 containers at times.
It seriously hurt not only my productivity but also my mood afterwards just by having to put up with it for weeks.
Unless you're a very basic user I don't get why you would settle for 8GB in 2020. 8 gigs of RAM cost basically nothing, it's not worth changing your workflow in the slightest to work around that artificial limitation.
It is odd because these memory issues are very real, but if you ever say "wow, devs are getting lazy and these 'desktop' apps that rebundle Chrome are really killing my machine (eg. Slack, Skype) with inordinate quantities of logic in javascript" you get shouted down.
It's bizarre. If everyone used the native toolkits we'd have far less memory usage and everyone (even the memory-constrained) would have a good experience.
Also, with these memory hogs they will do a lot of allocation and deallocation. This is also a problem with interpreted languages. And allocation is the enemy of speed, and energy usage. It'll destroy your daily battery expectancy as everything gets interpreted.
I remember feeling the same when I was forced to upgrade from 32 mb of ram to 128 mb of ram to run the combination of browser, chat and IDE on windows NT4, back when they moved from hand-optimized assembly to mass-produced C++ for most software.
With every layer of abstraction added to ease development the hardware requirements go up. You can build things fast, or you can build fast things, doing both is tricky.
I think OSs in general just eat a portion of whatever memory you give them. Right now I'm puttering around with a dozen tabs in firefox, in fact my biggest memory hogs right now are firefox with 3.5gb and apple mail with ~500mb, not really doing anything else, and somehow 12GB/16GB are in use. Better for the ux to keep things open in memory if you have it to spare, I suppose.
When you are memory constrained, you can definitely tell. Everything comes to a halt and you just twiddle your thumbs between commands. This 16GB machine I have shipped with 4GB which was painful even 8 years ago when it was released, and I upgraded myself to 8GB 6 months into ownership. A few years later when javascript became more pervasive on the web, I hit memory constraint on 8GB a lot just from having tabs open in chrome, back when it was perhaps more of a memory hog, so I opted for 16gb and haven't had issues since.
I think at 16gb you should be set for at least 5 years. Most people, even a lot of devs on company issued equipment, are working with 8gb complaining about it right here in this very thread.
If you have larger requirements, a lightweight, thin laptop with a teensy fan isn't for you. Even if it had the hardware specs, the physics of heat dissipation don't work for you and you are better off spending the same money for more hardware sitting in a box under your desk. Me and my sore back are eyeing this up, all my computing is done on a cluster anyway.
Same. I have a MacBook Pro (Retina, 13-inch, Early 2015) with 8GB of RAM and I've been doing fine. Sure, there are some hiccups every now and then but it works. I have Spotify, VS Code, Slack, Kitty tmux sessions and more open 24/7.
What? It's probably swapping like a bastard, which with SSDs is probably not that horrible. Even 16GB for me is low (I do run in a Linux VM guest). I got myself a Mac Mini with 64 gigs, for great justice.
If I open up PyCharm and IntelliJ and Spotify and SourceTree and Docker and three different browsers and iTerm and Remote Desktop and a few other apps all at once, I will get an occasional hiccup, but it's really not as bad as I would have expected. I think 16GB would be nice though.
For comparison, I also have a 2012 Mac Mini at home with an SSD and 16GB RAM, and it's still chugging along pretty well too, although it's noticeably slower than the 2018 model with 8GB RAM.
I'm curious with regard to swapping if that might mean my SSD is going to wear out sooner. Maybe investing in more RAM would be worth it even if I don't feel like I need it.
Exactly. I'm deciding between 32GB or even 64Gb, just to be on the safe side. Because nowadays you're running Slack, Spotify, several messengers, Firefox, Chrome, IntelliJ, Docker and Kubernetes on your local machine.
Which is kind of horrifying, if you stop and think about it. You're wondering whether you need another 32G of RAM to run a basic working environment, a glorified text editor, and some communications software. I used a BBC that could do that in 32K of RAM in the 1980s! Obviously I'm not really suggesting the functionality today is equivalent, but the idea that ultimately you're meeting the same basic needs yet it now takes a million times as much space is... unsettling.
Its certainly amazing how much memory consumption has grown. I like to think of it in terms of economics, we could never write today's software using 80s methods. Slack in assembler? Impossible. Kubernetes in C++? Maybe, but there will be security holes, and Go is just more productive. Developers are expensive, very expensive.
Such is the accepted wisdom in much of the industry, but I'm a bit of a sceptic on this score. Of course developer time is expensive, particularly if you're in somewhere like the Bay Area where salaries are an extra 0 compared to most of the world. But we live in an era of virtualisation and outsourcing (sorry, "cloud computing") when businesses will knowingly pay many times the cost of just buying a set of servers and sticking them in a rack in order to have someone else buy a much bigger server, subdivide it into virtual servers, and lease them at a huge mark-up. All kinds of justifications have been given for this, many of which I suspect don't stand up to scrutiny anywhere other than boardrooms and maybe golf courses.
There's a nice write-up somewhere, though regrettably I can't immediately find it, of the economics of cloud-hosting an application built using modern trends. IIRC, it pitched typical auto-scaling architectures consisting of many ephemeral VMs running microservices and some sort of orchestration to manage everything against just buying a small number of highly specified machines and getting on with the job using a more traditional set of skills and tools. Put another way, it was the modern trend for making everything extremely horizontally scalable using more hardware and virtualisation against a more traditional vertical scaling approach using more efficient software to keep within the capacity of a small number of big machines. The conclusion was astonishingly bad for the modern/trendy case, to the point where doing it was looking borderline insane unless your application drops into a goldilocks zone in terms of capacity and resources required that relatively few applications will ever get near, and those that do may then move beyond it on the other side. And yet that horizontal scaling strategy is viewed almost as the default today for many new software businesses, because hiring people who are good enough to write the more efficient software is assumed to be too expensive.
We live in a world where any 1 man startup thinks and their investors hope they will have 10k employees by year end. Therefore, if you are going to be burning money anyway, what's another line item on the monthly outflow if it means you don't have to spend 3 months hiring someone to toil in the server room and a couple months ordering and assembling your farm that might crash the day your startup gets linked on hacker news.
There are technical reasons for this, being able to handle sudden load, but mostly it's for ideological reasons. We aren't building companies, we are building stock pumps guised as the utopian future. If you are wondering what a blue chip company looks like in tech, they are the ones that own their own infrastructure.
Maybe there is a middle road for cash poor companies, where you keep latent demand in house for the sake of cost and sense, but have some sort of insurance policy with a cloud service to step in if demand surges.
We don't have the same basic needs. People are running VMs on their laptops so they can test things in an environment similar to production without having to run extra servers for every developer/sysadmin to test with. Back in the 80s you're QA and Production environment were very likely the same!
I'll admit that modern text editors and communication software have grown resource hungry, but a lot of that comes from being able to deliver a strong, cross platform experience. I remember desktop Java doing much of the same with just as bad resource usage. Same with applets.
People are running VMs on their laptops so they can test things in an environment similar to production without having to run extra servers for every developer/sysadmin to test with.
Sure, but that immediately raises the next question of why those VMs are so big...
Yeah, the fixed cost of a VM context is on the order of kilobytes in the host kernel, megabytes in the guest kernel. And with VM balloon paging a guest VM acts much like a regular process in terms of memory usage. It's not VM usage that hogs memory, it's the applications, regardless of VMs.
Why does it matter when you can afford that RAM? Just buy and forget about it, it's cheap enough. We used to land to the moon with CPU less performant than Apple's HDMI adapter cable, it's fun comparison but not very useful, that's just the way things are and it's not going to change anytime soon.
I realise it's how things are today and not going to change any time soon, but it still feels like we as an industry have moved all too easily in a very wasteful direction. Sure, with RAM you can just buy more, but it's symptomatic of a wider malaise. Other capacities, particularly CPU core speeds, have long since stopped increasing on a nice exponential-looking curve to compensate for writing ever more layers of ever more bloated software in the name of (presumed) greater programmer efficiency. It just feels like we've lost the kind of clever, efficient culture that we used to have, and I'm not sure we weren't sold a bill of goods in return.
I'm not sure whether curve is still exponential or not, but it's there. Single-thread performance is increasing every year a little bit and core count is increasing like never before. 16 cores consumer CPU is not a dream anymore.
RAM size slowly increases as well. 4 GB was enough 10 years ago. 8 GB was enough few years ago. Today I would suggest 16 GB as a bare future-proof minimum and one can buy 64 GB for a reasonable price.
We still have room for more layers. And it's not only about efficiency, it's also about security. Desktops are still not properly sandboxed, my calc.exe still can access my private ssh key.
Once performance growth will really stop, we will start to optimize. Transistor density will double every few years until at least 2030 and AFAIK there are plans beyond that, so probably not soon.
I have 8GB on my work laptop with almost all of this (except Kubernetes, but I fail to understand why you would need a local Kubernetes) and it's fine, I usually have 2GB free memory.
Don't exagerate your memory requirements, you would be more than fine with 16GB.
That's not even close to an exaggeration. I'm running only half those things (or their competitive equivalents) right now on a Windows box. I just checked and I've got 14.8 GBs in use.
Fortunately, I have a Dell XPS 15 with 32 GBs of RAM, but the second I start up a single VM, one more messaging app, a small handful of Docker containers, or any IDE (of which I'm running none right now), I'm going over 16 GBs.
Realistically, most of us on HN probably need around 20-24 GBs, but laptops don't come in those increments.
I develop for a living. I use 6 GB including a browser, a VM and an IDE.
Some of you greatly exaggerate the needs. Some workflows require 16+ GB of RAM, but most people complaining about RAM mismanage it or do not understand that caches are not mandatory.
Right now on macOS I'm running Firefox, Outlook, 2 VSCode instances, Postman, 1 Electron chat app and another chat app and I'm under 5GB. Uptime 4 days.