Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, a computer can easily render frames at the same rate that the monitor accepts them, while being horribly out of phase with the timing that the monitor expects to receive the frames. Imagine you and your friend playing on a swingset, where both of you are swinging at the same speed, but you are always at the top when your friend is at the bottom. That's likely the most common source of tearing: the frame rendered within enough time, but due to poorly configured graphics card drivers, the game process being briefly swapped out for something else at the wrong moment, naive timing in the game's rendering loop, or something else, the framebuffer is swapped out too late anyway.

The traditional way to solve this is with v-sync, which works great on realtime systems like game consoles, but on multitasking operating systems, it's rather difficult to synchronize an application perfectly to the display's refresh rate. As such, every graphics card I've ever used adds a frame or two of buffer when v-sync is enabled, resulting in a buttery-smooth but very noticeably laggy display. It doesn't matter how intensive a scene you are rendering, either: I was playing a Quake source port the other day, a game released over 17 years ago, and even that was unplayable for me with v-sync enabled.

I'm very excited for this technology, because allowing programs to control both the frequency and phase of display updates has the potential to eliminate the vast majority of display artifacts I've experienced.



If they're both able to maintain 120hz/120FPS why can't they just sync up once? Or once every N minutes if they drift apart?


I thought I was pretty clear about that. Synchronization is relatively straightforward in a realtime system, where your program has total control over the timing of execution. On a desktop operating system, however, the graphics card and its drivers can lie, the operating system can lie, the operating system can swap your process out for something else whenever it wants, there is a delay associated with accessing a PC's high accuracy timer (so even the clock lies!), and on top of that, everyone's PC is different. It's still possible to get some rough degree of synchronization going, but it's very difficult and imperfect.

If you still don't think I'm being serious about the timing stuff, consider that some folks still keep around old machines running DOS for real-time communication with microcontrollers.


Now this is interesting. What can operating systems, drivers, and GPU manufacturers do to restore a DOS like real time sync of frame generation/transfer/display.


Nothing. Otherwise the os would have to stop calling itself preemptive.


Now that every gamer has a multi-core processor, couldn't you allocate n-1 of them to the game with guaranteed non-preemption, and have one core where pre-emption can occur?


it's kind of a cool idea. I wonder if OS's would ever implement something like this. You would probably need to put some iOS like restrictions on it- only one app at a time, it must be running full screen in focus, and cannot be given that kind of priority in the background. Like having a dedicated built in console system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: