It's wild to me that browsers expose this kind of control over my system to third party developers. I think making the browser an "application platform" was overall a mistake. Call me crazy, but I just want a browser that fetches and displays web sites.
Firefox uses a different method that doesn't require lowering the minimum timer resolution.
Either way the global behavior of this is no longer true on modern Windows 10/11 machines (as of Windows 10 2004) as each process must now call timeBeginPeriod if it wants increased timer resolution: https://randomascii.wordpress.com/2020/10/04/windows-timer-r...
https://learn.microsoft.com/en-us/windows/win32/api/timeapi/...
Are you objecting to the fact that that's the case or that browser windows are being used to show HD videos?
In regards to achieving smoothness you'll need to have proper frame pacing and the article doesn't mention how to do that properly.
Another example of Windows' technical debt being there, low-hanging fruit-wise, to be cashed in by performance-oriented developers. Its interesting that Youtube changing the timer resolution propagates to other threads .. does this harken to darker days in the MSDOS era? Youtube, another Turbo button?
$ gunzip -c somefile.tar.gz | tar xvf -
.. because there was, once, a day when the terminal buffer available for this pipe was bigger than available memory offered to a process by default, meaning the thing would unpack faster if I did that versus:
$ tar zxvf somefile.tar.gz
Admittedly, this discrepancy of available memory was usually because the BOFH hadn't realized there was also an allocation for pipes-per-user, so it was a neat trick to get around the hard limits that BOFH had imposed on some of my processes in terms of heap allocation ..
(like make a 20 page word doc, and start selecting from the first page and drag through - it wil go faster if you jiggle. same in excel and nearly every windows app, even windows explorer)
It was not giving the impression of being faster, it was faster.
https://retrocomputing.stackexchange.com/questions/11533/why...
The setting in question is the minimum timer resolution. Changing this will only have an impact on applications that depend heavily on that resolution, i.e. it's not some sort of turbo button for general execution speed. In fact according to the docs, a higher resolution can "reduce overall system performance, because the thread scheduler switches tasks more often."
An application whose performance depends on the timer resolution should be setting that resolution itself, using the Win32 API function mentioned in the thread, timeBeginPeriod, which includes the following in its documentation:
> For processes which call this function, Windows uses the lowest value (that is, highest resolution) requested by any process. For processes which have not called this function, Windows does not guarantee a higher resolution than the default system resolution.
> Starting with Windows 11, if a window-owning process becomes fully occluded, minimized, or otherwise invisible or inaudible to the end user, Windows does not guarantee a higher resolution than the default system resolution. See SetProcessInformation for more information on this behavior.
> Setting a higher resolution can improve the accuracy of time-out intervals in wait functions. However, it can also reduce overall system performance, because the thread scheduler switches tasks more often. High resolutions can also prevent the CPU power management system from entering power-saving modes.
https://learn.microsoft.com/en-us/windows/win32/api/timeapi/...
>For processes which have not called this function, Windows does not guarantee a higher resolution than the default system resolution.
There should at least be mention that changing this resolution can effect other processes.
Is this a bug? Its hard to see it as a feature.
These API's are from the 90s, in the beginning of the 90s where these API's are from having an global system interrupt firing 1000 times per second could very well have taken a percent or two or more from overall CPU performance (people already complained about the "overhead" of having a "real OS").
On the other hand writing audio-players on DOS you had the luxury of receiving your own interrupt within a few samples worth of audio, this meant that you could have very tight audio-buffers with less latency and quicker response to user triggers.
Not having that possibility to get that timing fidelity would have made Windows a no-go platform for audio-software, thus giving developers the freedom to enable it when needed was needed. Removing it in the next 10 years would probably have risked bad regressions.
Like a sibling comment noted, they finally removed it during Windows 10's lifespan and with modern CPU's _AND_ multicore they probably felt safe enough with performance margins to separate high accuracy threads/processes to separate cores and let other cores sleep more and actually win more battery life out of it.
It might not be "perfect engineering", but considering the number of schedulers written for Linux over the years to address desktop(audio) vs server loads it was a fairly practical and usable design.
That sorta is what it’s saying. If you don’t set it yourself, you won’t get any better than the “default system resolution”. But if the default system resolution changes (say, by entering a sort of “performance mode” when playing games or watching videos), then it would imply it will affect all processes that are using the default, right?
“Prior to Windows 10, version 2004, this function affects a global Windows setting. For all processes Windows uses the lowest value (that is, highest resolution) requested by any process. Starting with Windows 10, version 2004, this function no longer affects global timer resolution.”
You get it, I get it, but I guarantee you there are a thousand developers for each one of us who won't get it and wonder why the heck things change now and then, without realizing they also need to test their timer-dependent code under less than hygienic conditions in order to validate the results ..
I think that this technically is a distasteful situation and whoever wrote those technical docs kind of wanted to avoid having to admit the painful truth, and just out and out state that changing the timer resolution will have a system-wide impact, because .. really .. why should it? There is no good reason for it. Only bad reasons, imho. Ancient, technical debt'ish kinda reasons.
I wanted FreeBASIC to have a RAD IDE back when I was still clinging to VB6 as it was being replaced by VB.NET. I hope someday Microsoft open sources bits and pieces of VB6.
After that are the various visual basics.
For DOS we are up to 4, hopefully 5 is next which has interesting TUI apps like DOSSHELL as well as QBasic and EDIT and QuickHelp.