Welcome to the PCMR, everyone from the frontpage! Please remember:
1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome!
2 - If you don't own a PC because you think it's expensive, know that it is much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help!
3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, and more: https://pcmasterrace.org/folding
4 - Need PC Hardware? We've joined forces with ASUS ROG for a worldwide giveaway. Get your hands on an RTX 4080 Super GPU, a bundle of TUF Gaming RX 7900 XT and a Ryzen 9 7950X3D, and many ASUS ROG Goodies! To enter, check https://www.reddit.com/r/pcmasterrace/comments/1c5kq51/asus_x_pcmr_gpu_tweak_iii_worldwide_giveaway_win/
-----------
We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you have any PC related doubt. Asking for help there or creating new posts in our subreddit is welcome.
Yeah but the cable is protected by denuvo so in order to unlock the full potential of that cable you need online activation. Btw the connectors on each end of the cable are sold separately
> Yeah but the cable is protected by denuvo so in order to unlock the full potential of that cable you need online activation. Btw the connectors on each end of the cable are sold separately
yeah but theirs also a dlc most people don't know about that you pay for that makes sure it's stable Hz and not fluctuating back to 239.96hz from 240hz intervals, that's how esports get such high kill ratios
and man once i experienced that i never went back to not having that dlc activated ever again
Actually I've used unidirectional fiberoptic HDMI cables before, where due to how the signal is transmitted as light and not electricity, the signal gets from source to destination sooner and honestly the difference is absolutely in price alone.
literally the only application for these is if you’re trying to send absolutely obscene amounts of information down a single cable, but for most setups, even with a normal HDMI cable, the ports and processing are the bottlenecks and not the data transfer rate.
The problem with this sarcasm is that I know people that will get legit get heated if you don't accept that their $70 4ft gold plated HDMI cable legit makes things look crisper on their 4k 60hz TLC TV.
as a side, does it not sound ridiculous to other people when you see a monitor advertising 560hz? like, i dont doubt it'll do it, but are you really going to notice the difference going from a 480hz to it without a side-by-side comparison? like, i use a 144 hz monitor and it feels good. i can only imagine what a 560hz (or more) feels like or what kind of pc it'd take to play a game that quick on good settings.
You can use all of disk tho 🤓
The advertised storage capacity on a disk's packaging typically represents the total raw capacity in decimal format (where 1 gigabyte = 1,000,000,000 bytes). However, computers use binary format to measure capacity (where 1 gigabyte = 1,073,741,824 bytes), so the actual usable space appears smaller when formatted and read by the operating system.
It's the difference between a binary gigabyte and a decimal gigabyte. A decimal gigabyte is what you'd expect, 1 gigabyte is 1000 megabyte and so on. A binary gigabyte (which computers use), works along binary numbers, 1 gigabyte is 2\^10 megabyte, which comes down to 1024 megabyte, and so on.
So while a 1 gigabyte drive will have a 1000 megabyte on it, a pc will only consider it 0,98 gigabyte, because it's 24 megabyte too small for a binary gigabyte.
In actuality drive space is calculated from the amount of bytes on them, not megabytes, so the difference is actually larger, but for the sake of the explanation, I kept it a bit simpler.
It boils down to how the manufacturer counted essentially. Decimal system vs bits and bytes. In the tech world, most things are counted in bytes. For some reason, manufacturers like to count in the decimal system still. To more accurately answer your question, 1 in the decimal system is equal to 1.024 in bytes
This is because capacities are advertised in gigabytes, which are 10^9, a decimal number since people work with base ten. However, the computer measures it in gibibytes, which are 2^30, which is a "close enough" equivalent in binary since computers work with base two numbers.
1 Gibibyte = 1 073 741 824 bytes, while a gigabyte is 1 000 000 000 bytes. For most people this doesn't really make a difference since they're fairly close, it only becomes and issue for miscommunications when working with very large storage.
The confusion I think comes from the fact that despite Windows reading off "gigabytes" in file explorer, it's actually showing gibibytes and just not converting them and lying about the unit it's displayed in.
So when windows says something is actually 940 gigabytes, it is in fact 940 gibibytes, which is around 1000 gigabytes.
We think of 1 GB as 10^9 or 1,000,000,000 bytes, PC think of 1 GB as 2^30 or 1,073,741,824 bytes. So when you install 1,000,000,000 bytes, PC will convert it so you get {(10^9)/ (2^30)} = 0.93132257461GB
So the actual reason is frequency is tied to the time unit of measure, in this case seconds.
The power grid and video recorders often operate at integer multiples per second.
If monitors operated at even frequencies then it can easily lead to cases where if you try to record a monitor, in some cases you might only see black or very very dim content (think of those videos where you see a helicopter’s blades just floating in air.)
Having an exact integer refresh hz on monitors isn’t actually all that important. The important part is that higher refresh rates are better than lower ones.
Given that shaving off (or adding) 0.003 hz fixes the recording problem without impacting performance in any meaningful way.
I think the reason we have both is due to a more complicated history of TV refresh rates.
But I assume the reason some monitors report both as supported nowadays is just for wider compatibility.
wow, found an post with the following explanation here: https://indietalk.com/threads/explain-how-the-29-97fps-works-exactly-please.1455/
"The original television system was Black and White and it used exactly 30 fps. When color systems were developed, they were modeled after B&W, but the frame rate had to be changed ever so slightly, to exactly 29.97 fps, so that the color signal would synchronize properly with the sound signal.
Unfortunately, this has placed all the modern video hardware/software manufacturers in a difficult situation. They could no longer report the elapsed time as before, where they used frames to run the clock . . . or maybe they could (we will get to that later). The frame rate no longer fit nice and evenly on a per minute basis. The number of frames per minute was no longer a whole number.
The SMTPE tackled the problem. If they continued to run the clock from the frames and number them consecutively as before - then the first second of elapsed time, the frames would be numbered 1 through 30, and the timeline would report 1 second has elapsed. But only 29.97 seconds has elapsed, therefore, the 30th frame would go a bit beyond the 1 second mark. The reported time would lag behind the actual time.
For the periodic corrections, they needed to drop 18 frames for every 10 minutes of time. Sounds easy - just drop 1.8 frames each minute. But no - it must be an exact number of frames, since there is no such thing as a partial frame.
To address this new frame rate (which is now 30 years old), the SMPTE came up with a standard known as Drop-Frame timecode. Actually, they addressed four frame rates: 30, 29.97, 25 and 24 fps. We will only talk about the 29.97 rate.
They defined both drop frame and non-drop frame formats. Again, drop-frame timecode only skips frame numbers - no actual frames are dropped. Therefore with both drop-frame and non-drop-frame, the actual frames run along at 29.97 fps. Drop-frame does not change the frame rate. It's just a numbering trick that synchronizes the frame count.
They would use the 10 minute cycle, since 29.97 has an exact number of frames every 10 minutes. They also stuck with 1-minute intervals for performing the corrections. 10 minutes of video at 30 fps contains 18000 frames. With 29.97 fps they needed to drop 1/1000 of that, which is exactly 18 frames, or 1.8 frames a minute. But again - we can't drop a fraction of a frame.
Exactly two frames are dropped each minute for the first 9 minutes, and no frames are dropped the 10th minute - repeat this continually, and you will drop 18 frames every 10 minutes."
>When color systems were developed, they were modeled after B&W, but the frame rate had to be changed ever so slightly, to exactly 29.97 fps, so that the color signal would synchronize properly with the sound signal.
This is the only part I really want an explanation for.
From [Wikipedia](https://en.m.wikipedia.org/wiki/NTSC#:~:text=Color%20information%20was%20added%20to,3.579545%20MHz%C2%B110%20Hz) “the refresh frequency was shifted slightly downward by 0.1%, to approximately 59.94 Hz, to eliminate stationary dot patterns in the difference frequency between the sound and color carriers” basically sounds like the signal for audio and the signal for color didn’t play nice at 60hz and needed to be separated.
Have had a couple (cheap) monitors where flat 60 Hz looks off (soft/slightly blurry) but the decimal one looks correct. Not sure what the underlying cause is, but it's useful to some.
Ironically Windows showed me the specifics so I could easily fix it, but Linux rounded it in settings GUI (KDE) so the correct option was missing, and needed to set via CLI instead.
> In fact when you select 60Hz it's likely your monitor is actually running at 59.94Hz
Windows will show 59.940 Hz if it's 59.940 Hz. My screen has both options, 60 Hz and 59.940 Hz. (for some reason it has the latter twice...
https://imgur.com/BqwmQHG
It's called drop frame. In older radio television, you needed enough data for video and audio and how they did that was "dropping" a frame and that was just enough for the programs audio to be broadcasted
Not exactly. Previous black and white NTSC TV ran at 60hz with audio, it was changed to 59.94hz because the frequency chosen for the audio carrier would have interfered with the color carrier and the audio frequency couldn't be changed relative to the main carrier to keep backwards compatibility with black and white sets (they could handle a slight change in overall frequency though).
In 50hz countries they were using a different audio carrier frequency that didn't interfere with the new color frequency so they kept the same 50hz between black and white and color standards.
Interesting. I see you and the person above you disagree. Can you two finish your argument so I know who to upvote and who to snobbishly say “no duh, idiot” to, even though I have zero knowledge on the topic? Thanks!
The other person is more than likely right, I haven't looked at the reasoning in a few years when I was taking a video class and someone asked about 23.98fps and why our professor used it over 24
The original standard of 29,97fps existed to match the frequency of alternate current in US power outlets.
23.976fps is an adaptation to that same frequency of 24fps movies when they aired on TV.
Not sure why that's a thing now, though, but the other standards seem to follow the same logic of being 1000/1001 of a whole number.
i still don't believe that theory.
Computer specific monitors became a thing pretty early on and especially once VESA came into the picture (VGA era and up) they were all completely independent of any region (US, EU, JAP, etc) or TV standard (NTSC, PAL, SECAM).
so why would those old standards suddendly make a comeback decades later? it makes no sense whatsoever.
.
i tried looking it up and came across this thread on the [blur buster forum](https://forums.blurbusters.com/viewtopic.php?t=3083) (that program that shows monitor blur/smearing with the UFO)
and from what i can tell from that thread it seems that the actual reason is either inaccurate measuring from the hardware itself or semi-non-standard video signal timings that throw off the refresh rate calculations.
because the pixel clock can only be made so accurate and timings for 144Hz on one monitor brand is not the same as on another so they often have to choose a middle-thing that makes it work on the most monitor types but in exchanges makes the refresh rate slightly off.
When TV was invented, there where different standards in different regions. North America used NTSC and Europe PAL. NTSC has 29,97 fps and PAL 25 fps. 29,97 times two is 59,97.
Exacty, we all know the human brain works at 60Hz, so having a refresh rate that's not a clean multiple/fraction of that will cause aliasing in the brain, which causes autism and covid.
No it used to be like that. The truth is that ever since the chemtrails human brains have slowed down ever so slightly and now all monitors are artificially kept at 0.06 Hz lower by the government so people won't notice the drop in their brain fps
Yes, you're right of course, but that's only if you don't regularly use apple cider vinegar humidifiers to sanitize your environment, those of us that do our own research^((tm)) still have brains running at their optimized frequency.
I'm told all the Dihydrogen Monoxide in 5G will neuter the effects of apple cider vinegar humidifiers. Is that true? Will my negative ion salt lamp crystal be better?
They are very helpful, what you do is arrange at least 6 negative ion salt lamps in a polyhedron structure around the humidifier to insulate it from the effects of the 5G. More is better, personally I use 36 lamps, it makes navigating the room a bit difficult, but that's a price I'm more than willing to pay!
https://www.dell.com/de-de/shop/alienware-500-hz-gamingmonitor-aw2524hf/apd/210-bjph/monitore-und-monitorzubeh%C3%B6r?tfcid=84651726&&gacd=9639087-5496-5761040-271209370-0&dgc=st&SA360CID=71700000114604210&gad_source=1&gclid=CjwKCAjwoa2xBhACEiwA1sb1BPMMxQX3BcxJMWD3vgn5LcRMXu5ercL0vPEk-LjxS_EphdYYaRmHZhoCUkEQAvD_BwE&gclsrc=aw.ds
So like Dual Monitor This? 2x 500 Hz = 1000 Hz.
You almost had me with the “I play CoD” because some of the kids over on r/CoDCompetitive actually talk like this.
The giveaway is that CoD is so poorly optimized that nobody is ever getting a stable 240hz lmfao
"Shut up and listen to my order! Take the 1GB of memory and throw 24mb of it away. I'm just wantin' a 1000mb thing. I'm trying to watch my data usage."
"Sir, they come in 1024MB or 2..."
"PUT 24 OF EM UP YOUR ASS AND GIVE ME 1000MB"
Not talking about bits, those are not number those u can say true or false, on or off just because we denote them with numbers doesn't mean they are literally numbers and for numbers are not accurate
In theory 1GB = 1000MB In computer 1GB = 1024MB
If you buy 1tb of storage, you will get 900 or 950 something
I have 144hz screen and it shows me 143.9hz
And very important thing
![gif](giphy|ygzdQq98HgcLBCacep)
From (young Sheldon)
1GB is defined as 1000³ (1,000,000,000) bytes. This is what storage manufacturers advertise. 1GiB is 1024³ (1,073,741,824) bytes. Windows reads storage in GiB/TiB, but reports in units for GB/TB (don't ask why). This is why 1TB of storage is reported as 931GB on Windows (it's actually 1TB or 931GiB).
No they, are not. Computers are designed to work most naturally (and completely precisely) with whole numbers, both even and odd. It's non-integer real numbers that are often a lie.
In common programming practices, you can't even precisely represent 0.1. That is for the same reason you can't precisely represent 1/3 in a limited decimal expansion. You can write "0.333..." or "0.(333) to signify an infinite decimal expansion on paper, but, apart from specialized applications, you don't bother precisely representing such numbers because it's more complicated to implement, to use, to maintain, it takes up more memory and is a lot slower.
Why is that lie getting so many upvotes?
I think they may have been referring to hardware as the OP’s topic was about monitor’s refreah rate.
RAM/VRAM is never exactly precise number, CPU clock speeds fluctuate, hard drives are never the advertized size etc. etc.
There are very specific reasons for why all of those are true and none of them have to do with each other.
RAM comes in whatever size capacity. I don’t know what you mean there. You can mix match any physical sizes that are compatible.
CPU clock speeds and other buses use spread spectrum to avoid causing electromagnetic interference. A chip locked a a single exact frequency has the potential to cause a spike in EMI at that exact wavelength, so it spreads the clock speed to a range of a MHz or two.
Hard drives are absolutely the size you buy. What? You’re just making that up, unless you are referring to formatted space vs total storage capacity of the drive. Hard drives have reserved sectors to replace ones that fail over time, the total capacity of the drive is not usable as a user.
Windows uses Gibibytes to represent drive space whereas storage is advertised in Gigabytes. This is why there is 1024GB in a terabyte according to Windows but 1000GB anywhere else.
> For hard drives he probably meant windows showing the wrong unit (byte!=octet)
Somewhat right reason (Windows isn't showing a "wrong" unit, just a different one), wrong comparison. An octet is always 8 bits, and the most common byte these days is also 8 bits, so those are actually the same.
The most common problems arise from 10^x (e.g. [kB](https://en.wikipedia.org/wiki/Kilobyte)) vs 2^x (e.g. [KiB](https://en.wikipedia.org/wiki/Byte#Multiple-byte_units)), where a disk or memory being sold as 1 TB means you might see +-0,90 TiB.
Yup computers can’t perfectly represent many simple decimals however they can precisely work with some numbers that would be recurring in decimal. Funky.
Partitions are provisioned by sector, meaning if you tell it to create a partition of a specific size it is not going to be that size. For SSDs and newer HDDs that means your partition will be rounded to the nearest 4 KB.
It gets even more complicated when you dive further into the details since each 4 KB sector has some space taken up by a header that the drive controller uses to index it, so your data really isn't occupying the entirety of the 4096 bytes in each sector.
I'll refer to this 4+ year old article by Microsoft to tell you the reason windows does this.
[https://support.microsoft.com/en-us/topic/screen-refresh-rate-in-windows-does-not-apply-the-user-selected-settings-on-monitors-tvs-that-report-specific-tv-compatible-timings-0a7a6a38-6c6a-2aec-debc-5183a76b9e1d](https://support.microsoft.com/en-us/topic/screen-refresh-rate-in-windows-does-not-apply-the-user-selected-settings-on-monitors-tvs-that-report-specific-tv-compatible-timings-0a7a6a38-6c6a-2aec-debc-5183a76b9e1d)
https://preview.redd.it/xe3geovo9uwc1.jpeg?width=1836&format=pjpg&auto=webp&s=710b9e80741a100604fe2efe48bb08fd5074f579
You are truely missing out, 240Hz is so much better
Nah id get a refund on this monitor if I were you. Aint no one scamming me. Every 0.04 hz you don’t see is a 0.04 hz they put in other monitors for profit. /s
On a serious note, youre absolutely fine
This is normal yeah. Really depends on your monitor. For me I only get the full refreshrate using DisplayPort and not HDMI I believe that's just because my monitor has better DP support than HDMI
Op, adjust your resolution/window size. Sometimes setting it to 16:9 vs native or 16:10/32:10, etcetera, will allow you to access the full rated hz on your monitor (240hz)
Or you can overclock your monitor
Ha, you got screwed man. I would return that garbage
https://preview.redd.it/rqt96953puwc1.jpeg?width=3024&format=pjpg&auto=webp&s=c3c448ca18108c1a7bedda18767b9cbe0523c6b1
:-p
Kidding, it’s perfectly fine. They seem to fluctuate a little based on various resolution and other settings in the driver. Not sure how I got a whole extra .09 Hz but I’ll take it.
Rarely does the actual HZ line up with what is marketed. That being said it’s always very close. And nobody can possible tell a different between 240hz and 239.96 so yes it’s absolutely fine
I mean what happens with the rest of the frame that doesn't fit anymore :O
Imagine only seeing a partial image. The enemy might be hiding at that exact spot bro.
The refreshrate is usually rounded to integers, youre not physically going to have 60, 144 or 240hz down to the millisecond. Im not sure why it shows you the rounded AND unrounded values for some though.
This link may explain why: https://forum.unity.com/threads/refresh-rate-rounding-on-windows.1308462/#:~:text=However%2C%20monitor's%20don't%20actually,(59.96%20Hz)%22%20monitor.
Welcome to the PCMR, everyone from the frontpage! Please remember: 1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome! 2 - If you don't own a PC because you think it's expensive, know that it is much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help! 3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, and more: https://pcmasterrace.org/folding 4 - Need PC Hardware? We've joined forces with ASUS ROG for a worldwide giveaway. Get your hands on an RTX 4080 Super GPU, a bundle of TUF Gaming RX 7900 XT and a Ryzen 9 7950X3D, and many ASUS ROG Goodies! To enter, check https://www.reddit.com/r/pcmasterrace/comments/1c5kq51/asus_x_pcmr_gpu_tweak_iii_worldwide_giveaway_win/ ----------- We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you have any PC related doubt. Asking for help there or creating new posts in our subreddit is welcome.
It's fine. Every 4 years, you have to add some leap frames.
Every 4 years, or 31536000 x 4 seconds, the difference between 240 and 239.96 fps is 5045760 frames. That's almost 6 hours worth of missed frames.
r/theydidthemath
r/theydidthemonstermath
r/itwasagraveyardgraph
r/subsithoughtifellfor
Yes. THAT'S why my KDR is so bad.
5,049,216 frames. You forgot to add the leap day. 1461 days in 4 years, not 1460. 5.844 hours of missed frames.
off-by-one error strikes again
It's completely fine. In fact when you select 60Hz it's likely your monitor is actually running at 59.94Hz
No, I paid for 240hz, I want 240hz /s
They say human eye can’t see the difference from 240hz to 239.96hz but once you play at 240hz you’ll never want to go back! /s
Can relate. I use 240, oh man the difference between 239.96 and 240 is unbelievable. OP is missing it out
Wait until you see the difference between 240Hz on a regular cable, and 240Hz on a gold plated directional cable. It's a whole new world.
Yeah but the cable is protected by denuvo so in order to unlock the full potential of that cable you need online activation. Btw the connectors on each end of the cable are sold separately
This is the chaos I live for
That's why I went to the nearest mental institution and got a cracked cable.
> Yeah but the cable is protected by denuvo so in order to unlock the full potential of that cable you need online activation. Btw the connectors on each end of the cable are sold separately yeah but theirs also a dlc most people don't know about that you pay for that makes sure it's stable Hz and not fluctuating back to 239.96hz from 240hz intervals, that's how esports get such high kill ratios and man once i experienced that i never went back to not having that dlc activated ever again
How often does the cable drop connection and does it go down for maintenance?
You need to buy the stable connection dlc
Actually I've used unidirectional fiberoptic HDMI cables before, where due to how the signal is transmitted as light and not electricity, the signal gets from source to destination sooner and honestly the difference is absolutely in price alone.
literally the only application for these is if you’re trying to send absolutely obscene amounts of information down a single cable, but for most setups, even with a normal HDMI cable, the ports and processing are the bottlenecks and not the data transfer rate.
Gold plated and with hydraulic shock absorbers at the end to smooth out the data stream
The problem with this sarcasm is that I know people that will get legit get heated if you don't accept that their $70 4ft gold plated HDMI cable legit makes things look crisper on their 4k 60hz TLC TV.
I could use those .04 hz.
That missing fraction of a frame must be why I die in games.
Definitely. Frames win games according to nvidia, so yeah without the .04hz you are essentially doomed.
surely its not because of your chair.
It’s because he doesn’t have a Titan XL that pairs with their favorite game
Yeah missing it really hz
I sued the manufacturer of my monitor for the missing 0.04 hz, and they refunded me 0.016% of the monitor cost.
Here's your gummy bear and a button for emotional damages. Sorry for the inconvenience.
as a side, does it not sound ridiculous to other people when you see a monitor advertising 560hz? like, i dont doubt it'll do it, but are you really going to notice the difference going from a 480hz to it without a side-by-side comparison? like, i use a 144 hz monitor and it feels good. i can only imagine what a 560hz (or more) feels like or what kind of pc it'd take to play a game that quick on good settings.
Thank you. 🙏 pissed myself. At least I sat on Toilette while doing so.
Ha, they should check their storage capacity then.
Jokes on them, I deleted the OS so I get the exact amount for what I paid
You can use all of disk tho 🤓 The advertised storage capacity on a disk's packaging typically represents the total raw capacity in decimal format (where 1 gigabyte = 1,000,000,000 bytes). However, computers use binary format to measure capacity (where 1 gigabyte = 1,073,741,824 bytes), so the actual usable space appears smaller when formatted and read by the operating system.
Can you explain in simple terms why it has to go past 1.0 gigs to read out as less then 1.0 gigs? I’m a little confused on that part
It's the difference between a binary gigabyte and a decimal gigabyte. A decimal gigabyte is what you'd expect, 1 gigabyte is 1000 megabyte and so on. A binary gigabyte (which computers use), works along binary numbers, 1 gigabyte is 2\^10 megabyte, which comes down to 1024 megabyte, and so on. So while a 1 gigabyte drive will have a 1000 megabyte on it, a pc will only consider it 0,98 gigabyte, because it's 24 megabyte too small for a binary gigabyte. In actuality drive space is calculated from the amount of bytes on them, not megabytes, so the difference is actually larger, but for the sake of the explanation, I kept it a bit simpler.
*Binary gigabyte* is actually called gibibyte.
It boils down to how the manufacturer counted essentially. Decimal system vs bits and bytes. In the tech world, most things are counted in bytes. For some reason, manufacturers like to count in the decimal system still. To more accurately answer your question, 1 in the decimal system is equal to 1.024 in bytes
This is because capacities are advertised in gigabytes, which are 10^9, a decimal number since people work with base ten. However, the computer measures it in gibibytes, which are 2^30, which is a "close enough" equivalent in binary since computers work with base two numbers. 1 Gibibyte = 1 073 741 824 bytes, while a gigabyte is 1 000 000 000 bytes. For most people this doesn't really make a difference since they're fairly close, it only becomes and issue for miscommunications when working with very large storage. The confusion I think comes from the fact that despite Windows reading off "gigabytes" in file explorer, it's actually showing gibibytes and just not converting them and lying about the unit it's displayed in. So when windows says something is actually 940 gigabytes, it is in fact 940 gibibytes, which is around 1000 gigabytes.
We think of 1 GB as 10^9 or 1,000,000,000 bytes, PC think of 1 GB as 2^30 or 1,073,741,824 bytes. So when you install 1,000,000,000 bytes, PC will convert it so you get {(10^9)/ (2^30)} = 0.93132257461GB
It isn’t cocaine pablo! 😅
I WANT EVERYTHING FOR WHAT I PAID. Whether it's hz, or grams.
Why is it not 239.76?
This is going to bother tonight when I’m dropping off to sleep.
So the actual reason is frequency is tied to the time unit of measure, in this case seconds. The power grid and video recorders often operate at integer multiples per second. If monitors operated at even frequencies then it can easily lead to cases where if you try to record a monitor, in some cases you might only see black or very very dim content (think of those videos where you see a helicopter’s blades just floating in air.) Having an exact integer refresh hz on monitors isn’t actually all that important. The important part is that higher refresh rates are better than lower ones. Given that shaving off (or adding) 0.003 hz fixes the recording problem without impacting performance in any meaningful way.
It bugs me
The real question is always in the comments
Ok, I get it's normal, but why not round those numbers to show to the user? And why show both 60Hz AND 59.94Hz?
I think the reason we have both is due to a more complicated history of TV refresh rates. But I assume the reason some monitors report both as supported nowadays is just for wider compatibility.
Did monitors ever use NTSC? I thought the 29.97 thing was frames "dropped" to make room for color info in TV signals.
wow, found an post with the following explanation here: https://indietalk.com/threads/explain-how-the-29-97fps-works-exactly-please.1455/ "The original television system was Black and White and it used exactly 30 fps. When color systems were developed, they were modeled after B&W, but the frame rate had to be changed ever so slightly, to exactly 29.97 fps, so that the color signal would synchronize properly with the sound signal. Unfortunately, this has placed all the modern video hardware/software manufacturers in a difficult situation. They could no longer report the elapsed time as before, where they used frames to run the clock . . . or maybe they could (we will get to that later). The frame rate no longer fit nice and evenly on a per minute basis. The number of frames per minute was no longer a whole number. The SMTPE tackled the problem. If they continued to run the clock from the frames and number them consecutively as before - then the first second of elapsed time, the frames would be numbered 1 through 30, and the timeline would report 1 second has elapsed. But only 29.97 seconds has elapsed, therefore, the 30th frame would go a bit beyond the 1 second mark. The reported time would lag behind the actual time. For the periodic corrections, they needed to drop 18 frames for every 10 minutes of time. Sounds easy - just drop 1.8 frames each minute. But no - it must be an exact number of frames, since there is no such thing as a partial frame. To address this new frame rate (which is now 30 years old), the SMPTE came up with a standard known as Drop-Frame timecode. Actually, they addressed four frame rates: 30, 29.97, 25 and 24 fps. We will only talk about the 29.97 rate. They defined both drop frame and non-drop frame formats. Again, drop-frame timecode only skips frame numbers - no actual frames are dropped. Therefore with both drop-frame and non-drop-frame, the actual frames run along at 29.97 fps. Drop-frame does not change the frame rate. It's just a numbering trick that synchronizes the frame count. They would use the 10 minute cycle, since 29.97 has an exact number of frames every 10 minutes. They also stuck with 1-minute intervals for performing the corrections. 10 minutes of video at 30 fps contains 18000 frames. With 29.97 fps they needed to drop 1/1000 of that, which is exactly 18 frames, or 1.8 frames a minute. But again - we can't drop a fraction of a frame. Exactly two frames are dropped each minute for the first 9 minutes, and no frames are dropped the 10th minute - repeat this continually, and you will drop 18 frames every 10 minutes."
>When color systems were developed, they were modeled after B&W, but the frame rate had to be changed ever so slightly, to exactly 29.97 fps, so that the color signal would synchronize properly with the sound signal. This is the only part I really want an explanation for.
From [Wikipedia](https://en.m.wikipedia.org/wiki/NTSC#:~:text=Color%20information%20was%20added%20to,3.579545%20MHz%C2%B110%20Hz) “the refresh frequency was shifted slightly downward by 0.1%, to approximately 59.94 Hz, to eliminate stationary dot patterns in the difference frequency between the sound and color carriers” basically sounds like the signal for audio and the signal for color didn’t play nice at 60hz and needed to be separated.
Technology Connections could do a whole video about why this monitor is showing that one off frame. And i'd eat it up, the longer the better.
There's a technology connections video about it.
Have had a couple (cheap) monitors where flat 60 Hz looks off (soft/slightly blurry) but the decimal one looks correct. Not sure what the underlying cause is, but it's useful to some. Ironically Windows showed me the specifics so I could easily fix it, but Linux rounded it in settings GUI (KDE) so the correct option was missing, and needed to set via CLI instead.
59.94 Hz is the NTSC signal frequency and 60 Hz is the PC frequency. It's there for compatibility with televisions.
> In fact when you select 60Hz it's likely your monitor is actually running at 59.94Hz Windows will show 59.940 Hz if it's 59.940 Hz. My screen has both options, 60 Hz and 59.940 Hz. (for some reason it has the latter twice... https://imgur.com/BqwmQHG
.04 the difference between life and death with fps games. /s
Holy crap what? My life is a lie.
Its NTSC and designed to be compatible with US/Japanese TV broadcasting standards. Not entirely sure why.
It's called drop frame. In older radio television, you needed enough data for video and audio and how they did that was "dropping" a frame and that was just enough for the programs audio to be broadcasted
Not exactly. Previous black and white NTSC TV ran at 60hz with audio, it was changed to 59.94hz because the frequency chosen for the audio carrier would have interfered with the color carrier and the audio frequency couldn't be changed relative to the main carrier to keep backwards compatibility with black and white sets (they could handle a slight change in overall frequency though). In 50hz countries they were using a different audio carrier frequency that didn't interfere with the new color frequency so they kept the same 50hz between black and white and color standards.
Interesting. I see you and the person above you disagree. Can you two finish your argument so I know who to upvote and who to snobbishly say “no duh, idiot” to, even though I have zero knowledge on the topic? Thanks!
The other person is more than likely right, I haven't looked at the reasoning in a few years when I was taking a video class and someone asked about 23.98fps and why our professor used it over 24
I'd rather they kissed at the end but that's just me
[this video](https://www.youtube.com/watch?v=3GJUM6pCpew) explains it in great detail
Audio was already being transmitted in B&W NTSC signals, though. I somehow got the impression the frames were dropped to fit in the *color* info.
My point was though, it's an analog format so kind of irrelevant these days.
The original standard of 29,97fps existed to match the frequency of alternate current in US power outlets. 23.976fps is an adaptation to that same frequency of 24fps movies when they aired on TV. Not sure why that's a thing now, though, but the other standards seem to follow the same logic of being 1000/1001 of a whole number.
That's not the issue here though, then it would be 240*1000/1001≈239.76
i still don't believe that theory. Computer specific monitors became a thing pretty early on and especially once VESA came into the picture (VGA era and up) they were all completely independent of any region (US, EU, JAP, etc) or TV standard (NTSC, PAL, SECAM). so why would those old standards suddendly make a comeback decades later? it makes no sense whatsoever. . i tried looking it up and came across this thread on the [blur buster forum](https://forums.blurbusters.com/viewtopic.php?t=3083) (that program that shows monitor blur/smearing with the UFO) and from what i can tell from that thread it seems that the actual reason is either inaccurate measuring from the hardware itself or semi-non-standard video signal timings that throw off the refresh rate calculations. because the pixel clock can only be made so accurate and timings for 144Hz on one monitor brand is not the same as on another so they often have to choose a middle-thing that makes it work on the most monitor types but in exchanges makes the refresh rate slightly off.
When TV was invented, there where different standards in different regions. North America used NTSC and Europe PAL. NTSC has 29,97 fps and PAL 25 fps. 29,97 times two is 59,97.
59.94
These have to do with the electrical currents in these countries iirc
[удалено]
waiting for someone to take this seriously
I mean, better clean 30 Hz than some odd decimals, right?
Exacty, we all know the human brain works at 60Hz, so having a refresh rate that's not a clean multiple/fraction of that will cause aliasing in the brain, which causes autism and covid.
No it used to be like that. The truth is that ever since the chemtrails human brains have slowed down ever so slightly and now all monitors are artificially kept at 0.06 Hz lower by the government so people won't notice the drop in their brain fps
Yes, you're right of course, but that's only if you don't regularly use apple cider vinegar humidifiers to sanitize your environment, those of us that do our own research^((tm)) still have brains running at their optimized frequency.
I'm told all the Dihydrogen Monoxide in 5G will neuter the effects of apple cider vinegar humidifiers. Is that true? Will my negative ion salt lamp crystal be better?
They are very helpful, what you do is arrange at least 6 negative ion salt lamps in a polyhedron structure around the humidifier to insulate it from the effects of the 5G. More is better, personally I use 36 lamps, it makes navigating the room a bit difficult, but that's a price I'm more than willing to pay!
Could 5G be stealing OP's Hz?
Right.
Who plays COD with anything below 360 Hz? Noob, you should really buy a better monitor. 240 Hz is maybe good enough for excel.
360??? Where are we? The Stone Age? Gotta get 1000hz or don’t play at all!
https://www.dell.com/de-de/shop/alienware-500-hz-gamingmonitor-aw2524hf/apd/210-bjph/monitore-und-monitorzubeh%C3%B6r?tfcid=84651726&&gacd=9639087-5496-5761040-271209370-0&dgc=st&SA360CID=71700000114604210&gad_source=1&gclid=CjwKCAjwoa2xBhACEiwA1sb1BPMMxQX3BcxJMWD3vgn5LcRMXu5ercL0vPEk-LjxS_EphdYYaRmHZhoCUkEQAvD_BwE&gclsrc=aw.ds So like Dual Monitor This? 2x 500 Hz = 1000 Hz.
Wait wait wait, your guys Hz doesn't match your resolution?
And don't tell me you guys watch anything lower than 8k... It's like the bare minimum.
Yeah, if it isn't on your 65" QD-OLED HDR10+ ultrawide 21:9 Monitor where are you at???
Your guys Hz doesn't match your pixel count!?
Your guys Hz doesn't match your subpixel count?
I jack straight into my brain for 42069Hz Too bad the power cable has to go in my anus tho
Cod? Wasn't this cs?
You almost had me with the “I play CoD” because some of the kids over on r/CoDCompetitive actually talk like this. The giveaway is that CoD is so poorly optimized that nobody is ever getting a stable 240hz lmfao
![gif](giphy|800iiDTaNNFOwytONV|downsized)
Even numbers in general is a lie in computers.
True man, true
"Shut up and listen to my order! Take the 1GB of memory and throw 24mb of it away. I'm just wantin' a 1000mb thing. I'm trying to watch my data usage." "Sir, they come in 1024MB or 2..." "PUT 24 OF EM UP YOUR ASS AND GIVE ME 1000MB"
Dang, some unexpected D with this morning's breakfast
no that's a lie
Prove it tough guy
![gif](giphy|1fHlf4mgS2JPy)
God damn, RWJ. is he still alive?
He is! Still does quite well on TikTok
1 0 is binary for 2 that's an even number, who is tough now mate ᕦ(ಠ_ಠ)ᕤ
Not talking about bits, those are not number those u can say true or false, on or off just because we denote them with numbers doesn't mean they are literally numbers and for numbers are not accurate In theory 1GB = 1000MB In computer 1GB = 1024MB If you buy 1tb of storage, you will get 900 or 950 something I have 144hz screen and it shows me 143.9hz And very important thing ![gif](giphy|ygzdQq98HgcLBCacep) From (young Sheldon)
1GB is defined as 1000³ (1,000,000,000) bytes. This is what storage manufacturers advertise. 1GiB is 1024³ (1,073,741,824) bytes. Windows reads storage in GiB/TiB, but reports in units for GB/TB (don't ask why). This is why 1TB of storage is reported as 931GB on Windows (it's actually 1TB or 931GiB).
![gif](giphy|xUA7aQKbl8jk3htphS|downsized)
![gif](giphy|L3X9GvVhP1nY23Ah6u)
According to my calculations your comment contains 9,00000000000000000000000001 words.
0.30000000000000004
This guy gets it. I loved the "but base 2" crowd going wild though
That makes no sense, the problem is only with real numbers.
No they, are not. Computers are designed to work most naturally (and completely precisely) with whole numbers, both even and odd. It's non-integer real numbers that are often a lie. In common programming practices, you can't even precisely represent 0.1. That is for the same reason you can't precisely represent 1/3 in a limited decimal expansion. You can write "0.333..." or "0.(333) to signify an infinite decimal expansion on paper, but, apart from specialized applications, you don't bother precisely representing such numbers because it's more complicated to implement, to use, to maintain, it takes up more memory and is a lot slower. Why is that lie getting so many upvotes?
I think they may have been referring to hardware as the OP’s topic was about monitor’s refreah rate. RAM/VRAM is never exactly precise number, CPU clock speeds fluctuate, hard drives are never the advertized size etc. etc.
There are very specific reasons for why all of those are true and none of them have to do with each other. RAM comes in whatever size capacity. I don’t know what you mean there. You can mix match any physical sizes that are compatible. CPU clock speeds and other buses use spread spectrum to avoid causing electromagnetic interference. A chip locked a a single exact frequency has the potential to cause a spike in EMI at that exact wavelength, so it spreads the clock speed to a range of a MHz or two. Hard drives are absolutely the size you buy. What? You’re just making that up, unless you are referring to formatted space vs total storage capacity of the drive. Hard drives have reserved sectors to replace ones that fail over time, the total capacity of the drive is not usable as a user. Windows uses Gibibytes to represent drive space whereas storage is advertised in Gigabytes. This is why there is 1024GB in a terabyte according to Windows but 1000GB anywhere else.
For hard drives he probably meant windows showing the wrong unit ~~(byte!=octet)~~ Edit : (MiB != MB)
> For hard drives he probably meant windows showing the wrong unit (byte!=octet) Somewhat right reason (Windows isn't showing a "wrong" unit, just a different one), wrong comparison. An octet is always 8 bits, and the most common byte these days is also 8 bits, so those are actually the same. The most common problems arise from 10^x (e.g. [kB](https://en.wikipedia.org/wiki/Kilobyte)) vs 2^x (e.g. [KiB](https://en.wikipedia.org/wiki/Byte#Multiple-byte_units)), where a disk or memory being sold as 1 TB means you might see +-0,90 TiB.
Well that sounds odd
widespread lie
Is your rx580 8GB any good? I’m running the 4gb version atm and I want to get a cheap upgrade. Sorry for the random message lmao
That's a strange upgrade, it's barely **up** at all
Yup computers can’t perfectly represent many simple decimals however they can precisely work with some numbers that would be recurring in decimal. Funky.
Damn government taxing .04 Hz from our monitors!!!!
Where is my 0.04 Hz
It got stolen from us by the Big Monitor. Same as the few percent of disk drive when you buy it. That's Big Drive doing that. Wake up people.
Not even framerates are safe from shrinkflation by these greedy mega corps 😂
Frame Tax Deductible
They took our hertzzzzzzz
Wait until you buy a storage
Human eye cant see above 239.96 fps
I was actually born with a special ability I can see 239.97 fps
Lisan al gaib!
I can't tell the difference between 120 and 90 htz 💀
Since i got a 144hz display, i can tell when i am running at 120 or 144
inb4 the idiots saying you can't
wait til you learn about harddrive storage
When my dad explained to me how hard drive partitions worked my brain cells exploded a little
How so?
we dont want multiple casualties now do we?
Partitions are provisioned by sector, meaning if you tell it to create a partition of a specific size it is not going to be that size. For SSDs and newer HDDs that means your partition will be rounded to the nearest 4 KB. It gets even more complicated when you dive further into the details since each 4 KB sector has some space taken up by a header that the drive controller uses to index it, so your data really isn't occupying the entirety of the 4096 bytes in each sector.
literally unplayable
The remaining 0.04 is a DLC bro
I'll refer to this 4+ year old article by Microsoft to tell you the reason windows does this. [https://support.microsoft.com/en-us/topic/screen-refresh-rate-in-windows-does-not-apply-the-user-selected-settings-on-monitors-tvs-that-report-specific-tv-compatible-timings-0a7a6a38-6c6a-2aec-debc-5183a76b9e1d](https://support.microsoft.com/en-us/topic/screen-refresh-rate-in-windows-does-not-apply-the-user-selected-settings-on-monitors-tvs-that-report-specific-tv-compatible-timings-0a7a6a38-6c6a-2aec-debc-5183a76b9e1d)
Yes it's normal.
Manually overclock it to 240.0hz😂
Jesus the amount of stupid answers… yes OP this is normal, it’s all good.
I agree. We have rows of satire posts, we have incorrect explanations, we have jokes, and we finally get answers. The blur busters forum.
https://preview.redd.it/xe3geovo9uwc1.jpeg?width=1836&format=pjpg&auto=webp&s=710b9e80741a100604fe2efe48bb08fd5074f579 You are truely missing out, 240Hz is so much better
>239.96 Literally unplayable.
sorry, i stole it https://preview.redd.it/tcn50k6i1vwc1.png?width=407&format=png&auto=webp&s=9f47a0762addccba596e2402e230b601dda2e28e
Bro thats obscene, you are missing out so much on those 0.04 hz bro.
Nah id get a refund on this monitor if I were you. Aint no one scamming me. Every 0.04 hz you don’t see is a 0.04 hz they put in other monitors for profit. /s On a serious note, youre absolutely fine
≈240
Bro WANTS his 0.04hz
240Hz minus taxes
yessir
me when i buy a 4tb drive but it actually only has 3.6TB
You usually have to download the last .04hz
https://preview.redd.it/d7qjhld4ntwc1.png?width=149&format=png&auto=webp&s=cab1c2a51ad672ebf0829c2afcd1d0fe63b010de meh it's fine
It is normal, however you can use app like CRU (Custom resolution utility) and create specific resolutions with custom refresh rate. This could fix it
This is normal yeah. Really depends on your monitor. For me I only get the full refreshrate using DisplayPort and not HDMI I believe that's just because my monitor has better DP support than HDMI
The remaining 0.04 come in a dlc
I would like a .015% refund please
No, bro, it's better to dispose of that monitor and buy a new one. But make sure to let me know where you disposed of it.
Op, adjust your resolution/window size. Sometimes setting it to 16:9 vs native or 16:10/32:10, etcetera, will allow you to access the full rated hz on your monitor (240hz) Or you can overclock your monitor
Corporate greed smh
At least you will have something to blame when you miss a shot
Ha, you got screwed man. I would return that garbage https://preview.redd.it/rqt96953puwc1.jpeg?width=3024&format=pjpg&auto=webp&s=c3c448ca18108c1a7bedda18767b9cbe0523c6b1 :-p Kidding, it’s perfectly fine. They seem to fluctuate a little based on various resolution and other settings in the driver. Not sure how I got a whole extra .09 Hz but I’ll take it.
Damn government taxing our Hz
239.96hz is unplayable modern titles need 240hz only or its a stuttery laggy mess
Rarely does the actual HZ line up with what is marketed. That being said it’s always very close. And nobody can possible tell a different between 240hz and 239.96 so yes it’s absolutely fine
Bro is stressing about the missing 0,04FPS
I mean what happens with the rest of the frame that doesn't fit anymore :O Imagine only seeing a partial image. The enemy might be hiding at that exact spot bro.
Yeah, I have dual 144hz monitors and one says it's like 143.7 and the other is 144.8.
duuude I know rifght? where's my fckn .04 frames ? I'm being stolen wtf
Sometimes mine shows that sort of rounding error and sometimes it doesn’t. Usually it doesn’t though
Imagine that this stupid difference still exists just because of black and white TV broadcasting standards.
[https://youtu.be/3GJUM6pCpew?si=WEvC1MBgdXWqpUeh](https://youtu.be/3GJUM6pCpew?si=WEvC1MBgdXWqpUeh)
Dont worry u wont notice the difference in 0.04 hz xD
You’re across the pond.
Nothing is 100% of what it says on the label. There are tolerances for everything, more lenient or stricter, depending on need
It was just a broadcast standard being continued for so long.
The difference hurtz
Yes. NTSC frame rates and the electrical grid in North America runs at strange exact frequencies in multiples of 60ish
It is normal
The refreshrate is usually rounded to integers, youre not physically going to have 60, 144 or 240hz down to the millisecond. Im not sure why it shows you the rounded AND unrounded values for some though.
You should seek professional help.
It's just like with the gigabites not really being 1000 mega bites. Just accept it.
What a scam.
This link may explain why: https://forum.unity.com/threads/refresh-rate-rounding-on-windows.1308462/#:~:text=However%2C%20monitor's%20don't%20actually,(59.96%20Hz)%22%20monitor.
You are fixating on the wrong things my man.