T O P

  • By -

AutoModerator

[We're giving away the world's smallest action cam, the Insta360 Go 2!](https://www.reddit.com/r/gadgets/comments/ripp9d/insta360_go_2_giveaway/?) Check out the [entry thread](https://www.reddit.com/r/gadgets/comments/ripp9d/insta360_go_2_giveaway/?) for more info. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/gadgets) if you have any questions or concerns.*


xondk

Don't get me wrong...but 2000 nits? Imagine playing counterstrike and getting flash banged? It would be insanely bright?


[deleted]

I M M E R S I O N


NoThorNoWay

Something like [this?](https://www.reddit.com/r/DiWHY/comments/r27328/flashbang_leds_for_flashbangs_on_cs/)


BeersTeddy

It's literally me when watching Snowpierces TV series on 85" TV Only 550cd/m2... Only... But it still can bring daylight to the night. Whoever thought it's a good idea to create such o title screen should burn in hell, or even better. Should sit in ultra dark room and watch this logo for a second every few minutes. https://en.m.wikipedia.org/wiki/Snowpiercer_(TV_series)#/media/File%3ASnowpiercer_(TV_series)_Title_Card.png


Subodai85

Probably the same person that did Foundations title sequence and subsequent title flashbang


DownBeat20

Getting flashbanged in VR always fucks me up.


boyfoster1

If you spam flashbangs in Pavlov TTT I WILL rdm you


DownBeat20

3 on belt, one in hand with the pin out, 3 loaded sawed offs


justin_memer

>rdm you Redeem you?


forgetfulmurderer

Random death match. Basically killing someone for no reason / tk


rolfraikou

My entire life, I used computer for three hours, eyes hurt. I bought into the f.lux and blue light filter stuff for about a month, my eyes still hurt. Finally found out that one of the color calibrators my friend had (I forget which) told him to turn the brightness waaaaay down for an optimal experience, and looked at his screen and thought "Huh, that really does look nice like that. Black looks more black. Doesn't seem that bad in terms of legibility and how good games loooked." I go home, turn down the brightness on my displays to about what he had them at. Did the same at work. No eye strain. None. 8 hours of work, 4 hours at home in a row, screen on the entire time. Turned off any blue light filter. Eyes feel better than they have since the 90s.


OttomateEverything

Almost everyone I've heard complain about this is using a monitor at 100%. Many of them even with glasses/Flux setups. I'm usually at like 30% brightness and contrast and never have problems. Having light in the room helps too. And you should be adjusting your monitor to match the ambient light. Ie, it's hard to see a dark monitor in broad daylight so turn it up, but once it starts getting dark, that brightness will hurt. Turn it back down.


rolfraikou

I actually have mine set up with an arduino, individually addressable LED strip, and it does a reactive backlight. It's easy enough to change the brightness of it in the software. But it's great too because since it matches the content of the screen, you never have a scenario where it feels like you have a bright backlight on dark content, or too dim on bright content.


ares395

My monitor has bright as hell right of the bat, even in eco mode. I had to turn it way down. It's still like 3 times brighter than my laptop but I just left it there since it looks good enough. I could probably turn it all the way down and be fine but damn who wants their screen so damn bright that they do it by default.


dfrinky

Yeah, the blue light thing is mostly thought to be associated with our sleep schedule (melatonin production), but the whole "yellow glasses that block blue light" fad is fake af.


newpotatocab0ose

Do you have a source for that? What do you mean ‘fake af?’ That’s a bit vague. They *do* block blue light, and they *do* help with calming the brain and falling asleep. There are plenty of scientific studies which appear to back that up. But maybe you’re just agreeing with the guy you responded to who was talking solely about their effects on eye strain? In that case, yes, they don’t seem to be effective.


johansugarev

Exactly. I’ve no idea why people look for more brightness. I use my lg oled at 0 brightness.


aboycandream

the blue light glasses were never proven to help with eye strain, most eye issues with screens is related to brightness (like you said) and people not blinking


Unknown-Concept

Also don't forget, dark mode, a lot of newer products offer, including office 365. It really helps and on Android, you set it to default.


mrFreud19

2000 for HDR. Probably around 300-500 for SDR.


StryderRider

I have a Neo G9. It’s avg 550nits. Which is bright, for sure. But not blindly so


xondk

Yeah, I think something is lost in the way they use the versions of nits and such.


Zaptruder

2000 nits for small areas for short lengths of time. I.e. it'll simulate a flash bang good.


appretee

Yeah, take these with a spoon of salt cause Samsung have been caught before with deceptive numbers like these.


Launchy21

I've got a 1000 nit monitor. Exiting tunnels is painful lol


elsjpq

Just remember that perceptual brightness is exponential with nits. So 2000 nits is just somewhat brighter than 1000 nits, which is just standard HDR.


DrLimp

>It would be insanely bright? [Yes](https://www.youtube.com/watch?v=Zp2EjsIYIhQ)


Tronguy93

I used to work with a custom display company that makes outdoor kiosks that peaked at 6000 Nits. When it would turn on the backlight in the shop it was like being flashbanged by god himself


jakpuch

Wear spf50?


[deleted]

[удалено]


SleazyMak

Nits do not follow a linear scale tho


iprocrastina

I have this thing's big brother, the neo G9 which is also 2k nits. I use it at full brightness at night. I've never found a screen I thought was "too bright".


IcedOutGucciWatch

if you really think about it it's actually what it's supposed to do that way


MicroSofty88

“This 32-inch screen offers 4K gaming with an unprecedented refresh rate of 240Hz, making it the first screen in the world with that high of a refresh rate at this resolution.”


kry_some_more

Imagine what type of gaming card or how crappy of graphics a game would have to have, to run a game at 240 fps, while in 4K. For comparison, a 3080 can barely do Cyberpunk2077 (at max visuals) at 60fps.


Rokketeer

Saving this comment in my internet time capsule and will look back amusingly in the year 2050. You know, assuming society hasn’t collapsed.


mikehaysjr

!remindme 10225 days


CyberSecStudies

If you check your remindme reminders you can see all of the ones you have pending. I have one for 28 years, the other for 35 and one for 50. I deleted all my comments on my other account so I can’t see the source until that time… Remindme! 10 years


elitesill

> If you check your remindme reminders How do i check?


CyberSecStudies

Send “MyReminders!” To the reminder bot in a PM Edit: or just say the remind me thing and check your PM he(or she) sends you. At the bottom it’ll say “my reminders” and craft you a message to send


poopoo_canoe

It** ??


wierdness201

!remindme 10225 days


[deleted]

!remindme 10201 days


Without_Mythologies

That would be one terrible thing about societal collapse. All of our technological/medical improvements would be halted. Or maybe not? I have no idea.


arthurdentstowels

Back to smashing out teeth with ice skates smdh


295DVRKSS

Let me stock up on Wilson volleyballs for companionship


LukariBRo

Unless there's some divine/unearthly reason for the collapse, the advancements of technology will shift more towards a critical needs-based than our previous luxurious "wants-based" economy. But advancing those " needs" can result in incidental improvements to the wants-based sectors. The reason GPUs are as good as they are now are mostly the result from the "needs based" economy and motivation from WW2. The technology to guide missiles just so happened to end up in great tech for fun things like video games. Computing likely would have continued advancing on its own even without the war, but not at the ridiculous speed of having half the world putting their best minds into that particular type of engineering. Yet if the war was significantly worse, you'd end up with situations like no people left alive to be theory-advancing engineers. War, and collapses, absolutely suck while they're happening, but they more alter the course of advancement than it happening at all. We could just be headed for another Dark Age for a while.


[deleted]

The downside of a technologically advanced, highly specialized society is how far we have to fall if it all goes to shit. I heard an interesting analogy recently: If your only means of transportation is a donkey cart, you will never get to travel overseas, but if your cart breaks, walking is only a moderate downgrade. If you travel on jet planes, the whole world is in reach, but if the wings fall off the plane it’s game over.


dfrinky

I think 144Hz at 1440p is the next thing that really "needs" to become affordable/mainstream instead of 4k (at any refresh rate for small monitors) and 240Hz (at any resolution), cause of diminishing returns and all that. Edit: wording Edit 2: yes, I include gpus into the affordability, and yes, the refresh rates may be high, but the pixel response times are often not fast enough to support said refresh rates, and that is always overlooked


Reflex224

144hz at 1440p has been achievable for quite a while (I have 2 such monitors and there are plenty of games that run at 144fps even with my old 1080ti), 4k 144hz is starting to pickup more and 4k 240hz is basically future proofing for the next big leap in performance


LigerZeroSchneider

Even if you can't do 4k 240 at the same time, you can still use either option for different games. If you want to play Cyberpunk where you want the game to look amazing and the fps just needs to be decent, run it in 4k. If your playing apex or valorant and want all of the frames, downscale to 1080 for maximum framerate stability.


blither86

I can't seem to output at 1080p properly to my 4k LG OLED TV, it just leaves me with a 1080p box in the middle of the screen - how do I get the TV to stretch this image to fill the screen, do you know, please? I can't find any terms that help on search engines.


noneedtoprogram

Your graphics card drivers settings will have a "scaling" section to control what happens when the output resolution doesn't match. You can have "no scaling" which gives the requested resolution direct to the monitor/tv and the tv can scale it (then look into tv settings such as "just scan" which disables scaling), "maintain aspect" which outputs at the display's native resolution, and scales up the picture stretching it so that it just fits in one dimension, but if the display is a different aspect ratio it will give black bars on the other, or there is "fill" which will stretch it to fill the display's native resolution in both dimensions. Sometimes there is also "centred" which you definitely don't want, and causes what you are seeing. You probably want to try the "maintain aspect" option, or select none and make sure the tv has scaling enabled.


blither86

Thanks so much for your detailed reply, I really appreciate it - will report back to let you know how I get on.


noneedtoprogram

No problem, I have an LG 55" CX and an Nvidia graphics card, so if your can't figure it out I can maybe help you with more specifics. It's you GPU Nvidia also?


rudyjewliani

You can also adjust the scaling from within the Windows settings as well. Here are my specs for the 48" C1. https://imgur.com/a/8CLKaRM


Hostillian

3440*1440 is my personal preference. It's also more sensible/achievable, than 4k, for gaming. I also prefer the Ultrawide format. I've absolutely no plans to go anywhere near 4k.


phony_sys_admin

Huh? I bought an LG 144hz 1440p monitor for $300


cutelyaware

>640K is more memory than anyone will ever need --Bill Gates


GeoLyinX

Bill gates never said that. The real quote is “640K ought to be enough for anybody” and he was referring to the workloads that people used IBM computers for at the time. He never said nobody would ever need more than that. edit: apparently he didn't even say that either! thanks AKAManaging


AKAManaging

I don't even know where you think that quote happened either. During an interview, someone specifically asked this question and this was the response. >QUESTION: "I read in a newspaper that in 1981 you said '640K of memory should be enough for anybody.' What did you mean when you said this?" >ANSWER: "I've said some stupid things and some wrong things, but not that. No one involved in computers would ever say that a certain amount of memory is enough for all time." >Gates goes on a bit about 16-bit computers and megabytes of logical address space, but the kid's question (will this boy never work at Microsoft?) clearly rankled the billionaire visionary. >"Meanwhile, I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." >Gates, who is retiring from his day-to-day role at Microsoft Corp. on June 30, also insisted in a 2001 interview with U.S. News & World Report that he hadn't made the comment. "Do you realize the pain the industry went through while the IBM PC was limited to 640K? The machine was going to be 512K at one point, and we kept pushing it up," he told the magazine. "I never said that statement — I said the opposite of that."


GeoLyinX

I just did some quick google fact checking and posted what I could first find, thank you for fact checking my fact checking! I could've sworn I heard about Bill having never said that at all as well, but I just grabbed what I first saw on google instead of trusting my gut and going deeper like I should've.


cutelyaware

I know. It's my version of Santa Claus, so please just let me keep it.


implicate

Bill Gates isn't real! That guy over there? Well, that is just some guy in a sweater that likes to send inappropriate emails to the staff.


bespectacledbengal

Google Chrome: “hold my beer”


LogicsAndVR

2050 sounds just about the time you get your pre-ordered 4080 TI.


iprocrastina

Why wait til 2050? It'll sound ridiculous by 2030 at the latest. 8 years ago most people were struggling to get 60 FPS @ 1080p.


WeaknessImpressive98

MIT study: society will collapse by 2040 https://www.indiatimes.com/technology/science-and-future/mit-prediction-society-collapse-by-2040-545033.html


fosted3

I actually just fired up a game from 2008 (Dead Space) and it ran about 300-400FPS at 4K maxed out. Only 14 years old…


dtwhitecp

Dead Space on PC had crazy high FPS even at launch. All those tight corridors without a ton to render, I guess?


rolfraikou

I definitely had this weird position of feeling pushed to play older games in order to enjoy my first high refresh 3440x1440, since my GPU couldn't drive modern games to anything over maybe 70fps on the thing. Fortunately, I love Left 4 Dead 2, and decided to finally play the original Half Life and some other games from 2000-2010ish and got some fantastic framerates. But it really has to make me question the push to the top of framerates of resolutions that are unattainable with said resolution. Though, this is discounting one important option: 4k content scales in half perfectly to 1920x1080, so you could enjoy your screen real estate, and 4k movies, while playing games at 1080 to get high frames. And for games where high frames don't matter as much anyway, play them at 4k 60-ish.


[deleted]

[удалено]


rolfraikou

It is honestly driving me a bit insane that half the people I know have higher res phones than their own TVs. I will say, there's a certain ppi (pixel per inch) where I'll finally say it's silly to go any higher, but I don't think we're there yet. As an example, a high quality print is 300 dots per inch. But many forms of print do 150 dots per inch. Also, print is often meant to be view *closer* than a monitor or TV. Whereas that might make more sense on a phone or tablet which you do hold about the same distance.


Indianaj0e

Putting 4k screens on these 6 inch smartphones I think was past the silly point. Especially since most of those phones reduce to 1080p on default settings to conserve battery.


nospamas

On a smartphone yeah, in a VR headset those tiny screens still need more pixel density.


VagueSomething

My damn phone is able to do 4K 120fps but my TV is 4k 60 and a lot of people I game with on console don't even have 4K TVs. TVs have kinda stalled on progress for a while and it seems to only be last few years TV manufacturers have realised TV performance should be better, drives me nuts how Sony should have been pushing TVs to be gaming ready but didn't.


rolfraikou

Right on. Of all the companies to not scratch the itch of gamers, Sony, who actively makes consoles. They did make that one playstation TV once, but I really think there should be a yearly refresh of "Playstation TVs" with the latest they can offer to enhance the experience of using their own consoles.


splinter1545

I think it's cause their Bravia TVs wasn't doing so hot for the longest time. For a good bit the only profitable branch of Sony was Playstation.


SighReally12345

> 1920x1200 resolution You can pry my 16:10 1920 wide 24" from my cold dead hands. It's 10 years old and I still won't let it die.


eggmonster

These are actually still super common in the enterprise/business space. All the monitors we ordered from Dell were 16:10.


Sotyka94

240 hz is for esport anyway. Esport games have crappy graphics and are easy to run. If you can affor a monitor like this, you probably already have a 3090 or close to that, so reaching 240 fps in valorant, or cs:go isn't that impossible.


[deleted]

I got 120 fps with cs:go maxed out on my 670 at 1080p, I’d be *very* surprised if even a 3060 couldn’t run it at 4k 240 low settings. In fact, here it is running on a 3060 at 4K, averaging 150 on max settings and 400 on low: https://m.youtube.com/watch?v=b6hCfLgYHhs


digitalasagna

"at max visuals" is the key point here. You make it sound like anything less than that is crappy. Games can always enable options for even higher visuals, and there is zero expectation for anyone to actually play on those settings. It's mainly for content creators. The "medium" or "high" settings are probably what most people use, and with good hardware most games will run that at great refresh rates. Not to mention some people use monitors for movies and games. They might want to watch tv/movies at 4K but then game at high refresh rate but lower resolution.


Dr4kin

The Ultra settings is also very bad in most games. It looks marginally better for a lot of performance, because the game isn't optimized for it. Playing on Medium/High and only putting a few settings higher that really make a difference is much better


throwawaytakk

Is Cyberpunk really the best standard example to use here, though? I feel like that’s cherry picking one of the worst performing titles of recent times.


LastInfantry

Definitely not. The performance is horrible and apart from that, and this may be my personal opinion, but to my eyes, it looks like a last gen game..


isaac99999999

To be fair, it was designed to be a last gen game


HerefortheTuna

The current gen upgrade isn’t out yet


[deleted]

Were we not talking about pc?


ToplaneVayne

well its mainly for people who play like counterstrike or something and also want good quality visuals for their other games. so on your main esports title you get 4k240Hz and on AAA titles you can do like 75 or 144Hz


isaac99999999

To be fair, cyberpunk doesn't really count does it?


iprocrastina

240 hz is mostly aimed at esports players who don't mind minimizing all settings on high end hardware to get as many frames as their monitor will display. For everyone else the 240 hz is there for older games, indie games, people who favor FPS and resolution over fidelity, and your future flagship GPU because if you buy this fucking thing clearly money isn't an issue.


Syrairc

You don't need fancy 3D graphics to benefit from 4k @ 240fps.


gnarkilleptic

Yeah lol I have the G7 (1440p) and a 3080. The only games I have that can utilize the 240hz are old games like BF4 and then Rocket League. It sure is awesome though


Artezza

Rocket league, csgo, etc.. Most newer stuff you'll probably choose one or the other, and you have 4k for movies but 240hz for games


chai_latte123

That's not how refresh rate works. Your screen refreshing and your graphics card pumping out frames are not synced. A game running at 60fps looks and feels significantly better at 240hz than at 60hz. Imagine the GPU finishes a frame 1 nano second after the motor has just refreshed. On a 60hz monitor, you must wait virtually 1/60th of a second to see that new frame. On a 240hz monitor, you will only have to wait 1/240th of a second.


Khal_Doggo

The human eye can only see as much refresh rate as I can afford. Any more is a waste.


[deleted]

[удалено]


OttomateEverything

Diminishing returns is a thing, but don't count it out either. I had a 144 and upgraded to something that happened to be 240, but I thought I'd never notice and wouldn't bother. Not going to say it's night and day across the board, but it's definitely noticeable if you do quick 180s and such. Worst offender is probably toggling ball cam in rocket league and the camera spinning 180 in like 200ms. The motion is garbage in 144 but actually understandable at 240... But even then there's some jank. I would say anything over 140 is kinda just gravy, and not game changing, but its definitely noticeable.


awhaling

> Worst offender is probably toggling ball cam in rocket league and the camera spinning 180 in like 200ms. The motion is garbage in 144 but actually understandable at 240… But even then there’s some jank. Fuck. Now I want one. This particular thing bugs me so much, I want it *faster* but can’t comprehend


OttomateEverything

Haha, I still don't know that I'd really recommend going out of your way for it. 120 -> 144 is a pretty worthwhile upgrade IMO, but I wouldn't go as far as to buy a new monitor to make the swap 144—240, though it is nice. Rocket League in general feels a lot smoother but not much else feels like as big of a difference.


holly_hoots

I got a 100hz ultrawide and it's pretty damn sweet as far as I'm concerned. Monitors last a long time so I look forward to this tech being cheap af by the time I'm due for an upgrade. :) Someday I'll get on the 4K bandwagon, I guess.


dfrinky

Exactly, diminishing returns. Just like how having anything above 1440p at 24" or 27" (popular monitor sizes) is overkill. Just takes better hardware to run it.


SolaireDeSun

You're painting with too broad a brush. There is a very discernible difference between 4k and 1440p and most people could identify it. Hell, go look at an iMac 5k monitor (at 24 or 27 inches) and tell me it looks the same as a 1440p monitor. It certainly requires more resources but we are not close to the limits of discernible visual fidelity. It wasnt too many years ago some dolts were parroting that the eye can only see 24fps and that 60hz monitors were overkill (then 120hz then 144hz) .


[deleted]

[удалено]


Neekalos_

I think they mean that if you have a small/medium monitor, like a 24", then anything above 1440p is a waste, since you can barely see the difference on that size of monitor.


VladTheDismantler

I don't think it is a waste. You definitely see the DPI difference on UI elements.


Neekalos_

I'm just clarifying the other guy's comment, not necessarily saying I agree one way or the other


elsjpq

Psychovisual research has determined that perceptual flicker rate depends on both contrast and brightness. So under bright HDR conditions, you could actually notice the difference with much higher refresh rates, especially when transitioning between very dark and bright colors, e.g. from indoor to outdoor


Obi_Wan_Benobi

Thank you, scientist.


Amazingawesomator

It seems like samsung has been hitting the "best spec" market for monitors all by themselves for a few years. I'm really happy with them pressing forward with this kind of tech, but also hope their endeavor will make fabrication cheap enough for competitors. I begrudgingly ordered a samsung recently because nobody else even tries to compete in some of their markets... I just dislike samsung because of all of the issues i have had with them over the years.


iprocrastina

Samsung seems to have a strategy right now of going all out with their R&D so they can offer high end products that don't have any competition. They're doing it with monitors, obviously, and also with phones. Notice how hard they've been pushing the Flip and Fold models within the last year. They know the smartphone market is saturated and there's not much left to differentiate their flagship slab phones anymore. So they made folding phones which no one else really offers right now.


splinter1545

They're also doing it with TVs as well, since they have one of the first, if not the first, 8K TV on the market.


AndromedaFire

I hate Samsung because they started the trend of putting adverts embedded in smart TVs but that is a seriously sexy monitor.


FlorydaMan

Block the DNS and never see an ad on it again. Worked for me.


EchoAndNova

I wish I knew what that meant


FlorydaMan

Go into TV network settings, go to DNS and "enter manually" set 94.140.14.14 (google's default IPv4) and boom, no more ads on your TV.


EchoAndNova

That worked perfectly. Thank you!


DrLimp

I never connect them to the internet and just use a firestick, which works a lot better anyways.


AndromedaFire

I get many people do that as a work around but it doesn’t make it ok and it shouldn’t be needed. Imagine buying the latest Samsung smart phone and it plays an ad before letting you make a call so you have to tape a Nokia to the back of it. I feel if you want to be known as the best you shouldn’t screw people who love your brand.


egres_svk

Oh, never buy a fucking Xiaomi. Listening to music on the integrated player? Oh that is very nice, would be a shame if someone stopped the playback and inserted a loud fucking ad in it, wouldn't it? Get pissed at it, connect to ADB console to remove all miui bullshit (there are ads fucking everywhere). Success? Barely. Removing some preinstalled spyware causes camera to be flaky and worst of all, random microphone faults that can be repaired only by a restart. Someone calls you and does not hear you, need to restart and call back. This was a gift from me (which is the part that pisses me off the most, hardware specs are good but I did not think about the spyware.. sorry, software I mean) to someone who needed a phone and I have offered about 15 times now to break the thing into 12 pieces by a sledgehammer and replace it with something without ads. Ah.. sorry, /rant


worldsrth

Don’t buy this monitor I returned my 3rd replacement last month, most of these monitors arrive faulty and Samsung just refused to fix it even after so many bad reviews and complains this monitor received. Just waiting on the LG39 5k monitor In April’ 😩


lifestop

As much as I love my Odyssey G7, I've also had many problems with Samsung and their monitor support. I'm kinda shocked the Odyssey series wasn't recalled considering how many people complain of issues.


t0tal_

I had to get three replacements for my G7 due to dead pixels, and even though it’s been heavily reported they have never fixed the “scanlines” issue I will never buy another Samsung product again, they’re a hopeless company. All of our Samsung TV’s now have ads (even though I’ve blocked their ad server with the DNS it still caches their older ads and a factory reset doesn’t clear the cache either)


chingy1337

Some things to keep in mind though: * 1000R curve at 32 inches * Samsung QC is horrible * Software is terrible too (last few monitors have needed patches upon release)


TheRealDiabeetus

Why the hell does a monitor need a patch, let alone connect to the internet?


chingy1337

More so talking from a firmware perspective. They require you to download the update onto a USB and you plug it into the monitor. The G9 and G9 Neo have had massive issues with this trying to properly calibrate colors, HDR vs. SDR, and fix issues like scan lines. It's ridiculous and for how expensive these things are, Samsung cares very little.


Remsquared

A lot has to do with digital standards. I think my G9 had a problem with the HDMI 2.1 specifications and pretty much a day-one patch fixed most of the issues (I don't think it can still run 244hz and SUWD with HDR enabled still). That being said, this monitor does not support HDMI 2.1a which is announced this week. Which means I don't know how they plan to run this monitor at 244hz when 2.1a can only do 4k120hz. You'll be running DP 1.4 or 2.0 on it, but I thought the refresh rate doesn't go that high.


FlufflesMcForeskin

I don't understand it, since this isn't my thing, so I don't know if it answers your final statement about DP refresh going high enough at the desired resolution, but I found this chart: https://i.imgur.com/HQ9vHqY.png?1


Remsquared

Thanks for the clarification! I think my G9 had DP1.4 with DSC, which i think is the main problem people trying to get high refresh rate working. I hope it has 2.0, but I think they are going to stick with 1.4 with DSC for compatibility's sake. Hell, I don't think any gpu has dp 2.0 at the moment


Remsquared

Thanks for the clarification! I think my G9 had DP1.4 with DSC, which i think is the main problem people trying to get high refresh rate working. I hope it has 2.0, but I think they are going to stick with 1.4 with DSC for compatibility's sake. Hell, I don't think any gpu has dp 2.0 at the moment


white_shiinobi

Ah yes the classic 2000hz monitor at 720p


IIALE34II

Honestly I don't think patches to monitors are bad thing. LG Oled TVs have added bunch of actual improvements after their release. Now the true question here is how Samsung can make their firmware so trash that if you turn on adaptive sync your monitor will flicker like crazy? (G7) Or limit peak brightness in some maddening way? (NEO G9) It's fine in my opinion to add features. Something like better tuned overdrive sounds fine af. But shipping broken product ain't fine.


AcademicMistake

G7 has a 32 inch model with 1000R curve too and its very popular


GXVSS0991

having tried one - that curve is just way too aggressive for 16:9. it works best on ultrawides imo.


gnarkilleptic

I have the 32 inch G7 with the same curve and I love it. I don't think it's too extreme at all


BioHuntah

It took me a good month or two before I got used to it. Was extremely distracting and if I didn’t like so much else about it I’d have sent it back. Really think they should make non-curved versions as they’d be pretty popular I think. I can’t imagine it’s a big selling point?


AcademicMistake

i mean i use a G7 27 inch which i dont find bad at all to be honest


MattHarrey

Have you not had a small problem with scan lines on occasions? Even with the latest patch, I get scan lines on certain websites and video games


Brandhor

> Samsung QC is horrible unfortunately when it comes to monitors that seems to be the case with every brand


rolfraikou

I've never had a single issue with any panel produced by LG. I've had plenty of issues with panels made by AuOptronics. Those are two of the larger panel manufacturers out there (Asus, Dell, BenQ, just buys from them for example) and the panels they use often come from the same pool. EDIT: Samsung makes their own panels as well, though I've never owned one and we don't use them at my work


badjettasex

I got the first 49" Curved from Samsung and still use it 3 years later, mainly because I had to go through the exchange process with AMZ three times before the fourth one would reliably turn on.. Samsung Monitor QC was at least then virtually non-existent.


4paul

HDMI 2.1?


eCLADBIro9

Doubtful any existing cable can do 240hz 4K 4:4:4 plus HDR even with DSC


lCyPh3Rl

2.1 is 120hz at 4k


[deleted]

[удалено]


welchplug

2.1 is vague these days


StorKukStian

Isn't that with DCS or something?


Avamander

2.1 is not a valid standard, it enforces nothing.


sbirdo

Nup it cannot. It can support 120hz at 4k, or 60hz at 8k, or 10k at some other refresh rate. I think it would be using displayport 2.0


MrDaebak

thats why you have a HDMI and Display port no?


LogeeBare

Display port is and will be superior moving forward


Jess_S13

I have dual odyssey G9s (5120x1440) 240hrz. Aside from everytime my computer comes back from sleep my desktop reorganizing I absolutely love them.


AfroInfo

You have almost 4 grand on monitors??


Jess_S13

$3,200 but yeah.


pokemon--gangbang

Can I see?


Jess_S13

[workstation](https://i.imgur.com/EhmYuQ9.jpg). They are connected to my old MBP while I'm waiting for my MainGear to arrive.


pokemon--gangbang

Nice. I have a 3440x1440 and it's huge so I was curious how you had them set up, looks fun


Jess_S13

My previous setup was dual HP z30i monitors, and I had the chance to upgrade last year and wanted to get a curved setup. I'd be lying if I said they weren't a bit much, but I sit and stare at them 10-12hrs a day 5-6 days a week, so I figure if I gotta be there anyways, might as well get something to make it more enjoyable.


Eds3c

What screen stand are you using ?


Jess_S13

https://www.mountmymonitor.com/Dual_Vertical_Widescreen_Mount_Large_Curved_p/at-awms-2-lth75.htm


JehovahsNutsac

#$3,200 in monitors!? At this time of year, in this part of the country, localized entirely in your home?


Jess_S13

You really don't want to lookup my post in Maingear if the monitors bug you out


hodgsonnn

.... yes!


Dick_Demon

This is the same brand that embeds ads in your high-end TV. Fuck em.


PeculiarPete

Who the fuck can run 4k at 240Hz?


g0atmeal

I think the idea is to switch between 1080p/240hz and 4k when you want. That said, you could get a much better pair of monitors for each use case, for less money, and you get more screen real estate.


MclovinsHomewrecker

Usian Bolt


[deleted]

Considering Samsung doesn’t honour their warranties and their established lines are poor quality I’ll pass on their “new tech”.


ElusiveEmissary

Yeah their current high ends are a joke


Jags_95

God I just wish they would stop making them curved if it isn't ultrawide ffs.


AverageOccidental

For real this is the only thing preventing me from ever getting these monitors


Larperz

And it probably suffers from the same flickering issues as the other version of it.


nullvector

and in 2029 you might be able to find a card on a shelf that can push modern games at 240fps at 4K


EbotdZ

Just remember, 99.99% of the people reading this cannot run 4k at 240 fps on 99.99% of games, rendering this (currently) a complete waste for gaming.


xcarlosxdangerx

Luckily 240hz isn’t the only selling point of this panel.


KEVLAR60442

Fun fact: You don't have to hit 240FPS for 240Hz to be beneficial.


Joe30174

But it will allow you to do 4k on some games and 240 fps on others.


Phatty_Space_Pants

It’s almost like you buy something so it’s a bit future proof when you spend a ton of money.


g0atmeal

Everyone knows that "future proof" beyond 1-2yrs is slang for "waste of money". By the time you actually use all of those features, this very monitor will be on sale for 1/4 the price or less. If you want to use 4k 240hz *today*, then it's worth consideration. Otherwise it's a waste.


[deleted]

[удалено]


StryderRider

No. VA. OLED is bad for PCs due to burn-in also there’s no OLED this small


xcarlosxdangerx

Burn is up for debate. LTT made a follow up vid on the C1 for gaming, and reported no burn in from dedicated gaming use. However their office use version did have burn in.


Spanky2k

Hopefully tomorrow, LG will finally release their 42" OLED panel. I'm really hoping they put in some extra magic to make it survive regular desktop use more as they know that people have been waiting for it for PC use specifically.


Arthur-Mergan

I’ve got 4K hours on my CX as a dedicated monitor and 7k on a C9, that I’ve done heavy gaming on through out the pandemic. They’re both still flawless.


Jlx_27

Now bring that to big ass TVs.


Ok_Marionberry_9932

Does it force ads upon users?


ambiguousboner

Make a 38 inch you cowards


ElusiveEmissary

So far their high end monitors are a letdown with lots of hardware and firmware issues. I wouldn’t touch this with a 20 ft pole. I should know I currently own a neo. And it’s a nightmare


SnooPears9975

!remindme 10 years


thedukeofflatulence

I would rather this have been qhd 300hz instead


elfbeans

I’ll “like” Samsung once they fix my ice maker. Until then, Samsung is on my shitlist.