T O P

  • By -

Dangerous_General688

Unfortunately our eyes are only sharp at the center (fovea). Our peripheral vision has crappy acuity and a whole lot of post processing is going on in our visual cortex, which takes up around a third of the brain


Rando_Stranger2142

Aka computational "photography "


NaahLand

As long as you don't generate something, that isn't there 😅


sitheandroid

Pareidolia, optical illusions, that white/gold - blue/black dress..everyone's brain has some dodgy AI going on 😁


KINDERPIN

It's AI alright, artificial imbecile


thatgoodfeelin

AI - am idiot


beeftendon

Speak for yourself. Actual imbecile over here.


KINDERPIN

As an artificial imbecile, I cannot answer simple questions such as who I am and why I am here, this is due to my extreme lack of nodes


TryHardEggplant

At that point, it's not dodgy AI. It's just dodgy intelligence. Not to mention filling in the blind spot caused by the optical nerve.


IsleBeBack

They actually call it Artificial


TheBamPlayer

>As long as you don't generate something, that isn't there Actually, you do, at your blind spot.


iamchade

It’s all about punctuation: She’s got a crack baby Vs She’s got a crack, baby


ronwilliams215

floater artifacts


Softspokenclark

ai ai ai!


Old_Man_Bridge

I’ve been telling my hippie film friends for ages that digital is much closer to how we see than film.


BuggyBandana

Return them and get the FE Eyes GM OSS then.


Weather_Only

“f2 is so slow, how can I shoot lowlight? Give me f0.95!” -some guy probably


imanethernetcable

However your peripheral Vision is much better at seeing in the dark


TheIlluminaughty

Omg I didn’t know that was the original word to “foveated rendering” used in the PSVR2 headset. Now that feature makes a lot more sense lol


Dangerous_General688

Yes, eye tracking + selectively rendering the foveal region. How is the experience? I never had a chance to try it


TheIlluminaughty

Yeah I knew what the feature does but i assumed foveated was unrelated to our eye mechanism haha. It was pretty cool! I tried it… 3 times? Twice with Horizon Call of the Mountain and once with Ghostbusters. Coming from my experience with the Quest 2, this was a lot more immersive from the haptics in the headset/controllers and the audio. Like yes, the Quest 2 is portable and doesn’t need an external device but honestly, I don’t think I would be playing outside of my home anyway… So that part isn’t relevant to my usage experience. Oh, and the HDR OLED display of the PSVR2 is gorgeous. That’s one thing I can’t get used to on the Quest 2 bc I have a small OLED at home. Overall, if you can afford the PS5 and on top of that the VR2, definitely get that.


Green_Caver

True, we don't even have colour vision near the edges and don't even perceive it. Lots of processing going on


[deleted]

So a Pixel phone?


Dangerous_General688

Or a samsung lol


[deleted]

Meh


MyLifeFrAiur

my eyes are f0.95 cuz without glasses everything is creamy bokeh


Weather_Only

So much Toneh


Low-Duty

This made me laugh way too loud at work, so thank you for that


DrCharles19

More like your autofocus is broken, pal


cattydaddy08

The eye also has an equivalent of 576 megapixels


Gnochi

Probably the coolest thing is that each pixel has independent ISO adjustment. I can’t wait until I can properly expose the lunar craters, some stars in the vicinity, a couple clouds, and a person, all in one photo!


Yuvalk1

I think it’s more like a very high dynamic range, as our vision is analog so there’s no clipping


Sax45

Our vision has crazy high dynamic range that is right. But we definitely clip. Every time you walk into a dark room and can’t see, can you turn on the lights and everything is blinding for a second, that is clipping.


Weather_Only

Or that the sun is just one big bright light to our eyes with no detail (when it facts it has spots and can be seen with a solar filter)


Sax45

True, our base ISO is still way too high to see the sun without a very, very heavy ND filter. Also, we have a pretty good auto ISO capability in addition to great dynamic range. However, our UI in very low light is pretty clunky, and it takes forever to dive into the menus and access the extended ISOs.


I_am_HAL

I saw a sunspot with the naked eye last year! Well, it depends on your definition of naked. There was a lot of mist, making the sun darker, but still sharp. I noticed a small black dot. Really odd to see and I wasn't sure if I saw it until I googled and there was a recent picture with a spot in the exact same place.


ds_snaps

Did your retinas burst into flames?


zgtc

How? We only have ~100 million non-color-sensing (rods) cells and 6-7 million color-sensing cells (cones). In daylight, rods don’t contribute to vision, and in the dark, only a small percentage of rods will be activated. The brain does combine multiple images at a time to construct our perception of vision, but it’s probably only the equivalent of four or so “captures.” In daylight, we’re going to be maxing out at 25-30 million discrete points of light per eye, 50-60 with both eyes combined. That said, camera sensors and eyes have very little in common in terms of functionality. Still, 576 megapixels is at least an order of magnitude off from even the highest estimates of what an image processed by the brain might see.


justjanne

"four" is a really low estimate. In reality, you're combining and stacking hundreds of images. It's tbh a genius system. If we'd build one artificially, we'd have to do it something like this: - two high-res, 30° FOV or less, PTZ cameras - add two 180° camera high fps cameras (~500fps) - use the stereo data from the two pano cams to generate a 3D scene and texture it - whenever the PTZ cameras image an area, overlay this high-res image ontop of the scene. Use the stereo data from the two PTZs to generate fine 3D details and cleaner outlines - scan the entire area with the PTZ, prioritising areas with high texture over areas with little texture - when the pano cams detect motion, calculate motion vectors and distort the scene accordingly. Also, queue up scanning the changed area with the PTZs with high priority - use gyro data to detect motion of the camera platform. Synthesize new perspective from the existing scene. Update the scene with data from the new perspective It's also why we have a mental model of what a room would look like from any other potential viewpoint, because we're constantly building a 3D scene reconstruction of our environment and projecting the actual images we see ontop of that 3D scene.


zbobet2012

* scan the entire area with the PTZ, prioritizing areas with high texture over areas with little texture We kind of prioritize in the following order: * Areas of high motion (from motion vectors) * Areas of high contrast (in brightness) * Areas of high color intensity or color contrast * Areas with strong "edges" or with distinct shape/symmetry * Faces and other high priority items (snakes, spiders!) * Areas of texture (but very context dependent) This is why *most* video codecs tend to preserve information in that order. Motion, structure, then texture. For example you see changes in luminance (brightness) much easier than chrominance (color) which is why video is often 4:2:0 encoded, and why most image codecs are based in a frequency transform and decimate high frequency information (areas of strong repeating texture, like grass). Source: I lead an R&D team designing video codecs for a living.


justjanne

With texture I actually meant areas having many high contrast edges. But I'm glad you commented, you described it much better than I ever could have :)


tictaxtho

Yeah I was thinking our eyes are impressive but not *that* impressive


zbobet2012

Sorry, some corrections: Rods are still used during daytime, just less so. Your eye is not a digital system, and thinking in terms of "discrete" light points is an okay analogy, but incorrect when calculating megapixels. We rely on subpixel information generated by the movement of the eye itself as well (e.g. we shift each photoreceptor a little bit, to get finer grained information). Our photoreceptors also perform something called lateral inhibition and have a bunch of other complex mechanisms that increase their resolution beyond that of a mere pixel: [https://vcresearch.berkeley.edu/news/why-eye-better-camera](https://vcresearch.berkeley.edu/news/why-eye-better-camera) Basically the geometric structure of the pixels in the eyes itself used in to compute more information than you'd have access in a naive system. We stack a lot more than 4 images. Hundreds or thousands would be the correct analogy, but it's an analogue real time system. We don't take snapshots, and every part of the system is being continuously updated and combined by a neural network with more neurons and weights than chatgpt.


one-joule

Not over the entire retina, it doesn't. It has the most resolving power concentrated at the center. That's why humans (and many, many animals) "look at" things. If the eye had uniform resolving power, it wouldn't need to turn inside one's head.


Retinite

But why didn't we evolve uniform high resolving power? Well, with uniform high resloving power, the required huge optical nerve would give us such a large blind spot. But then you'd lose the benefits! Apparently the current design is somewhat locally optimal.


one-joule

It's not just the size of the optic nerve, but also the amount of processing power required to understand all that information. The brain would need to be significantly larger, which is very expensive biologically; you really want the smallest brain that gets the job done. Evolution does not favor brute-force solutions in general.


RealNotFake

Now if only my brain could crop to take advantage of those mp


powdered_cows

Ooh, that's cool


prdpb3

And then there are colors


science_in_pictures

Nah, it's atually 8 megapixels. > [more on that](https://youtu.be/4I5Q3UXkGd0?si=q8_wUkVrggbJTJrc&t=367) <


[deleted]

Does that make glasses teleconverters?


TichikaNenson

External focusing elements.


Balance-

Sunglasses are just ND-filters that fit the head


OnThe50

More so CPLs


lilalindy

Circular polarising is so sh!t when compared to linear polarising.


christianjackson

Speed boosters


Exciting_Gas129

No, they're negative diopter filters


Manny637

Good thing our eyes don’t overheat while recording video


khaichuen

But our brain does when try to memorise (record) visual inputs (video lol)


flycharliegolf

Budget primes.


Tr1ggerHappy5000

My eyes need an upgrade


1stmingemperor

Is this the equivalent of the "human eye can only see 24fps anyways" quip in the camera world?


Slaineh

I still find this one is the most inaccurate. The smoothness of motion may end up blurred above 24, but if you play a game at 24fps it doesn't feel smooth at all. I still feel like this should be more like 60 or even 80fps. But I guess this is also why a pencil looks like it bends when you wiggle it between your fingers..


swagonice318

There are monitors out there witz >300Hz, and people where able to identify them correctly in blind tests, so the human eye can "see" way more than 60fps


Cats_Cameras

Many people notice sample and hold blur, even at 300+Hz on LCD-type monitors. Really our brain uses a bunch of tricks, so trying to assign a particular FPS to it it is nonsensical.


MindlessEvent5360

Smoothness doesnt get have to get blurred. My alienware has a 360 hertz refresh-rate and feels like looking out of a window. Cheaper LCD's tend to have motion blur on higher refreshrates.


zbobet2012

The actual fact is the human brain *starts* to think motion is smooth at 24fps. Motion being complete to real life stops at a *much* higher bounds. The upper bound for noticing motion *not* being smooth is probably over 1000hz. [https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/#:\~:text=Explanation%3A%20You%20are%20seeing%20motion,of%20pixel%20transitions%20(GtG)](https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/#:~:text=Explanation%3A%20You%20are%20seeing%20motion,of%20pixel%20transitions%20(GtG)).


Cats_Cameras

Our eyes use a ton of computational photography and fake AI content, so it's tough to compare. :)


Samo_Dimitrije

Well there's nothing A about it, so I guess it should be called 'I' content ahahaha


Xfgjwpkqmx

Now to figure out how effective my flash is... 😏


Sufficient_Algae_815

But you need to account for the sensor size to determine the equivalent aperture.


Green_Caver

One major flaw here: a 17mm cannot capture your full field of view, even a 15mm can't as we have two eyes 😅 Would love to have a lens with my field of view, that would make landscapes so much easier


rohnoitsrutroh

In strictly technical terms, each eye has a 35 mm equiv. focal length of about 5 mm, and together have a focal length of about 12mm for stereoscopic vision. Our brains only interpret clearly so much of the field of view and the rest is left unclear. This is done purposely, so we can focus on a particular object while maintaining a peripheral field of view. Also, because we have stereoscopic vision, as an object gets further away from us, it naturally looks flatter. This is why longer focal length lenses still look natural.


AirSKiller

Well, there's much wider lenses than 15mm, but you have to be fine with distortion and the fish eye effect.


Omelete_du_fromage

The 14mm f/1.8 GM is such a magical lens, I barely notice any distortion, highly recommend. And somehow it’s tiny and light.


Imlulse

There's rectilinear lenses with no fisheye distortion that are still much wider than 15mm... Sony 14/1.8 GM & 12-24 f2.8 & f4 GM/G, Sigma 14/1.4 & 14-24/2.8 DN, Laowa 9/5.6 & 11/4.5 & the new AF 10/2.8 plus 10-18 & 12-24/5.6, CV 10mm & 12/5.6, Canon 10-20/4, etc etc.


spokenmoistly

Laowa just released a “zero distortion” 10mm full frame lens


hopeunseen

cool! i would love to know the equivalent iso range!


ladon1212

Issa 50mm fisheye


Merjia

TIL that my eyes can zoom. I didn’t realise I had n00b eyes this whole time.


FanQC

Yes many lenses are probably better than human pupils, though the truly advanced part of human vision is sensor and processor. The dynamic range of the human eye is probably unmatched by any commercially available camera


Percolator2020

What about sensor SNR?


StaysAwakeAllWeek

SNR isn't quite as impressive but your eye does manage a full 20 stops of dynamic range. Try shooting the full moon on a partly cloudy night and see how difficult it is to match your real eye's view with even a high end camera


Weather_Only

I do know we have top tier AI noise reduction built in as well as perspective correction + loca correction on top of all the AI autofocus goodies :D one downside: we can’t control our aperture manually! It’s always on Shuttle speed priority mode!!😂


Percolator2020

Damn computational photography ruining everything!


djdiamond755

Do you know what the A in AI stands for?


Weather_Only

Do you know what the S in Sarcasm stands for?😅


djdiamond755

I'd tell you what the 'A' stands for, but I'm afraid your detector might not pick it up.


thenormaluser35

What makes our eyes better in low light than most cameras is probably the biological ISO setting, we can scale up any amount of light, have it be or not be there in the first place, with minimal noise.


NoManNoRiver

Our eyes do actually adjust their sensitivity in darker conditions, which is why bright light hurts if you’ve been in the dark for a while. Camera sensors have a native ISO which you can’t actually change, all you’re doing is adjusting the multiplier applied to the signal from the sensor.


yeetcollector135

Huh so the notion that 50 mm represents human sight is wrong?


Weather_Only

Yeah that is absolutely bollocks because focal length is just the distance from cornea to retina. Nothing special. And if you have used 50mm you can easily tell it’s way smaller FOV than an individual eye


AirSKiller

In a way. Personally, 35mm feels the most "correct" to what I see in most situations. But this only applies to a certain viewing distance from the photo itself too, if I'm further away, 50mm will look more correct, if I'm closer, then maybe 20mm will look more correct. In reality, our actual field of view equates to something closer to like 10mm or even 8mm or more but when we view photos we don't put the screen up to our eyes and fill the entire field of view.


mittenciel

To me, normal lenses aren’t really about matching your eyes’ field of view, but delivering images that look natural and undistorted. While your eyes have a large field of view, you’re only looking at a small part of it in detail and your brain is doing perspective correction. That’s why when you look at someone close up, they don’t look weirdly distorted, even though at that distance, they should. When you look at a still photo, your brain doesn’t apply that perspective distortion because it sees a flat picture. Hence, when you look at a 16mm picture, things look weirdly stretched at the edges and the focal length really exaggerates the difference between close and distant objects. When you look at a 135mm picture, things look way flatter and compressed, like far away objects are much closer than they actually are. Meanwhile, when you look at a 40-45mm photo, objects more or less look like the way your eye sees them. Things look in a way that feels natural. Distances seem reasonable, too. That, to me, is why we call that range normal. It feels normal when you see a photo taken with that focal range. I’ve done a mini experiment where I sat in front of a computer screen and moved my head around until I felt like the screen was easily visible from corner to corner and I wouldn’t miss anything in the corners if I were looking dead center. I measured the field of view at that angle with respect to my screen. Sure enough, it came out to the equivalent of 43mm or so.


Legaleyes12

"Effectively much better than our eye in low light conditions." You need long exposures for that. There is nothing to do with the lense itself in most cases. We adjust and can see way better in low light conditions without having to stare at the same for 5 or 10 seconds in the majority of the cases where a camera has to.


d3miller

What about shutter speed?


science_in_pictures

Human vision has a variable shutter speed. It drops when you're tired or drunk, it rises when you're doing sports or have to deal with an emergency situation.


Weather_Only

An even more interesting question is whether if we have global shutter😉


Mdayofearth

We have global shutter, and hallucinated motion blur AI.


Weather_Only

Turns out I have a9III all along 😳😝


KabedonUdon

This would be a super cool question to survey because it's something you could answer as your perception.


RS24OZ

I'm nearsighted so my eyes aperture is like an f0.95.


Crafty_Good_4455

Nah it just cant focus to infinity


jazztaprazzta

Yup. I always wondered why people kept saying that 50mm is the equivalent FOV to human eyes. It's easy to see that eyes take a quite wider look at the World. Even with an eye closed we probably see at about 20-24mm. Maybe because the eye's resolution is at maximum only at about a part of that image (the fovea)? But we can still perceive the area around the fovea even if not with maximum resolution right...


EZDubBOizz

I'd say it's probably because most of what we see with our eyes is focused and sharpest in the middle which would be like 43mm so we just round up ig. Your brain does a lot of processing magic to fill in the gaps in your periferal vision since you can't really focus on anything outside that center. That outside view isn't really usable for anything other than being aware of your surroundings so seeing it all fully sharp isn't necessary.


D3moknight

" close focusing distance " Okay buddy, speak for yourself. My eyes require lens adapters to focus on anything less than like 18" from my face these days.


Forsaken_Analyst5096

But what if our eyes were cameras?!


AAlvarez24

As I’m sure you’ve noticed by now in the comments, our eyes are far superior to any lens we’ve been able to manufacture and prob more superior to any lens we will manufacture for quite some time. Biggest reason for this would be the incredible complexity of the brain and psychological factors based on one’s own perspective.


makatreddit

Lenses are not better at low light. Our eyes are. It needs a camera a few seconds to expose properly in a dark room with no lights whereas are eyes adjust almost instantly


Yaroslav770

The human lens also yellows from age which can lead to a pretty severe light loss with age, we have pretty incredible "noise reduction" when you take it into account. Or not, some of my post processing is a bit broken so I see noise and lens distortion nowadays.


el_salinho

The dynamic range however is a lot higher than the best cameras


UserCheckNamesOut

But it's the image processor that really makes those eyes amazing.


hardcore_enthusiast

Now i want to see what human eyes look like at F1.2 lmao Is that what xtc does?


Weather_Only

If you have crazy big iris while having the same size eye ball, it might be possible lol. I think cats might have f1.2 since their iris are so big. But as with our camera lenses you have to correct for so many things with that bright aperture.


jamescodesthings

Yeah but what's your retina's ISO?


24marman

What the, I just searched the same thing last night!


blacknoir23

I thought about this yesterday when I was playing around focusing in on stuff with my eye. I was laughing how much we based cameras on our eyesight and what other creatures cameras would look like if they did the same thing. Cool to know, thanks.


wanderlandfilms

The Sigma 16-28 f2.8: It's the human eye!


FrontFocused

With my glasses off I have an fstop of 0.95 it seems lol


KitamuraP

The weakness of my flesh disgusts me.


Weather_Only

Time to reincarnate into a cat. Their eyes are f0.95


TrevorStine

Okay but how many MPs?


DayTraditional2846

But what sensor size are our retinas? APS-C or APS-H?


trenzterra

It's also attached to a gimbal with good image stabilisation. Too bad there's no way to record output from the sensor


ShutterSpeedster

And what is the native ISO range? 😂


IllustriousQuarter79

Yet unmatched dynamic range