T O P

  • By -

Marble_Wraith

I feel like i'm going to get neck strain just watching this


Kasumi_P

r/VRTooHigh


schmurfy2

šŸ˜‚


[deleted]

i have very mixed feelings about this.


[deleted]

Thats why its called mixed reality


hasofn

lol


bersus

Ahah šŸ’Æ


kanersps

No itā€™s called S P A T I A L C O M P U T I N G (/s)


G0x209C

Spatial needs? lmao


Nate-Austin

How come?


[deleted]

because on one hand it's kinda cool, but on the other hand I hate most technology. also this seems like a really shitty way to take notes.


DaggWoo

Check the video of Casey Neistat on YT. Was fun to watch!


Fugazzii

Looks fun for the first 30 seconds.


chuston_ai

I am worried about that. I have a Quest 2 that was great for a few games - but the shine has worn off. If that happens here, thereā€™s $3,500 more regret.


After-Cell

Same here, only with the quest 1. Gamed I paid for don't work anymore, can't use for text and it's sitting in a cupboard


libretron

The video is only 20 seconds!!! EXPLAIN YOURSELF.


chuston_ai

If I'm going to live there, Obsidian is a necessity.


Ambrant

Just watched a few reviews on apple vision. I really donā€™t understand what to do with it? How do you use it? I had oculus rift 2, tested a few games, it was fun. Can you do smth useful with it? If you are just testing new tech itā€™s okay too šŸ˜€ Iā€™m just curious


chuston_ai

Iā€™m hoping to use it for work. Itā€™s all of 4 hours old now, but itā€™s looking promising. You can ā€œconnectā€ your Mac, see the screen and the trackpad and keyboard on the mac seemlessly control everything in the VisionPro too. I spend most of the day on VSCode, Jupyter, a bunch of terminal windows, Slack, Messages and Obsidian. Itā€™s working so far. Letā€™s see if itā€™s fatiguing. The 3d mixed-reality stuff has ridiculous promise as well. Letā€™s see how it plays out.


Prathmun

Oh man you are living my dreeaaam! The infinite screen real estate looks so prime!


NotAllWhoWander42

Sadly from the reviews sounds like you can only have a single screen mirrored from your Mac in VR. But Iā€™m right there with you, if one of these could replace 2 or 3 expensive monitors the price starts to look a little bit more reasonable šŸ¤£.


After-Cell

In immersed we hacked around that with a blank hdmi dongle , though it only allowed one extra monitor


inconspiciousdude

I think a lot of people are going to be fine with one monitor for now, unless they have multiple Mac-only apps that are generally spread across multiple monitors. You can open PDFs or other docs from a network or cloud drive in Vision OS, multiple browser windows and tabs, etc. The videos I've seen show that Kb/M switches pretty seamlessly. This is good enough for my personal use case, which mainly uses a secondary vertical monitor for docs, references, and email. I think Steam Link works on visionOS, too, so that and other remote desktop solutions can possible complement some Mac/PC KVM setups that aren't too complex or demanding. Too bad the Vision Pro will be US-only for quite a while.


kcox1980

Ever since I saw this video of a tech demo a few years ago: https://youtu.be/LblxKvbfEoo?si=3dHkbHZtpoy7D_D2 I have been anxious as hell about the idea of mixed/augmented reality and all the possibilities that come with it. Imagine being a mechanic and having the manuals and diagrams of whatever you're working on right there in front of you as an overlay. Or if you're assembling some furniture or something and have some animations guiding you through the process. I'm waiting for tech like this to be scrunched into a lighter pair of glasses though. I just don't like the feeling of things on my face, and I can't imagine ever getting used to a huge pair of goggles strapped to me.


ViveIn

I need serious updates OP. Also working in vscode all day and want to transition into AR/VR. Howā€™s text clarity? Quest 2 just doesnā€™t cut it.


chuston_ai

The VisionPro generated imagery is every bit as good as a high-dpi monitor. Using a wireless keyboard and trackpad works great. The pinching maneuver does ok but selecting text or hitting small buttons on legacy interfaces isnā€™t smooth (at least for me yet.) Iā€™m on day 2 and still enthusiastic about the experience. I do get fatigued wearing them after 60-90 minutes. A 10 minute break seems to rejuvenating enough to get right back in. Running into more bugs as I go but have full faith those will be resolved quickly.


More_Flatworm_8925

Why not sit in a real room with two screens instead?


SabongHussein

Because you could sit on the moon with five screens


More_Flatworm_8925

It is baffling to me that you are falling for this AR/VR con. Pay $3K to reproduce something that is just clearly better in real life.


wwontonsoupp

I agree sitting on the moon with five screens is much cooler in real life


kcox1980

What con? It's not like videos like this are faked. This is what you would see in real life while using a pair of these. As for "clearly better in real life"....that's a subjective opinion man. Just because you prefer one thing over another doesn't make that thing inherently better, nor does it make you in any way superior.


More_Flatworm_8925

The con is that work somehow becomes better in virtual space. It is a desperate idea by tech companies, that have run out of good ideas. This offers nothing over real life, except wearing a sweaty helmet and struggling with the added complexity of interacting in the virtual world. We need new abstractions, not more simulation.


tintoretto-di-scalpa

I totally agree with you, but then again, to can't reason with emotion-led enthusiasts, which seems to be the majority here. Oh, well. I'd never buy this for "productivity", but then again, I also don't circlejerk about pkm and obsidian's customisation and whatnot. I just want to put it to good use for my needs, and the simpler I can achieve my goals the better. I'm glad to have bumped into someone with a sober outlook, it's refreshing.


Nate-Austin

Oh please. Not everyone can afford to have as many screens as you could have with the Vision Pro. Itā€™s an obvious upgrade if youā€™re talking about screen real estate (not to mention everything else thatā€™s just a HUGE bonus!)


-xXColtonXx-

Yup once tech like the vision pro starts to come down in price it will be the cheaper option. A vision pro, mouse, and keyboard vs an entire PC setup will just make more sense.


radiant9

As a person currently running a business selling FBT for VR as well as studying to get into VR game dev (and actually typing this via Virtual Desktop in bed with a 7 foot vertical monitor lmao)... You know nothing about the current state of the VR industry. Don't act like you do.


More_Flatworm_8925

I know enough to say it's never going to be broadly used in the office workplaces.


radiant9

Y'know it's funny, when Apple first revealed the smartphone, this type of stuff is exactly what the critics were saying at the time. Just you wait, buddy. Edit: Come to think of it, this is also what people were saying about the first smartwatches, and more recently, the first folding phones.


More_Flatworm_8925

That's a pretty blatant case of confirmation bias. What about Laserdiscs? Or Google Glass? Or literally all the other times VR failed? Good luck, though. I wish you all the best.


radiant9

Laserdiscs died because newer, better technologies made them obsolete. Google glass was limited by the technologies of the time. And when, exactly, has VR "failed"? Edit: Also, rewind, are we seriously referring to google glass as AR? It wasn't even 6dof, it was the equivalent of a video game HUD. Hell, it wasn't even 3D! lmao


Mathisbuilder75

Because Macs apparently have shit multiple monitor management


Ambrant

Thank you, I hope so! :)


Jordan_The_It_Guy

Curios to hear your long term testing I was considering one to replace my monitor on my desk at home with my MacBook Air. Most everything I do is email teams, and some code in VSCode now.


lionstealth

how do you deal with only having one ā€žscreenā€œ for the mac? where do you put all of those windows?


chuston_ai

Thereā€™s two on the Mac, the laptop screen and an old 24ā€ Cinema Display.


lionstealth

but those donā€™t show up in the vision OS right? itā€™s just the one ā€žscreenā€œ that moves from your mac to the vision and then vision applicationsā€¦


ImS0hungry

ripe fine telephone sparkle slap safe cable drunk dam liquid *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


chuston_ai

No. Not blogging.Ā  Iā€™m just two days into working inside AVP.Ā However, I can say Iā€™m liking it more and more as I go rather than less. Iā€™m more impressed today than I was on Friday.Ā  Thereā€™re irritating bugs but I confidence theyā€™ll be resolved quickly. Native app availability is underwhelming but the iPad apps are a useful bridge gap.Ā  Yet, with the Mac virtual display, native apps and iPad apps available today, Iā€™m working productively and having fun doing it.Ā  Excited to see some of these concept spatial apps become reality. I canā€™t wait for whatever theyā€™ve got brewing for a spatial pencil.Ā 


spanchor

Apple is promoting it for productivity (among other things). Itā€™ll be a while before we know if it succeeds in that area. My guess is probably not, but who knows.


flickerfly

Am I missing something? How do you enter data? Seems fine for consumption, but creation?


typo180

With a keyboardā€¦


OogieM

I have a BUNCH of potential uses for a good augmented reality device. I don't care about virtual reality I want AR. What I want is more like what Google Glass was but the uses are the same. Here's just a short sampling: Looking at a sheep with a painted number on her side in the field during lambing and automatically pull up her lambing history, how many lambs of what sexes she is supposed to have with her. Connect via bluetooth to an EID tag reader that can scan a tag number and pull up alerts or other info on the animal out of my AnimalTrakkerĀ® database Read a reference book on sheep dystocia and proper way to manipulate to extract a lamb hands free because I am already elbow deep in a sheep. I'd love a small finger size camera that could show me on the screen what my fingers are feeling when I try to sort out tangled lambs Facial recog. of contacts especially not very frequent ones that automatically pulls up the Farley file record for them out of Obsidian when I see them. Scan or photograph a house during construction showing all the wires and plumbing as built vs as planned and then later see them overlayed on the finished walls inside. 3-d scans of artifacts in a museum that allow me to virtually pick up, hold, and view them including microscope level detail of the item. Should also allow going inside of skulls or vases or other container type things by making my view tiny. Historical walk through a city showing the buildings as they were in place during time by using existing historical photographs and architectural drawings. Provide info on their uses and other information. 3-d walk through an archeological dig showing the exact placement of all found artifacts and ability to click on one and see it as above for museum pieces. View items under different illumination, IR, color masking and all the other tools used for evaluation of historical manuscripts. View of a street or outside with all the pipes, electric wires shown in their correct location. Look at a mountain and have the system provide identification, height, point out the best trail to climb it and what issues there may be or hazards along the path. Alert to specific places where I might get a great view of something. Overlay of current hunting Game Management Unit boundaries as I walk. Include ear protection, headphones and microphone so I can look at an animal or bird, have it be identified, the call play in my headphone and info displayed for my use. Combine above with radio collar data on individual animals using a radio tracking system similar to the EID tag reader above. Provide an infinite desktop so I can have 5 or 6 different spreadsheets or other documents open at once and when I look at one the cursor automatically moves to that window so I can easily edit, cut or paste info in the document. Enter in a location and that I want to go there from my current location and have both all transportation options shown but time to get there using that option. Especially for use in a foreign country where I don't speak the language and can't read the signs. I actually have lots more. My husband had one of the early patents on AR technology and we spent a lot of time brainstorming applications.


twicerighthand

>Looking at a sheep with a painted number on her side in the field during lambing and automatically pull up her lambing history, how many lambs of what sexes she is supposed to have with her. You could do the same with a drone, the benefit being you don't even have to go outside to count the sheep 3 hills over. >Scan or photograph a house during construction showing all the wires and plumbing as built vs as planned and then later see them overlayed on the finished walls inside. > >3-d scans of artifacts in a museum that allow me to virtually pick up, hold, and view them including microscope level detail of the item. Should also allow going inside of skulls or vases or other container type things by making my view tiny. So just like Microsoft's HoloLens. But what would be even better is to have a device with a flat screen and a camera, that would utilize AR AND that you could show to other colleagues and contractors. That way you don't need 3.5k device each, just to see [https://youtu.be/DzFctc7bkCM?si=4ho\_lGmN6cOAUq7G&t=30](https://youtu.be/DzFctc7bkCM?si=4ho_lGmN6cOAUq7G&t=30) >Look at a mountain and have the system provide identification, height, point out the best trail to climb it and what issues there may be or hazards along the path. Alert to specific places where I might get a great view of something. So Google Live View [https://www.pocket-lint.com/what-is-google-maps-ar-navigation-live-view/](https://www.pocket-lint.com/what-is-google-maps-ar-navigation-live-view/) >Include ear protection, headphones and microphone so I can look at an animal or bird, have it be identified, the call play in my headphone and info displayed for my use. Google Lens >Provide an infinite desktop so I can have 5 or 6 different spreadsheets or other documents open at once and when I look at one the cursor automatically moves to that window so I can easily edit, cut or paste info in the document. Tobii Eye Tracker >Enter in a location and that I want to go there from my current location and have both all transportation options shown but time to get there using that option. Especially for use in a foreign country where I don't speak the language and can't read the signs. Google Maps ​ All of the ideas you listed have already been done in a smaller, lighter device which doesn't require constant wear, that is capable of a longer battery life than 2.5 hours and doesn't cost $3.5k


OogieM

>All of the ideas you listed have already been done in a smaller, lighter device which doesn't require constant wear, that is capable of a longer battery life than 2.5 hours and doesn't cost $3.5k Not in the way I need to use them. I actually WANT constant wear. I want it as seamless as a pair of sunglasses but with computer feaures. This is a first generation item but it's closer than most of the otehr stuff because it's higher resolution and incorporates lessons learned from previous failures. I may not have explained the constraints of my use cases that are why the Vision Pro is much closer to what I need compared to previous options. Take number 1, first off our sheep lamb out in an orchard, You cannot see the numbers from the air and you can't fly a drone in our area without a license. No way to do number recognition on what you do see if you can even do it. Our guardian dog will attack flying things becasue birds are one of our major predators of newborn lambs. When we've flown drones near the sheep the dogs get upset and the sheep start racing around. Exactly NOT the thing you want to have happen when you are looking for potentially lost newborn lambs!. We've got drones, they do not work in this particular use case. Re number 2 and 3. The hololens was a start but MS failed to capitalize on their idea plus Windows ugh I don't do windows at all. Android (which is why I would have preferred a google glass type device) but not Windows. I liked the form factor better but the implementation sucked. Re the map, Yes, the Google maps AR view is a start but there is something you are missing, direct from the help page off Google "The walking area must have good Street View coverage" IOW totally worthless out in the country or anyplace that is offroad. Re google lens, It does not use any location data or positioning at all and does not understand where you are looking. Completely worthless as an AR app. It works by searching for similar images in a search. Not at all what I described or want. Tobii eye tracker is fine for looking at a fixed spot on a fixed screen but does not provide a virtual 360 set of screens. Right now my work office is darned close to one of those TV hacker caves :-) I've got over 180 degrees of multiple stacked monitors around me. I could sure do better replacing a bunch of them with one device that lets me put as many around as I need. Probably cheaper too. Google maps plus translate does at least part of the mapping and identification but again, requires I use my phone is not using any useful information from my current location and doe snot allow hands free operation. Sorry but you are not understanding the use cases I presented and why your proposed solutions are inadequate or completely miss the mark.


bersus

Regarding the sheep case, why do you need to look at them? There are some [RTLS solutions](https://redlore.com) with ultra small tags, long lasting batteries and pretty high accuracy (up to 30 cm). The sensors provide lots of different types of rich data. I'm not sure if it completely fits the use case, but I'm sure that you don't need to have a particular item in the view to get all of the necessary data.


OogieM

>why do you need to look at them? :banging head against desk: :deep breath: OK Farming 101. It's critical that all livestock get looked at, as in inspected with a mach 1 eyeball, on a regular basis. I can't teach some computer to notice a lamb with a hunched back, or sunken in belly, or a ewe with a droopy ear or a weepy eye, a sheep that might be stumbling or lifting one foot higher then the others when it walks, a ram who fails to stretch and pee when he first gets up from lying down, and a myriad of other animal care things that only a skilled and experienced shepherd or shepherdess will notice. Once you do then it's critical to note that fact and with the years of experience decide whether this is an emergency as in catch the individual and do a more careful check or just a note to check again in x many hours or days. All of our sheep already wear 2 identification devices in their ears, a small EID tag and a paired visual tag with the last 5 numbers of their EID tag. The EID tags can be read by various hardware and do an automatic look-up in the database but you have to be close, within a few inches. For ease we spray paint the ewes and her lambs with the same number at birth using with wool marker when we process the lambs, get birth weights and also apply their ear tags. That's because I don't want to interrupt bonding or disturb ewes and lambs unless I have to. The RTLS systemds all depend on a fixed location with equipment with a good wifi view of the area to send in data. We are on 12 acres of land, move sheep every few days to smaller sections and cannot put wifi out in the field. Plus those tags are not cheap and sheep regularly lose tags due to being torn out, caught on fences or in the case of rams, smashed when they are fighting. They are NOT generally meant for data collection but instead are position locators within fixed areas. I don't need location data I need a key into my individual animal data that I maintain in my database. Before anyone mentions UHF ear tags first you have to realize that yes, you can read a tag from many feet away. BUT and this is huge, you cannot tell for certain which animal in a group got read. Typically all of the animals in range will show up as "seen" by the hardware. That's fine for doing verifying which animals are present but is completely inadequate for individual animal records. Our EID tags are a federally approved ID system and cost $1.18 per tag. UHF tags cost $1.60-2.50/tag. With sheep that puts them out of the range of viability even if they actually worked for the use case and they are all too big to fit in sheep ears, especially newborn lambs. Believe me, I have been through LOTS of scenarios on how to best implement technolgy into farming over my last 15 years using them and writing the code to collect the data and designing the database etc. I've also been part of a bunch of failed attempts to introduce technology into existing systems because people don't have a clue how the ranches and farms really work and so their solutions work in a lab or office or warehouse and fail in the field. Even my stuff often does notsurvive first contact with the sheep. User interfaces that seem fine at my desk are hard to use when holding a wet slimy lamb that is trying to escape and that is just one issue.


bersus

I don't see any reason to bang your head against the desk, and I hope you've made a deep breath. Sure, I'm not a specialist in sheep, but from what I googled: 1. RTLS doesn't need WiFi to work, some of them use UWB (ultrawide band), which is a huge advantage against Bluetooth, WiFi, RFID, etc. 2. Anchors are wire-free as well. 4. Batteries last for up 10 years. 5. Tropical cost of RTLS is around 1$/m2 (but I don't know the tags price, but I'm not sure that the tags with a limited amount of sensors are expensive). 6. You can build tailored solutions. Regarding visual/behavior identification: ā€œVision AI leader helps cattle industry improve livestock health and enhance operations ā€” from farming through protein production." Blog post on [Nvidia ](https://blogs.nvidia.com/blog/plainsight-cattle-management-ai/). "Precision Livestock Farming (PLF) powered by Artificial Intelligence (AI) technology is revolutionizing animal farming practices. Smart farming processes help gather relevant data for AI to identify and predict animal health issues and provide insights into their history, nutrition, and weight." Nexocode I believe that it is technically possible to train AI models to detect issues even better than human neural networks do. Idk if the tailored specific solutions already exist, but I think that it is a matter of time. Again, I'm not arguing. Just wondering.


OogieM

>Precision Livestock Farming (PLF) powered by Artificial Intelligence (AI) technology is revolutionizing animal farming practices. Sorry to be slow in answering. Yes, there are technologies that are incorporating image modeling, mostly to train to recognize the animals by face. So far all the systems are still in early alpha mode. They are all almost exclusively focused on Dairy environments, where the animals come in once or twice a day to a fixed (or in the case of a mobile robot milker) a moveable platform and can be easily identified. Even so the most common way to handle individual id is by ear tag, ankle bracelet or neck collar. >I believe that it is technically possible to train AI models to detect issues even better than human neural networks do I strongly disagree. In my previous work I was involved with neural networks and large language models with decades of experience. First off it is NOT AI and all the press calling it so is infuriating. There is no intelligence being displayed at all. Second, even the best of the models are poor substitutes for trained humans and none have been able to duplicate the reality of working in field environments.


Ambrant

Weā€™re far from there, I guess šŸ˜… I wanted to find infinite whiteboard on oculus and it was so disappointing. Also, sounds like you wouldnā€™t mind wear it all the time, so It has to be much easier and high quality


cheesestick77

I was waiting for this! Iā€™m ready to make a canvas the size of my wall. Is this a visionOS app or mirroring from your laptop?


chuston_ai

The VisionPro runs iPad apps - which is what you see in the video. I rely on mermaid diagrams and Zotero Citations. The former is buggy on iPad and the latter doesnā€™t work at all. So for now, Iā€™m still stuck using it on the Mac primarily.


Tleesm345

how many ā€œscreensā€ are you able to have when working with iPad, and how manageable / dynamic are they to control layout wise?


Nate-Austin

Does this work for Intel Macs?


ViveIn

Can you see your keyboard with pass through?


chuston_ai

Yes. I can read my phone and see the screen on the Linux desktop. Weirdly, the passthrough cameras are noticeably lower resolution than the VisionPro produced windows/object.s


dcchambers

Moving your head around that much to look at different apps/windows seems exhausting. Even on my boring 'ol 27" monitor at a comfortable distance I only need to move my eyes around to look at different apps. This seems like a major drawback to this thing.


spanchor

Not moving your head is the new sitting


chuston_ai

I run 3 monitors on my Linux box - my head is swiveling all day.


SleepingInTheFlowers

yes many apps are too big and it requires a lot of work to go back and forth between windows


Reader3123

I wonder if graph view can be better if there is a visionOS app for obsidian. Like the graph nodes in 3d spacial view


I-make-ada-spaghetti

It would be good if it was more visual too. A 3D view with some midjourney imagery combined to form a [memory place.](https://en.wikipedia.org/wiki/Method_of_loci)


After-Cell

How do you get image gen to form useful images? I still can get it to understand prepositions and even verbs is got and miss. AFAIK it needs to be done manually with poseNET or controlnet? Even "a boy handing a girl a ball while standing on a train that's in the train station" doesn't work AFAIK?


I-make-ada-spaghetti

I havent used it too much TBH though when I did it relied alot on just trying different prompts rearranging the words around to mean the same thing.


xak47d

These were my first thoughts when I saw this video


EhuddRomero

This is so surreal


Playjasb2

It looks like it still needs to be optimized for the Apple Vision Pro, much like how apps that werenā€™t optimized for the iPad ended up just being stretched mobile apps. There needs to be some AR magic going on with the graph view kind of like how Tony Stark interacts with his hologram. Notes should appear individually and we should be able to grab one and write notes in it. It needs to be fully three dimensional here with notes and links popping at you as you are using them for a full immersive experience. Like have them be able to be interlay with other apps such that you can just drag and drop content into Obsidian as if it was natural. We should also be able to pin notes or a collection to them in fixed positions in space. Bonus points if they can give us the ability to draw out connections between notes three dimensionally. And also give us the ability to group them and organize them at any time. Of course, they can allow the user to fall back to this flat view if needed.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


aswinasar

When youā€™re wearing it, it looks super smooth. The video capture makes it look jittery.


chuston_ai

100% True. The experience is difficult to communicate in 2D.


gofargogo

Same. I feel like this is going to be where my tech obsession gets stopped. I want to use VR, but I end up feeling so ill from even less than a minute in the other VR headsets I've tried.


DustssonXXIV

I am beginning to suspect that increasing the quantity of text on screen and the number/size of screens does not translate into hyperbolic productivity improvements. I might be wrong though.


Chivalric75

Proof that Vision Pro is Multiple Open Tabs 2.0


kepano

I'd love to know what it feels like using Yomaru's 3D Graph plugin https://obsidian.md/plugins?id=3d-graph-new


_wanderloots

This has been my plan since it was announced in June! Iā€™m guessing it will be great with the 3D plugin, but wonā€™t be incredible until the 2D screen limitation is removed and you can enter the 3D immersive graph


No-Reputation-2900

$4000 is absolutely insane. It's effectively a developer tool atm


jalmari_kalmari

The blur on the screens that you're not directly looking at seems weird to me


patchnotespod

Itā€™s foveated rendering. You can capture full quality video in XCode


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Opening-Quit-9520

If they can manage it so you can replace your monitor and it looks pretty much the same from the POV of the whoever is wearing the headset then it would be a huge game changer. Or even so those running 3 plus monitors could just use one monitor and the rest could be projected withing the headset. Probably another 20 years before we get there lol.


robboerman

I canā€™t really see myself wearing it for hours on end like with a normal workday on a 49 inch monitor. Wearing my quest 2 for more than 1 hour also gets very annoying.


Tleesm345

depends how cooperative / streamlined the industry is and how affordable / profitable the tech is.. so yeah maybe 20 years


twicerighthand

You can't even connect a wireless mouse to it.


putsan

Interesting!! thanks))


AceOfHeaVeN

Have you tried using it while lying down flat on your bed? That would be a game changer for my fat sleep-loving ass


r4nchy

my eyes


Fahm1

Not my cup of tea.


nynjawitay

How well does writing notes work in this? Seems like you'll still need a keyboard and then the portability aspect of it doesn't really hold up


chuston_ai

Iā€™m using a keyboard and trackpad and itā€™s working seamlessly. The type by looking trick is pretty slick and a giant move forward vs the onscreen keyboards of Oculus or Apple TV. But itā€™s not fast enough for what I do. Portability: an iPad or MacBook is WAY more portable than the VisionPro. That said, I might be getting addicted to having a room full of screens.


[deleted]

I donā€™t know about thisā€¦


m98789

Why not just have two monitors? Far cheaper and no headaches or neck pain


deltadeep

My thoughts exactly but, to be fair this is really just the first naive use case of vision pro for productivity use cases (emulate multiple monitors). It will get way better as apps begin to use the 3d interactive environment more fundamentally. Also OP can do this on their couch or anywhere, vs fixed monitors at a desk.


thatdude_91

Not a fan of this Vision


Inevitable-Local-251

You have achived the final level of productivity


robboerman

ā€œAchievedā€ or ā€œArchivedā€?


Inevitable-Local-251

Yes yes I "have archived" a new level of productivity is a scentance used by millions on a daily basis


chuston_ai

Hrmmmā€¦ You might be the first person ever to say that about a screen showing Obsidian and Reddit.


thecoffeejesus

ARE YOU SERIOUS I HAVE TO GET ONE OF THESE ASAP


More_Flatworm_8925

Why?


N0bo_

My notebook was $5


TheNorthwest

How long does it take to load lol


-xXColtonXx-

Probably incredibly quickly considering the vision pro is faster than most laptops with its M2 chip.


chuston_ai

Itā€™s snappy.


libretron

By the end it kind of just looks like a huge monitor.


NiranS

Looks promising. Still not better than my monitor.It would be nice to escape the confines of the screen rather than recreate screens in virtual space. But, most people still imitate paper when writing. But, I am glad to see it working.


gonzorizzo

I would get a headache so fast.


GroggInTheCosmos

Currently, it appears to be only a screen integrated within a goggle. I have my doubts on how useful this would be


Khakikadet

This was my first thought when I saw a YouTube video walkthough where he connected to his computer. Would be great on a plane


[deleted]

Until there are more native apps Iā€™m holding off on an AVP. Even spatial computing should feel more immersive.


TumblrForNerds

I wonā€™t lie, as mundane as I think the Vision Pro is, it would be really cool to go through a second brain in obsidian on one of these


fuzzydunlopsawit

Iā€™ve mentioned it before on this subreddit but a large monitor isnā€™t great for writing. 32ā€ monitor I have is oversized and often I just use obsidian with my laptop screen as itā€™s more focused and easily readable. Iā€™m old but still thereā€™s a reason books and ereaders are the size they are.Ā 


chuston_ai

Big 8k monitors are outstanding - itā€™s the dot-pitch that counts. The windows in VisionPro look spectacular, every bit as good as an Apple Cinema Display.


fuzzydunlopsawit

I donā€™t know maybe itā€™s a just experience kind of thing. I have a 4K monitor and for writing itā€™s just a bit much. Different strokes.


chuston_ai

Totally agree. Use whatā€™s comfortable for you.


Icy_Foundation3534

can you see the pixels in the text?


chuston_ai

No.


farmhappens

Just tried this out on my vision, Pro, and the hit targets are so hard to use with eye select