The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off. For example, it's clear that a lot of the Rust UI framework developers have been working on Macs for the last few years. The font rendering on many of those look bad once you plug them into a more normal DPI monitor. If they hadn't been using Macs with Retina displays they would have noticed.
Also, it's not only about the screen resolution. Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!
Had reported many issues where to reproduce they needed to enable 10x throttling in the browser. Or use a Windows machine.
Yes! I’m glad to see this pointed out - when working on UIs, I regularly move them between 3 monitors with varying resolution & DPI. 4k @ 200%, 2K at 125%, and 2K at 100%. This reveals not only design issues but application stack issues with DPI support.
this exactly. same ppl do for sound, listen in the car, over shity headphones etc. - that's just quality control not the fault of any piece of equipment.
Yes this is universal in pro mixing setups, having filters or even actual physical hardware to provide the sound of stock earbuds, a crappy Bluetooth speaker, sound system in a minivan, etc.
Yeah sure, as long as you have a lot of resources for testing widely.
Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.
I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.
One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.
I believe there are a lot of people using 1080p monitors because they bought it a while ago and they're still working fine. There's also a lot of lower-end 1080p monitors still being sold today.
> One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world
I personally see a lot of 1080p screens on new gaming laptops too. Lots of people get those for work from what I see with my peers. When I sold my RTX 3060 laptop with a 1080p screen, most buyers wanted it for professional work, according to them.
> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from
If anything, this is exactly the place where I'd expect a bunch of people to be rocking an older Thinkpad. :)
I don’t get marketing people. The only link in the press release is to adobe’s creative cloud. Why isn’t there two taps to buy the monitor with Apple Pay and have it shipped when it’s available?
Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.
Why is this? 5k/6k at 27" would be the sweet spot for me, and potentially 8k at 32". However, I'm not willing to drop $2k per monitor to go from a very nice 27" 4k to 27" 5k.
You can get 8K TVs for <$1000 now. And an Quest 3 headset has 2 displays at far higher PPI for $600.
Because the vast majority of Monitor Sales-Volume are (public) tenders from companies buying huge volume, and those companies still mostly look for monitors <4K (without fancy specs and without i.e. USB-C).
If 4K reaches mass-market for those, the specs will shift down and there will be room in the (much smaller) Premium-Tier monitor segment
Heck, even if you just want USB-C and an integrated webcam on an average display, the price-hike compared to one without it is crazy, because everything except those basic office-monitors is still niche-production...
What's your actual use-case for this? I run a 32" 4K, and I have to stick my nose within a foot (~30cm) of the display to actually spot individual pixels. Maybe my eyesight isn't what it used to be
I'd kill for a 40" 5k or 6k to be available - that's significantly more usable desktop real estate, and I still wouldn't be able to see the pixels.
The likelihood of dead pixels increases quadratically with resolution, hence panel yield drops correspondingly. In addition, the target audience who has hardware (GPUs) that can drive those resolutions is smaller.
Except that the hardware doesn’t necessarily offer perfect integer scaling. Oftentimes, it only provides blurry interpolation that looks less sharp than a corresponding native-resolution display.
The monitor may or may not offer perfect scaling, but at least on Windows the GPU drivers can do it on their side so the monitor receives a native resolution signal that's already pixel doubled correctly.
> Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.
It's mostly because the improvement over 4k is marginal. In fact, even from 1920x1080 it's not so big of a deal, which is why people keep buying such monitors in 2025.
A the worse is that the higher spending consumer segment of PC parts, the gamers, can't really use high resolution display at their full potential because it puts such a burden on the GPU (DLSS helps, but the results is even smaller of an improvement over 1920x1080 than regular 4k is)
Nice monitor, but its target demographic is pretty small, and its price makes Eizo look cheap.
I’ve done a lot of color-calibrated work, and, for the most part, don’t like working in a calibrated system. I prefer good ol’ sRGB.
A calibrated system is a “least common denominator” system, where the least capable element dictates what all the others do. So you could have one of these monitors, but, if your printer has a limited gamut, all your images will look like mud, anyway, and printer technology is still pretty stagnant. There was a big burst of improvement in inkjets, but there hasn’t been much progress in a long time. Same with scanners. I have a 12-year-old HP flatbed that is still quite valid.
A lot of folks get twisted over a 60Hz refresh rate, but that’s not something I worry about. I’m old, and don’t game much. I also watch entertainment on my TV; not my monitor. 60Hz is fine, for me. Lots of room is my priority.
I'm not buying a new monitor with a decade-old version of DisplayPort. Non-oled monitors are products that last a long time (at least a decade) so if I bought this monitor, I'd still be using DisplayPort 1.4 from 2016 in 2036. I need UHBR20 on a new monitor so I can rest assured that I will have some lanes available for my other peripherals. I've already lived the hell of needing to dedicate all 4 lanes to DisplayPort, leaving only a single USB2.0 connection remaining for all my other peripherals to share[0].
An aside - this monitor is proving surprisingly difficult to buy in the UK. Everywhere I look it seems to be unavailable or out of stock, and I’ve been checking regularly.
Relatedly, I also don’t understand why a half-trillion dollar company makes it so hard to give them my money. There’s no option to order ASUS directly on the UK site. I’m forced to check lots of smaller resellers or Amazon.
I'd imagine for most people the HDR perf difference is more noticeable than the resolution. This new monitor can do 1200 nits peak with local dimming, PA32QCV can only do 600 nits peak with no local dimming. Also Dolby Vision.
Is there a good 5k monitor at 27" that does not burn the wallet? It's worth mentioning that it should be also very reliable because these monitors seem to have issue after awhile, especially burn-in.
> With the HDMI 2.2 spec announced at CES 2025 and its official release scheduled for later this year, 8K displays will likely become more common thanks to the doubled (96 Gbps) bandwidth.
Uncompressed, absolutely we need another generation bump with over 128Gbps for 8K@120Hz with HDR. But with DSC HDMI 2.1 and the more recent DisplayPort 2.0 standards is possible, but support isn't quite there yet.
Nvidia quotes 8K@165Hz over DP for their latest generation. AMD has demoed 8K@120hz over HDMI but not on a consumer display yet.
Is it actually good for productivity? The curve isn’t too aggressive? Could you, e.g. stack 3 independent windows and use all 3? Or you kind of give up on the leftmost / rightmost edges ?
I wouldn't hold my breath. Competing models seem to top out around 120 Hz but at lower resolutions. I don't imagine there's a universal push for higher refresh rates in this segment anyway. My calibrated displays run at 60 Hz, and I'm happy with that. Photos don't really move much, y'know.
I swore a blood oath that I would never buy an Asus product ever again, after three terrible laptops from them in a row, but holy hell do I kind of want this monitor.
My main "monitor" right now is an 85" 8K TV, that I absolutely love, but it would be nice to have something smaller for my upstairs desk.
I have a fantastic Asus laptop that is 8 years old now and (after an easy battery replacement) easily does everything I want from it and feels nice and solid. I was so impressed that I recommended Asus to someone else, and what they got was pretty awful.
So basically, YMMV. They make good stuff, and they make awful stuff.
What would you pick for your next laptop if you had to buy one?
I had an Asus laptop, but the frequent security firmware updates for one of the Dell laptop that I had makes me think it might make a good candidate in terms of keeping up with security updates.
Not sure for the current latest models for Asus/Dell/HP/etc., but I liked the fact that disassembly manuals are provided for older Dell and HP. I can hardly find disassembly manuals for Asus when I have to do maintenance such as swapping out thermal paste/pads and clearing out the heatsink fins.
I’m only one data point, but I also swear that I would never buy an Asus laptop again. If you are fine with the operating system, a MacBook Pro is the best in my opinion. It’s not even close.
Otherwise I had okay Dell or Lenovo laptops. Avoid HP, even the high end Zbook ones. A framework might be worth a try if you have a lot of money.
I have used a ZBook G1a for the past few months because it is the only laptop with AMD's Ryzen 395+, and while not thinkpad or XPS/Precision tier, the laptop has been perfectly fine.
I run Asahi Linux as a daily. Support is imperfect and for a daily driver you can probably forget about using anything newer than an M2 at the moment. On my M2, missing features include USB-C video out and microphone support. Windows on ARM is worse and has zero drivers for Mac hardware as far as I know.
Depending on the specific TV, small details like text rendering can be god-awful.
A bunch of TVs don't actually support 4:4:4 chroma subsampling, and at 4:2:2 or 4:2:0 text is bordering on unreadable.
And a bunch of OLEDs have weird sub-pixel layouts that break ClearType. This isn't the end of the world, but you end up needing to tweak the OS text rendering to clean up the result.
Someone mentioned the latencies for gaming, but also I had a 4K TV as a monitor briefly that had horrible latency for typing, even. Enough of a delay between hitting a key and the terminal printing to throw off my cadence.
Only electronic device I’ve ever returned.
Also they tend to have stronger than necessary backlights. It might be possible to calibrate around this issue, but the thing is designed to be viewed from the other side of a room. You are at the mercy of however low they decided to let it go.
You could probably circumvent this by putting the display into Gaming Mode, which most TVs have. It removes all the extra processing that TVs add to make the image "nicer". These processes add a hell of a lot of latency, which is obviously just fine for watching TV, but horrible for gaming or using as a pc monitor.
It was a while ago (5 years?), so I can’t say for certain, but I’m pretty sure I was aware of game mode at the time and played with the options enough to convince myself that it wasn’t there.
game mode is a scam. it breaks display quality on most TVs. and still doesn't respond as fast as a PC monitor with <1ms latencies.... it might drop itself to 2 or 3 which is still 2x or 3x atleast slower.
you can think 'but thats inhumanly fast, you wont notice it' but in reality, this is _very_ noticeable in games like counter-strike where hand-eye coordination, speed and pinpoint accuracy are key. if you play such games a lot then you will feel it if the latency goes above 1ms.
Most people lack an understanding of displays and therefore what they are quoting and are in fact quoting the vendors claimed pixel response time as the input lag.
It’s gotta be the most commonly mixed up things I’ve seen in the last twenty years as an enthusiast.
well atleast i didn't misunderstand my own lack of understanding :D ... -
the part of feeling the difference in response times, that's true though, but I must say, the experience is a bit dated ^^ i see more high resolution monitors have generally quite slow response times.
<1ms was from CRT times :D which was my main counter-striker days. I do find noticable 'lag' still on TV vs. monitor though but i've only tested on HD (1080p) - own only 1 4k monitor and my own age-induced-latency by now far exceeds my display's latency :D
high latency on TVs make it bad for games etc. as anyhting thats sensitive on IO timings can feel a bit off. even 5ms compared to 1 or 2ms response times is noticable by a lot in hand-eye coordination across io -> monitor.
In the context of this thread that's a non-issue. Good TVs have been in the ~5ms@120Hz/<10ms@60Hz world for some time now. If you're in the market for a 4K-or-higher display, you won't find much better, even among specialized monitors (as those usually won't be able to drive higher Hz with lower lag with full 4k+ resolution anyway).
It sort of depends on what you perceive as 'high'. Many TVs have a special low-latency "game" display mode. My LG OLED does, and it's a 2021 model. But OLED in general (in a PC monitor as well) is going to have higher latency than IPS for example, regardless of input delay.
I'm sure there are reasons with regards to games and stuff, but I don't really use this TV for anything but writing code and Slack and Google Meet. Latency doesn't matter that much for just writing code.
I really don't know why it's not more common. If you get a Samsung TV it even has a dedicated "PC Mode".
"PC Mode" or "Gaming mode" or whatever is necessary - I can tell any other mode easily just by moving the mouse, the few frames of lag kill me inside. Fortunately all tvs made in this decade should have one.
If you play video games, display latency. Most modern TVs offer a way to reduce display latency, but it usually comes at the cost of various features or some impact to visual quality. Gaming monitors offer much better display latencies without compromising their listed capabilities.
Televisions are also more prone to updates that can break things and often have user hostile 'smart' software.
Still, televisions can make a decent monitor and are definitely cheaper per inch.
IIRC Apple dropped sub pixel antialiasing in Mojave or Sonoma (I hate these names). It makes no sense when Macs are meant to be used with retina class displays.
I have been using a 43 inch TV as a monitor, since last 10 years, currently on a LG.
You get lot of screen-space, as well as you can sit away from desk and still use it. Just increase the zoom.
Usually refresh rate and sometimes feature set. And it’s meant to be viewed from further away. I’m sure someone else could elaborate but that’s the gist.
There is a lot of marketing material at the linked page. But there is no mention of price and available sizes. Also, there is no link to purchase one. This is November. I can look these things up, but why link to a PR fluff piece if there something more substantial available?
8K HDR implies that DSC becomes unavoidable...but DSC's "visually lossless" criteria relies on the human eye and is statistically subjective at face value.
Any domain experts know how that actually squares in practice against automated colorimeter calibration?
DisplayPort 2.1 (which the monitor supports) provides sufficient bandwidth for 7680x4320@60 Hz 10-bit without DSC when using UHBR20. The press release unfortunately doesn’t clarify whether the monitor supports UHBR20 or only the lower UHBR10 or UHBR13.5 speeds. Of course, the GPU must also support that (Nvidia RTX 5000 only at the moment, as I believe AMD RX 9000 is only UHBR13.5).
Apologies never expected when it comes to USB and HDMI naming and spec stuff. I have to look them up every time.
But, that’s 8K DSC or.. 24fps maybe? then. Weird oversight/compromise for such a pro color-focused monitor, perhaps Asus reused their legacy monitor platform. “8K HDR” at 24fps could be a niche for theater movie mastering, perhaps?
I tried a 32" 4k for a while but the form factor never worked for me. 8k seems absurd after working with that monitor.
27" 1440p is much easier to drive and live with day to day. I can still edit 4k+ content on this display. It's not like I'm missing critical detail going from 4k=>qhd. I can spot check areas by zooming in. There's a lot of arguments for not having to run 4k/8k displays all day every day. The power savings can be substantial. I am still gaming on a 5700xt because I don't need to push that many pixels. As long as I stay away from 4K I can probably use this GPU for another 5 years.
32" 4k is pretty much the worst of all worlds configuration. It is just dense enough that traditional 100% scale is not great, but not dense enough to get that super smooth hidpi effect either. I'd argue that for desktop monitors around 200 ppi is sweet spot, so 5k for 27" or 6k for 32".
This 8k is bit overkill, but I suppose makes some sense to use a standard resolution instead of some random number.
These things aren't for use in an office setting where you're fiddling with a web browser, Excel, or writing software. They're for situations where colour calibration matters, so either designing for print, or working on video.
Particularly for the people doing video an 8k display is great - that means you can have full resolution 4k video on screen with space around it for a user interface, or you can have a display with the 8k source material on it if the film was shot at that resolution.
Can confirm. I use a Dell 6K 32", and it's frankly amazing. I still use an older Dell 4K 24" (rotated 90º) off to one side for email/slack/music but I just use the single 32" for ~90% of what I do.
There's two instances where 32" is helpful. First for Xcode and Android Studio, where you write some UI code and the phone/tablet preview on the right, in both horizontal and vertical orientation.
And second for doing writing and research, because recently I had to get a certificate for which I had to write a portfolio of old-fashioned essays. 32" but even 40" is extremely helpful for this. Basically I kept my screen organized in three columns with the word processor on the left, and two PDFs in the middle and on the right.
I HATE (yes, all caps) Apple for very actively discouraging 1440p as a useful resolution (as in, it is literally, not figuratively, painful to use out of the box). I'm a happy customer of BetterDisplay just to make it bearable, but it's still not as sharp as any other OS.
The specs look impressive, especially the 8K HDR and built-in color calibration. It’ll be interesting to see how it performs compared to Apple’s Pro Display XDR in real workflows.
For macOS, 8K should have a larger screen. This 8K monitor is 32 inches, which leaves us with a very awkward 275ppi. 42" would be 209ppi, which is great for 16.5" from your face. 48" would be 183ppi, which is great for 18.8" from your face (my preference). But at 32" and 275dpi, that would be a 12.5" viewing distance, which is far too close for a 32" monitor. You'd be constantly moving your neck to see much of the screen--or wasting visual acuity by having it further.
macOS is optimized for PPIs at the sweet spot in which Asus's 5K 27" (PA27JCV) and 6K 32" (PA32QCV) monitors sit. Asus seemed to be one of the few manufacturers that understand a 27" monitor should be 5K (217ppi), not 4K (163ppi). 4K will show you pixels at most common distances. But if you follow that same 217ppi up to 8K, that leads to 40.5" not 32".
My wife has a triple vertical PA27JCV setup and it's amazing. I've been able to borrow it for short stints, and it's nearly everything I've ever wanted from a productivity monitor setup.
What is the right size for 4K monitor and the distance from our eyes?
I have Skyworth monitor at 27" already. If I set macos resolution at 4K, the default font is too small. My distance with the monitor is around 16,5".
You can scale the UI according to your preferences, but the real problem is that if your monitor’s ppi is not close to the macOS sweet spot of 220ppi (or an integer multiple thereof) you’re going to have aliasing issues with text and other high contrast elements.
I recently (a couple of weeks ago) got the 6K version of this screen, the Asus PA32QCV. It has the same pixel density as my MacBook Pro, so the UI looks great. To be honest, it's enough screen real estate that I now operate with my laptop in clam shell mode.
My only complaint is that the KVM leaves a bit to be desired. One input can be Thunderbolt, but the other has to be HDMI/DisplayPort. That means I need to use a USB-C cable for real KVM when switching between my two laptops. I'd like two cables, but four cables isn't the end of the world.
You can run it natively, but it is better to downscale to 4k or 1080p. I run three 5k versions of this monitor and they are all downscaled to 1440p. I get 1:1 pixel mapping so text looks crisp in every app except Microsoft Teams.
Hey, I think that's a great idea, too. 4K panels on phones (tiny!) exist for some absurd reason. But somehow there are no 22" 4K monitors. I think they probably don't sell well. Probably the same reason why all monitors are 16:9.
If you need 5k at 27 inches, you need more at 32". But if you're saying that 32" are excessive, I think it's a personal preference. I would never go back to a smaller monitor (from 32) personally - especially as you grow older.
The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off. For example, it's clear that a lot of the Rust UI framework developers have been working on Macs for the last few years. The font rendering on many of those look bad once you plug them into a more normal DPI monitor. If they hadn't been using Macs with Retina displays they would have noticed.
Also, it's not only about the screen resolution. Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!
Had reported many issues where to reproduce they needed to enable 10x throttling in the browser. Or use a Windows machine.
> Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!
Part of what QA testing should be about: performance regressions.
Yes! I’m glad to see this pointed out - when working on UIs, I regularly move them between 3 monitors with varying resolution & DPI. 4k @ 200%, 2K at 125%, and 2K at 100%. This reveals not only design issues but application stack issues with DPI support.
As a designer, one should keep a couple of cheap, low-res monitors reset to the factory defaults for proofing what many users are going to see.
I must confess I felt a lot of lust looking at the self color calibration feature.
It is extremely useful if your work ends up in paper. For photography (edit: film and broadcast, too) would be great.
My use case are comics and illustration, so a self-color-correcting cintiq or tablet would be great for me.
this exactly. same ppl do for sound, listen in the car, over shity headphones etc. - that's just quality control not the fault of any piece of equipment.
Yes this is universal in pro mixing setups, having filters or even actual physical hardware to provide the sound of stock earbuds, a crappy Bluetooth speaker, sound system in a minivan, etc.
Conversely if you only use a ~110 DPI display you won't know how bad it looks on a ~220 DPI display.
The solution here is wide device testing, not artificially limiting individual developers to the lowest common denominator of shitty displays.
Yeah sure, as long as you have a lot of resources for testing widely.
Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.
> if you were to make an analogy you should target for a few devices that represent the "average"
For Macs, 220DPI absolutely is the average.
I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.
One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.
I believe there are a lot of people using 1080p monitors because they bought it a while ago and they're still working fine. There's also a lot of lower-end 1080p monitors still being sold today.
> One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world
I personally see a lot of 1080p screens on new gaming laptops too. Lots of people get those for work from what I see with my peers. When I sold my RTX 3060 laptop with a 1080p screen, most buyers wanted it for professional work, according to them.
> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from
If anything, this is exactly the place where I'd expect a bunch of people to be rocking an older Thinkpad. :)
I don’t get marketing people. The only link in the press release is to adobe’s creative cloud. Why isn’t there two taps to buy the monitor with Apple Pay and have it shipped when it’s available?
> The redemption period ends August 31, 2026. For full details, visit https://www.asus.com/content/asus-offers-adobe-creative-clou....
Well, the monitor is €8,999, so maybe it’d be more than two taps for me:
> The monitor is scheduled to be available by October 2025 and will costs €8,999 in Europe (including VAT)
Buy a 9k€ monitor and get free 3 months for a cloud subscription. What a deal !
If you’re not careful, that adobe creative cloud sub will cost you more than the monitor when you try to cancel
Too rich for me. Also I don’t need a creative cloud sub. But I’m the wrong customer for such a monitor.
I’ll wait till 8k becomes more of the norm for say 1-1.5k
Human eye resolution is about 1 arcminute. The comfortable field of view is about 60°, or 3600 arcmimutes. A 4K display should mostly suffice %)
But I run double on Mac so an 8k is 4k.
Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.
Why is this? 5k/6k at 27" would be the sweet spot for me, and potentially 8k at 32". However, I'm not willing to drop $2k per monitor to go from a very nice 27" 4k to 27" 5k.
You can get 8K TVs for <$1000 now. And an Quest 3 headset has 2 displays at far higher PPI for $600.
Because the vast majority of Monitor Sales-Volume are (public) tenders from companies buying huge volume, and those companies still mostly look for monitors <4K (without fancy specs and without i.e. USB-C).
If 4K reaches mass-market for those, the specs will shift down and there will be room in the (much smaller) Premium-Tier monitor segment
Heck, even if you just want USB-C and an integrated webcam on an average display, the price-hike compared to one without it is crazy, because everything except those basic office-monitors is still niche-production...
> and potentially 8k at 32"
What's your actual use-case for this? I run a 32" 4K, and I have to stick my nose within a foot (~30cm) of the display to actually spot individual pixels. Maybe my eyesight isn't what it used to be
I'd kill for a 40" 5k or 6k to be available - that's significantly more usable desktop real estate, and I still wouldn't be able to see the pixels.
The likelihood of dead pixels increases quadratically with resolution, hence panel yield drops correspondingly. In addition, the target audience who has hardware (GPUs) that can drive those resolutions is smaller.
as a gamer 8k makes me sweat because i can't imagine what kind of hardware you'd need to run a game :O probably great for text-based work, though!
You don't really need 8K for gaming, but upscaling and frame generation have made game rendering resolution and display resolution almost independent.
And if all else fails, 8K means you can fall back to 4K, 1440p or 1080p with perfect integer scaling.
Except that the hardware doesn’t necessarily offer perfect integer scaling. Oftentimes, it only provides blurry interpolation that looks less sharp than a corresponding native-resolution display.
The monitor may or may not offer perfect scaling, but at least on Windows the GPU drivers can do it on their side so the monitor receives a native resolution signal that's already pixel doubled correctly.
The Asus PA27JCV is rather less than $2k...
> Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.
It's mostly because the improvement over 4k is marginal. In fact, even from 1920x1080 it's not so big of a deal, which is why people keep buying such monitors in 2025.
A the worse is that the higher spending consumer segment of PC parts, the gamers, can't really use high resolution display at their full potential because it puts such a burden on the GPU (DLSS helps, but the results is even smaller of an improvement over 1920x1080 than regular 4k is)
Nice monitor, but its target demographic is pretty small, and its price makes Eizo look cheap.
I’ve done a lot of color-calibrated work, and, for the most part, don’t like working in a calibrated system. I prefer good ol’ sRGB.
A calibrated system is a “least common denominator” system, where the least capable element dictates what all the others do. So you could have one of these monitors, but, if your printer has a limited gamut, all your images will look like mud, anyway, and printer technology is still pretty stagnant. There was a big burst of improvement in inkjets, but there hasn’t been much progress in a long time. Same with scanners. I have a 12-year-old HP flatbed that is still quite valid.
A lot of folks get twisted over a 60Hz refresh rate, but that’s not something I worry about. I’m old, and don’t game much. I also watch entertainment on my TV; not my monitor. 60Hz is fine, for me. Lots of room is my priority.
6K 32" ProArt model PA32QCV might be more practical for YN crowd at 1299 USD VS 8-9K USD PA32KCX will run you
I'm not buying a new monitor with a decade-old version of DisplayPort. Non-oled monitors are products that last a long time (at least a decade) so if I bought this monitor, I'd still be using DisplayPort 1.4 from 2016 in 2036. I need UHBR20 on a new monitor so I can rest assured that I will have some lanes available for my other peripherals. I've already lived the hell of needing to dedicate all 4 lanes to DisplayPort, leaving only a single USB2.0 connection remaining for all my other peripherals to share[0].
[0] https://media.startech.com/cms/products/gallery_large/dk30c2...
An aside - this monitor is proving surprisingly difficult to buy in the UK. Everywhere I look it seems to be unavailable or out of stock, and I’ve been checking regularly.
Relatedly, I also don’t understand why a half-trillion dollar company makes it so hard to give them my money. There’s no option to order ASUS directly on the UK site. I’m forced to check lots of smaller resellers or Amazon.
Was same in US till maybe 2-3 weeks ago. Maybe they are slowly rolling out to various markets
I've been enjoying the PA32QCV in the last couple months. It's definitely not perfect, but the 220 PPI at 32 inch is just amazing to code on.
I'd imagine for most people the HDR perf difference is more noticeable than the resolution. This new monitor can do 1200 nits peak with local dimming, PA32QCV can only do 600 nits peak with no local dimming. Also Dolby Vision.
Is there a good 5k monitor at 27" that does not burn the wallet? It's worth mentioning that it should be also very reliable because these monitors seem to have issue after awhile, especially burn-in.
ASUS ProArt PA27JCV
No mention of 120Hz; I'm waiting for a 6k or higher-density display that can do higher refresh rates.
I was going to joke about 8k@120Hz needing like 4 video cables, but it seems we are not too far from it.
[8k@120Hz Gaming on HDMI 2.1 with compression](https://wccftech.com/8k-120hz-gaming-world-first-powered-by-...)
> With the HDMI 2.2 spec announced at CES 2025 and its official release scheduled for later this year, 8K displays will likely become more common thanks to the doubled (96 Gbps) bandwidth.
Uncompressed, absolutely we need another generation bump with over 128Gbps for 8K@120Hz with HDR. But with DSC HDMI 2.1 and the more recent DisplayPort 2.0 standards is possible, but support isn't quite there yet.
Nvidia quotes 8K@165Hz over DP for their latest generation. AMD has demoed 8K@120hz over HDMI but not on a consumer display yet.
https://en.wikipedia.org/wiki/DisplayPort#Refresh_frequency_...
https://en.wikipedia.org/wiki/HDMI#Refresh_frequency_limits_...
https://www.nvidia.com/en-gb/geforce/graphics-cards/compare/
My primary monitor is the Samsung 57" 8Kx2K 240Hz ultrawide. That's the same amount of bandwidth, running over DisplayPort 2. It mostly works!
I have three 4K 27" which yield a bit more screen real estate. Otherwise I'd love to go to a single ultrawide.
I use the same monitor can I love it. Couldn't recommend it more to people.
Fifty seven inches??
Just two 4k monitors slapped together, it’s 8k wide but 2k tall.
Is it actually good for productivity? The curve isn’t too aggressive? Could you, e.g. stack 3 independent windows and use all 3? Or you kind of give up on the leftmost / rightmost edges ?
Also as far as 6k goes, that's half the bandwidth of 8k.
Thunderbolt 5 supports up to 120Gbps one-way.
> 4 video cables
The IBM T220 4k monitor required 4 DVI cables.
https://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors
I wouldn't hold my breath. Competing models seem to top out around 120 Hz but at lower resolutions. I don't imagine there's a universal push for higher refresh rates in this segment anyway. My calibrated displays run at 60 Hz, and I'm happy with that. Photos don't really move much, y'know.
> Photos don't really move much, y'know.
They do when you move them (scroll)
And?
Can you provide a ROI point for scrolling photos at 120Hz+ ?
Sure, give me your ROI point for an extra pixel and I can fit refresh rate in there.
It looks and feels much better to many (but not all) people.
I don't really know how you expect that to translate into a ROI point.
I imagine your mouse still moves plenty though.
I swore a blood oath that I would never buy an Asus product ever again, after three terrible laptops from them in a row, but holy hell do I kind of want this monitor.
My main "monitor" right now is an 85" 8K TV, that I absolutely love, but it would be nice to have something smaller for my upstairs desk.
I have a fantastic Asus laptop that is 8 years old now and (after an easy battery replacement) easily does everything I want from it and feels nice and solid. I was so impressed that I recommended Asus to someone else, and what they got was pretty awful.
So basically, YMMV. They make good stuff, and they make awful stuff.
What would you pick for your next laptop if you had to buy one?
I had an Asus laptop, but the frequent security firmware updates for one of the Dell laptop that I had makes me think it might make a good candidate in terms of keeping up with security updates.
Not sure for the current latest models for Asus/Dell/HP/etc., but I liked the fact that disassembly manuals are provided for older Dell and HP. I can hardly find disassembly manuals for Asus when I have to do maintenance such as swapping out thermal paste/pads and clearing out the heatsink fins.
I’m only one data point, but I also swear that I would never buy an Asus laptop again. If you are fine with the operating system, a MacBook Pro is the best in my opinion. It’s not even close.
Otherwise I had okay Dell or Lenovo laptops. Avoid HP, even the high end Zbook ones. A framework might be worth a try if you have a lot of money.
I have used a ZBook G1a for the past few months because it is the only laptop with AMD's Ryzen 395+, and while not thinkpad or XPS/Precision tier, the laptop has been perfectly fine.
I've been toying with getting one of these with 128GB of RAM. What's your opinion (especially since you have compared it to thinkpad/xps)?
You can also run Asahi Linux or Windows for ARM on Macs
I run Asahi Linux as a daily. Support is imperfect and for a daily driver you can probably forget about using anything newer than an M2 at the moment. On my M2, missing features include USB-C video out and microphone support. Windows on ARM is worse and has zero drivers for Mac hardware as far as I know.
[dead]
What are the cons of having a large TV as a monitor? I've been considering something like this recently, and I wonder why is this not more common.
Depending on the specific TV, small details like text rendering can be god-awful.
A bunch of TVs don't actually support 4:4:4 chroma subsampling, and at 4:2:2 or 4:2:0 text is bordering on unreadable.
And a bunch of OLEDs have weird sub-pixel layouts that break ClearType. This isn't the end of the world, but you end up needing to tweak the OS text rendering to clean up the result.
Someone mentioned the latencies for gaming, but also I had a 4K TV as a monitor briefly that had horrible latency for typing, even. Enough of a delay between hitting a key and the terminal printing to throw off my cadence.
Only electronic device I’ve ever returned.
Also they tend to have stronger than necessary backlights. It might be possible to calibrate around this issue, but the thing is designed to be viewed from the other side of a room. You are at the mercy of however low they decided to let it go.
You could probably circumvent this by putting the display into Gaming Mode, which most TVs have. It removes all the extra processing that TVs add to make the image "nicer". These processes add a hell of a lot of latency, which is obviously just fine for watching TV, but horrible for gaming or using as a pc monitor.
It was a while ago (5 years?), so I can’t say for certain, but I’m pretty sure I was aware of game mode at the time and played with the options enough to convince myself that it wasn’t there.
> horrible latency for typing
Was this the case even after enabling the TVs "game mode" that disables a lot of the latency inducing image processing (e.g. frame interpolation).
game mode is a scam. it breaks display quality on most TVs. and still doesn't respond as fast as a PC monitor with <1ms latencies.... it might drop itself to 2 or 3 which is still 2x or 3x atleast slower.
you can think 'but thats inhumanly fast, you wont notice it' but in reality, this is _very_ noticeable in games like counter-strike where hand-eye coordination, speed and pinpoint accuracy are key. if you play such games a lot then you will feel it if the latency goes above 1ms.
Where are you finding monitors with <1ms input lag? The lowest measured here is 1.7ms: https://www.rtings.com/monitor/tests/inputs/input-lag
Most people lack an understanding of displays and therefore what they are quoting and are in fact quoting the vendors claimed pixel response time as the input lag.
It’s gotta be the most commonly mixed up things I’ve seen in the last twenty years as an enthusiast.
well atleast i didn't misunderstand my own lack of understanding :D ... -
the part of feeling the difference in response times, that's true though, but I must say, the experience is a bit dated ^^ i see more high resolution monitors have generally quite slow response times.
<1ms was from CRT times :D which was my main counter-striker days. I do find noticable 'lag' still on TV vs. monitor though but i've only tested on HD (1080p) - own only 1 4k monitor and my own age-induced-latency by now far exceeds my display's latency :D
false advertisements :D
high latency on TVs make it bad for games etc. as anyhting thats sensitive on IO timings can feel a bit off. even 5ms compared to 1 or 2ms response times is noticable by a lot in hand-eye coordination across io -> monitor.
In the context of this thread that's a non-issue. Good TVs have been in the ~5ms@120Hz/<10ms@60Hz world for some time now. If you're in the market for a 4K-or-higher display, you won't find much better, even among specialized monitors (as those usually won't be able to drive higher Hz with lower lag with full 4k+ resolution anyway).
It sort of depends on what you perceive as 'high'. Many TVs have a special low-latency "game" display mode. My LG OLED does, and it's a 2021 model. But OLED in general (in a PC monitor as well) is going to have higher latency than IPS for example, regardless of input delay.
OLED suffers from burn-in, so you'll start seeing your IDE or desktop after a while, all the time.
I have a couple of budget vertical Samsung TVs in my monitor stacks.
The quality isn't good enough for photo work, but they're more than fine for text.
I'm sure there are reasons with regards to games and stuff, but I don't really use this TV for anything but writing code and Slack and Google Meet. Latency doesn't matter that much for just writing code.
I really don't know why it's not more common. If you get a Samsung TV it even has a dedicated "PC Mode".
"PC Mode" or "Gaming mode" or whatever is necessary - I can tell any other mode easily just by moving the mouse, the few frames of lag kill me inside. Fortunately all tvs made in this decade should have one.
If you play video games, display latency. Most modern TVs offer a way to reduce display latency, but it usually comes at the cost of various features or some impact to visual quality. Gaming monitors offer much better display latencies without compromising their listed capabilities.
Televisions are also more prone to updates that can break things and often have user hostile 'smart' software.
Still, televisions can make a decent monitor and are definitely cheaper per inch.
For me, on macOS, the main thing is that the subpixel layout is rarely the classic RGB (side by side) that macOS only supports for text antialiasing.
If I were to use a TV, it would be an OLED. That being said, the subpixel layout is not great: https://pcmonitors.info/articles/qd-oled-and-woled-fringing-...
IIRC Apple dropped sub pixel antialiasing in Mojave or Sonoma (I hate these names). It makes no sense when Macs are meant to be used with retina class displays.
A.K.A. workaround for a software limitation with hardware. Mac font rendering just sucks.
For me it's eye fatigue. When you put large 4k TV far enough it's same view angle as a 27" desk monitor, you're almost 1.5m away from it.
I have been using a 43 inch TV as a monitor, since last 10 years, currently on a LG. You get lot of screen-space, as well as you can sit away from desk and still use it. Just increase the zoom.
Usually refresh rate and sometimes feature set. And it’s meant to be viewed from further away. I’m sure someone else could elaborate but that’s the gist.
There is a lot of marketing material at the linked page. But there is no mention of price and available sizes. Also, there is no link to purchase one. This is November. I can look these things up, but why link to a PR fluff piece if there something more substantial available?
Here's some specs: https://www.asus.com/displays-desktops/monitors/proart/proar...
8K, 32inch, 275ppi, 60Hz 2 Thunderbolt 4, 1 DisplayPort 2.1
> But there is no mention of price and available sizes
No idea about prices, but, assuming they follow the usual conventions for model codes, that's a 32" unit.
8K HDR implies that DSC becomes unavoidable...but DSC's "visually lossless" criteria relies on the human eye and is statistically subjective at face value.
Any domain experts know how that actually squares in practice against automated colorimeter calibration?
DisplayPort 2.1 (which the monitor supports) provides sufficient bandwidth for 7680x4320@60 Hz 10-bit without DSC when using UHBR20. The press release unfortunately doesn’t clarify whether the monitor supports UHBR20 or only the lower UHBR10 or UHBR13.5 speeds. Of course, the GPU must also support that (Nvidia RTX 5000 only at the moment, as I believe AMD RX 9000 is only UHBR13.5).
I believe you're right regarding AMD's lack of UHBR20 on its cards. Fingers crossed for their next gen!
8K 60fps 4:4:4 8bpp uncompressed requires a 96gbit HDMI cable, which is labeled Ultra96 in HDMI 2.2 afaik: https://www.hdmi.org/download/savefile?filekey=Marketing/HDM...
DisplayPort over USB4@4x2/TB5 at 120Gbps would be required for uncompressed 12bpp.
Apologies for not tracking; the monitor in question is spec'd with HDMI 2.1 and TB4 I/O.
Apologies never expected when it comes to USB and HDMI naming and spec stuff. I have to look them up every time.
But, that’s 8K DSC or.. 24fps maybe? then. Weird oversight/compromise for such a pro color-focused monitor, perhaps Asus reused their legacy monitor platform. “8K HDR” at 24fps could be a niche for theater movie mastering, perhaps?
I tried a 32" 4k for a while but the form factor never worked for me. 8k seems absurd after working with that monitor.
27" 1440p is much easier to drive and live with day to day. I can still edit 4k+ content on this display. It's not like I'm missing critical detail going from 4k=>qhd. I can spot check areas by zooming in. There's a lot of arguments for not having to run 4k/8k displays all day every day. The power savings can be substantial. I am still gaming on a 5700xt because I don't need to push that many pixels. As long as I stay away from 4K I can probably use this GPU for another 5 years.
32" 4k is pretty much the worst of all worlds configuration. It is just dense enough that traditional 100% scale is not great, but not dense enough to get that super smooth hidpi effect either. I'd argue that for desktop monitors around 200 ppi is sweet spot, so 5k for 27" or 6k for 32".
This 8k is bit overkill, but I suppose makes some sense to use a standard resolution instead of some random number.
These things aren't for use in an office setting where you're fiddling with a web browser, Excel, or writing software. They're for situations where colour calibration matters, so either designing for print, or working on video.
Particularly for the people doing video an 8k display is great - that means you can have full resolution 4k video on screen with space around it for a user interface, or you can have a display with the 8k source material on it if the film was shot at that resolution.
Can confirm. I use a Dell 6K 32", and it's frankly amazing. I still use an older Dell 4K 24" (rotated 90º) off to one side for email/slack/music but I just use the single 32" for ~90% of what I do.
There's two instances where 32" is helpful. First for Xcode and Android Studio, where you write some UI code and the phone/tablet preview on the right, in both horizontal and vertical orientation.
And second for doing writing and research, because recently I had to get a certificate for which I had to write a portfolio of old-fashioned essays. 32" but even 40" is extremely helpful for this. Basically I kept my screen organized in three columns with the word processor on the left, and two PDFs in the middle and on the right.
42" 4k 100%
I don't want to ever go back but I got this 2020 Dell for 200. I don't want to pay 800-1400 if I ever have to replace it
I HATE (yes, all caps) Apple for very actively discouraging 1440p as a useful resolution (as in, it is literally, not figuratively, painful to use out of the box). I'm a happy customer of BetterDisplay just to make it bearable, but it's still not as sharp as any other OS.
This looks amazing for creators — 8K, HDR, and auto calibration in one screen!
The specs look impressive, especially the 8K HDR and built-in color calibration. It’ll be interesting to see how it performs compared to Apple’s Pro Display XDR in real workflows.
I shudder to think how small the macOS ui text will be on this but I’m willing to find out.
For macOS, 8K should have a larger screen. This 8K monitor is 32 inches, which leaves us with a very awkward 275ppi. 42" would be 209ppi, which is great for 16.5" from your face. 48" would be 183ppi, which is great for 18.8" from your face (my preference). But at 32" and 275dpi, that would be a 12.5" viewing distance, which is far too close for a 32" monitor. You'd be constantly moving your neck to see much of the screen--or wasting visual acuity by having it further.
macOS is optimized for PPIs at the sweet spot in which Asus's 5K 27" (PA27JCV) and 6K 32" (PA32QCV) monitors sit. Asus seemed to be one of the few manufacturers that understand a 27" monitor should be 5K (217ppi), not 4K (163ppi). 4K will show you pixels at most common distances. But if you follow that same 217ppi up to 8K, that leads to 40.5" not 32".
My wife has a triple vertical PA27JCV setup and it's amazing. I've been able to borrow it for short stints, and it's nearly everything I've ever wanted from a productivity monitor setup.
Yeah I currently daily drive a 43" monitor and it has been a life changer since I got it in 2022.
I'm still happy with it, would kill for an 8K 43" 120hz monitor but that's still a ways away.
What is the right size for 4K monitor and the distance from our eyes? I have Skyworth monitor at 27" already. If I set macos resolution at 4K, the default font is too small. My distance with the monitor is around 16,5".
You can scale the UI according to your preferences, but the real problem is that if your monitor’s ppi is not close to the macOS sweet spot of 220ppi (or an integer multiple thereof) you’re going to have aliasing issues with text and other high contrast elements.
https://griffindavidson.com/blog/mac-displays.html has a good rundown.
I recently (a couple of weeks ago) got the 6K version of this screen, the Asus PA32QCV. It has the same pixel density as my MacBook Pro, so the UI looks great. To be honest, it's enough screen real estate that I now operate with my laptop in clam shell mode.
My only complaint is that the KVM leaves a bit to be desired. One input can be Thunderbolt, but the other has to be HDMI/DisplayPort. That means I need to use a USB-C cable for real KVM when switching between my two laptops. I'd like two cables, but four cables isn't the end of the world.
You can run it natively, but it is better to downscale to 4k or 1080p. I run three 5k versions of this monitor and they are all downscaled to 1440p. I get 1:1 pixel mapping so text looks crisp in every app except Microsoft Teams.
Isn‘t downscaling the wrong term? You‘re still taking advantage of its native resolution.
It'll look normal, maybe even a little big by default if the XDR is anything to go by
OSX does great at scaling UIs for high resolutions
Why does it have blinders?
To prevent glare and reflections usually. Similar to how a lens hood functions.
i long for a “Eizo” like quality monitor, 15 or 17 inch, with “retina” ppi count.
am i the only one who thinks that this would make sense?
Hey, I think that's a great idea, too. 4K panels on phones (tiny!) exist for some absurd reason. But somehow there are no 22" 4K monitors. I think they probably don't sell well. Probably the same reason why all monitors are 16:9.
About twice the price of the Dell 8k.
This is a direct competitor to the Apple Pro Display XDR.
I wouldn’t be surprised if it comes in at a similar price point.
The sustained 1,000 nit HDR and Dolby Vision support suggest their target market is very specifically film color grading.
it’s already on sale in the Chinese market for about 70,000 CNY, so the price is likely around 9,000–10,000 USD.
How much
realistically what’s the point of all those pixels at 32 inches? 5k at 27 inches seems more than enough.
If you need 5k at 27 inches, you need more at 32". But if you're saying that 32" are excessive, I think it's a personal preference. I would never go back to a smaller monitor (from 32) personally - especially as you grow older.
Apparently, ASUS believes there's an addressable market willing to pay a premium for +26.5% color-calibrated ppi in larger form factor.