![]() There are numerous examples of developer apathy toward HDR. Mejia writes that developers “still need to deliver a standard dynamic range version of your game - and creating a separate version for HDR means twice as much mastering, testing, and QA. It doesn’t help the matter that HDR is usually an afterthought for game developers. Look up user experiences on these two games, and you’ll find reports ranging from the best HDR game ever to downright terrible image quality. On my Samsung Odyssey G7, for instance, Tina Tiny’s Wonderlands looks dark and unnatural with HDR turned on, but Devil May Cry 5 looks naturally vibrant. TVs and consoles widely support Dolby Vision, however, which is a big reason why console HDR is so much better than HDR on PC.Īs former game developer and product manager for Dolby Vision Gaming Alexander Mejia points out, static metadata creates a big problem for game developers: “There are more and more HDR TVs, monitors, and laptops on the market than ever, but if you grab a couple from your local big-box retailer, your game is going to look drastically different on each one … How do you know that the look you set in your studio will be the same one the player sees?” Jacob Roach / Digital Trends The vast majority of monitors are dealing with static metadata. There are a few HDR10+ monitors, but they’re exclusively from Samsung’s most expensive displays. Only a select few monitors support Dolby Vision, like Apple’s Pro Display XDR, and none of them are gaming monitors. ![]() Dynamic metadata is a big reason why console HDR is so much better than HDR on PC. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |