[ad_1]
Intel’s PR rep for its Arc discrete graphics has sat down for another “Between Two Monitors” tech chat. The company has promised to continuously offer information about Arc prior to its upcoming launch. The rationale is that it will allow gamers to have all the information they need prior to purchase. To do that it will be releasing short videos tackling questions from gamers. This week it released a new briefing discussing how Arc would handle Variable Refresh Rates (VRR) and HDR, and there’s talk of support for HDMI 2.1.
Starting with VRR, the GPU world is currently bifurcated between Nvidia’s G-Sync and AMD’s FreeSync. However, in May the Video Electronics Standard Association (VESA) formally announced its own open standard based on Adaptive Sync. This is an attempt to offer an open standard that quantifies a display’s ability to offer variable refresh rates with DisplayPort using various performance tests. This effort was designed to override the competing standards, and hopefully lessen confusion among customers. After all, the average gamer may not be able to parse the various G-Sync and FreeSync versions available, especially with ratings like “G-Sync compatible” and “FreeSync Premium Pro” making things a bit muddier.
To demonstrate Arc’s capabilities in this arena, Ryan Shrout fired up Death Stranding on a 4K, 120Hz Acer monitor using an Arc A750 GPU. He didn’t state which certification the monitor has, only that it supports variable refresh rates. He shows the monitor syncing the refresh rate with the game’s frame rate, which is around 100 FPS/Hz. It’s running at 1440p, so overall the GPU is performing quite well in this game. He states Arc is “fully supporting DisplayPort VRR standards.” He summarizes by stating Arc will support “any and all” VESA Adaptive Sync standard. In addition, the company will also be validating Arc on over 100 of the most popular VRR displays to ensure a smooth experience at launch. As an aside, it’s not clear that Ryan is wearing two watches in this video; easter egg of some kind?
Moving on to HDR, he says Arc will support it on compliant monitors. However, there are many different certifications for HDR that vary depending on a monitor’s brightness capabilities. Those include Display HDR400, DisplayHDR 600, etc. From there, he mentions that HDR is difficult to show on a video since you’d need an HDR display of your own to see it. Therefore, as proof it’s working he jokingly says Intel spares no expense and has a highly-accurate external testing device. That “device” is a person named Ken, who looks at the monitor and agrees it’s working.
Finally, he says the lower-end Arc GPUs will support HDMI 2.0 natively. That includes the A310, A380, and A580. However, they can be modified with a PCON chip to support HDMI 2.1. That decision will be left up to the partners making the GPUs though. The higher-end GPUs, which include the Arc A750 and the A770, will support HDMI 2.1 natively. A chyron on the screen also states that all Arc GPUs will support DisplayPort 2.0 as well. The PCON chip he mentions performs a protocol conversion from DisplayPort 2.0 to HDMI 2.1.
The company recently stated Arc’s launch is “now in sight” but it’s still not clear when that will happen. The big question is whether Intel will be able to launch before AMD and Nvidia’s next-gen GPUs arrive, which might be around September. Intel is probably hoping it can pull it off, as its competition’s GPUs are rumored to be quite powerful. However, that power might come at a great cost, with a lot of heat as well. Therefore, it’s possible Intel is hoping to undercut them on price-to-performance and efficiency.
Now Read:
[ad_2]
Source link