Choices, choices. When it comes to connecting a computer to a monitor (or picking a monitor to buy) one of the last things you want to get bogged down in is a decision over which connections you need.
And the truth is, a lot of people overthink it — in most cases it doesn’t matter. It can when new technologies begin to overlap, such as when HD was making way for 4K, but then you usually don’t have a choice
For general-purpose use — and a single 4K display is no longer a special case — then it doesn’t matter. You’re probably better off with, in general, simply because monitors with DisplayPort (DP) in addition to HDMI tend to be more expensive.
But if you’re one of the edge cases where you do need to think about it — for console or PC gaming,, business travel or multimonitor configurations — here are some guidelines.
On the road
If you travel for business and want to hook up to monitors in different locations, it’s HDMI all the way (and version doesn’t really make a difference). That’s the most pervasive connection type. You may want to ensure your laptop has a built-in full-size HDMI connection as well, though they’re getting scarce; while you can use USB-C-to-HDMI or micro HDMI-to-HDMI dongles, they’re easy to lose and you don’t want to spend the first 20 minutes of your meeting time hunting for one. Any laptop that’s light enough to tote everywhere doesn’t have a DP connection, anyway.
For gaming… ugh. There really are no easy answers, because that opens the “AMD vs. Nvidia vs. neither” can of worms. If you’ve already got a monitor with a DP 1.4 connector and a matching graphics card, then use the DP, simply because it gives you the most ways to configure adaptive refresh and at the moment supports the highest refresh rates. HDMI tops out at 144Hz uncompressed or 240Hz compressed, while DP 1.4 can hit 360Hz in 1080p.
If you’re trying to pick a monitor, HDMI is the budget-friendly choice, but if you’ve got DP you’ll have more options if you plan to upgrade to a new graphics card or add an external GPU in the near future. DP is obviously better if you’ve got an Nvidia card and want to usemay be better over DP as well.
For consoles, you really don’t have a choice — they only support HDMI — but you have to pay attention to not just the version of HDMI, but the specific list of features it supports. To support variable rate refresh (VRR) on the, and forthcoming on the , as well as dynamic HDR metadata ( ) for better HDR rendering, you can’t assume that a monitor with an HDMI 2.1 connection automatically supports them. You need to check specifically that the monitor’s or TV’s specs for HDMI 2.1 explicitly state that they do, at the resolution(s) you care about.
For color-critical work either HDMI 2.1 or DP 1.4 have enough bandwidth to handle an uncompressed data stream, especially at 4K.
For multimonitor setups, you might want (or need) both. Depending upon your graphics card you may only have one of each connection type or USB-C only, anyway, and if you’re forced to daisy chain the USB-C DP it may limit the resolution options. (For multi-computer setups — connecting two computers to a single monitor — the connection type matters less than the features of the monitor.)
Decide for yourself
If you’d like to make up your own mind, though, here some considerations to take into account:
- If a feature requires a specific version of one of the standards, that means that both the monitor and the graphics card need to have it. In other words, if your graphics card uses DP 1.4 but your monitor is DP 1.2, you won’t get HDR.
- DP 1.2 and later supports daisy chaining, allowing you to drive more than one monitor off a single output connection. The number of monitors depends upon their resolutions and you’ll most likely need a splitter or hub. It’s not automatically supported, though, so check the monitor specs.
- Mini DP, occasionally found on gaming laptops, and USB-C Alt Mode are DP 1.4; USB-C with Alt Mode support is actually the successor to Mini DP.
- HDR display requires DP 1.4 or HDMI 2.0a (or later). On the graphics card side that means Nvidia GeForce GTX 1050 (i.e., Pascal) and AMD Radeon RX 400 series or newer cards. (Nvidia’s RTX series supports HDMI 2.0b, which is necessary for displaying Hybrid Log Gamma HDR, which is currently only really relevant at the moment if you’re editing HDR video.)
- Currently, 8K requires two DP 1.4 connections for 60Hz or a single connection for 30Hz. And a monster system. But we haven’t seen any 8K monitors since a couple trickled out about five years ago.
- Adaptive sync technologies help synchronize the framerate output of games with monitor refresh rates (how fast the screen can update) to prevent temporal artifacts like tearing (where you briefly see elements of two frames at the same time). For the purpose of this decision, you only really need to know that AMD FreeSync works over both HDMI and DP while Nvidia’s G-Sync only works over DP. However, Nvidia G-Sync doesn’t work over USB-C (even though it’s technically DP) because USB doesn’t directly connect to the graphics processor. That’s changing (in theory) with Intel’s 12th-gen mobile chipset which supports designs that do.
- HDMI is almost universal on modern monitors — i.e., in every conference room everywhere — while DP is generally only available on higher end, more expensive models. Cheap monitors usually have an old version of HDMI, frequently 1.4, which only really matters if you’re trying to play 4K HDCP 2.2-protected content. Those monitors are rarely 4K, though.
- As of today, some monitors have finally added connections for , which can support generic variable refresh rates (i.e., will work with a TV), resolutions above 5K and Dynamic HDR. Most current graphics cards have HDMI 2.1 support.
- If your system doesn’t have discrete graphics, it probably only has an HDMI connection, anyway.