Sunday, June 12, 2022

Connecting the Dots - Part 1 - USB

A while back, I decided to audit my USB-connected devices and their adjoining cables (because that's the kind of thing I do in my spare time) in an effort to lend some semblance of order to the confused mass of cables that I had steadily accrued over the years, and I recently did the same for both my HDMI- and Bluetooth-enabled devices. Over the course of these audits, I discovered several things that I previously didn't know about both technologies and what I could be doing better. 

In this blog post (Part 1 of 3 in this series), I'll discuss what it was I discovered during my USB audit, what I was doing wrong, and how you can make sure you're getting the most out of your devices by optimizing your connections. In future posts I'll discuss the same for HDMI and Bluetooth.


USB - Universal Serial Bus

If you've used any device with a microprocessor in the last 20 years, you probably have at least a vague idea of what USB is, so I won't waste your time by covering the basics. Instead, I'll just let it suffice to say that USB is basically the standard interface used to physically connect digital devices in this day and age, providing both data transfer and power capabilities.


The well-known USB trident logo.


USB Data Transfer Protocols


When it comes to USB, you've probably heard that there are basically two kinds of USB devices/cables currently on the market, each named after the data transfer protocol it implements: USB 2.0 and USB 3.0 (in case you're wondering, USB 1.0 came and went in the 90s with few people taking notice. It wasn't until USB 2.0 was released in 2000 that the technology started catching on). Virtually every USB cable and device on the market today implements one of these two protocols (Note: I know that there is USB 3.1 and USB 3.2 in addition to USB 3.0, but for the sake of simplicity, I'll just refer to them all as USB 3.0).

However, have you ever wondered just what the differences are between the two versions of USB and what they mean for you? Well, wonder no more! Simply put, USB 3.0 can transfer data about 10x faster and can provide 80% more power than USB 2.0. 


USB Connector Types


Unfortunately, there's another layer to understanding USB which tends to confuse people. In addition to there being two data transfer protocols to worry about, there are also many different USB connector types to consider. 

On occasion, you've probably found yourself digging through a pile of USB cables looking for one that has an end in the shape of the port that you're trying to plug it into. With so many different types of USB connectors, this really has become a nightmare for consumers. However, I'll try to clear up the confusion by using the below chart to show you the most common connector types and which data transfer protocol each one implements.

The most common USB connector types, with the exception of USB-C (more on that later).

USB 2.0 Connectors


On the top left of the chart, you can see a USB 2.0 Type A connector. This is the standard USB connector that you probably see every day and is used by many different devices.

Next to it is the USB 2.0 Type B connector. This connector is typically used with printers, as it is optimized to send data rather than receive it.

On the middle left is a USB 2.0 Mini-A connector, next to a Mini-B. These connector types were introduced as a first attempt to design a category of compact connectors for use with small, mobile devices such as smartphones, tablets, and digital cameras.

As devices became thinner over time, a new category of USB connectors was needed; the result was the USB 2.0 Micro category, located on the bottom left of the chart. Besides being smaller, the Micro category is more durable than the Mini category it replaced, designed to survive 10,000 connect-disconnect cycles.


USB 3.0 Connectors


With the introduction of the USB 3.0 protocol, new connector types were needed to handle the extra power and data. To help consumers tell which protocol a given USB cable or device implements, a blue insert is usually included inside the connector on the cable and inside the port on the device to indicate implementation of the USB 3.0 protocol (hence the blue outline on the chart). 

An example of a PC that has both USB 2.0 Type A (center-right) and USB 3.0 Type A (center-left) ports. Note the blue insert in the USB 3.0 ports signifying the USB 3.0 protocol.


On the top right of the chart is a USB 3.0 Type A connector, which is simply a USB 2.0 Type A connector with five additional contact points (called pins). These additional pins help the cable achieve the data transfer rate and power output required by the USB 3.0 protocol.

Likewise, the USB 3.0 Type B connector is very similar to its 2.0 predecessor, with an extra set of five pins inside the added notch at the top of the connector.

The USB 3.0 Micro-B connector is essentially a USB 2.0 Micro-A and Micro-B joined at the hip, doubling the number of pins available from five to ten. Unlike it's USB 2.0 predecessor, USB 3.0 Micro-B connectors are usually used with external data storage devices instead of mobile devices.

So, there ya go! An easy-to-understand breakdown of the most common USB connectors. Now when you're searching for that particular cable, you know which name to scream at the wall in frustration.


What about backward compatibility?


Good question. A big selling point of USB is backwards comparability; that is, newer USB cables should (theoretically) be compatible with older USB devices.

The good news is that the USB 3.0 protocol is backward compatible with the 2.0 protocol. The bad news is that in order for a USB 2.0 device to use a USB 3.0 cable, the connectors must also be compatible. 

As you can see from the chart, there's no way you're going to cram a USB 3.0 Micro-B connector into a USB 2.0 Micro-B port (likewise for the Type B connectors). For Type A cables, you can use a USB 3.0 Type A cable with a USB 2.0 Type B device, and everything should work fine; the additional pins in the connector simply won't be used by the device.

In the end, the backwards compatibility we were initially promised by USB was only partially delivered. 

Consumers -1, USB Cable Manufacturers +1


What about USB-C?


That's right. The one commonly used USB connector type that is not on the above chart is also the newest one, USB-C.

So many pins!

USB-C is a new connector type intended to replace the Type A, Type B, and Micro-B connectors so that there will be only one common connector type going forward, reducing confusion and frustration for consumers. Furthermore, with a whopping 24 pins, USB-C is intended to be as future-proof as possible, providing the capability to handle large amounts of data and power. Also, USB-C cables can implement either USB 2.0 or USB 3.0 (typically the latter), whereas the other connector types are all tied to a specific protocol.

Devices today are increasingly switching over to USB-C, though adoption hasn't been as fast as many would like (I'm looking at you, Apple). Still, the tide seems to be turning, and it might not be long before you can throw away that tangled mess of Type A, B, and Micro-B cables.

What I learned

For my part, I never really tried to understand the different connector types, what they're used for, and how that relates to the USB protocol they implement. Instead, I was content to play "Find the plug (A) that looks like it goes into port (B)" ad nauseum. 

That said, after my audit, I was able to document how many USB-connected devices I owned, which version of the USB protocol they implemented, and which connector type they used. This helped me declutter my collection of cables by separating the ones I knew I needed and storing them near the devices that used them while also discarding the ones I knew I wouldn't need (usually because the device the cable came with was long gone).

Additionally, I found that I had two USB 3.0 devices that I was connecting using a USB 2.0 cable, meaning that I wasn't taking full advantage of either device's capabilities; without this audit, I would have never recognized my mistake.

What you can apply

My advice is this: Check to make sure that you're using the right cable for the device. Just because the USB cable you're using fits and your device works doesn't mean that you're doing everything right. Using the wrong cable can potentially keep you from realizing your device's full potential.

The easy answer is to simply use the USB cable that came packaged with your device, as it almost certainly implements the same USB protocol as the device itself. In the event that said cable isn't available, you'll need to employ some deductive reasoning to figure out which cable you need to use.

For most connector types, only one USB protocol is implemented. If your device has a Mini, Micro, or Type B port, then the only cable that will conform to that port will also implement the correct corresponding USB protocol.

For Type A and USB-C devices, it gets a little trickier. USB 2.0 and USB 3.0 Type A ports look the same and are physically compatible with both USB 2.0 and 3.0 Type A cables; the only visible difference is a blue insert that device manufacturers usually add to indicate implementation of the USB 3.0 protocol.

USB-C's versatility gives it the ability to implement either USB 2.0 or 3.0. Unfortunately, there is no visible difference between USB 2.0 or 3.0 USB-C cables and ports.


For USB-C devices, there is typically no visible difference between a port that implements USB 2.0 vs 3.0. My rule of thumb is to always use USB 3.0 cables with USB-C devices, just to be safe.

Overall, the rule is simple: Use USB 3.0 cables with USB 3.0 devices and USB 2.0 cables with USB 2.0 devices.

I hope this helped answer any questions you may have had about USB cables and devices. Be sure to keep an eye out for my post on HDMI in the coming weeks!