Sunday, July 24, 2022

Connecting the Dots - Part 2 - HDMI

For Part 2 of my "Connecting the Dots" series, I'll be discussing HDMI, both the underlying technology and the cable types we use to connect HDMI-enabled devices. If you've ever wondered what the different versions of HDMI mean and which cables to use to ensure that you get the best multimedia experience with your equipment, this post is for you. Let's get started!


HDMI - High-Definition Multimedia Interface


If you've owned a TV or multimedia device of any kind over the last 15 years or so (and I'm betting you have), you've probably heard the term HDMI, whether in reference to your device's features or the cables used to connect your devices.

Simply put, HDMI is the standard interface type used to connect multimedia devices (much like USB is the standard for connecting computing devices). HDMI devices come in 4 primary versions, and there are 4 types of HDMI cables. 

Sounds straightforward, right? Unfortunately, our friends in the TV industry managed to make what was supposed to be a simple standard somewhat complicated, but hopefully I can clear up some of the confusion in this post.

Not pronounced "Hedemi"

HDMI Versions


HDMI 1.0 was released in December 2002 as a new standard interface type for transmitting both audio and video signals; this was a major innovation, as most multimedia transmissions interfaces up to that point transmitted either video or audio, not both. The promise of HDMI was that it would make multimedia connections simpler by providing a single connection that could handle a variety of signal types.

Despite this promise, HDMI didn't see widespread adoption by consumer electronics until the rise of high-definition TVs in the later 2000s. By then, the most common HDMI version was HDMI 1.4.


HDMI 1.4b


Released in June 2009, pretty much every TV manufactured since 2010 has supported HDMI 1.4b. Now the oldest of the HDMI versions still in production, HDMI 1.4b helped kickstart the rise of high-definition TVs by supporting the transmission of video content up to a resolution of 1080p @ 120Hz, 4k @ 24Hz, and surround sound (via a feature called Audio Return Channel, or ARC).

While virtually all new TVs manufactured today employ newer versions of HDMI, many older TVs in use today are still running strong on HDMI 1.4b.


HDMI 2.0


Introduced in September 2013, HDMI 2.0 added support for video content resolutions up to 4k @ 60Hz, 32-channel audio content, and an enhanced color space called Rec. 2020. HDMI 2.0 reached large-scale adoption in the 2014 model year and remained the dominant HDMI version through 2015.

In April 2015, and update for HDMI 2.0, called HDMI 2.0a, was released. This version of HDMI added support for High Dynamic Range (HDR) content (which is all the rage with TVs today), as well as an upgrade for ARC called eARC (Enhanced Audio Return Channel). HDMI 2.0a was the flagship HDMI standard for the 2016-2018 model years. 

This implementation of HDR technology was further refined with the release of HDMI 2.0b in March 2016, which added support for HLG (Hybrid Log-Gamma). Most TVs manufactured in the 2019-2021 model years support up to HDMI 2.0b.


HDMI 2.1


In January 2017, HDMI 2.1 was released. The most significant revision for the HDMI standard since it was first introduced, HDMI 2.1 adds support for video resolutions of 4k & 8k @ 120Hz, object-based and high-definition audio formats, and gaming features such as Variable Refresh Rate (VRR) and Auto Low Latency Mode (ALLM).

Because HDMI 2.1 is such a major change to the HDMI specification, adoption of the new version has been slow. Though officially released in early 2017, only starting in the 2022 model year has a wide range of TVs supported this new version.


HDMI Cables


Like the HDMI spec itself, HDMI cables come in several different flavors, which are:

  • Standard
  • High Speed
  • Premium High Speed
  • Ultra High Speed

While most HDMI cables look about the same, they come in several different types.

Now, it's worth noting that each HDMI cable type doesn't necessarily correspond to a particular version of HDMI. Instead, the primary difference between the different cable type is how much bandwidth each cable provides. Newer versions of HDMI require more bandwidth, which is why they may require certain cable types. I'll explain in more detail below.


Standard HDMI Cable


The original HDMI cable, Standard HDMI cables have about 5 Gb/s of bandwidth, which is sufficient to support up to HDMI 1.2. An older spec, Standard HDMI cables are no longer widely manufactured and have largely been replaced by High Speed HDMI cables.


High Speed HDMI Cable


The most common HDMI cable available today, High Speed HDMI cables have about 10 Gb/s of bandwidth, which supports all HDMI versions up through HDMI 2.0b. Because of their ubiquity, these cables are quite affordable and can be found at retailers like Amazon for pennies on the dollar.


Premium High Speed HDMI Cable


Premium High Speed HDMI cables have 18 Gb/s of bandwidth and are additionally certified by the HDMI Forum to meet standards of durability and EMI (electromagnetic interference) shielding. 

The label affixed to all Premium High Speed HDMI cables. The QR code can be scanned for authentication.

The Premium High Speed spec for HDMI cables was created by the HDMI Forum (the body that regulates HDMI standards) in response to the appearance of cheap High Speed HDMI cables that eventually came to saturate the market. The concern was that some of these cheaper High Speed HDMI cables were of suspect build quality, and manufacturers of high-quality HDMI cables wanted a way to distinguish their cables from cheaper ones. Thus, the Premium High Speed HDMI cable was born, with each one carrying a scannable QR code on the label that can be used for authentication.

While Premium High Speed HDMI cables don't support any more versions or features of HDMI than regular High Speed HDMI cables, they are intended to offer the consumer assurance of quality that had been lacking in the marketplace.


Ultra High Speed HDMI Cable


Ultra High Speed HDMI cables are the newest HDMI cables, sporting a bandwidth of 48 Gb/s, sufficient to support the full HDMI 2.1 spec. 

If you see this label, know that you're getting the best (and most expensive) HDMI cable available.

Ultra High Speed HDMI cables also adhere to the same physical standards as Premium High Speed HDMI cables and offer QR code verification. Also, be aware that these cables are more expensive, averaging about $10/foot (though that will likely lower over time).


What I learned


Years ago, a tech journalist explained that low-cost High Speed HDMI cables were sufficient for connecting all HDMI-enabled devices, and that more expensive HDMI cables were a waste of money. For the most part, he was right; after reading his piece, I proceeded to equip all my devices with cheap High Speed HDMI cables purchased from Amazon, and I never had much cause to revisit that decision.

However, with the emergence of new HDMI features as part of the HDMI 2.0 and 2.1 specs, I wanted to check again to make sure my old cables were still up to the task. Fortunately, I found that they are, though that will change as more HDMI 2.1 devices find their way into my home over the next few years.


What you can apply


The good news here is that picking the right HDMI cable is actually pretty straightforward. 

First off, Standard HDMI cables aren't really around anymore, so you don't have to worry about them at all.

Most devices today work fine with High Speed HDMI cables, since they're likely using HDMI 2.0b or earlier. Additionally, High Speed HDMI cables are abundant (and cheap), so you shouldn't have much trouble obtaining them.

Premium High Speed HDMI cables are only really necessary in environments where cable durability or EMI shielding is a requirement, such as if you're running the cable through a wall or in an area with high EMI (very rare).

Ultra High Speed HDMI cables are required only if you're connecting two HDMI 2.1-enabled devices to one another and want the full suite of features the spec offers. Most HDMI 2.1 features are gaming-centric, so only the newest TVs and gaming consoles currently on the market support it. However, that will likely change over time as HDMI 2.1 slowly becomes the dominant version of HDMI.

In summary, if you're happy with the way your devices are performing today, then you're probably set; there's no need to rush out and get new HDMI cables, even with all the new versions of HDMI coming out. That said, if you run into a situation where you need a newer cable, go ahead and shell out a little extra money to make sure you've got the right cable for the job; it'll be worth it in the end.

Sunday, June 12, 2022

Connecting the Dots - Part 1 - USB

A while back, I decided to audit my USB-connected devices and their adjoining cables (because that's the kind of thing I do in my spare time) in an effort to lend some semblance of order to the confused mass of cables that I had steadily accrued over the years, and I recently did the same for both my HDMI- and Bluetooth-enabled devices. Over the course of these audits, I discovered several things that I previously didn't know about both technologies and what I could be doing better. 

In this blog post (Part 1 of 3 in this series), I'll discuss what it was I discovered during my USB audit, what I was doing wrong, and how you can make sure you're getting the most out of your devices by optimizing your connections. In future posts I'll discuss the same for HDMI and Bluetooth.


USB - Universal Serial Bus

If you've used any device with a microprocessor in the last 20 years, you probably have at least a vague idea of what USB is, so I won't waste your time by covering the basics. Instead, I'll just let it suffice to say that USB is basically the standard interface used to physically connect digital devices in this day and age, providing both data transfer and power capabilities.


The well-known USB trident logo.


USB Data Transfer Protocols


When it comes to USB, you've probably heard that there are basically two kinds of USB devices/cables currently on the market, each named after the data transfer protocol it implements: USB 2.0 and USB 3.0 (in case you're wondering, USB 1.0 came and went in the 90s with few people taking notice. It wasn't until USB 2.0 was released in 2000 that the technology started catching on). Virtually every USB cable and device on the market today implements one of these two protocols (Note: I know that there is USB 3.1 and USB 3.2 in addition to USB 3.0, but for the sake of simplicity, I'll just refer to them all as USB 3.0).

However, have you ever wondered just what the differences are between the two versions of USB and what they mean for you? Well, wonder no more! Simply put, USB 3.0 can transfer data about 10x faster and can provide 80% more power than USB 2.0. 


USB Connector Types


Unfortunately, there's another layer to understanding USB which tends to confuse people. In addition to there being two data transfer protocols to worry about, there are also many different USB connector types to consider. 

On occasion, you've probably found yourself digging through a pile of USB cables looking for one that has an end in the shape of the port that you're trying to plug it into. With so many different types of USB connectors, this really has become a nightmare for consumers. However, I'll try to clear up the confusion by using the below chart to show you the most common connector types and which data transfer protocol each one implements.

The most common USB connector types, with the exception of USB-C (more on that later).

USB 2.0 Connectors


On the top left of the chart, you can see a USB 2.0 Type A connector. This is the standard USB connector that you probably see every day and is used by many different devices.

Next to it is the USB 2.0 Type B connector. This connector is typically used with printers, as it is optimized to send data rather than receive it.

On the middle left is a USB 2.0 Mini-A connector, next to a Mini-B. These connector types were introduced as a first attempt to design a category of compact connectors for use with small, mobile devices such as smartphones, tablets, and digital cameras.

As devices became thinner over time, a new category of USB connectors was needed; the result was the USB 2.0 Micro category, located on the bottom left of the chart. Besides being smaller, the Micro category is more durable than the Mini category it replaced, designed to survive 10,000 connect-disconnect cycles.


USB 3.0 Connectors


With the introduction of the USB 3.0 protocol, new connector types were needed to handle the extra power and data. To help consumers tell which protocol a given USB cable or device implements, a blue insert is usually included inside the connector on the cable and inside the port on the device to indicate implementation of the USB 3.0 protocol (hence the blue outline on the chart). 

An example of a PC that has both USB 2.0 Type A (center-right) and USB 3.0 Type A (center-left) ports. Note the blue insert in the USB 3.0 ports signifying the USB 3.0 protocol.


On the top right of the chart is a USB 3.0 Type A connector, which is simply a USB 2.0 Type A connector with five additional contact points (called pins). These additional pins help the cable achieve the data transfer rate and power output required by the USB 3.0 protocol.

Likewise, the USB 3.0 Type B connector is very similar to its 2.0 predecessor, with an extra set of five pins inside the added notch at the top of the connector.

The USB 3.0 Micro-B connector is essentially a USB 2.0 Micro-A and Micro-B joined at the hip, doubling the number of pins available from five to ten. Unlike it's USB 2.0 predecessor, USB 3.0 Micro-B connectors are usually used with external data storage devices instead of mobile devices.

So, there ya go! An easy-to-understand breakdown of the most common USB connectors. Now when you're searching for that particular cable, you know which name to scream at the wall in frustration.


What about backward compatibility?


Good question. A big selling point of USB is backwards comparability; that is, newer USB cables should (theoretically) be compatible with older USB devices.

The good news is that the USB 3.0 protocol is backward compatible with the 2.0 protocol. The bad news is that in order for a USB 2.0 device to use a USB 3.0 cable, the connectors must also be compatible. 

As you can see from the chart, there's no way you're going to cram a USB 3.0 Micro-B connector into a USB 2.0 Micro-B port (likewise for the Type B connectors). For Type A cables, you can use a USB 3.0 Type A cable with a USB 2.0 Type B device, and everything should work fine; the additional pins in the connector simply won't be used by the device.

In the end, the backwards compatibility we were initially promised by USB was only partially delivered. 

Consumers -1, USB Cable Manufacturers +1


What about USB-C?


That's right. The one commonly used USB connector type that is not on the above chart is also the newest one, USB-C.

So many pins!

USB-C is a new connector type intended to replace the Type A, Type B, and Micro-B connectors so that there will be only one common connector type going forward, reducing confusion and frustration for consumers. Furthermore, with a whopping 24 pins, USB-C is intended to be as future-proof as possible, providing the capability to handle large amounts of data and power. Also, USB-C cables can implement either USB 2.0 or USB 3.0 (typically the latter), whereas the other connector types are all tied to a specific protocol.

Devices today are increasingly switching over to USB-C, though adoption hasn't been as fast as many would like (I'm looking at you, Apple). Still, the tide seems to be turning, and it might not be long before you can throw away that tangled mess of Type A, B, and Micro-B cables.

What I learned

For my part, I never really tried to understand the different connector types, what they're used for, and how that relates to the USB protocol they implement. Instead, I was content to play "Find the plug (A) that looks like it goes into port (B)" ad nauseum. 

That said, after my audit, I was able to document how many USB-connected devices I owned, which version of the USB protocol they implemented, and which connector type they used. This helped me declutter my collection of cables by separating the ones I knew I needed and storing them near the devices that used them while also discarding the ones I knew I wouldn't need (usually because the device the cable came with was long gone).

Additionally, I found that I had two USB 3.0 devices that I was connecting using a USB 2.0 cable, meaning that I wasn't taking full advantage of either device's capabilities; without this audit, I would have never recognized my mistake.

What you can apply

My advice is this: Check to make sure that you're using the right cable for the device. Just because the USB cable you're using fits and your device works doesn't mean that you're doing everything right. Using the wrong cable can potentially keep you from realizing your device's full potential.

The easy answer is to simply use the USB cable that came packaged with your device, as it almost certainly implements the same USB protocol as the device itself. In the event that said cable isn't available, you'll need to employ some deductive reasoning to figure out which cable you need to use.

For most connector types, only one USB protocol is implemented. If your device has a Mini, Micro, or Type B port, then the only cable that will conform to that port will also implement the correct corresponding USB protocol.

For Type A and USB-C devices, it gets a little trickier. USB 2.0 and USB 3.0 Type A ports look the same and are physically compatible with both USB 2.0 and 3.0 Type A cables; the only visible difference is a blue insert that device manufacturers usually add to indicate implementation of the USB 3.0 protocol.

USB-C's versatility gives it the ability to implement either USB 2.0 or 3.0. Unfortunately, there is no visible difference between USB 2.0 or 3.0 USB-C cables and ports.


For USB-C devices, there is typically no visible difference between a port that implements USB 2.0 vs 3.0. My rule of thumb is to always use USB 3.0 cables with USB-C devices, just to be safe.

Overall, the rule is simple: Use USB 3.0 cables with USB 3.0 devices and USB 2.0 cables with USB 2.0 devices.

I hope this helped answer any questions you may have had about USB cables and devices. Be sure to keep an eye out for my post on HDMI in the coming weeks!

Monday, January 31, 2022

Upscale Upsell

We all love TVs, and we especially love TVs with great picture. There's nothing quite like walking into Best Buy, finding the biggest, most expensive TV on display, and taking a moment to enjoy the impressive picture quality it exhibits.

Unfortunately, most content we regularly watch on our TVs falls short of the eye-popping glory of the demo reels we see on the retail floor. While TV display technology has improved by leaps and bounds over just the last few years, content has struggled to keep pace. To account for this, TVs leverage a specific process to handle content produced at a lower resolution and spruce it up to make it look like what you saw at the store. 

This process is called "upscaling", and while it's very useful, not all upscaling is the same. In this post, I'll discuss what upscaling is and how to avoid one specific mistake that many TV owners make.


TV Picture Basics

TV picture quality is usually measured in terms of resolution. That is, how many pixels comprise the picture. TVs with higher resolution generally have better picture quality, as the more pixels that comprise the picture being displayed, the more detail the TV is able to show.

Now, there are plenty of other factors to consider when measuring picture quality (such as contrast, brightness, color accuracy, etc.), but for the sake of simplicity, picture resolution is the most commonly used measurement.

The most common display resolutions.

Today, the most common resolution for TVs is 4K (sometimes also called "UHD"). Almost every TV you see on the shelf at any electronics store features 4K resolution; in fact, you'll have to look hard to find one that isn't 4K.

That said, many of the movies and TV shows we regularly watch aren't produced in 4K. Instead, most of them are still produced in either 1080p or 720p (lower resolutions commonly called "HD").

So, how are movies and TV shows produced in lower resolutions like 1080p and 720p displayed on a 4K TV? The answer is via a process called upscaling.


Upscaling

Without getting into too much technical detail, upscaling is a process which converts lower-resolution video content (such as 1080p or 720p) to a higher resolution (such as 4K) so that it can be displayed on a higher resolution screen.

Sony TVs have some of the best upscaling technology on the market.

TV manufacturers invest a lot of time and effort into upscaling technologies because they want all content viewed on their TVs to look good, and they know a lot of the content consumers watch is of lower resolution than 4K. In fact, the upscaling technology built into many premium TVs today is so good that the average viewer can't even tell the difference between content produced in 4K and content upscaled by their TV to 4K.

However, not all upscaling processes are created equal; some devices are better at upscaling than others, and many consumers make the mistake of not letting their TV handle upscaling duties.


The Device Chain

When trying to determine how to get the best picture quality for your TV, it's important to note how content gets to your TV. Many people have a streaming device (such as a Roku, Fire TV, or Apple TV), gaming console, cable box, or Blu-ray player connected to their TVs, as well as streaming apps built directly into the TV itself.

Together, the TV and all of the devices connected to it are called a device chain. An example of the device chain in my living room is below.

Good old MS Paint never disappoints.

As you can see, I have an Xbox and a Roku connected to my TV. The Xbox and Roku send video and audio signals to the TV; the TV displays the video signal and sends the audio signal to the Sound System to be played.

Now, this is a very simple setup and all seems well and good. However, there is one problem.

My Roku contains all of the apps I use for streaming content (such as Netflix, Disney+, and YouTube TV), so most of the content I view goes from my Roku to my TV. The problem is that a lot of the content I watch (mostly sports and TV shows on YouTube TV, but also some movies on Netflix and Disney+) is produced in 1080p or 720p. Therefore, before it can be displayed on my TV, the content needs to be upscaled to 4K.

No problem! My TV will just handle the upscaling and everything will be great... right? Well, no. The problem is, my Roku actually upscales the content I'm watching before it's sent to the TV. Therefore, as far as the TV is concerned, it's getting a 4K signal from the Roku and no upscaling is needed.

An example of the upscaling process in my setup. 1080p content is streamed from Netflix (or another app) to my Roku, which then upscales the content to 4K before sending it to the TV. This is sub-optimal, as my TV's upscaling capabilities are better than my Roku's.

The problem is that the Roku isn't as good at upscaling content as my TV is. You see, streaming devices like Rokus are designed to be portable and affordable; while they are capable of upscaling content, they're simply not as effective at it as TVs. 

Today, TVs (especially premium ones) have very powerful processors and sophisticated upscaling algorithms that streaming devices simply can't match. Thus, to get the best picture quality possible, you need to ensure that your TV is handling upscaling duties, not your input device.


Roku Workaround

Some devices have a pass-thru option where all content streamed is passed to the TV without any sort of modification by the device itself. If your device has this option, it's probably in your best interest to enable it.

Unfortunately, Roku devices have no such option. However, there is a workaround that will allow you to send content in its native resolution from the Roku to the TV, which will in turn allow the TV to handle upscaling.

First, you need to know the resolution of the content you're planning to stream (ex. 1080p, 720p, etc.). Once you have that, you can set your Roku to output video at that specific resolution by going to Settings -> Display Type, and then selecting the resolution of the content you're planning to stream.

This essentially forces the Roku to output all video content at that specific resolution, and if that resolution is lower than what your TV supports, your TV will upscale it using its upscaling engine.

Now, I know this isn't the most convenient workaround, but it's really the only one that Roku leaves open to us. That said, I probably wouldn't bother trying to work around Roku's auto-upscaling unless I were preparing to watch a major event that I know is going to need to be upscaled (like the upcoming Super Bowl, which, despite being the world's second most-watched sporting event, is still being broadcast in 1080i. Ugh...). For casual viewing, allowing Roku to handle upscaling is acceptable in my book.


Conclusion

Upscaling is a powerful and (unfortunately) necessary technology in modern home entertainment. TVs today have become extremely effective at upscaling, so much so that the process almost goes unnoticed by the average consumer.

Upscaling is most effective when performed by your TV, and when certain devices get in the way and try to perform upscaling themselves, the result can be sub-optimal picture quality. Always do what you can to ensure that each device is performing the process for which it is best suited, and you'll have a quality viewing experience.

Tuesday, January 11, 2022

Best of the Rest

Can a non-"power" program ever again win college football's ultimate prize? Should they even try?

Last night, Georgia won the College Football Playoff (CFP), and with it, the program's first national championship since 1980. While seeing the country's best teams play head-to-head for the title is exciting, I can't help but wonder: What about the little guys?

It's been an ongoing controversy for years in the upper echelon of college football that the smaller, so-called "Mid-Major" Football Bowl Subdivision (FBS) programs (more commonly known today as the "Group of Five") are routinely denied the opportunity to play for a national championship. However, what if the best Group of Five teams were recognized at the end of each season and awarded a "national championship" of their own? Certainly, they deserve something for their efforts, aside from brush off from the CFP Selection Committee and a spot in a consolation bowl game.

Side-stepping the overarching controversy for now, I decided to take a look back at the past few seasons to see which of these teams can claim the title of "Best of the Rest".


Methodology

Before I could begin compiling my list, I needed to pick a starting point. I settled on 1992, since that was the inaugural year of the "Bowl Coalition", college football's first attempt at an organized national championship at the FBS level.

The 1984 BYU Cougars are the last non-"power" football program to win the national championship, though it wasn't without controversy.

Next, I had to figure out which teams were eligible. While the distinction between "Power" and "Mid-Major" teams today is fairly clear, such wasn't the case in 1992. After some research, I determined that programs that met the following criteria were sufficiently "Mid-Major" teams beginning in 1992:

- Teams belonging to the following conferences:

○ Big West

○ Mid-American

○ Western Athletic

- Independent Programs EXCEPT Notre Dame and Penn State

It goes without saying that the college football landscape has shifted quite a bit since 1992: Teams have switched conferences and some conferences have merged or folded while new ones have been formed. Additionally, many programs moved up from "Mid-Major" to "Power" status as a result of the realignments that occurred over the years; this is why you may see some teams on my list that today are considered "Power" programs.

Finally, I had to decide how to select the best mid-major team each season. This was actually the simplest part: I picked the highest-ranked mid-major team(s) in the final AP & Coaches poll each season. 

The reason is simple: Introduced in 1936 and 1950 respectively, the AP and Coaches polls are widely considered the definitive college football polls (often call the "major wire service polls"). Each team that finishes ranked #1 in either poll has a legitimate claim to the title "National Champion". Therefore, to crown the mid-major nation champions, I used the same polls that are used to determine the FBS national champions.


Mid-Major Champions

Without further ado, here's my list of mid-major College Football Champions* (since 1992):

1992 - Hawaii

1993 - Louisville

1994 - Utah

1995 - Toledo (AP), East Carolina (Coaches)

1996 - BYU

1997 - Colorado State

1998 - Tulane

1999 - Marshall

2000 - Colorado State (2)

2001 - Louisville (2)

2002 - Boise State

2003 - Boise State (2)

2004 - Utah (2)

2005 - TCU

2006 - Boise State (3)

2007 - BYU (2)

2008 - Utah (3)

2009 - Boise State (4)

2010 - TCU (2)

2011 - Boise State (5)

2012 - Utah State (AP), Boise State (Coaches) (6)

2013 - UCF**

2014 - Boise State (7)

2015 - Houston

2016 - Western Michigan

2017 - UCF (2)

2018 - UCF (3)

2019 - Memphis

2020 - Cincinnati

2021 - Cincinnati (2)

2022 - Tulane (2)

2023 - SMU

*All schools listed are consensus champions unless otherwise noted.

**For 2013 only, the American Athletic Conference inherited the former Big East Conference's automatic BCS bid due to a contractual obligation. However, for competition purposes, the American is considered a mid-major conference.

By conference:

- WAC, Mountain West (9)

- American (8)

- C-USA, MAC (3)

- Independent (2)


Looking Ahead

While retroactively selecting the top mid-major team each season is an interesting academic exercise, that's all it really amounts to. After all, there's no championship associated with being a season's top-performing mid-major team, only a statistical distinction. The controversy of mid-major teams being excluded from the CFP isn't going anywhere: In fact, after Cincinnati was denied a berth in the 2020 College Football Playoff after completing an undefeated regular season, the rancor only grew louder. In 2021, the Bearcats were finally rewarded for putting together second consecutive undefeated regular season with a spot in the Playoff, only to be smashed in the semifinal round by Alabama by a score of 27-6.

Despite being the only undefeated team in the nation, Cincinnati was no match for Alabama in the 2021 Cotton Bowl

One idea being kicked around is expanding the CFP with one spot guaranteed for the highest-ranked Group of Five team. That way, the Group of Five will always be included in the national championship picture.

However, even this system does not address the fundamental issue with pitting the Power Five against the Group of Five: Parity. Even if the Group of Five were always granted a spot in the playoffs, they would likely never make it all the way to the championship due to the competitive gap between college football's "Haves" and "Have-nots".

Instead, I would rather see a system where the top Group of Five teams play one another for a championship that each has a fair chance of winning, as opposed the selecting the single best Group of Five team and sending them to be destroyed on national TV by the country's top Power Five team. Call it the "Mid-Major National Championship" or name it after a sponsor ("The Domino's Pizza Championship presented by Mountain Dew" has a nice ring to it). All I'm saying is give the little guys something to play for other than a mediocre bowl and or near-certain defeat at the hands of Alabama.