Thursday, March 14, 2024

Box Office Blues

Show business is a funny business. There are times when moviegoers show up in force to see really mediocre movies, while more acclaimed (and deserving) films are ignored. In fact, some of my favorite movies in recent years were box office flops.

Don't be fooled: This place can be brutal.

Looking back at these movies, I can't help but wonder: How hard did they fail at the box office and why? In this post, I'll explore a few of these films (in no particular order) and offer explanations for their box office fates.

Note: While movie studios are generally open about film production budgets, they are less forthcoming about other expenses (including market and distribution costs). This is why films may still fail to turn a profit even if their box office earnings comfortably eclipse their production budget. Generally, most films need to earn about 2.5-3x their production budget to turn a profit during their theatrical run, though this does vary.


Star Wars: Solo

Solo was sunk by fan discontent over The Last Jedi
The only Star Wars movie to lose money at the box office, Solo was actually one of my favorite Star Wars adventures, and one of only two of the Disney era (along with Rogue One) that I can fully embrace.

Released in May 2018, Solo opened to solid reviews (69% on Rotten Tomatoes) and earned a worldwide gross of $393 million, a sum which would have brought a tidy profit for many other productions. However, with a large production budget of $275 million, Solo ended up losing approximately $77 million in spite of this haul. So, what happened? The reasons are two-fold:

1. The production was highly troubled, with the original directors (Phil Lord and Christopher Miller) fired partway through filming. Director Ron Howard was hired to finish the film and ended up reshooting approximately 70% of the movie, causing costs to balloon.

2. Solo was released only 6 months after Star Wars: The Last Jedi. While the negative fan reaction to The Last Jedi is well-documented, what is less discussed is the fact that many Star Wars fans responded by boycotting Solo.

Alden Ehrenreich may not be Harrison Ford, but he did the character justice
Ultimately, the combination of costly reshoots and lingering fan resentment over The Last Jedi was just too much for Solo to overcome. That's a shame, as almost every Star Wars fan I know who actually saw Solo really enjoyed it (including me), and the real tragedy is that we're unlikely to see any further Han Solo adventures as a result.



Dungeons & Dragons: Honor Among Thieves

A solid swashbuckling adventure undone by the narrow appeal of its IP
I'll be honest, I've never been a D&D fan. However, I was intrigued enough by Honor Among Thieves to see it when it was released in March 2023. And man, what a treat.

Funny, fast-paced, and action-packed, Honor Among Thieves earned an impressive 91% rating on Rotten Tomatoes. However, the film managed to gross only $208 million against a production budget of $150 million (Paramount has never officially revealed how much Honor Among Thieves lost [likely in the tens of millions], stating only that it was a "disappointment").

Chris Pine channels his Captain Kirk energy in Honor Among Thieves
As for explaining the movie's underperformance, that likely comes down to the niche appeal of the Dungeons & Dragons IP. Moviegoer surveys indicated that the audience demographic skewed heavily towards millennial men, overlapping with the game's core fanbase. That is to say that if you weren't a fan of the game, you probably didn't go see the movie. A pity for sure, since even my parents (who are about are far removed from D&D fandom as you can get) enjoyed it.


Scott Pilgrim vs The World

A superb film that defied classification... and that was its downfall
Full disclosure: Even I passed on Scott Pilgrim vs the World when it was released in August 2010. A cartoony movie based on a cult graphic novel about a goofy gamer/garage musician guy trying to impress a girl with weird hair? Forgive me if I wasn't in line at the midnight premier. In fact, it wasn't until well after its theatrical release that I finally got around to watching it late one night when I has absolutely nothing else to do.

Oh, man. Only then did I discover what I was missing. Definitely one of the top 3 wittiest movies I've ever seen, Scott Pilgrim brought me laughs and feels in equal measure, and remains one of the few movies I watch on a regular basis.

Michael Cera and Mary Elizabeth Winstead may seem like an odd match, but they make it work
With a strong RT score of 82% and my personal seal of approval, it's disappointing to me that Scott Pilgrim brought in only a measly $49 million at the box office against a production budget of $85 million (like with many movies, the final gain/loss tally was never officially released by the studio [Universal]).

In my estimation, Scott Pilgrim failed at the box office for the same reason I didn't see it during its theatrical release: It wasn't marketed well. Was it a comic book movie? Comedy? Action? Was the target audience only young adults?

While it's common knowledge that genre-bending movies are tricky to market (see Cowboys & Aliens), it's still tough to swallow that one of my all-time favorite movies fell victim to this old problem. Maybe one day Hollywood will figure it out, but I'm not holding my breath.


Mission: Impossible - Dead Reckoning

Dead Reckoning couldn't compete with Barbenheimer
In the past, I've had a love/hate relationship with the Mission: Impossible film series. At best, I thought of them as solid action flicks good enough for a home movie night, while at worst, I considered them poor imitations of the Bond series. However, more recently I've been pleasantly surprised with how Tom Cruise and Director Christopher McQuarrie have revitalized this once-stagnant series.

Beginning with the 2015 surprise hit Mission: Impossible - Rogue Nation, the Cruise/McQuarrie combo hit on a formula (a mix of lighthearted approachability and breathtaking practical action sequences) that produced back-to-back blockbuster successes in the M:I series (with 2015's Rogue Nation and 2018's Fallout raking in $689 million and $791 million, respectively).

With the duo returning for Dead Reckoning (which earned a 96% RT rating), it seemed that box office run was sure to continue. Unfortunately, audiences had other plans. Against a whopping production budget of $291 million, Dead Reckoning brought in a total of $568 million for a loss of around $100 million.

Despite Dead Reckoning's box office struggles, Tom Cruise remains the king of stunts
I've gotta be honest: When the numbers came in for Dead Reckoning, I was pretty disappointed. I enjoyed Dead Reckoning as much as Rogue Nation and Fallout, and my hope was that third consecutive hit would guarantee the continuation of the series for years to come. So, why the decline on fortunes for what was supposed to be a sure-bet blockbuster? One word: Barbenheimer.

As it turns out, Dead Reckoning had the misfortune of releasing in July 2023 on the week before the biggest theatrical pop culture phenomenon of the year, which certainly diverted audience attention away from what would have otherwise been a box office hit.

Moral of the story? Don't go head-to-head with pink-clad blondes or nuclear bombs.


Star Trek Beyond

Tough competition held back Star Trek Beyond
When J.J. Abrams rebooted Star Trek in 2009 to both critical and commercial acclaim, I was as happy as a fat kid in a candy store. The new Star Trek series (or the Kelvin Timeline, as it would later be known) was everything the original wasn't: Fast, flashy and fun. It felt like the property had found a new lease on life, one where it could break out of the heavy-handed sci-fi niche it had always existed within and achieve broad appeal.

With a successful follow-up in 2013's Star Trek Into Darkness, it was thought that 2016's Star Trek Beyond, the third (and thus far, latest) entry in the Kelvin film series, would see similar success. Armed with a strong 86% RT score, why wouldn't it?

Well, as you can probably guess (since it's on this list), it didn't. Against a production budget of $185 million, Star Trek Beyond managed to generate only about $343 million at the box office, amounting to a loss of about $51 million.

Was Star Trek Beyond the last adventure for this cast?
Now, let me be clear about one thing: For me, Star Trek Beyond falls squarely into "Good, not great" territory. While I found it to be a perfectly acceptable entry in the series, it wasn't quite the smash that I was hoping for. That said, the film deserved to turn a profit, and its $50+ million loss basically assured that we would see no further Star Trek films made for the time being.

So, what happened? While analysts mainly pointed to stiff competition at the box office during its release window (and that quite possibly played a part), I think the biggest thing that hindered Star Trek Beyond was the departure of J.J. Abrams as director. 

At the time, Abrams was the hottest director in Hollywood, but when he was replaced with Justin Lin (so that Abrams could direct Star Wars: The Force Awakens), that put a damper on audience enthusiasm for the film. That, plus a substantial production budget of $185 million, meant that Star Trek would have to fight an uphill battle to turn a profit (one that it ultimately lost).

The Man from U.N.C.L.E.

The Man from U.N.C.L.E. released 48 years after the TV show ended
Movies adapted from TV shows have a checkered record. For many, the challenge centers around getting modern audiences to connect with characters and stories that went off the air years or even decades ago.

Such was the struggle with The Man from U.N.C.L.E. While the film itself was a slick, funny, charming spy thriller (co-led by the Man of Steel himself, no less) that earned a 68% approval rating from Rotten Tomatoes, it had the misfortune of releasing at a time when the box office was already saturated with spy thrillers that were more relatable to modern audiences.

Even Henry Cavill's star power couldn't save The Man from U.N.C.L.E.
This disconnect was a full display at the box office, where The Man from U.N.C.L.E. managed to gross only $110 million against a production budget of $75 million for an approximate loss of a whopping $80 million.

Moral of the story? Don't wait almost half a century before giving a TV show its big screen debut.


Conclusion


While I know that not many moviegoers keep a close eye on box office receipts, I do because it's important to me that movies I enjoy experience success. Part of it is that I want those who do good work to be rewarded, but perhaps more selfishly, I want to see more good movies get made. After all, when good movies turn a profit, it's more likely that others will come down the pipe.

That said, the market isn't always fair. Sometimes good (or even great) movies simply have bad luck, whether it's a bad release window, competition, or simple audience ignorance. However, the important thing is that while good movies may not always experience the success that they deserve, the fact that they exist in the first place gives us the opportunity to enjoy them, nevertheless.

Wednesday, September 6, 2023

Upscale Upsell Part 2

If you've recently played a video game on a PlayStation 5 and/or an Xbox Series X, you've probably noticed how detailed the graphics looked and how smooth the motioned seemed. When it comes to the visual fidelity of video games, ensuring both a high degree of image detail (resolution) and smoothness of motion (framerate) simultaneously are essential, but that's very difficult for even modern PCs and gaming consoles to achieve.

Star Wars Jedi: Survivor is one of the latest games to support both DLSS and FSR.

Luckily, GPU (Graphics Processing Unit) manufacturers Nvidia and AMD have each developed a system specifically designed address this challenge (DLSS and FSR, respectively). However, which of these new technologies is the best?


Nvidia DLSS

DLSS was first released in 2019 and was first supported by Battlefield V.

Nvidia's DLSS (Deep Learning Super Sampling) is a hardware-based system developed by Nvidia that is exclusive to Nvidia's RTX line of GPUs. DLSS allows games to render textures at a lower resolution (ex. 1080p) and then utilizes AI to upscale them to a target resolution (ex. 4K), an operation that is less computationally expensive than rendering textures natively at the target resolution, thus allowing more compute cycles to be allocated to maintaining framerate.


AMD FSR

FSR was first released in 2021 and was first supported by Farming Simulator 2022.

AMD's FSR (FidelityFX Super Resolution) works on essentially the same principle (except it uses a spatial upscaling algorithm rather than AI), with one major difference: FSR is a software-based system whereas DLSS is hardware-based. This means that while DLSS is restricted to GPUs that support it natively, FSR can be implemented on virtually any modern GPU.


Conclusion

Unfortunately, comparing the two technologies head-to-head is not straightforward. Tom's Hardware performed an in-depth test of the two technologies and found that while DLSS generally outperformed FSR, the facts that it is more computationally expensive than FSR and exclusive to RTX GPUs kept them from crowning it the undisputed victor of the matchup.

Not to be outdone, Intel released their own upscaler, XeSS, in 2022, though it has yet to see widespread adoption.

The good news is that nothing is stopping games from implementing both technologies (as well as Intel's upscaler called XeSS), and the expectation is that support for both will soon become the norm. Ultimately, the difference between DLSS and FSR isn't likely to be noticeable to the average consumer, so rest assured that regardless of which system the game you're playing utilizes, the visual experience is going to be a quality one.

Note: Both the PlayStation 5 and Xbox Series X/S use AMD GPUs, so only FSR is available on those platforms. The Nintendo Switch uses an Nvidia GPU, but it does not include support for DLSS.


Sunday, July 24, 2022

Connecting the Dots - Part 2 - HDMI

For Part 2 of my "Connecting the Dots" series, I'll be discussing HDMI, both the underlying technology and the cable types we use to connect HDMI-enabled devices. If you've ever wondered what the different versions of HDMI mean and which cables to use to ensure that you get the best multimedia experience with your equipment, this post is for you. Let's get started!


HDMI - High-Definition Multimedia Interface


If you've owned a TV or multimedia device of any kind over the last 15 years or so (and I'm betting you have), you've probably heard the term HDMI, whether in reference to your device's features or the cables used to connect your devices.

Simply put, HDMI is the standard interface type used to connect multimedia devices (much like USB is the standard for connecting computing devices). HDMI devices come in 4 primary versions, and there are 4 types of HDMI cables. 

Sounds straightforward, right? Unfortunately, our friends in the TV industry managed to make what was supposed to be a simple standard somewhat complicated, but hopefully I can clear up some of the confusion in this post.

Not pronounced "Hedemi"

HDMI Versions


HDMI 1.0 was released in December 2002 as a new standard interface type for transmitting both audio and video signals; this was a major innovation, as most multimedia transmissions interfaces up to that point transmitted either video or audio, not both. The promise of HDMI was that it would make multimedia connections simpler by providing a single connection that could handle a variety of signal types.

Despite this promise, HDMI didn't see widespread adoption by consumer electronics until the rise of high-definition TVs in the later 2000s. By then, the most common HDMI version was HDMI 1.4.


HDMI 1.4b


Released in June 2009, pretty much every TV manufactured since 2010 has supported HDMI 1.4b. Now the oldest of the HDMI versions still in production, HDMI 1.4b helped kickstart the rise of high-definition TVs by supporting the transmission of video content up to a resolution of 1080p @ 120Hz, 4k @ 24Hz, and surround sound (via a feature called Audio Return Channel, or ARC).

While virtually all new TVs manufactured today employ newer versions of HDMI, many older TVs in use today are still running strong on HDMI 1.4b.


HDMI 2.0


Introduced in September 2013, HDMI 2.0 added support for video content resolutions up to 4k @ 60Hz, 32-channel audio content, and an enhanced color space called Rec. 2020. HDMI 2.0 reached large-scale adoption in the 2014 model year and remained the dominant HDMI version through 2015.

In April 2015, and update for HDMI 2.0, called HDMI 2.0a, was released. This version of HDMI added support for High Dynamic Range (HDR) content (which is all the rage with TVs today), as well as an upgrade for ARC called eARC (Enhanced Audio Return Channel). HDMI 2.0a was the flagship HDMI standard for the 2016-2018 model years. 

This implementation of HDR technology was further refined with the release of HDMI 2.0b in March 2016, which added support for HLG (Hybrid Log-Gamma). Most TVs manufactured in the 2019-2021 model years support up to HDMI 2.0b.


HDMI 2.1


In January 2017, HDMI 2.1 was released. The most significant revision for the HDMI standard since it was first introduced, HDMI 2.1 adds support for video resolutions of 4k & 8k @ 120Hz, object-based and high-definition audio formats, and gaming features such as Variable Refresh Rate (VRR) and Auto Low Latency Mode (ALLM).

Because HDMI 2.1 is such a major change to the HDMI specification, adoption of the new version has been slow. Though officially released in early 2017, only starting in the 2022 model year has a wide range of TVs supported this new version.


HDMI Cables


Like the HDMI spec itself, HDMI cables come in several different flavors, which are:

  • Standard
  • High Speed
  • Premium High Speed
  • Ultra High Speed

While most HDMI cables look about the same, they come in several different types.

Now, it's worth noting that each HDMI cable type doesn't necessarily correspond to a particular version of HDMI. Instead, the primary difference between the different cable type is how much bandwidth each cable provides. Newer versions of HDMI require more bandwidth, which is why they may require certain cable types. I'll explain in more detail below.


Standard HDMI Cable


The original HDMI cable, Standard HDMI cables have about 5 Gb/s of bandwidth, which is sufficient to support up to HDMI 1.2. An older spec, Standard HDMI cables are no longer widely manufactured and have largely been replaced by High Speed HDMI cables.


High Speed HDMI Cable


The most common HDMI cable available today, High Speed HDMI cables have about 10 Gb/s of bandwidth, which supports all HDMI versions up through HDMI 2.0b. Because of their ubiquity, these cables are quite affordable and can be found at retailers like Amazon for pennies on the dollar.


Premium High Speed HDMI Cable


Premium High Speed HDMI cables have 18 Gb/s of bandwidth and are additionally certified by the HDMI Forum to meet standards of durability and EMI (electromagnetic interference) shielding. 

The label affixed to all Premium High Speed HDMI cables. The QR code can be scanned for authentication.

The Premium High Speed spec for HDMI cables was created by the HDMI Forum (the body that regulates HDMI standards) in response to the appearance of cheap High Speed HDMI cables that eventually came to saturate the market. The concern was that some of these cheaper High Speed HDMI cables were of suspect build quality, and manufacturers of high-quality HDMI cables wanted a way to distinguish their cables from cheaper ones. Thus, the Premium High Speed HDMI cable was born, with each one carrying a scannable QR code on the label that can be used for authentication.

While Premium High Speed HDMI cables don't support any more versions or features of HDMI than regular High Speed HDMI cables, they are intended to offer the consumer assurance of quality that had been lacking in the marketplace.


Ultra High Speed HDMI Cable


Ultra High Speed HDMI cables are the newest HDMI cables, sporting a bandwidth of 48 Gb/s, sufficient to support the full HDMI 2.1 spec. 

If you see this label, know that you're getting the best (and most expensive) HDMI cable available.

Ultra High Speed HDMI cables also adhere to the same physical standards as Premium High Speed HDMI cables and offer QR code verification. Also, be aware that these cables are more expensive, averaging about $10/foot (though that will likely lower over time).


What I learned


Years ago, a tech journalist explained that low-cost High Speed HDMI cables were sufficient for connecting all HDMI-enabled devices, and that more expensive HDMI cables were a waste of money. For the most part, he was right; after reading his piece, I proceeded to equip all my devices with cheap High Speed HDMI cables purchased from Amazon, and I never had much cause to revisit that decision.

However, with the emergence of new HDMI features as part of the HDMI 2.0 and 2.1 specs, I wanted to check again to make sure my old cables were still up to the task. Fortunately, I found that they are, though that will change as more HDMI 2.1 devices find their way into my home over the next few years.


What you can apply


The good news here is that picking the right HDMI cable is actually pretty straightforward. 

First off, Standard HDMI cables aren't really around anymore, so you don't have to worry about them at all.

Most devices today work fine with High Speed HDMI cables, since they're likely using HDMI 2.0b or earlier. Additionally, High Speed HDMI cables are abundant (and cheap), so you shouldn't have much trouble obtaining them.

Premium High Speed HDMI cables are only really necessary in environments where cable durability or EMI shielding is a requirement, such as if you're running the cable through a wall or in an area with high EMI (very rare).

Ultra High Speed HDMI cables are required only if you're connecting two HDMI 2.1-enabled devices to one another and want the full suite of features the spec offers. Most HDMI 2.1 features are gaming-centric, so only the newest TVs and gaming consoles currently on the market support it. However, that will likely change over time as HDMI 2.1 slowly becomes the dominant version of HDMI.

In summary, if you're happy with the way your devices are performing today, then you're probably set; there's no need to rush out and get new HDMI cables, even with all the new versions of HDMI coming out. That said, if you run into a situation where you need a newer cable, go ahead and shell out a little extra money to make sure you've got the right cable for the job; it'll be worth it in the end.

Sunday, June 12, 2022

Connecting the Dots - Part 1 - USB

A while back, I decided to audit my USB-connected devices and their adjoining cables (because that's the kind of thing I do in my spare time) in an effort to lend some semblance of order to the confused mass of cables that I had steadily accrued over the years, and I recently did the same for both my HDMI- and Bluetooth-enabled devices. Over the course of these audits, I discovered several things that I previously didn't know about both technologies and what I could be doing better. 

In this blog post (Part 1 of 3 in this series), I'll discuss what it was I discovered during my USB audit, what I was doing wrong, and how you can make sure you're getting the most out of your devices by optimizing your connections. In future posts I'll discuss the same for HDMI and Bluetooth.


USB - Universal Serial Bus

If you've used any device with a microprocessor in the last 20 years, you probably have at least a vague idea of what USB is, so I won't waste your time by covering the basics. Instead, I'll just let it suffice to say that USB is basically the standard interface used to physically connect digital devices in this day and age, providing both data transfer and power capabilities.


The well-known USB trident logo.


USB Data Transfer Protocols


When it comes to USB, you've probably heard that there are basically two kinds of USB devices/cables currently on the market, each named after the data transfer protocol it implements: USB 2.0 and USB 3.0 (in case you're wondering, USB 1.0 came and went in the 90s with few people taking notice. It wasn't until USB 2.0 was released in 2000 that the technology started catching on). Virtually every USB cable and device on the market today implements one of these two protocols (Note: I know that there is USB 3.1 and USB 3.2 in addition to USB 3.0, but for the sake of simplicity, I'll just refer to them all as USB 3.0).

However, have you ever wondered just what the differences are between the two versions of USB and what they mean for you? Well, wonder no more! Simply put, USB 3.0 can transfer data about 10x faster and can provide 80% more power than USB 2.0. 


USB Connector Types


Unfortunately, there's another layer to understanding USB which tends to confuse people. In addition to there being two data transfer protocols to worry about, there are also many different USB connector types to consider. 

On occasion, you've probably found yourself digging through a pile of USB cables looking for one that has an end in the shape of the port that you're trying to plug it into. With so many different types of USB connectors, this really has become a nightmare for consumers. However, I'll try to clear up the confusion by using the below chart to show you the most common connector types and which data transfer protocol each one implements.

The most common USB connector types, with the exception of USB-C (more on that later).

USB 2.0 Connectors


On the top left of the chart, you can see a USB 2.0 Type A connector. This is the standard USB connector that you probably see every day and is used by many different devices.

Next to it is the USB 2.0 Type B connector. This connector is typically used with printers, as it is optimized to send data rather than receive it.

On the middle left is a USB 2.0 Mini-A connector, next to a Mini-B. These connector types were introduced as a first attempt to design a category of compact connectors for use with small, mobile devices such as smartphones, tablets, and digital cameras.

As devices became thinner over time, a new category of USB connectors was needed; the result was the USB 2.0 Micro category, located on the bottom left of the chart. Besides being smaller, the Micro category is more durable than the Mini category it replaced, designed to survive 10,000 connect-disconnect cycles.


USB 3.0 Connectors


With the introduction of the USB 3.0 protocol, new connector types were needed to handle the extra power and data. To help consumers tell which protocol a given USB cable or device implements, a blue insert is usually included inside the connector on the cable and inside the port on the device to indicate implementation of the USB 3.0 protocol (hence the blue outline on the chart). 

An example of a PC that has both USB 2.0 Type A (center-right) and USB 3.0 Type A (center-left) ports. Note the blue insert in the USB 3.0 ports signifying the USB 3.0 protocol.


On the top right of the chart is a USB 3.0 Type A connector, which is simply a USB 2.0 Type A connector with five additional contact points (called pins). These additional pins help the cable achieve the data transfer rate and power output required by the USB 3.0 protocol.

Likewise, the USB 3.0 Type B connector is very similar to its 2.0 predecessor, with an extra set of five pins inside the added notch at the top of the connector.

The USB 3.0 Micro-B connector is essentially a USB 2.0 Micro-A and Micro-B joined at the hip, doubling the number of pins available from five to ten. Unlike it's USB 2.0 predecessor, USB 3.0 Micro-B connectors are usually used with external data storage devices instead of mobile devices.

So, there ya go! An easy-to-understand breakdown of the most common USB connectors. Now when you're searching for that particular cable, you know which name to scream at the wall in frustration.


What about backward compatibility?


Good question. A big selling point of USB is backwards comparability; that is, newer USB cables should (theoretically) be compatible with older USB devices.

The good news is that the USB 3.0 protocol is backward compatible with the 2.0 protocol. The bad news is that in order for a USB 2.0 device to use a USB 3.0 cable, the connectors must also be compatible. 

As you can see from the chart, there's no way you're going to cram a USB 3.0 Micro-B connector into a USB 2.0 Micro-B port (likewise for the Type B connectors). For Type A cables, you can use a USB 3.0 Type A cable with a USB 2.0 Type B device, and everything should work fine; the additional pins in the connector simply won't be used by the device.

In the end, the backwards compatibility we were initially promised by USB was only partially delivered. 

Consumers -1, USB Cable Manufacturers +1


What about USB-C?


That's right. The one commonly used USB connector type that is not on the above chart is also the newest one, USB-C.

So many pins!

USB-C is a new connector type intended to replace the Type A, Type B, and Micro-B connectors so that there will be only one common connector type going forward, reducing confusion and frustration for consumers. Furthermore, with a whopping 24 pins, USB-C is intended to be as future-proof as possible, providing the capability to handle large amounts of data and power. Also, USB-C cables can implement either USB 2.0 or USB 3.0 (typically the latter), whereas the other connector types are all tied to a specific protocol.

Devices today are increasingly switching over to USB-C, though adoption hasn't been as fast as many would like (I'm looking at you, Apple). Still, the tide seems to be turning, and it might not be long before you can throw away that tangled mess of Type A, B, and Micro-B cables.

What I learned

For my part, I never really tried to understand the different connector types, what they're used for, and how that relates to the USB protocol they implement. Instead, I was content to play "Find the plug (A) that looks like it goes into port (B)" ad nauseum. 

That said, after my audit, I was able to document how many USB-connected devices I owned, which version of the USB protocol they implemented, and which connector type they used. This helped me declutter my collection of cables by separating the ones I knew I needed and storing them near the devices that used them while also discarding the ones I knew I wouldn't need (usually because the device the cable came with was long gone).

Additionally, I found that I had two USB 3.0 devices that I was connecting using a USB 2.0 cable, meaning that I wasn't taking full advantage of either device's capabilities; without this audit, I would have never recognized my mistake.

What you can apply

My advice is this: Check to make sure that you're using the right cable for the device. Just because the USB cable you're using fits and your device works doesn't mean that you're doing everything right. Using the wrong cable can potentially keep you from realizing your device's full potential.

The easy answer is to simply use the USB cable that came packaged with your device, as it almost certainly implements the same USB protocol as the device itself. In the event that said cable isn't available, you'll need to employ some deductive reasoning to figure out which cable you need to use.

For most connector types, only one USB protocol is implemented. If your device has a Mini, Micro, or Type B port, then the only cable that will conform to that port will also implement the correct corresponding USB protocol.

For Type A and USB-C devices, it gets a little trickier. USB 2.0 and USB 3.0 Type A ports look the same and are physically compatible with both USB 2.0 and 3.0 Type A cables; the only visible difference is a blue insert that device manufacturers usually add to indicate implementation of the USB 3.0 protocol.

USB-C's versatility gives it the ability to implement either USB 2.0 or 3.0. Unfortunately, there is no visible difference between USB 2.0 or 3.0 USB-C cables and ports.


For USB-C devices, there is typically no visible difference between a port that implements USB 2.0 vs 3.0. My rule of thumb is to always use USB 3.0 cables with USB-C devices, just to be safe.

Overall, the rule is simple: Use USB 3.0 cables with USB 3.0 devices and USB 2.0 cables with USB 2.0 devices.

I hope this helped answer any questions you may have had about USB cables and devices. Be sure to keep an eye out for my post on HDMI in the coming weeks!

Monday, January 31, 2022

Upscale Upsell

We all love TVs, and we especially love TVs with great picture. There's nothing quite like walking into Best Buy, finding the biggest, most expensive TV on display, and taking a moment to enjoy the impressive picture quality it exhibits.

Unfortunately, most content we regularly watch on our TVs falls short of the eye-popping glory of the demo reels we see on the retail floor. While TV display technology has improved by leaps and bounds over just the last few years, content has struggled to keep pace. To account for this, TVs leverage a specific process to handle content produced at a lower resolution and spruce it up to make it look like what you saw at the store. 

This process is called "upscaling", and while it's very useful, not all upscaling is the same. In this post, I'll discuss what upscaling is and how to avoid one specific mistake that many TV owners make.


TV Picture Basics

TV picture quality is usually measured in terms of resolution. That is, how many pixels comprise the picture. TVs with higher resolution generally have better picture quality, as the more pixels that comprise the picture being displayed, the more detail the TV is able to show.

Now, there are plenty of other factors to consider when measuring picture quality (such as contrast, brightness, color accuracy, etc.), but for the sake of simplicity, picture resolution is the most commonly used measurement.

The most common display resolutions.

Today, the most common resolution for TVs is 4K (sometimes also called "UHD"). Almost every TV you see on the shelf at any electronics store features 4K resolution; in fact, you'll have to look hard to find one that isn't 4K.

That said, many of the movies and TV shows we regularly watch aren't produced in 4K. Instead, most of them are still produced in either 1080p or 720p (lower resolutions commonly called "HD").

So, how are movies and TV shows produced in lower resolutions like 1080p and 720p displayed on a 4K TV? The answer is via a process called upscaling.


Upscaling

Without getting into too much technical detail, upscaling is a process which converts lower-resolution video content (such as 1080p or 720p) to a higher resolution (such as 4K) so that it can be displayed on a higher resolution screen.

Sony TVs have some of the best upscaling technology on the market.

TV manufacturers invest a lot of time and effort into upscaling technologies because they want all content viewed on their TVs to look good, and they know a lot of the content consumers watch is of lower resolution than 4K. In fact, the upscaling technology built into many premium TVs today is so good that the average viewer can't even tell the difference between content produced in 4K and content upscaled by their TV to 4K.

However, not all upscaling processes are created equal; some devices are better at upscaling than others, and many consumers make the mistake of not letting their TV handle upscaling duties.


The Device Chain

When trying to determine how to get the best picture quality for your TV, it's important to note how content gets to your TV. Many people have a streaming device (such as a Roku, Fire TV, or Apple TV), gaming console, cable box, or Blu-ray player connected to their TVs, as well as streaming apps built directly into the TV itself.

Together, the TV and all of the devices connected to it are called a device chain. An example of the device chain in my living room is below.

Good old MS Paint never disappoints.

As you can see, I have an Xbox and a Roku connected to my TV. The Xbox and Roku send video and audio signals to the TV; the TV displays the video signal and sends the audio signal to the Sound System to be played.

Now, this is a very simple setup and all seems well and good. However, there is one problem.

My Roku contains all of the apps I use for streaming content (such as Netflix, Disney+, and YouTube TV), so most of the content I view goes from my Roku to my TV. The problem is that a lot of the content I watch (mostly sports and TV shows on YouTube TV, but also some movies on Netflix and Disney+) is produced in 1080p or 720p. Therefore, before it can be displayed on my TV, the content needs to be upscaled to 4K.

No problem! My TV will just handle the upscaling and everything will be great... right? Well, no. The problem is, my Roku actually upscales the content I'm watching before it's sent to the TV. Therefore, as far as the TV is concerned, it's getting a 4K signal from the Roku and no upscaling is needed.

An example of the upscaling process in my setup. 1080p content is streamed from Netflix (or another app) to my Roku, which then upscales the content to 4K before sending it to the TV. This is sub-optimal, as my TV's upscaling capabilities are better than my Roku's.

The problem is that the Roku isn't as good at upscaling content as my TV is. You see, streaming devices like Rokus are designed to be portable and affordable; while they are capable of upscaling content, they're simply not as effective at it as TVs. 

Today, TVs (especially premium ones) have very powerful processors and sophisticated upscaling algorithms that streaming devices simply can't match. Thus, to get the best picture quality possible, you need to ensure that your TV is handling upscaling duties, not your input device.


Roku Workaround

Some devices have a pass-thru option where all content streamed is passed to the TV without any sort of modification by the device itself. If your device has this option, it's probably in your best interest to enable it.

Unfortunately, Roku devices have no such option. However, there is a workaround that will allow you to send content in its native resolution from the Roku to the TV, which will in turn allow the TV to handle upscaling.

First, you need to know the resolution of the content you're planning to stream (ex. 1080p, 720p, etc.). Once you have that, you can set your Roku to output video at that specific resolution by going to Settings -> Display Type, and then selecting the resolution of the content you're planning to stream.

This essentially forces the Roku to output all video content at that specific resolution, and if that resolution is lower than what your TV supports, your TV will upscale it using its upscaling engine.

Now, I know this isn't the most convenient workaround, but it's really the only one that Roku leaves open to us. That said, I probably wouldn't bother trying to work around Roku's auto-upscaling unless I were preparing to watch a major event that I know is going to need to be upscaled (like the upcoming Super Bowl, which, despite being the world's second most-watched sporting event, is still being broadcast in 1080i. Ugh...). For casual viewing, allowing Roku to handle upscaling is acceptable in my book.


Conclusion

Upscaling is a powerful and (unfortunately) necessary technology in modern home entertainment. TVs today have become extremely effective at upscaling, so much so that the process almost goes unnoticed by the average consumer.

Upscaling is most effective when performed by your TV, and when certain devices get in the way and try to perform upscaling themselves, the result can be sub-optimal picture quality. Always do what you can to ensure that each device is performing the process for which it is best suited, and you'll have a quality viewing experience.

Tuesday, January 11, 2022

Best of the Rest

Can a non-"power" program ever again win college football's ultimate prize? Should they even try?

Last night, Georgia won the College Football Playoff (CFP), and with it, the program's first national championship since 1980. While seeing the country's best teams play head-to-head for the title is exciting, I can't help but wonder: What about the little guys?

It's been an ongoing controversy for years in the upper echelon of college football that the smaller, so-called "Mid-Major" Football Bowl Subdivision (FBS) programs (more commonly known today as the "Group of Five") are routinely denied the opportunity to play for a national championship. However, what if the best Group of Five teams were recognized at the end of each season and awarded a "national championship" of their own? Certainly, they deserve something for their efforts, aside from brush off from the CFP Selection Committee and a spot in a consolation bowl game.

Side-stepping the overarching controversy for now, I decided to take a look back at the past few seasons to see which of these teams can claim the title of "Best of the Rest".


Methodology

Before I could begin compiling my list, I needed to pick a starting point. I settled on 1992, since that was the inaugural year of the "Bowl Coalition", college football's first attempt at an organized national championship at the FBS level.

The 1984 BYU Cougars are the last non-"power" football program to win the national championship, though it wasn't without controversy.

Next, I had to figure out which teams were eligible. While the distinction between "Power" and "Mid-Major" teams today is fairly clear, such wasn't the case in 1992. After some research, I determined that programs that met the following criteria were sufficiently "Mid-Major" teams beginning in 1992:

- Teams belonging to the following conferences:

○ Big West

○ Mid-American

○ Western Athletic

- Independent Programs EXCEPT Notre Dame and Penn State

It goes without saying that the college football landscape has shifted quite a bit since 1992: Teams have switched conferences and some conferences have merged or folded while new ones have been formed. Additionally, many programs moved up from "Mid-Major" to "Power" status as a result of the realignments that occurred over the years; this is why you may see some teams on my list that today are considered "Power" programs.

Finally, I had to decide how to select the best mid-major team each season. This was actually the simplest part: I picked the highest-ranked mid-major team(s) in the final AP & Coaches poll each season. 

The reason is simple: Introduced in 1936 and 1950 respectively, the AP and Coaches polls are widely considered the definitive college football polls (often call the "major wire service polls"). Each team that finishes ranked #1 in either poll has a legitimate claim to the title "National Champion". Therefore, to crown the mid-major nation champions, I used the same polls that are used to determine the FBS national champions.


Mid-Major Champions

Without further ado, here's my list of mid-major College Football Champions* (since 1992):

1992 - Hawaii

1993 - Louisville

1994 - Utah

1995 - Toledo (AP), East Carolina (Coaches)

1996 - BYU

1997 - Colorado State

1998 - Tulane

1999 - Marshall

2000 - Colorado State (2)

2001 - Louisville (2)

2002 - Boise State

2003 - Boise State (2)

2004 - Utah (2)

2005 - TCU

2006 - Boise State (3)

2007 - BYU (2)

2008 - Utah (3)

2009 - Boise State (4)

2010 - TCU (2)

2011 - Boise State (5)

2012 - Utah State (AP), Boise State (Coaches) (6)

2013 - UCF**

2014 - Boise State (7)

2015 - Houston

2016 - Western Michigan

2017 - UCF (2)

2018 - UCF (3)

2019 - Memphis

2020 - Cincinnati

2021 - Cincinnati (2)

2022 - Tulane (2)

2023 - SMU

*All schools listed are consensus champions unless otherwise noted.

**For 2013 only, the American Athletic Conference inherited the former Big East Conference's automatic BCS bid due to a contractual obligation. However, for competition purposes, the American is considered a mid-major conference.

By conference:

- WAC, Mountain West (9)

- American (8)

- C-USA, MAC (3)

- Independent (2)


Looking Ahead

While retroactively selecting the top mid-major team each season is an interesting academic exercise, that's all it really amounts to. After all, there's no championship associated with being a season's top-performing mid-major team, only a statistical distinction. The controversy of mid-major teams being excluded from the CFP isn't going anywhere: In fact, after Cincinnati was denied a berth in the 2020 College Football Playoff after completing an undefeated regular season, the rancor only grew louder. In 2021, the Bearcats were finally rewarded for putting together second consecutive undefeated regular season with a spot in the Playoff, only to be smashed in the semifinal round by Alabama by a score of 27-6.

Despite being the only undefeated team in the nation, Cincinnati was no match for Alabama in the 2021 Cotton Bowl

One idea being kicked around is expanding the CFP with one spot guaranteed for the highest-ranked Group of Five team. That way, the Group of Five will always be included in the national championship picture.

However, even this system does not address the fundamental issue with pitting the Power Five against the Group of Five: Parity. Even if the Group of Five were always granted a spot in the playoffs, they would likely never make it all the way to the championship due to the competitive gap between college football's "Haves" and "Have-nots".

Instead, I would rather see a system where the top Group of Five teams play one another for a championship that each has a fair chance of winning, as opposed the selecting the single best Group of Five team and sending them to be destroyed on national TV by the country's top Power Five team. Call it the "Mid-Major National Championship" or name it after a sponsor ("The Domino's Pizza Championship presented by Mountain Dew" has a nice ring to it). All I'm saying is give the little guys something to play for other than a mediocre bowl and or near-certain defeat at the hands of Alabama.

Tuesday, September 21, 2021

Rocket Ball

When I moved to Huntsville last year, one of the first things I did was research the professional sports scene in town. Being at the height of the initial outbreak of the COVID-19 pandemic, I knew there was little chance I would get to see any pro sports teams play anytime soon, but I wanted to make sure I was well-versed in local sports when the time did come for play to resume. Throughout my studies, I found that Huntsville has a rich (if uneven) history of minor league professional sports, and one that seems to have a bright future.

Let's dig into the history of professional sports in Huntsville.


Baseball

Huntsville Stars (1985-2014)


The Stars are notable for being Huntsville's longest-tenured professional sports team (30 years)

In July 1984, an ownership group led by Larry Schmittou, President of the Southern League's Nashville Sounds, purchased the Evansville Triplets of the American Association. Schmittou essentially planned to swap the locations of the two teams: Send the Sounds to Evansville, Indiana and bring the Triplets down to Nashville. However, after the City of Evansville refused to pay for upgrades to the Triplets' home stadium, Schmittou made an agreement with Huntsville Mayor Joe W. Davis to construct a new, 10,000-seat multipurpose facility (later named "Joe W. Davis Stadium") in Huntsville that would serve as the new home of the Sounds, who would be renamed the Huntsville Stars and begin play as a "new" franchise in 1985. Meanwhile, Schmittou followed through with his plan to move the Triplets to Nashville, where the team assumed the name and history of the Nashville Sounds.

Beginning in 1985, the Stars played 30 seasons in Huntsville, first as an affiliate of the Oakland Athletics (1985-1998) and then the Milwaukee Brewers (1999-2014). During that time, the team fielded a number of future MLB stars (including Ryan Braun, José Canseco, Jason Giambi, Tim Hudson, and Mark McGwire) and won the Southern League championship in 1985, 1994, and 2001.

Just before the 2014 season and after several years of sagging attendance, the Huntsville Stars were purchased by an ownership group led by Ken Young with plans to relocate the team to Biloxi, Mississippi. The plan was to have the Stars play a final "farewell" season in Huntsville in 2014 before moving to Biloxi for the 2015 season and assuming a new identity as the Biloxi Shuckers. However, construction delays with the new stadium meant that for their first 54 games of the 2015 season, the Shuckers played exclusively on the road while the finishing touches were put on their new facility. On June 6, MGM Stadium hosted the Shuckers' home opener, finally completing the move.


Rocket City Trash Pandas (2020-Present)


Huntsville's newest professional sports team and current envy of minor league baseball

In November 2017, an ownership group led by Ralph Nelson purchased the Southern League's Mobile BayBears (who had been struggling with poor attendance and dilapidated facilities) with the intention to relocate the team to Madison, Alabama (a suburb of Huntsville). Following a fan poll, the distinctive name Rocket City Trash Pandas was chosen. 

Toyota Field. I've spent many an evening here

The deal also included the construction of a new, $46 million ballpark in Madison, later named "Toyota Field". While the new stadium was completed and the team's operations were migrated to Madison in time for the start of the 2020 season, the onset of the COVID-19 pandemic delayed and later canceled the season altogether. 

Undeterred, the Trash Pandas began play in 2021 as an affiliate of the Los Angeles Angels. Over the course of their inaugural season, the Trash Pandas led all of Double-A in attendance with an average of over 5,000 fans per game, good enough for 10th overall in minor league baseball.


Basketball

Huntsville Lasers (1991-1992)


The Lasers were the shortest-lived pro sports team to play in Huntsville

In 1991, the Huntsville Lasers were founded as a charter member of the newly formed Global Basketball Association (GBA). The Lasers had a short, tumultuous existence which included the firing of their general manager and publicity manager during the preseason. This disfunction, combined with low attendance and high costs (mostly due to travel), meant the Lasers lasted for only a season and a half, folding along with the GBA in December 1992.


Huntsville Flight (2001-2005)



One of the coolest logos I've seen

In 2001, the Huntsville Flight were formed as a founding member of the new NBA Development League (known today as the G League). The Flight played their home games at the Von Braun Center, and in their third season (2003-04) advanced to the league finals, losing by a mere two points to the Asheville Altitude. 

Unfortunately, the Flight suffered from both low attendance and the NBA Development League's shaky business model, the combination of which forced the Flight to relocate to Albuquerque, New Mexico to become the Albuquerque Thunderbirds following their fourth season in Huntsville. Today, the franchise still exists as the Cleveland Charge.


Football 

Huntsville Rockets (1962-1966)


In 1962, the Huntsville Rockets were formed as a member of the Dixie Professional Football League (DPFL). After one season of play, the Rockets moved to the new Southern Professional Football League (SPFL) where they played for two seasons (1963-1964) before the SPFL folded. Following the demise of the SPFL, the Rockets joined the North American Football League (NAFL) in 1965 where they played a season and a half, folding midway through the 1966 season after failing to pay league dues.


Huntsville/Alabama Hawks (1967-1969)


In 1967, the remnants of the Huntsville Rockets organization came together to form the Huntsville Hawks and joined the Professional Football League of America (PFLA), itself a reorganization of the NAFL. After a single season, the PFLA folded and the Hawks joined the Continental Football League (CoFL), rebranding themselves as the Alabama Hawks. The Hawks played in the CoFL for two seasons before both the team and the league folded.

It's worth noting that during their short existence, the Hawks served as an unofficial "affiliate" of the Atlanta Falcons. Several Hawks players eventually played for the Falcons and the two teams even played an exhibition game in 1969 (which the Falcons won 55-0), one of the rare occasions that an NFL team played a non-NFL team.


Tennessee Valley Vipers (2000-2010)


The Vipers won Huntsville's first pro football championship

In 2000, the Tennessee Valley Vipers were founded as a charter member of the AF2 (stylized as "af2"), the official developmental league of the Arena Football League (AFL). The Vipers played in af2 for five seasons (through 2004) before leaving af2 for a rival league, United Indoor Football (UIF), following a feud between league management and owner Art Clarkson.

This logo needs to go extinct

Upon joining UIF, the Vipers attempted to rename themselves the Tennessee Valley Pythons (as af2 retained ownership of the "Tennessee Valley Vipers" trademark). However, af2 threatened to sue, arguing that since both "Vipers" and "Pythons" referred to snakes, it had the potential to confuse fans. Under legal pressure, the Vipers settled on the name Tennessee Valley Raptors

Unfortunately, fans didn't connect with the team's new identity or the new league's peculiar rules. After playing a single season in the UIF, Raptors owner Art Clarkson relocated the team to Rockford, Illinois where the team once again rebranded as the Rock River Raptors. The team later folded in 2009.

In 2006, after the departure of the Raptors, the af2 founded a new franchise in Huntsville. The new team assumed the name and history of the Tennessee Valley Vipers (which the former team vacated upon leaving af2). The new Vipers played in af2 for four seasons (through 2009) and despite struggles with attendance, the team managed to win the league championship in 2008.

The Vipers' final incarnation

In 2010, the Vipers were moved up to the parent AFL and were renamed the Alabama Vipers. Competing against franchises based in major cities like Dallas, Phoenix, Orlando, and Cleveland, the Vipers played only one season in the Huntsville before relocating to Atlanta and rebranding as the Georgia Force. The franchise later folded in 2012.


Alabama Hammers (2011-2015)


Reminds me of my alma mater, Southern Miss

In 2011, following the departure of the Alabama Vipers to Georgia, a new ownership group stepped in the fill the void by founding the Alabama Hammers as an expansion franchise in the Southern Indoor Football League (SIFL). Named after the Alabama state bird (the Yellowhammer), the Hammers played in the SIFL for one season before the league, plagued by financial instability, folded.

In 2012, the Hammers joined the Professional Indoor Football League (PIFL) where they played for four seasons, winning the league championship in 2013. After the 2015 season, both the Hammers and the PIFL folded.


Hockey

Huntsville Blast (1993-1994)


Huntsville was just a pit stop for the Blast

In 1993, the Roanoke Valley Rampage of the East Coast Hockey League (ECHL) relocated to Huntsville and became the Huntsville Blast. The Blast played a single season in Huntsville (1993-94), averaging barely 1,500 fans per game before being sold and relocated to Tallahassee, Florida to become the Tallahassee Tiger Sharks. The organization still exists today as the Utah Grizzlies.


Huntsville Channel Cats (1995-2004)


The Channel Cats won a championship in each league in which they played

In 1995, the Huntsville Channel Cats were founded as an expansion franchise for the Southern Hockey League (SHL) by Knoxville, Tennessee-based doctors John Minchey and John Staley. In their first and only season in the SHL, the Channel Cats won the league championship before the SHL folded after the 1995-96 season.

In 1996, following the demise of the SHL, the Channel Cats joined the Central Hockey League (CHL) for the 1996-97 season. The Channel Cats played a total of five seasons in the CHL (through the 2000-01 season), winning the league championship in the 1998-99 season.

After Minchey and Staley threatened to move the team following the 1998-99 season, the Channel Cats were sold to a local businessman, John Cherney. Cherney disliked the nickname "Channel Cats" and changed the team's moniker to the Huntsville Tornado beginning with the 2000-01 season, an unpopular move with fans. A combination of poor attendance and date conflicts with the Huntsville Flight and UAH Chargers led the organization to suspend operations for the next two seasons.

The team played as the Huntsville Tornado for one season (2000-01) before temporarily suspending operations

In 2003, Cherney helped found a new league called the South East Hockey League (SEHL) and revived team operations with the name returning to the Huntsville Channel Cats following a fan poll. The Channel Cats played in the SEHL for the 2003-04 season, winning the league championship before the league folded.

Following the demise of the SEHL in 2004, a plan was formulated to combine the remnants of the SEHL with that of the WHA2 (World Hockey Association 2, another defunct hockey league) to form a new league, the Southern Professional Hockey League (SPHL). 

At the same time, a new ownership group proposed starting a new professional hockey team in Huntsville to join the SPHL in place of the Channel Cats. Ultimately, the decision on whether the Huntsville Channel Cats would survive as an SPHL franchise or if the new proposed team would replace them fell squarely on the venue, the Von Braun Center.

In the end, the Von Braun Center selected the new ownership group's franchise as their new tenant, which was soon christened the Huntsville Havoc. The Channel Cats folded immediately following the Center's decision.


Huntsville Havoc (2004-Present)


The Havoc are one of the SPHL's more successful franchises

In 2004, the Huntsville Havoc began play as a founding member of the SPHL. To date, the Havoc have played 17 seasons in the SPHL and won the league championship in the 2009-10, 2017-18, and 2018-19 seasons. Furthermore, the Havoc have regularly led the SPHL in attendance and are considered one of the league's most stable franchises.


Soccer

Huntsville Fire (1997-1998)


The name was appropriate, considering the circumstances

In 1997, the Daytona Beach Speedkings were founded as a charter member of the Eastern Indoor Soccer League (EISL). Due to dismal attendance (selling fewer than 200 tickets per game), the team was sold and relocated just 8 games into the season, arriving in Huntsville in in July 1997 as the Huntsville Fire.

Upon the team's arrival in Huntsville, attendance increased to about 2,500 per game and this number held steady through the team's second season in 1998. However, it wasn't enough to save the franchise or the league; the former suspended operations in September 1998 and the latter folded in December. 


Future

I know that the history of professional sports in Huntsville consists of more misses than hits, but that's the nature of minor league sports. However, the success of the Huntsville Havoc and Rocket City Trash Pandas (as well as the Huntsville Stars, who stayed around for 30 years before eventually moving) shows that with the right combination of ownership, venue, and fan engagement, professional sports franchises in Huntsville (and anywhere, really) can survive and thrive.

In the years since teams like the Flight, Hammers, and Fire have come and gone, Huntsville has experienced an incredible amount of growth. A generation of young, educated, affluent professionals has reshaped the area, and I think they're hungry for local sports that they can identify with.

Rendering of the proposed renovation of Joe Davis Stadium

A more tangible sign of hope may be the recent plan to renovate Joe W. Davis Stadium and turn it into a multi-purpose sports facility with the aim of attracting a minor league soccer team. I know this is a bit of a "down the road" scenario, but hey, if Huntsville keeps growing, why not? Given Huntsville's current trajectory, I don't see why this city can't be place where professional sports can thrive.