Thursday, November 17, 2016

Crimson Tide

A county-by-county map of the results of the 2016 United States Presidential election.


A Chinese businessman once voiced his disdain for American elections, lamenting that they are "highly unpredictable" and thus difficult to plan around. To be honest, this shouldn't be surprising; businesses like predictability and in authoritarian states like China, elections often have pre-determined outcomes. However, the 2016 United States Presidential election, held last week, proved to be anything but predictable as every pollster, pundit, and prognosticator was proved wrong when Donald Trump won the presidency by a score of 306-232 electoral votes over Hillary Clinton. This came as a shock to everyone, Clinton and Trump supporters alike (myself included), and in the days since, everyone has been asking themselves the same questions: How did this happen? Why did so many people vote for Trump, in spite of his inflammatory and bone-headed comments? Did gender, race, or immigration play a role? In this post, I'm going to examine each of these questions and give you my own thoughts on the factors that led to Trump becoming the 45th President of the United States.

The campaign leading up to this election was anything but ordinary: Wild accusations, gaffes, and insults traded between the candidates became everyday news. Donald Trump, with his off-script, firebrand manner, made many inflammatory statements about his opponents, Republicans and Hillary Clinton alike. In addition, the revelation of questionable past business dealings and unflattering comments about women and minorities made Trump's campaign a daily Public Relations nightmare. The Clinton campaign wasted no time using the struggles of the Trump campaign to their advantage; indeed, the Democrats' strategy for winning the election seemed to consist solely of discrediting Trump's character and his fitness to be President. Considering Trump's lack of experience (having never served previously in government or the military) and his unrestrained manner, this appeared to be a solid game plan. As the campaign wore on, Clinton's lead in the polls steadily grew and by November 9th, it seemed like the election would be only a formality, a coronation for the country's first female President.

Unfortunately for Clinton, the United States isn't China. As usual with Presidential elections, the first few states called on Election Day carried no surprises. However, after a couple of hours, the heavily anticipated vote counts from the swing states of Florida and North Carolina started to trickle in, and they indicated that Trump was faring much better than expected. In particular, Florida, a battleground state where many expected the count could go late into the night or early hours of the next morning before a winner was declared, was actually called quite early for Trump, a surprise to many observers and an ill omen for the Clinton campaign. Before long, Trump had secured victory in both North Carolina and Florida, and began looking toward the Mid-West swing states of Wisconsin, Pennsylvania, and Michigan for an edge. As the numbers came in, the nation held its breath as it watched the unthinkable happen: Trump, who needed a Hail Mary to flip at least one of these "reliably blue" states, ended up securing victory in all three to win the election.

I watched all of this drama unfold live on CNN. It was interesting to see the mood of the commentators change throughout the night, from confident and relaxed at the onset to concerned and confused at around the midpoint and finally to shocked and dismayed at the conclusion. Once it was obvious that Trump was going to win the election, the commentators and analysts began asking: How did this happen? After all of the terrible and off-putting things Trump had said and done, why would so many people still go out and vote for him? After these questions were posed, it wasn't long before the accusation of sexism, racism, and xenophobia on the part of the American electorate began flying. Celebrities and personalities all over social media voiced their displeasure, accusing voters of sexism for having decided against electing the first female President, despite the fact that Clinton was obviously much more qualified to hold the office than Trump. On CNN, contributor Van Jones delivered his now-famous monologue in which he declared that the result constituted a "White-lash", or a vote of opposition by white people, against the increased political presence of minorities and influx of immigrants into the country.

Ultimately, these critics are wrong in their assessments of the voters and the reasons why Trump was elected. You see, voters have very practical concerns when it comes to selecting a new Commander-in-Chief. Many political observers and activists often see issues  through a certain viewpoint that agrees with their ideology but may not be shared by the average voter. For example, many of these such observers and activists were hoping to see the first female President elected, especially when that candidate was running against someone who had made very negative remarks about women in the past; for them, the politics of gender were front and center in this election. However, voters who had more pressing concerns (such as economic security, healthcare, taxes, and jobs) didn't have the luxury of spending their vote on the candidate that would provide the best political optics; instead, they voted for the candidate who they felt would serve them best as President.

That candidate was Donald Trump, a successful businessman who made trade reform with the aim of bringing back well-paying manufacturing jobs a centerpiece of his campaign and whose experience building businesses in the private sector was set to help him do just that; by contrast, Clinton admitted that she possessed little economic acumen and promised that her economic policy would simply be a continuation of that of President Obama's administration. Did this decision to vote for Trump mean that the voters endorsed his vulgar past comments about women, and by extension, establishes them as sexists? Of course not. No one likes the remarks that Trump made about women, but the American voters were not voting for a role model or a best friend; they were voting for a President who could help make their lives better, and when the choice presented to the voters for that purpose was between Donald Trump and Hillary Clinton, they chose Trump.

Next, you have the Van Jones argument, that voters supported Trump because they hold racist and xenophobic sympathies. Aside from the obvious problems with proving that over 61 million Americans are unabashed racists and xenophobes, you can discredit this idea entirely by simply looking at the election data. In 2008, President Obama was elected over John McCain by a margin of 365-173 electoral votes and 69-60 million popular votes. In 2012, President Obama was again elected by a margin of 332-206 electoral votes and 66-61 million popular votes. As you can see, Obama was elected by comfortable margins each time. However, many of the voters who helped place Obama into office voted for Trump this time around. Does that mean that the same voters that voted for Obama twice have all of a sudden become racists in the four years since the last election? Ridiculous. A truly bigoted nation doesn't elect an African-American to the Presidency by comfortable margins twice. Of course, this kind of rhetoric shouldn't be surprising coming from Van Jones. After all, he's made a career out of race politics, so it's in his professional interest to spin every topic into a race issue. The day that race is no longer a hot-button political issue is the day that he finds himself out of a job.

Unfortunately, the Left's excuses for the loss didn't end there: The Electoral College, the media, independent candidates, the FBI, the Russians, the Democratic party, minorities, and Hillary Clinton herself have all been blamed for the loss. However, instead of tackling each of these excuses in turn, I'll just let it suffice to say that the Left is currently in a tailspin, desperate to find a scapegoat that massages their political ideology rather than accepting the most obvious and accurate explanation: That their movement is simply out of touch with the average American voter. You see, through Obama's 8 years in office, the Democratic agenda has been met with a lot of success on the coasts of America; social justice and liberal progress initiatives thrived in the left-leaning states of New York, Massachusetts, California, Washington, and Oregon. But while that was happening, the heart of America, the states of Iowa, Wisconsin, Michigan, and Pennsylvania were struggling and decidedly less interested in any social agenda and more so in economic policy, an area where they had been left behind. So, it should come as no surprise that when the 2016 Presidential Election came around and the Democrats asked these same states to vote to keep them in office that the answer was a resounding "No".

Instead, these the voters in these states started their own movement, a turn in the tide, if you will, that demands change in the form of a new government that puts the people's basic need of economic security first, rather than frivolous "cultural progress" initiatives. That is what Donald Trump and the Republican Party have promised to deliver, and that is what the American people really need. With a unified Republican government in office starting on January 20th, this is the best opportunity that we have had in a long while to effect real, constructive change and progress. I'll be watching closely, because if they manage to succeed, American's best days will truly lie ahead.

Wednesday, November 2, 2016

The Everyday Heroine

From an early age, I developed an appreciation for classic movies. Whether it was the screwball comedies of the 1930s, the World War II epics of the 1940s and 1950s, or the gritty Westerns of the 1960s, something about the style and spirit of that era of Hollywood captured my imagination. Despite this affection for Hollywood's Golden Age, I never really took the time to fully explore the most prolific works and performers of the time period; whether it was a lack of time, means, or direction, this intriguing pursuit always remained an elusive endeavor. However, with a little help from various online resources (such as Wikipedia and IMDB), I've decided that the time is right to begin this oft-deferred adventure in earnest.

For this particular undertaking, I am going to select a handful of the biggest stars of the classic Hollywood era (in no particular order) and, using the American Film Institute's 100 Years, 100 Stars list as a guide, watch/research each of their biggest hits. Since there is such a large body of content to process, I expect that this pursuit will take years to see through. That said, I've elected to begin this journey with a profile of a star not listed on the aforementioned guide; a star whose peculiarities on and off the screen (as well as her odd and winding career path) carved out for her a niche in Hollywood history that is as unique as she was. For this first step of my journey through Hollywood history, I present my profile of the actress Jean Arthur.

Jean Arthur

1930s publicity photo.

Early Years


Jean Arthur was born Gladys Georgianna Greene in Plattsburgh, New York in 1900, to parents Johanna Nelson and Hubert Greene. Arthur and her three older brothers had a somewhat nomadic childhood; Hubert worked as a photographer, and his job took the family all across the country for many years before finally settling in New York City in 1915.

Silent Film Career


While working as a model in New York City in the early 1920s, Arthur was discovered by Fox Film Studios and offered a role in a silent film. Arthur accepted, moving to Hollywood and taking the stage name "Jean Arthur" after her two favorite childhood heroes (Joan of Ark, whose full name was Jeanne d'Arc, and King Arthur). Despite a promising start, Arthur was panned by critics who lamented her lack of acting talent, and she considered leaving Hollywood altogether. However, Arthur found that she enjoyed acting and decided to continue her pursuit of a career on the silver screen.

Eventually, Arthur took a job making low-budget westerns for a small studio called Action Pictures. Working conditions were harsh (as B-westerns were often shot on-location with few amenities) and the roles were stereotypical "damsel in distress" parts (which Arthur loathed, likening them to a "diet of spinach"), but the films were moderately successful and helped keep Arthur's career afloat. Her success in these films eventually won her roles in more promising projects, though she was still plagued by criticism over her talent.

Paramount Pictures Career


In 1928, Arthur signed with Paramount Pictures, a major step forward in her career. At around this time, Paramount decided to begin producing sound films, or "talkies". Arthur was hesitant to transition to sound film for several reasons, not the least of which was her throaty, "frog-like" voice. Arthur and studio executives alike were afraid that audiences would not take to her voice (which she called a "fog horn" after hearing a recording of herself during a voice test), and as a result, she was not considered for the top roles at Paramount.

Arthur and her co-stars in a publicity photo for The Saturday Night Kid (1929) one of Arthur's early films.
Left to right: Arthur, Clara Bow, Jean Harlow, Leone Lane.

After years of languishing in sub-par productions and with her career deteriorating, Arthur's contract with Paramount expired in 1931 and was not renewed. Shortly after she was advised to move back to New York, since she presumably no longer had a viable career in Hollywood. Reluctantly, she packed up and went back east, one of many Hollywood washouts.

Broadway Career


Upon returning to New York in 1931, Arthur began to pursue a career on Broadway. While the productions she starred in drew little attention and her Broadway career was generally regarded as a failure, Arthur steadily developed her craft and was positively received by critics. Eventually, this led to her rediscovery by Columbia Pictures, who offered her a contract to return to Hollywood in 1934. Despite her lack of commercial success, Arthur considered her years spent on Broadway to be the happiest of her life.

Columbia Pictures Career


In 1935, Arthur starred in The Whole Town's Talking, her first Hollywood hit. Directed by the legendary John Ford, the film made Arthur into a star and established her screen identity as that of a hard-nosed working girl, the role type that she would be associated with for the rest of her career. Also at this time, Arthur began to bleach her naturally brunette hair a distinctive blonde that would become her iconic look.

Mr. Deeds Goes to Town (1936)


In 1936, director Frank Capra chose Arthur to star alongside Gary Cooper in his next comedy, Mr. Deeds Goes to Town, after Carole Lombard dropped out three days before the start of production. Mr. Deeds was a hit and launched Arthur into superstardom; critics raved over Arthur's performance and the chemistry between she and Cooper.

Cooper and Arthur in Mr. Deeds Goes to Town (1936)

However, during the production of Mr. Deeds, Arthur began to suffer from bouts of stage fright that would plague her for the rest of her career. In fact, Arthur would often become violently ill before shooting and would suffer crying fits between takes. Co-star Cooper stepped in to console Arthur through her anxiety; Arthur appreciated his efforts and would go on name Cooper her favorite co-star.

Later in 1936, Arthur reunited with Cooper to film the swashbuckling western, The Plainsman. The film was a success and Arthur's performance as Calamity Jane was popular with audiences and critics (and was also her favorite role of her career). However, the success of Mr. Deeds and The Plainsman brought another problem for Arthur to the forefront: Her reclusive personality.

During her first years in Hollywood, Arthur's reclusiveness did little to impact her career, as she had yet to accrue a following. However, by the end of 1936 Arthur was a household name, and as such she was expected to give interviews, participate in photo shoots, and attend parties to socialize with the Hollywood elite (like virtually all other major stars of her day). Despite this, Arthur typically declined to appear in public and loathed almost all forms of attention, which somewhat impacted her public appeal and frustrated studio executives.

You Can't Take it With You (1937)


In 1937, Arthur teamed back up with Frank Capra to film the screwball comedy You Can't Take It With You alongside Jimmy Stewart.

Still from You Can't Take it With You (1937)
From left to right: Lionel Barrymore, Jimmy Stewart, Jean Arthur, Edward Arnold.

An adaptation of a Pulitzer Prize-winning play, You Can't Take it With You was met with significant hype. Luckily, the film met expectations and was very well received; in fact, Columbia was so confident in the film that studio executives held a massive press screening prior to its general release, and You Can't Take it With You went on win the Academy Award for Best Picture for 1937. As a result of her run of successful films, Arthur was a finalist to play the coveted role of Scarlett O'Hara in the upcoming production of Gone With the Wind (though the role would famously be awarded to Vivien Leigh).


Only Angels Have Wings (1939)


In 1939, Arthur joined with Cary Grant and director Howard Hawks to film Only Angels Have Wings. Arthur and Hawks clashed during filming, as Arthur was not accustomed to Hawks' highly improvisational style of directing. In addition, Hawks asked Arthur to play her role with more subtlety than she was used to; Arthur unhappily relented and performed her role as directed. Years later, after witnessing Lauren Bacall's performance in another Hawks film (To Have and Have Not), Arthur formally apologized to Hawks, as she finally understood what Hawks was asking of her.

Grant and Arthur in Only Angels Have Wings (1939)


Only Angels Have Wings was also notable for being the debut of future Hollywood star Rita Hayworth. Arthur shunned Hayworth during the production of Only Angles Have Wings, as she saw Hayworth as a threat to her position as Columbia's top actress (a title Hayworth would indeed assume upon Arthur's retirement). Years later, Arthur came to regret her snubbing of Hayworth.

Mr. Smith Goes to Washington (1939)


Later in 1939, Arthur reunited with Capra and Stewart to film Mr. Smith Goes to Washington, another box office success.

Stewart and Arthur in Mr. Smith Goes to Washington (1939)

While originally conceived as a sequel to Mr. Deeds Goes to Town (tentatively titled Mr. Deeds Goes to Washington), Gary Cooper was unavailable at the time, so Jimmy Stewart was chosen for the lead role and the film took shape as Mr. Smith Goes to Washington. Though the movie was a hit, the two stars repeatedly clashed during filming, with Arthur believing that Stewart was playing his role too "cute" and not channeling the "commanding" screen presence that Cooper had embodied. After the conclusion of production, Arthur vowed never to work with Stewart again, even going so far as to pass on the lucrative female role in It's a Wonderful Life simply to avoid Stewart. Despite their differences, Stewart considered Arthur "the finest actress I ever worked with.", and Arthur later considered Mr. Smith Goes to Washington one of her favorite films.

After filming Mr. Smith Goes to Washington, Arthur slowed down the pace of her work considerably. After starring in no fewer than 10 productions from 1936-1939 left her physically and emotionally exhausted, Arthur negotiated an easing of her workload with Columbia to appear in no more than seven pictures over the next five years.

Unfortunately for Arthur, her first three films following Mr. Smith would prove to be unsuccessful. 1940's Too Many Husbands, while moderately successful at the box office, was largely overshadowed by its sister production, Cary Grant's My Favorite Wife. Arizona, also in 1940, attempted to reclaim some of Arthur's fire from The Plainsman, yet it failed spectacularly. 1941's The Devil and Miss Jones (produced by Arthur's then-husband Frank Ross) also underwhelmed.

Despite beginning the 1940s in a slump, Arthur remained quite popular with audiences, and her fortunes would soon turn around.

The Talk of the Town (1942)


Grant, Coleman, and Arthur in The Talk of the Town (1942)

In 1942, Arthur starred alongside Cary Grant and Ronald Coleman in The Talk of the Town. Directed by George Stevens, The Talk of the Town sees Arthur play the role of an innocent schoolteacher caught between a radical political activist (Grant) who is on the run after being accused of arson, and a stuffy law professor (Coleman), who are both vying for her affections. While more of a dramatic comedy (or "dramedy") than the pure screwball comedies that she was best known for, Arthur shined in her role and the film was a hit, performing well at the box office and even earning an Academy Award nomination for Best Picture. Arthur worked particularly well with Stevens, as they had similar approaches to their craft; Arthur would later name Stevens her favorite director.

The More The Merrier (1943)


McCrea, Coburn, and Arthur in The More The Merrier (1943)


After their success together with The Talk of the Town, Arthur and Stevens reunited in 1943 for The More The Merrier, a comedy about a young lady who finds herself unexpectedly sharing an apartment with two men during a housing shortage in Washington, D.C. Co-starring Joel McCrea and Charles Coburn, The More The Merrier was another hit, both commercially and critically. But perhaps most importantly, for her performance in the film, Arthur finally earned her first Academy Award nomination for Best Actress (which she ultimately lost to Jennifer Jones).

The reasons that Arthur had been consistently overlooked by the Academy over the course of her career (in spite of her excellent performances) were varied. Firstly, her best performances were often overshadowed be her co-stars (such as Gary Cooper in Mr. Deeds and Jimmy Stewart in Mr. Smith). In addition, Arthur's best roles were in her comedies, which never fared particularly well with the Academy. And finally, for an actor/actress to win an Oscar, it was expected that his/her resident studio would to quite a bit of politicking on their behalf. Unfortunately, the head of Columbia (Arthur's studio), Harry Cohn, was no fan of Arthur's; the two of them fought titanic battles over contracts and picture assignments (Arthur was quite picky about the films in which she would agree to appear and often rejected assignments, angering Cohn to no end), and as a result, Cohn felt no obligation to appeal to the Academy on her behalf. Nonetheless, modern film historians feel that Arthur's lack of recognition from the Academy is one of the great injustices in Hollywood history.

Retirement


Arthur in Shane (1953), her final (and only color) film appearance

After filming The More The Merrier, Arthur appeared in two more films (A Lady Takes a Chance in 1943 and The Impatient Years in 1944) to fulfill her contract with Columbia (neither film attracted much attention). Eager to leave the pressure and unwelcome attention of Hollywood, Arthur reportedly ran through the studio's streets while exclaiming "I'm free! I'm free!" on the day her contract expired.

While Arthur was convinced to come out of retirement for a couple of one-offs (A Foreign Affair in 1948 and Shane in 1953), the conclusion of her Columbia contract essentially marked the end of her film career. After her retirement from Hollywood, Arthur returned to Broadway for some limited stage work. Aside from a modestly successful production of Peter Pan which ran from 1950-51, Arthur's second stint on Broadway was an utter failure.

In 1966, Arthur attempted a short, ill-fated comeback on TV with The Jean Arthur Show, which was cancelled after 11 episodes. Afterwards, she decided to try her hand at teaching drama at Vassar College, where she instructed a future Hollywood star, Meryl Streep. Though the novelty of having a Hollywood star as an instructor was intriguing for both the school and students alike, clashes between Arthur and the faculty over her teaching style brought her final artistic endeavor to a premature end.

With every meaningful pursuit of her post-Hollywood life having ended in failure, Arthur became as reclusive as ever, retreating to the confines of her Hollywood estate. Arthur later died from heart failure on June 19, 1991 at the age of 90. In accordance with her wishes, no funeral service was held. Arthur's remains were cremated and scattered off the coast of Point Lobos, California.

Legacy


At a time in Hollywood when female roles were reserved largely for either damsels in distress or femme fatales, Arthur's screen presence as a worldly, hard-nosed working girl blazed a trail for the actresses who came after her. In addition, Arthur's up-and-down road to stardom inspired many caught in the dredges of Hollywood's depths to not give up on their dreams.

1935 publicity photo.

At the height of her career, the labels "The Everyday Heroine", "The Quintessential Comedic Leading Lady", and "The Queen of Screwball Comedy" where all ascribed to Arthur. In the years following her retirement, Arthur gradually came to be known as "The Actress Nobody Knew"; her distaste of publicity stood in stark contrast to the attention-hungry culture of Hollywood and much of her personal life was shrouded in mystery (she largely lived as a recluse and had no children). That said, I believe that Arthur's legacy lies not in her reclusion or even her trailblazing career path, but simply in the endearing spirit in which she performed. Whether bringing to life the romance of Mr. Deeds Goes to Town, the comedy of You Can't Take it With You, or the drama of Only Angels Have Wings, Arthur presented a natural charm on the silver screen that is worth remembering.

Tuesday, September 6, 2016

Where Eagles Dare

"Anyone. Anywhere. Anytime." This was the tagline for the University of Southern Mississippi athletics program for a few years in the early 2000s. While this marketing campaign has since ended, the mindset this slogan encapsulates has always been, and continues to be, at the heart of Southern Miss athletics. This spirit was put on display again this past weekend, as Southern Miss played one of the best games in its history and defeated another Goliath.

From the beginning, Southern Miss has held a reputation as a giant-slayer. Whether it's ending Bear Bryant's 57-game home winning streak in 1982, defeating #6 Florida State on the road in 1989 (featuring a Quarterback by the name of Brett Favre, no less), or upsetting #7 Houston in the 2011 Conference USA Championship Game, the Golden Eagles have demonstrated a willingness and an ability to effectively compete against any opponent, big or small, throughout the program's history.

This tenacity was on display again this past weekend, when Southern Miss went on the road to take on Kentucky. While the Wildcats are far from an SEC powerhouse, the size and resources of that program still dwarf those of Southern Miss by a considerable margin. Add in the fact that the Golden Eagles would be playing in a hostile environment and you can forgive anyone for doubting our prospects for securing a win.

That said, Southern Miss wouldn't be where it is today without having prevailed in such games time and again. Every Southern Miss fan knows the history of the program and the ability of our teams to deliver hard-fought wins even in the most daunting of situations, and this year's squad is no different: Not only does it have the talent it needs to compete, but it also possesses the resolve to fight to the end, no matter what.

That resolve was tested early. The normal Week 1 jitters, the pressure that comes with playing in a hostile environment, and Kentucky's superior size took its toll on Southern Miss in the first half, which at one point saw the Golden Eagles down by a score of 35-10. Hanging with a bigger opponent on the road is one thing, but a 25-point comeback against the same? Let's just say even this Southern Miss fan was ready to throw in the towel.

Good thing I wasn't asked to lead this team. Unlike yours truly, the Golden Eagles didn't entertain the notion of giving up and instead came back swinging. The offense, led by QB Nick Mullens and RB Ito Smith, lit up the scoreboard with 34 straight points, whilst the defense, led by DL Dylan Bradley, held Wildcats to nothing. In the end, the game was a 44-35 victory for Southern Miss and the largest come-from-behind win in the program's history.

While this win was exciting (to put it lightly), it comes by no means as a surprise: This kind of grit is in the Southern Miss DNA. You see, Southern Miss is a place where the underdogs, the misfits, the rejects and all who want to play hard for a chance at success can come to compete. Southern Miss is where that determination and spirit is channeled into one victory after another over the likes of Alabama, Auburn, LSU, Florida State, Ole Miss, Mississippi State, Houston, TCU, Louisville, Kansas, Virginia, and now Kentucky, even as the gulf between the "big" and "small" schools continues to widen. Southern Miss is where you'll find that the college football spirit is alive and well.

To put it simply, Southern Miss is where Eagles dare.

Thursday, June 9, 2016

Golden NuGet

One of the hassles of software engineering is finding the right libraries for your project. Need jQuery? Until now, you had better be prepared to scrounge the web for a jQuery host, download an archive containing the code, unzip the files, copy them to your project directory, add the references to your project, and then hope and pray that you've got the correct version. Sound like a headache? Trust me, it is... and that's just jQuery, one of the more popular and readily-available libraries that many engineers use on a daily basis. Need bootstrap? iTextSharp? jQuery UI? AngularJS? Be prepared for more of the same. Luckily, Microsoft recognizes this problem and has prepared a nifty little solution that takes the pain out of locating and managing these commonly-used libraries.

This solution is a service called NuGet (pronounced "new-get"). NuGet consists of two distinct components, the first of which is the NuGet Gallery, which is essentially a warehouse for commonly-used libraries for .NET, such as jQuery, bootstrap, and others. Using NuGet, developers don't have to search across the web to find the libraries they need; instead, the NuGet Gallery functions as a one-stop source for these libraries and others, housing over 600,000 different libraries (called "packages" by NuGet) in all at the time of this writing.

The second component of NuGet is the NuGet Package Manager, a plug-in that is included with Visual Studio. With the NuGet Package Manager, you don't even have to leave your IDE to get the libraries you need. Better yet, in addition to downloading your desired packages on demand, the NuGet Package Manager will also automatically install your packages to your solution by copying the code files to the appropriate directories and adding the references to your project file. Still need jQuery? Instead of following the tedious steps I mentioned earlier, simply open your project in Visual Studio, go to 'Tools' -> 'NuGet Package Manager' -> 'Package Manager Console', and then type the command 'Install-Package jQuery'. There ya go! NuGet Package Manager will automatically download the latest version of jQuery from the NuGet Gallery and install it to your project. Pretty nice, eh?

Not only does NuGet make your life easier by streamlining the process of installing packages, but it makes maintenance a breeze, too. Let's say some time goes by and you want to make sure that you're using the latest version of jQuery (or any other package). NuGet Package Manager can check for available updates to all of the NuGet packages included in your project and install them automatically. To do this, simply re-open the NuGet Package Manager Console and type the command 'Update-Package'. Presto! All of your NuGet packages are now updated to the latest available version.

Finally, NuGet makes removing packages easy, too. Eliminating unneeded libraries and cleaning up deprecated references is a necessary and often tedious process. NuGet takes the pain out of this by "cleaning up" when a package is uninstalled by removing the package code and references automatically. Decided that you don't need that jQuery package? Just open the NuGet Package Manager Console and type the command 'Uninstall-Package jQuery'. NuGet will now remove all of the jQuery code and references, leaving your project as sparkly-clean as it was to start.

In summary, NuGet is a very powerful service that does a lot to make the jobs of .NET software engineers easier and improves the overall integrity of a solution. The degree of convenience that NuGet brings to package installation, maintenance, and uninstallation cannot be understated and I applaud Microsoft for supporting their developers by releasing such a great tool. Now, if they can just develop one that'll automatically prepare all of my documentation for me...

Saturday, May 14, 2016

Key Feature

I have a confession to make: Back in college, I acquired a disturbing habit. That is, I began to regularly lock my keys in my car by accident. I don't know if it was the stress of my coursework or some other external force that caused me to pick up this tendency (as I had never done it in high school); I only know that I ended up making a humiliating call to my mother and/or a locksmith to come to my rescue on multiple occasions as a result. However, while I may not have gotten any smarter over the years, cars apparently have.

My new car, a 2016 Ford Fusion Titanium, uses a keyless "Fob" (from the German word "Fuppe", translated as "pocket" - basically just a small IR remote) for entry and exit in lieu of a traditional key, meaning that I typically don't have to take my "key" out of my pocket during the normal course of my commute. While this has certainly reduced the risk of me locking my "key" in my car, I do occasionally find myself taking it off my keychain for some reason (usually to hand to a hotel valet or attendant at a carwash).

Now, you may presume that in these situations I would be vulnerable to the same mishaps as before, but surprisingly enough, my car has me covered. As it turns out, when I attempt to lock my car, it checks to see if the fob is still inside; in the case that it is, the car refuses to lock and honks the horn to inform me that the fob is within. While this feature may not seem like a dealmaker, it definitely bailed me out at my local Publix earlier today when I separated my fob from my keychain, dropped it in my cup holder, and later proceeded to attempt to lock my car as I was disembarking (I had just left a carwash where, as usual, I had to separate my fob from the keychain).

Needless to say, I owe the engineers at Ford a big "Thank you" for this feature; without it, I would probably be helping myself to serving of Humble Pie about right now. Sometimes, it's the smallest things that make all the difference.

Friday, May 6, 2016

STEMulus Package

The April 2016 jobs report came out today. It indicates that U.S. employers added 160,000 jobs this past month, the lowest rate in 7 months as economic growth is moderating. President Obama touts this as a success story, declaring that he has added 14 million jobs since taking office, 3rd-most among U.S. Presidents (behind FDR and Reagan). However, the real story is one level deeper.

While President Obama's figures are accurate, the one statistic that is conspicuously absent is the quality of the jobs that are being added. If you dig a little deeper, you'll see that the jobs lost during the recession were mostly middle-class manufacturing jobs. On the other hand, most of the jobs added since have been entry-level and temporary service-sector jobs. This helps explain why while jobs are being added, wage growth has been stagnant.

This is a bad deal. Replacing good jobs with poor ones isn't going to do the economy much good in the long run. The good news is that there are plenty of good jobs available in the U.S. The bad news is that we have a skills deficit that is preventing us from filling those jobs, because unlike the jobs lost during the recession, these require education and training.

Case in point, it is projected that the U.S. will add over 1.5 million positions for Software Engineers over the next 10 years. However, the U.S. is only graduating about 40,000 Software Engineers per year. Do the math and you see that at that rate, we'll be short about 1.1 million Software Engineers. It goes without saying that these are good, lucrative jobs; jobs that are ripe for the filling but that we simply don't have trained people to take.

The problem is that we have a skills management issue. The jobs of the 21st century economy are here and they're good ones, but they require skills that our people aren't getting. Only 8% of all U.S. college students are in STEM (Science, Technology, Engineering, and Math) programs. By comparison, in China, where the educational system does much more to emphasize careers, about 30% of all college students are studying STEM. If we don't have enough Americans to fill these STEM positions, guess where they're going to go?

At the same time we're facing this problem, our political leaders have chosen to fight over - wait for it - raising the minimum wage. Seriously? Compared to the real employment problem we're facing, the minimum wage is little more than a distractionary issue. Instead, we desperately need to focus on giving our people the skills and training they need to succeed in today's economy.

In a nutshell, our problem is that we have a 20th century workforce and a 21st century economy. Fixing this begins with education, where we need to emphasize STEM as a cornerstone for success in today's world; there's no reason why a developing nation should be graduating more engineers and scientists than the U.S. We already have the most advanced and capable higher education system in the world; if only we can use it in conjunction with the private sector to give students the skills they need to be successful today, sub-2% economic growth, wage stagnation, and underemployment will quickly become the issues of yesterday.

Thursday, April 21, 2016

As You Like It

I've always had a soft spot for theatre. Whether it's the fond memories of being in my church's Christmas plays as a kid, seeing the talent of the actors on display in person, or the emotional connection the performance establishes with the audience in a way that cinema simply cannot recreate, I've always felt a special appreciation for the stage. And this past Saturday evening, I found myself rediscovering the joy of the theatre when I attended the University of South Alabama Department of Theatre's production of Shakespeare's "As You Like It".

One of Shakespeare's lesser-known works, "As You Like It" is a crowd-pleasing comedy at heart; it tells the story of a pair of star-crossed lovers, Orlando and Rosalind, who each separately flee their homes to escape persecution and seek refuge in the Forest of Arden, where they proceed to join up with an exiled Duke and his merry band of followers who have taken up residence there. From there, the comedy takes shape in earnest as intrigue, plots of revenge, and the fickleness of the heart play havoc with the characters' attempts to find true love. Along the way, the cast delivers some of Shakespeare's most popular lines ("All the world's a stage" and "Too much of a good thing", among others) whilst musing on the merits of love, reconciliation, and the simple pastoral life.

"As You Like It" is light-hearted and fairly fast-paced; so fast, in fact, that early on I was worried that I may have trouble keeping up with the plot. However, these fears were soon alleviated as I slowly adapted to the Shakespearean dialogue and began recognizing characters and scenes from the plot summary I had read the night before (I know that's cheating, but I'm new at this!). Once I got settled, I was quickly immersed in the production; the trademark Shakespeare humor was riveting and the cast pulled it off wonderfully with only the occasional slip up (hey, 17th century English is tough!). Brianna Bond's and Zachary Fitzgibbon's lead-role performances were excellent; both Bond's witty, calculating Rosalind and Fitzgibbon's youthful, lovesick Orlando captured the audience's emotional investment and held it throughout the production.

That said, my personal favorite performance of the night was Abigail St. John as Celia (Rosalind's cousin and best friend à la Mercutio in "Romeo and Juliet"). St. John's performance exuded an energy that went unmatched on the stage and she very quickly established herself as an audience favorite (despite her relatively small role). And hey, to be completely honest, I find it to be a pleasant surprise when a member of the supporting cast can steal a bit of the spotlight (Will Ballard's Jacques and Blake Waters' Touchstone were each good as well).

All-in-all, I thought that this was a great production. The performances of the actors, the charm of the script, and the intimate atmosphere fostered by the compact theatre in the Laidlaw Performing Arts Center all combined to make my first visit to the theatre in well over a year particularly satisfying. The University of South Alabama's Department of Theatre has certainly demonstrated its quality, and this group of students is top-notch as well. I'm definitely looking forward to becoming a regular patron; after all, each ticket sold is not only a pass to an enjoyable evening, but also supports the artistic endeavors and educational initiatives of a local institution. Having a good time for a good cause? That's how I like it.

Tuesday, March 15, 2016

Trial and Error

In my career, I've been through quite a few interviews; some have gone well, a few have gone poorly, and a handful have left me downright puzzled. To be sure, I'm not the world's best interviewee; thinking on my feet is not my strong suit and I tend to lock up under pressure. As a result, I feel I may have missed out on a few opportunities for which I thought I was qualified and well-suited because of a poor interview, and I know I'm not the only one. In fact, people regularly lose out on opportunities not because of a lack of ability or aptitude, but because of poor interview practices and simply succumbing to the pressure of the moment. That said, a new trend is emerging (pioneered by the IT industry) that has the potential to rectify this issue for good... but at a high cost.

The problem that every recruiter faces is that they have to accurately gauge a candidate's ability to perform a job, compare him/her to others, and ultimately decide which one(s) to hire. This is particularly difficult in the IT industry, where technical skills are vitally important but difficult to measure in a traditional interview. To address this, some companies are experimenting with an approach that I call "trial" hiring.

In trial hiring, a company will simply hire an applicant for a "trial" period in lieu of performing a formal interview. During this trial period (typically two weeks), the candidate in question will perform his/her job as normal while being evaluated by a manager. At the end of this trial period, the manager will make a decision on whether or not to hire the candidate permanently. As you can see, the idea (and the benefit) is clear; candidates get to show off their skills doing real work (as opposed to submitting to an arbitrary and potentially inaccurate technical examination), giving the prospective employer a good idea as to whether or not they would be a good fit while also allowing the candidate time to develop a feel for the workplace (making it easier he/she to decide if that is the right place for them).

A downside of this approach is that for the trial period, the candidate is unpaid. Companies don't want to invest capital in unproven quantities and definitely don't want unqualified candidates signing on for a quick paycheck (especially if the position is a particularly lucrative one). However, on the other side, the candidate is giving up a full two weeks of his/her time to perform full-time work for which he/she will not be paid. This could be a deal breaker for many candidates who simply can't afford such an arrangement; traditional interviews typically take no more than a few hours and yield a definite yes/no answer within a day or two, but the trial hiring approach essentially stretches that process out for the duration of the trial.

Additionally, there may be a situation where multiple candidates are being evaluated simultaneously for a single position; in this case, each candidate is essentially in competition with one another for the job. The kind of competition could foster a "Survivor"-like environment that would not be beneficial for the candidates or the company (I've seen this firsthand in IT consulting). For example, the candidates, under pressure to win a position, might elect to work harder than normal and to do so at an unsustainable pace; in the end, they might be giving their future employer an inaccurate reading on their potential for long-term productivity and/or cause themselves to "burn out" (that is, to lose interest in a job and look for opportunities elsewhere), in addition to all of the other ill-effects that accompany excessive work under stressful conditions. Furthermore, candidates in this situation may also become actively hostile and attempt to "sabotage" one another by making unwarranted negative or damaging statements regarding a candidate or actively disrupting a candidate's work; Such actions can tarnish a company's reputation and drive away quality talent (see this article written about Amazon in the New York Times).

Personally, I'm split on the idea of trial hiring. On one hand, I do like the fact that candidates get to perform real work to demonstrate their skills and thus have a greater hand in determining the outcome of the process. However, the time and compensation loss is a tough pill to swallow, not to mention the unpleasantries that could ensue should a candidate find himself/herself in a competition with others for the same position. To help mitigate these two drawbacks, I would recommend that companies offer at least some compensation for the work performed during the trial period (a single check at predetermined percentage of the regular rate for the position, for example) and that companies also avoid placing more than one candidate at a time on trial for a single position (to keep from forcing candidates into direct competition with one another).

In the end, is trial hiring right for corporate America? The strategy has already elicited strong opinions on boths sides of the argument (for both moral and practical reasons), but it is still far from mainstream. Ultimately, as with all practices, the effect of trial hiring on the bottom line will provide the clearest indication of it's future in business.

Wednesday, March 2, 2016

Trump Card

With Super Tuesday behind us and another 595 delegates awarded, Businessman Donald Trump looks to be unstoppable in his quest to become the Republican Party's nominee for the 2016 Presidential Election. Currently, Trump holds commitments from 319 delegates, almost a hundred more than the next closest candidate. Polls currently have Trump ahead in most of the upcoming primaries/caucuses, including the crucial March 15 contests of Florida and Ohio; wins in those two states could virtually guarantee Trump the nomination.

Many in the Republican Party are concerned about the implications of a Trump candidacy both for the party and the nation; his eccentricities and character flaws are well-documented (so I won't bore you with rehashing them here), and most Americans not directly supporting his candidacy harbor a highly negative opinion of him. This has brought into question his ability to win enough "mainstream" support to win a general election, as well as the impact he will have on the image of the RNC going forward.

In this unfamiliar scenario, three questions come to mind: What, exactly, has led to Donald Trump's rise to prominence? What mistakes did the other Republican candidates make that allowed Trump to obtain the mantle of frontrunner? And what can be done now to prevent Trump from winning the nomination? I'll explore each of these questions and present my answer for each one.

From Novelty to Frontrunner

For all of his shortcomings, Donald Trump has one thing going for him: He has a larger-than-life persona. Whether it's because of his multi-billion dollar business, his reality TV show, or his foray into politics, everyone knows Donald Trump. However, I think the more interesting question is this: What does he represent? What does a New York City jet-set billionaire have that makes him so appealing to the American blue-collar Average Joes that they will turn out in huge numbers to support him?

Simply put, many Americans are upset. They're upset with the economy, where low-skill jobs that offer middle-class wages are quickly disappearing. They're upset at the state of the world, where the threat of terrorism is at its highest point in years. And they're upset with the government, which has been stuck in political gridlock for years. These same Americans see Donald Trump as the answer to these issues, an outsider who isn't afraid to clean house and make the bold moves that'll get the country running again without regard for political expediency (as opposed to the other candidates, who ostensibly represent the status quo). After all, why wouldn't someone who possesses the acumen to build and run such a successful business empire on sheer force of will be a good fit for President of the United States?

Misunderestimated

At the beginning of the race, the other Republican candidates didn't see Trump as being a serious threat. After all, they were the established figures in the party and he was a political newcomer who had to pay people to attend his campaign announcement! How could anyone take him seriously as Commander-in-Chief? In fact, you could see this dismissive attitude toward Trump in the early debates where the other candidates scarcely acknowledged his presence and instead took potshots at the Democrats and one another. In retrospect, this was a missed opportunity to cut Trump down to size early.

In addition, the fracturing of the Republican field hasn't helped. Trump's support comes mainly from those looking for an "non-establishment" candidate and he has rallied almost all of those voters behind his campaign. However, the non-establishment supporters are still outnumbered by the "establishment" supporters; should these voters rally behind a single candidate (like the non-establishment voters have done for Trump), then that candidate would likely win the party's nomination. The problem is that there were/are quite a few establishment candidates and each have been hesitant to drop out; Rand Paul only dropped out after Iowa, Chris Christie didn't drop out until after New Hampshire, Jeb Bush waited until after South Carolina, and John Kasich's campaign is still active.

While these candidates haven't won many delegates, they have kept support away from Ted Cruz and Marco Rubio, the two who could conceivably defeat Trump one-on-one. Had the Republican candidates who knew from early on that the nomination was unwinnable dropped out then, more support would have been opened up to rally behind a non-Trump candidate.

No Lead is Safe

As imminent as a Trump candidacy looks now, it still isn't a foregone conclusion. Many elements of the Republican Party are strongly opposed to him and he still needs to win over 900 additional delegates to clinch the nomination. In addition, Cruz and Rubio had strong performances on Super Tuesday (Cruz winning Oklahoma, Texas, and Alaska, and Rubio winning Minnesota), which could be a telling sign that Trump's campaign can be defeated. Should the remaining non-Trump candidates win enough delegates to prevent Trump from clinching the nomination before the convention, this would force a "brokered" convention.

In the event of a brokered convention, the delegates are free to vote for whichever delegate they choose. In this scenario, the delegates who were previously pledged to candidates other than Trump will need to coalesce around a single "alternative" candidate to defeat Trump. Since it is understood that in this scenario the non-Trump delegates will outnumber the Trump delegates and the non-Trump delegates will be unlikely to support Trump, this would effectively give the nomination to the alternative candidate (likely Cruz or Rubio).

In conclusion, while Donald Trump's rise to the front of the RNC race has been remarkable (for better or for worse), the fact is that it's the stresses that today's world has placed on everyday Americans that have helped put him there. In addition, missteps by the Republican establishment early in the race allowed Trump to shore up his support build a lead to the point where he's nearly unsurpassable. However, all is not lost: Trump can still be defeated and there are still many more primaries/caucuses yet to take place. Should the Republican candidates (and their supports) play it smart from here on out, there is a possibility that they could Dump the Trump.

Friday, February 19, 2016

Taking Back TV

When I was young, I was amazed by cable TV. So many channels! Nickelodeon, Cartoon Network, Fox News (yep, I was that kid), and hundreds of others? To someone who grew up with only our local ABC and FOX affiliates, that kind of selection was unbelievable. However, what I didn't understand at the time was the seemingly unlimited buffet of viewing options laid out before me actually represented a heavy-handed industry practice that often left consumers high and dry. That said, recent changes in the industry landscape are making it possible for consumers to reclaim their fair share of control over TV.

Let's start with how the cable TV industry works today. Channels are owned and produced by entities called Content Owners (or simply "owners"). Owners include companies like Viacom, Disney, Turner, and Discovery Communications. The owners, in turn, offer their channels to Content Carriers (or simply "carriers") to deliver to customers. Carriers include DirecTV, Dish Network, Comcast, Cox, Charter, and others. The carriers are charged a fee by the owners for each channel they carry (called a subscriber fee) which varies by channel and is based on the number of customers who subscribe to each channel (Channel A may have a subscriber fee of $0.50 per subscriber and Channel B might have a fee of $0.35 per subscriber). This fee is largely what determines the prices the consumers pay when subscribing to a cable service.

Now, here's the stickling point: If you have a cable subscription, you likely weren't allowed to pick out which individual channels you wanted. Instead, your carrier asked you to pick a "package" containing a pre-determined assortment of channels. Hmm... when you think about it, that does seem a little strange. After all, why should you have to pay for hundreds of channels that you care nothing about if you just want the handful that you want regularly? This has been a major point of pain for consumers over the years; besides, what other industry makes you do that? And how is that fair? If you want to go to the grocery store and buy a Coke, does the store make you buy a Pepsi, too? The more you think about it, the more you might be upset at your carrier for not letting you select your channels al le carte.

However, the reason the carrier sells to consumers in this way is because that is precisely how channels are sold to them by the owners. For example, Disney owns ESPN, ABC Family, and the Disney Channel (among many others). DirecTV may want to carry ESPN, but not ABC Family or the Disney Channel. However, it is Disney's policy that they will only sell these channels together, so if DirecTV wants ESPN, they're going to have to pay for (and carry) ABC Family and the Disney Channel, too. In addition, as part of the agreement between Disney and DirecTV to carry these channels, Disney may dictate that DirecTV must offer all three in the same bundle; that is, DirecTV cannot offer consumers a package that includes ESPN but not ABC Family and the Disney Channel.

Simply put, if you're wondering why you're paying $70 per month for 200 channels when you only watch about 5 or so, this is it. You have no choice but to buy all of these channels from your carrier because your carrier has no choice but to buy them that way from the owner. Pretty raw deal, huh? Well, the good news is that change is on the horizon.

With the rise of high-speed internet, new content delivery services like Netflix, Hulu, and Amazon Prime have started taking over consumers' viewing habits. Increasingly, Americans are spending more time watching content using these services than through cable. In fact, a small but quickly growing number of consumers (called "cord cutters") are simply cancelling their cable subscriptions and getting all of their content over the internet. This new trend has forced owners and carriers to change their business plans. Now that consumers have more choice in the market, the practice of forcing them to overpay for their content is starting to become a liability. The first evidence of this shift in business strategy is being seen in a new service called Sling TV.

Sling TV, launched in January of 2015, is an internet-based TV service that provides channels like ESPN, A&E, AMC, CNN, History, and others for a much lower price than traditional cable ($20 for the base package, as opposed to the average Cable TV subscription which costs $70). The way Sling TV accomplishes this is by focusing on offering only a small selection of popular channels, which significantly reduces subscriber fees. In addition, since Sling is entirely internet-based, consumers can watch TV over any internet-connected device including tablets, cell phones, PCs, gaming systems, and media set-top boxes.

Win-win, right? Well, almost. Sling TV is still in its infancy, and as such, there have been some growing pains. First off, content delivery over IP (the internet) isn't as smooth or reliable as delivery over cable. As a result, unstable feeds and even short-term blackouts have been regular occurrences. In addition, the bandwidth requirement to maintain a quality feed is pretty high (no official measurement, but the general rule is that about 50-75 Mbps or higher is needed), so unless you have a relatively strong connection, your best bet may be to stick with cable for now.

Still, for all of its shortcomings, Sling TV is a huge step forward for the industry. With it, the marketplace has choice and the cable TV establishment has competition. That, together with a consumer base that is largely dissatisfied with being taken advantage of and is willing to give new players in the industry a look, constitutes a strong start to the quest to take back TV.

Monday, February 15, 2016

Fight or Flight

Last week, Carrier Corporation (an air conditioner manufacturer) announced that they would be moving their plant in Indiana (where about 1,400 people are employed) to Mexico. While many see this as an example of corporate greed that should be punished, I believe that it is a symptom of the pressures of a global 21st century economy and an indictment of the outdated regulation of business in the United States. In fact, I think there's an opportunity here to dig deeper into the driving factors behind this move and to consider what steps can be taken to alleviate the pressure that corporate America finds itself under today and prevent this kind of corporate flight in the future.

Now, before I go any further, let me say that anytime anyone is laid off this way, it's a tragedy; I found myself in a similar situation just recently, and let me tell you, it's no fun (to put it lightly). That said, there are three points I'd like to make in this post: Why this happened, what can be done to prevent more moves like it, and why I believe the "progressive" approach is mistaken.

First off, the story here is probably one that you've heard many times before: The cost of labor and regulation is the United States high very high, so to remain competitive, Corporation XYZ has decided to relocate operations to country ABC where regulation is almost non-existent and people work for a buck and a quarter. While this may be a tired line, it's still pretty accurate. In Carrier's case, the cost of labor included a minimum $23/hour base wage plus a comprehensive suite of benefits for every worker. That's a pretty high cost (especially the benefits), so when you consider that, by comparison, Mexican laborers will perform the same work for a quarter of the pay and a fraction of the benefits, it's a wonder that Carrier stayed in the United States as long as it did. In addition, when you note the costly regulation that manufacturers must follow in the U.S. (Worker benefits such as ObamaCare and various EPA rules), the capital expense of day-to-day business is extraordinarily high. These factors considered, the move makes perfect business sense.

So, what can be done? For starters, understand that generally speaking, businesses actually like to stay in America as opposed to moving abroad. In fact, the vast majority of businesses will elect to remain stateside even if it means absorbing a somewhat higher cost of operation. However, manufacturers in particular (like Carrier) are under an existential threat from global competition; for them, it's no longer about how good a "Made in the USA" sticker will look on the box of the product, but whether the company is going to survive another 5 years. That said, there are a few things that can be done to lower the cost of doing business in the United States.

First, lower the corporate tax rate. Right now, the base U.S. corporate tax rate is 35%, the highest in the developed world. This rate needs to be cut to 20%, which is closer to what other developed countries charge. Secondly, end double taxation. Currently, profits made by U.S. companies overseas are taxed there, and then taxed again (at 35%) when the capital is brought stateside; the net result is that companies keep profits offshore and invest over there rather than here. Instead, once revenue is taxed once, allow it to move across borders freely. Finally, curb the power of organized labor. In the past few decades, labor unions have grown very powerful and effective at forcing companies to award their members excessive benefits (usually through threat of strike or legal action). Instead, the entire country needs to be "Right-to-Work", making it illegal for unions to force company employees to join as a condition of their employment (as it is already in most states). This step alone will compel labor unions to bring their demands more in-line with real market value.

Finally, the progressive side of this discussion has their own ideas of how companies that move operations overseas should be treated. In a nutshell, they believe that companies who move operations offshore should be punished with fines, tariffs, and other sanctions. That threat, they reason, will keep companies from moving abroad and force them to stay in America. However, the fault in that logic is that no company wants to be kept prisoner by the government. If the government is willing to impose such harsh penalties on a company that relocates, what else would it be willing to do? Such targeting and corporate prosecution does not create an environment conducive to sustained economic prosperity and will instead accelerate the trend of corporate flight as more businesses will seek to escape such tyranny.

In addition, the imposition of tariffs or other protectionist penalties are likewise a folly, a relic of the 19th century. In a global, 21st century economy, protectionism may prevent short-term loss, but at the cost of long-term development. For an example of this, consider Japan: Once the leading figure in the  technology sector in the post-World War 2 global economy, the Japan has suffered through consecutive decades of economic stagnation. Tentpoles of the Japanese economy like Sony, Sharp, Nintendo, and Panasonic are struggling to catch up to Korean, Chinese, and American competition and are losing cash and market share at an alarming rate. A major contributor to the economic situation these companies find themselves in are the very protectionist policies that they sought to have implemented; Japan imposes more penalties on imports than any other developed economy. As a result, Japanese corporations faced little domestic competition and in the absence of such, they failed to adapt to foreign competition. Companies like Microsoft, Apple, LG, Samsung, and Google are now selling more products in Japan than domestic companies, who are now playing a very difficult (and costly) game of catch-up. We don't want the same thing to happen in the United States.

In conclusion, the realities of competition in a global economy are putting pressure on U.S.-based companies to make tough decisions to remain viable. However, common sense steps can be taken help corporate America compete effectively in the 21st century without entertaining to the punitive approach of the progressive movement. The United States still is and can always be a great place to do business.

Friday, February 12, 2016

Pullin' Gs

Recently, AT&T announced that they have begun rolling out their new 5G network on a trial basis on Austin, TX, with Verizon following suit with its own field tests later this year. The expectation across the industry is that 5G services will be widely available in the next few years.

So, what's so great about 5G? For starters, AT&T is claiming that 5G can deliver data at speeds from 10 to 100 times faster than 4G. Fast enough, in fact, for wireless cellular-based internet access to supplant the fixed cable-based system used in most American residences (much like the new startup Starry Internet). In addition, 5G is designed to be more software reliant, meaning that future upgrades should be faster and easier; rather than replacing a large amount of hardware, the network can be updated by simply deploying new firmware. Should these claims come to fruition, we could be looking at a seismic shift in the internet service landscape in America. After all, who wouldn't want to download a movie to their phone in seconds (as opposed to hours) and dump their expensive cable internet connection for a faster (and presumably cheaper) wireless solution? While all of that does sound good, I do harbor a few concerns.

First off, one problem the wireless industry has been grappling with for years is capacity. Since wireless networks are relatively new and almost everyone owns a smartphone (which, in turn, typically employ multiple processes and applications that consume a substantial amount of data on a daily basis), cellular carriers have been concerned about how much traffic their networks can effectively handle. To prevent the networks from becoming too congested, most cellular data plans include data caps to discourage consumers from using too much data. While these usage-based caps have proven effective in preventing any widespread data delivery issues (at least according to the carriers), they remain a major point of contention for consumers. After all, what's the point of having a phone that I can stream a movie on if I'm just going to hit my monthly data cap a quarter of the way through?

In this way, increasing speeds is like putting a bigger engine in a car without increasing the fuel capacity: You'll go faster, but not farther. Likewise, while network speeds have steadily improved over the years, data caps really haven't budged. Personally, I would venture to say that most users are much more satisfied with their connection speeds than their data usage limits; after all, my cellular connection speed from my desk at work is 16.5 Mbps (with a 3 GB monthly cap), which is not much slower than my 20 Mbps DSL connection (with no usage cap) at home. However, it seems as if the major carriers have spent much more time and effort attempting to improve network speed and coverage rather than capacity; I can only guess as to why this is (possibly a better return on investment?), but I can tell you that unless the rollout of 5G is met with a significant data cap increase (or outright elimination), the new technology is going to be greeted by consumers with resounding indifference.

Secondly, could coverage be extensive enough to make 5G a viable replacement for land-based internet? Certainly, building and maintaining the infrastructure would be easier; instead of laying untold miles of cables, a carrier could simply install towers that could each deliver wireless internet access across a large area. However, as with cellular connections today, natural and man-made obstructions could come into play. For instance, what about customers who live in a rural, heavily wooded areas? The obstructions caused by the trees, hills, and other geographic features would degrade the wireless signal, potentially delivering a consumer experience inferior to that of traditional terrestrial internet. That said, it's worthing pointing out that these very consumers struggle mightily to obtain access to traditional cable internet due to the expense of building out the infrastructure (a problem 5G would presumably not suffer), so this point could well be moot.

Finally, how much would all of this cost? While major investments in new technology are, by nature, very costly, let's be realistic: Consumers are very price-conscious and aren't likely to quickly adopt something that is too expensive. For example, once 5G coverage starts to be built out, we'll see a new generation of 5G-compatible phone hit the market. However, it's not outside the realm of possibility that the carriers, in an attempt to recoup some of the cost of their investment, may elect to place a 5G "tax" (or likely they'll call it something like an "access fee") on each 5G device purchased through them. In addition, in more conspicuous manner, the carrier may simply choose to increase the cost of their data plan offerings to achieve the same end. Also, in the case that a home internet access plan is offered, how would its pricing compare to traditional cable internet? Earlier I presumed that it would be cheaper since the infrastructure would be less expensive to build and maintain, but the initial rollout of any new service is going to incur significant cost; early adopters may be compelled to foot the bill for that investment.

In conclusion, while I do think that the introduction of 5G technology is exciting and a big step forward, I'm going to be careful about getting too enamored before my concerns are addressed; after all, I'd take cautious enthusiasm as an alternative to risking crushing disappointment, haha. As always, feel free to let me know what you think and what questions or concerns you may have about this plan for 5G service.

Wednesday, February 10, 2016

Granite State of Mind

Last night, New Hampshire held its Presidential primary and unsurprisingly (at least, if you've been following the polls), Bernie Sanders and Donald Trump each scored resounding victories. And while this is only one state and the nomination process is far from over, I think that the results of last night's primary tell us a lot about the state of the election.

Let's start on the Democratic side. Well before her candidacy was even officially announced, everyone knew that Hillary Clinton was going to be in the running for the nomination. There was a lot of excitement among Democrats at this prospect; after all, they had just successfully voted the first black President into office to much fanfare, so it follows that the novelty of putting the first woman into office would be too much to resist.

However, novel as that idea may be, Hillary is tied very closely to the Democratic "establishment" (traditional party center where most of the party support and resources are focused), which has become very unpopular since the last election. This is especially true with young Democrats, who feel that they were betrayed by the broken promise of "Change We Can Believe In" when they elected Barack Obama (the Affordable Care Act, or "ObamaCare", was far from the health care revolution it was sold as, and the rest of Obama's time in office has proven unproductive).

Instead, young Democrats have embraced the "non-establishment" (standing within the party but apart from the "establishment") candidate Bernie Sanders, a self-described socialist who has made initiatives like universal single-payer health care, tuition-free higher education, and the dissolution of major financial institutions the main points of his campaign. Sanders' more revolutionary goals of an expanded welfare state and higher taxation strike a chord with young, idealistic voters who are still recovering from the recession and want to see more social guarantees. From here on out, the Democratic nomination looks to become a pitched battle for the party's future.

On the Republican side, Donald Trump has finally broken through and scored the big win that many had been predicting since he rose to frontrunner status. After his upset loss in Iowa, some observers had been questioning whether the polls showing huge national support for Trump were accurate, but it seems as if the New Hampshire primary has put those doubts to rest. Like on the Democratic side, there is a divide in the Republican party between the establishment and the non-establishment, with the latter group showing some real muscle after propelling Cruz and Tump to victories in Iowa and New Hampshire, respectively.

The real question now is if another Republican candidate (Rubio, Kasich, Bush) can round up enough establishment support to defeat Donald. Before this primary, I would have told you that Rubio had the bet shot at doing just that, but his gaffe at this past Saturday's debate (repeating himself three time while being mocked for doing so by Christie) seems to have hurt him (finishing 4th in New Hampshire where he was projected to finish 2nd beforehand). However, now that Christie and Fiorina are both dropping out, more mainstream Republican support should start freeing up to coalesce around a candidate who can serve as an alternative to the Trump firebrand.

That said, the South Carolina debate is this Saturday (8:00 p.m. CST on CBS) with the primary following a week thereafter. As we saw in New Hampshire, a debate this close to a primary can have a significant impact on the outcome and this field is tighter now than it has been in quite some time. If you have the opportunity, be sure to tune in!

Tuesday, February 9, 2016

Millennial Chess

Hey, everyone!

Welcome to my new blog, Millennial Chess! Here, you'll find my thoughts on a wide range of subjects, encompassing sports, technology, entertainment, politics and other aspects of modern life. Simply put, my goal is to provide a civil, welcoming destination for anyone interested in analyzing and/or discussing the topics of the day. As such, please feel free to comment on any of my posts you may find intriguing!

Just to give you a little background about myself, my name is Garrick Aube (pronounced 'obey') and I'm from a small town in south Mississippi called Poplarville. I started blogging on-and-off through high school, but as the normal pressures of college (and the start of my career thereafter) began to pick up, I gradually fell out of the habit. As of this post, it's been about 6 years since I last attempted to blog in earnest, but now that life seems to have calmed down a bit, I've decided the time is right to take another swing at it!

That said, as of right now I really don't have a set schedule for when or how often I'll try to post, so please bear with me while I try to get my authorial processes restarted, haha. Still, I hope that you all enjoy your time here and feel free to let me know what you think of the blog once I get up and running.

Thank you, and welcome again to Millennial Chess!