As you all probably know, last week, a 71-year-old ex-cop shot and killed a man inside a Tampa movie theater for texting during the trailers. Apparently he will be using the infamous “Stand Your Ground” statute in his defense, reminding us all that Florida is bizarro world where logic has died and idiocy runs rampant. Many on the internet have sarcastically wondered why we even keep Florida around, but throughout the past couples of days, I've been coming back time and time again to one simple question:
Do we really need Florida?
Are we seriously better off as a country with this flaccid peninsula jutting out from us? I mean, Florida is so bad that Complex felt the need to publish a 25 Worst Things To Happen in Florida in 2013 list, and "Man Beats Child to Rhythm of 'Blurred Lines'" was only at #18.
But while pondering this, I decided to be as objective as I could be about this and came up with a 100% scientific approach to solving this question…
A pros and cons list!
I’ll weigh all of Florida’s benefits with all the things that suck about Florida, and by the end, we’ll know whether we really need Florida as a state or not.
Roy Jones Jr.! (Ben made me include this one)
The Winter Equestrian Festival! (Shauna made me include this one)
Dexter Seasons 1-4!
Stand Your Ground
Dexter Seasons 5-8
Welp, after some careful analysis, it looks like the cons outweigh the pros, meaning we are officially better off without Florida.
And that’s not even mentioning the fact that according to user BlueHerons on city-data.com, "You will never find ANYTHING in your price range in Sarasota”