Comma usage is the difference between this:
and this:
This has been a public service announcement about the importance of commas.
Syamantak Payra spelled his way to the top last week at the 5th annual MetLife South Asian Spelling Bee, winning the championship with the word “dghaisa.”
Now, you’re probably thinking two things. First, what in the world is a “dghaisa”? Well, apparently it’s a Maltese boat similar to a gondola. Yeah, I didn’t know either. Second, you’re probably wondering why South Asian Americans seem to have taken such a shining to spelling bees (the last five winners of the Scripps National Spelling Bee are all of South Asian descent, and now the community even has its own spelling bee circuit). As a South Asian American who happens to love spelling, I thought I’d take a stab at this one.
In my years of pondering this topic, I’ve read a number of interesting theories and been asked some pretty hilarious questions on why the South Asian American community has fallen head over heels for spelling bees. Here’s my response to the FAQs:
Are spelling bees big in India? What the heck is a bee anyway?
Despite what some people might think, no, spelling bees aren’t really a thing in India. In fact, spelling bees are very much an American phenomenon. According to the Scripps National Spelling Bee website, the first recorded usage of the phrase “spelling bee” was in the U.S. in 1875, but the etymology of the word is a bit of a mystery. Most sources suggest the term “bee” for a communal gathering is a reference to the insect (being busy or social like a bee), but this is just a theory.
Is there something about Indian culture that makes people go crazy for spelling?
Um, no. I love spelling, but I can’t speak for a billion people. I never entered a major spelling bee and my non-South Asian best friend always trounced me at the school bee.
Yes, rote memorization is emphasized in the Indian education system, but that’s a total oversimplification of both the education system and spelling. In my opinion, good spelling isn’t actually rote memorization. Nobody can memorize every word in the language, especially since so many words we use aren’t actually English. You have to learn how to recognize the root of a word and the rules that come into play as a result (that’s why being bilingual can provide an advantage). Second, the kids competing in this bee might have South Asian parents, but they grow up in America. And when they’re standing in front of a microphone trying to spell “logorrhea,” they’re on their own.
Lastly, I’m not a fan of the “but Indians make their kids study so hard” argument for a number of reasons that I won’t get into here because that’s an essay. If you’re interested, you can read more about why I have a beef with the model minority stereotype.
You still haven’t answered my question. Why spelling bees?
Okay, okay. Here goes. James Maguire, author of American Bee: The National Spelling Bee and the Culture of Word Nerds, offered up the simplest yet most convincing reason in his 2006 interview that aired on the PBS NewsHour:
“Indian-Americans are very, very strong at the bee. And, of course, an Indian-American boy won in 1985, and I think it inspired a lot of immigrant pride. I think recent Indian immigrants said to themselves, ‘Well, if one of our own can win this quintessentially American contest, then we really want to be, you know, interested in this.’ So Indian-Americans put a lot of emphasis on it.”
In other words, South Asian Americans saw somebody that looked like them win the spelling bee and thought, “Hey, I can do that!” Mind you, 1985 was before South Asians had risen to national prominence in the U.S. It was just two years before the Dotbusters committed a spree of hate crimes against South Asians in Jersey City, and Indians were often the butt of jokes in TV and movies. I can’t emphasize enough how empowering it must’ve been to see a fellow South Asian American excel at something — anything — on the national stage.
This phenomenon isn’t unique to South Asian Americans and spelling bees either. In Daniel Coyle’s book The Talent Code, there’s actually an entire section of the book called “If she can do it, why can’t I?” Coyle answers the question of why so many Russian women were dominating tennis all of a sudden, or why a slew of South Korean women golfers were joining the LPGA tour. The reason is that talent hotbeds developed when a single star rose to prominence and prompted others to say, “If she can do it, why can’t I?” To us, it looks like a sudden, strange anomaly. In actuality, it’s a slow and steady climb.
So there you have it. The more South Asian American kids win the National Spelling Bee, the more other South Asian American kids become interested in trying to win it. Over time, the South Asian Spelling Bee circuit has become the hotbed to nurture and grow talent. But it all started in 1985 with Balu Natarajan, a little luck, and the word “milieu.”
This week marks the 25th anniversary of when Johnny met Baby, and a leotard-clad Patrick Swayze (RIP, Patrick!) danced his way to heartthrob status. Dirty Dancing debuted in theaters on August 21, 1987, becoming one of the most beloved films of the 1980s — and my life.
Still, I can admit that Dirty Dancing doesn’t always make sense. The biggest incongruity is the nonsensical (yet oddly addictive) soundtrack. The characters boogie down to some very ‘80s-sounding songs in a movie clearly set in the ‘60s. Considering this major discrepancy — which I excuse due to the sheer awesomeness of Dirty Dancing — it might seem strange that for years, my biggest problem with the movie was these two lines crooned by Eric Carmen in Hungry Eyes (the song that accompanies the dance practice montage where Johnny starts to fall for Baby):
I’ve got hungry eyes
I feel the magic between you and I
What’s wrong with this famous refrain, you ask? Well, let’s forget for a moment that it rhymes the word “eyes” with “I” (I mean, think about how much better it rhymes with romantic words like “thighs,” “french fries,” or “cauterize”). The real conundrum is the phrase, “between you and I.”
If you’re a stickler for grammar, then you’re thinking to yourself that this should be “between you and me” — and you’re right (Grammar Girl offers a great explanation of why that’s the case, for those of you who geek out over pronouns). Basically, saying “between you and I” is like saying “between we” instead of “between us.” I know, artists often take license with grammar in favor of rhyming, rhythm, or emotion. But Eric Carmen isn’t the first or last singer to belt out that erroneous phrase with a whole lot of feeling — Jessica Simpson has a song called “Between You and I.” So why does this pesky error persist?
Well, one theory is pretty simple. As the website for Oxford Dictionaries puts it, “People make this mistake because they know it’s not correct to say, for example, ‘John and me went to the shops’. They know that the correct sentence would be ‘John and I went to the shops’. But they then mistakenly assume that the words ‘and me’ should be replaced by ‘and I’ in all cases.” We’re over-correcting ourselves with what we think sounds right.
The second theory is more complicated. A recent Lexicon Valley podcast for Slate presented a riveting debate (seriously) that managed to change my mind about the supposedly errant phrase in “Hungry Eyes.” It suggests that “between you and I” might not be “wrong,” per se. After all, it’s used more widely than “between you and me” by everyone from Shakespeare to Mark Twain to that guy you overheard on the train.
In other words, maybe the rules of language are dynamic. If a word or phrase catches on — anything from “gotta” to “bling” to “between you and I” — it becomes a part of our cultural repertoire and contemporary language. Sometimes it even becomes officially sanctioned by the Oxford English Dictionary. It might not be grammatically correct, but that doesn’t mean it’s going away. The key is knowing when it’s important to follow the rules (i.e. if it’s your job to write grammatical, accurate prose) and when it’s okay to play with them, like in music, creative writing, and even daily conversation.
After all, good writing and communication isn’t always about the rules — it’s about how it feels, kind of like dancing. As Johnny Castle would say, “It’s not the mambo, it’s a feeling… a heartbeat.”
By now, you’ve probably read a whole lot about the Millennials. As a generation, they’ve been called spoiled, entitled, needy, narcissistic, and most recently, well, totally screwed.
For a while, I’d read these articles that ragged on ‘trophy kids’ and nod my head in agreement: “Darn whippersnappers don’t know the value of a hard-earned dollar!” That is, until I had a horrifying realization akin to Bruce Willis at the end of The Sixth Sense: I AM a Millennial… !!
Furious Web research ensued. How could this be?! I learned how to type on a typewriter. I made mix tapes and read newspapers. Two of my favorite movies were Singles and Reality Bites. And I still think the Dewey Decimal System is a superior method of classifying information! But there it was, staring me in the face: “Generation Y, also known as the Millennials,” born between the late 70s and early 2000s.
I had always thought of myself as Gen Y — born just after Gen X (aka the MTV generation) but before the Millennials, the generation raised online. Other writers talked about this conundrum of feeling like the in-between generation, too. Doree Shafrir wrote a piece for Slate suggesting that we call ourselves “Generation Catalano” (if you don’t get the reference, you’re not Generation Catalano). But somehow, that didn’t seem satisfactory.
After I got through the denial, I started to wonder — why do we lump people together, who are born 20-30 years apart, for the sake of categorization? Do people born in the late 70s who first learned about Facebook well after college really have that much in common with babies born into a world where it’s ubiquitous? But it seems defining generations has long been a messy science with lots of overlap. The Baby Boomers were born between 1946 and 1964. Gen X is still sometimes defined as people born between the early 60s and early 80s, and there’s already a struggle to define Gen Z.
The idea of defining cultural generations first rose to prominence in the late 1800s. In 1863, Emile Littre defined a generation as “all men living more or less in the same time” (from The Generation of 1914 by Robert Wohl). Soon after, the naming of generations followed — The Lost Generation, who lived through World War I; the Silent Generation, who lived through the Depression; the Greatest Generation, who fought in World War II; and of course, the Baby Boomers, born after World War II.
Baby Boomers, interestingly, have a lot in common with Millennials. They came of age in a time of turmoil. They were youth-obsessed (you know, “hope I die before I get old”). They even ushered in the “Me Decade,” a term coined by Tom Wolfe to describe the ’70s. And recently, the criticism has shifted away from younger Millennials (who can’t find jobs no matter how desperately they want to) and onto the Boomers as the self-absorbed ones. As Joel Kotkin’s piece for The Daily Beast states: “Boomer America never had it so good. As a result, today’s young Americans’ never had it so bad.”
My intention here isn’t to place blame, though — there are a lot of complicated reasons we’re in a huge mess right now. Rather, it’s to rethink how we define generations. On some level, it feels like we’re creating sports teams and pitting them against each other. Which generation will emerge the greatest of all time?
So first, let’s openly acknowledge that the boundaries are fuzzy — there’s inherently a lot of overlap. Second, let’s be okay with generations being shorter, as massive changes (you know, like the Internet) create wider cultural gaps much more quickly. When you look at it culturally, Gen Y and the Millennials really should be separate generations, and I’m not saying that because I have a problem with being a Millennial (on the contrary). After all, we are living at the start of a new millennium.
That brings me to my last point. Generations are huge swaths of time — even if you’re only talking about 15 years. Even though the purpose of these categories is to generalize about ideas and trends that define a certain time period, let’s stop with the nasty ones directed at young people who haven’t even had a fair chance to self-actualize. After a while, as Erika Andersen notes in Forbes, it just makes you sound like a bad parody of that song from Bye Bye Birdie: “Why can’t they be like we were, perfect in every way? What’s the matter with kids today?”
When I first heard the news about Jonah Lehrer fabricating quotes by Bob Dylan in his book “Imagine: How Creativity Works,” I was shocked. Aside from the fact that I’ve really enjoyed reading his writing over the years, it takes serious balls to lie about Dylan, of all people.
But when I heard on Friday that Fareed Zakaria had admitted to plagiarizing sections of Jill Lepore’s New Yorker piece on gun control in his own column for Time, well, that was just completely befuddling. I mean, we’re talking about the former editor of Newsweek International, a current editor-at-large for Time, host of his own show on CNN, and a seemingly omnipresent columnist and pundit on international relations and foreign policy.
Looking at his resume, you might wonder “How does he do it?” Well, that’s precisely the problem. At some point, it’s physically impossible to keep up with the demands for more information in today’s 24-hour news cycle. Something has to give. In the case of tech blogger Om Malik, it was his health, and he suffered (but survived) a heart attack at the young age of 41. In the case of Fareed Zakaria, it was his reporting that took the hit, and you can see the full extent of his ‘lapse’ on the Atlantic Wire.
There’d been warning signs that Zakaria was over-extended — like when he basically gave the same graduation speech at Duke and Harvard. Still, copying yourself isn’t necessarily a moral quandary — it’s just really tacky. Chances are, things started to come to a head and Zakaria likely hired himself research or writing assistants to stay on top of the heap of assignments. Of course, that’s just speculation — though I’m not alone in my thinking. But when he says the incident “is entirely my fault,” he’s right. Whether the lapse was a result of his own reporting or failing to review someone else’s, it was sloppy work for someone of his caliber.
Now, before it sounds like I’m going too easy on Zakaria, I should point out that plagiarism makes me furious and I think it’s a fundamental sin of journalism. A few years ago, my writing partner and I poured our hearts (and a whole lot of time) into writing a piece for Little India magazine about Indian immigrants who are leaving the U.S. and returning home. We were shocked and angry when we learned that Mona Sarika had plagiarized us extensively in an online piece she wrote for the Wall Street Journal.
But Zakaria isn’t right out of journalism school, nor is he a struggling journalist trying to make a name for himself. As the Dallas Morning News points out, Zakaria is widely-respected as a first-rate thinker. I say “is” (not “was”) because despite some cries for all plagiarists’ heads on a platter, I believe he deserves a second chance. I’m surprised by my position (which I know is supported by at least one other group of editors), since I’m usually of the hard-nosed Jack Shafer school on this issue. But in this case, I can’t help but wonder if a.) there’s a systemic problem that needs to be addressed, b.) degree of the offense and intention should matter (is this as bad as Jayson Blair?), and c.) maybe I’m a little biased, and would hate to see Zakaria — at long last, a smart Indian-American personality on TV — get fired.
To be sure, he hasn’t exactly done us any favors with this recent incident. And saying that the reason plagiarism’s on the rise is because journalists are too busy or we as an audience are too demanding is a total cop-out.There are hoards of skilled journalists out there who would happily share the burden but can’t seem to find work.
The good news is, we live in a world where there’s obviously still a demand for good ideas and information. The bad news is, as beat reporters fall by the wayside, it seems original reporting is getting replaced with linking off to what other people reported and calling it a day (ahem, HuffPo). And with editorial budgets getting slashed, having full-time fact-checkers (who might uncover some of these transgressions in their research) is becoming a luxury except for some of the most elite organizations.But that’s opening a can of worms, and I don’t expect anyone to have a simple solution to the business problems facing journalism.
At the root, this is a marketing problem — and by that, I mean Zakaria’s own desire to be the face of foreign policy news in the U.S., and the media’s desire to build up marketable personalities who can sell books and draw ratings. And for better or worse, the only real solution is personal responsibility. Call me crazy, but maybe having your own TV show and doing it with integrity is enough. Sure, you can take on the occasional side project (an article here, a book deal there) but what’s the point in biting off more than you can chew in the name of building your brand?
I hope to see work from Zakaria in the future, just much less of it. After all, when you’re trying to be Superman, you usually end up more like Icarus — burned and bruised from your nasty fall.