Embracing Failure

The silence on this blog is glaring. Crafting yet another introduction to yet another apology for silence fills me with a familiar feeling of fraudulence, a nagging insecurity about my repeated inability to embrace the constant grind that being a writer requires.

Yet I can embrace one thing: the knowledge that the root of all of this silence comes from my persistent willingness to value success over failure. That is, my silence roots not from an inability to speak, but an inability to accept the failure that inevitably comes from each and every attempt I will ever make to express myself wholly. Good ideas do not emerge intrinsically; they emerge from iterations and iterations of an idea, and it takes a lot of failure to get to a successful idea worth sharing. And while I could have been iterating on failures, I was instead, finding short-term successes in my career. Much of my daily job requires producing successes, which means that much of my work happens in the form of ephemeral face-to-face interactions or in short, satisfying e-mails. I am not shaming myself for engrossing fully in this work (it has been the right thing to do at this point in my life), but I also want to work on embracing more fully the work required to write.

That is, today, I want to embrace the work of failure. I’m starting this by writing what’s going to be something of a failure at a blog post.

Why am I already saying that this blog post is a failure? I’m saying it’s a failure because it does not adhere to any of the conventions of a blog post. Its title is not that catchy; it won’t get caught on any search engine optimization filters. The only way anyone will ever read or find this is if I share it through social media. Otherwise, it’ll just be a part of the block of Internet content noise. Fine. So be it. I’m embracing it.

It’s also a failure because it’s not going to compel you to do anything. It won’t offer you an action item; it’ll simply offer you with some understanding of how I’m feeling about my writing now and my identity as a “writer.” Maybe that’ll be worthwhile to you. But this post is still something of a blogging failure precisely because it doesn’t explicitly offer you much of anything. I’m aware of your needs as an audience, but I’m not really addressing them right now. I’m going to apologize – yes, sorry – but I know that apology probably seems a little insincere because I’m not really going to do anything to remedy this failure of a blog post. It’s going to live as a little failure.

I’m deliberately making failure to help myself embrace it. What I’ve learned from my various identity pivots over the years – thinking of myself as a creative writer to a journalist to a burgeoning professor to a “higher education professional” – is that boxing myself into successes and expecting myself to produce my best work possible at all times is not possible. My pivots have been possible because I’ve been willing to “fail” and, have, instead, turned my failures or falterings into futures. I’ve never always been entirely comfortable with that, but remaining flexible, trying new things, and discovering new strengths has, much to my surprise, made me pretty happy.

I’m going to fail a little more on this blog soon and make this my space to keep experimenting with ideas and allowing this to be the space where I see things that are working and not working. This will be the only one I’ll continue to learn and grow.

 

Citation Woes

I sheepishly admitted something  embarrassing to a member of my dissertation committee yesterday: I still use Microsoft Word to take notes and keep track of citations.

“Jenaaaaae,” he groaned. “Seriously? It’s time to get with the twenty-first century.”

For those of you new to the citation software game, Microsoft Word is not twenty-first century. It’s kind of like the AOL of citation management; if you’re still using it, it means you’re old or incompetent.

For those of you skeptics wondering, “Why should it matter where citations get stored? Can’t they just be copied and pasted from document to document? Wouldn’t that be easier?”

The resounding answer to that is: “No. No no no. Why would you do that to yourself? No.”

Why is the resounding answer a series of emphatic “no’s?” Well, because it takes way longer to hunt down various citations from random files in random places on one’s computer. Copying and pasting citations from various documents (rather than storing them in one place) is kind of like putting your clothing in a bedroom closet, in a bathroom medicine cabinet, and in the refrigerator. Why would you ever do that? You wouldn’t.

And I know this. I’ve known this. I downloaded EndNote, a citation management program that our library lets students download for free, when I first started graduate school. I felt very pious about it and I quickly learned the basics. It wasn’t hard to use, but it felt cumbersome, like an additional step in the already-long and concentrated process of writing up research.

Here and there, I’d halfheartedly enter in some citations for papers I was working on, telling myself, “You’ll thank yourself for this later.” The problem was that I never got into a habit of it, and I found that for my seminar papers and my prospectus, I was often working against deadlines that made the simple task of entering the citation into EndNote feel like an enormous chore. Though I recognized the looming, long-term disadvantage of copying and pasting citations for the rest of my being, the short-term incentive to finish up writing to do anything else sounded better.

It also doesn’t help that I’ve found great soothing comfort in a Word document. It is, after all, the only word processor I’ve ever known, and it’s pretty much done everything I’ve ever needed it to do (except crash on me on several occasions; it never, never had to do that).

So the thought of abandoning the warm familiarity of a long, blank scrolling page (ha, who’d have thought I’d ever describe a blank page as “warm?”) has just made me want to cuddle up in all of its mainstream, moneyed glory even more. In other words: old habits die hard.

But indeed, it is time to get with it, this twenty-first century business, and actually manage my citations in a reasonable, organized, deliberate, and totally un-crazy way.

During this meeting, my committee member showed me Zotero, and while it is in many ways similar to EndNote, I think it may actually suit my working style a whole lot better (and one can only hope) actually motivate me to store my citations in a reasonable and systematic way.

The biggest advantage to Zotero for me is the fact that I can save readings I’ve accessed from my browser and store them immediately to Zotero. With the version of EndNote I got from the library, this isn’t possible (though perhaps newer versions of EndNote do this?).

Another huge advantage is that Zotero includes a note-taking tab along with the citations stored that allows me to write endlessly in a vertical view along with the citation. In EndNote, the “notes” section is a single bar that also scrolls endlessly, but it scrolls horizontally rather than vertically. This might seem like a totally nit-picky thing, but as a reader, it is much less cumbersome to read down than across.

Zotero also features a dynamic tagging view that will allow me to sort my articles and books by tags that I determine. Again, EndNote does this too, but with Zotero, the tags look a lot more like ones I’d seen on a blog.

In the scheme of things, these might seem like some small differences, but learning and becoming attached to new tools is part of becoming a better researcher. The more flexible I can become in how I write and store information, the better!

My Hesitation to Talk about Research is Not About You.

Whenever I am asked, “So, what’s your research about anyway?” my stomach dips. My mind goes blank for a moment; what is my research about? Do I even know? Sometimes, I feel like the project I’m working on is so big that I don’t know where to begin and there’s a certain dread I experience in trying to capture this project in a few sentences. So, my answer typically meanders through some qualified, fuzzy statements, like “Well, you know, it’s like…”

Let me just assure you of this: my hesitation is not about you. It’s all about me.

After I passed my qualifying exams, I thought that I would finally feel comfortable with the big research question. After all, I managed to convince a committee of five professors of my competence; surely, I could convince others of the same. Yet months after my qualifying exam, I somehow feel more insecure than ever about explaining my research.

The reason for my insecurity is simple: the more I learn, the more I know that I don’t know very much at all. The more I start to probe my research questions, the more depth, complexity, and texture they seem to take on. I feel as though I’ve hatched open a large egg and have just discovered that the creature hatched is not just a lizard but a fire-breathing dragon. I’ve got to get myself one hell of a shield.

Feeling like an incompetent nincompoop is a classic learned person problem. It’s impossible not to feel like one when your days are filled with reading, writing, reflecting, and asking questions. In fact, at the beginning of every graduate seminar I took, the professor always asked the participants to introduce themselves and their major fields of interest, and every time, my peers and I would have to qualify our interests with statements like, “Well, I’m not really an expert or anything, but…” Arguably, all of us in that room were well on our way to being experts, yet none of us could own that title. It felt uncomfortable. It still feels uncomfortable.

The thing is, the only way I’ll be able to convince other people that I’m doing more than picking lint out of my navel these days is by giving the dreaded 2-minute version of my research. People like Nicholas Kristof have called for making academic research accessible to the public and I completely agree. I don’t struggle very much with colloquial language and making ideas accessible, but I find I struggle with condensing an argument and making it clear without either obfuscating the point or over-simplifying it. In other words, I don’t want to misrepresent an idea, but I also don’t want to bog down.

What’s the solution here? I’m not sure yet, but here’s my call to you if you’re reading this (and if you are, it probably means you know me in person because – let’s be real – this blog is mostly getting circulated through my Facebook friends): ask me about my research. Keep asking me. The more practice I get, the less uncomfortable I’ll feel, and hey, maybe the more coherent I’ll become too. I want to be able to share ideas and discuss them with you, whether you’re an academic or not. I might be afraid to do so and I might seem a little weird about it, but ultimately, I’ll be grateful. If I don’t seem grateful, just point me back to this blog post and allow me to eat my words. They’ll be delicious.

New Year Resolution

I have a common New Year’s story for you. It starts with an evening of carnivalesque revelry, of glitter and alcohol, of photographs and finger foods. It ends with a chorus of declarations of hope for the New Year, of promises to do things better and to make the next year even better than the last. It’s an evening with a lot of hope; there’s no one who doesn’t love the idea of a fresh start.

The fresh start begins with closed shop windows and quiet streets. Everyone sleeps off the night before. People bustle in their kitchens, cleaning up the messes, eventually floating over to big desk chairs in front of their computer screens to keep refreshing the e-mail inboxes for work e-mails that won’t arrive, at least not until tomorrow. The hope still lingers, and it’s supplemented by the creeping reminder that the very next day – January 2nd – will be another calendar day where we’re all still responsible for the things we promised we would be even before a New Year began.

The resolutions we set and remain mindful of on the 31st and 1st don’t disappear entirely perhaps, but they go forgotten. Yet this is not the kind of forgetting one does with misplaced car keys. This is often a willful forgetting, a forgetting done because it’s much more convenient to forget than to remember. It’s much easier to say, “I forgot my last year’s New Year’s Resolution” than to assert “I didn’t succeed” or even worse, “I didn’t feel like following through.”

These aren’t stories with resolutions of their own. Rather, they’re beginnings to familiar stories, cliches really, that remind me how much I, too, fall into the typical traps and patterns of New Year traditions. On the one hand, I find New Years’ Eve, the idea of celebrating all of the possibilities of the year to come, to be one of the most joyous, reflective holidays of the year. Yet it’s one of these holidays with a severe letdown, a booming crash after skating through holidays. Above all, it’s sobering.

I’ve been particularly sobered by the realization that this coming year is going to mean much more than the years prior. This is the year where I really will be forced to define for myself who I want to be in the university. How do I want to define myself? How will I produce work to show that I can define myself in the ways I so desire?

I’m afraid. So, my New Year’s Resolution turns out to be quite simple: Don’t Be Afraid. Be Courageous.

New Year’s Resolutions have a particular way of sounding maddeningly cliche, and my resolution falls into this unfortunate trap, but I think I (and I don’t think i’m the only graduate student in this position) have been so caught up in anxieties of who and where I will be in the next few years that I have not tackled who I am and want to be right now. I have been afraid of the outcomes of my choices before I have even made the choices in the first place. Hence, I resolve to tackle my time in graduate school with courage, to make it through and finish what I’ve started, and not worry about the outcomes, not yet. This doesn’t mean I should turn a blind eye to the reality of higher education, but that I should still make the most of the opportunities I have, relish them, and allow these opportunities to be positive, proud moments for me in 2014.

Make it New

I’m a little self-conscious about the title of this post, mostly because I don’t intend to talk Ezra Pound or Modernism here and in fact, I find both the poet and the movement a little tiresome. This is for a number of reasons, primary of which is the whiplash from reading dozens of Modernist texts for my preliminary exams.

That said, Ezra Pound’s famous expression “Make it New” is perfect in my mind. To me, “Making it New” is essential to surviving in graduate school.

Let me take a step back for a moment: like so many others, I’ve found my graduate school experience simultaneously gratifying, exhilarating, and disheartening. I’ve had moments of great confidence and empowerment and perhaps even more moments of self-doubt. I just recently erased the subtitle on my to-do list “Happenings of a Continued Adolescence.” Though it was initially written in jest, I realized I was re-enforcing my own self-loathing and insecurities and I just couldn’t have that.

Though I recognize that graduate school is necessary career training and apprenticing for a life in the academy (or, increasingly, in other kinds of professional institutions and research settings), I can’t help but shake the feeling much of the time that I’m doing college all over again, but with higher stakes and more responsibilities. It’s a lot like treading water, endlessly egg beating with only the potential outcome for the rescue ship to come in.

Yet I have to remind myself (frequently) that I’m very happy here, reading and writing, and though it may feel like I’m staying in one place much of the time, I am (very slowly and often imperceptibly) moving forward, building skills, and becoming a sharper thinker and communicator.

One way I remind myself of this is by reminding myself of what, indeed, makes me happy. In college, I kept a gratitude journal for a little while, trying to map what made me feel happy during a time where I felt very sad. So, even today, I look at what I’m grateful for: my flexible schedule, my time to read as many books (and whatever books) I want and reflect on those books, my ability to exercise on a regular basis, my proximity to a beautiful campus, and the opportunity to work with dynamic and motivated students. These reminders keep me pushing through.

That said, to really dig into and find joy in reading and writing, I’ve found that I truly also need novelty. Let me start by saying that, as a general rule, graduate students are rarely doing anything truly novel on their own. Rather, graduate students often (nervously) emulate advisers, hero theorists, and trendy trailblazers in the field.

Yet we need to feel like our work is new and discover what makes it new: that’s what builds scholarship.

For me, post-preliminary exam (and for those following at home: I passed), I’m revisiting a lot of old work and trying to make it new again. I’m not necessarily revising work or outputting much for others. Rather, I’m trying to make it new for myself. I’m trying to remind myself of why I’m in graduate school in the first place and why the topics I’m purportedly interested in (life writing, writing about technology, writing with technology) are even interesting to me at all. Then, I want to make my work new to others. But I’ve got to convince myself first.

Let me make one thing clear: I’ve never been unconvinced that my ideas are uninteresting. However, I’ve had many days recently when I’ve been reading books and have taken a pause to wonder: why am I interested in this? What is actually of interest here? What am I noticing that’s new? What am I noticing that’s exciting? I can usually think of something, come up with a million questions, and then never manage to reach one conclusion about them. Days often go from exhilaration about a new book or question to complete despair about whether there’s anything of value to say about anything. Do questions even matter anymore? Do books matter? AM I AN INTERESTING PERSON??

This spiral is not impossible to recover from, but it’s one that also leads to the larger, looming purpose of making work new again: writing a dissertation.

I mean, here’s the thing. I don’t know how to write a dissertation. If a dissertation walked up to me, looked me in the eyes, and said “Cheerio!”, I don’t think I’d recognize it.

This shouldn’t be news to me. But it is.

I hadn’t put it together until recently that I’m at the scary-exciting moment of figuring out how to interpret a new genre of writing. This is something I haven’t had to do in a long time and I’m now facing the questions I have to tell my own writing students to face all of the time: what’s the purpose of this writing? Who’s the audience? What are the constraints? What’s your exigence for writing?

So, what have I been doing to a.) make my ideas feel new again and b.) tackle an unfamiliar genre?

Well, in addition to reading, I’ve been revisiting writing I know how to do. I started penning a short story for the first time in forever, I’ve been drafting up some short articles that I can (hopefully) submit to GradHacker. In other words, I’ve been trying to rediscover what I like not just about reading, but also about writing in the first place. Why do I care so much about writing? Why is knowing how to write – and knowing how to write well – something that endlessly fascinates me? Perhaps more importantly, why do I still think I’m a “good” writer and how can I continue to validate and enjoy my own practice?

These are questions that scare me because they require deep introspection, an opportunity to consider my values and my everyday practices. At the same time, this is a rare opportunity, one that I know will be important to all of the choices I make in the future. I hope that making it new is a practice that will soon feel very practiced and very – well – old.

In Which I Will Not Be Afraid!

“You’re going to come running back to academia,” a colleague assured me when I described to her my summer internship working in a Technical Communications department at a company in San Diego.

Maybe so. I received a whole packet of documents today with the types of reports I’ll be expected to write. And now? I kind of feel like this:

As in, wait: am I prepared to do something at which I could potentially fail?

So, OK. Wait. You need more information before you can understand my quick surge of panic this evening.

Potential projects for me include improving the usability of help information for breast cancer imaging software, creating a more interactive, educational platform for understanding electronic health records software, and collating a series of articles about electronic health records use into one more cohesive space.

This is all very cool stuff! These are the sorts of projects that could:

  • Make doctors’ lives easier!
  • Improve patients’ ability to get the results they need to be healthy!
  • Save HR departments from having to lead terrible training sessions!

So, real world solutions! Cool! I don’t often get to say that my work inspires tangible change in a working environment that – and we’re about to get real – SAVES LIVES. (Though come on, my understanding of esoteric literary theory should clearly impact your outlook on your digital reading/writing practices. I wrote this great essay on lolcats, you should read it some time).

But I’ll admit it: I’m scared of doing in a field with which I am not comfortable and familiar. This anxiety is clearly the vestige of some serious “straight A student syndrome;” I’m compelled to pursue projects in which I feel that success is within my reach. This is the first job I’ve undertaken where I don’t feel like I am comfortable with what I’m diong. I’m going to have to learn on the job and – well – maybe fail a few times.

I could go into any number of hackneyed aphorisms about this. One must not try; one must DO. (I’m on a Yoda kick tonight if you didn’t get that already. Looking for inspiration in all of the right places).

What am I going to do?

1.  Get over myself. And promptly.

2.  Look for some points of familiarity. I will say that upon looking through the project documents sent to me, I did see some places that I could contribute my knowledge. Many of the documents (especially the step-by-step help guides for using the mammogram software) were driven very much by the logic of a page. That is, while they are conversational in tone (typically a good start to making help text accessible), they’re a little – well – verbose. Clearly, I can sympathize. Verbosity is always my inclination.

But I thought a lot about my discussions in my Literacy and Technology class concerning the relationship between content and design and there are certainly design issues at stake here. So, I could certainly do some re-design work if nothing else.

3. Employ my love of organizing and re-organizing. There’s probably nothing I love more than a great spreadsheet or a clean table. A lot of the writing could probably be organized into the sweet, sweet symmetry of a table! I suppose this is still an issue of design, but if my background in literature has provided me with one practical skill it is the ability to distinguish main points from blocks of dense text. So, while the language of these documents is difficult for me to understand, I can typically distinguish the purpose of the pieces I read.

So, those are my strategies for not being such a fail-fearing wimp. Who knows? Thinking though ways to work through these challenges is compelling. Plus, I have a new work environment to anticipate. I happen to crave novelty. Maybe the office will even feel like this:

One can only hope.

The Shortcut Divide

“Wait, wait, what is it that you highlighted? Do I click here?”

“Do I need to get the YouTube BEFORE I make the post?”

“Hold on: what are you doing exactly? What button did you press?”

These are questions I hadn’t thought about.

“Oh, um, you just highlight the URL. You know, the long series of words that are in this bar – yes, this one.”

“It’s the button with the music note and the camera. You see that? There?”

“So, I just clicked the ‘video’ button. Yeah, where it says ‘video.'”

It’s easy to take for granted the processes online that feel so natural to those of us who have used computers for as long as we can remember.

Yesterday, I attended the last portion of a WordPress workshop for middle school and high school history teachers led by UC Davis Digital History developer Phillip Barron. I was asked to give a small presentation on the functional literacy project I developed for UWP 270 as well as any tips or experiences I had about working with WordPress.

Frankly, I’m not sure I had too much to offer (I still have so much to learn myself!). The main piece of advice I gave was not to have fear. Looking out at the group, mostly absorbed in their laptops, I couldn’t help but remember that feeling I get almost every time I sit in front of a computer and start a new project. For the most part, when I sit with my hands upon the keys, trying to figure out how to tweak the code of a webpage, I alternate between fear, frustration, and impatience. I know that it is within my control to change whatever it is that I want to change, but I do not intuitively know how to manipulate space on a screen. It takes me a long time to figure out how to do something new.

I felt really impressed with the group that was there because I know how challenging it is to change the way that we think about our work. Learning is fun. Learning is exhilarating. But much of the time, learning is also really, really hard.

So, there was a part of me that felt like a huge hypocrite standing at the front of the room, speaking about pushing past fears as if it was something I had already done. But after I gave my “talk,” I helped individual teachers work on their webpages, and I realized that I did have something to offer: patience. If there’s something that teaching has taught me, it is the patience to help people when they’re struggling, to listen through struggles, and to find solutions.

I realized that even as I struggle and complain about my own technical incompetency, I still have the major advantage of being young, of growing up knowing how to copy and paste, of understanding what a hyperlink and a URL is, of opening up millions of tabs on my screen and being able to effortlessly click among them. The concept I found I helped the participants with the most was just that: copying, pasting, choosing the right buttons to do certain actions.

For many of the things participants wanted to do, I had already found shortcuts. And that, I realized, was the source of the divide between my experience with computers and many of these workshop participants’ experience with computers: I felt comfortable enough with the fluid processes of using a computer and browsing the Web that I wasn’t afraid to take shortcuts and cut corners. The workshop participants, however, wanted every step laid out in sequential order; they wanted to know exactly what to press and when to press it.

There’s clearly no wrong way to use a computer, but the desire for linearity, for seeing all of the steps, for making the process of using a computer transparent surprised me a little bit. It shouldn’t have, of course. After all, what have I been reading about all year? The inherent discomfort in our postmodern tendencies to shake up the order of our lives, to see our lives and experiences as disconnected and fragmented. The shortcuts might seem easier, but for someone entering a new world and a new way of thinking, they can simply be baffling.

If nothing else, yesterday was a good reminder of why I’m so fascinated with and interested in the intersections between literacy technology: the way we use the devices so central to our lives changes the ways that we think about leading our lives. I think after yesterday, I’m even more motivated now to think closely this summer about how I’m going to adapt to teaching in a computer classroom and how I might consider integrating 21st century literacies into my formal writing class.

Fear and Self-Loathing in the Humanities

Greetings!

It’s good to see you. Take a seat. Oh, you’re already seated? Well, grab yourself a cup of tea. Make yourself comfortable.

Ever since DML, this blog has been sadly neglected. This was not for lack of things to write about. In fact, I have a folder of bookmarks full of exciting links I gathered over the past few weeks from SXSW Interactive, a new (and very fascinating) advertisement from the Guardian, and the buzz over Curator’s Code. Oh, and don’t even get me started about the chitter chatter circulating re: Mike Daisey’s falsified Foxconn tour. For those of you who haven’t listened to it already, it is absolutely worth your hour to check out the “Retracted” episode from This American Life all about the fact-checking process on Mr. Daisey’s story and, of course the “defense” from Daisey himself. This goes not only for those interested in technology, but anyone interested in issues of journalistic ethics and – well – what it means to capture an experience “truthfully.”

The great thing about writing about technology is it seems that there’s no lack of things to talk about. New stories arise every day. Gadgets are exciting! The development of new technologies (typically) means the cultivation of new ideas or at least the streamlining of old ones.

But I don’t think I’m going to talk about those things today. Those things will (likely) emerge in upcoming posts, ones that will appear shortly after this one, not after a long hiatus. After all, what good is reporting on news that ever-so-quickly becomes old news?

I think this post will end up being more reflective because I’ve hit the end of (yet another) quarter. Two quarters down and one to go for completing my first year in a PhD program. It’s surreal. I’m sure this will feel all the more surreal once I’ve actually completed this first year. By then, I have even more right to say, “Well, NOW I’m 1/5 of the way through this whole graduate school experience.”

But at 2/3 of the way through this first year, I’ve surprised myself with the number of apologies I’ve been compelled to make about my career choice. Why am I pursuing a PhD in English? I end up falling back on hackneyed lines:

I love to read! I love to write! I love to teach!

Heck, this is a great way to spend five years! The economy sucks right now, so why not go back to school?

Yet all of these reasons somehow sound like lies to me. They’re not. They’re abbreviations of the truth. Yet they’re stale and they somehow feel ridiculous to say aloud. Sometimes, I feel like telling people I’m pursuing a PhD in English is as remarkably dumb as telling people I’ll be an elephant trainer in the circus. And if I told people that, I’d probably get more positive response than a tight-lipped smile, raised eyebrows, and an “Oh, so what do you plan to do with THAT?” at the end.

Let me explain: I was at a family event this past weekend and it became starkly clear to me that no one knows understands how I spend my time and make a living. I was introduced to friends of family as a “teacher” or simply as a “writer,” but never as a student of English. Questions that followed included:

“So, do you like working with kids?”

“So, have you read any good books lately?”

I didn’t know how to answer these questions. Kids? Well, are college students “kids?” If so, aren’t I technically still a “kid,” only two years out of college? Am I simply a kid teaching kids?

And GOOD books? Well, I’ve read plenty of books. I read at least two books every week. Are they GOOD books? I don’t know; I thought about a lot of different things. Could I name you new favorite writers? Probably not. But, again, with the thinking. That’s how I spend most of my time. In my head. Thinking.

How do I explain the fact that I’ve chosen a career where my job is to think? I may have had final seminar papers to write (a primary reason why I haven’t written in here lately) but other than papers that one other person (my professor) will read, I’ve produced very little. Nothing, in fact. The other favorite question asked this weekend:

“So, when are you going to write a novel?”

I’m not producing a novel any time soon. This isn’t my job. But again, when one’s job is relegated to reading stuff, saying some smart things in a room with ten other people, grading undergraduate essays, and writing a little bit here and there, what does that mean precisely?

Well, here’s the short answer: I see this time as an investment. One way or the other, I’m investing in a future. Will that be a future in academia? Possibly. It doesn’t have to be. And I think that’s OK. I’d like to think I’m receiving some valuable training on how to be an organized thinker, how to communicate with others. These are likely skills I’d gain doing a lot of different kinds of things; I just so happen to be receiving this training while talking about things like literature and literacy.

Given this perspective, I still struggle with my compulsion to laugh nervously when someone asks me what I want to do with my future. I don’t like that I feel the need to backpedal and apologize for choices that I’ve (mindfully) made. This would not be an issue if I was a law school student or a medical school student. Without having to say a word, my accomplishments would be deemed impressive. But since I’m pursuing a higher degree in the humanities, I’m met with skepticism. This whips my ego into a tailspin; the inner former Honors student emerges indignant: “I’m making a GOOD choice that YOU simply don’t understand!”

Of course, it’s a good thing I don’t allow that inner former Honors student to emerge like the Ghost of Christmas Past too often because she says some awfully silly, immature things. However, whether I like it or not, she is still very much a part of me and very much a part that leads me to wonder, “Why do I pursue work that so few people understand?” If I care so much about communication, about clarity in prose and clarity in thought, why am I in a field that takes me a good five minutes to justify?

This does not mean that I regret my choice to attend graduate school by any means. I’ve been given a wonderful opportunity to work with bright, forward-thinking individuals in a major, public institution. My college experiences all indicated to me that the kind of work I’d do in academia would be work that would fulfill me, enrich me, and make me feel like I am contributing to something larger than myself.

Unfortunately, I often feel mired in recursive self-loathing/self-indulgent thinking. I want to remain open to questioning, open to seeing the bad for the bad and the good for the good in the field that I’m in. But I think that I also need to make sure that the bad I see in the work that I do is not influenced by  my fear of uncertainty or my fear of criticism.

Next quarter will be a time to remain open-minded, to pursue the work I love, and to be mindful of the fact that training (AKA graduate school) exists for a reason: to give me the space to ensure that I continue to make choices that will ultimately benefit both myself and others.

Functions and Conjunctions

I have a confession to make: computers scare me.

Admittedly, this is not ideal. I’m working towards graduate work in digital literacy/media studies. Yet if something goes wrong, I’m the first to call tech support, panicked. What am I thinking? And yes, I ask myself this all the time.

I’m likely ripe for a psychological diagnosis, but I think this anxiety is precisely what drives me to pursue this study: I know all too well that, as Bradley Dilger and Jeff Rice write in the intro toFrom A to <A>: Keywords of Markup, “the literate mind has extended to the markup mind” (xiv). Plus, as the discourse of computing and computer control becomes an increasingly big part of working culture (and popular culture), I recognize that (as Stuart Selber writes in Multiliteracies for a Digital Age) “one must effectively appropriate the language of a community in order to have a voice within it” (45).

So, yeah, this stuff is important. But that recognition doesn’t stop me from feeling utterly terrified.

Allow me to clarify: the kinds of things I can do on computers don’t scare me. Software is my friend. I love using my Word Processor. I love that I can make calculations easily with Excel (this will come in handy with tax season upon us!). Heck, I love using the Internet and I love the kind of cultural resources the Internet has wrought.

The point is, it’s the inner machinations of the computer that scare me. If there’s a window that pops up that’s anything out of the ordinary, I feel panicked. If something is frozen on my screen, I frantically hit CTRL ALT DEL in the hopes that if I just shut a troublesome program down, the problem will figure itself out. Is my computer running really slowly? Well, shoot, I’ll just shut it down, restart it, and let my heart beat as loud as a tom-tom until the windows welcome sound greets me again. Ah, it’s working! It’s working! Life is beautiful! I’ll never hurt you again, my dearest Shibi!

Anyway, what this all boils down to (I think) is my fear of losing control. This perhaps scares many of us (especially those of us in graduate school; our work is very much contingent on finding answers and organizing those answers). Yet when it comes to the computer, I fear this loss of control even more because I know how central this computer (this machine) is to so much of my life. Scary but true.

Mitigating the kinds of fear I have is exactly the reason that writers like Selber, Dilger, and Rice advocate for the development of functional literacy in the classroom. In short, my understanding of functional literacy (given the discussions we’ve had in our readings and in class) is the development of technical computing skills (and applying those skills in appropriate and confident ways). After all, developing functional literacy is empowering! Knowing how to do something is a powerful tool to – well – actually doing stuff and being a part of a larger community of other “do-ers.”

But the range of developing functionally literate skills seems expansive to me. Selber suggests that functionally literate students merely need to be aware of “those online activities considered to be customary in English courses at the post-secondary level,” (44) whereas Dilger and Rice suggest that “In the age of new media, there is no way to avoid markup. Markup is text. Markup is communication. Markup is writing” (xi).

So, the range here is: functional literacy could mean using a word processor (yay!) to learning how to use HTML markup (yay, but scary!).

To me, that range is ENORMOUS. I consider myself a fairly adept user of certain kinds of software and programs, but I’m forcing myself to learn coding and markup and, boy, it’s hard. I can’t imagine having to learn this kind of stuff in an English class. Granted, I’m a few weeks behind on my Code Year lesson (when you have fifty-five undergraduate papers in front of you to grade, a lecture to prepare, and class presentation questions to write, you know where your priorities go), but still. I think if I was told I had to edit a web text for a writing class, I would feel enormously alienated from the processes of writing I grew up with. Though I may be a “digital native,” there are some terrains that to me are still like the wild, wild west.

In any case, I’ve probably gone on too long, but I suppose my question with the development of functional literacy is this: how much will future students need to know? To what extent do computers need to be quasi-programmers to be able to be a part of “the conversation?”

Code Year Lesson 2, Part 2

Yikes, I’ve been woefully negligent of my coding lately! Admittedly, it’s a task that I end up avoiding. For whatever reason, thinking about the coding often fills me with anxiety; I know that the work will be a little bit tedious and will often be difficult for me to process. I imagine it’s natural to avoid tasks that feel uncomfortable, but I think I have to do a better job of either a.) finding joy in the process or b.) becoming a little bit more disciplined. I’d like to rely on option a), but I have a feeling that the mean option b) may have to take the wheel.

Technically, I’m a week behind. I just finished up the “activity” for Lesson 2, which was about defining functions in Java.

Do you remember those word problems in math class? You know, the ones that were like:

“Jimmy drove 100 miles at sixty miles per hour to meet Carmen. Carmen is 500 miles away driving at 30 miles per hour. Calculate how long it will take Jimmy to meet Carmen. Throw in the number of corn nuts Jimmy will munch on as he drives to meet Carmen.”

I mean, I guess they weren’t exactly like that, but I remember walking away from each of those problems with an exasperated, “WHO CARES?” Plus, it was always so difficult for me to wrap my mind around the spatial-temporal connection.  What if Jimmy had to pull over for a flat tire? Why is Carmen such a slow driver?

In any case, the “functions” activity for Code Year initially terrified me as it reminded me of those old word problems…

My task was to create a program that calculated New York taxi cab fare. It started nicely enough:

Declare a variable! Yeah! I’ve done this before! For those of you who don’t remember, declaring a variable really just means saving it into the program. Sweet. Now the program knows the baseFare is $2.50.

Oh, and that little yellow “caution” sign at the start of line three of my code was to scold me for forgetting a semicolon. Unfortunately, I forgot to screencast my “good job!” when I fixed the syntax.

Then, we had to bring back our knowledge of functions. To refresh your memory, functions are those programs that you create so that you don’t have to repeat certain lines of script. A function is a program that you probably want to be running in the background or that you want to use multiple times.

So, here’s the function I defined to calculate the taxi fare. For whatever reason, it was really difficult for me to keep the variables straight initially. Somehow, defining the base fare BEFORE I established the function that finds the final taxi fare seemed confusing.

I mean, for someone used to thinking in a narrative form, this would be kind of like giving the cast of characters at the front of a book before the book even begins. I suppose that’s not so weird, but casts of characters typically only appear at the beginnings of plays. When I’m reading a novel, I somehow don’t expect this to happen. In the same way, I somehow didn’t expect this to happen in functions either.

Of course, in math, you must define your variables before you solve your problems, so I suppose if I was used to thinking in a “math-brained” way, this would have been more intuitive.

Now, the lesson grew increasingly tricky as we had to include a parameter in our function. This parameter allows the function to define more specific characteristics of the function. In this case:

I had to include the parameter for the miles traveled in the taxi to determine its total cost. Without defining that parameter, I would have been unable to execute the cost of the taxi ride. In the initial function, that total fare would have only been determined according to the base fare. That, of course, is incomplete since the rest of the fare is determined by the cost of additional miles and, of course, the number of miles traveled.

See how this is reminiscent of those old word problems?

But look! We had to deal with one more factor:

That’s right: the cost of the fare CHANGES during certain hours of the day. I guess it’s about time I start calculating those corn nut delays, right?

As you can see, I didn’t end up getting the final function right. Frankly, I’m still a little stumped on this one. I think I made a simple syntax error, but I’m pretty sure I set it up correctly. All I had to do was add in an “if/else” statement to determine that final fare. However, again, something went wrong…

This, of course, took me a while to figure out and, with the help of forums that are part of the Codeacademy lessons, I finally got to this point. However, it was that kind of “backwards” thinking that threw me off again. Why couldn’t the night surcharge declared at the beginning of the function just operate as the factor needed to do the calculation? Couldn’t the computer just “figure it out?”

Obviously, the answer to that is “no.” As a programmer, it is your job to define all the variables necessary to make sure the end result comes out accurately. If I think about it more, it’s somewhat similar to the process of writing a paper. A writer should not assume that the reader just “gets” his argument. He has to go through the pains of explicating it, justifying his point and showing evidence. Something similar is happening here: I have to explicate the problem and show all my tracks to guarantee that the computer “gets it.” Otherwise, he’s stumped!

I suppose he is a little stumped at present, considering the fact that my syntax is somehow wrong. At times like these, I want to apologize for any student’s English grammar that I’ve ever scoffed at. It’s incredibly difficult to remember these grammatical rules when learning a new language! I suppose I’ve always taken that for granted.

Next week, I’ll complete the work I was supposed to do this week on Conditionals in Java. Get stoked!