Category Archives: arbitrariness

Toyoko INN for railfans

Looking to stay at a quality business hotel, but don’t want to tear yourself away from the trains? Look no further: Toyoko INN delivers!

That’s right… From certain floors and rooms of the Toyoko INN chain of hotels, you can overdose on train watching without leaving the comfort of your bed. Just ask at the front desk to be accommodated. Here are some samples of train views from their various hotels, including the one I’m going to be staying in next week!

Screen Shot 2013-04-04 at 5.55.49 PM

Not only do I get to enjoy Sendai and the hospitality of Tohoku University, but I’ll potentially get some great views of the Tohoku Shinkansen while I’m there. I thought Toyoko INN outdid itself when I received socks and q-tips as “presents” when I became a point club member a few years back, but I was wrong. Can’t wait!

 

don’t learn to code

There is a lot of speculating going on, on the Internet, at conferences, everywhere, about the ways in which we might want to integrate IT skills – for lack of a better word – with humanities education. Undergrads, graduate students, faculty. They all need some marketable tech skills at the basis of their education in order to participate in the intellectual world and economy of the 21st century.

I hear a lot, “learn to code.” In fact, my alma mater has a required first-semester course for all information science students, from information retrieval specialists to preservationists, to do just that, in Python. Others recommend Ruby. They rightly stay away from the language of my own training, C++, or god forbid, Java. Coding seems to mean scripting, which is fine with me for the purposes of humanities education. We’re not raising software engineers here. We tend to hire those separately.*

I recently read a blog post that advocated for students to “learn a programming language” as part of a language requirement for an English major. (Sorry, the link has been buried in more recent tweets by now.) You’d think I would be all about this. I’m constantly urging that humanities majors acquire enough tech skills to at least know what others are talking about when they might collaborate with them on projects in the future. It also allows one to experiment without the need for hiring a programmer at the outset of a project.

But how much experimentation does it actually allow? What can you actually get done? My contention is: not very much.

If you’re an English major who’s taken CS101 and “learned a programming language,” you have much less knowledge than you think you do. This may sound harsh, but it’s not until the second-semester, first-year CS courses that you even get into data structures and algorithms, the building blocks of programming. Even at that point, you’re just barely starting to get an idea of what you’re doing. There’s a lot more to programming than learning syntax.

In fact, I’d say that learning syntax is not the point. The point is to learn a new way of thinking, the way(s) of thinking that are required for creating programs that do something interesting and productive, that solve real problems. “Learning a programming language,” unless done very well (for example in a book like SICP), is not going to teach you this.

I may sound disdainful or bitter here, but I feel this must be said. It’s frankly insulting as someone who has gone through a CS curriculum to hear “learn a programming language” as if that’s going to allow one to “program” or “code.” Coding isn’t syntax, and it’s not learning how to print to the screen. Those are your tools, but not everything. You need theory and design, the big ideas and patterns that allow you to do real problem-solving, and you’re not going to get that from a one-semester Python course.

I don’t think there’s no point to trying to learn a programming language if you don’t currently know how to program. But I wish the strategies generally being recommended were more holistic. Learning a programming language is a waste of time if you don’t have concepts that you can use it to express.

 

* I’m cursed by an interdisciplinary education, in a way. I have a CS degree but no industry experience. I program both for fun and for work, and I know a range of languages. I’m qualified in that way for many DH programming jobs, but they all require several years of experience that I passed up while busy writing a Japanese literature dissertation. I’ve got a bit too much humanities for some DH jobs, and too little (specifically teaching experience) for others.

disciplinarity and undergraduate education

I have a quick comment on a recent blog post I read: “The Politics of Disciplinarity at the Undergraduate Level” (Natalia Cecire) This is adapted and expanded from a lengthy comment I left at said blog.

I have an admission to make: I was a naive, stereotypical computer science major. How so? I looked down, so very much, on the humanities – on what I perceived to be the humanities. Soft, vague, insular, self-interested, and ultimately irrelevant to my (or anyone else’s) life. “Learning for learning’s sake” was my hobby, but somehow it seemed ridiculous as a university course. How would humanities majors get jobs? Perhaps it’s partly my humble background, but majoring in something that didn’t have a definable endpoint in a career that would make up for the investment in a college education just seemed worse than pointless. It seemed irresponsible and naive.

Yet I was the one who was naive, along with my fellow CS majors who mocked MBAs and even the information science students. They were the ones who couldn’t hack it, right? If you’re not in a hard science or engineering (and we counted ourselves among them), you’re just playing around; you can’t make it to our league.

Who was I kidding? Myself.

I am now, as you know, in a humanities PhD program. I’m in an area studies department but study the history of the book, and came to it via literature (and before that, via a very social-science oriented history department, which is also partly the explanation for my attitude toward things like cultural studies and other vague humanities, including history departments with this bent).

It’s been a hard road, admittedly, for me to come to terms with this. I’ve never felt fully at home in the humanities and it’s because of the carryover of this attitude. And yet at the same time I’ve been doing a dual degree in information science, the very discipline I used to mock along with my CS buddies as for the kids who couldn’t hack our program, who couldn’t move from pseudocode to real programming, to real work.

And as you may guess, I’ve changed my mind in that I’ve become less naive (I would hope) and much more broad-minded about what can mean. Of course it’s more difficult to get a job that translates directly from a humanities degree to something concrete – but that doesn’t mean that one’s degree isn’t widely applicable and doesn’t prepare one for a variety of life paths. I know that’s often considered a platitude uttered by career counselors at universities everywhere (not to mention tenured professors who don’t understand undergrads’ lack of appreciation for “learning for learning’s sake”) but it’s true.

One of the things that was lacking from my CS education was a strong dose of critical thinking. It wasn’t until a few years into my humanities PhD program that I could think critically about the science discipline that I had come from, about  the inability to be truly objective but rather the ability to recognize and be aware of one’s own biases, and about how the questions we are able to ask, the problems we are able to pose, are not self-evident. Thinking critically about code, about programming, about application design from the very concept of applications to the endpoint of execution, was not in my DNA until I had already left the field and joined the legions of critical thinkers that inhabited another.*

The blog post referenced above speaks to the implications of politics at the “academic” level about disciplinarity having perhaps unintended consequences for attitudes at the undergraduate level, and so I’m sharing my undergraduate attitude, and gradual attitude change, above. Below, I’d like to address another consequence that the author brings up: the possibility of differential undergraduate tuition that could reflect perceived value of various “hard” versus “soft” majors. This is what I had to say in my comment on her blog:

One school, at least, has already implemented the policy of differential undergrad tuition: University of Michigan (where I am currently a student). The tuition varies by college, with Engineering being the best example, but since Computer Science is in the college of Arts & Sciences but veers toward the money-making assumption about engineering, it also gets differential (higher) tuition at the upperclassmen level.

I was a computer science major as an undergrad, and this kind of system would have strongly discouraged me from pursuing the degree. As a woman who was often the only woman, or one of perhaps two or three, in a class of 40-60 students, this has serious implications for the demographics of the major, which are already an issue. I also have to say that as a computer science undergrad with a double major in history, I held that unfortunate attitude: CS is “real work” whereas history is something fun I did on the side, something not really relevant to anything but history and academia itself.

I’m now a PhD candidate in the history of the book (within an area studies department – humanities, in other words), and I see now the patronizing and narrow-minded attitude I have. But it is so prevalent that even I – and I naively considered myself broad-minded – held it for a long time, and actively mocked those outside the “hard” sciences because of it.

It’s so pervasive, and I’m glad that you addressed the fact that what is often written off as academic squabbles and pissing matches impact undergrads profoundly as well.

 

* That’s not to say that everyone who majors in the humanities ends up being able to think critically. I meet many who get by completely unable to do so. But here I speak from my own experience and say that it is what allowed me to do so.

pseudonymity and the bibliography

My research is on authorship, and specifically on varied practices of writing and ways that authorship is performed.

For my study – that is, late 19th-century Japan – the practice of using pseudonyms, multiple and various, is extremely common. It’s an issue that I consider quite a bit, and a practice that I personally find simultaneously playful and liberating. It’s the ultimate in creativity: creating not just a work but one’s authorship, and one’s authorial name, every time.

This does raise a practical issue, however, that leads me to think even more about the meaning and implications of using a pseudonym.

How does one create a bibliography of works written under pen names?

The easy version of the problem is this: I have a choice when making my master dissertation bibliography of citing works in a number of ways. I can cite them with that instance’s pen name, then the most commonly known pen or given name in brackets afterward. I can do the reverse. Or I can be troublesome and only cite the pen name. Then again, I could adopt the practice that is the current default – born of now attributing works solely to the most commonly known name rather than to the name originally on the work – that is to not bother with the original pen name, obscuring the original publication context entirely. I can pretend, for example, that Maihime was written by Mori Ogai, and not Mori Rintaro. This flies in the face of convention but is the only way that I can cite the work and remain consistent with the overarching argument that I make in my dissertation: that is, use of and attribution to specific, variable pen names matters, both for understanding context and also understanding the work itself. It goes without saying that this is crucial for understanding authorship itself.

But there is another issue, and it goes hand-in-hand with citing works by writers whose name does not follow Western convention of given name first, last name second. Of having two names at all. The issue comes in the form of citation managers.

I’ve been giving Zotero a go lately and quite enjoying it. But I find myself making constant workarounds because of most of my sources being by Japanese writers, and the writers of my primary sources not only being Japanese but also using pen names. My workaround is to treat the entire name as one single last name, so I can write it in the proper order and not have it wrangled back into “last name”, “first name” – both of those being not quite true here. For citing a Japanese writer, I’d want to retain the last name then given name order; for someone using a pen name, the issue is that no part of the name is a last or given name. It’s what I’d like to call an acquired name.

Mori Ogai is now the most commonly used name of the writer Mori Rintaro (Mori being the last name, Rintaro being his given name). Ogai is a shortened version of his early pen name Ogai Gyoshi. Ogai Gyoshi isn’t a false last plus given name. It’s always in the order Ogai Gyoshi, neither of them is a “real” name, and it is a phrase, not a name. It’s as though he’s using a word that happens to have a space in it.

So when I put some of Mori Rintaro’s writing into Zotero, I put in “Mori Rintaro” as the last name. Sometimes I just put in “Ogai” as the name, when he signs a piece that way. Occasionally it’s “Ogai Mori Rintaro” (this is, in fact, the author of Maihime – I made a shortcut above in my example). And then there are some pieces in which the last name in Zotero is “Ogai Gyoshi.”

I don’t know how to go about this any other way, but it’s less about me having be a little hacky to get it to do what I want, and much more of a constant reminder of our current (Western) assumptions about names, authorship, and naming conventions. It’s a reminder of how different the time and place that I study is, and how much more dynamic and, frankly, fun it was to write in the late 19th century in Japan than it is now, either here as an American or even in Japan. Names are taken a bit more seriously now, I’d argue, and more literally. It’s a little harder to play with one’s name, to make one up entirely for a one-off use, at this point – and I think it’s for the worse.

(Obviously, there are exceptions: musicians come immediately to mind. And it’s not as though writers do not adopt pen names now. But it’s not in the same way. And this, incidentally, is something I love about the early Internet – I’m referring to the nineties in particular. Fun with handles, fun with names, all pseudonymous, and all about fluid, multiple identity.)

Showa 40s vs 1970

I was listening to a podcast interview with a favorite author just now (Kakuta Mitsuyo if you’re wondering) and I came to a realization about Japanese and Western calendars. From Meiji onward, I have come to crave dates in Japanese reign years when using Japanese, crazy as it sounds – I want to ditch Western years altogether!

Why is this? Frankly, Western years take a mouthful to say and are much harder to pick up when you’re listening, especially if the speaker is talking quickly. 2009 becomes “the year two thousand nine.” Try 1999: “the year one thousand nine hundred ninety-nine!” Do you see the problem?

Well, many learners of Japanese hate the confusion of having a separate, less often used Japanese reckoning of years according to emperors’ reigns. The infamous Hirohito is known as the Showa emperor in Japan and the Showa period starts with his coronation in 1926. I was born in Showa 56 – 1981. Incidentally I know this because the high school where I taught my first year in Japan functioned on Japanese years and one needs to know one’s birthday! For most official forms, you are still expected to write your birthday in Japanese years. If you want to impress a functionary, learn this and write it proudly. They will be unnecessarily astounded.

In any case, the author was talking about her childhood and said “In the Showa 40s…” I actually sighed with relief! When the host breezed over that day’s date in Western years at the beginning of the podcast I had simply stopped listening, but Showa 40s – it just clicked. 1965-1975. It just makes sense to me somehow.

As you may know, I study the Meiji period (1868-1912). Specifically, it’s the Meiji 20s, or 1887-1897. I find myself writing dates as Meiji 20-something all the time. Why? I can’t explain it. It’s certainly in part because publication dates in the books I read are all in Meiji years – it was still the norm then to use Japanese years. But I can convert easily now from studying it for so long. So, why?

Really, it’s not me becoming accustomed to Japan. No normal Japanese person born after the Taisho period (1912-1926) would do such a thing. I think it’s much simpler: I am a nerd who studies literary history. It’s the sad truth. Well, a happy Heisei 23 to you all then!

(By the way, why no pre-Meiji reign years for me? They’re too short, numerous, and confusing. They sound the same. I just can’t take it. But if I studied early modern? I bet I’d be using Bunka-Bunsei like it’s 1821!)

new phone destroys old phone

I’m jumping headfirst into the 21st century here for once, rather than being dragged kicking and screaming. Yes, I have replaced an old electronic device (to be fair, only 1.5 years old) with a fancy shiny new one.

Samsung Epic 4G

I posted once about my Blackberry destroying certain parts of my internet life by being a laptop replacement that nonetheless can do fewer things. Well, I now have a laptop replacement that can do quite a few more things, but with a harder to use keyboard, and which sucks its battery down like you wouldn’t believe. Goodbye, poor Blackberry. I who defended you so stubbornly am now transferring your pictures away and will soon recycle you, and remember you fondly in spirit.

Blackberry Curve Purple

 

RIP Blackberry Curve Purple!

P.S. The new phone is one that someone else gave up for an iPhone and donated to yours truly. So I’m still in character, fear not.

the tradeoff: elegance vs. performance

Oh snap – I just fixed this by turning on caching in the Cocoon sitemap. Thanks Brian Pytlik Zillig for pointing out that this is where that functionality is useful! And note to self (and all of us): asking questions when you’re torn between solutions can lead to a third solution that does much better than either of the ones you came up with.

With programming or web design, “clean and elegant” is a satisfaction for me second only to “it’s working by god it’s finally doing what it’s supposed to.”  So what am I to do when I’ve got a perfectly clean and elegant solution – one that involves zero data entry and only takes up a handful of lines in my XSLT stylesheets – that crunches browser speed so hard that it takes nearly a minute to load the homepage of my application?

I’ve got a choice here: Two XML files (one for each problem area) that list all of the data that I’d otherwise dynamically be grabbing out of all files sitting in a certain directory. This is time-consuming and not very elegant (although it certainly could be worse). The worst part is that it requires explicit maintenance on the part of the user. Wouldn’t it be nice to be able to give my application to any person who has a directory of XML files without any need for them to hand-customize it, even just a small part?

On the other hand, I can’t expect Web users to sit there and wait at least 30 seconds for TokenX to dynamically generate its list of texts, an action that would take a split second if it were only loading the data out of an XML file. I already have all the site menu data stored in XML for retrieval, meaning that modifications need only take place once and that nested menus can be easily entered without having to worry about the algorithm I’m using to make them appear nested on the screen in the final product.

You can tell from reading my thought process here what the solution is going to be. It’s too bad, because aiming for elegance often ends up leading you to better performance at the same time. Practicality vs. idealism: the eternal question to which we already know the answer.

in which I acquire typographical empathy

Guys. I humbly apologize for forcing you to read monospace font of your browser’s (or OS’s) choice for the past year. I’ve learned the error of my ways.

From now on it’s serif all the way.

And I now supply Linux font defaults just in case, which I didn’t before out of ignorance. At least there’s something to try to fall back on before font-family: serif now.

where are the japanese exchange students?

I was recently reading Jake Adelstein’s review of Reimagining Japan and he noted the need for openness as a topic explored in the book – and defined that as a reluctance for both young Japanese to go abroad, and for companies to reach outside of their own borders. I don’t have any profound insights into this issue (or even on whether it is the issue that people make it out to be), but it reminded me of a touching conversation I had last year.

I organized a flower-viewing party (for cherry blossom season) in the last April that I lived in Tokyo. For those of you who haven’t had the experience of cherry blossom season in Japan, I’ll give you a representative image: cold, cloudy, miserable day, often with no cherry blossoms in sight, a park completely covered in blue tarps (“leisure sheets!”), populated by shivering drunk people trying their damndest to get even more drunk while snacking on party foods like octopus dumplings. Doesn’t it sound like that romantic image of an elegant branch of cherry blossoms against the clear blue sky, perhaps with Fuji-san lurking nearby, that has become the representative of Japan? Well, don’t believe it. The version I’ve given you is hard cold reality.

Still, you do hanami (“flower viewing”) in late March and early April, flowers or not, and regardless of whether you need a winter coat. Once you’re plastered, you won’t notice anyway! Well, let’s move on now that I’ve set the scene.

I was chatting with my friend Naoko who had brought her shy but nice cousin along with her, who was interested in learning and practicing English. We got to talking about how I’d come to Japan in the first place and how my experiences were.

As we talked, she surprised me with her reaction: She revealed that she’d love to live abroad for a year or two after college, and she was so jealous of people like me and my friends who had been able to do that in Japan. It really threw me. After all, she was shy and hesitant about speaking, but her English was passable enough that after a few weeks in a place like the US or UK, she’d be doing fine. Her interest in foreign countries and languages was obvious. So why the resignation to not having a chance? What was stopping her?

Her answer to this really blew me away. “Well,” she said, “companies want to hire their new employees when they graduate from college, on time.” (By this she means that graduation is in late February, and the start of the new school/employment year is in April. Because of this, if you happen to not pass your entrance exams or not get a job offer, you have to wait until that time in the next year to try again.) “So if I were to go abroad, I’d come back and I wouldn’t be a new graduate, and I wouldn’t be with the class I graduated with. So it would be really difficult to get a job because I wouldn’t be in the category of people that companies want to hire.”

It really shouldn’t have surprised me so much. After all, it’s true. There is a deeply ingrained system of when and how, from college exams to job interviews. Of course, part-time jobs, and going into business for yourself, are different. But overall, despite a lot of shifting preferences and more varied ways of living compared to a decade or two ago, that system is still there. And if you don’t fit into the path that leads you to a position as a regular company employee (as opposed to contract or part-time), you are going to be stuck in what’s still considered by many to be an underclass.

The irony here is that Japanese firms would benefit immensely from young employees who have at least traveled, if not lived, abroad – anywhere. In fact, I worked very informally at a large company in Tokyo and my contact there confided in me more than once that he’d like the old guard to open their minds to hiring foreigners as in-house workers, instead of using contractors to do tasks like translation. His argument was that it would be more cost-effective and flexible (at one point they needed a rush translation and had to pay through the nose to get someone else to do it), and even more than that, that it would change the culture of the workplace in a positive way. But at the same time, he sighed when he said this and said, “There’s no way that could happen now. I’m hoping in two or three years, it might be possible, if I keep working on them.”

I see how this system is set up and I understand the logic of it, because everyone knows how it works and it’s self-perpetuating because of it. But looking at it from the outside, it doesn’t make a whole lot of sense. On the one hand I’m reassured and even inspired by the people I’ve talked to in Japan who have either made a path that weaves in and out of the system as they like, or who have become successful within powerful companies and used that position to shape their work into something international. There are plenty of people I know who lived abroad for a while (from three months to like 20 years) who were then perfectly successful when they returned. But by and large, if this attitude is there – well, the fear that success can only be had along one specific, limited path is self-fulfilling.

I can only hope that more people like my friend’s cousin who have a desire to go abroad and experience life in other countries do get the courage to do it regardless of the rigid system that they live in, and make something successful out of it. But in the meantime, I am a lot more understanding of why the Asian exchange students at my university come to us from every nation but Japan.

my poor laptop, cont’d.

I’m being dragged kicking and screaming into obsolescence, despite having perfectly good hardware and a brand new battery.

This time, it’s not being able to upgrade to Java 1.6 without installing Yellow Dog Linux, following instructions for putting IBM’s PowerPC release of 1.6 on it, and hoping for the best. Ordinarily, I would do just that, but I didn’t know I needed Java 6 for anything until, well, yesterday.

It’s downright embarrassing. I have to borrow a laptop from a kind workshop organizer on Saturday at DH2011 because one of the visualization tools we’re running is a Java app that needs, yes, 1.6.

I’m being pressured toward a newer laptop more and more, apropos of my recent two posts which were more my complaining about something that wouldn’t necessarily force me to upgrade to something less than 5 years old. How frustrating!

(And I never thought I’d regret not having brought my Linux netbook along with me this summer, thinking there’s no way I could need a desktop and two laptops, which is ridiculous – but there is probably a JDK 1.6 sitting on that Ubuntu install. But there are 12 hours between me and the netbook until August. Too bad!)

A random positive note to end this series of posts about my ridiculous computing situation. When I was doing research to find Java 6 for PowerPC, I came across a cottage industry of people helping others install it (and Linux) on their – get this – 64-bit PPC Playstation3! It warms the heart to know that there’s still a phenomenal console out there (and really, it is the best of the three) that uses PPC architecture. Hooray for Sony (and for IBM, which is using 64-bit PPC architecture in their workstations and releasing the JDK for the rest of us).