Category Archives: intellectual laziness

the first-world internet

I heard an interesting presentation today, but it concluded with a very developed-world, class-based interpretation of the Internet that I simply can’t agree with.

Although it’s true that more students are coming from abroad to study in the US (attributed in the presentation partially to budgetary issues in public schools in the US, another issue entirely), the idea of ‘globalization’, I’d argue, is really a concept based in the developed world. Yes, we have more students studying ‘cross-border’ topics, and interested in the world outside of the US. American students are coming into more contact with international students thanks to their presence in American universities, and perhaps gaining more cultural competency through this interaction. ‘Global studies’ are now a thing.

But this presentation talked at the end about the global power of the Internet, and globalization generally, about being able to reach across borders and communicate unimpeded. It doesn’t just have the potential to break down barriers, but already actively does so, this presenter posited. It doesn’t just encourage dissent but is already a channel for dissent, and an opportunity available to all.

International students in the US may be experiencing this power of the Internet, yes. But at home? Students from nations such as China and Saudi Arabia may not have experienced the Internet in this way, and may not be able to experience it back home in the same way as they can in the West, in Korea, in Japan, in other developed countries. (And I realize that’s a problematic term in itself.) Moreover, not all American students have experienced this Internet either. The students we find in universities generally already have opportunities not available to everyone, including their access to technology and the Internet.

There’s also the inherent assumption that this global access – and ‘global studies’ in general – takes place in English. While many students abroad are studying English, not all have this opportunity; moreover, their access to the educational opportunities of the developed world are limited to those opportunities they can access in English. Many undergraduates and even graduate students in the US limit themselves to the kind of global studies that can take place without foreign language competency. I realize that many do attempt foreign language studies and while the vast majority of undergraduates I encounter who are interested in Japan and Korea cannot read materials in their focus countries’ languages, they are often enrolled in language classes and doing their best. However, there are many more who are not. They do not come to the world – they expect the world to come to them.

And there are many, many students around the world who do not have access to the English Internet, or cross-border collaboration in English through the opportunities the Internet potentially affords (or doesn’t, depending on the country). They may not even have reliable access to electricity, let alone a data connection. This is changing, but not at the speed that the kind of thinking I encountered today assumes.

Related to this, another presentation talked about the power of MOOCs and online learning experiences in general. And yes, while I generally agree that there is much potential here, the vast majority of MOOCs currently available require English, a reliable connection, reliable electricity. They are by and large taken by educated adult males, who speak English. There is potential, but that is not the same as actual opportunity.

Overall, I think we need to question what we are saying when we talk about the power of the global Internet, and distinguish between potential and reality. Moreover, we need to distinguish exactly the groups we are talking about when we talk about globalization, global studies, and cross-border/cross-cultural communication. Even without the assumption of a developed-world, upper-class Internet, we need to recognize that by and large, our work is still conducted in silos, especially in the humanities. Science researchers in Japan may be doing English-language collaboration with international colleagues, but humanities researchers largely cannot communicate in English and cross-language research in those fields is rare. I can’t speak for countries other than Japan and the US, really, but despite the close mutual interest in areas such as Japanese literature and history, there is little collaboration between the two countries – despite the potential, as with digitizing rare materials and pooling resources to create common-interest digital archives, for example.

Even those international students often conduct their American educations in language and culture silos. Even the ones with reliable Internet access use country-based chat and social media, although resources such as Facebook are gaining in popularity. We go with what is most comfortable for us, what comes to us; that doesn’t apply only to Americans. Our channels of communication are those that allow us the path of least resistance. Even if Twitter and Facebook weren’t blocked in China, would they prove as popular as Sina Weibo and other Chinese technologies? Do Americans know what Line is or are they going to continue using WhatsApp?

If we find that English, money, and understanding of American cultural norms are major barriers to our communication, we might find other ways. Yes, that developed-world Internet may hold a lot of potential, but its global promise may not go in a direction that points toward us in America anyway.

don’t learn to code

There is a lot of speculating going on, on the Internet, at conferences, everywhere, about the ways in which we might want to integrate IT skills – for lack of a better word – with humanities education. Undergrads, graduate students, faculty. They all need some marketable tech skills at the basis of their education in order to participate in the intellectual world and economy of the 21st century.

I hear a lot, “learn to code.” In fact, my alma mater has a required first-semester course for all information science students, from information retrieval specialists to preservationists, to do just that, in Python. Others recommend Ruby. They rightly stay away from the language of my own training, C++, or god forbid, Java. Coding seems to mean scripting, which is fine with me for the purposes of humanities education. We’re not raising software engineers here. We tend to hire those separately.*

I recently read a blog post that advocated for students to “learn a programming language” as part of a language requirement for an English major. (Sorry, the link has been buried in more recent tweets by now.) You’d think I would be all about this. I’m constantly urging that humanities majors acquire enough tech skills to at least know what others are talking about when they might collaborate with them on projects in the future. It also allows one to experiment without the need for hiring a programmer at the outset of a project.

But how much experimentation does it actually allow? What can you actually get done? My contention is: not very much.

If you’re an English major who’s taken CS101 and “learned a programming language,” you have much less knowledge than you think you do. This may sound harsh, but it’s not until the second-semester, first-year CS courses that you even get into data structures and algorithms, the building blocks of programming. Even at that point, you’re just barely starting to get an idea of what you’re doing. There’s a lot more to programming than learning syntax.

In fact, I’d say that learning syntax is not the point. The point is to learn a new way of thinking, the way(s) of thinking that are required for creating programs that do something interesting and productive, that solve real problems. “Learning a programming language,” unless done very well (for example in a book like SICP), is not going to teach you this.

I may sound disdainful or bitter here, but I feel this must be said. It’s frankly insulting as someone who has gone through a CS curriculum to hear “learn a programming language” as if that’s going to allow one to “program” or “code.” Coding isn’t syntax, and it’s not learning how to print to the screen. Those are your tools, but not everything. You need theory and design, the big ideas and patterns that allow you to do real problem-solving, and you’re not going to get that from a one-semester Python course.

I don’t think there’s no point to trying to learn a programming language if you don’t currently know how to program. But I wish the strategies generally being recommended were more holistic. Learning a programming language is a waste of time if you don’t have concepts that you can use it to express.

 

* I’m cursed by an interdisciplinary education, in a way. I have a CS degree but no industry experience. I program both for fun and for work, and I know a range of languages. I’m qualified in that way for many DH programming jobs, but they all require several years of experience that I passed up while busy writing a Japanese literature dissertation. I’ve got a bit too much humanities for some DH jobs, and too little (specifically teaching experience) for others.

blog link: the inferiority of blackness as a subject

I don’t do nearly enough (or any) linking to other inspiring blog posts. Today that will change. I came across an eloquent and inspiring (not to mention blood-pressure-raising) post today critiquing a Chronicle of Higher Education blog post that clearly should never have been published – which attacked doctoral students’ dissertation titles as a statement on the illegitimacy of black studies departments as a whole. Seriously. Picking on PhD students, by someone with no graduate education, as though black studies is a complete waste of time because it doesn’t meet the writer’s standards of legitimate, relevant topics. By the way, did you know that racism is dead? We’ve got a black president! Clearly studies that focus on race are no longer relevant. Especially topics in black studies that are written on by, you know, black scholars.

Anyway, what I will link to is not that offensive and – hey – less than worthwhile post. I link to a much better one.

“The Inferiority of Blackness as a Subject” at Tressiemc <– Click that link; read and weep for humanity. Get mad. Sign the petition. Share to your own audience. Let’s not put up with this kind of thing being posted at Chronicle of Higher Ed.

multilingual laziness

I’ve come across a few sites lately – commercial and academic – that offer “translation” into several languages. I click on them out of curiosity. The link does not take me to a translation.

It takes me to Google Translate!

I have a few things to say about this:

  1. It’s unprofessional. It looks unprofessional too. That bar at the top of the site that says it’s being translated by Google? Kind of ruins the effect.
  2. If you’re just sending your site text to Google, why not offer all languages? I just visited a site that offers about 7; another (Rakuten) offers a handful. Google can handle more than 7 languages.
  3. It’s deceptive. Listing only a few languages makes it look like you’re offering actual human translations that you made or commissioned yourself. In fact, this is what I thought about all of the sites that I’ve visited recently that “offer” translations via Google. There’s no button that says “powered by Google Translate,” only a few flags that the user assumes lead to a real translation. They don’t. They lead to Google.
  4. Last but not least – it’s laughable. The translations aren’t so much inaccurate as hilarious. The only reason I can use the English Rakuten.jp is that I know Japanese and can guess at the original phrasing that produced such funny English. I have to double-check everything with the original Japanese text that it helpfully supplies a link to.*

This is the impression that your “serious” commercial or academic site is leaving: you’re naive about machine translation; you’re too cheap or unimaginative to get your own local translations; you’re out to deceive your users; and you’re an Engrish generator.

Is this the impression you want to leave?

Machine translation isn’t here yet. It may never be. Machine translation is hard. At the very least, get a native speaker of each language to read each end of the translation. You can identify the parts that need to be fixed simply by watching them making funny faces.

Google Translate may be okay in a pinch if you need to, for example, order something from a storefront or service that isn’t in your language. If you want to be taken seriously, find a human.

—-

* The reason I use Rakuten’s international English site is to narrow down the stores to ones that ship internationally. I wish I could use the Japanese site with this filtering.

on coding

I can’t count the number of times I’ve heard these in the past year:

“I want help learning how to code.”
“Will I have to learn to code?”
“Do digital humanities scholars need to know how to code?”
“How much do I need to know about coding?”

In every case, I’m left scratching my head: how can I begin to answer this question? What is “coding”?

The Problem

What I want to ask in these cases is simpler than you might think. I want to ask what the problem is. What is the solution that you need? Do you need to display data, to run numbers through formulas, to create a nice desktop software application with a GUI? What is the purpose? We can’t begin to answer these questions without the goal concretely in mind.

Coding

And here is the one word that has come to drive me up the wall. Coding. It’s everything and nothing. I’ve heard it used to refer to everything from HTML markup to large and complex software development projects. Coding is agnostic. It doesn’t specify what’s going on. “Learning to code” could be anywhere from “I want to learn the software development process” to learning the basics of a particular language (enough to get “Hello world?”), learning a language in depth, learning scripting languages for using databases on your Web site, or marking up text using HTML and creating CSS stylesheets to go along with them. But without saying any of these, “learning to code” has absolutely no meaning. There really is no such thing. Learning the fundamentals of good programming, possibly, but a tutorial of how to make a for loop Python is no such thing (and when I dissect the coding question, this is oddly what it ends up being much of the time).

A Little Rephrasing

Having discussed the problem with “coding,” let’s move foward and try to produce a real answer to those questions at the top.

Instead of thinking of this as a kind of mysterious and wide-ranging skill to pick up, that applies to creating things with the computer no matter what kind, I will rephrase it as this: creation. Problem solving through creating something to help you arrive at the solution. We build and create all kinds of things without a computer, without giving it any instructions. We make tools and write and draw. We give instructions to our subordinates.

Putting the two big pieces together: What is your problem, what is the goal, and what do you need to learn in order to create something that will solve the problem? Leave programming language and hardware requirements out of it. What do you need to get done?

Now this is the tough part, in my experience. You have a problem and a goal. (For example, “I have all this data, and I want to discover patterns in it. I want that to be displayed as a map.”) But you need the concrete steps in it. What exactly are we working with? What do we want to do with it? How on earth are we going to make that happen? Lay it out: step 1, step 2, and so on. How are they connected to each other? Annoyed as I was at having to diagram program flow charts when I was in college, I have really appreciated their value later on. Make a map of where you have to go in your solution, including all of the stops, in order to reach your goal.

Going Forward

From there, I don’t want to say that the “coding” that I think people are referring to is trivial per se, but the really difficult part has already been done once the problem is thought out and possible solutions mapped. Learning to implement the solution is orders of magnitude easier, in so many cases, than coming up with robust, quality solutions and the concrete steps needed to carry them out.

My advice to all of those with “coding” questions is this: to follow the above steps, and then go out and get a human or a book that is specialized in what you’d use to solve the problem. Google is your friend, if you don’t have an all-knowing expert at your disposal (and who does?). You are probably not the first person who has had to implement your solution concretely, using a specific technology. They may not have done it in your field, but think of the problem itself and not whether it’s processing data sets from history or from biology. Get a sense of how people approached it: visualization software? A specific programming language? None of the above?

Then go get the book or that human, and put in the time. Programming – not to mention Web design in these days of CSS (yes, I started in 1996, and never quite internalized anything after table-based layout) – has a learning curve. You are not the first one to encounter it, but you can overcome it too. Go through the examples and type them in and run them. Play around with them and modify them, then try to get why you broke it and how to fix it. Even better are books or very thorough tutorials on the Web (so few and far between) that take you through a project just like the one you’re working on.

How Much?

So, do you need to know how to code? Who knows. You need to be able to do – or hire someone to do – whatever the solution to your problem requires. Unless you are looking at a career in software development, no one is expecting you to become a programmer. Do you need help “learning how to code?” I hereby order you to never use that phrase again. You need help in figuring out which technology works for your problem and some advice about where to go to learn it. Now go, and enjoy this excuse for getting to learn something new and making something exciting!

Coda

Obviously, I have some personal bias here. I have a programming background, have done quite a bit of Web design and implementation (including sites that get their information from databases), and have used about 10 programming languages over the years. I’m not an expert in any of them. And I don’t always know what technology is going to help me solve my problem. What I learned, really, is that I’m going to have more or less of a learning curve, I’m going to do a lot of research figuring out what might help me the most, and that I might end up on a forum or emailing or calling someone with what I think is an embarrassingly dumb question. It happens. In fact, I just called up my brother and asked a really stupid question 2 weeks ago, but if I didn’t, it would have taken me weeks to figure out my small mistake that was killing everything. If it’s of any comfort, the first learning curve is going to be immensely harder than any of the ones that come after. Once you first “get it” from making a project work, you are going to have problem-solving and logic skills that will vastly improve your life.

So go forth and learn some new technologies! And never use the word “coding” again.

using a cable too difficult for reporter

This one is going to be quite off the topic of research – well, maybe it links in with digital humanities. The digital part. The part that uses computer hardware.

But seriously? Writing a whole article about your own incompetence (and ignorance) when it comes to hooking your laptop’s headphone jack up to your stereo? With one cable? When what you should be talking about is the new Thunderbolt connection in MacBook Pros?

Spare us! Please!

But since you won’t, at least indulge me in a response.
Continue reading using a cable too difficult for reporter

some questions for the media

…to which I already know the answers. So don’t worry, I’m not looking for an explanation of the obvious.

I’d simply like to juxtapose some stories to think about.

First up – a kid is arrested in an FBI sting for attempting to detonate a car bomb in Portland, OR. (Note: I am on the side of the FBI in this one. From everything I’ve heard about the story, it seems about as far from entrapment as you can get and still be running a sting.) It’s front page news. Obviously. It should be.

So what isn’t front page news? What did I just have to spend over five minutes digging through CBSnews.com to find, under the “front page news” of the Unabomber’s Montana property going up for sale, and “how to feel sexy while aging” (answer: have lots of sex. not making that up.)?

Guy attempts to set mosque on fire in Corvallis, OR, days after the kid is arrested. Guy is arrested for doing so. He’s in jail. This isn’t even second-page news. This is comb-through-the-site-for-a-few-minutes news. Conduct-a-few-searches-because-I-can’t-find-it (even though it was sent to my cell phone via Google News a few minutes before) kind of news.

Should it be? I think you can guess what my answer would be to that question. I dare not even ask if it should be covered as “domestic terrorism” in the same way that say, the same action undertaken by a brown person with an accent. If I went and asked that, I’d have to keep asking about our shifting use of “terrorism” and why it never seems to apply to our most bountiful domestic terrorists, white power and violent anti-abortion groups.

Regardless of your views on any of this, wouldn’t it be nice for this kind of article to be a little closer to the top of the page? As opposed to, say, the media’s freak-out about new TSA screening procedures when it turned out that, as reported in the media, absolutely no freak-out actually occurred in reality despite their predictions? Wouldn’t it be nice to have a front-page story about something that happened, in addition to all that stuff that didn’t happen?

Oh well. Let’s move on.

Second story of the day is the continued “deliberation” over whether to repeal Don’t Ask Don’t Tell (DADT). There has been a Pentagon study. General McMullen has repeatedly called for its repeal. Then the heads of the various military units call for it not to be “while we’re fighting.” (Conveniently for them, it doesn’t look like this will ever not be the case, so they’re kind of off the hook.) Sen. McCain goes into increasingly complex contortions to get out of admitting that there has been a large-scale study done that overwhelmingly concludes that the policy should be repealed. Other senators waffle. It is endless.

The question that comes up again and again is how active-duty personnel, especially in combat, think it would affect their ability to do their jobs. How will it impact the unit? How will it impact their own effectiveness? Morale?

So here’s a question I would like to hear asked, just once. Even once would be enough.

How do currently serving gay and lesbian personnel feel their effectiveness and morale is influenced by DADT? How would its repeal impact their ability to do their jobs, in combat, where they are already serving? How do the people directly impacted by DADT feel about it currently and how would they feel if it was done away with?

The questions that will never be asked. I’m allowed to dream, aren’t I?

moratoria: “western” edition

Some of you who know me well (academically) will probably not be surprised by this post, but here I go anyway. I just need to vent a little.

I am typing up handwritten notes right now, getting organized. I am typing some words over and over (used by the authors of the things I took notes on, not me): “western,” “european” and their “influence”.

Okay, I am officially calling you out on this, scholars. This, as far as I am concerned, is about as INTELLECTUALLY LAZY as you can get.
Continue reading moratoria: “western” edition