Archive for June 12th, 2008

INTERPRETERS should be available to schools to assist parents for whom English is not their first language, secondary school managers have suggested.

They also called for extra resources for schools to allow them translate and publish enrolment policies in several languages, in a response to the recently published audit of these policies.All inquiring parents must be given a copy of a school’s enrolment policy in an accessible language and must be provided with all the relevant information, according to the Association of Management of Catholic Secondary Schools.

In a submission to Education Minister Batt O’Keeffe, the association said parents must know their entitlements and responsibilities in the process. But they strongly rejected further regulation, which they said would restrict parental choice. They also dismissed the suggestion of a statutory enrolment commissioner. “It would constitute unacceptable interference in the responsibilities and functions of boards of management. It is simply not workable and would generate a raft of additional administrative work for an already overburdened school management.”

Likewise, any attempt to appoint a person to assume the exercise of the enrolment function is an “unacceptable interference”, added the submission.The INTO, in its submission, argued that the role of the school patron in setting out enrolment guidelines should be enhanced and given statutory recognition with agreed limits. “To have the minister directing a school on enrolment policy or practice is generally not constructive, but it is better to have such a role devolved through the patron to a local mechanism or committee”.”Such a mechanism could be based on a parish, a town and its hinterland, or cluster arrangements where necessary.”

Source: http://www.independent.ie

Read Full Post »

The state of Alaska is claiming it does not have to provide voting ballots and other election materials in Yup’ik as well as English to residents of the Bethel area.

State attorneys argued their case Wednesday before U.S. District Court Judge Tim Burgess in Anchorage. The Native American Rights Fund and the American Civil Liberties Union filed a lawsuit last year on behalf of Yup’ik elders and four tribal councils in Western Alaska.

They claim the state and the city of Bethel are violating federal voting laws by failing to provide election materials in Yup’ik. The state responded that Yup’ik is historically an unwritten language and therefore exempt from written translation requirements.

Lawyers for Bethel said the city does a good job of providing translators instead.

Source: http://www.fortmilltimes.com

Read Full Post »

‘Gospel of Judas’ story criticized for ‘scholarly malpractice’

Prominent scholars have accused the National Geographic Society’s 2006 series of articles on the Gospel of Judas of mistranslation, commercial exploitation, and “scholarly malpractice.” A recent essay in the Chronicle Review asserts that the widely publicized reports of the gospel’s portrayal of a “noble Judas,” including reports from the National Geographic project team itself, have been thoroughly challenged by experts who believe the public has been misled.

On April 6, 2006 the National Geographic Society announced the completed restoration and translation project surrounding the rediscovered apocryphal Gospel of Judas, a second-century text written by a heretical Gnostic sect. A documentary on the gospel aired on April 9, Palm Sunday.

National Geographic’s introductory webpage for the Gospel of Judas summarizes its interpretation of the text:

“The Gospel of Judas gives a different view of the relationship between Jesus and Judas, offering new insights into the disciple who betrayed Jesus. Unlike the accounts in the canonical Gospels of Matthew, Mark, Luke, and John, in which Judas is portrayed as a reviled traitor, this newly discovered Gospel portrays Judas as acting at Jesus’ request when he hands Jesus over to the authorities.”

Since the publication of National Geographic’s interpretation, a heated debate over the magazine’s controversial view has arisen in scholarly circles. Thomas Bartlett described the scholarly criticisms of National Geographic’s interpretation in his essay “The Betrayal of Jesus,” published in the May 30 issue of the Chronicle Review, a publication of the Chronicle of Higher Education.

Bartlett summarized the contents of the Gospel of Judas, in which Bartlett says the character of Judas is more prominent than he is in the canonical New Testament. “He and Jesus discuss theological matters, like the meaning of baptism and whether the human spirit dies. Perhaps the most striking aspect of the text is Jesus himself, who is often laughing, playful, and aggressive and who seems to enjoy mocking his disciples. For those familiar with the Jesus taught in Sunday school, that may come as a jolt,” Bartlett wrote.

According to Bartlett, the text of the Gospel of Judas has survived in an originally leather-bound codex which is about 1,700 years old and written in Coptic, an ancient Egyptian language. It is supposed to have been discovered in a cave by an Egyptian farmer sometime in the 1970s. The codex, which includes other ancient apocryphal writings such as the Letter of Peter to Philip, was purchased by a Cairo antiquities dealer and later spent 16 years in a safe deposit box in Hicksville, New York.

Swiss antiquities dealer Frieda Tchacos Nussberger purchased the manuscript in 2000. In 2004 she reportedly sold the rights to translate and publish the gospel to the National Geographic Society for $1 million.

The codex itself was in poor condition, its fragile and torn pages requiring careful restoration.

To study and restore the codex, National Geographic brought together a panel of experts including Gregor Wurst, a professor of ecclesiastical history and patristics at the University of Augsburg, in Germany; Bart Ehrman, a professor of religious studies at the University of North Carolina at Chapel Hill; Elaine Pagels, a professor of religion at Princeton University; and Marvin Meyer, an expert in Coptic studies.

According to Bartlett, National Geographic’s materials presented Judas in a positive light:

“In an online video clip, Meyer calls the text’s Judas the ‘most insightful and the most loyal of all the disciples.’ In Ehrman’s essay, Judas is ‘Jesus’ closest friend, the one who understood Jesus better than anyone else, who turned Jesus over to the authorities because Jesus wanted him to do so.’ The teaser on the documentary’s DVD case asks, ‘What if this account turned Jesus’ betrayal on its head, and in it the villain became a hero?’”

Bartlett reports that though these interpretations attracted an initial flood of media attention, many scholars now argue that National Geographic’s coverage seriously distorts the text.

April D. DeConick, a professor of biblical studies at Rice University, examined the English translation of the Gospel of Judas on the Internet soon after the National Geographic documentary aired.  In her reading, she saw that Judas was not turning to Jesus as a friend but rather was sacrificing him to a demon god named Saklas.

Translating from the original Coptic the next day, she found what she considered a major error.  The National Geographic translated one line from the gospel’s Jesus to say “O 13th spirit, why do you try so hard?”  The word ‘spirit’ was used for the word ‘daimon,’ which is usually translated in other early Christian texts as “demon.” The number 13, in the Sethian Gnostic sect believed to have written the Gospel of Judas, also signifies the realm of a demon, Ialdabaoth.

Professor DeConick believes other errors in the translation include a phrase saying that Judas “would ascend to the holy generation” which should have been translated to say he would not “ascend.” Another translated passage said that Judas would be “set apart for the holy generation” where the original said “set apart from the holy generation.”

According to Bartlett, DeConick suggests the translators were overly influenced by St. Irenaeus’ comments on the Gospel of Judas. In his work “Against Heresies” the Church Father wrote that the gospel, which he considered heretical, portrayed Judas as “knowing the truth as no others did.”

In a December 2007 essay in the New York Times, DeConick explained her criticisms, asking, “How could these serious mistakes have been made? Were they genuine errors or was something more deliberate going on? This is the question of the hour, and I do not have a satisfactory answer.”

She suggested that National Geographic’s desire for an exclusive led it to insist on nondisclosure agreements from cooperating scholars, whose work then could not be corrected by their peers.

DeConick also organized a conference on the Gospel of Judas at Rice University, where many attendees were critical of the National Geographic research team. She has expanded her criticisms of the project in her book The Thirteenth Apostle: What the Gospel of Judas Really Says.

Professor Bart Ehrman has defended the National Geographic Society’s actions, saying its nondisclosure agreements were necessary to secure its exclusive rights to the Gospel of Judas story.

Terry D. Garcia, executive vice president for mission programs at National Geographic, also said such agreements were necessary. “The last thing we wanted were multiple voices talking about bits and pieces of this project,” he says. “All that would do was fan speculation and create unsubstantiated claims that might impede the research.”

Garcia attacked the assertions in Professor DeConick’s New York Times essay, calling them “the height of irresponsibility.”

Marvin Meyer, the National Geographic project’s coptologist, said he was bothered by DeConick’s suggestion that some of the translation had been deliberately falsified.  However, he did voice some criticisms of the National Geographic Society’s approach to the Gospel of Judas research.

“We have at times gnashed our teeth to work with them,” Meyer said, according to Bartlett. “We have found things to be highly irregular in terms of how we do things in scholarship.”

In a May 30 press statement, the National Geographic Society responded to Bartlett’s Chronicle Review essay. The statement accused Bartlett of mischaracterizing the “long and painstakingly careful” process of preserving and presenting the codex as a “rushed job.” National Geographic said that its disputed translation choices are “addressed in extensive footnotes in both the popular and critical editions of the gospel” and chastised Bartlett for not mentioning that DeConick’s New York Times essay coincided with the release of her book on the Gospel of Judas.

Speaking with CNA, Bartlett said that he was reluctant to characterize the overall reaction of the academic community to the debate. However, he said he has noted a large response from various Christian blogs and websites. He said some Christians had expressed a “great deal of consternation and concern” about whether the Gospel of Judas would change traditional Christian interpretations of the biblical figure, though many were generally skeptical towards the material presented in the National Geographic project.

He said that Craig A. Evans, an evangelical Christian and professor of New Testament at Acadia Divinity College who was on the Gospel of Judas project, is now “pretty vehement” against the “good Judas” interpretation. According to Bartlett, Evans feels the first translation was “problematic and inaccurate.”

Bartlett also addressed the National Geographic Society’s characterization of his Chronicle Review essay saying it was “inaccurate in a number of ways.”

He also provided more detailed responses to the society’s accusations on several academic web logs. One fact that Bartlett pointed out was that, contrary to the claims of National Geographic, he did report that later editions of the Gospel of Judas acknowledge alternate readings of the text and make some corrections to the translation. However, he said, “the best-selling first edition of the book and the television documentary watched by millions do not include these caveats.”

“I understand that National Geographic must be reeling from criticism of its Judas project by biblical scholars. But your sloppy, bewildering response to my article doesn’t help your case,” he wrote.

Source: http://catholicnewsagency.com

Read Full Post »

Machine translation for the most part targets Internet and technical texts

Like all technology, machine translation (MT) has its limits, says Mike Dillinger, President of the Association for Machine Translation in the Americas and Adjunct Professor, Department of Psychology, San José State University.

At the invitation of the Department of Artificial Intelligence, Mike Dillinger has been giving a course on paraphrasing and text mining at the School of Computing. Dillinger considers that without clean and clear texts machine translation will not work well and that, despite technological advances, there will always be a demand for human translators to render legal or literary texts.Machine translation, he adds, for the most part targets Internet and technical texts. MT’s Internet bias means training web content creators to assure that the documents are machine translatable.

– As an acclaimed expert in machine translation, how would you define the state of the art in this discipline?

The state of the art is rapidly changing. A far-reaching new approach was introduced fifteen to twenty years ago. At the time we faced a two-sided problem. First, it took a long time and a lot of money to develop the grammatical rules required to analyse the original sentence and the “transfer” or translation rules. Second, it looked to be impossible to manually account for the vast array of words and sentence types in documents.

The new approach uses statistical techniques to identify qualitatively simpler rules. This it does swiftly, automatically and on a massive scale, covering much more of the language. Similar techniques are used to identify terms and their possible translations. These are huge advances! Before system development was very much a cottage industry; now they are mass produced. Today’s research aims to increase the qualitative complexity of the rules to better reflect syntactic structures and aspects of meaning. We are now exploiting the qualitative advances of this approach.

– Machine translation systems have been in use since the 1970s, is this technology now mature?

If maturity means for use in industrial applications, the answer is definitely yes. MT has been widely used by top-ranking industrial and military institutions for 30 years. The European Community, Ford, SAP, Symantec, the US Armed Forces and many other organizations use MT every day. If maturity means use by the general public to enter a random sentence for translation, I would say, just as definitely, no. Like all technology, machine translation has its limits. You don’t expect a Mercedes to run well in the snow or sand: where it performs best is on a dry, surfaced road. Neither do you expect a Formula 1 car to win a race using ordinary petrol or alcohol, it needs special fuel. Unfortunately, very often people expect a perfect translation of a not very clear or error-riddled text. For the time being at least, without clean and correct texts, machine translation will not work properly.

– Do you think society understands MT?

Not at all! It’s something I come across all the time. A lot of people think that “translation” is being able to tell what the author means, even if he or she has not expressed himself or herself clearly and correctly. Therefore, many have great expectations about what a translation system will be able to do. This is why they are always disappointed. On the other hand, those of us working in MT have to make a big effort to get society to better understand what use it is and when it works well: this is the mandate of the association I chair.

– What is MT about? Developing programs, translation systems, computerized translation, manufacturing electronic dictionaries? How exactly would you define this discipline?

MT is concerned with building computerized translation systems. Of course, this includes building electronic dictionaries, grammars, databases of word co-occurrences and other linguistic resources. But it also involves developing automatic translation evaluation processes, input text “cleaning” and analysis processes, and processes for guaranteeing that everything will run smoothly when a 300,000 page translation order arrives. As these are all very different processes and components, MT requires the cooperation of linguists, programmers and engineers.

– What are the stages of the machine translation process?

1. Document preparation. This is arguably the most important stage, because you have to assure that the sentences of each document are understandable and correct.
2. Adaptation of the translation system. Just like a human translator, the machine translation system needs information about all the words it will come across in the documents. It can be taught new words through a process known as customization.
3. Document translation. Each document format, like Word, pdf or HTML, has many different features, apart from the sentences that actually have to be translated. This stage separates the content from the wrapping as it were.
4. Translation verification. Quality control is very important for human and machine translators. Neither words nor sentences have just one meaning, and they are very easy to misinterpret.
5. Document distribution. This stage is more complex than is generally thought. When you receive 10,000 documents to be translated to 10 different languages, checking that they were all translated, putting them all in the right order without mixing up languages, etc., takes a lot of organizing.

– Is this technology a threat to human translators? Do you really think it creates jobs?

It is not a threat at all! MT takes the most routine work out of translators’ hands so that they can apply their expertise to more difficult tasks. We will always need human translators for more complex texts legal and literary texts. MT today is mostly applied to situations where there is no human participation. It would be cruel even to have people translate e-mails, chats, SMS messages and random web pages. The required text throughput and translation speed is so huge that it would be excruciating for a human being. It is a question of scale: an average human translator translates from 8 to 10 pages per day, whereas, on the web scale, 8 to 10 pages per second would be an extremely low rate. The adoption of new technologies, especially in a global economy, seldom boosts job creation. What it does do is open up an increasingly clear divide between low-skilled routine jobs and specialized occupations.

– Is the deployment of this technology a technical or social problem?

First and foremost it is a social engineering problem because people have to change their behaviour and the way they see things. The MT process reproduces exactly the same stages as human translation, except for two key differences:
a) In translation systems, you have to be very, very careful about the wording. Human translators apply their technical knowledge (if any) to make up for incorrect wording, but machine translation systems have no such knowledge: they reproduce all too faithfully the mistakes in the original/source text. It is hard to get them to translate more accurately, but there are now extremely helpful automatic checking tools. Symantec is a recent example that uses an automatic checker and a translation system to achieve extremely fast and very good results.
b) Translation systems have to handle a lot of translated documents. What happens if an organization receives 5,000 instead of the customary 50 translated documents per week? Automating the translation process ends up uncovering problems with other parts of document handling.

– You mentioned the British National Corpus that includes a cross section of texts that are representative of the English language. It contains 15 million different terms, whereas machine translation dictionaries only contain 300,000 entries. How can this barrier to the construction of an acceptable MT system for society be overcome?

This collection of over 100 million English words is a good mirror of macro language features. One is that we use a great deal of words. However, word frequency is extremely variable: of 15 million terms 70% are seldom used! To overcome the variability in vocabulary usage barrier, we now use the most common words to create a core system to which 5,000 to 10,000 customer-specific words are added. This is reasonably successful. For web applications, however, this simply does not work. Even the best systems are missing literally millions of words, plus the fact that new words are invented every day. At least three remedies are applied at present: ask the user to “try again”, ask the user to enter a synonym, and automatically or semiautomatically build synonym databases. As I see it, we will have to develop web content author guidance systems, such as have already been developed for technical documents. There are strong economic arguments for moving in that direction.

– The Association for Machine Translation in the Americas that you chair is organizing the AMTA 2008 conference to be held in Hawaii next October, what innovations does conference have in store?

There is always something! Come and see! One difference this year is that several groups are holding conferences together. AMTA, the International Workshop of Spoken Language Translation (IWSLT), a workshop by the US government agency NIST about how to evaluate translation evaluation methods, a Localization Industry Standards Association meeting attracting representatives from large corporations, and another group of Empirical Methods in Natural Language Processing (EMNLP) researchers will all be at the same hotel in the same week. Finally, as it is to be held in Hawaii, our colleagues from Asia will be there to add an even more international edge. For more information, see conference web site.

Source: http://www.innovations-report.de

Read Full Post »

RFID and AIDC News: With Globalization Comes New Challenges in Label Printing

When it comes to globalization, many challenges are well understood: the challenges of supply selection, etc. However, there is one important area that has not received much attention – international label printing.

Whether for import or export, a global supply chain world has two key labeling challenges:

  • Translating English (or whatever the local language is) into the appropriate words in the targeted language; and
  • Printing the characters of these different languages, especially for non-Roman alphabets such as the Asian languages and Arabic.

There is no magic bullet for the translation. In general, says Mark Fralick, SCDigest’s Technology Editor, you need to set up database tables that have the right translation for each language or location you need to support.

For example, “Part Description” may need to be translated into multiple languages to support different international customers. Even within the English language, there may be differences in terminology. For example, an automobile “trunk” in the US is called a “boot” in the UK.

While there are hard coded ways of handling this issue, Fralick says the use of Database-oriented retrieve methods is the best practice.

“I would use the ISO code for each destination country; then select the local language terminology or translation for each field on the label based on that,” Fralick said. This means a company needs to maintain a database of term usage and translations for each country they need to support with customized label printing.

So that general approach addresses the translation side of the problem, but does not fully address the challenges of printing non-English characters.

“Keep in mind also that not every language can be represented in the character set most commonly used by the US and Western Europe, so understand if you will have the additional requirement of these sorts of character encodings, and make sure your database can handle it,” Fralick added.

One approach, says Fralick, is to think of these local fields as being image strings, as opposed to characters strings – meaning Mandarin characters could be stored in the database as gifs, and sent to the printer that way. But that may take even more database maintenance, and would result in relatively slow “first out” label printing, as the printer would need to process what could be a significant number of images, rather than directly printing the characters from internal fonts, as it does when printing English.

Unicode to the Rescue?

A technology approach to global character printing called Unicode solves many of the issues around international characters sets, says Hugh Gagnier, Sr. Vice President of Business Development and Operations for Zebra Technologies, a manufacturer of thermal printers.

“Unicode offers a scalable, flexible approach to globalization that is saving our customers and our distributors a lot of time and money,” Gagnier recently told Supply Chain Digest.

Unicode is a global standard for representing virtually any character in any language – perhaps as many as 100,000 in all. Unicode provides a unique number for every character, no matter what the platform, no matter what the program, no matter what the language. It can be thought of as a global version of the “ASCII” character set used in western languages to represent characters on computers.

So, with a Unicode-compatible printer, Unicode characters are sent to the machine, and it translates the Unicode representations into the appropriate global characters.

So, a company would still use the database approach described above, but would store those translations as Unicode numbers, not images or other ways of dealing with the international characters.

This has a number of advantages. It speeds printing, because the characters are accessed as native fonts in the printer. It makes it easy to print multiple character sets on one label, which was often difficult with traditional approaches. It eliminates the need to download or license specific local fonts, as many companies have done in an attempt to solve the problem. It eliminates the need to do that repeatedly as support is required for new countries/languages. If the printer supports Unicode, it means a company – or reseller – can maintain a single pool of inventory that will work anywhere in the world and support shipping to any country.

“This has been a huge benefit for contract manufacturers like Solectron or Jabil, since they can deploy the same printer anywhere or take on new labeling and shipping requirements very rapidly,” Gagnier said.

The same holds true for printer resellers, which can now maintain a single stock of printers that can be quickly localized, rather than in some cases having different printers that have customized fonts installed. With Unicode, a user or reseller easily activates a specific language set that needs supporting.

Despite all these advantages, the concept of Unicode label printing is still somewhat new, even thought the overall Unicode standard, which has many other uses besides label printing, has been around for some time. (More general information can be found at the Unicode Consortium web site.)

But it is starting to gain traction quickly, fueled by the rapid growth of the global supply chain.

Zebra added full Unicode support to its thermal printers in the last few years, even to some of its small desktop printers, and is starting to see rapid adoption by users and resellers, Gagnier said. In addition, many label design packages are now offering Unicode support as well.

“It’s just a lot better to solve this issue inside the printer than outside the printer,” Gagnier added.

Source: http://www.scdigest.com

Read Full Post »

As a German SEO Consultant I worked with UK and US SEO companies and other clients on many internationalisation or localisation projects in recent months. The international sites we tried to optimise in many cases failed to compete with even much smaller local competitors. Also the SEO measures undertaken were far from sufficient due to structural limitations of these projects.

Thus I want to introduce 10 most common fatal localization mistakes English language sites face when entering other markets.

  1. No local domain, instead using internationalcompany.com and having no local domain like .fr for France, .de for Germany or .pl for Poland. Thus everybody will link to the .com domain and the non-English speaking audience will bounce off it before finding the small flag in the right top corner. In the meantime a domain grabber will make big bucks off your brand.
  2. Translating before doing local market research. Ever tried selling beef in India? Or freedom fries in France? Not all mistakes are that easy to spot. Nonetheless most companies just translate their sites without even taking a look at what a new market demands.
  3. No local server. You need a German server to rank high in Google for Germany. The difference is substantial.
  4. Translation full of grammatical and spelling errors. I’m astounded how many business sites fail at that and how bad. Nobody will trust you if you can’t even spell correctly trying to sell something. Hire a translator who is a native speaker of the language you want to localize to and actually lives there not someone living next door.
  5. Setting up a completely new domain for a new country days before you enter the market. Basically you should register the most common international domains months or years before you enter the markets. It might be gone already later and you risk ending up in the Google sand box not being acknowledged as an authority and thus not ranking.
  6. Being far too late on the market. I’m still amazed by the companies which need months or years to offer a product or service in Germany fisrt offered in the US. Why give away 100 million German speaking potential customers to copycats and local businesses? Coming too late (like Facebook in Germany or eBay in Poland) means you will probably never be the leader on the market.
  7. Not having a local address or representation. With the rise of local search and a plethora of local websites and services that replaced directories you won’t even get a link without a proper address.
  8. Not offering payment via PayPal or other locally accepted or wide spread payment methods. Unlike in the US e.g people in Germany don’t use credit cards much.
  9. Broken character sets: Recently I joined several ad networks and affiliate networks and those sites which were translated had in many cases broken German “Umlauts”. In most cases I will leave such a site.
  10. No local blog. If you do not have a “company interface” in a local language you won’t reach the public. You rely solely on search engine traffic but you won’t get it for the reasons above for a while. No useful localized content means no local links. Without local links you won’t rank, even as an authority domain.

Source: http://blog.seoptimise.com

Read Full Post »

The Jordanian embassy in Tel Aviv has created a web site in Hebrew, for the first time since the Hashemite Kingdom signed a peace agreement with Israel in 1994.

“The purpose of the web site is to address Israeli public opinion directly, in an effort to present the way Jordan sees the situation in the region,” Ambassador Ali al-Ayid told Haaretz on Tuesday.  This is the only official Arab web site in Hebrew that is in regular operation.

The web site, http://www.jordanembassytelaviv.gov.jo, is administered by embassy staff and contains a wide variety of information. Among the documents available on the site is a full translation into Hebrew of the Arab Peace Initiative as it was approved by the Arab League Summit in Beirut in 2002. The site also contains the texts of all agreements between the two countries, including the 1994 peace treaty.

In addition, the site offers important information for the Israeli visitor to Jordan – both tourist information and, for those who wish to do business in the kingdom, economic information, such as Jordan’s economic agreements with Israel. And, as is fitting for a monarchy, one can also find information on the royal family.

Source: http://www.haaretz.com

Read Full Post »

A lucky novelist will discover a bonanza in little-known IMPAC prize
Per Petterson, last year’s winner of the IMPAC prize.

Sometime tomorrow, one lucky novelist will get the most famous literary prize that no one’s ever heard of.

That’s an exaggeration, but it’s safe to say that many Americans are completely unaware of the International IMPAC Dublin Literary Award, even though it hands out more money than any other annual fiction prize in the world. Tomorrow’s winner will get 100,000 Euros, or roughly $157,000.

There are natural reasons why the IMPAC award is less visible than such major American literary awards as the Pulitzer Prize and National Book Award or the British Man Booker Prize.

For one thing, the IMPAC is centered in Dublin, which is not a major publishing center. For another, it is only 13 years old, so it’s still building its track record.

Americans may be particularly ignorant of the IMPAC because only one U.S. author has won it — Edward P. Jones in 2005 for his slavery saga, “The Known World,” which also took the Pulitzer Prize and the National Book Critics Circle Award.

While most American awards are limited to U.S. authors and the Booker prize deals with writers from the British commonwealth nations and Ireland, the IMPAC encompasses works from anywhere in the world, as long as they’ve been translated into English.

In this year’s short list of eight novels, for instance, the authors are from Algeria, Australia, France, Ireland, Israel, Lebanon, Spain and Sri Lanka.

And unlike other major literary competitions, the nominations for the IMPAC come from libraries all over the world. Cathy McKenna, senior librarian at the Dublin City Library and IMPAC award administrator, said 122 libraries from around the globe, including 24 in the United States, made nominations this year.

The process generates an intriguing mix of well-known novels and lesser-known local favorites, she said. And in many cases, the winning books are nominated by libraries from outside the author’s nation.

When Irish novelist Colm Toibin won the IMPAC in 2006 for “The Master,” his fictional treatment of the life of Henry James, one of the nominating libraries was the Public Library of Cincinnati and Hamilton County.

Joan Luebering, the manager of the Loveland Branch Library in that system, was the person who passionately pushed for “The Master” and remembers how thrilled she was when it was chosen for the award.

“I think it’s hard for people to find things that are out of the mainstream,” Ms. Luebering said, “especially since there’s so much information out there, so I think the role of librarians of the future will be to help people find the information they need from the flood.”

After the library nominations flow in during the autumn before the prize year, an international panel of authors and academics chooses the short list of up to 10 books the following spring and then selects the winner in June.

One of this year’s judges, Nigerian novelist Helon Habila, said the IMPAC sometimes offers the chance to lift up lesser-known novelists who write just as well as their more famous counterparts. This year, for instance, several libraries had nominated Cormac McCarthy’s apocalyptic novel, “The Road,” which already had won the Pulitzer Prize in America. (A portion of the movie adaptation was filmed in the Pittsburgh region this year.)

“One of the thoughts in my mind,” Mr. Habila said, “was that he might not need this award as much as a younger writer. It won’t have as big an impact on his career.”

The IMPAC award is named for an international company based in Florida, and therein lies part of its history.

In 1992, Dublin Lord Mayor Gay Mitchell wanted to start an international fiction prize that would call attention to the rich literary history of the city that spawned James Joyce, George Bernard Shaw and William Butler Yeats.

Coincidentally, James Irwin, an Irish-American who is chairman of IMPAC, a productivity improvement firm whose European headquarters is in Dublin, “had in his mind that he wanted to do something about books, but he wasn’t sure what,” said his personal assistant, Christopher Houghton.

The son of a New York City fire battalion chief, Mr. Irwin says in his official biography that both his parents were avid readers and encouraged him to do the same. They told him “the sky was my limit, and if there was such a thing as a limit, books were the friendly doorways to the sky.”

When he found out about Mr. Mitchell’s campaign, Mr. Irwin decided to endow the contest with enough money so that it could award 100,000 Euros each year, easily making it the most lucrative fiction award in the world.

Big prize money, however, has not sparked higher visibility for the award. One factor could be that Americans are simply not reading novels and short stories as much as they used to.

A poll done last year by The Associated Press and Ipsos showed that 1 in 4 Americans had not read a single book in the previous year. In 2004, a study titled “Reading at Risk,” by the National Endowment for the Arts, showed that fewer than half of Americans said they ever read literature.

If it is daunting to get people to read literary fiction by American authors, it can be all the tougher to persuade them to pick up a book by a foreign writer.

And that is a shame, said Mr. Habila, the Nigerian novelist who teaches at George Mason University in Fairfax, Va., because “nothing introduces you to a people’s culture and sensibilities better than literature does.

“The news can give you a kind of one-sided view of people in the world,” he said “but literature offers a deeper view.”

Although it flies under the radar of many Americans, the IMPAC is well known by now in the publishing industry, and winning the prize can provide a significant boost to an author’s fortunes, industry experts say.

Last year’s victor, Norwegian author Per Petterson, who won for his father-son novel, “Out Stealing Horses,” experienced an immediate surge in requests from foreign publishers, said his agent, Froydis Kristiansen Jorve. The book has now been published in more than 30 languages, she said.

Beyond its benefit to authors, supporters of the IMPAC prize say they hope it will help convince people of the power of fiction to promote understanding and closer bonds among people throughout the world.

“A lot of the big publishing houses feel there’s not a readership for literature in translation, and people are intimidated by it,” said Jill Schoolman, a founder of Archipelago Books, a small press in Brooklyn, N.Y., that specializes in translating foreign fiction.

“But if you talk to most good readers, I don’t think they’re afraid of translations at all. They just want to read a good book.”

Source: http://www.post-gazette.com

Read Full Post »

Canada Offers an Apology for Native Students’ Abuse

The government of Canada formally apologized on Wednesday to Native Canadians for forcing about 150,000 native children into government-financed residential schools where many suffered physical and sexual abuse. The system of schools, which began shutting down in the 1970s, after decades of operations, was dedicated to eradicating the languages, traditions and cultural practices of Native Canadians and has been linked to the widespread incidence of alcoholism, suicide and family violence in many native communities.

“The treatment of children in Indian residential schools is a sad chapter in our history,” Stephen Harper, the prime minister of Canada, said in a speech in the House of Commons, where a small group of former students and native leaders sat in front of him. “Today, we recognize that this policy of assimilation was wrong, has caused great harm and has no place in our country.” An apology from the prime minister had been sought by native groups for years and was part of a broad, court-sanctioned settlement with the government and the church organizations that operated the schools. The federal government also agreed to pay 1.9 billion Canadian dollars (about $1.85 billion) to surviving students and to establish a truth and reconciliation commission to document the experiences of children who attended the schools.

Harry S. LaForme, a Mississauga Indian and a justice of the Ontario Court of Appeal who will oversee the commission, said the schools program was responsible for making the relationship between native people and other Canadians “so unworkable, so filled with mistrust.”“The policy of the Canadian residential schools wasn’t to educate Indian children,” he said in an interview. “It was to kill the Indian in the child, it was to erase the culture of Indian people from the fabric of Canada.”

In a rare break with parliamentary tradition, several native leaders were allowed to speak from the floor of the House of Commons. Some spoke in their native languages. All praised Mr. Harper for offering the apology, though native groups remain at odds with the government on several issues, including spending on native communities.“The memories of residential schools sometimes cuts like merciless knives at our souls,” Phil Fontaine, the national chief of the Assembly of First Nations, the national association of native groups, told the House of Commons. He wore a ceremonial feathered headdress. “Never again will this House consider us ‘the Indian problem’ just for being who we are,” he said.

In 1990, Mr. Fontaine, an Ojibway, became one of the first native leaders to disclose that he had been sexually abused while attending the Fort Alexander Indian Residential School in Manitoba. The federal government has admitted that sexual and physical abuse in the schools was widespread. In his speech, Mr. Harper acknowledged that “while some former students have spoken positively about their experiences at residential schools, these stories are far overshadowed by tragic accounts of the emotional, physical and sexual abuse and neglect of helpless children.”

Attendance at residential schools was made mandatory by the government in 1920 for native children between the ages of 7 and 16 as part of a program it called “aggressive assimilation.” Children were forced to leave their parents and were harshly punished for speaking their own languages or practicing their religions. All but a small number of the approximately 130 schools were run by Christian denominations that operated them as missionary schools, some as far back as the 19th century. Those denominations were the Anglican, United, Roman Catholic and Presbyterian Churches.

Although the history of the program has been reviewed by various government commissions and courts, many details are still unknown, including the number of children who died from abuse or neglect. The commission run by Justice LaForme will have access to previously closed church and government archives to fill in some of those blanks. The commission also plans to hold hearings around the country to question former students and others familiar with the operation of the schools. Mr. Harper and many fellow members of the Conservative Party initially resisted offering an apology, suggesting that it would be applying current cultural values to the past. Mr. Fontaine said in an interview that he believed that Mr. Harper changed his mind after the government of Australia formally apologized to its aboriginal people earlier this year for its policy of forced assimilation.

Source: http://www.nytimes.com

Read Full Post »

The Cultural Services of the French Embassy are pleased to announce that Linda Coverdale’s translation of Ravel by Jean Echenoz (The New Press) won the French-American Foundation and Florence Gould Foundation’s 21st annual $10,000 translation prize for a fictional work published in 2007.

The translation and publication of Ravel was supported by “French Voices,” a translation grant program run by FACE (the French-American Cultural Exchange) and the Book Office of the Cultural Services of the French Embassy. “French Voices” aims to make 30 contemporary French fiction and non-fiction books available in translation to American readers by the end of 2008, by helping to offset the risks involved in bringing translated texts to the American marketplace.

The publishers of selected books receive $6,000 to cover translation costs, and, if necessary, help identifying an American publishing house to distribute the book in the United States. In addition, translators can apply for a 6 to 8–week residency at the Villa Gillet in Lyon, France, so as to meet with the work’s French author while being immersed in the French language. Each “French Voices” work receives the patronage of a major American writer, either in the form of a preface, or through his or her involvement in a promotional event.

A committee of French and American professionals meets several times a year to read and discuss the works submitted for consideration by “French Voices.” Books must not have been published before 2000, and those embodying new trends in fiction or under-represented perspectives or points of view are especially sought. The committee is headed by a representative from the Book Office of the French Embassy’s Cultural Services and comprises representatives from the Translation Committee of PEN American Center as well as writers, translators, university professors, literary agents, editors, and booksellers. Committee members include:

Esther ALLEN (Columbia, Center for Literary Translation)

Olivier COHEN (L’Olivier)

Emmanuelle ERTEL (translator, NYU)

Stéphane GERSON (NYU, Institute of French Studies)

Siri HUSTVEDT (writer)

A. KAISER (translator) Suzanna LEA (agent)

Elisabeth PEELLAERT (translator)

Jeannette SEAVER (Arcade Publishing)

Frédéric VIGUIER (NYU, Institute of French Studies)

Alyson WATERS (translator, Yale)

Among the titles selected by the committee and already published, there are: Ravel by Jean Echenoz (with a preface by Adam Gopnik), Kick the Animal Out by Véronique Ovaldé (with a preface by Siri Husdvedt), and How to talk about books you haven’t read? by Pierre Bayard (with a preface by Francine Prose). Selected books that have not been published yet include Aux États-Unis d’Afrique by Abdourahman A. Waberi (with a preface by Percival Everett), Cousine K by Yasmina Khadra, Origines by Amin Maalouf, Le bébé by Marie Darrieussecq, Écrits politiques by Maurice Blanchot, La Vie extérieure by Annie Ernaux, and L’Intérieur de la nuit by Leonora Miano. Please see below for the full list of 2006-2007 grantees.

Source: http://www.frenchculture.org/

Read Full Post »

Older Posts »