Feb 042012
 

As a typesetting and data conversion service provider, we sell on page price. No matter how much we spend on R&D and how well we may cope with difficult content or authors, in the end it’s the unit price that matters. If a customer’s procurement department only compares page prices, they will try to replace us with commodity service providers whenever possible. But this will only work well for standardized processes or less complicated manuscripts/authors. As a consequence, the relatively easy, high-volume production will be offshored while the so-called boutique production remains at our shop (because some authors or editors insist that their complicated stuff will be treated appropriately). As a consequence, our average page price even gets higher, because there is less cheap off-the-shelf  producton in the mix. As a consequence, the procurement people who only compare page prices seek to quench us out.

What can be done about it? A couple of things.

  • Establish a low-cost production line and rigidly define what is included in the price and what is not. The drawback is that the surcharges that you inevitably charge will contribute to the average cost per page, making you still more expensive. But at least the reasons become much more transparent.
  • Raise awareness that there are different production categories. The customer might already have different production categories. Convince him that the criterion whether a book is typeset in a standard layout or whether the author used a template is not sufficient.
  • Emphasize the role of the intake report. Establish automated tools for checking adherence to templates, image profiles, etc.
  • Establish a Web-based frill counter where you document the production editor’s or author’s special requests. For titles that are supposed in a standard workflow,  ask a customer’s representative to approve every single deviating requirement.

These measures will assist the customers in improving their own processes and hopefully in moving away from underparameterized per-page pricing expectations. Per-page isn’t going to work in the long run, since more and more publications will be unpaginated. However, moving from page to kilobyte metrics won’t be a solution.

To answer the initial question: a boutique shouldn’t necessarily sell commodities. However, flat, per-page price comparisons might suggest diluting boutique production prices with commodity prices.

Another reason for us as a boutique to keep standardized, high-volume production is: not only do we deliver boutique production, but also boutique workflow consultancy and automation. In order to fully understand high-volume production requirements, we have got to do it ourselves. Therefore, we as a boutique do also strive to sell commodities, against all odds.

Apr 252011
 

XSLT/XPath 2.0/3.0 are powerful technologies. But sometimes they’ll drive you nuts. A large share of issues falls into the category of “why doesn’t my template match?” The reasons are manifold. Here are some typical traps:

  • not including the namespace prefix
  • typo in the namespace URI
  • processing the document in a different mode than the template is supposed to match in. A very subtle example, accidentally closing the xsl:template start tag, cutting off the existing mode declaration – this actually happened to me:
<xsl:template match="HyperlinkTextSource[…]">
    mode="idml2xml:ConsolidateParagraphStyleRanges-remove-empty" priority="4">
  • other typos (in predicates, element names, modes)
  • skipping intermediate elements, e.g., formulating predicates for <td>s when the template is supposed to match <tr>
  • other templates have higher priority
  • import precedence: this template is an imported one, and there is a matching template of whatever priority in the importing stylesheet, or in a template imported after the template in question
  • logical misconceptions in the predicates
  • if the template is supposed to match a result of a previous transformation: the output of the previous tranformation is not as expected
  • processing some surrounding element with xsl:copy-of instead of xsl:apply-templates
  • not looking at the actual output document, or looking at another part of the document, while your template indeed matched (thanks, @fbuehring)

My tactics of debugging these cases include

Continue reading »

Feb 012011
 

BREAKING: In #iOS 6/#MacOS 11, Web access only thru proxy.apple.com. Analyst: Web pages bypassing IAP a threat to revenue model

With all the recent fuss about Apple’s In-App Purchase (IAP) API enforcement, the train toward Web apps will take up more traction.

Apple themselves are constantly improving Mobile Safari: it supports most of the events that iOS native apps do support, it is fast enough, and you can imitate a native app’a user experience quite well already. Plus: the Web has gotten really interoperable. You can get almost the same user experience with any iOS browser as with any Android browser without the need to develop for different hardware/software platforms.

German weekly Die Zeit already draw the consequences and abandoned the idea of native apps for their main content.

So in the view of the hefty app tax of 30%, will players such as Sony or Amazon offer their libraries as a pure Web site?

Continue reading »

Dec 212010
 

…in bullshitty publishing industry insiders’ insights.

When I saw this term for the second time today, on page 6 of the flashy “ebook” at zmags (a warm greeting to the surprisingly numerous iPad users who read this blog), I thought it might be worth analyzing whether I was missing something in past years or whether this is a new fad in publishing industry analyst lingo. The latter is true, obviously:

until Q1/2001 Q2/2001–2006 2007 2008 2009 2010
0 33 63 102 206 968
the table above as a hockey stick diagram, having its inflection point in 2009

A hockey stick, that’s what every vertical’s analyst strives for

Continue reading »

Dec 052010
 

As a proud, although specwise so far inactive, IDPF member I stumbled across the first editor’s draft of EPUB 3.0. There are many features in the current early draft that are important to educational content, such as annotations or MathML3. I think that, beside other important roles that it will play, EPUB 3.0 has the potential to become the dominant school textbook format. The primary reasons are in my view:

  • HW/SW vendor neutrality, open standards:
    • A broader installed base of reading systems means economies of scale for the textbook vendor
    • Neutrality is especially important for markets where public procurement and/or public curriculum definition dominates, as in the German school textbook market
  • EPUB’s design metaprinciple of embracing and packaging mainstream user agent technologies: a promise that interactive applications may be developed more cost-effectively than traditional learning applications

Continue reading »

Nov 172010
 

Suppose you imported XML data into an InDesign document. Suppose that the layout should convey the markup’s semantics: keywords in italics, proper names in small caps, block quotes indented etc.

There are several ways how to map the markup to the layout. But it is important to know: no matter how you’ve mapped it, once the mapping has taken place, markup and layout information may evolve in totally different directions. And this is dangerous. For real-world XML document types and real-world typesetting, mapping is a one-way street, leading from markup to layout, and not the other way round, as we’ll see later. If you trust that after carrying out author corrections, everything that looks like a keyword will be a keyword in the exported XML, you may be proven wrong later. Or the two paragraphs that you see in InDesign are still a single one in XML, because it has been split only visually after import and the markup hasn’t been updated accordingly.

Continue reading »