Benutzer:Dirk Hünniger/wb2pdf


SummaryBearbeiten

mediawiki2latex converts MediaWiki markup to LaTeX and, via LaTeX, to PDF. It can be used to export pages from any project running MediaWiki, such as Wikipedia. It is also possible to generate epub and odt output files.

Web VersionBearbeiten

You may test mediawiki2latex under the following url

https://mediawiki2latex.wmflabs.org/

Remember:

  • There is a time limit of four hours   2000 pages per request on the server.
  • There is no limit on the locally installed versions described below.
  • There is an other server that can compile larger documents (up to 5000 pages) https://mediawiki2latex-large.wmflabs.org/

Installation InstructionsBearbeiten

see Installation Instructions

User ManualBearbeiten

see the User Manual



Command Line VersionBearbeiten

A command line version is currently available as part of the Stretch debian distribution, as well as the current ubuntu distribution.

LaTeX intermediate CodeBearbeiten

On Linux you can use the -c command line option with an absolute pathname.

MediaBearbeiten

TalkBearbeiten

File:Wb2pdfTalk.ogv

SlidesBearbeiten

 

PosterBearbeiten

File:Wb2pdfPoster.png

In ActionBearbeiten

To see it in action look here: Datei:Wb2latexCompilingWikibook2PDF.ogg

DevelopersBearbeiten

The follwing Link Benutzer:Dirk Huenniger/wb2pdf/details explains some of the inner workings of the software.

Quality and StatisticsBearbeiten

A test run in October 2014 processing 4369 featured articles of the English Wikipedia did produce a PDF file in each case. In particular these were all featured articles we were able to find at the beginning of the test.

In May 2020 we looked at the usage of the web server an saw that the 50 requests looked at resulted in the following output:

PDF 37
FAIL 2
EPUB 3
ZIP 2
ODT 6

The failures are believed to be caused by trying to process large books from the wikipedia book namespace exceeding the time limit of four hours.

In December 2018 we also did a test run on 100 featured articles on the English wikipedia. In two cases a PDF was not created. We just ran these two cases once again and got a PDF in each case. The total size of all PDFs was 2.2 GB on disk. 5 GB were downloaded in order to make them. The process took 6 hours and 15 minutes. The computer used was a i5-8250U notebook with 8 GByte of memory running only one instance of mediawiki2latex at a time. The internet downstream speed was 11.6 MBit/s.

The currently largest book (8991 pages) we created with mediawiki2latex is available from here:

https://drive.google.com/file/d/1EfSMj34KE1YZHsNpHRExJvLwsuxEQEgT/view?usp=sharing