Updated documentation.

[ci skip]
This commit is contained in:
Bastian Kleineidam 2013-03-26 17:35:26 +01:00
parent 92150ddbda
commit 30e76470e3
4 changed files with 39 additions and 30 deletions

View file

@ -145,11 +145,11 @@ for example using up to 4 processes:
.RE
.SH ENVIRONMENT
.IP HTTP_PROXY
.B mainline
.B dosage
will use the specified HTTP proxy when downloading URL contents.
.SH NOTES
Should retrieval fail on any given strip
.B mainline
.B dosage
will attempt to retry. However the retry information is only outputted
in the
.B second
@ -172,11 +172,7 @@ Else the return value is zero.
Users can report or view bugs, patches or feature suggestions at
.I https://github.com/wummel/dosage/issues
.SH AUTHORS
Jonathan Jacobs <korpse@slipgate.za.net>
.br
Tristan Seligmann <mithrandi@slipgate.za.net>
.br
Bastian Kleineidam <bastian.kleineidam@web.de>
Jonathan Jacobs, Tristan Seligmann, Bastian Kleineidam <bastian.kleineidam@web.de>
.SH COPYRIGHT
Copyright \(co 2004-2005 Tristan Seligmann and Jonathan Jacobs
.br

View file

@ -77,6 +77,15 @@ Writes out an RSS feed detailing what strips were downloaded in the last 24
hours. The feed can be found in <B>Comics/dailydose.xml</B>.
</DL>
<P>
<DL COMPACT><DT><DD>
<B>json </B>-
Write a JSON file with all download infos (URLs, images). Can be used with
other scripts, eg. order-symlinks.py to add symbolic links.
</DL>
This option can be given multiple times.
<DL COMPACT>
<DT><B>-t</B>, <B>--timestamps</B><DD>
@ -181,7 +190,7 @@ the beginning.
<P>
On Unix, <B><A HREF="../man1/xargs.1.html">xargs</A>(1)</B> can download several comic strips in parallel,
On Unix, <B>xargs(1)</B> can download several comic strips in parallel,
for example using up to 4 processes:
<DL COMPACT><DT><DD>
<B>cd Comics &amp;&amp; find . -type d | xargs -n1 -P4 dosage -b . -v</B>
@ -193,7 +202,7 @@ for example using up to 4 processes:
<DL COMPACT>
<DT>HTTP_PROXY<DD>
<B>mainline</B>
<B>dosage</B>
will use the specified HTTP proxy when downloading URL contents.
</DL>
@ -201,7 +210,7 @@ will use the specified HTTP proxy when downloading URL contents.
<H2>NOTES</H2>
Should retrieval fail on any given strip
<B>mainline</B>
<B>dosage</B>
will attempt to retry. However the retry information is only outputted
in the
@ -238,13 +247,7 @@ Users can report or view bugs, patches or feature suggestions at
<A NAME="lbAM">&nbsp;</A>
<H2>AUTHORS</H2>
Jonathan Jacobs &lt;<A HREF="mailto:korpse@slipgate.za.net">korpse@slipgate.za.net</A>&gt;
<BR>
Tristan Seligmann &lt;<A HREF="mailto:mithrandi@slipgate.za.net">mithrandi@slipgate.za.net</A>&gt;
<BR>
Bastian Kleineidam &lt;<A HREF="mailto:bastian.kleineidam@web.de">bastian.kleineidam@web.de</A>&gt;
Jonathan Jacobs, Tristan Seligmann, Bastian Kleineidam &lt;<A HREF="mailto:bastian.kleineidam@web.de">bastian.kleineidam@web.de</A>&gt;
<A NAME="lbAN">&nbsp;</A>
<H2>COPYRIGHT</H2>

View file

@ -1,7 +1,5 @@
diff --git i/doc/dosage.1.html w/doc/dosage.1.html
index e4f07f0..b8110eb 100644
--- i/doc/dosage.1.html
+++ w/doc/dosage.1.html
--- dosage.1.html.orig 2013-03-26 19:02:14.690504207 +0100
+++ dosage.1.html 2013-03-26 19:04:24.808185171 +0100
@@ -4,7 +4,7 @@
</HEAD><BODY>
<H1>DOSAGE</H1>
@ -11,7 +9,16 @@ index e4f07f0..b8110eb 100644
<A NAME="lbAB">&nbsp;</A>
<H2>NAME</H2>
@@ -269,7 +269,7 @@ Copyright &#169; 2012-2013 Bastian Kleineidam
@@ -190,7 +190,7 @@
<P>
-On Unix, <B><A HREF="../man1/xargs.1.html">xargs</A>(1)</B> can download several comic strips in parallel,
+On Unix, <B>xargs(1)</B> can download several comic strips in parallel,
for example using up to 4 processes:
<DL COMPACT><DT><DD>
<B>cd Comics &amp;&amp; find . -type d | xargs -n1 -P4 dosage -b . -v</B>
@@ -282,7 +282,7 @@
</DL>
<HR>
This document was created by

View file

@ -61,6 +61,10 @@ OPTIONS
rss - Writes out an RSS feed detailing what strips were
downloaded in the last 24 hours. The feed can be found
in Comics/dailydose.xml.
json - Write a JSON file with all download infos (URLs,
images). Can be used with other scripts, eg. order-sym
links.py to add symbolic links.
This option can be given multiple times.
-t, --timestamps
@ -126,13 +130,13 @@ EXAMPLES
ENVIRONMENT
HTTP_PROXY
mainline will use the specified HTTP proxy when down
loading URL contents.
dosage will use the specified HTTP proxy when download
ing URL contents.
NOTES
Should retrieval fail on any given strip mainline will attempt
to retry. However the retry information is only outputted in
the second and successive output levels.
Should retrieval fail on any given strip dosage will attempt to
retry. However the retry information is only outputted in the
second and successive output levels.
At the time of writing, a complete Dosage collection weighs in
at around 3.0GB.
@ -153,9 +157,8 @@ BUGS
at https://github.com/wummel/dosage/issues
AUTHORS
Jonathan Jacobs <korpse@slipgate.za.net>
Tristan Seligmann <mithrandi@slipgate.za.net>
Bastian Kleineidam <bastian.kleineidam@web.de>
Jonathan Jacobs, Tristan Seligmann, Bastian Kleineidam <bas
tian.kleineidam@web.de>
COPYRIGHT
Copyright © 2004-2005 Tristan Seligmann and Jonathan Jacobs