recently fixed bugs
The following renders incorrectly:
[[!toc ]]
# header1
content1
# header2
[[!map pages="sandbox"]]
Removing the [[!toc ]]
directive or moving it at the end of the page
makes the whole wiki page be rendered as expected.
Hint : in all cases, the non-interpreted markdown code is copied as-is
in the HTML output, without any leading <p>
or any HTML formatting.
You're using the old version of
markdown
, that is known to have a broken block html parser, that will get confused if markdown is present between two separate html blocks, and not format the markdown.This is fixed in Text::MarkDown 1.0.19. markdown 1.0.2 also fixes the problem. Install either one. I'm going to make ikiwiki's dependencies list Text::Markdown before markdown, since people keep stumbling over this. (The downside is that the old broken markdown is faster). --Joey
done
If you wish to install ikiwiki in your home directory (for example because you don't have root access), you need to set environment variables (such as PATH and PERL5LIB) to point to these directories that contain your personal copy of IkiWiki.
The CGI wrapper remembers PATH, but not the environment variable PERL5LIB. Consequently, it will look for plugins and so on in the usual system directories, not in your personal copy. This is particularly insidious if you have a system copy of a different version installed, as your CGI wrapper may then load in code from this version.
I think the CGI wrapper should remember PERL5LIB too.
-- Martin
Thank's a lot for pointing me to this location in the code. I was looking it for some time.
This brutal patch implement your solution as a temporary fix.
*** Wrapper.pm.old 2012-08-25 16:41:41.000000000 +0200
--- Wrapper.pm 2012-10-01 17:33:17.582956524 +0200
***************
*** 149,154 ****
--- 149,155 ----
$envsave
newenviron[i++]="HOME=$ENV{HOME}";
newenviron[i++]="PATH=$ENV{PATH}";
+ newenviron[i++]="PERL5LIB=$ENV{PERL5LIB}";
newenviron[i++]="WRAPPED_OPTIONS=$configstring";
#ifdef __TINYC__
As I am not sure that remembering PERL5LIB
is a good idea, I think that a prettier solution will be to add a config variable (let's say cgi_wrapper_perllib
) which, if fixed, contains the PERL5LIB
value to include in the wrapper, or another (let's say cgi_wrapper_remember_libdir
), which, if fixed, remember the current PERL5LIB
.
-- Bruno
Update: I had not seen this bug earlier, but I ran into the same issue and made a more general solution. You can already add stuff to %config{ENV}
in the setup file, but it was being processed too late for PERL5LIB
to do any good.
This change moves the %config{ENV}
handling earlier in the wrapper, so anything specified there is placed back in the actual environment before Perl gets control. Problem solved!
-- Chap
Thanks, this looks like a nicer solution than the above. Some review:
+ $val =~ s/([\\"])/\\$1/g;
This is probably OK, because the configuration is unlikely to include non-ASCII, but I'd prefer something that covers all possibilities, like this:
my $tmp = $val; utf8::encode($tmp) if utf8::is_utf8($tmp); $tmp =~ s/([^A-Za-z0-9])/sprintf "\\x%02x", $1/ge;
and then passing $tmp to addenv.
+ delete $config{ENV};
I don't think this is particularly necessary: there doesn't seem any harm in having it in the storable too?
--smcv
Happy to make the escaping change, thanks for the sharp eye.
Merged with that change. --smcv
My thinking on delete
is once it's handled, it's handled. The C code
is going to put this straight into the real environment and then do
a simple exec
... is there any way this hasn't been handled?
It just takes up space twice in the generated wrapper otherwise. Admittedly it's not much space, but seems to be even less point ... ?
-- Chap
That makes sense, as long as nothing else is going to read
$config{ENV}
for purposes other than copying it into the actual environment. --smcv
I've just backported your ikiwiki 1.43 and installed it, but now
I can't rebuild my all ikiwiki pages. When I run ikiwiki --setup ikiwiki.setup
,
then I can see the following error:
internal error: smileys.mdwn cannot be found
BEGIN failed--compilation aborted at (eval 5) line 111.
I have smiley plugin enabled in my ikiwiki.setup
file, but I've never
had smileys.mdwn
page.
Probably the reason of the problem is that you've removed many pages
from basewiki
directory and created symlinks for that pages, but there
don't exist in the last package:
$ LANG=C apt-cache policy ikiwiki
ikiwiki:
Installed: 1.43gpa1
Candidate: 1.43gpa1
Version Table:
*** 1.43gpa1 0
500 http://gpa.net.icm.edu.pl sarge/main Packages
100 /var/lib/dpkg/status
$ dpkg -L ikiwiki |grep smileys.mdwn
--Paweł
This seems to be a bug in your 1.43gpal version, whatever that is.. In the package I built, I see:
joey@kodama:~>dpkg -L ikiwiki | grep smileys.mdwn
/usr/share/ikiwiki/basewiki/smileys.mdwn
joey@kodama:~>ls -l /usr/share/ikiwiki/basewiki/smileys.mdwn
-rw-r--r-- 1 root root 1643 Feb 13 18:03 /usr/share/ikiwiki/basewiki/smileys.mdwn
--Joey
You're right. My backport was builded without all symlinks, because I store all rebuilded sources in CVS repo, but it seems that CVS doesn't support symlinks. Grrr... I need to switch to another repo now. --Paweł
Ok, done then --Joey
I just setup my first OpenID account and tried to login to ikiwiki.info. It all works but being relatively unfamiliar with OpenID, when I was presented with the login page it wasn't at all clear which bits needed to be filled in.
At the moment it looks like this:
Name:
Password:
OpenID:
[Login] [Register] [Mail Password]
Really this form is presenting two entirely separate ways to login. The "normal" user/pass OR OpenID. Also (I assume) the [Register] and [Mail Password] actions are only relevant to user/pass form.
I would suggest that the form be split into two parts, something like this:
Login (or register) with a username and password:
Name:
Password:
[Login] [Register] [Mail Password]
**OR**
Login with OpenID:
OpenID URL:
[Login]
As an example, the first time I went to login I filled in all three fields (user, pass, openid) and then clicked [Register] because from the layout I assumed I still had to instantiate an account with ikiwiki ... and to make it even more confusing, it worked! Of course it worked by creating me an account based on the username password and ignoring the OpenID URL.
If you want to keep it as one form, then perhaps using some javascript to disable the other pieces of the form as soon as you fill in one part would help? Eg. If you put in an OpenID URL then Name/Password/Register/Mail Password gets greyed out. If you enter a username then the OpenID URL gets greyed out. -- Adam.
It's one form for architectural reasons -- the OpenID plugin uses a hook that allows modifying that form, but does not allow creating a separate form. The best way to make it obvious how to use it currently is to just disable password auth, then it's nice and simple. Javascript is an interesting idea. It's also possible to write a custom templates that is displayed instead of the regular signin form, and it should be possible to use that to manually lay it out better than FormBuilder manages with its automatic layout. --Joey
I've improved the form, I think it's more obvious now that the openid stuff is separate. Good enough to call this done. I think. --Joey
Looks good, thanks! -- AdamShand
I have a commit doing
-[[map pages="link(tag/<TMPL_VAR name>) and !papers/*"]]
+[[map pages="link(sourcepage()) and !papers/*"]]
ikiwiki now fails to compile the site, barfing:
Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki.pm line 1288.
ikiwiki.setup: Can't use string ("") as a subroutine ref while "strict refs" in use at /usr/share/perl5/IkiWiki.pm line 1288.
BEGIN failed--compilation aborted at (eval 6) line 200.
after forcefully entering the Perl mode of thinking, I reduced this to line
1285 of IkiWiki.pm (2.53), which apparently returns undef
:
my $sub=pagespec_translate($spec);
Why does it even bother parsing the diffs of recentchanges
?
I have not recompiled this site in ages, so I am not sure when this problem was introduced, but it wasn't there when I worked on the site last about a year ago in September 2007.
-- madduck
I can't reproduce this problem. When I try, the generated
recentchanges/change_$sha1._change
file has the diff properly escaped, so that the map is not expanded at all.I also tried de-escaping that, and still failed to reproduce any crash. The bogus pagespec simply expands to nothing. The line directly after the line you quoted checks for syntax errors in the pagespec translation eval and seems to be working fine:
joey@kodama:~>perl -e 'use IkiWiki; my $sub=IkiWiki::pagespec_translate("link(tag/) and !papers/*"); print "caught failure:".$@' caught failure:syntax error at (eval 14) line 1, near "|| &&"
Based on your line numbers, you are not running a current version of ikiwiki. (Doesn't quite seem to be version 2.53.x either) Try with a current version, and see if you can send me a source tree that can reproduce the problem? --Joey
Did not hear back, so calling this done, unless I hear differently. --Joey
Just in case someone else sees this same error message: I was able to reproduce this by having an incomplete (not upgraded) rcs backend that didn't provide rcs_commit_staged() when attempting to submit a blog comment. --JeremyReed
The img plugin is not generating the proper class
attribute in its HTML output.
The plugin receives something like the following:
[[!img 129199047595759991.jpg class="centered"]]
And is supossed to generate an HTML code like the following:
<img src="129199047595759991.jpg" class="centered" />
But is generating the following
<img src="129199047595759991.jpg" class="centered img" />
This seems to be happening with all images inserted using the plugin (that use
the class=yaddayadda
argument to the img
directive.) I remember it didn't
happen before, and I suspect an ikiwiki upgrade is to blame. I tested with a
blog created from scratch, and a single post, and the problem appeared there
too.
This is happening with version 3.20100815 of ikiwiki.
How is this a bug? It's perfectly legal html for a class attribute to put an element into multiple classes. notabug --Joey
open (IN, "$config{wikistatedir}/aggregate" ||
die "$config{wikistatedir}/aggregate: $!");
It looks like the intent was "open this file, and die if you can't",
but I'm pretty sure it actually means "open this file and ignore errors
silently". Shouldn't this be open(IN, $file) || die "$file: $!";
(i.e. with the parens before the call to die
)? --Ethan
Thanks, done --Joey
Hi!
While working on Reproducible Builds for Tails, we noticed that the img plugin's output is not deterministic: PNG images embed timestamps.
The img-determinism
branch in the
https://git-tails.immerda.ch/ikiwiki.git
Git repository has a fix
for this problem + a new test (that fails without this change, and
succeeds with the branch merged).
Thanks, merged --smcv
Hi, I am trying to build a template. The compilation of this template results in a weird exception. I have isolated the cause of the exception to the following point:
If i have this in the template code:
[[!inline
pages="\"
template=extract-entry
]]
There is no problem at all. I can use the template with the desired result. But if I try to use this (just adding the "show" parameter):
[[!inline
pages="\"
template=extract-entry
show=\
]]
I get this exception on the Git bash console:
$ git push Counting objects: 7, done. Delta compression using up to 8 threads. Compressing objects: 100% (4/4), done. Writing objects: 100% (4/4), 410 bytes, done. Total 4 (delta 3), reused 0 (delta 0) remote: From /home/b-odelama-com/source remote: eb1421e..5e1bac5 master -> origin/master remote: Argument "\x{3c}\x{54}..." isn't numeric in numeric lt (<) at /usr/share/perl5/IkiWiki/Plugin/inline.pm line 231. remote: Argument "\x{3c}\x{54}..." isn't numeric in numeric lt (<) at /usr/share/perl5/IkiWiki/Plugin/inline.pm line 231. To ssh://b-odelama-com@odelama-com.branchable.com/ eb1421e..5e1bac5 master -> master
Please, let me know what to do to avoid this kind of error.
When you add a template page
templates/foo.mdwn
for use the template directive, two things happen:
[[!template id=foo ...]]
becomes available;- a wiki page
templates/foo
is built, resulting in a HTML file, typicallytemplates/foo/index.html
The warnings you're seeing are the second of these: when ikiwiki tries to process
templates/foo.mdwn
as an ordinary page, without interpreting the<TMPL_VAR>
directives,inline
receives invalid input.This is a bit of a design flaw in template and edittemplate, I think - ideally it would be possible to avoid parts of the page being interpreted when the page is being rendered normally rather than being used as a template.
There is a trick to avoid parts of the page being interpreted when the page is being used as a template, while having them appear when it's rendered as a page:
<TMPL_IF FALSE> <!-- This part only appears when being used as a page. It assumes that you never set FALSE to a true value :-) --> [[!meta robots="noindex,nofollow"]] This template is used to describe a thing. Parameters: * name: the name of the thing * size: the size of the thing </TMPL_IF> The thing is called <TMPL_VAR name> and its size is <TMPL_VAR size>
I suppose you could maybe extend that to something like this:
<TMPL_IF FALSE> <!-- This part only appears when being used as a page. It assumes that you never set FALSE to a true value :-) --> [[!meta robots="noindex,nofollow"]] This template is used to describe a thing. Parameters: * name: the name of the thing * size: the size of the thing </TMPL_IF> <TMPL_IF FALSE> [[!if test="included() and !included()" then=""" </TMPL_IF> <!-- This part only appears when being used as a template. It also assumes that you never set FALSE to a true value, and it relies on the <a href="../../ikiwiki/pagespec/">pagespec</a> "included() and !included()" never being true. --> The thing is called <TMPL_VAR name> and its size is <TMPL_VAR size> <TMPL_IF FALSE> """]] </TMPL_IF>
but that's far harder than it ought to be!
Perhaps the right solution would be to change how the template plugin works, so that templates are expected to contain a new
definetemplate
directive:This template is used to describe a thing. Parameters: * name: the name of the thing * size: the size of the thing [[!definetemplate """ The thing is called <TMPL_VAR name> and its size is <TMPL_VAR size> """]]
with templates not containing a
[[!definetemplate ]]
being treated as if the whole text of the page was copied into a[[!definetemplate ]]
, for backwards compatibility?--smcv
OK, here is a branch implementing what I said. It adds the
definetemplate
directive to goodstuff as its last commit.Templates with the current strange semantics will still work, until IkiWiki breaks compatibility.
Possible controversies:
Should the
definetemplate
plugin be core, or in goodstuff, or neither?Should [[!definetemplate ]] be allowed on any page (with the implementation of
template("foo")
looking for adefinetemplate
intemplates/foo
, then adefinetemplate
infoo
, then fall back to the current logic)? If not, should [[!definetemplate ]] raise an error when used on a page not intemplates/
, since it will have no practical effect there?Is it OK to rely on
definetemplate
being enabled in the basewiki's templates?Should the "use definetemplate" wording in the documentation of template and edittemplate be stronger? Should those plugins automatically load definetemplate?
--smcv
this looks like a good idea to me.
i'd put it in core, and add a transition for the time compatibility gets broken, provided the transitioning system will be used in that. templates can't be expected to just work as markdown+ikiwiki too.
(it being in core would also solve my qualms about
section => "web"
/[[!tag type/web]]
).if definetemplate gets deemed core, no "use definetemplate!" notes on the template/edittemplate pages will be required any more.
first i was sceptical of the approach of re-running scan to make sure the
my %templates
is filled, but it is indeed a practical solution.the name "
definetemplate
" gives me the first impression that something is assigned (as in#define
), but actually it highlights a region in the file. wouldn't "templatebody
" be a better description of the meaning of the directive?--chrysn
Thanks for your feedback! Looking at its description on this wiki, I agree that
type/web
doesn't fit, and core does seem better. I like yourtemplatebody
suggestion, too, particularly if templates remain restricted to/templates
. I'll try to come up with better wording for the documentation to say "usetemplatebody
, like this", with a note about backwards compatibility later.Rationale for
my %templates
: yes it does seem a bit odd, but if I used$pagestate{$tpage}{template}
instead of amy
variable, I'd sometimes still have to force ascan
, because template has to expand the template at scan time so that it can contain links etc. - so I have to make sure that if the template has changed, it has already been scanned (scanning happens in random order, so that can't be guaranteed). This means there's no benefit in reading it back from the index, so it might as well just be in-memory.I suppose an alternative way to do it would be to remember what was passed to
needsbuild
, and only force ascan
for templates that were in that list - which potentially reduces CPU time and I/O a little, in exchange for a bigger index. I could do that if Joey wants me to, but I think the current approach is simpler, so I'll stick with the current approach if it isn't vetoed. --smcv@name: even outside
/templates
,[[!templatebody ]]
would be interpreted as "when this page is used as a template, this is what its contents should be", and be suitable.@
%templates
: my surprise wasn't to it not being in%pagestate
, but rather that thescan
function was used for it at all, rather than plain directive parsing that ignores everything else -- but i agree that it's the right thing to do in this situation.--chrysn
Branch and directive renamed to ready/templatebody
as chrysn suggested.
It's on-by-default now (or will be if that branch is merged).
Joey, any chance you could review this?
There is one known buglet: template_syntax.t
asserts that the entire
file is a valid HTML::Template, whereas it would ideally be doing the
same logic as IkiWiki itself. I don't think that's serious. --smcv
Looking over this, I notice it adds a hash containing all scanned files. This seems to me to be potentially a scalability problem on rebuild of a site with many pages. Ikiwiki already keeps a lot of info in memory, and this adds to it, for what is a fairly minor reason. It seems to me there should be a way to avoid this. --Joey
Maybe. Are plugins expected to cope with scanning the same page more than once? If so, it's just a tradeoff between "spend more time scanning the template repeatedly" and "spend more memory on avoiding it", and it would be OK to omit that, or reduce it to a set of scanned templates (in practice that would mean scanning each template twice in a rebuild). --s
Commit f7303db5 suggests that scanning the same page more than once is problematic, so that solution is probably not going to work.
The best idea I've come up with so far is to track whether we're in the scan or render phase. If we're in the scan phase, I think we do need to keep track of which pages we've scanned, so we don't do them again? (Or perhaps that's unnecessary - commit f7303db5 removed a scan call that's in the render phase.) If we're in the render phase, we can assume that all changed pages have been scanned already, so we can drop the contents of
%scanned
and rely on a single boolean flag instead.This is not actually good enough for the templatebody directive, which does in fact need to scan certain pages during the render phase, namely when a page that is rendered due to dependencies uses a template that no other page being rendered in this pass was using. I've reverted this optimization, to fix wrong rendering of templatebody, and applied a more limited version which only optimizes rebuilds (the worst case of this memory consumption). --smcv
%scanned
is likely to be no larger than%rendered
, which we already track, and whose useful lifetime does not overlap with%scanned
now. I was tempted to merge them both and call the result%done_in_this_phase
, but that would lead to really confusing situations if a bug led torender
being called sooner than it ought to be.My ulterior motive here is that I would like to formalize the existence of different phases of wiki processing - at the moment there are at least two phases, namely "it's too soon to match pagespecs reliably" and "everything has been scanned, you may use pagespecs now", but those phases don't have names, so write doesn't describe them.
I'm also considering adding warnings if people try to match a pagespec before scanning has finished, which can't possibly guarantee the right result, as discussed in conditional preprocess during scan. My
wip-too-soon
branch is a start towards that; the docwiki builds successfully, but the tests that use IkiWiki internals also need updating to set$phase = PHASE_RENDER
before they start preprocessing. --sreviewing those modifications, i think this is a good way to go. along with warning about pagespecs evaluated in scan phase, i think it should be an error to invoke scan in the render phase; that would mean that
readtemplate
needs to check whether it's invoked as a scan or not to decide whether to scan the template page, but would be generally more robust for future plugin writing.At the moment templatebody really does need to re-scan templates in the render phase, unfortunately. Not scanning in the render phase seems to be precisely how wrong rendering of templatebody happened. --s
addendum: if the new phase state is used to create warnings/errors about improper ikiwiki api use of plugins (which is something i'd advocate), that should likewise warn if
add_link
actually adds a link in the render phase. such a warning would have helped spotting the link-related template evaluation oddities earlier. --chrysnMerged --smcv
When I set up my picture page with [[!img defaults size=300x]]
then the html validator complains that the value for height is missing and the IE browsers won't show the pictures up at all; no problems with ff tho. If I set up my picture page with [[!img defaults size=300x300]]
then all the images are funny stretched. What am I doing wrong?
This is a bug. --Joey
And .. fixed --Joey
Not quite; For some reason it requires me to update wiki pages twice before the height value shows up.