[Home]WikiBugs/LongPageGetsTruncated

UseModWiki | WikiBugs | RecentChanges | Preferences

I'm still in the process of trying to replicate this. One of my Wiki pages got to be a few thousand words long. I'd just finished editing it and had clicked "Save". Looking at the newly saved page though i noticed it was chopped off about half way down. Despite lots of cutting and pasting, removing all the hyperlink stuff it still wouldn't save all the text, and its seemed to be random as to where it cut it off. I'm currently in the process of rebuilding the page, so I'll post a link if i can get it to reliably fail.

BrendanTregear

Old versions of Netscape (4.x) under Windows have a 32Kb limit for text boxes. The wiki itself has a limit of about 200Kb (configurable), but you should receive an error message if you reach that limit. --CliffordAdams

I'm seeing the same problem with a 12672 byte chunk of text in Mozilla 0.9.8+. It is cut off at 11799 bytes. In addition, the submit of the text never comes back. The page is not saved into the db until I hit "stop" in my browser. The wiki.pl process shows itself in the 'S' state. I can't find anything obvious about the text at the truncation point that would trigger anything. I was able to get the page up by maually editing the wiki .db file -- JoelBecker

Gecko (the mozilla engine) has a bug, hanging on posts bigger than 10kB. This should be already fixed in the latest version. -- DavidSchmitt

I have seen this as well, and it is very easily reproduceable on our Wiki running on Solaris 2.6, Apache 2.0, using IE 6.0:

  1. Open a Wiki page with a pageful of content
  2. Open a Word document or even any Windows text document in Notepad.
  3. Copy and paste a pageful of text into the beginning of the Wiki page
  4. Content is truncated. And if you look at the db file on the file system, it is HUGE!

It appears to have to do with cutting and pasting between Windows (where the browser is running) and Unix (where the Wiki is running), because I have verified that this does not happen if the Wiki is running on Windows NT. This happens even if you run dos2Unix on the text file before you copy and paste its contents into the Wiki.

-- Angelique Faustino

I've only encountered 2 different truncated pages being produced for any given save. 1 page is shorter than the other, but there are never more than 2.

I was trying to determine if there was max character limit and I created a page with 6181 characters. If I add a character, it will truncate. Otherwise, it will save properly at max after 2 tries. 6181 is not accurate though. The problem is that some characters are "worth" more than others. For example, in the page that I made, if I replace two letter characters (for example, lets say XY) with one semicolon (;) the page will always truncate when I save. But with XY, it can save without truncating. Thus, semicolons take up more space than regular letter characters. Furthermore, tags have different values as well. I had a set of <pre> tags in the page. If I remove them, and add the same amount of characters elsewhere, I get to add at least 17 characters more than what the tags consisted of. This happens even if among those characters, you add the same characters used in the tags; meaning it's not the value of those specific characters used in the tags, but the fact that its a tag gives it a different value altogether. Lastly, I verified that this problem has nothing to do with cutting and pasting. If I cut and paste the page I made to another page, I get an identical page that truncates on the same conditions as the first.

-Syed

I show this same feature with Mozilla 1.0, working with the the 0.91 CGI.pl. I get something more like 1081 characters. Something tells me its browser-dependent. I don't get this "feature" with Konqueror. I show HTTP1.1 POST commands, so it's not a limit on the <textarea> field length, is it?

- Mike Dodds < magister AT the-magister.com >

Ah, I have it: mod_perl. My Apache2.0 comes defaulted to use modules/mod_perl.so. I'm not sure how much detail to go into here, so forgive me if I'm a little pedantic. Using that module means that perl scripts are interpreted by the module, not /usr/bin/perl. I just disabled that "feature", resorting to the directive "AddHandler? cgi-script cgi pl". That ensures that the script is interpreted by the system's perl. There's a link from UseModWiki/Install to [an Apache Tutorial] that documents this procedure. I speculate that, if this is the problem, many people will be stumped. Can I recommend this is Apache directive is strongly noted in UseModWiki/Install?

- Mike Dodds < magister AT the-magister.com >

I don't understand: CGI.pl 0.91? I'm using CGI.pm 2.81, and before that I've been using CGI.pm 2.735 (if I remember correctly). I'm using this little CGI script to determine the version. Put it where your wiki.pl is, and name it version.pl or similar. Then call it from your browser. What does it say? -- AlexSchroeder

    #!/usr/bin/perl
    use CGI qw/:standard/;
    $q = new CGI;
    print $q->header()
      . $q->start_html()
      . $q->p("CGI Version: ", $q->version())
      . $q->end_html();


This is maddening. Once wiki.pl compiles under mod_perl...it does one post request correctly, but then no others. If I modify the script, and it recompiles and works...once. Ieee! I'm running as straight CGI now and it seems fine, but if any can remedy this issue (which I think is tied to PersistentCGI), I will personally ship them a batch of "Fire Fudge" (yes, the food...with Cayenne and your choice of peanuts, cashews or toffee mixed in.) --LukeStark


UseModWiki | WikiBugs | RecentChanges | Preferences
Edit text of this page | View other revisions | Search MetaWiki
Last edited November 12, 2007 5:39 pm by MarkusLude (diff)
Search: