Patch #1404
closedRaise wiki content page size
100%
Description
Files
Related issues
Updated by Thomas Lecavelier over 16 years ago
Updated by Thomas Lecavelier over 16 years ago
I'm quite dumb... But I think I found a problem: uploaded file is missed when you meet an error just before the successful push (in my case: I forgot the subject when I create this patch issue).
Hoping this patch is interesting.
Updated by Jean-Philippe Lang over 16 years ago
Thanks for your patch Thomas.
But the migration fails with Postgresql (tested with 8.3).
I found this ticket on Rails Trac: http://dev.rubyonrails.org/ticket/3818.
== 95 ChangeWikiContentsTextLimit: migrating ================================== -- change_column(:wiki_contents, :text, :text, {:limit=>131072}) rake aborted! RuntimeError: ERROR C42601 Mtype modifier is not allowed for type "text" P52 F.\src\backend\parser\parse_type.c L273 RtypenameTypeMod: ALTER TABLE wiki_contents ADD COLUMN "text_ar_tmp" text(131072)
Maybe we could test the database type and do change for MySQL only.
Updated by Thomas Lecavelier over 16 years ago
- % Done changed from 100 to 80
It would be far better to stay database agnostic... I'll do some research tomorrow. Thank you for pointing the problem.
Updated by Thomas Lecavelier over 16 years ago
After some searches, it appears that postgresql has no limit to its text type. But the error when migrating a :text with :limit => x on a postgresql database appear like an authentic bug.
My first thought is to do two different patches:- Check on creation that the wiki content is not too long for its database
- Provide a migration to raise wiki content page size to 128KB or 256KB (even I think 64KB is a good limit) with a monkey patch about this postgresql problem
But postgresql is a problem, now:
- How to write a test that can fail when content is too big, knowing that postgresql text columns have no limits?
As I don't want to provide a patch without matching test cases, I'm listening to every brillant mind which could help me to find a good and reliable testcase... :-/
Updated by Thomas Lecavelier over 16 years ago
It seems that the pgsql bug doesn't exist anymore. Here the updated patch, against trunk@r1736. Hoping it solve the problem.
Updated by Mischa The Evil about 16 years ago
Thomas Lecavelier wrote:
Here the updated patch, against trunk@r1736. Hoping it solve the problem.
I've just applied this patch on clean r1900 using MySQL 4.1.20 to prevent the silent-truncating of wikipages bigger than 64K. Though when testing it by creating a new wikipage containing 65485 characters (without counting LF's) I don't get any error-message. Instead it still silently truncates the wikipage content to 64K.
Is this due to this patch (ifso: how could it be solved?) or is it to blame my older MySQL-version?
Updated by Jean-Philippe Lang about 16 years ago
The db migration is missing in the updated patch. I'll have a look at it.
Updated by Thomas Lecavelier about 16 years ago
Erf... Migration come back... Here the corrected patch, against trunk@1905
Hoping this patch is the one... -_-
Updated by Mischa The Evil about 16 years ago
Thomas Lecavelier wrote:
Here the corrected patch, against trunk@1905
Ok, now the DB-migration (which modifies the DB-table type to MEDIUMTEXT
/MEDIUMBLOB
which can store 16 million characters) is included in the patch.
The above change will provide some more space (upto 16 million characters) for wikipages, though the patch isn't fully functional imho.
Expected behaviour
The changes in the wiki_controller.rb
are imo intended to create a flashnotice and prevent saving (thus truncating) the contents to the DB when the size of wikipage is too big.
Actual behaviour
If I try to create a (new) wikipage longer than 64K characters, without having the DB-migration applied (otherwise I should test this with over 16 million characters in a page), I don't receive the flashmessage showing the language string value of text_wiki_content_too_large
. Also the content of the page is still truncated and saved to the DB.
HTH...
Updated by Mischa The Evil over 15 years ago
Are there any updates on this issue?
Updated by Eric Gallimore about 15 years ago
I just encountered this behavior while importing a Trac database. I had a problem with two fields, and did this to resolve them:
Edit wiki_contents table to make column "text" type LONGTEXT (type TEXT is too short).
Edit wiki_content_versions table to make column "data" type LONGBLOB (type BLOB is too small).
Having done this, it seems that everything imported properly.
Updated by Thomas Lecavelier about 15 years ago
Oh… It looks like urgent, now. I'll propose a new patch for the end of week.
Updated by Thomas Lecavelier about 15 years ago
Rewritten patch (see comment in #1071) => raise wiki content to 16MB.
Updated by Jean-Philippe Lang about 15 years ago
Trying to migrate with a Postgres 8.3 database gives the following error:
== EnlargeWikiContents: migrating ============================================ -- change_column(:wiki_contents, :text, :text, {:limit=>16777216}) rake aborted! An error has occurred, this and all later migrations canceled: RuntimeError: ERROR C42601 Mtype modifier is not allowed for type "text" F.\src\backend\parser\parse_type.c L273 RtypenameTypeMod: ALTER TABLE "w iki_contents" ALTER COLUMN "text" TYPE text(16777216)
Updated by Thomas Lecavelier almost 15 years ago
Thank you for reviewing my patch, jp. Since that page,
http://www.postgresql.org/docs/7.4/interactive/datatype-character.html
It appears that postgres / rails adaptor should ignore limit silently. I'll find or create a bug in Rails about it. I'll try to modify the patch with that two points:
- Don't submit :limit key when ARAdaptor is postgres
- Detect capacity overflow in wiki to warn potential data loss
I'll update that patch in a while, since my first child is born this saturday ;) Have an happy Xmas
Updated by Gerry Gerry about 14 years ago
- Status changed from Closed to Reopened
Updated by Mischa The Evil about 14 years ago
- Status changed from Reopened to Closed
- % Done changed from 80 to 100
- Estimated time deleted (
1.00 h)