When trying to upload a large video to the media library (1.5G mp4), receive the following error: General error: 2006 MySQL server has gone away. The upload status bar climbs to 100%, but then after about an additional minute shows the error. Don’t seem to be having an issue with smaller video files. Are there any settings I can change to solve the problem? 1.8.4 docker install on windows 10 machine.
Sounds like the upload is taking longer than MySQL is prepared to wait for an update from the webapp.
You’d need to increase MySQL’s timeouts to resolve it.
To do so, you’ll need to adjust your docker-compose file slightly in the cms-db section:
Where wait-timeout is the number of seconds mysql should hold a connection open and inactive.
Update … just to try something different, I created a fresh install of 1.8.4 and was able to get a working file after 2 tries. There is no obvious indication that Test file 1 had an issue - it shows as media #20 with the correct duration - but as compared to Test file 2 (media #21, which does work) I can now see that the file size is not correct. Also, looking at the file size in windows explorer, it is now apparent that media #21 is exactly the same size (1,592,990 KB) as the original file (output.mp4). Every other attempt created a file that was slightly smaller.
Still cannot get the file to work on the production install. What is most concerning is that there is no error produced (now that MySQL is no longer timing out) until the player tries to play the file. Even then, the player just shows a blank screen - I would not have known to check the log if I did not happen to catch that the video was not playing.
Thanks again for any input you may have!
So what changed between you getting a bad upload and a good one?
Do you consistently get a good upload now or not?
Nothing changed between getting a bad upload and a good one - same computer, same database, same file. New database is more reliable - the upload succeeded 7 out of 12 times. 3 of the 5 failures were a result of trying to upload 4 at a time - and at least an error was produced upfront on those (see below - the 4th file, Test file9, uploaded ok).
Machine 1 - cannot get file to upload with at least a dozen attempts.
Machine 2 - now has 2 databases. “A”, very similar to the one on machine 1, cannot upload. “B” is a newly created database - and, as mentioned above, is more reliable, but still not 100%.
Both machines have windows 10 professional installed.
Both machines have same version of docker and cms 1.8.4
Using same “output.mpg” upload.
I thought you said it gave you no indication that the file hadn’t uploaded correctly, yet your screenshot shows an error on upload.
That upload error is normally caused either by filesystem permissions, or by a temporary file left from a previous failed upload attempt. I’d try clearing the
shared/cms/library/temp directory and see if that changes anything.
I don’t have an issue with those 3 failures - as mentioned above, at least an error was generated informing of the problem. No error produced for any other of the other dozen+ failures. I checked shared/cms/library/temp - and the folder is empty.
OK. Please can you step through the Report Fault wizard and capture one of these failures, so we can see what the CMS is seeing?
As a test, I’m uploading a 3GB file here on 1.8.4.
It uploads without errors as expected, and the file size and contents match in the library (I’m running md5sum on them both to verify).
What settings exactly have you got in your config.env for:
There are no settings for the 3 items you list. Should there be? If so, how do I add them?
Report fault created a troubleshoot file - how do I send that to you?
So if you’re not setting them explicitly, then they’ll default as follows:
You might want to increase your MAX_EXECUTION_TIME so that it’s slightly longer than whatever value you’ve set the mysql timeout to.
To do so, you can just add a line to config.env, and then down/up the containers.
Increasing the max_execution time might have solved the problem. I’ll do some further testing and let you know.
can i play 5gb files without any problems with this changes ?
I’ve tested up to 3GB here, but I don’t see why you couldn’t go up to 5GB.
why dont include this changes in default docker file ? it gives only pluses
Not everyone will want their users uploading multi-gigabyte files.
The entries in config.env have been possible since the introduction of PHP 5.6 in release 1.8.3.
If you want large file uploads, then that is all that should be required.
The change to the docker-compose file for MySQL seems to only apply to the OP here. I’ve tested with larger files than they are using and haven’t hit MySQL timeouts at all in the same scenario.
Thanks for answer . For next releases i think to have an option of setting maximum load file size in web interface will look good. i mean in config.env set max size could be , and than in a web limitation switch.
It’s not possible to set it in the web interface as it’s a function of PHP not Xibo.