I'm writing a CMS on PHP+MySQL. I want it to be self-updatable (throw one click in admin panel). What are the best practices?
How to compare current version of cms and a version of the update (application itself and database). Should it just download zip archive, upzip it and overwrite files? (but what to do with files that are no longer used). How to check if an update is downloaded correctly? Also it supports modules and I want this modules to be downloadable from the admin panel of cms.
And how should I update MySQL tables?
A slightly more experimental solution could be to use something like the phpsvnclient library.
With features:
- List all files in a given SVN repository directory
- Retrieve a given revision of a file
- Retrieve the log of changes made in a repository or in a given file between two revisions
- Get the repository latest revision
This way you can see if there are new files, removed files or updated files and only change those in your local application.
I recon this will be a little harder to implement, but the benefit would probably be that it is easier and quicker to add updates to your CMS.
RJD22
so what is your solution? I think that file permission problem will occure regardless of what way is used, php-svn or downloading zip archives. –
Mulvey - Keep your code in a separate location from configuration and otherwise variable files (uploaded images, cache files, etc.)
- Keep the modules separate from the main code as well.
- Make sure your code has file system permissions to change itself (use SuPHP for example).
If you do these, simplest would be to completely download the new version (no incremental patches), and unzip it to a directory adjacent to the one containing the current version. Because there won't be variable files inside the code directory, you can just remove or rename the old one and rename the new one to replace it.
You can keep the version number in a global constant in the code.
As for MySQL, there's no other way than making an upgrade script for every version that changes the DB layout. Even automatic solutions to change the table definition can't know how to update the existing data.
A slightly more experimental solution could be to use something like the phpsvnclient library.
With features:
- List all files in a given SVN repository directory
- Retrieve a given revision of a file
- Retrieve the log of changes made in a repository or in a given file between two revisions
- Get the repository latest revision
This way you can see if there are new files, removed files or updated files and only change those in your local application.
I recon this will be a little harder to implement, but the benefit would probably be that it is easier and quicker to add updates to your CMS.
RJD22
so what is your solution? I think that file permission problem will occure regardless of what way is used, php-svn or downloading zip archives. –
Mulvey You have two scenarios to deal with:
- The web server can write to files.
- The web server can not write to files.
This just dictates if you will be decompressing a ZIP file or using FTP to update the files. In ether case, your first step is to take a dump of the database and a backup of the existing files, so that the user can roll back if something goes horribly wrong. As others have said, its important to keep anything that the user will likely customize out of the scope of the update. Wordpress does this nicely. If a user has made changes to core logic code, they are likely smart enough to resolve any merge conflicts on their own (and smart enough to know that a one click upgrade is probably going to lose their modifications).
Your second step is to make sure that your script doesn't die if the browser is closed. This is a process that really should not be interrupted. You could accomplish this via ignore_user_abort(true);
, or some other means. Or, if you like, allow the user to check a box that says "Keep going even if I get disconnected". I'm assuming that you'll be handling errors internally.
Now, depending on permissions, you can either:
- Compress the files to be updated to the system /tmp directory
- Compress the files to be updated to a temporary file in the home directory
Then you are ready to:
- Download and decompress the update
en situ
, or in place. - Download and decompress the update to the system's /tmp directory and use FTP to update the files in the web root
You can then:
- Apply any SQL changes as needed
- Ask the user if everything went OK
- Roll back if things went badly
- Clean up your temp directory in the system /tmp directory, or any staging files in the user's web root / home directory.
The most important aspect is making sure you can roll back changes if things went bad. The other thing to ensure is that if you use /tmp, be sure to check permissions of your staging area. 0600
should do nicely.
Take a look at how Wordpress and others do it. If your choice of licenses and their's agree, you might even be able to re-use some of that code.
Good luck with your project.
The web server can write to files
. Nice suggestions about * to take a dump of the database and a backup of the existing files, if something goes wrong; * to make sure that your script doesn't die if the browser is closed; Thanks. –
Mulvey There is a SQL library called SQLOO (that I created) that attempts to solve this problem. It's a little rough still, but the basic idea is that you setup the SQL schema in PHP code and then SQLOO changes the current database schema to match the code. This allows for the SQL schema and attached PHP code to be changed together and in much smaller chunks.
http://code.google.com/p/sqloo/
http://code.google.com/p/sqloo/source/browse/#svn/trunk/example <- examples
Based on experience with a number of applications, CMS and otherwise, this is a common pattern:
- Upgrades are generally one-way. It's possible to take a snapshot of full system state for a restore upon failure, but to restore usually entails losing any data/content/logs added to the system since the upgrade. Performing an incremental rollback can put data at risk if something were not converted properly (e.g. database table changes, content conversions, foreign key constraints, index creation, etc.) This is especially true if you've made customizations that rollback scripts couldn't possibly account for.
- Upgrade files are packaged with some means of authentication/verification, such as md5 or sha1 hashes and/or digital signature to ensure it came from a trusted source and was not tampered. This is particularly important for automated upgrade processes. Suppose a hacker exploited a vulnerability and told it to upgrade from a rogue source.
- Application should be in an offline mode during the upgrade.
- Application should perform a self-check after an upgrade.
I agree with Bart van Heukelom's answer, it's the most usual way of doing it.
The only other option would be to turn your CMS into a bunch of remote Web Services/scripts and external CSS/JS files that you host in one location only.
Then everyone using your CMS would connect to your central "CMS server" and all that would be on their (calling) server is a bunch of scripts to call your Web Services/scripts that do all the processing and output. If you went down this route you'd need to identify/authenticate each request so that you returned the corresponding data for the given CMS user.
© 2022 - 2024 — McMap. All rights reserved.