[Koha-patches] [PATCH] Bug 8268: Add database dump to export tool

Marc Balmer marc at msys.ch
Tue Jun 19 04:22:39 CEST 2012


This is a MySQLism and can't go in as is.

The script backup.sh should really call backup_mysql.sh or backup_postgresql.sh, depending on whether database is being used.

backup_postgresql.sh can be for now, but provisions have to be made to allow for DB independent backup.

- Marc


-- 
Marc Balmer
micro systems, http://www.msys.ch/
Tel. +41 61 383 05 10, Fax +41 61 383 05 12

Am 18.06.2012 um 23:47 schrieb Jared Camins-Esakov <jcamins at cpbibliography.com>:

> This patch builds on work by Lars Wirzenius for the Koha packages.
> 
> To date, the only way for a Koha librarian to obtain a complete backup
> of their system has been to log into the system via SSH (or FTP) to
> download the mysqldump file. This patch makes it possible for
> superlibrarians in properly configured systems to download night backups
> via the staff client's Export tool.
> 
> Recognizing that this is functionality with potentially very grave
> security implications, system administrators must manually enable these
> features in the koha-conf.xml configuration file.
> 
> The following configuration settings have been added to the koha-conf.xml
> file:
> * backupdir => directory where backups should be stored.
> * backup_db_via_tools => whether to allow superlibrarians to download
>  database backups via the Export tool. The default is disabled, and
>  there is no way -- by design -- to enable this option without manually
>  editing koha-conf.xml.
> * backup_conf_via_tools => whether to allow superlibrarians to download
>  configuration backups via the Export tool (this may be applicable to
>  packages only). The default is disabled, and there is no way -- by
>  design -- to enable this option without manually editing koha-conf.xml.
> 
> This commit modifies the following scripts to make use of the new
> backupdir configuration option:
> * koha-dump and koha-run-backups in the Debian packages
> * The sample backup script misc/cronjobs/backup.sh
> 
> Note that for security reasons, superlibrarians will not be allowed
> to download files that are not owned by the web server's effective user.
> This imposes a de facto dependency on ITK (for Apache) or running the
> web server as the Koha user (as is done with Plack).
> 
> To test:
> 1. Apply patch.
> 2. Go to export page as a superlibrarian. Notice that no additional
>   export options appear because they have not been enabled.
> 3. Add <backupdir>$KOHADEV/var/spool</backup> to the <config> section
>   of your koha-conf.xml (note that you will need to adjust that so that
>   it is pointing at a logical directory).
> 4. Create the aforementioned directory.
> 5. Go to export page as a superlibrarian. Notice that no additional
>   export options appear because they have not been enabled.
> 6. Add <backup_db_via_tools>1</backup_db_via_tools> to the <config>
>   section of your koha-conf.xml
> 7. Go to the export page as a superlibrarian. Notice the new tab.
> 8. Go to the export page as a non-superlibrarian. Notice there is no
>   new tab.
> 9. Run: mysqldump -u koha -p koha | gzip > $BACKUPDIR/backup.sql.gz
>   (substituting appropriate user, password, and database name)
> 10. Go to the export page as a superlibrarian, and look at the "Export
>    database" tab. If you are running the web server as your Koha user,
>    and ran the above command as your Koha user, you should now see the
>    file listed as an option for download.
> 11. If you *did* see the file listed, change the ownership to something
>    else: sudo chown root:root $BACKUPDIR/backup.sql.gz
> 11a. Confirm that you no longer see the file listed when you look at the
>     "Export database" tab.
> 12. Change the ownership on the file to your web server (or Koha) user:
>    sudo chown www-data:www-data backup.sql.gz
> 13. Go to the export page as a superlibrarian, and look at the "Export
>    database" tab. You should now see backup.sql.gz listed.
> 14. Choose to download backup.sql.gz
> 15. Confirm that the downloaded file is what you were expecting.
> 
> If you are interested, you can repeat the above steps but replace
> <backup_db_via_tools> with <backup_conf_via_tools>, and instead of
> creating an sql file, create a tar file.
> 
> To test packaging: run koha-dump, confirm that it still creates a
> usable backup.
> ---
> Makefile.PL                                        |    9 ++
> debian/scripts/koha-dump                           |    6 +-
> debian/scripts/koha-run-backups                    |   13 ++-
> debian/templates/koha-conf-site.xml.in             |    6 ++
> etc/koha-conf.xml                                  |    6 ++
> .../intranet-tmpl/prog/en/modules/tools/export.tt  |   54 ++++++++++++
> misc/cronjobs/backup.sh                            |   18 ++---
> tools/export.pl                                    |   89 +++++++++++++++++++-
> 8 files changed, 181 insertions(+), 20 deletions(-)
> 
> diff --git a/Makefile.PL b/Makefile.PL
> index 9ac1474..46a3524 100644
> --- a/Makefile.PL
> +++ b/Makefile.PL
> @@ -226,6 +226,10 @@ command-line, e.g., READMEs.
> 
> Directory for Apache and Zebra logs produced by Koha.
> 
> +=item BACKUP_DIR
> +
> +Directory for backup files produced by Koha.
> +
> =item PAZPAR2_CONF_DIR
> 
> Directory for PazPar2 configuration files.
> @@ -293,6 +297,7 @@ my $target_map = {
>   './services'                  => 'INTRANET_CGI_DIR',
>   './skel'                      => 'NONE',
>   './skel/var/log/koha'         => { target => 'LOG_DIR', trimdir => -1 },
> +  './skel/var/spool/koha'       => { target => 'BACKUP_DIR', trimdir => -1 },
>   './skel/var/run/koha/zebradb' => { target => 'ZEBRA_RUN_DIR', trimdir => -1 },
>   './skel/var/lock/koha/zebradb/authorities' => { target => 'ZEBRA_LOCK_DIR', trimdir => 6 },
>   './skel/var/lib/koha/zebradb/authorities/key'  => { target => 'ZEBRA_DATA_DIR', trimdir => 6 },
> @@ -548,6 +553,7 @@ my %test_suite_override_dirs = (
>     KOHA_CONF_DIR  => ['etc'],
>     ZEBRA_CONF_DIR => ['etc', 'zebradb'],
>     LOG_DIR        => ['var', 'log'],
> +    BACKUP_DIR     => ['var', 'spool'],
>     SCRIPT_DIR     => ['bin'],
>     ZEBRA_LOCK_DIR => ['var', 'lock', 'zebradb'],
>     ZEBRA_DATA_DIR => ['var', 'lib', 'zebradb'],
> @@ -1227,6 +1233,7 @@ sub get_target_directories {
>         $dirmap{'DOC_DIR'} = File::Spec->catdir(@basedir, $package, 'doc');
>         $dirmap{'ZEBRA_LOCK_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'lock', 'zebradb');
>         $dirmap{'LOG_DIR'} =  File::Spec->catdir(@basedir, $package, 'var', 'log');
> +        $dirmap{'BACKUP_DIR'} =  File::Spec->catdir(@basedir, $package, 'var', 'spool');
>         $dirmap{'ZEBRA_DATA_DIR'} =  File::Spec->catdir(@basedir, $package, 'var', 'lib', 'zebradb');
>         $dirmap{'ZEBRA_RUN_DIR'} =  File::Spec->catdir(@basedir, $package, 'var', 'run', 'zebradb');
>     } elsif ($mode eq 'dev') {
> @@ -1256,6 +1263,7 @@ sub get_target_directories {
>         $dirmap{'DOC_DIR'} = File::Spec->catdir(@basedir, $package, 'doc');
>         $dirmap{'ZEBRA_LOCK_DIR'} = File::Spec->catdir(@basedir, $package, 'var', 'lock', 'zebradb');
>         $dirmap{'LOG_DIR'} =  File::Spec->catdir(@basedir, $package, 'var', 'log');
> +        $dirmap{'BACKUP_DIR'} =  File::Spec->catdir(@basedir, $package, 'var', 'spool');
>         $dirmap{'ZEBRA_DATA_DIR'} =  File::Spec->catdir(@basedir, $package, 'var', 'lib', 'zebradb');
>         $dirmap{'ZEBRA_RUN_DIR'} =  File::Spec->catdir(@basedir, $package, 'var', 'run', 'zebradb');
>     } else {
> @@ -1277,6 +1285,7 @@ sub get_target_directories {
>         $dirmap{'DOC_DIR'} = File::Spec->catdir(@basedir, $package, 'doc');
>         $dirmap{'ZEBRA_LOCK_DIR'} = File::Spec->catdir(File::Spec->rootdir(), 'var', 'lock', $package, 'zebradb');
>         $dirmap{'LOG_DIR'} =  File::Spec->catdir(File::Spec->rootdir(), 'var', 'log', $package);
> +        $dirmap{'BACKUP_DIR'} =  File::Spec->catdir(File::Spec->rootdir(), 'var', 'spool', $package);
>         $dirmap{'ZEBRA_DATA_DIR'} =  File::Spec->catdir(File::Spec->rootdir(), 'var', 'lib', $package, 'zebradb');
>         $dirmap{'ZEBRA_RUN_DIR'} =  File::Spec->catdir(File::Spec->rootdir(), 'var', 'run', $package, 'zebradb');
>     }
> diff --git a/debian/scripts/koha-dump b/debian/scripts/koha-dump
> index 99c3894..2fe9edd 100755
> --- a/debian/scripts/koha-dump
> +++ b/debian/scripts/koha-dump
> @@ -44,7 +44,9 @@ mysqlhost="$( xmlstarlet sel -t -v 'yazgfs/config/hostname' $kohaconfig )"
> mysqldb="$( xmlstarlet sel -t -v 'yazgfs/config/database' $kohaconfig )"
> mysqluser="$( xmlstarlet sel -t -v 'yazgfs/config/user' $kohaconfig )"
> mysqlpass="$( xmlstarlet sel -t -v 'yazgfs/config/pass' $kohaconfig )"
> -dbdump="/var/spool/koha/$name/$name-$date.sql.gz"
> +backupdir="$( xmlstarlet sel -t -v 'yazgfs/config/backupdir' $kohaconfig )"
> +[ -z "$backupdir" ] && backupdir="/var/spool/koha/$name"
> +dbdump="$backupdir/$name-$date.sql.gz"
> echo "* DB to $dbdump"
> mysqldump --databases --host="$mysqlhost" \
>     --user="$mysqluser" --password="$mysqlpass" "$mysqldb" | 
> @@ -54,7 +56,7 @@ chmod g+r "$dbdump"
> 
> 
> # Dump configs, logs, etc.
> -metadump="/var/spool/koha/$name/$name-$date.tar.gz"
> +metadump="$backupdir/$name-$date.tar.gz"
> echo "* configs, logs to $metadump"
> tar -C / -czf "$metadump" \
>     "etc/koha/sites/$name" \
> diff --git a/debian/scripts/koha-run-backups b/debian/scripts/koha-run-backups
> index 7bf39c5..9675f89 100755
> --- a/debian/scripts/koha-run-backups
> +++ b/debian/scripts/koha-run-backups
> @@ -17,7 +17,7 @@
> # Daily cron job for koha.
> # - dump all sites, except one called 'demo'
> 
> -dirname="/var/spool/koha"
> +dirname=""
> days="2"
> 
> show_help() {
> @@ -58,10 +58,15 @@ done
> for name in $(koha-list --enabled | grep -Fxv demo)
> do
>     koha-dump "$name" > /dev/null
> +    if [ -z "$dirname"]; then
> +        backupdir="$( xmlstarlet sel -t -v 'yazgfs/config/backupdir' /etc/koha/sites/$name/koha-conf.xml )";
> +    else
> +        backupdir="$dirname/$name";
> +    fi
> 
>     # Remove old dump files.
>     # FIXME: This could probably be replaced by one line of perl.
> -    ls "$dirname/$name/" | 
> +    ls "$backupdir/" | 
>     sed "s:^$name-\([0-9-]*\)\.\(sql\|tar\)\.gz$:\1:" |
>     sort -u |
>     tac |
> @@ -69,8 +74,8 @@ do
>     tac |
>     while read date
>     do
> -        tardump="$dirname/$name/$name-$date.tar.gz"
> -        sqldump="$dirname/$name/$name-$date.sql.gz"
> +        tardump="$backupdir/$name-$date.tar.gz"
> +        sqldump="$backupdir/$name-$date.sql.gz"
>         if [ -e "$tardump" ] && [ -e "$sqldump" ]
>         then
>             rm "$tardump"
> diff --git a/debian/templates/koha-conf-site.xml.in b/debian/templates/koha-conf-site.xml.in
> index a440c96..d8fbd7c 100644
> --- a/debian/templates/koha-conf-site.xml.in
> +++ b/debian/templates/koha-conf-site.xml.in
> @@ -263,6 +263,12 @@
>  <intrahtdocs>/usr/share/koha/intranet/htdocs/intranet-tmpl</intrahtdocs>
>  <includes>/usr/share/koha/intranet/htdocs/intranet-tmpl/prog/en/includes/</includes>
>  <logdir>/var/log/koha/__KOHASITE__</logdir>
> + <backupdir>/var/lib/koha/__KOHASITE__</backupdir>
> + <!-- Enable the two following to allow superlibrarians to download
> +      database and configuration dumps (respectively) from the Export
> +      tool -->
> + <backup_db_via_tools>0</backup_db_via_tools>
> + <backup_conf_via_tools>0</backup_conf_via_tools>
>  <!-- <pazpar2url>http://__PAZPAR2_HOST__:__PAZPAR2_PORT__/search.pz2</pazpar2url> -->
>  <install_log>/usr/share/koha/misc/koha-install-log</install_log>
>  <useldapserver>0</useldapserver><!-- see C4::Auth_with_ldap for extra configs you must add if you want to turn this on -->
> diff --git a/etc/koha-conf.xml b/etc/koha-conf.xml
> index f31e31c..bb79355 100644
> --- a/etc/koha-conf.xml
> +++ b/etc/koha-conf.xml
> @@ -282,6 +282,12 @@ __PAZPAR2_TOGGLE_XML_POST__
>  <intrahtdocs>__INTRANET_TMPL_DIR__</intrahtdocs>
>  <includes>__INTRANET_TMPL_DIR__/prog/en/includes/</includes>
>  <logdir>__LOG_DIR__</logdir>
> + <backupdir>__BACKUP_DIR__</backupdir>
> + <!-- Enable the two following to allow superlibrarians to download
> +      database and configuration dumps (respectively) from the Export
> +      tool -->
> + <backup_db_via_tools>0</backup_db_via_tools>
> + <backup_conf_via_tools>0</backup_conf_via_tools>
>  <pazpar2url>http://__PAZPAR2_HOST__:__PAZPAR2_PORT__/search.pz2</pazpar2url>
>  <install_log>__MISC_DIR__/koha-install-log</install_log>
>  <useldapserver>0</useldapserver><!-- see C4::Auth_with_ldap for extra configs you must add if you want to turn this on -->
> diff --git a/koha-tmpl/intranet-tmpl/prog/en/modules/tools/export.tt b/koha-tmpl/intranet-tmpl/prog/en/modules/tools/export.tt
> index fa98d78..9c27482 100644
> --- a/koha-tmpl/intranet-tmpl/prog/en/modules/tools/export.tt
> +++ b/koha-tmpl/intranet-tmpl/prog/en/modules/tools/export.tt
> @@ -29,6 +29,12 @@ $(document).ready(function() {
> <ul>
> <li><a href="#bibs">Export bibliographic records</a></li>
> <li><a href="#auths">Export authority records</a></li>
> +[% IF ( allow_db_export ) %]
> +<li><a href="#db">Export database</a></li>
> +[% END %]
> +[% IF ( allow_conf_export ) %]
> +<li><a href="#conf">Export configuration</a></li>
> +[% END %]
> </ul>
> <div id="bibs">
> <p>
> @@ -207,6 +213,54 @@ Calendar.setup(
> </form>
> </div>
> 
> +[% IF ( allow_db_export ) %]
> +<div id="db">
> +<form method="post" action="/cgi-bin/koha/tools/export.pl">
> +    <p><b>Note : This export file will be very large, and is generated nightly.</b></p>
> +    <fieldset class="rows">
> +    <legend> Choose a file </legend>
> +    [% IF ( dbfiles ) %]
> +        <ul>
> +        [% FOREACH dbfile IN dbfiles %]
> +            <li><input type="radio" name="filename" value="[% dbfile %]">[% dbfile %]</input></li>
> +        [% END %]
> +        </ul>
> +    [% ELSE %]
> +        <p>Unfortunately, no backups are available.</p>
> +    [% END %]
> +    </fieldset>
> +
> +    <input type="hidden" name="op" value="export" />
> +    <input type="hidden" name="record_type" value="db" />
> +    <fieldset class="action"><input type="submit" value="Download database" class="button" /></fieldset>
> +</form>
> +</div>
> +[% END %]
> +
> +[% IF ( allow_conf_export ) %]
> +<div id="conf">
> +<form method="post" action="/cgi-bin/koha/tools/export.pl">
> +    <p><b>Note : This export file will be very large, and is generated nightly.</b></p>
> +    <fieldset class="rows">
> +    <legend> Choose a file </legend>
> +    [% IF ( conffiles ) %]
> +        <ul>
> +        [% FOREACH conffile IN conffiles %]
> +            <li><input type="radio" name="filename" value="[% conffile %]">[% conffile %]</input></li>
> +        [% END %]
> +        </ul>
> +    [% ELSE %]
> +        <p>Unfortunately, no backups are available.</p>
> +    [% END %]
> +    </fieldset>
> +
> +    <input type="hidden" name="op" value="export" />
> +    <input type="hidden" name="record_type" value="conf" />
> +    <fieldset class="action"><input type="submit" value="Download configuration" class="button" /></fieldset>
> +</form>
> +</div>
> +[% END %]
> +
> </div>
> 
> </div>
> diff --git a/misc/cronjobs/backup.sh b/misc/cronjobs/backup.sh
> index 38026cb..0806c6c 100755
> --- a/misc/cronjobs/backup.sh
> +++ b/misc/cronjobs/backup.sh
> @@ -1,23 +1,19 @@
> #!/bin/sh
> # Script to create daily backups of the Koha database.
> # Based on a script by John Pennington
> +BACKUPDIR=`xmlstarlet sel -t -v 'yazgfs/config/backupdir' $KOHA_CONF`
> KOHA_DATE=`date '+%y%m%d'`
> -KOHA_DUMP=/tmp/koha-$KOHA_DATE.dump
> -KOHA_BACKUP=/tmp/koha-$KOHA_DATE.dump.gz
> +KOHA_BACKUP=$BACKUPDIR/koha-$KOHA_DATE.sql.gz
> 
> -mysqldump --single-transaction -u koha -ppassword koha > $KOHA_DUMP &&
> -gzip -f $KOHA_DUMP &&
> -# Creates the dump file and compresses it;
> -# -u is the Koha user, -p is the password for that user.
> -# The -f switch on gzip forces it to overwrite the file if one exists.
> +mysqldump --single-transaction -u koha -ppassword koha | gzip -9 > $KOHA_BACKUP
> 
> -mv $KOHA_BACKUP /home/kohaadmin &&
> -chown kohaadmin.users /home/kohaadmin/koha-$KOHA_DATE.dump.gz &&
> -chmod 600 /home/kohaadmin/koha-$KOHA_DATE.dump.gz &&
> +#mv $KOHA_BACKUP /home/kohaadmin &&
> +#chown kohaadmin.users /home/kohaadmin/koha-$KOHA_DATE.dump.gz &&
> +#chmod 600 /home/kohaadmin/koha-$KOHA_DATE.dump.gz &&
> # Makes the compressed dump file property of the kohaadmin user.
> # Make sure that you replace kohaadmin with a real user.
> 
> -echo "$KOHA_BACKUP was successfully created." | mail kohaadmin -s $KOHA_BACKUP ||
> +[ -f $KOHA_BACKUP] && echo "$KOHA_BACKUP was successfully created." | mail kohaadmin -s $KOHA_BACKUP ||
> echo "$KOHA_BACKUP was NOT successfully created." | mail kohaadmin -s $KOHA_BACKUP
> # Notifies kohaadmin of (un)successful backup creation
> # EOF
> diff --git a/tools/export.pl b/tools/export.pl
> index b439a8d..88d34b8 100755
> --- a/tools/export.pl
> +++ b/tools/export.pl
> @@ -33,7 +33,7 @@ my $filename=$query->param("filename");
> my $dbh=C4::Context->dbh;
> my $marcflavour = C4::Context->preference("marcflavour");
> 
> -my ($template, $loggedinuser, $cookie)
> +my ($template, $loggedinuser, $cookie, $flags)
>     = get_template_and_user
>     (
>         {
> @@ -57,10 +57,23 @@ my ($template, $loggedinuser, $cookie)
>         $branch = C4::Context->userenv->{'branch'};
>    }
> 
> +my $backupdir = C4::Context->config('backupdir');
> +
> if ($op eq "export") {
> +    my $charset  = 'utf-8';
> +    my $mimetype = 'application/octet-stream';
>     binmode STDOUT, ':encoding(UTF-8)';
> -    print $query->header(   -type => 'application/octet-stream', 
> -                            -charset => 'utf-8',
> +    if ( $filename =~ m/\.gz$/ ) {
> +        $mimetype = 'application/x-gzip';
> +        $charset = '';
> +        binmode STDOUT;
> +    } elsif ( $filename =~ m/\.bz2$/ ) {
> +        $mimetype = 'application/x-bzip2';
> +        binmode STDOUT;
> +        $charset = '';
> +    }
> +    print $query->header(   -type => $mimetype, 
> +                            -charset => $charset,
>                             -attachment=>$filename);
> 
>     my $record_type        = $query->param("record_type");
> @@ -159,6 +172,30 @@ if ($op eq "export") {
>             push @sql_params, $authtype;
>         }
>     }
> +    elsif ( $record_type eq 'db' ) {
> +        my $successful_export;
> +        if ( $flags->{superlibrarian} && C4::Context->config('backup_db_via_tools') ) {
> +            $successful_export = download_backup( { directory => "$backupdir", extension => 'sql', filename => "$filename" } )
> +        }
> +        unless ( $successful_export ) {
> +            warn "A suspicious attempt was made to download the db at '$filename' by someone at " . $query->remote_host() . "\n";
> +        }
> +        exit;
> +    }
> +    elsif ( $record_type eq 'conf' ) {
> +        my $successful_export;
> +        if ( $flags->{superlibrarian} && C4::Context->config('backup_conf_via_tools') ) {
> +            $successful_export = download_backup( { directory => "$backupdir", extension => 'tar', filename => "$filename" } )
> +        }
> +        unless ( $successful_export ) {
> +            warn "A suspicious attempt was made to download the configuration at '$filename' by someone at " . $query->remote_host() . "\n";
> +        }
> +        exit;
> +    }
> +    else {
> +        # Someone is trying to mess us up
> +        exit;
> +    }
> 
>     my $sth = $dbh->prepare($sql_query);
>     $sth->execute(@sql_params);
> @@ -255,6 +292,16 @@ else {
>         push @authtypesloop, \%row;
>     }
> 
> +    if ( $flags->{superlibrarian} && C4::Context->config('backup_db_via_tools') && $backupdir && -d $backupdir ) {
> +        $template->{VARS}->{'allow_db_export'} = 1;
> +        $template->{VARS}->{'dbfiles'} = getbackupfilelist( { directory => "$backupdir", extension => 'sql' } );
> +    }
> +
> +    if ( $flags->{superlibrarian} && C4::Context->config('backup_conf_via_tools') && $backupdir && -d $backupdir ) {
> +        $template->{VARS}->{'allow_conf_export'} = 1;
> +        $template->{VARS}->{'conffiles'} = getbackupfilelist( { directory => "$backupdir", extension => 'tar' } );
> +    }
> +
>     $template->param(
>         branchloop               => \@branchloop,
>         itemtypeloop             => \@itemtypesloop,
> @@ -264,3 +311,39 @@ else {
> 
>     output_html_with_http_headers $query, $cookie, $template->output;
> }
> +
> +sub getbackupfilelist {
> +    my $args = shift;
> +    my $directory = $args->{directory};
> +    my $extension = $args->{extension};
> +    my @files;
> +
> +    if ( opendir(my $dir, $directory) ) {
> +        while (my $file = readdir($dir)) {
> +            next unless ( $file =~ m/\.$extension(\.(gz|bz2|xz))?/ );
> +            push @files, $file if ( -f "$backupdir/$file" && -o "$backupdir/$file" );
> +        }
> +        closedir($dir);
> +    }
> +
> +    return \@files;
> +}
> +
> +sub download_backup {
> +    my $args = shift;
> +    my $directory = $args->{directory};
> +    my $extension = $args->{extension};
> +    my $filename  = $args->{filename};
> +
> +    return unless ( $directory && -d $directory );
> +    return unless ( $filename =~ m/$extension(\.(gz|bz2|xz))?$/ && not $filename =~ m#(^\.\.|/)# );
> +    $filename = "$directory/$filename";
> +    return unless ( -f $filename && -o $filename );
> +    return unless ( open(my $dump, '<', $filename) );
> +    binmode $dump;
> +    while (read($dump, my $data, 64 * 1024)) {
> +        print $data;
> +    }
> +    close ($dump);
> +    return 1;
> +}
> -- 
> 1.7.2.5
> 
> _______________________________________________
> Koha-patches mailing list
> Koha-patches at lists.koha-community.org
> http://lists.koha-community.org/cgi-bin/mailman/listinfo/koha-patches
> website : http://www.koha-community.org/
> git : http://git.koha-community.org/
> bugs : http://bugs.koha-community.org/


More information about the Koha-patches mailing list