ID: 26042
Updated by: [email][/email]
Reported By: vsv3 at alu dot ua dot es
-Status: Verified
+Status: Closed
Bug Type: mcrypt related
Operating System: Linux 2.4.22 Debian Woody
PHP Version: 4CVS, 5CVS
New Comment:

This bug has been fixed in CVS.

In case this was a PHP problem, snapshots of the sources are packaged
every three hours; this change will be in the next snapshot. You can
grab the snapshot at [url][/url].

In case this was a doentation problem, the fix will show up soon at

In case this was a website problem, the change will show
up on the site and on the mirror sites in short time.

Thank you for the report, and for helping us make PHP better.

Fixed in 5.0, will be fixed in 4.X after 4.3.4 release.

Previous Comments:

[2003-10-30 18:06:48] [email][/email]

Looks like a leak starting in mcrypt_generic_init(). When
mcrypt_generic_init() is called from PHP userland, it
never gets deinit'd unless you explicitly call

Perhaps mcrypt_generic_init() should check to see if the
td has already been init'd and if so, deinit it first.
Haven't looked too closely at the libmcrypt source yet to
see the easiest way to do it.

This is using libmcrypt 2.5.7 and an up-to-date 4_3.



[2003-10-30 10:41:23] vsv3 at alu dot ua dot es

When I use the functions for encrypt some data, the memory that PHP
uses, doesn't get free. It gets more and more memory.

Reproduce code:
$key = '123456789012345678901234567890';
$iv = '12345678';

$nVeces = 100000;
$n = 0;
$td = mcrypt_module_open( MCRYPT_FISH, '', MCRYPT_MODE_CBC, '' );
while( $n < $nVeces ) {
$fichero = file_get_contents( "/tmp/hola" );

mcrypt_generic_init( $td, $key, $iv );
$fichero_enc = addslashes( mcrypt_generic($td, $textoPlano) );
unset( $fichero_enc );

if( isset($fichero_enc) ) echo "<b><h1>No pudimos destruir la
variable</h1></b><br />";
$n = $n + 1;
mcrypt_module_close( $td );

Expected result:
A script that execute with a consume of less than 1MB of memory (with a
file '/tmp/hola' of 1kB).

Actual result:
An script that consumes more than 100MB, or more. A similar script with
other data, have been consumed more than 2000MB.


Edit this bug report at [url][/url]