[ROBOTS JJA] Lib-WWW-perl5

Juan Jose Amor (jjamor@infor.es)
Wed, 13 Nov 1996 09:59:57 +0100 (MET)


-----BEGIN PGP SIGNED MESSAGE-----

Hello,

We are trying some programs (web robots...) written in Perl5 which
use the Lib-WWW-Perl5 package.

Normally, we can't try these programs because they abort with a
segmentation fault.

As an example, when we run the CheckBox program with perl flag -d,
its diagnostics are:

- ------------
Stack dump during die enabled outside of evals.

Loading DB routines from perl5db.pl patch level 0.94
Emacs support available.

Enter h or `h h' for help.

Got SEGV!
Signal SEGV at /usr/lib/perl5/AnyDBM_File.pm line 6
eval {...} called at /usr/lib/perl5/AnyDBM_File.pm line 9
require AnyDBM_File.pm called at checkbot line 126
main::BEGIN called at /usr/lib/perl5/DynaLoader.pm line 0
eval {...} called at /usr/lib/perl5/DynaLoader.pm line 0
IOT trap
- ------------

Can you help me?? We use Perl 5.003 compiled on Linux 2.0.23 (from
Slackware 96) and LibWWW Perl 5.03.

Thanks in advance.

Cheers, Juanjo AI

==============================================================================
= Juan Jose Amor Iglesias InterNet E-Mail: jjamor@infor.es =
= Inforespa~na FidoNet NETMail: Juan Jose Amor, 2:341/12.19 =
= SubNet NETMail: Juan Jose Amor, 93:341/102.19 =
= WWW: http://lml.ls.fi.upm.es/~jjamor/ =
= ** Strategie de la Rupture ** PGP key Id: 4601E115 available ** =
= =
= Serrano, 81. Madrid (SPAIN) =
==============================================================================

-----BEGIN PGP SIGNATURE-----
Version: 2.6.3i
Charset: noconv

iQCVAgUBMomOEJbplLtGAeEVAQFA+gP9FOvRxk+lCHXChd6WoK9eDa1J7/fsyHhV
tkOpjSk6oU+xP0WNQbMJ7mk9EstFVuCyIwJUmOBAqwWq/6Inyzy8f/edofaFOSic
Tyb610uw6xlns2tm3BolJyFabxR3dFaJ3kfubrZDt2CXgOY4VOAGYmZyAKwb32Xu
Sqt+lJ+b6rg=
=Av3x
-----END PGP SIGNATURE-----

_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html