[NBLUG/talk] I hate perl. :-)

Eric Eisenhart eric at nblug.org
Thu Jun 1 09:51:23 PDT 2006

> >Well, with JSP you have possibility of having a persistent  application
> >that completely skips the whole dynamic compilation phase.  
> That still seems rather silly, since Java code needs to be run in a 
> virtual machine and so will still take a performance hit.  If you're 
> looking for performance, wouldn't it make more sense to compile it into 
> the machine's native code?

The performance hit is actually quite small.  And there's always the
possibility of Just In Time (JIT) compilation where the java bytecode is
compiled into native code on load.  Most programs end up with their primary
performance constraint being external anyways: waiting on files to be read
off disk, waiting on the database engine, waiting for network I/O, etc... 
Or fundamental algorithms.

A lot of those sorts of levels of performance hit actually vanish in the
noise.  Programmers are more expensive than hardware.  Imagine you've got a
decent sized project that'll take 10 programmers a year to write in Java or
either twice as many programmers or twice as much time to write in C.  Even
if the Java runs 10 times as slow, you can buy an awful lot of hardware for
your over $1M savings.  And it's entirely possible to write a faster program
in a "slower" language.  (Given just the memory management issues, I'm
guessing my 2:1 ratio of programmer time is a much lower ratio than most
projects would actually encounter)

> >You also have
> >the advantage (esp. as compared to PHP) of a much more consistent and
> >organized language.  Basically Java is a designed language vs. the
> >grown/hacked on mess of inconsistency that PHP is.
> >  
> Fair enough, although I have yet to run into a web application that was 
> so complex that this would have made any difference.  All in good time, 
> I suppose...

You pretty much need a team of more people than you can count on your hands
spending months on just that project.  For something like that I'd be likely
to insist on Java, despite my own personal love of Perl.  Perl's at least a
designed, organized and somewhat consistent language, but the OO stuff lacks

I'll give you a quick example of how a "slower" language can be faster than
a "fast" language.  A simple enough task: counting lines.

In C, written the lazy (obvious/natural for a C programmer) way:
#include <stdio.h>
int main() {
  int linecount=0;
  int c;
  while( (c=getc(stdin)) != EOF ) {
    if (c == '\n') {

In perl, written similarly lazy:
my $linecount = 0;
while(<>) {
print "$linecount\n";

Note that the perl implementation is about half as many lines, therefore
likely about half as much time to write, half as many bugs, half as much
time to debug, etc.  These two will differ in opinion about whether a file
that doesn't end in a newline ends with a line or just some random

so, I put the first in "linecount.c" and "gcc -o linecount linecount.c", the
second in linecount.pl (chmod +x linecount.pl) of course.

$ time ./linecount < linecount.c

real    0m0.006s
user    0m0.000s
sys     0m0.000s

6 thousandths of a second, and 13 lines.  Same result as wc -l.

time ./linecount.pl < linecount.c

real    0m0.021s
user    0m0.010s
sys     0m0.010s

Perl is obviously much slower.  7:2 speed ratio.  Perl's gotta compile, then
it runs it's funny bytecode language, etc...

Something bigger, perhaps:
$ time ./linecount < /usr/lib/libperl.a

real    0m0.600s
user    0m0.600s
sys     0m0.000s

Over half a second.

$ time ./linecount.pl < /usr/lib/libperl.a

real    0m0.076s
user    0m0.060s
sys     0m0.010s

What's this?  Perl's now faster than C?  Well, of course it is, my Perl
program has a highly optimized set of I/O routines that outsmart me behind
the scenes by prereading big old chunks of file in sizes the OS likes, and
my program processes things a line at a time instead of a character at a

Though, to be fair:
$ time wc -l < /usr/lib/libperl.a

real    0m0.028s
user    0m0.000s
sys     0m0.020s

I looked in the wc code, and it does similar smart things to what Perl does
behind the scenes on my behalf: it's reading in 16K at a time and counting
things within that buffer. But taking its 683 lines (plus a key function
from another file in the coreutils package) down to something that just
counts lines of STDIN would still result in something longer than both
programs put together with yet more chances for bugs.
Eric Eisenhart
NBLUG Co-Founder, Scribe and InstallFest Coordinator
The North Bay Linux Users Group -- http://nblug.org/
eric at nblug.org, IRC: Freiheit at fn AIM: falschfreiheit

More information about the talk mailing list