What are some specific examples of backward incompatibilities in Perl versions?
Asked Answered
E

5

12

It has been 22 years between the initial public release of Perl 1.0 (December 18, 1987) and the current stable release 5.10.1 (2009).

During those 22 years the following notable releases have been made:

  • Perl 1.0 (1987 - initial release)
  • Perl 2 (1988 - better regular expressions)
  • Perl 3 (1989 - support for binary data streams)
  • Perl 4 (1991 - identifying the version of Perl described in the Camel Book)
  • Perl 5 (1994 - major changes introduced, near complete rewrite of the interpreter)
  • Perl 5.6 (2000 - 64 bit support, unicode strings, large file support)
  • Perl 5.8 (2002 - improved unicode support, new IO implementation)
  • Perl 5.10 (2007 - new switch statement, regular expression updates, smart match operator)

I'm looking for specific examples of backwards incompatibilities during the history of Perl.

Question:

  • In the 22 year history of Perl, are there any examples of Perl backwards incompatibility where Perl source code targeting Perl version X won't run under version Y (where Y > X)?

Please include references and code examples when possible.

Endowment answered 6/12, 2009 at 14:11 Comment(0)
W
14

One of the biggest deliberate incompatibilities is array interpolation which changed between Perl 4 and Perl 5.

my @example = qw(1 2 3);
print "[email protected]";

In Perl 4 that would be:

[email protected]

In Perl 5 that would be:

foo1 2 3.com

Fortunately, if the array doesn't exist Perl will warn you about "possible unintended interpolation".

Threads underwent a big change between 5.005 and 5.6. "5005 threads" used the traditional POSIX threading model where all global data is shared. While in theory this was faster, because then Perl could just use POSIX threads, it was a nightmare for Perl coders. Most Perl modules were not thread-safe. And it never really worked well.

In 5.6, ActiveState and others made fork() on Windows. When you fork() on Windows, Perl would make a copy of the interpreter object and run the opcodes of both interpreters. This was known as "multiplicity".

In 5.8, Arthur Bergman ran with that and used it to create ithreads. Because multiplicity is emulating a separate process, no data is shared by default. Only data you say is shared is shared. This makes them much safer to use, though it took a long time before ithreads were stable. Folks like Elizabeth Mattijsen and Jerry Hedden made that happen.

5005threads were finally expunged in 5.10.0. A compatibility layer exists, but I doubt it would really work in production code.

Another big incompatibility came wrt Unicode between 5.6 and 5.8. Unicode in 5.6 blew. Whether or not a string was Unicode was decided by the surrounding scope. It was completely re-engineered in 5.8 so now the Unicodeiness of a string is tied to the string. Code written using 5.6's Unicode usually had to be rewritten in 5.8, often because to get 5.6's Unicode to work right you had to do ugly hacks.

Recently, 5.10.1 made a bunch of incompatible changes to smart-match. Fortunately they were introduced in 5.10.0 so its not a big deal. The story there is Perl 6 introduced the smart-match concept, and it was backported to a development version of Perl 5. Time passed, and Perl 6's idea of smart-matching changed. Nobody told the Perl 5 guys and it went out in 5.10.0 unchanged. Larry Wall noticed and did the equivalent of OMG YER DOIN IT WRONG!!! The new Perl 6 version was seen as significantly better and so 5.10.1 fixed it.

Welladvised answered 6/12, 2009 at 21:33 Comment(0)
D
12

Pseudo-hashes are a recent example that spring to my mind. In general, perldelta files have an overview of incompatible changes in a specific version. These changes almost always either obscure (like pseudo-hashes) or small.

Dolan answered 6/12, 2009 at 14:28 Comment(4)
You beat me to it. I was about to state pseudo hashes. :-)Former
It's only small if you can manage to teach all the devs in your company to not reference pseudohash fields like $this->[$this->[0]->{fieldname}]. Sigh...Forme
Pseudo-hashes were always labeled an experiment. Not my fault if you used them in production code. :PWelladvised
Especially since you could have used fields to get the pseudo-hash behavior, like the docs said to. The fields pragma still works.Greenling
W
11

Yes. There are many, although they're usually minor. Sometimes this is due to deprecation cycles ultimately ending in removal. Sometimes it's due to changing semantics for new (and experimental) features. Sometimes it's bug fixes for things that didn't work correctly. The Perl developers take great pains to preserve backwards compatibility between versions wherever possible. I can't recall ever having a script that was broken by upgrading to a new version of Perl.

The internal hash order has changed several times. While this isn't something you should depend on, it can cause problems if you unwittingly do.

Binary incompatibility between major (5.x) releases is common, but that usually just means that any XS extensions need to be recompiled.

The complete list is far too long to list here. You can get it by checking the "Incompatible Changes" section of each version's history.

Wellfounded answered 6/12, 2009 at 14:29 Comment(2)
The internal "hash order" is randomized in the newer versions.Gusty
@Brad Gilbert: Yes and no. Hash randomization was added in 5.8.1 but as of 5.8.2 it only occurs if the key distribution is poor.Wellfounded
W
5

OTOH there's some wild features dating back to Perl 1 that still work. For example, what does this print?

%foo = (foo => 23);
print values foo

That's right, 23. Why? Because "associative arrays" were not first-class objects in Perl 1. $foo{bar} worked but there was no %foo. I really don't know why, even the Perl 1 man page acknowledges this is warty. So for compatibility with Perl 1 you can access a global hash without using a %, maybe if your keyboard is broken or Apple decides nobody uses the % symbol.

chdir has some oddities. chdir() with no argument will take you to your home directory, replicating the shell cd behavior. Unfortunately so will chdir undef and chdir "" making it difficult to catch errors around chdir. Fortunately this behavior is deprecated. I will have to make sure it dies in 5.14.

$[ is still around and remains undeprecated, but "highly discouraged". It changes what the first index of an array is, so if you're a human like me and count from 1 you could do:

$[ = 1;
@foo = qw(foo bar baz);
print $foo[2];   # prints bar

Perl 5 changed it to be file-scoped, as otherwise it was a performance drag and a great source of CrAzY.

Welladvised answered 7/12, 2009 at 2:15 Comment(1)
Hash %foo missing the % in argument... That's a warning I've never seen before. I like perldiag's description: "Really old Perl let you omit the % on hash names in some spots. This is now heavily deprecated." Apparently it's not deprecated heavily enough to actually remove it, though.Wellfounded
V
3

I've had some funky errors with Perl4 and Perl5 evaluating left hand and right hand sides of an assignment in a different order, quoting the Perl traps for the unwary:

LHS vs. RHS of any assignment operator. LHS is evaluated first in perl4, second in perl5; this can affect the relationship between side-effects in sub-expressions.

@arr = ( 'left', 'right' );
$a{shift @arr} = shift @arr;
print join( ' ', keys %a );
# perl4 prints: left
# perl5 prints: right

For some new and possibly incompatible stuff, see the FAQ between Perl4 and Perl5.

Valdavaldas answered 10/12, 2009 at 17:12 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.