Is there any research on (or better use of) of RAII in GC languages?
Asked Answered
T

1

9

Note: Object Lifetime RAII not using/with block scope RAII

It seems like its possible using an extra gc category, short lived objects(check gc category somewhat frequently), long lived objects(check gc category less frequently), and resource objects(check gc category very frequently). Or possibly with an extra reference counting gc for resource objects.

It seems like the using/with style can have some benefits by promoting a more functional style(forgive me if I'm wrong and this is not the functional style) of I/O discouraging lots of I/O spread out over the place vs the flexibility of object based RAII (because it's easier). But some problems probably require hard to track resource lifetimes.

Are there reasons besides avoiding gc complexity and speed, that this has not been done on mainstream languages?(I understand that some languages use reference counting as part of gc in their main implementations and as such, RAII may work there, but as I believe their spec doesn't specify reference counting for some type of objects/or all objects and that other implementations used by people don't have reference counting, limiting use of object lifetime RAII in those languages.

P.S.:Do they have c++ type RAII in perl?

Taraxacum answered 10/9, 2010 at 15:14 Comment(2)
Interesting. Supposedly C++/CLI allows c++ style RAII.Taraxacum
What are you asking? Is there a specific problem you're trying to address? Is there a significant advantage you are proposing? The question you've written sounds like you suspect there is a problem that may not actually exist.Extraordinary
C
4

Many languages make it a lot easier to write a custom inner block processor than it was traditionally in C++ (this may have been addressed in the current drafts of the latest standard). When you have these, much of the requirement of the use of RAII for exact resource handling becomes much less pressing; you can do something like this:

using (Transaction t = makeTX()) {
    // blah
}

instead of:

{
    Transaction t = makeTX();
    // blah
}

There's not a huge difference really except that when you have multiple nested using constructs it's much clearer what the order of resource release is. (It's also IMO easier to do special handling in the case where an exception is thrown, useful for things like transactions where you'd want to roll back on error, but I don't expect everyone to agree with me there.) Also note that there are many different ways of writing using constructs, some much more heavyweight than others, but we don't really need to explore the differences here.

Given that exact resource handling is dealt with in this different way, there's a lot less demand for the C++ RAII style and it is viable to use Garbage Collection (GC) instead, as that handles complex cases (i.e., anywhere where it is difficult to tie object lifetime to a specific scope) far more easily. To be fair, there are cases when you need exact resource management with a non-trivial lifetime, but those cases are nasty for everyone.

Perl uses Garbage Collection and has cheap subroutine blocks, as do most other scripting languages in one form or another (because the division between code and data is looser in scripting languages than more traditional compiled langs). The only big scripting language that I'm aware of that doesn't use GC is Tcl, and that's because the value system there is guaranteed loop-free for technical semantic reasons and so reference counting is sufficient. Code blocks are still very cheap there though.

If we look at mainstream compiled languages (i.e., not scripting languages) then we really see a divide in about 1990. Languages from before then (including C++) tend to not assume garbage collection (with some exceptions such as Lisp, Smalltalk and the functional programming languages) whereas languages from after that point (notably Java and C#) do assume GC. I guess there was a substantial philosophical shift about that point, probably coupled to some clever implementations that dealt with the most egregious problems in GC before that point. When you have GC, you simply don't think of RAII as a solution; it's very rooted in C++'s model of the world.


I just made that term up.

Campbell answered 9/10, 2010 at 6:54 Comment(2)
The reason I asked about perl is that I thought that by default it like python used ref counting + some extra checking, so I thought they might have undefined behavior which in practice works like RAII, especially since perl(perl5) is unlikely to have other implementations.Taraxacum
Note that this feature is not equivalent to RAII. I see new language designers constantly making this mistake (e.g. defer in Go). With RAII I can nest objects controlling resources arbitrarily deep, and tie the lifetime of all the resources to the lifetimes of the objects. E.g. destroying the Server object can automatically tear down all connects and then free their memory. And this is done only once where you define the classes, not at every use site like with defer.Aboveboard

© 2022 - 2024 — McMap. All rights reserved.