In the recent overload journal under the topic Enforcing the rule of zero, the authors describe how we can avoid writing the Rule of five operators as the reasons for writing them are:
- Resource management
- Polymorphic deletion
And both these can be taken care of by using smart pointers.
Here I am specifically interested in the second part.
Consider the following code snippet:
class Base
{
public:
virtual void Fun() = 0;
};
class Derived : public Base
{
public:
~Derived()
{
cout << "Derived::~Derived\n";
}
void Fun()
{
cout << "Derived::Fun\n";
}
};
int main()
{
shared_ptr<Base> pB = make_shared<Derived>();
pB->Fun();
}
In this case, as the authors of the article explain, we get polymorphic deletion by using a shared pointer, and this does work.
But if I replace the shared_ptr
with a unique_ptr
, I am no longer able to observe the polymorphic deletion.
Now my question is, why are these two behaviors different? Why does shared_ptr
take care of polymorphic deletion while unique_ptr
doesn't?
unique_pointer
? – Gregstd::shared_ptr
carries around a pointer to the deleter function. When you assign onestd::shared_ptr
to a compatible one the pointer is one of the members copied or moved. This does not happen withstd::unique_ptr
and since your base class that does not have a virtual destructor you get boned. – Damiandamiani