Eric Lippert (longtime Microsoft programming language designer) wrote about this topic in 2004. This article mentions VBScript, VBA, and VB6 mixed together, and his conclusions seem to pertain to all of them.
TLDR: He concludes:
- No flaw in the language or compiler/interpreter. There is no need in the VB✶ languages to set objects to nothing explicitly, in general.
- But there are specific cases due to complex program logic where this can be required. Programmers have overgeneralized this practice.
Here's a summary of the key points.
Firstly he cites some plausible-sounding but incorrect rationales:
Explanation #1: (Bogus) Perhaps some earlier version of VB required
this.
...
Explanation #2: (Bogus) Circular references are not cleaned up by the
VB6 garbage collector.
Neither is a correct assumption.
Next, some deductive reasoning for why some programmers may have adopted this practice too broadly:
Explanation #3: It's a good idea to throw away expensive resources early. Perhaps people overgeneralized this rule?
...
I can see how overapplication of this good design principle would lead
to this programming practice. ... I'm still not convinced that this is
the whole story though.
IMO the most persuasive argument is the last which cites a known ADO issue that did in fact require explicit object cleanup:
Explanation #4: ... There is a difference [between clearing variables
yourself before they go out of scope, and letting the scope finalizer
do it for you]
...
[If] two objects have some complex interaction, and furthermore, one
of the objects has a bug whereby it must be shut down before the
other, then the scope finalizer might pick the wrong one!
...
The only way to work around the bug is to explicitly clean up the
objects in the right order before they go out of scope.
And indeed, there were widely-used ADO objects that had this kind of bug. Mystery solved.
He goes on to mention how even Microsoft documentation may have encouraged the Set = Nothing practice which then influenced numerous programmers who probably had no idea what the real original need for this was.
He's also rightfully critical that things got as far as they did:
What is truly strange to me though is how tenacious this coding
practice is. OK, so some objects are buggy, and sometimes you can work
around a bug by writing some code which would otherwise be
unnecessary. Is the logical conclusion “always write the unnecessary
code, just in case some bug happens in the future?” Some people call
this “defensive coding”. I call it “massive overgeneralization”.
With
block at the end). For this reason, I've closed this question as a dupe, and not the other way round. – Breakaway