I think it is not correct to consider "greedy" and "eager" to be the same.
Greedy and Thrifty Optimizations
Greedy algorithms refer to the optimization paradigm to consider the locally best choice as the best global choice. This of course is done iteratively so that the local neighbourhood changes. The algorithm always the best choice of the options it "sees" in current iteration. An example for a greedy optimization algorithm would be gradient descend.
A non-greedy / thrifty optimization algorithm considers options more globally. It tries to check out many more options. Examples would be Bayesian Optimization and many forms of swarm optimization techniques, especially firefly optimization (afaik they find all the locally optima).
Eeager and Lazy Learning
"Eager" is used in the context of "eager learning". The opposite of "eager learning" is "lazy learning". The terms denote whether the mathematical modelling of the data happens during a separate previous learning phase, or only when the method is applied to new data. For example, polynomial regression is eager, while Gaussian processing regression or kernel regression are lazy.
This is closely related to whether the method is parametric (often eager learning) or non-parametric (often lazy learning), but not always. For example decision trees are eager learners, but still non-parametric.