Topological link-prediction can exploit the entire network topology (global methods) or only the neighbourhood (local methods) of the link to predict. Global methods are believed the best. Is this common belief well-founded? Stochastic-Block-Model (SBM) is a global method believed as one of the best link-predictors, therefore it is considered a reference for comparison. But, our results suggest that SBM, whose computational time is high, cannot in general overcome the Cannistraci-Hebb (CH) network automaton model that is a simple local-learning-rule of topological self-organization proved as the current best local-based and parameter-free deterministic rule for link-prediction. To elucidate the reasons of this unexpected result, we formally introduce the notion of local-ring network automata models and their relation with the nature of common-neighbours' definition in complex network theory. After extensive tests, we recommend Structural-Perturbation-Method (SPM) as the new best global method baseline. However, even SPM overall does not outperform CH and in several evaluation frameworks we astonishingly found the opposite. In particular, CH was the best predictor for synthetic networks generated by the Popularity-Similarity-Optimization (PSO) model, and its performance in PSO networks with community structure was even better than using the original internode-hyperbolic-distance as link-predictor. Interestingly, when tested on non-hyperbolic synthetic networks the performance of CH significantly dropped down indicating that this rule of network self-organization could be strongly associated to the rise of hyperbolic geometry in complex networks. The superiority of global methods seems a "misleading belief" caused by a latent geometry bias of the few small networks used as benchmark in previous studies. We propose to found a latent geometry theory of link-prediction in complex networks.