As in the title, I was wondering whether the entropy of a system (it can be any entropy, from Boltzmann to Renyi etc, it is of no importance) is a function or a functional and why? Since it is mostly defined as: $$S(p)=\sum_{i}g(p_i) $$ for some $g$ that has to be continous etc then it has to be a functional. But then I see that $S_{BG}$ for example, which is defined as $S_{BG}=\sum_i p_i \log p_i$ just needs the value of each $p_i$ in order to be defined, right?
The way I see it, it has to be a functional but it is not clear to me why. Also many authors mention the entropy as a function while others call it a functional.
Thank you!