For context: the usual greedy approximation algorithm for the minimum vertex cover problem (given a graph, find the smallest set of vertices such that every edge is incident to at least one selected vertex) goes like this:
while some uncovered edge remains:
pick an arbitrary uncovered edge e = (u, v)
add u, v both to the cover
There is a nice proof that this algorithm approximates the true minimum vertex cover by a factor of at most $2$: "consider the set of picked edges $E$, which are by construction pairwise non-adjacent; any true minimum vertex cover contains at least one vertex from each edge in $E$ and so has size $\ge|E|$, and the greedy algorithm produces a vertex cover with size exactly $2|E|$."
Consider the other (more natural/intuitive) greedy algorithm that iterates over vertices instead:
while some uncovered edge remains:
pick a vertex v incident to maximum number of uncovered edges
add v to the cover
What do we know about this algorithm? (I tried to look it up on various sites but there seems to be almost no discussion about it; every textbook/lecture I could find only discusses the edge-greedy algorithm.) Why is this not mentioned as an alternative to the edge-greedy algorithm? Is there a proof that has a good approximation bound, and in particular if not, are there example graphs where this algorithm performs strictly worse than the edge-greedy algorithm?
(I tried writing a program to search for examples, but on most examples the vertex-greedy algorithm performs better than the edge-greedy algorithm--e.g., star graph, edge forest, cycles.)