0

Consider an Array $A$ with $n$ values and the following operations:

  1. get(i): Returns the value of $A[i]$
  2. insert (x): Insert the element x into the any free place in A (not necessarily in $A[x]$ or the end of the array)
  3. delete (x) delete the element $A[x]$

What is the best known (amortized) time complexity of all operations? Especially the delete-op seems like it needs at least $\Omega(log n)$

See also the papers in this post. However the insert operation is defined slightly different by me, does this have any effect?

Panzerkroete
  • 140
  • 7

1 Answers1

2

Ok so I think you can build a structure that has $O(1)$ (amortized) complexity for all operations, and using $O(n)$ space for $n$ insertions.

The underlying array is a dynamic array (let's call it $T$) : it has amortized O(1) complexity for insertion at the end. In the array, you store tuples of values and booleans : the boolean will be set to false when you delete, true when you insert. You also keep a stack / queue / some dynamic data structure with O(1) push and pop to keep the indexes of empty places in the array that were previously filled (i.e. they are not at the end of the array), let's call it $S$.

get(i) : if the boolean at i is true, return the elemet at position $i$ in $T$ : $O(1)$, otherwise return an error : no element at that position.
Insert(x) : if $S$ is empty, insert the element at the end (setting the associated boolean to 1) : $O(1)$ amortized with dynamic array. If $S$ is not empty : pop a position from $S$ and insert the element at that position : $O(1)$ as well.
Delete(i) : set the boolean at $i$ to false, and add $i$ to $S$. $O(1)$ as well.

The only downside of that method is that the array is never reduced, and therefore has size $O$(number of insertions) and not $O$(number of element).

GBathie
  • 652
  • 4
  • 9