1

I have an algorithm that on each iteration generates a copy of an object containing a bit array and modifies it (the modification is limited to just 1 element per copy)

While doing the time analysis I cannot neglect the copy cost as this bit array is encoding information about the input set, so for a very large number of elements I have to account for it..

I has been said that effective data structures exists, like big-endian Patricia trees, that can achieve a $O(log(n))$ when copying an instance, and I confirmed there are implementations using them like Data.IntSet from haskell that provides a very good TC for certain operations that I can use to refactor this part of the algorithm ($O(min(n,W))$ W=word size)

I have seen another post in Stack Exchange explaining that is possible to achieve a similar copy cost with AVL persitent trees.

I'm looking for confirmation about this or any other specialized structure that can achieve the reported time complexity for copy.

Jesus Salas
  • 519
  • 1
  • 4
  • 17

1 Answers1

2

Yes, you can use standard techniques for persistent data structures to make a version of Patricia trees that has the properties you want. See also What classes of data structures can be made persistent? and on this site.

D.W.
  • 167,959
  • 22
  • 232
  • 500