I have an algorithm that on each iteration generates a copy of an object containing a bit array and modifies it (the modification is limited to just 1 element per copy)
While doing the time analysis I cannot neglect the copy cost as this bit array is encoding information about the input set, so for a very large number of elements I have to account for it..
I has been said that effective data structures exists, like big-endian Patricia trees, that can achieve a $O(log(n))$ when copying an instance, and I confirmed there are implementations using them like Data.IntSet from haskell that provides a very good TC for certain operations that I can use to refactor this part of the algorithm ($O(min(n,W))$ W=word size)
I have seen another post in Stack Exchange explaining that is possible to achieve a similar copy cost with AVL persitent trees.
I'm looking for confirmation about this or any other specialized structure that can achieve the reported time complexity for copy.