I'm developing a IMHO very interesting algorithm for integer division. This algorithm uses boolean shifting(It shifts left for multiplication by 2). I'm wondering if << c is $O(c)$ or $O(n)$. I'm hearing it depends on architecture. Algorithms ideally should be architecture independent. So for the purposes of analysing the time complexity of my algorithm, should I consider << c as 1 primitive operation or c primitive operations.
I've looked at the other questions here, and they are saying it depends on the machine.
I'm using RAM model of computation, but don't want to assign << c a cost of 1 when it may in fact cost c.