You are viewing a single comment's thread from:

RE: Why Bitcoin is not sustainable and a better alternative to the proof-of-work algorithm. Part I

in #bitcoin8 years ago (edited)

As for any blockchain endeavor, there is much to cover at an early stage. In this case, identification based on timestamps is the single most crucial factor so I will only comment on that, and I will provide a rather theoretic view.

Identifiability

Mathematically speaking there is a foundation for this. Say our stamps take values in (a convex set in) the reals, than it is true that two events cannot take place at the same time if we accept that there exists no subset of the reals with cardinality strictly greater or smaller than aleph-naught, e.g., the cardinality of the reals in its entirety.

Uniqueness

That property leads also to a problem. The reals can be expressed by decimal representations that have an infinite sequence of digits to the right of the decimal point, such that for every two consecutive points in the reals there exists an integer that marks a threshold after which the two reals are said to be arbitrarily close to each other. See Cauchy sequence. This means that as for our computational systems concerned, which have clear limits to the amount of digits that can be expressed, at some point, two consecutive reals are identical.

Further acknowledgement

Our timestamps cannot actually take values in the reals. In basically every well-know theorem, spacetime (as a generalization of time) is treated as continuous and linear such that for any two events in time with coordinates in both dimensions, coordinates may exist between them as well. The question of how far can we go with splitting and stretching the resolution, e.g., the question on how many digits these coordinates can have, leads in deep water. Many will agree that eventually time intervals are restricted by Planck time, the smallest unit of time available (10^-44). At that resolution, laws of physics probably don't add up, so it is, even for this purely theoretic exercise, not at all entirely clear whether two events can in fact be uniquely identified based on time. Luckily, we will probably never ever achieve an observation level at which we will have to concern ourselves with that. So as for any realistic bottlenecks, we will have to find some resolution of time units for which we find a (strictly positive) probability level of two events coinciding at an identical timestamp still acceptable, while the storage of this large precision data remains manageable. If that defines a theoretical bound within which to operate, I believe there is surely legroom. However, I would say that a set of predefined rules on how to proceed when such an event happens, has to be in place. Possibly together with features that split any transaction into a sequence of small transactions such that two coinciding transactions will very small economic significance.

Sort:  

According to Prof. Einstein and co, there is no universal moment of 'now' and no two events separated by even a tiny amount of space can be considered to have occurred simultaneously.

Seriously, this idea would be way too dependant on very accurately synchronised clocks and unworkable because, even if we expressed the exact time down to every of the nine billion odd oscillations of caesium 133 atoms that occur every second, there is still a risk that two transactions will happen at the same apparent moment.