Stardates are a mathematical formula which varies depending on location in the galaxy, velocity of travel, and other factors, can vary widely from episode to episode. And lest we forget Swiss watchmakers in all of this, Swatch introduced their own bizarro decimal time system in Called Swatch Internet Time , it divided the day into ".
Nope, not confusing all. The French Republican Calendar was another attempt by revolutionary France to decimalize everything. It wasn't particularly successful. Also interesting is the Chinese ke , a unit of decimal time. Follow us on all channels Start following us for more content, inspiration, news, trends and more.
English French. Start Typing. There are 50 articles matching your search. Filtered by Products , Image , Interview and Video. Decimal time: the revolution that never was. By Christophe Roulet.
View Profile. Nicoud, Sallanches, circa Esquivillon and Deschoudens, Geneva, circa This didn't stop some areas of the country from continuing to observe decimal time, and a few decimal clocks remained in use for years afterwards, presumably leading to many missed appointments! Result in "normal" time:.
Result in decimall time:. Some applications using decimal time are available in both Google Play and the Apple Store.
Max time to convert is ! The argument that is made the most to explain this is the utility of having many divisors and in the absence of any better ones, is at the very least, plausible. Ultimately it is utility should drive any discussion in metrology — not some idiotic desire to create some unified system that appeals to the sensibilities of a small population whose sense of aesthetics is driven by symmetry. Measurements, after all are just a tool, and as a tool they should serve the needs of the user first and above all.
Nevertheless the major division of the day and night was by twelve, not by ten, even though people counted by tens. That, and the fact that weight and linear measurement used divisions of twelve or sixteen more often than any other, and that division by ten is notable by its absence in these evolved systems is the question that you need to answer before extending the notion that metric has superior utility.
Apparently the Chinese used their decimal system for years, only changing to the Western system when the Church interfered. So it was a good enough system for them. The reason we still use our system is pure inertia.
Everybody knows it, all the clocks are made for it, appointments, calendars, everything. Britain changed from our weird, antiquated duodecimal system of currency 12d in 1s, 20s in 1L to a sensible decimal one. It took a couple of years of getting people ready for it, and a big campaign to get the information out.
But it was worth it. The ancient Chinese division of the day into units coexisted alongside a system which divided the day into 12 double-hours — in other words they used both simultaneously. Other divisions of the day were used as well including , 96 and units which were also subdivided sometimes into , sometimes into Had they chosen to they could have insisted on decimal clocks — they were a large enough market that this would have been provided had there been a demand.
Adoption had to be forced by law, and they tried introducing metric time twice — failing both attempts. There are deeper underlying reasons why decimal time has failed, and it has little to do with bad attitudes on the part of the public and everything to do with poor utility.
Even in countries that are officially metric, imperial measures still abound. Carpentry, the needle trades, and several other areas cling to the old system. Much of aviation still uses decimal inch, and decimal ounce as well.
Carpentry not so much — mostly the amateurs about my age and older that refuse to make the switch, and even less so in the building trade. The fact remains that if metric was so damned superior as all of its doctrinaire supporters constantly assert, it would have displaced the old units on that factor alone. And my desk refernce shows well over a hundred of these metrified older units in official use throughout the world. For example, and a small confession, being English we still use miles on the roads so I have a good understanding of what a mile is and relative speeds in mph however I have very little understanding of yards, feet and inches as I grew up using meters for all other measurements.
Imperial would mean nothing to me if we had gone fully metric. At the end of the day they are both fairly arbitrary concepts constructed by man so your most likely to favour what you have gown up using. Though having said that I wish we would have gone fully metric as having a system that is mostly metric can get confusing at times. Obviously not in everyday life.
You only think it is more sensible, because you are used to it. Wait, how is it not perfectly sane to use an average extremity size as the base unit of the length? It guarantees that a average person will be able to roughly measure things with no other size reference. They just did that to match the existing length.
You could do the same with inches… pick a different interval or a different element. The units of Common Measure use exactly the same standards as SI. The chosen size of the units are based primarily on convenience for the task to which they are applied. Common Measure has a much more rich set of units than SI because it has many domain specific units for length, volume, energy, etc. However, the standards upon which they are defined are exactly the same standards used for SI.
In either case, the definition is based on the constant standard: the speed of light. The U. We just define a set of customary units in terms of the metric ones as well. One inch is exactly Convenience units occur everywhere.
Using one base unit eg the meter and combining it with unit prefixes is a quite good solution. Of course, the meter itself is based on an arbitrary piece of metal. Just like the imperial units. SebiR: Although just in this case it is quite funny.
Even here in Europe the plumbers themselves use inch dimensions for pipes quite often. It has something to do with relations of inside and outside diameters which changed over the development of new higher strength materials and manufacturing techniques. Like e. I know.
Our HDDs have 3. Not only are convenience units not dumb, they are precisely that: convenient, in that they are useful for the industry to which they are applied. SI wants all industries to use the same units which is just clumsy if data interchange between industries is infrequent. Moreover, when certain types of accuracy are necessary, Conventional Measure what the U. It turns out that base ten can be applied to any unit system. How about that? An astronomical unit is meters.
A light year is 9. Which one is deci, and which one is deca? How do you go from peta to milli? Why is this a good thing again? Gets even worse if you have fractions of that standard object. The parsec exists in astronomy for exactly this reason. Distances were measured to nearby objects by parallax: so if you know the distance to 1 star is 3.
Pat: That is really just too much. Have you ever heard of the calculator? The computer? By definition. This makes all of the other measurements related to the inch equally well-defined and precise. A calculator gives you numbers, not measurements. But you know what I do have? A yardstick. When you show me a meterstick that can be divided in 3 as accurately as a yardstick, then you can call my argument pointless. Conventional Measure units and SI units all use the same standards, today.
SI superiority is largely a case of Eurocentrist bias. As an earlier poster suggested, the factors of conventional measure usually base 2 or base 12 are much more useful than base 10 and the size of the typical units are also more useful for the purposes they were made.
USA aside, conversion of a country to Metric is not hard. Mixing the two is just plain confusing. Yes, but especially in electronics we have to do it often. Older packages are often specified in inches and newer ones metric. And then you must not mismatch 0,63mm pin spacing or is it 0, and 0, But as the dimensions get smaller they are mostly metric.
Although a 0,4mm BGA is difficult to route out. Egyptians had 12 finger segments? Theoretically a good argument, but then where did we get base ten from? From bloody chimps counting on their fingers. I suppose instead of holding up finger segments like we hold up fingers one could point with the other hand which would at least offer an explanation as to why they stopped at 12 instead of rocking it all the way to 24 using both hands. They could do much better than that, counting ones on the left hand and twelves on the right, for a total count of But I see it as important, that the measurement system has the same base as the number system we are sued to it, so that the conversion to a multiple unit is just a carry to the next digit.
If we would all be used to hexadecimal numbers and would calculate with them routinely, then a base 16 unit system would be no problem. But 12 inch a foot, 3 foot a yard and some odd number like to the mile which suddenly contains the divider 5 is what makes this system really difficult to handle.
First, if this was such an important factor, why did these odd conversions evolve in the first place? Second, how often do you need to do such conversions by hand anymore? And finally third, I submit that it is far easier to make mistakes doing conversions by hand by shifting decimal points, and potential errors harder to intuit.
In other words a conversion of inches to yards yields a number so different that an error of magnitude is obvious. In the case of cm to meters the integers are the same alright, but a slip of a decimal place is not always easy to see. You are wrong about Linux time. The fact that you can print that number out in almost any programming language and the default will be in base ten reflects on the bias of the programming language, not Linux.
Most CPUs these days force things, at least at the machine language level, into binary. In this sense, you could say that the OS is imposing binary, just by not working on anything but binary-based computers. Yeah, the and Z had an instruction for that too, which I think was just a marketing feature — the Decimal Adjust Accumulator instruction was just one of many changes that would be necessary to make BCD arithmetic work.
Why not? Leap seconds. Gonna steal that definition.
0コメント